History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Mon, 01 Jun 2020 19:38:59 +0000 Mon, 01 Jun 2020 19:38:59 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed When the President’s Son-In-Law Truly Was a Great Success

 

For many Americans, the idea of the president tasking his son-in-law with solving national, even international, crises, seems problematic, if not absurd. But it happened once before and turned out to be the kind of “great success story” our current first family wants us to believe in again. Slightly over a century ago, as the US mobilized for the First World War, the nation faced devastating breakdowns of its financial and transport systems. In response, President Woodrow Wilson leaned heavily on his talented and experienced Treasury Secretary, William McAdoo, who just happened to be his son-in-law. Looking back at this episode tells us a lot about what makes for successful emergency management at the highest levels of government.

 

Any analysis needs to start with the fact that Wilson and McAdoo had been political allies before becoming in-laws. The two shared a vision of an activist federal government that could respond to the rise of giant corporations that began to dominate the national economy in the late 19th century. Also, they shared similar personal backgrounds, both growing up in the post-Civil War South before heading north to achieve success and fame. McAdoo joined Wilson’s cabinet after a career as a lawyer-entrepreneur that turned him into a celebrity – a business executive skilled at advancing the public interest, as well as the bottom line.

 

McAdoo’s first business venture – a pioneering effort to convert an existing horse-drawn trolley line in Knoxville, Tennessee, into one of the country’s first electrified light-rail systems – failed when construction delays outran his credit lines. To dig himself out of debt, he moved to New York, where he worked reorganizing financially shaky railroad companies. Still a visionary risk taker eager to exploit the latest technologies, but now with greater skills, McAdoo tackled one of the greatest construction challenges of the day: drilling train tunnels under the Hudson River to replace time-consuming commuter ferries. Others had tried before, only to give up in the face of fatal accidents and crushing costs. But McAdoo’s attempt succeeded – a result of his insistence on the latest engineering advances, his ability to assemble tens of millions of investor dollars, and his expertise as a manager. The public hailed him as a hero. 

 

As president of the Hudson and Manhattan Railway, McAdoo’s commitment to passenger service, progressive labor policies, and the larger public good only added to his charisma. The company’s motto, “Let the Public Be Pleased,” was meant to evoke the infamous response of the fabulously wealthy William Henry Vanderbilt, head of the New York Central Railroad. When confronted by rider complaints, Vanderbilt had blurted out: “The public be damned!” Asked to explain, he insisted that the sole responsibility of a business should be to its owners, not its customers, much less the larger society. 

 

In his role as Secretary of the Treasury, McAdoo was central to passage of much of the legislation for which Wilson’s administration is remembered, most notably the Federal Reserve Act. Despite their strong working relationship, however, the President was less than elated when his widowed 51-year-old cabinet member asked for the hand of his youngest daughter, Eleanor, only 25 at the time. But Wilson reconciled himself to the match, with the wedding taking place in the White House in the spring of 1914, just months before hostilities broke out in Europe.

 

The war presented enormous challenges clearly within the purview of the Treasury Department. When fighting began, European investors scrambled to sell their US securities and convert the proceeds into gold. The run on US gold reserves threatened to bring down the banking system. McAdoo responded with a bold decision to close the New York Stock Exchange for four months and the dollar stabilized. To help finance the war, he staged the famous “Liberty Loan” bond drives aimed not only at raising money, but drawing it primarily from the mass public so as to dampen inflation. Each issue was oversubscribed, with over 80% of American households contributing.

 

With Wilson’s encouragement, McAdoo’s portfolio expanded. When German submarine warfare threatened a collapse of ocean shipping, Congress approved a scaled-back version of McAdoo’s proposal for a federally owned and managed merchant marine. The Emergency Fleet Corporation set up modernized shipbuilding facilities that used simplified designs and standardized parts to construct ships in record time. Meanwhile it provided the model for other so-called war corporations, a nimble new breed of administrative agencies by which the government could coordinate particular sectors of the economy.

 

Resolving the paralysis of the country’s railroads is the war achievement for which McAdoo is best remembered. With practically all rail traffic headed east to deliver soldiers and supplies for shipment to European allies, the country’s rail network literally seized up in late 1917. Trains stood waiting to be unloaded at eastern ports unequipped to deal with the volume, while agricultural producers in the South and West faced ruin for want of empty boxcars. In response, Wilson temporarily nationalized the railroads, putting McAdoo in charge. With a team of rail executives to advise him, McAdoo unsnarled the system by removing competition between the over 400 private railroad companies then in existence, responding to appeals from railway unions for better pay and working conditions, and operating the country’s quarter-of-a-million miles of rails as a single system. Meanwhile, as with ships, he instituted rationalized designs and mass production processes to deliver new rolling stock faster than previously thought possible. 

 

What can we learn about public management in emergencies by looking at this son-in-law who stepped up so capably? There is no doubt that having a personal relationship with the President, a shared political philosophy, and similarities in background helped. Being a son-in-law probably helped, too, although the marriage came after McAdoo already had the President’s ear and trust.

 

But these factors would not have made McAdoo so effective without his own strengths. From his failures as well as successes, he had learned how to run investor-owned enterprises that produced value not just for owners, but for all stakeholders. His deep experience in the worlds of finance and transportation meant that he possessed what is called “domain expertise” – fine-grained, specific knowledge, not just brain-power or general business experience. And in addition to his understanding and appreciation of the private sector, he believed in government and was comfortable working within it.  

 

Unfortunately not all sons-in-law are William McAdoos.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175742 https://historynewsnetwork.org/article/175742 0
Disruption and Resilience: Lessons from the Ancient History of the 2000s

 

 

In the very first episode of Mr. Rogers’ Neighborhood in 1967, the puppet character King Friday XIII declared, “Down with the changers – because we don’t want anything to change!” Fred Rogers astutely recognized how difficult it is for many to accept change. It is thrust upon us, unwelcome, and we have to adapt to our new reality. Disruption happens.

 

This has certainly been the case with the COVID-19 pandemic that erupted in the U.S. in early 2020. It brought on a swift and unexpected recession that threw millions out of work. People have compared this not just to the Great Recession of a decade earlier, but to the Great Depression of the 1930s. Indeed, as I write this in the midst of the crisis, no one knows how long this era of public distancing, mask-wearing and daily death counts will last. We have all had our lives severely disrupted.

 

We like to believe that we are in control, but control is often a myth. Disruption is always around the corner, and with it comes anxiety, uncertainty, and an unknown future that leads to toilet paper hoarding. When will this public health crisis end? When will life return to normalcy? When will it be safe to gather with friends, or how quickly will the economy rebound once the crisis ends? When is it safe to go on a date, or be intimate, or dine in a restaurant or board an airplane? No one knows.

 

The pandemic is a huge disruption that will likely last into 2021, with the end hope of deploying a viable vaccine. Millions are unemployed with uncertain prospects of businesses reopening. Schools are canceled for the academic year and children are home, driving their parents bonkers. The entire family has found their lives disrupted.

 

But I will tell you: this isn’t forever. This is just for now. We will get through this. 

 

How do we know? Because we’ve been through this many times before in our history. Maybe not exactly this, but many other disruptions, and we’ve always managed to come out the other side. True, this is my first pandemic, as it most likely is yours too. History is full of disruption but also catharsis — and we humans have proved resilient.

 

In my latest book, A Decade of Disruption: America in the New Millennium 2000 – 2010, I composed a narrative history of a controversial decade that we all lived through, bookended by two major financial crises: the dot.com meltdown and the Great Recession – and plenty of disruptive events in between. Here we are a decade later with a strong sense of déjà vu. Remember how the nation was glued to our television sets in the days after the September 11 terrorist attacks? It rather feels that way now as the news is full of stories about the coronavirus. Everything changed after 9/11, just as things will be different once this pandemic ends.

 

The story of our lives is one of disruption but also resilience. 9/11 was enormously disruptive, an event that shaped the first decade and reminded Americans that the world is a complicated and dangerous place. It disproved the post-Cold War notion that history was over (hint: it’s never over). I worked at WorldCom at the time, and witnessed firsthand the meltdown of a once-proud company from simple corporate greed to boost the stock price. The Iraq War, the biggest strategic mistake of the decade, was a self-inflicted wound that damaged our country’s reputation abroad. The Great Recession was a seismic disruption, costing millions of people their homes and a giant, unpopular bailout of the financial sector that planted the seeds for the Tea Party.

 

We’ve been blessed with the internet, an amazing, resilient tool designed to survive a nuclear war with all of the world’s information (and mis/disinformation as well) right at your fingertips. But how much of our privacy have we yielded by embracing new technologies, as many companies are more than happy to sell our personal data to advertisers?  

Digital technology has proved massively disruptive. Amazon steamrolled over the retail market. Google overturned so many business models, and now consumers expect online content to be free. Netflix undermined the movie theater business. Ride sharing companies Lyft and Uber have disrupted the taxi industry. Natural gas and renewables are putting the coal industry out of business. Someone is always building a better mousetrap, rendering your business model obsolete. 

 

And yes, in our own time, President Donald Trump has proved the ultimate disruptor. He is a symptom, but not a cause, of white working-class disaffection after the Great Recession shook out many of the remaining high-paying, low-skill jobs. He tapped into white America’s racial anxiety of a nation becoming demographically ever browner. As I wrote in A Decade of Disruption, Trump “drove a red-hot poker into America’s cultural fissures and fanned the flames, widening an already divided country into hostile camps.”

 

Trump’s management of the pandemic has been disastrous. He ignored the looming pandemic for a crucial seventy days, actively undermined governors and scientists, endlessly contradicted himself, spewed out nonsense like injecting bleach and disinfectant, set a bad example by refusing to wear a face mask, and stoked tribal divisions. He showed no ability to unite the country behind a common purpose, let alone empathize with people’s suffering. There is no Harry Truman “the buck stops here” sense of responsibility with Trump, who simply refuses to lead. He is a terrible crisis manager who increasingly resembles a flailing Herbert Hoover during the Great Depression. Trump’s ineptness in the pandemic overshadows George W. Bush’s sluggish response to 2005’s Hurricane Katrina by a mile.

 

We are reminded how senseless tribalism is in the face of the COVID-19 pandemic. The coronavirus does not discriminate between old and young, rural and urban, Democrat and Republican. But is the pandemic a shared national experience like World War II or 9/11? Yes and no. We shared in isolation, and we shared in social media, but the coronavirus affected different parts of the country differently. Some chafed at the restrictions; others wanted to reopen the businesses and face the dire consequences. Still others thought it was no worse than the flu. 

 

And yet.

 

Faced with the Darwinian choice of allowing an estimated 2.2 million people to die – largely the elderly and those with underlying health conditions – or save the economy, Americans chose collectively to sacrifice by staying home so that others might live. We rallied around the idea that we had to protect others. We demonstrated incredible and unexpected solidarity as the pandemic became part of our shared national experience. And we watched a lot of Netflix and learned how to create a sourdough starter and made ice cream in a Mason jar

 

With the pandemic, we essentially put our economy into a medically-induced coma to defeat this nasty pathogen. We hope the patient can spring back once revived. A huge downside is that many of the most vulnerable are suffering the most – those in the service economy. These are people with low wages who tend to live paycheck-to-paycheck.

 

Knowledge workers have the luxury of working from home, even if their kids are a major distraction. But if a service economy employee can’t get to work, they don’t get paid, simple as that. Being a full-time author and tour guide, I learned firsthand that there is no safety net in the gig economy. In March 2020 – the start of the busy season for tour guides – the hospitality industry simply eviscerated as the pandemic gained steam. Nearly 41 million people were laid off in the first ten weeks as the health emergency fueled an economic crisis unprecedented in our nation’s history.

 

The coronavirus has disproportionately impacted people of color, where poverty and race intersect to produce asthma, diabetes, heart and lung ailments, and hypertension – all underlying health conditions that the pathogen has exploited. We have seen far too many of our fellow citizens die from COVID. People of color are also more likely to be “essential workers” in food delivery, grocery stores, hospitals, and nursing homes. They stand in harm’s way to protect the rest of us.

 

Franklin Delano Roosevelt stated in his inauguration address on March 4, 1933, in the worst days of the Great Depression, when a quarter of the American workforce was unemployed: “If I read the temper of our people correctly, we now realize as we have never realized before our interdependence on each other; that we cannot merely take but we must give as well.” We are bound by community. 

 

Just as the Great Depression changed our country – and the role of the federal government – the pandemic will do the same. The coronavirus has exposed how fragile our social compact is. Inequality is soaring to where it was in the 1920s. For decades, conservatives have disparaged and undermined the government, and we see this come home to roost in the pandemic, where the federal response has been a patchwork. We have no national plan to address the coronavirus, only denial, wishful thinking, and vigorous hand-washing.

 

How do we rebuild the social compact that joins all Americans together after this pandemic? How do we renew trust in government, which we sorely need? How do we reduce inequality and improve economic mobility for the many who are stuck?

 

In his 1651 masterpiece The Leviathan, published at the end of the decade-long English Civil War, political philosopher Thomas Hobbes warned that without intermediating institutions such as a reliable government, humanity would have “no arts; no letters; no society; and which is worst of all, continual fear, and danger of violent death; and the life of man, solitary, poor, nasty, brutish, and short.” 

 

But what I have witnessed during this pandemic doesn’t resemble a Hobbesian outcome, but rather extraordinary resilience and solidarity among Americans. People are pulling together to help others. Strangers are speaking from a distance, asking how they are doing. Neighbors are running errands for one another, risking their health to protect those who are most at risk. It may not feel like much, but it makes a difference as we weather this storm together.

 

This pandemic will end, I promise. We will get through it, like hundreds of crises our nation has faced before. We have all had our lives disrupted, and we must remain resilient. Just don’t expect things to be the same afterward. A catharsis is coming. 

 

And please: vote this November like your country depends on it.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175731 https://historynewsnetwork.org/article/175731 0
Why Study History? More Than Ever, a Question that Needs Answering

 

 

One of the few bright features of the past two months has been the extent to which the pandemic crisis has clarified the real public need for historical perspectives. Specific columns by historians have been amplified by the number of press releases incorporating historical findings (in Germany historians are even serving on reentry committees). Among many topics, here are some of the questions which require historical answers: how serious is this crisis compared to others in the past; what features of past plagues are cropping up or might crop up today (including attacks on minorities and social violence); what kinds of structural change, if any, do epidemics generate (alas the most recent overall analogue, the Spanish flu, changed almost nothing in the United States despite impressive outpourings of altruism during the crisis itself).  And of course, what seems to be novel in our present situation? 

Here is ample evidence of why history is essential – which all of us historians have known all along. And coincidentally, it provides a framework in which a new effort to explain history’s blessings to interested students may be particularly timely (among other things, thanks to flexible publishers, the new book I’ve co-authored on history even includes a brief comment on the implications of the pandemic).

History departments, and historians within them, have been spending a lot of time in recent years trying to lay out the reasons that interested students should go on and major in history. Many of us, raised in earlier times when the decision seemed far less problematic, have had to put a lot more effort into this than we ever anticipated.  When my British colleague Marcus Collins and I were invited to write up the current  rationale, it seemed an ideal chance to put  the quite varied, though complementary, arguments into a form that, among other things, can be offered directly to those most concerned: students, their sometimes more difficult parents, advisors and teachers. 

The resulting short book, obscurely titled Why Study History?, has just been issued by the London Publishing Partnership and may be acquired through bookstores, Amazon, etc. Because so many of us are concerned with protecting and advancing the discipline, it seems worthwhile to call this publication to wider attention -- for it is really intended to help the ongoing effort.

We try to do several things in the book. First, we very consciously combine the arguments about the skills history imparts with the really encouraging evidence about career outcomes. On the skills side we emphasize both capacities that history promotes such as writing and critical thinking, AND more distinctively historical skills such as evaluation of the phenomenon of change. The book has given us the chance to advance our thinking on these points, including some new praise for the sheer experience in dealing with ambiguity.

 On the more narrowly pragmatic side, we note of course that history graduates get a wide range of jobs (teaching recruits only about 13% on both sides of the Atlantic), which superficially complicates the picture but actually adds spice. The discussion of skills is intended not only to be persuasive but to help students themselves, when they enter the job market, explain their strengths, an area where many remain somewhat deficient despite our best intentions.  We are careful to note problems: this is not meant to be a sales blitz, particularly for a discipline that contributes so much to the evaluation of fake news. Thus: history graduates end up making as much money as business majors, but less than engineers. While history really trains well in most of the skills employers seek, it lags in advancing group work. But we also take the occasion to highlight the massive expansion of the topics and methods that history programs deal with (including digital and environmental history), something that high school and even college students may not realize.

The book also deals with history’s contributions to citizenship and public understanding, which must not be forgotten amid utilitarianism. And, as Mary Lindemann kindly notes in her contribution to the cover blurb, it also “successfully conveys the joy” of doing history, something we also hoped would shine through. 

Anyway, the book is available, and we truly hope it will prove useful. And, back to the introductory point, a final suggestion: maybe we should be collecting the evidence on the various uses of and appeals for history in the current crisis, as another way of invigorating the argument for relevance that our new book seeks to enhance.  

 

Editor's note: this essay was accepted for publication prior to announcements that at least one public university is closing its history department due to financial distress; HNN is more enthusiastic than ever to promote arguments for the necessity of history in our universities and our society.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175733 https://historynewsnetwork.org/article/175733 0
"Fiction Makes a Better Job of The Truth"? Telling the Erased Story of Lucia Joyce

 

 

The Joyce Girl tells the hidden story of Lucia Joyce, a dancer who lived in 1920s Paris, and happened to be the daughter of novelist James Joyce.

It was the day after Christmas when I sent the manuscript of my novel, The Joyce Girl, to my editor. I’d spent several weeks drafting and re-drafting, while my family cooked, shopped, wrapped and ate with festive abandon. I didn’t get the break I’d hoped for: my editor asked me to write a new scene.  A scene he felt was vital, but a scene that I had no circumstantial evidence for. 

‘I can’t do that,’ I said, ‘because I don’t know that it happened.’ After three years scratching around for every bit of factual accuracy I could find, having to invent an entire scene made me feel deeply uncomfortable. Over the previous years I had traced the 120 homes in which my protagonist lived, tracked down the dance schools she attended and the theatres she performed at, read every surviving letter and diary entry that referred to her. I’d sat in archives, libraries and every park I thought she might have walked through - from Italy to Switzerland to Paris. I’d even spent months learning to dance in the modern freeform style my protagonist pursued, not to mention endlessly listening to the music she danced to. I’d also read every biography and letter of every person alleged to have spent time with her, from Alexander Calder to Sylvia Beach. Most challenging of all, I’d pored over two books that routinely top lists of ‘the most challenging books ever written’: Ulysses and Finnegans Wake, by the legendary Irish author James Joyce.

It hadn’t been easy because my protagonist, Lucia Joyce (James Joyce’s daughter) had been intentionally erased from history. Most of her letters (to her, from her, about her) had been destroyed.  Despite her spending months in analysis with Carl Jung, there were no surviving notes. Despite seeing more than two dozen doctors during her lifetime, not a single medical record remained. Her diaries, poems and the draft of a novel she had supposedly written had also been lost or destroyed.  What remained of her life was little more than a few bald facts strung together and viewed through the lens of other people. She was the subject of a meticulously researched biography, but one with a curious void at its heart. Hardly surprising with so little source material.

I’d realised early on that only a novel was going to give me the emotional truth of Lucia.  Only fiction could provide the emotional access to the past I was looking for.  To experience her life, both in the intense, claustrophobic Joyce household and colourful, creative jazz-age Paris needed imagination, not another biography or history book.  

And yet everything I’d written was justified by some scrap of empirical evidence – a newspaper article, words from the report of a bystander, lines from Joyce’s own works. These, however flimsy, had been my anchors, my beacons of veracity.  

My editor pointed out that there was no indication that the innocuous scene he wanted me to write – in which an admirer of Lucia watches her in a dance class - hadn’t taken place. He urged me to have a go, persuading me that it would enhance the narrative flow.  Eventually I agreed.  But first I had to banish the academic who squats perpetually on my shoulder like a petulant parrot, demanding a footnote (by way of explanation my father is a university professor).  Once banished, I felt a huge liberation. As if my imagination had been freed from its self-imposed cage. But although I wrote unconstrained by facts, I preserved the fastidious accuracy and exactitude that brings historical fiction to sparkling life – details of clothing, interiors, landscapes, the smells and sounds of 1920s Paris, the traits of my real-life characters. 

I began to see that an evocative sense of place and time is often more important than precision. And I realised that the shedding of accumulated knowledge takes imaginative courage. Kill your darlings, indeed.

The experience was so revelatory that I returned to my manuscript determined to rewrite it yet again. Freeing myself from the noose of veracity enabled me to slip beneath the skin of Lucia Joyce in a very different way.  I didn’t veer from a single fact.  But instead of evading the gaps, I embraced them, realising that these gaping speculative holes while anathema to the scholar are the shadowlands where historical fiction lives and breathes.  They are the spaces closed to biographers and historians – the omissions, repressions, evasions of history, the tantalising voids that paradoxically illuminate what Virginia Woolf called ‘the thoughts and emotions which meander darkly and obscurely through the hidden channels of the soul.’  

Besides, my research had revealed just how slippery ‘facts’ can be.  Because Lucia’s own voice had been effectively smothered, most ‘facts’ came from those later responsible for incarcerating her in a series of mental asylums and hospitals.  Few sources are genuinely independent, memory is notoriously fickle, and all facts are open to interpretation. My fellow writer, Louisa Treger, describes historical fiction as a lie through which the truth can emerge.

Writers exploring lost voices, or restoring marginalised voices, have to be capable of shedding the weight of historical precision.  Fiction provides, uniquely, the chance to view major events and significant personalities through the eyes of a narrator displaced from history. It gives a voice to the dispossessed, the disempowered and the suppressed: women, the poor and illiterate, people of colour.  When centre-stage is occupied by someone with a different viewpoint, the reader experiences history differently, perhaps in a more complex and nuanced way.

Personally, I don’t like huge departures from the truth.  If we’ve learnt anything from the doyenne of historical fiction, Hilary Mantel, it’s this: interpret and imagine thoughts and emotions as much as you like, but make sure your research is rigorous and thorough.  

For me, this meant writing tens of versions of The Joyce Girl in order to find that delicate balance between historical accuracy and imaginative flair. It also meant a lengthy Historical Note and a full biography where I listed all my sources. And while Mantel may dislike the ‘cultural cringe’ of an author’s Historical Note, I always want to know where I can find more.

When I started my second novel, Frieda: The Original Lady Chatterley, my liberation was complete. I had learnt to lean on the empirical evidence in order to explore experience, rather than letting the evidence confine me. I had learnt to construct an historically detailed and precise scaffolding on which to string imagined emotional truth – without misrepresenting the facts.  

And as Doris Lessing famously said, ‘There’s no doubt fiction makes a better job of the truth.’

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175730 https://historynewsnetwork.org/article/175730 0
COVID and the Deadly Logic of Efficiency in Meatpacking and Elder Care

 

 

It is estimated that 80 percent of COVID-19 related the deaths in Minnesota have been linked to long term care facilities for the aged, and in six states that are reporting data, half of all deaths have been in similar institutions. Meanwhile, 6,500 meatpacking and food processing workers have caught COVID-19.  Twenty died from it. Three USDA meat inspectors have also died of the COVID-19 virus.

These stories are not unrelated. Over the last 40 years, the US has begun producing livestock and poultry in large confinement operations and then slaughtering the animals at lightning speed. At the same time, we have taken to warehousing our elders, who too often are confined in for-profit nursing homes

And who among our elders wind up in nursing homes? They are the poorest among us, who have neither the wealth to spur economic growth with their purchases nor the capacity to work for wages. Like the packinghouse workers who face the virus on the front lines, our elders in nursing homes are being revealed as disposable people.

Who takes care of our elders? They are low wage workers, not unlike those who cut up our meat. When a COVID-19 outbreak hit the JBS slaughterhouse in Grand Island, Nebraska, it spread to the community to kill more than 26 people, most of whom were in the city’s nursing homes.  The virus spreads easily in packing houses and nursing homes, and where there is an outbreak in one kind of facility there are often outbreaks in the other.

How did we get here? The massive depopulation of rural America combined with the cult of efficiency that guides our every action as a nation. 

For most of history, aging was not a “problem.” For one thing, fewer people reached old age, and those that did lived with their family.  They cared for children and the sick, gardened, cooked, cleaned, told stories, and mediated disputes. As people moved to cities, space became expensive. Every square foot cost money and too often children could not afford to take in and care for their parents. 

This movement away from farms and small towns toward cities set in motion a series of events in rural America that is still unfolding. Farms grew larger, rural towns and cities grew smaller, commodity, livestock, and poultry production and processing concentrated, and all of this was driven by the cult of efficiency.  

Consider what many of us have learned about how pigs are raised and turned into food. Most pigs are raised in large, complex confinement operations and reach a market weight (around 282 pounds) in 22-26 weeks. Slaughterhouses are organized to receive shipments of tens of thousands of nearly identical pigs a day. When this ultra-efficient supply chain is interrupted, either at the production or processing end, chaos results. 

As we transformed ourselves into an urban nation, we began to confine our impoverished elders, our livestock, and our poultry in strikingly similar ways. In both settings, the low-wage workers that care for the old and cut up animals and birds are often poor people of color and immigrants

And now, in the face of this horrible pandemic, we have classified these poor workers, the ones who care for our elders and prepare our meat, as essential. Though rules differ from state to state, if you are an essential worker who refuses to go to work for fear of catching COVID-19, you can lose your right to unemployment insurance and so, under some circumstances, you must choose between risking your life at work or going without a paycheck

Often, essential workers are poor, living paycheck to paycheck. Many do not have paid sick days and some that do are fearful to taking them because of managerial retaliation. This twisted combination of low pay without sick days causes workers to show up to work no matter their condition, worsening the public health crisis. 

It is time to slow down and ask ourselves why we have turned our nursing homes and slaughterhouses into death traps for our elders, those who care for our elders, and those who prepare our food. Nobody wants sick people processing our food and caring for our elders.  It’s in everyone's self interest to advance policies that change the way we do both.  There are alternatives.  It’s time to pursue them.

We should guarantee universal paid sick days, affordable health care, and a living wage for every worker in the United States. These commonsense policies will ensure that people infected with a contagion stay home when sick and avoid infecting others. These polices will make everyone safer, protect our elders and our food supply, and allow the economy to reopen sooner. Public health is good for business and for workers.

What kind of nation do we want to be? Hubert Humphrey once said, “… that the moral test of government is how that government treats those who are in the dawn of life, the children; those who are in the twilight of life, the elderly; and those who are in the shadows of life, the sick, the needy and the handicapped.” 

Are we meeting that test? 

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175738 https://historynewsnetwork.org/article/175738 0
Trump and the Puritans

 

 

 

Whatever one feels about it, the ‘Trump phenomenon’ is often described as the US version of a populist trend that has impacted on many areas of contemporary global politics.  However, despite the global political similarities, Donald Trump’s success is also rooted in a peculiarly American experience, since a very large and influential part of his support base lies among Christians of the so-called ‘evangelical right’. 

     

The presidential inauguration, in 2017, featured six religious leaders, more than any other inauguration in history.  Since then many evangelical leaders have (controversially) claimed that God has placed Trump in the White House, despite his character flaws, because he is the man who will get God’s work done at this – in their  view – critical point in US and world history. As a result, the influence of evangelical Christians on American politics has never been more pronounced. From the appointment of Supreme Court judges to US relations with Israel, from support for ‘The Wall’ to abortion legislation, the power of this extraordinary lobby is seen in the changing politics and policies of the nation. A veritable culture war appears to be occurring over the future direction of the USA; a battle for the ‘soul of America’.

     

In this, religious faith has an impact which is unique to the USA among twenty-first-century ‘Western’ states; and it stands comparison with the impact of Islam in other countries. There is clearly something distinctive about US culture and politics that sets it apart from comparatively developed democratic societies and states. In 2014 an extensive programme of research revealed that 70.6 per cent of Americans identified as Christians of some form. And, of the total adult US population, 25.4 per cent identified as ‘evangelicals’. About eighty-one per cent of them voted for Trump in 2016. This evangelical Christian base remains solid, even after three turbulent and controversial years. Recent evidence reveals that between May 2019 and March 2020, the number of weekly-church-attending white Protestants who believe that President Trump is anointed by God increased from 29.6 per cent to 49.5 per cent. And, while some are beginning to wonder whether Trump as the ‘chaos candidate’ (who thrives in situations of uncertainty and turbulence), may have finally met his match in the ‘chaos event’ (the Covid19 pandemic), there is evidence that his support remains solid among his base, many of whom are evangelical Christians. 

 

This raises the question of why the USA is so different from otherwise comparable modern states. The evidence indicates that the answer ultimately lies in the seventeenth-century Puritan experience. This is not to claim one causal factor, nor is it to trace a simple trajectory from ‘then’ to ‘now’. Rather, it is to recognise that this – for all the undoubted complexity since – set America on a distinct course which is still being seen in key aspects of modern culture, identity and politics. It is a major factor which helps explain why the USA is as it is.

    

Remarkably, 2020 is not only the year of the next US presidential election, it is also the 400th Anniversary of the arrival of the Mayflower ‘Pilgrims’ in North America and the beginnings of Puritan New England. And both the election and anniversary occur in the same month: November. There is, undeniably, an historic link between the origins of Puritan settlement of North America (particularly that in Massachusetts Bay from 1630 onwards) and the remarkable events which have shaken the nation since 2016.  What is going on in the modern USA has very deep roots. They are roots that stretch back into the, almost mythological, origins of the nation in the seventeenth century. Modern events are part of a very deep story and this ‘Puritan connection’ helps explain a unique feature of modern US society: the influence of conservative evangelicals.

     

Co-writing Trump and the Puritans emphasised the extraordinary differences in the Puritan legacy either side of the Atlantic. In the UK, the restoration of the British monarchy in 1660 cast Puritans into the political wilderness. It was not until the nineteenth century that their descendants (termed ‘dissenters’ and ‘non-conformists’) were finally fully readmitted into the UK’s political system.  In contrast, while the British crown reined in the theocrats of New England in the 1680s and 1690s, Puritans and their descendants remained political players, ‘insiders’. Consequently, their legacy (even when diluted and adapted) continued to be potent as it connected later American colonists with their foundational roots – even when it was highly mythologised. The Mayflower Compact and the later covenanting agreements of the Puritan congregations of Massachusetts Bay contributed a profound sense of ‘calling’, a commitment to active formation of communities, and moral self-confidence to a venture which saw itself as an exceptional people, an ‘American Israel’. This was later adapted, manipulated and developed by those who came after. These concepts fed into the Declaration of Independence and the US Constitution, along with Enlightenment ideas. This was accompanied by a providential outlook which from early on considered the colonists as having been granted title to the land. This ‘American Israel’ was culturally and psychologically armed to dominate the original inhabitants of this ‘American Canaan’. These ideas of American exceptionalism and divine calling became secularised in nineteenth-century claims regarding Manifest Destiny and the subjugation of native peoples. 

    

Despite huge changes since the seventeenth century, this has had a profound input into the US cultural DNA and still informs much of the outlook of many people concerning what it means to be an American. This may be explicitly referenced in events such as Thanksgiving or implicitly found in a confident sense of American exceptionalism and being ‘a city on a hill’ to inspire others. 

     

While Trump is no Puritan, his appeal to evangelicals has seventeenth-century roots. It draws on a deep story of American exceptionalism, providential calling, elimination of the ‘alien other’, distrust of foreign states, the jeremiad against current sinfulness, and an apocalyptic confidence of being the fulfilment of history and gate-keeper to the fulfilment of biblical prophecy. We might call these ‘foundational phenomena’ and they reverberate with Puritan resonances. ‘America First’ and ‘Make America Great Again’ may be twenty-first-century slogans but they appeal to something very deep within the US psyche and cultural mythology. Furthermore, ‘Make America Great Again’ looks backwards as well as forwards.

     

In addition, among many modern evangelicals there has occurred a revitalising of the theocratic and providential aspects inherent in many of the original concepts. Consequently, though a secular state, the USA contains millions of voters who harbour theocratic ambitions that would have been readily understood in seventeenth-century New England.Even Trump’s behaviour during the Covid19 pandemic can still resonate positively with some core voters who are inclined to pit faith against science, and who also have a deep distrust of federal influence.

     

As a result, the current manifestations of these ‘foundational phenomena’ can be traced back through a very American style of imperialism and nationalism, to Manifest Destiny and frontier expansion, which has its ultimate roots in the seventeenth-century drive to create a New Jerusalem in the New World.  Much cultural water has passed under the bridge since the seventeenth century, and many tributaries have flowed into and diluted that primordial Puritan stream. Despite this, that original current has, arguably, played a greater role in forming a sense of American exceptionalism – both in terms of political culture and religious identity – than any other single causal factor. From the 1620 Mayflower landing and the 1630 Winthrop Fleet to the 2016 election of Donald Trump, from the 1692 Salem Witch Hunts to the divided USA of today, the Puritans have left a unique and complex legacy that helps explain the role of the evangelical right in modern US politics and, ultimately, the phenomenon that we know as the Trump presidency. That Puritan-derived influence and energy is far from exhausted. Whether consciously recognised and referenced, or not, seventeenth-century Puritan history has not finished with the USA. 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175736 https://historynewsnetwork.org/article/175736 0
COVID-19 is Devastating Native American Communities. History Illuminates Why.

 

For centuries, disparities have made disease outbreaks particularly devastating in Native American and other indigenous communities. The COVID-19 pandemic is no different.

Global, national, and even local crises reveal the fault lines in our social and political systems, and historically disadvantaged people inevitably bear the brunt of these crises. We are seeing this again with COVID-19: the homeless, the imprisoned, the elderly and disabled are especially vulnerable, as are black, Latinx, and Native American populations in the United States, Canada, and Latin America. 

Preliminary studies show that historical racial inequalities have made people of color more susceptible than whites to complications from COVID-19. News reports from the Navajo Nation document extremely high rates of COVID-19. This is made even more life-threatening by the inadequate funding afforded the Indian Health Service and the impoverished situation of many tribal members. 

Meanwhile, in news stories from the Brazilian Amazon we hear of loggers and miners finding opportunity in the political chaos of COVID-19 to ramp up their activities, killing indigenous people who get in their way. 

In 1492 and for centuries afterwards, Europeans, Africans, Asians, and others introduced waves of novel diseases into the Americas, resulting in devastating serial epidemic disease episodes. Smallpox, measles, cholera and other foreign diseases did not kill indigenous people because of weak bodies and immune systems, as some have alleged. Rather, the losses of population due to disease were combined with increased warfare, the losses from commercial slaving and other forced labor systems of Native people, and other consequences of colonialism. All told, the Native American mortality rate topped 90% between 1500 and 1800 AD. 

Native Americans continue to die in disproportionate numbers today because colonization also imposed specific structures of inequality designed to maintain-forced labor regimes, to remove indigenous people from their national lands, and to undermine their livelihoods. The efforts of colonizers to eliminate indigenous languages, sovereignty, and traditions have also caused further destruction to indigenous senses of community, kinship, spirituality, and health. 

Centuries of unequal access to resources and decision-making served to weaken indigenous health across the Americas and beyond. In the United States alone, Native communities have not only unequal access to health care, but also increased rates of diabetes, heart disease, and respiratory ailments, all co- morbidity factors for COVID-19 mortality. 

Increased government and public action is needed throughout the Americas not only to address the short-term issues related to COVID-19 but also the deeper structural issues we have raised. We call on all people to listen to the voices emanating from indigenous communities of the Americas and elsewhere and to support their demands. The Navajo Nation, for example, has called for the federal government COVID 19 relief package to be justly distributed to Native American tribal health-care systems. Only by alleviating the structural damage done by 500 years of colonial rule can we fully combat the COVID-19 pandemic in indigenous communities and help create a more equitable future for all. 

For more information or to find ways you can help, go to ethnohistory.org.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175735 https://historynewsnetwork.org/article/175735 0
Things Have Been Worse Could that be comforting? I don’t think it should be, but that comparison can at least put our current distress in perspective.

 

I was reminded how bad it can become in America by the movie “Trumbo” on Netflix. Netflix provides a service we’re happy to pay for. We can watch a film without the constant interruptions of cable channels and can stop it to go make popcorn. “Trumbo” is the story of Dalton Trumbo, a writer of books and screenplays, who was also a member of the Communist Party in the 1940s. In a series of infamous hearings before the House Un-American Activities Committee, Trumbo and 9 other writers and directors refused to answer questions like, “Are you now, or have you ever been, a member of the Communist Party?” Citing their First Amendment right to free speech didn’t work. They were charged with contempt of Congress for refusing to name other members of the CPUSA. Trumbo went to jail for a year, and the Hollywood Ten were blacklisted from work in the film industry for a decade. Walt Disney used charges of communism to attack cartoonists who were trying to unionize. Eventually the blacklist extended to hundreds of people who had been named as communist sympathizers by conservative publications.

 

The wider witch hunt for dangerous social reformers that is identified with Sen. Joseph McCarthy sought to prevent any questioning of the American political status quo. Thousands of people who had voiced political ideas anywhere to the left of center lost their jobs, homes, and social positions. Wikipedia summarizes the injustices of the McCarthy era: “Most of these punishments came about through trial verdicts that were later overturned, laws that were later declared unconstitutional, dismissals for reasons later declared illegal or actionable, or extra-legal procedures, such as informal blacklists, that would come into general disrepute.”

 

Far-right groups prospered in the 1950s by attacking anything that involved public collective action as “communist”. The Keep America Committee published a flyer denouncing fluoridation, the polio vaccine, and mental health programs as steps toward a “communistic world government”. The Motion Picture Alliance for the Preservation of American Ideals, the self-appointed loyalty watchdog co-founded by Disney, announced ideological prohibitions of certain “subversive” ideas: “Don't smear the free-enterprise system ... Don't smear industrialists ... Don't smear wealth ... Don't smear the profit motive ... Don't deify the 'common man' ... Don't glorify the collective.” The FBI used the Red scare to target union activists and civil rights leaders, delaying the end of segregation by a decade or more. The House Un-American Activities Committee functioned under both Republican and Democratic leadership.

 

Government and private propaganda about the communist “menace” convinced half of Americans to support McCarthy in a Gallup poll in 1950. Good, loyal Americans who kept their mouths shut except to praise capitalism had little to worry about.

 

The Hollywood blacklist was broken by courageous people in 1960, who defied the witch-hunters. Otto Preminger asked Trumbo to write the screenplay for “Exodus” and promised he would credit him openly. Kirk Douglas, in the midst of starring in and producing “Spartacus”, announced that Trumbo had written the script. The American Legion boycotted the film, but President John F. Kennedy went anyway. After that the blacklist crumbled, but the effects of years of hiding and public vilification had long-lasting repercussions for many people.

 

Things got better in my lifetime when the blacklist and McCarthyism were defeated in the 1950s and 1960s, when the work of white supremacists began to be dismantled in the 1960s, and when the work of male supremacists started crumbling in the 1970s. Our situation now will keep getting worse, unless people like us make it stop.

 

Perhaps it doesn’t make much sense to compare how bad these different times were, because in each case, they were very bad for different groups of people. The Red-hunting era was bad for people who had committed themselves to reforming American society. More like our current crisis, the HIV/AIDS plague killed over 2 million people a year between 1982 and 2002, a far higher worldwide mortality than the coronavirus at its peak, “wreaking the most havoc among the world's poorest and most underprivileged communities.” In the US, AIDS briefly became the greatest cause of death in men and women 25–44 years old from 1993 to 1995, before anti-retroviral therapy significantly reduced mortality. Most in danger were particular groups of people: men who have sex with men, drug users, and blood recipients.

 

I am certainly comparing very different social problems: the sudden emergence of a pandemic is not like the long history of racial or gender injustice. But it still seems useful to me to recognize that American life has been much worse in the recent past for large groups of people. In fact, the same social groups keep reappearing as the most affected: the poor and minorities, the main sufferers from COVID-19. I think the high incidence of death among health care workers and nursing home residents is a new feature of this human disaster. Familiar is the much lower disruption to the lives of the well-off. 

 

Each of these human disasters preys on or exacerbates existing social inequalities. Each examines our empathy for others, our willingness to challenge the social conventions that prop up inequality. Each requires government action or government reforms to end the crisis. Each tests American leaders, some of whom fail miserably or succeed gloriously.

 

Most of us will survive until the next crisis.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/blog/154355 https://historynewsnetwork.org/blog/154355 0
Stop the Machine: Why We Should Resist Online Learning

Photo Damian Gadal, 2010. CC-BY-2.0

 

 

 

Now that the Covid epidemic has forced universities to go online, some voices want to make this a permanent state of affairs. In the Wall Street Journal, Eric Mazur (a Harvard physics professor), Bob Kerrey (formerly a US Senator and president of the New School) and Ben Nelson (founder and CEO of the for-profit Minerva Project, which purports to offer online instruction superior to an Ivy League education) argue that “Higher-education leaders should seize this period of upheaval as an opportunity to focus on learning.” Evidently these three gentlemen are under the impression that college teachers and administrators never thought to focus on learning. They urge us to “shift to student-centered instruction,” when in fact they recommend tech-centered instruction. And they say that we must “refocus on student outcomes, on the universal skills that will enable graduates to respond to the next crisis, to create resilience and adapt to unfamiliar territory, and to help lead society forward.” Reading these vacuous platitudes, one has to wonder whether the Minerva Project teaches its students how to express themselves meaningfully on paper – or on a screen for that matter.

 

One problem with the Mazur/Kerrey/Nelson logic (if it can be called that) is that high school seniors aren’t buying it. Charlie Baker of Millburn, New Jersey, who has been admitted to Vanderbilt University, says he will probably take a year off if the college remains online in the fall. “I was excited to live somewhere else and be more independent and get to know people on my floor and in my classes. That’s the college experience I’ve been looking forward to,” he says. “I need to be there in person. It allows me to understand the material much better. [Online teaching] wouldn’t be a waste of money, but it definitely does make college less valuable.” His classmate Jamie Serruto will attend online classes at Fordham University, but with great reluctance: “It really, really stinks, but we are all in the same boat.” Cailla Cruz of Westchester, New York is definitely unhappy with the online classes she had to take this spring: “It’s a difficult thing to do, taking math class in bed. The work gets done, but it’s not the same.”

 

Having just finished teaching such a course, I have to agree emphatically with Ms. Cruz. On-line courses are a poor second-best. When the instructor is just a face on a screen, he cannot provoke, dramatize, wade into the classroom, and get into students’ faces. Discussion dries up. Presenting visuals and videos is possible but awkward.  It’s just not the same.

 

Consider, as an illustration, my department’s capstone seminar, which is required of all majors. The students are expected to do exactly what professional historians do: write an original research paper based on primary sources and a review of the secondary literature. They must start with a research prospectus and then produce four successive drafts. These are discussed in class, and students are encouraged to offer constructive criticisms of each other’s projects. The students must also present their penultimate drafts at the “History Happening,” a mini-conference, open to the public, where faculty serve as commentators. Whether they major in history, theater, art, or music, students are motivated to do their best when know they must perform in public. But this time around we had to seriously truncate the course and cancel the History Happening. This seminar teaches the research, writing, and public speaking skills that students will need no matter what future technology may bring, but it requires a lot of group interaction and individual attention, which are very difficult online. And how could remote education possibly work for the theatre arts and the laboratory sciences?

 

Consider also all the education that takes place outside of the classroom. Students come to college to stage plays, join musical groups, display their artwork, organize political protests, play on (or cheer on) athletic teams, listen to controversial speakers (assuming they haven’t been disinvited), and engage in interminable bull sessions debating everything under the sun. At Cambridge University, the “Cambridge Apostles” carried on what has essentially been a two-hundred-year-long bull session, and its alumni included major members of the Bloomsbury Group: John Maynard Keynes, Lytton Strachey, and Leonard Woolf. That extracurricular ferment doesn’t only happen in elite universities. At City College in the Great Depression, students argued furiously over Stalin and Trotsky in the cafeteria, and several of them went on to become eminent New York intellectuals: Irving Howe, Irving Kristol, Daniel Bell, Nathan Glazer.

 

That subversive crossfertilization simply can’t be Zoomed. In his 1909 dystopian tale “The Machine Stops”, E. M. Forster (a Cambridge Apostle) predicted with astonishing accuracy the locked-down tech world we live in today. Everyone shelters in his or her little cubicle, communicates via screens, has stuff delivered in, and almost never ventures outside. In effect, Amazon and Microsoft have taken over everything, and because there is no real human interaction or intellectual exchange, the whole system eventually collapses.

 

In a survey by the nonprofit mental health organization Active Minds, 80% of college students reported that the Covid crisis has negatively affected their psychological health. Obviously they are worried about personal finances and the virus itself, but they are also stressed by online instruction, and they sorely miss the supportive community of a college campus. “People really crave that person-to-person contact, maybe more than they realized”, says Jaclyn Friedman-Lombardo, director of counseling services at Montclair State University. “This generation is very, very accustomed to screens. But they don’t feel quite as engaged online.”

 

College is where students make lifelong friends and often meet lifetime spouses. They invite themselves over to cafeteria tables and chat up potential romantic partners. Colleges of course also offer sex education, most of which takes place outside of the classroom. Needless to say, there are plenty of online instructional materials on this topic – but it’s just not the same.

 

How many healthy, curious 18-year-olds want to live with their parents and work all day at a computer? A recent survey by the educational research firm SimpsonScarborough finds that, if colleges only offer online instruction this fall, fully 20 percent of high school seniors will not enroll, and 14 percent of current college students will not reenroll.  Add to that the economic fallout of the Covid Depression: the out-of-work parents who can no longer afford higher education for their children, the severe budget cuts that cash-strapped state legislatures will inevitably impose on public colleges.

 

 Given all that, SimpsonScarborough is now predicting a “domestic undergraduate enrollment decline…that would be truly catastrophic for our industry.” Colleges should therefore make every effort (including lobbying state governors) to reopen in the fall with a full range of on-campus courses and extracurricular activities. It never made much sense to send all our students home, given that under-25s account for just over one-tenth of one percent of all Covid deaths. Of course we should observe reasonable social distancing, and those faculty who are older and medically vulnerable should be granted the option of taking a leave of absence or teaching their courses online. But I for one (at age 67) would choose to be with my students. After all, I would run fewer risks than first responders and health care workers. Getting out of the house and back to the classroom would probably be good for my mental and physical well-being. And in any case, I believe that education is an essential business.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175718 https://historynewsnetwork.org/article/175718 0
Little Richard: Bold, Pioneering, Complex, and Unapologetic

 

Like millions of people all over the world, I was saddened to hear about the death of Little Richard earlier this month. The 87 year-old singer succumbed to cancer. To say that Little Richard was a pioneer is obvious as well as an understatement. He was an innovator and an artist who personified the word rebel. He integrated thunderous gospel vocals with R&B and boogie-woogie in foot tapping, energetic songs such as “Long Tall Sally,” “Tutti Frutti,” and “Rip It Up.” He unabashedly screamed, shouted, kicked, pounded on his piano, and thunderously belted out vocals — all of which cultivated a most brash sound that provided rock & roll one of its most distinctive sounds.

But more than that, Little Richard brought an uncompromised strand of black Southern music, the sound of the Chitlin’ Circuit itself, into the lives of teenagers around the globe, an irrevocable development whose impact cannot be overstated. He was a performer who garnered appeal across racial boundaries at a time when the nation was highly segregated, and he helped pave the way for future Black artists such as James Brown, Parliament Funkadelic, Sly and the Family Stone, and others. 

The third of twelve children born on December 5, 1932 in Macon, Georgia, Richard Wayne Penniman was a child of the Great Depression. He grew up in a segregated, Jim Crow environment where racism, poverty,  stark health disparities and other social maladies were rampant. Despite such emotionally and psychologically demoralizing factors, the young and talented Richard found an escape in the world of music. It was here where the future rock and roll icon and hall of fame artist would hone his craft and feverishly develop his piano and vocal skills as he waited for and anticipated his eventual career break, which did in fact come to fruition

While race was indeed a defining factor in his identity, another element shaped his career in complex ways: sexuality. Throughout his childhood, he was taunted as effeminate, and in adulthood he had run-ins with authorities while acting on same-sex desires. He occasionally condemned homosexuality as sinful; at other times, he described himself as gay or omnisexual. Many of his songs focused on pursuing intensely sexual women while leaving open a queer subtext. Through it all, his radically flamboyant public persona upended gender boundaries, and much of his appeal lay in the fact he never seemed all that unnerved with tending to his obvious contradictions. During the early 1970s, an interviewer asked him why he was wearing makeup. “You’re suppose to wear makeup,” Richard replied. “Just like when you toast your bread or put sugar in your coffee, you’re supposed to add a little touch to it.”

Long before it was hip to do so, Little Richard daringly and unapologetically pushed the boundaries of sexual fluidity. He wore garish clothes and explicit attire — eyeliner, nail polish, mascara, and tight pants — as he brazenly twisted, hopped, hollered, tapped, snapped, and gyrated in front of television cameras and live audiences. His language, appearance, and disposition defied standards and norms. Indeed, as time progressed, his increasingly eccentric antics had many people intensely debating his sexuality, religious beliefs, and other facets of his personal life. Intensely religious yet sexually adventurous, he was the personification of contradictions, moral sacredness, and ambiguity. He could never be easily defined nor neatly tucked into any specific category

It should not be lost on anyone that demonstrating such an unrestrained degree of sexually charged behavior was a quite audacious stance to take, particularly for Black men. He defiantly adopted and embraced such antics wearing wigs, lipstick, eyeliner, colorful, sexually fluid clothes, embracing high pitched effeminate voice patterns that were largely frowned upon and deeply at odds with the mores, customs and the larger culture of the era. This was a risk that very few performers, regardless of race, were willing to take.

Entertainers, even notably successful ones, experience career highs and lows. Little Richard was no exception. The period of the late 1960s and the 1970s were challenging for the rock pioneer. New musical trends driven by new artists made it difficult for him to carve a niche as he was able to do so effectively a decade earlier. To many, he was viewed as a relic whose time had passed.  Various attempts at comebacks were met with minimal success and he eventually made a return to the ministry.

What a difference a decade made. The 1980s placed Little Richard back into the spotlight with a number of profitable ventures for the then fifty-something entertainer. He appeared in the successful 1986 film “Down and Out in Beverly Hills,” penned an autobiography The Lives and Times of Little Richard, appeared on number of television programs and music videos, and began touring again. 

Later decades would witness Tiny Tim, Boy George of Culture Club, Annie Lennox of the Eurythmics, Marilyn Manson, Prince, Grace Jones, and others also embrace gender fluid behavior. It is safe to say that Prince was the Little Richard of his era and was undoubtedly influenced by him. Other artists such as Elton John, Jimi Hendrix, and the Beatles were committed devotees of Little Richard as well. His talent and legacy were so appreciated that he was among the inaugural group of performers inducted into the Rock & Roll Hall of Fame along with James Brown, Sam Cooke, Ray Charles, and a few others.

Historically speaking it has not been unusual for performers to mirror themselves after forebears of previous decades. Oasis were a direct adaptation of The Beatles. The Black Crowes mimicked the Rolling Stones, as REM to The Byrds, Rod Stewart to Bobby Womack, Patti LaBelle to Stephanie Mills, Tracy Chapman to Odetta , Michael Jackson to James Brown, Anita Baker to Nancy Wilson, and so on. This truth aside, the fact is that Little Richard had an enormous influence on a disproportionate number of artists across the racial and gender spectrum as well as various musical genres.

Overall, there were few Black artists of the 1950s that were as innovative, bold, experimental, and willing to manipulate with music as Little Richard was. Moreover, he managed to create music that resonated with so many people across different backgrounds and political inclinations. Integrating pop, R&B, and big band sound was like second nature to him. He, along with Buddy Holly, Jerry Lee Lewis, and Elvis Presley, rankled the psyche of more than a few conservative, suburban parents and simultaneously earned fierce popularity and devotion among many young people.

Later decades would witness Tiny Tim, David Bowie, Boy George of Culture Club, Annie Lennox of the Eurythmics, Marilyn Manson, Prince, Grace Jones, and others also embrace gender fluid behavior. In fact, it is safe to say that Prince (the artist formerly known as Prince) was the Little Richard of his era and was undoubtedly influenced by him. Other artists such as Elton John, Jimi Hendrix, Otis Redding, The Rolling Stones and the Beatles were committed devotees of Little Richard as well. His talent and legacy was so appreciated that he was among the inaugural group of performers inducted into the Rock & Roll Hall of Fame along with James Brown, Sam Cooke, Ray Charles, and a few others. 

Arguably, Little Richard’s influence has been felt outside of the musical arena in the acceptance of more gender-fluid identities, and specifically gender fluid Black identities Today, more than a half a century later, growing numbers of men, especially millennials and post-millennials, embrace and accept gender fluid behavior that was often seen as unacceptable in previous generations.

Little Richard was bold, daring, visionary, and fearless in a way that few Black male artists — past or present — have dared to be. While his influence and presence may be more limited among younger generations, he was one of the most distinctive, definitive, and pioneering voices ever produced in the world of music. His music was larger than life, and indeed his life was anything but dull. May he rest in peace.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175734 https://historynewsnetwork.org/article/175734 0
Fly me to the Moon: Covering the Early Space Program

 

Recently NASA has been talking about man returning to the moon in a renewed space program. The target date is 2024 for a walk on the moon and 2028 for a permanent colony. It is ambitious but not surprising because space has always been one of America's macho concepts. Called Artemis, the new program calls for a series of rocket tests by the end of 2021 followed by a crewed moon landing several years later. 

It is hard to deny the thrill of men and women flying in space. The idea of adventuring into space to further conquer the moon or the audacity of trying to step on Mars will be with us forever. Recently there was an obituary about a space pioneer, which was reverent, yet, in the context of the times, an afterthought. Al Worden, who had been the command module pilot on the Apollo 15 lunar landing in 1971, had died at 88. A member of the fifth astronaut class of 1966, he had retired from the space program in 1975 after 11 years. During his trip into space, only the fourth to the moon, his fellow astronauts David Scott and James Irwin walked on the moon's surface, where they tested the first lunar rover. He circled the moon for three days, not exactly a minor task. Though Apollo 15 was his only space flight, he was the first astronaut to walk in deep space outside the space module. More than 200,000 miles from earth Worden spent 38 minutes outside the lunar module he had been piloting. He did repairs on the "space ship" and retrieved film of the mission. It is a story of a pioneer in one paragraph, a deed that held the world's attention at a time when space flights were part of everyday life and of extensive news coverage. 

In the early 1960s I was part of the NBC News space unit with its headquarter in New York. My first assignment was to write fifteen and 30-second introductions to almost 250 prepared pieces that we would use during our broadcast. It may have been the dullest assignment I ever had. But I did it, wrote the intros and then filed them away until we needed them.  

As a writer and producer I spent many weeks at Cape Canaveral covering Gemini flights, Challenger, and the space shuttle Columbia. I had a lot of fun doing it despite the primitiveness of the conditions we had to endure. We worked in sub-tropical climate where humidity ruled. Often the question of the day was what rocket would land in the nearby Banana River, where failed rockets went to die. The answer came when a flight failed, but fortunately no astronaut ever found his way into that sparse waterway. 

Having covered America's space program off and on over many years, with my favorite time the 1960s, there is much I could tell you, though almost nothing about the science of space flight itself. The mechanics of rocketry have always eluded me. I cared about the adventure and the daring men and eventually women who put themselves at risk as pioneers going where no one had previously ventured. Covering space in the 1960s was freer and more open for journalists, with only the three major networks spending a great deal of money to show a live launch of astronauts before dawn and then, often that night in a prime time special program about that days launch. 

In those early days at Cape Canaveral fandom had not yet exploded. It was not yet the modern age of celebrity in America but there were hints of its coming on the horizon. In 1963, Cape Canaveral became Cape Kennedy in honor of the recently dead president. Then in 1973 it became Cape Canaveral again. Back then, astronauts were everywhere in public. Gene Cernan (who walked on the moon twice), Buzz Aldrin, Alan Bean, Michael Collins and other astronauts walked the streets of Cocoa Beach and Melbourne, Florida, known as the Space Coast, where they lived, worked, ate, and drank in restaurants without autograph hunters besieging them. People were more respectful than they have since become when dealing with celebrities. Keep in mind there was no Internet, no social media, Instagram, Facebook, Twitter or Google. For TV there were only the three networks, ABC, NBC and CBS. On flight days groupies, followers and gawkers came to watch liftoff, landings, and frequent the bars and motels along the strip where they enjoyed themselves between launches. Sometimes they slept in their cars or on the beach waking in time for the morning's launch of a rocket or satellite. 

Most of the press usually stayed at the Starlight Motel or the Satellite Motel. We drank late night beers in their bars and lounge where you would often find an astronaut or two and the main TV anchors, even sometimes Walter Cronkite. We ate our meals in restaurants such as Ramon's, Alma's Italian, and the Moon Hut. The food was uneven at best, but meeting with and sharing time with NASA engineers and other technicians was worth every minute. In those days the press had open access to everyone who worked at NASA and who lived the experience to the fullest. They needed us as much as we needed them, allowing it to become a successful marriage. 

NBC News had three knowledgeable reporters covering space at that time. Frank McGee was the anchor of our coverage: calm, collected and very knowledgeable. Roy Neal and Jay Barbree were the two principal reporters who did admirable work. I and the other writers and producers left it to them to explain what was going on. When not at the Cape for a launch, McGee worked out of huge mockup of a space capsule that took up a whole wing of a major studio in the NBC Building at 30 Rockefeller Plaza. Jim Kitchell was our executive producer and he knew space as well as anyone.  

In Florida we worked out of barely serviceable reconditioned aluminum trailers that had white asbestos siding and no air conditioning. We kept the small windows closed to keep out the bugs, sand flies and sand in general. Our working conditions were terrible but it hardly mattered considering the story we were covering. We used manual and electric typewriters. Computers and smart phones did not exist. We wrote our notes with ballpoint pens and Number 2 pencils and kept those notes on legal size lined yellow pads or in stenographer notebooks. When we typed our stories, we hoped the ribbons would not tear or snap because replacements were near impossible to find. There were no hardware stores near the space center to buy needed supplies.  

The men and the few women who ran things for NASA lived and worked under similar conditions as we did. The NASA launch operations center on Merritt Island, Launch Complex 39, was a sparsely furnished building with rows of equally Spartan computers and earnest engineers hard at work. They sat at rows of plain desks with primitive computers. As many calculations as they did by hand, they also used the basic computers to calculate the flight, manned or not, they would direct on the day of a launch. Men wore either short sleeve white shirts or long sleeve white shirts with dark ties pulled as tight as possible against their close shaven necks. Their upper left hand pocket usually had a plastic pen and pencil holder filled with more pens and pencils than one could count. In this time before smart phones and our ubiquitous smartphone pocket computers, they did many difficult calculations by a slide rule in its own leather case, which when not in use, attached to the belt on the engineer's waist. A plastic holder with their nametag hung around their neck. The low level buzz at mission control sounded as if it was coming from a moderately active beehive. It was as if the Stepford husbands were in charge. Their genius was that these engineers conducted their work under the most trying of conditions and succeeded marvelously. 

In our network production trailers we had black plastic dial phones that were off the scrap heap. Every ring sounded the same. We did not have the option of choosing our own ring tone. Our tape editing equipment flown in from New York surprised us when it worked despite the heat, sand, and bugs our common enemy. Using razor blades, a magnifying glass, and store-bought Scotch tape and, sometimes, only prayer, we edited pieces slowly, carefully, hoping the tape we joined the edits with would stick when we fed it into a network program for broadcast. Wastebaskets overflowed with discarded newspapers, empty paper coffee cups, and various iterations of scripts that never made it to air. People shared battered desks and broken chairs in seriously overcrowded conditions but it did not matter. The story mattered more.

What also mattered were the good times that went with what we did for a living. Part of being at the Cape or any other venue as a journalist was what happened after we finished working for the day. Most rocket launches took place in the very early morning, usually in the dark before sunup. As a Today Show producer, I was there along with other early morning staff. Bleary eyed from lack of sleep because we played hard the night before -- too many drinks, too little decent food -- we always arrived on time for the launch. We prepared for our broadcast, sipped bad, mostly tasteless coffee, chewed on a hard roll or an uninteresting Danish and allowed the caffeine in the coffee to cascade through our system.

Set and ready to go, we waited patiently for the launch because that was all we could do. Dawn rose slowly. The bright orange sun dominated the sky. The astronauts walked slowly to the capsule that would carry them into outer space. With little fanfare, they entered their home away from home, and settled into their cramped quarters. The countdown began. Then, usually on a tight, computerized and targeted schedule the rocket blasted off with a thunderous sound and bright yellow-orange light. The earth beneath our feet shook and rumbled, as if an earthquake had struck the sands at Cape Canaveral, while the space ship made its way into the sky above. 

The big crowds of people, many of who had slept overnight in campers, in the backs of their cars or in sleeping bags on the ground, and who numbered in the thousands outside the perimeter of the launch site cheered, applauded and yelled encouragement. Soon the rocket was out of sight. An announcement told us the flight had been successful. Our work done for the day, it was time to head home. Our bags packed and loaded into our cars, we made our way from the NASA launch site and into heavy traffic crowded with the people who had just witnessed the launch. Many of us were on the same road leading to the highway and the Orlando airport across the state. There we would make our flight home, me to New York, others to the city where they lived. 

For all the excitement I had covering the many launches I witnessed at Cape Canaveral and, later, after it was renamed Cape Kennedy, one of biggest thrills was more mundane. There were days when life was boring on the Cape when we had little to do but wait for the perfect weather a launch required. Then we played touch football. Usually it was Nightly News versus the Today Show. The sand at the Cape was rocky, not soft and almost silky like the sand I grew up with in Brighton Beach and Coney Island. Walking on the ground was not easy and often the rocks and shale cut into the bottoms of our sneakers. But we played touch football with fury. Usually the Today Show lost because we had more women on the team who were not good at the game, especially roughhouse touch football. 

One day after a launch, we had played a few hours in the hot sun in our usual desultory fashion drinking water -- no beer permitted -- until we on the Today side of things had one series of plays remaining. I lined up in the backfield and sprinted for all I had into what was our end zone. I turned and saw the football floating to me and I thought it beyond where I could catch it. I looked over my shoulder again, and to my eternal surprise there was the ball. I reached out, grabbed the ball going away into the end zone for the winning touchdown. I bent over, breathing hard, panting and sweating. At first, only silence followed the catch. But, no. Soon came cheering. And through the cheers, a voice rang out above all the other voices: "I didn't know he had it in him." That was the perfect end to my day. For one brief moment in time, life did not get better than that.

With the world seeming to be in free fall, as we learn to live with the pandemic and virus that will probably be with us for many years, will the public love space flight as it once did? Going to the moon or Mars is expensive for the taxpayer as well as news organizations. Will people, many of whom will still be struggling to get through a normal day attach themselves to space for its entertainment and daring as in the past? Will TV that helped keep NASA's programs alive because of extensive coverage fund the showcasing of these new space programs into the wild blue yonder for its audiences? At least I knew there was a time when we all had it in us. For the future, only time will tell.

 

 

 

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175686 https://historynewsnetwork.org/article/175686 0
Presidential Rivalry and Bad Blood in American History (Part 2) Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

For part 1 of this blog series, read here. 

 

Harry Truman had two famous feuds with other presidents: his successor Dwight D. Eisenhower and Richard Nixon, who as a Congressman from California accused Truman of being “soft on Communism,” even before the rise of Senator Joseph McCarthy.

Truman and Ike originally got along well. Truman at one point made a stunning proposal to Ike that he run for the Democratic presidential nomination in 1948 with Truman, who had low public opinion ratings, serving as his vice president.  But Ike was then nonpolitical and uninterested, and turned Truman down.  Later, when the Korean War was raging, Ike agreed to run for the Republican nomination in 1952. Even though Truman quickly ended his reelection campaign, the two men were now at loggerheads, as Ike was critical of Truman’s Korean policies.  So the two men did not get along during the Eisenhower Administration, and only met and agreed to reconcile at the 1963 funeral of John F. Kennedy. They were never close again in the six years before Eisenhower passed away in 1969.

The Truman-Nixon hatred was clear in 1948, when Nixon rose to prominence as a member of the House Un-American Activities Committee investigating Alger Hiss. At that time, Truman crudely insulted Nixon, then a freshman Congressman from California, whom he did not personally know.  And when Nixon was vice president under Eisenhower, Truman was a regular critic, most specifically when Nixon ran for president in 1960 against John F. Kennedy.  But when Nixon finally won the Presidency in 1968, he decided to make the first move to heal the bad blood, determining that he would bring the White House piano used by Truman (it was still in the White House as Nixon also played), to be permanently settled in the Truman Presidential Library and Museum in Independence, Missouri.  The scene of Nixon meeting with Truman, then 85 years old, was a nationally televised event, and Truman seemed not very thrilled, but accepted the gift, and it made Nixon look more presidential to overcome the rivalry. But after Nixon was forced out of office in 1974, two years after Truman’s passing, comedians joked that the dirt moved over Truman’s grave in Missouri, as if Truman in heaven was laughing that Nixon had received his “just desserts”!

Gerald Ford and Jimmy Carter competed against each other in 1976. Ford was very bitter about his defeat, and considered joining Ronald Reagan as his vice president against Carter in 1980.  Ford was critical of Carter’s record, but in a short time after Carter’s loss to Reagan, the two men and their wives became fast friends, visited each other and their presidential libraries, and held symposiums together. Some considered this rapprochement, the most impressive since that of Adams and Jefferson in the early 19th century.  The two men agreed among themselves that the survivor would give the eulogy at the deceased’s funeral, and Carter did precisely that in December 2006.

Jimmy Carter and Ronald Reagan also had a major rivalry in 1980. The two men did not get along or interact during the Reagan Presidency. One famous moment displaying their continued rivalry occurred when Reagan ordered the solar panels that Carter has installed on the White House roof removed. But Carter went to the funeral of Egyptian President Anwar El-Sadat in 1981 by Reagan’s invitation, and also attended the opening of the Ronald Reagan Presidential Library and Museum in 1991 and the funerals of Ronald and Nancy Reagan in 2004 and 2016 respectively. So in a sense, there was a mild reconciliation.

The same situation occurred between George H. W. Bush and Bill Clinton during the presidential Election of 1992, and over the next eight years. A reconciliation came during the administration of George W. Bush, whose father and Clinton became close to the point that the senior Bush and his wife Barbara jokingly said Bill Clinton was an adopted son from another mother. Clinton and the elder Bush engaged in Hurricane Katrina relief in 2005, with Clinton offering his sleeping quarters on their shared plane to the older Bush. 

Finally comes the story of Donald Trump’s rivalry with both Bill Clinton and his 2016 opponent Hillary Clinton, and the conflict between Trump and Barack Obama.

It may be hard to imagine, but Trump had invited the Clintons to his wedding to Melania in 2005, and had said good things about both of them, but then turned against both, to run a vicious campaign of insults in 2016.  The Trump attack on both Clintons has continued to this day, as has the totally nasty and bitter Trump attack on Obama beginning with the “Birther” conspiracy that Obama is not an American citizen and was born in Kenya, but continuing incessantly ever since.  

Trump has set out to destroy everything that Barack Obama has accomplished in both domestic and foreign policy during his eight years in the White House. Trump has accused Obama and all of the people in his administration of treason, and no attack by him or his two older sons, Donald Jr. and Eric, on Obama and his wife is beyond the pale.  And now, the baseless accusation that Trump calls “Obamagate” is becoming the newest line of attack in the 2020 presidential campaign against Obama’s vice president, Joe Biden. So anything is possible in the campaign of hatred and division being waged by the 45th president against the 44th president and his vice president. It’s probably no coincidence that this is happening in the midst of the Coronavirus driven collapse of the American economy from the peak of recovery accomplished under Obama after the Great Recession of 2008, arguably the greatest economic recovery in American history (surpassing FDR’s after Herbert Hoover).

While presidential rivalries have happened before, they have reached a new peak with the current president, more vicious and divisive than any other in American history.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/blog/154353 https://historynewsnetwork.org/blog/154353 0
JFK at 103: What We Miss

 

 

John F. Kennedy, whose life and brief presidency have generated countless books and intense interest, was born 103 years ago today. For many baby boomers his presidency stands as a beacon for wit, grace, and youth. His abrupt death remains a watershed in modern American history, a point at which the United States took a different turn.

 

I met Kennedy when I was ten years old and was smitten. I have studied him for years. He was a complex man who sometimes did infuriating things but also some magnificent things. He was flawed, as are all political figures, but he also was special.

 

Theodore H. White, the keen presidential observer who helped craft the Camelot mystique after the president’s death, later explained that time in more measured ways. He called the hagiography “a misreading of history. The magic Camelot of John F. Kennedy never existed. Instead, there began in Kennedy’s time an effort of government to bring reason to facts which were becoming almost too complicated for human minds to grasp.” He added that a more accurate appraisal of Kennedy and his advisors presented a challenge: “What kind of people are we Americans? What do we want to become?”

 

Those 1,036 days of the Kennedy presidency were a time of promise. He began a journey to bring the nation to a better place in Cold War tensions and peace, civil rights, the environment, public service, the arts, and other areas. He said in his inaugural address that the quest would extend beyond his time, but great progress was made.

 

John Kennedy understood the importance of symbolism, inspiration, and aspiration. He was a unifier when unity was important. He also grasped the importance of science and research to the country and the world; how we miss that commitment today.

 

Camelot was a special time but for more than the glamour: it represented a brief period when presidential leadership was strong and clear. As Teddy White said, Kennedy asked us—cajoled us—to search for ‘What do we want to become?’”

 

If this were an ordinary time, I would be heading to Martin’s Tavern in Georgetown today to eat a bowl of clam chowder in the Kennedy booth as I did on the one hundredth anniversary of his birth. But we can still toast a different time and hope that we can recapture what we have lost.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175741 https://historynewsnetwork.org/article/175741 0
The Decline and Fall of Socialist Zionism (Review)

1950s Mapam May Day meeting. Slogan reads '1 of May for Peace and Brotherhood of the Peoples.

Photo Hashomer Hatzair Archives Yad Yaari, CC BY 2.5

 

 

Review of Kibbutz Ha'Artzi, MAPAM, and the Demise of the Israeli Labor Movement by Tal Elmaliach (translated from Hebrew by Haim Watzman), Syracuse University Press, 2020.

 

It may seem surprising that this work by a young Israeli historian does not depict recent events that have all-but destroyed the political parties formerly associated with the “Labor Movement” in the title. Instead, Tal Elmaliach concentrates on the 1950s through 1977, the year that Israel’s politics were shaken to the core when a Likud-led right-leaning government was elected for the first time. (A post-doctoral fellow at the University of Wisconsin-Madison from 2015-17, he is currently at Ben-Gurion University in Beersheva, Israel.)   

 

The Labor Party has won, lost and tied electoral contests since, along with its frequent ally, the more left-wing Meretz.  Both have been stigmatized by the failed peace process of the 1990s, and have experienced catastrophic decline in support, as noted by Haaretz columnist Chemi Shalev in an April 1st article, reviewing this past March round of elections:

 

“The remnants of the movement that founded Israel and ruled it for decades, Labor and Meretz, were forced to band together to avert electoral extinction; they wound up with 5.83 per cent of the vote, which yielded a grand total of 7 seats in the new Knesset.” 

 

This dwindled to six out of 120 Members of Knesset, as a right of center ally defected to Netanyahu’s side shortly after the election.  Labor and Meretz have also reverted to their former status as totally separate parties.  

 

In  1992, Labor and Meretz won 56 seats to form the core of Yitzhak Rabin’s peace-oriented coalition.  They returned briefly to power under Ehud Barak in 1999-2000, but have struggled ever since the Second Intifada led to the election of Ariel Sharon in 2001.  In alliance with a small dovish party led by Tzipi Livini, Labor gave Prime Minister Benjamin Netanyahu a run for his money in 2015, but has now fallen to the brink of extinction.  

 

The other entities in the title, “Kibbutz Ha’Artzi” and “MAPAM,” were respectively the federation of kibbutzim (communes) and the left-socialist political party founded by the first pioneering Zionist youth movement, Hashomer Hatzair (the “young guard”).  MAPAM, Israel’s second largest party in the early 1950s, became a major component of the Meretz party created in the 1990s. 

 

There were two other kibbutz federations of the secular-Zionist left, each associated with distinct political parties.  One of these, MAPAI, became Israel’s largest and most influential party under the leadership of David Ben-Gurion, the country’s first and until recently, longest-serving prime minister.  MAPAI was also the original political home of Prime Ministers Levi Eshkol and Golda Meir; it merged with two other parties to form the Labor Party in the late 1960s, and eventually formed a common electoral slate with MAPAM, known as the Labor Alignment.  

 

Dr. Elmaliach embeds his analysis of Kibbutz Ha’Artzi (translatable as the “National Kibbutz”) and MAPAM (the Hebrew acronym for “United Workers Party”) with many instances of cooperation and conflict with the other two left-Zionist movements, and their functioning within the Histadrut trade union federation, which MAPAI more or less dominated in periodic elections.  The Histadrut doubled as the holding company for a number of member-owned industrial and agricultural cooperatives that largely shaped the economy under the British Mandate and Israel’s first decades as an independent state.     

 

This plethora of parties, movements and related institutions makes this book a challenge for anyone not familiar with this history.  Elmaliach weaves a complicated picture that toggles back & forth between the Kibbutz Ha’Artzi/MAPAM movement and the “labor movement” of the title.  He advances an economic structural and sociological analysis for the decline of Israel’s Zionist left.   

 

His work is grounded in historical facts that do not reflect well on MAPAM, the most left-wing of pioneering Zionist movements and the broader labor movement of which it was a part. The Kibbutz Ha’Artzi/MAPAM movement’s leadership was dominated by two strong personalities, Ya’akov Hazan and Meir Ya’ari, for decades into the 1970s.  Elmaliach (who was brought up on a kibbutz) views their duopoly in caustic terms: “Ya’ari may have been accused of suffocating paternalism, but Hazan was an expert at axing new ideas.”  Together, they resisted merging with the rest of the labor camp into the Labor Party, but also ensured that they were not fully divorced from it.

 

The differences among the three major labor streams are not deeply examined but brings to mind the expression, “the narcissism of small differences.”  Both MAPAM and the Achdut Ha’Avoda (“brotherhood of labor”) movement, with which it merged from 1946 until 1954, were pro-Soviet.  MAPAM continued to regard the USSR as the “socialist motherland” until Nikita Khrushchev’s dramatic revelations about Stalin to the Communist Party Congress in 1956, and even after the Soviet Union had totally switched from materially allying with Israel to arming Egypt and other Arab forces after the rise of Gamal Abdel Nasser.    

 

There were some significant differences occasioned by the “statism” of Ben-Gurion, who favored building national institutions in the new Jewish state at the expense of the separate kibbutz/political movements.  The three movements maintained separate institutional infrastructures, which included different training facilities, teacher colleges, and publications.  They zealously defended the distinct legacies of their founders and emphasized their ideological differences, but none more energetically than the Kibbutz Ha’Artzi/MAPAM movement, which regarded itself as Marxist-Leninist.  These differences began to break down when Hazan and Ya’ari were effectively forced into retirement in the early ‘70s.  

 

Elmaliach writes of the emergence of a “New Left” at that time, akin to similar currents in the US and Europe, that reached across the movements for a fresh look at society by a new generation in revolt against the establishment -- which in Israel meant the Old Left.  In this connection, he writes of the Tzavta cultural club, a Tel Aviv center sponsored by the Kibbutz Ha’Artzi/MAPAM originally to advance its Party-approved ideology; eventually young activists succeeded in opening Tzavta to diverse contributors and views, and it became a popular source of entertainment for the general public.  

 

The author includes in his analysis the emergence of a technocratic managerial elite in the running of kibbutz enterprises, which pressured all the kibbutz movements to rationalize their operations for the sake of profit, including industrialization and the hiring of non-kibbutz workers (ideologically anathema to many) to raise living standards.  He sees this as part of a global trend among Western democracies, including within established leftwing parties, toward “neoliberalism.”  

 

Israel rose with astonishing rapidity from Third World scarcity to the technologically-advanced productive capacity of a First-World Western economy.  Sadly, this rapid development accentuated the difficult adjustment of the new masses of Jewish immigrants and refugees from Arab and other Muslim-majority countries, who doubled Israel’s population in the 1950s and early ‘60s.  They were culturally distinct from Israel’s founding generations of Ashkenazi Jews (of European origin) who had already created the early ethos of a country striving to be Western and mostly secular.  The newcomers were generally more traditional and religious in their outlook, and not receptive to the unconventional and communal lifestyle of the kibbutz.  

       

This Mizrachi (Middle Eastern) working class suffered in their initial decades in Israel from lacking the protection of being embedded within one of the left movements (known then as “proteksia”).  The author writes of the inequitable impact of the mid-1960s recession, because there was no national unemployment insurance for workers in the construction trades and other private enterprises where most Mizrachim were employed; by way of contrast, most Ashkenazi workers benefited from civil service employment or membership in kibbutzim or the Histadrut labor federation.  For cultural reasons, few Mizrachim joined a kibbutz and few could afford the membership dues to join the Histadrut at that time.  

 

By the time a similar recession hit during Yitzhak Rabin’s first term as prime minister in the mid 1970s, the government had instituted unemployment insurance and provided other measures designed to soften the blow for the mostly Mizrachi working class, but this was too late politically, with Menachem Begin’s nationalist rightwing bloc making decisive strides to win over their votes.  (One surprise finding of Elmaliach’s research is that the Mizrachim did not massively desert Labor for Likud until the 1970s.)     

 

Since the author ends his analysis in 1977 with Likud’s rise to power, he misses the final period of MAPAM’s existence, when it broke with the Labor Party to oppose Israel’s invasion of Lebanon in 1982, and did not renew its alliance with Labor in the unified electoral slate of the Labor Alignment.  Labor balked at Ariel Sharon’s expansion of the war to Beirut, but had previously backed a limited operation to push Arafat’s PLO forces away from Israel’s northern border.

 

Somewhat typically for MAPAM, the movement gave its adherents a complicated message.  Whether they were older reservists called up for the war or younger members doing their initial army service, members of the party or its aligned kibbutzim were advised to serve with their units on the frontline, and then to join the demonstrations being mounted against the war, when on leave.  

 

A large grassroots Israeli peace movement began during the first Lebanon war; as once noted ironically by the late Prof. Edward Said, Tel Aviv was the only place in the Middle East where there was a massive protest against the slaughter of Palestinians in two refugee camps by a Lebanese Christian militia, allied with Israel.  MAPAM pressed its nuanced position that the integrity of Israel’s armed forces was vital for the country’s survival, but as citizens of a free country, they could act politically to change Israel’s policies.  

 

Some members of the Kibbutz Ha’Artzi took a more consistent position against the war and advocated refusing to serve.  They broke with MAPAM to become active in a small, ultimately fringe leftwing political party known as Sheli.  The Lebanon war of the 1980s and ‘90s, which transformed from a conflict with the Palestinians to a war with the new Lebanese Shi’ite Hezbollah movement, was the first time that Israel’s national consensus broke down.  It was also the first time that MAPAM lost the support of some members of its own kibbutz federation.  

MAPAM ended its independent existence by merging into the new Meretz party during the 1990s. As Meretz, it remains the most clearly identified leftwing pole of the wide spectrum of parties and currents that consider themselves Zionist.  It has consistently advocated for an equitable two-state solution with Palestinians in the West Bank and Gaza Strip, while simultaneously defending the rights and participation of Arab citizens within Israeli society; most Meretz activists back Israel as both a state for Jews and for all its citizens.  It remains to be seen whether Meretz can maintain itself against the crosscurrents of competing nationalisms, but as long as it endures, the legacy of MAPAM continues.  

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175740 https://historynewsnetwork.org/article/175740 0
Roundup Top Ten for May 29, 2020

Why Does the Minneapolis Police Department Look Like a Military Unit?

by Philip V. McHarris

The violent response to protests by the Minneapolis Police Department show how police departments that have been granted subsidized access to military equipment have adopted a posture of battle against civilians. 

 

Immigrant Workers Have Borne the Brunt of COVID-19 Outbreaks at Meatpacking Plants

by Anya Jabour

The COVID-19 epidemic should remind us of the hazards faced by immigrant meatpacking workers a century ago, and the labor and industry reforms needed to secure their safety.

 

 

How the Disappearance of Etan Patz Changed the Face of New York City Forever

by Paul Renfro

Concerns about the “safety” and “security” of specific children—particularly those who resemble Etan Patz—played a considerable role in New York’s extraordinary late twentieth-century transformation.

 

 

COVID-19 Didn’t Break the Food System. Hunger Was Already Here.

by Carla Cevasco

The COVID-19 pandemic is revealing the hunger underneath the rhetoric of American plenty.

 

 

Understanding the Origins of American Gun Culture Can Help Reframe Today’s Gun Debate

by Jim Rasenberger

The past is a morally untidy place. As a result, it is also a place, perhaps the last one left, where we can meet and lower our weapons for a while. 

 

 

There’s No Historical Justification for One of the Most Dangerous Ideas in American Law

by Julian Davis Mortenson and Nicholas Bagley

The delegation of regulatory power to federal agencies is the indispensable foundation of modern American governance. And it is under siege.

 

 

’A Black Man in a White Space’: America has a Long and Troubled History of Segregated Public Parks

by Victoria W. Wolcott

Public spaces are infused with the power of history: the legacy of segregation, police brutality, and white supremacy. If there was ever a time that called for compassion in our shared spaces, it is now.

 

 

From Associate to Full Professor

by Keisha N. Blain

Although securing tenure and tenure-track jobs has received great attention lately, it is important that historians from underrepresented groups successfully pursue promotion to full professorship in their institutions to diversify leadership in the profession. 

 

 

The Link Between the Video of Ahmaud Arbery’s Death and Lynching Photos

by Grace Elizabeth Hale

Lynching images can only be created in a context where both killing and observing are allowed by law and society.

 

 

The Great Depression, Coronavirus Style: Crashes, Then and Now

by Nomi Prins

Monetary policy responses to the current crisis can't fix either the structural problems that make the economy vulnerable to severe disruption or the virus and public health crisis that underlie that disruption. Governments must choose to take coordinated action on multiple fronts. 

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175725 https://historynewsnetwork.org/article/175725 0
Tolerance and Violence: The Fate of Religious Minorities During The Plague under Christianity and Islam

 

 

 

Pandemics are nothing new—they scythed through the ancient world as they did the pre-modern and, as we know to our grief and confusion, they are still mowing us down today.

 

We might think that human nature is fairly invariant across time and space, and expect the response to these catastrophes to be perennially the same. Certainly, in the 21st century there are disturbing echoes of the way Jews were blamed by European Christians of the 14th century for the Black Death. From the US to the UK, from Iran to Indonesia (the largest Muslim country in the world), there has recently been an escalation of abuse and violence against Chinese and Asian-looking people. And not just Asians. Political groups and politicians have latched on to coronavirus as a weapon in their anti-immigration policies, urging their partisans to hunker down and suspect the alien minority. In a bid for votes President Trump seems to be using coronavirus to whip up anti-Chinese feeling. In India, egged on by the BJP, the ruling Hindu nationalist party, Muslims have been viciously attacked and accused of conspiring to kill Hindus by deliberately spreading the disease.

 

But in equal measure we can see that rulers across time and space have given startlingly different responses to pandemics, depending on their religious, political and economic circumstances. 

 

Take the Black Death. The plague swept through Christian Europe and Islamdom at roughly the same time--between 1347 and 1351. In Christian Europe, the bubonic plague, which probably originated in China, killed about 25 million; a deadlier strain, the pneumonic plague, which was in fact spread by particles in the air, killed about 75 million in Islamdom (Muslim Spain, North Africa and the Middle East). Like some ghastly laboratory experiment, the plague tested how both Islamic and Christian worlds responded to the same terrifying catastrophe.

 

The Christian world responded by turning on its usual scapegoat: Jews, ‘the sons of the crucifiers’. Reports spread that Jews were emptying little leather pouches of poison into the wells of Europe’s cities. Jews were rounded up in village and city squares–from Christian Spain through France and Germany to Holland—and burned in their thousands. It would be easy to blame the ignorant mob, but it was usually the secular and religious authorities—the city councils run by merchants and craftsmen, the great lords and dukes who ruled cities and territories, and the Holy Roman Emperor himself—who arrested Jews, initiated the tortures, presided over the confessions and ‘trials’ and organised the massacres. In Basel, Switzerland, 600 Jewish men and women were herded into a wooden barn and burned to death with the support of the prince-bishop; perhaps the worst massacre occurred in Strasbourg, France, where the bishop, his senior clergymen and the city guild authorised the rounding up of some 2,000 Jews and burned them to death. 

 

To give him his due, Pope Clement VI condemned the conspiracy theorists, arguing that Jews were also dying of the plague, though actually, in lower numbers thanks to their religious cleansing rituals. But to no avail. More European Jews were killed during this time than at any other time until the Holocaust (there were no Jews to burn in England, because they had all been expelled in 1290).

 

By contrast, when bodies were piling up in the Islamic world and funeral corteges jostled each other in the streets, when 1,000 people were dying every day in Damascus (compared to 300 a day in London), the Mamluk authorities called on the city’s Christians, Jews and Muslims to pray and fast for three days. On the fourth day, the three communities of monotheists processed together barefoot out of the city gates to the Mosque of the Footprint where they spent the day chanting and praying together.

 

Of course there were outbursts of violence directed against Christians and Jews, but nothing on the scale of the violence exacted in the Christian world. 

 

Judging by the standards of pre-modern Christendom, the Muslim world was astonishingly tolerant of its religious minorities. It had had to be, ever since Arab tribesmen conquered the Persian and most of the eastern Roman empires in the 7th century and began to build their vast Islamic empire. The Arab conquerors desperately needed the trade and administrative skills of the Christian, Zoroastrian and Jewish people they were conquering, and the result was a series of agreements, probably drawn up in the 7th century. The ‘Pact of Umar’, as it was called, was essentially a charter for tolerance, but it demonstrates why, even if tolerance is infinitely preferable to persecution, it is undesirable. 

 

In exchange for being allowed to practise their religion without being killed, thedhimmis, the protected ‘people of the Book’, agreed that they would pay a special tax, the jizya. But even more importantly they agreed that ‘We [the dhimmis] shall show deference to the Muslims and shall rise from our seats when they wish to sit down’. Dhimmis could never build a house higher than a Muslim’s, nor be employed in a position of superiority over a Muslim. They had to wear clothes that distinguished them from Muslims—in the 9th century for instance, Jews and Christians had to wear an identifying yellow patch on their cloaks. In other words, as the caliph Umar I, said: ‘Humiliate them [the dhimmis] but do them no injustice’ . The Pact (strictly enforced by Muslim authorities in times of fear and crisis) became the model for the Muslim world’s policy towards its religious minorities, and is still so today in certain Islamic countries. 

 

Tolerance was infinitely preferable to the massacres and persecutions inflicted on Jews in Christendom. Nevertheless, it is based on a relationship of the superior who puts up with (tolerates) the inferior. It is, in fact, based on dislike, ‘To tolerate is to insult’, as Goethe pointed out. 

 

Not that the Islamic authorities, secular or religious, thought tolerance was praise-worthy. As in the Christian world, tolerance was frowned on: it encouraged heresy and its political sister rebellion. But it was a political necessity in the Islamic empire because of its multi-faith history. In the Christian world tolerance was not considered to be virtuous until Europe had been bled dry by a century of wars fought in the name of religion culminating in the Thirty Years’ War of the 17th century. Tolerance then became not just a political necessity but began to be seen, thanks to religious pioneers like the Rhode-Island founder, Roger Williams, as a quintessentially Christian virtue. 

 

Christendom has, indeed, turned the tables on Islamdom. In the pre-European world Islamdom was the model of tolerance compared to Christendom’s shameful record of intolerance. Now the West prides itself on its tolerance - which it contrasts with Islamic intolerance. But, to repeat, tolerance is a questionable virtue.

 

Look, for instance, at the response to HIV/AIDS, known in the 80s as ‘the gay disease’. In 14th century Christendom gay men would have been burned at the stake; in the 20th century West (except for a handful of religious fundamentalists who saw AIDS as God’s punishment on homosexuality) gay men were just about tolerated. But they were not accepted, nor treated as the equals of heterosexuals—one reason perhaps why the search for a cure for ‘the gay disease’ took a low priority despite the carnage it wrought. It took vigorous campaigning to move AIDS up the political agenda.

 

So have our authorities done any better with this latest pandemic?

 

Compare how virtually the whole of the world has responded to this killer epidemic with the way we reacted to the Hong Kong flu in 1968. Scarcely remembered now, that flu killed 100,000 people in the US; 1-4 million worldwide. There were no lockdowns - who knows how many lives could have been saved if there had been. But this time around, the world has done a remarkable thing. Across the world from West to East and North to South,, no matter what the majority religion of the country, many of our rulers have actually reacted in the same way to the same catastrophe. They have shut down our economies in order to save lives. The scapegoating still goes on of course. We still have our anti-immigrant nationalists, our jihadists who are using the blame card for all it is worth—scapegoating immigrants, Muslims, Jews, in the case of some Iranian and jihadist fanatics, or the decadent West. Yet, as President Macron of France said in a recent interview with the Financial Times, ‘we have half the planet at a standstill to save lives. There is no precedent in history.’ 

 

Even if other, less altruistic, factors are also at work, the fact is that our countries from the US to Abu Dhabi, are throttling the economy to save lives. 

 

Collectively we have shown that we value our humanity - far more than our economic order and wealth. We have shown that we are capable of moving beyond the terrors of blame, and the humiliations of tolerance towards a better world where everyone’s life is valued - including the lives of the elderly, often the least valued people in the West.

 

Selina O’Grady is the author of In the Name of God: The Role of Religion in the Modern World. A History of Judeo-Christian Tolerance, Pegasus, June 2020. She is also the author of And Man Created God: A History of the World at the Time of Jesus,St Martin’s Press and Picador

 

 

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175624 https://historynewsnetwork.org/article/175624 0
Honor the Work of Brazil's Villas-Bôas Brothers by Protecting the Amazon's Indigenous

Xingu men wrestle. Photo by Author

 

 

Brazil’s 800,000 indigenous people, in over two hundred tribes, are under the greatest threat that they have seen in recent times.  President Jair Bolsonaro claims that they want to leave their communal societies and ancient cultures, to make money as ordinary materialistic Brazilians.  He wants to open their territories to mining and other commercial exploitation.  And he favours destroying tropical rain forests by penetration roads, colonisation schemes, logging and hydroelectric dams.  Every indigenous leader – world figures like Raoni and Megaron Metuktire, Davi Yanomami, Aritana Yawalapiti, and scores of younger chiefs – as well as every anthropologist and environmental activist knows that he is totally wrong.  I, who have been with forty-five indigenous peoples, and written a three-volume 2,200-page history of their travails during the past five centuries, know that there is not a single people who want to leave their lands and communal way of life.

 

Until now, Brazil could be proud of its recent treatment of indigenous people. Its 1988 Constitution declares that these are descendants of the original Brazilians. It protects their way of life and all the lands that they need for their physical and spiritual wellbeing.  The result is some seven hundred indigenous territories, for 240 ethnic groups, covering almost 1,150,000 square kilometres – an area as large as the original European Union.  Much of this is forested, a significant portion of the world’s surviving tropical rain forests. Earth’s richest ecosystem is essential for sequestering carbon, generating rain and combating global warming.  To paraphrase Winston Churchill: rarely have so many owed so much to so few – the rest of humanity to the indigenous peoples, custodians of the forests within which they alone can live sustainably.

 

The system that is now under acute threat is to a large extent the creation of the three remarkable Villas-Bôas brothers, the subjects of my new book People of the Rainforest.  In the late 1940s the President of Brazil announced a great national venture, to cut into its Amazonian forests rather than explore them by river, as had been done during the previous four centuries.  The Villas-Bôas brothers were the only middle-class young men excited by this expedition.  Each left a desk job in São Paulo, travelled for 1500 kilometres to join it and, with their dynamism, education and leadership potential, were soon in charge.  After eighteen months of tough trail-cutting across unexplored dangerous ‘Indian country’ the expedition reached its goal: the upper Xingu River, in the forested heart of Brazil. After leading this national endeavour, the brothers made further challenging explorations, and they opened a chain of airstrips across the endless expanse of forests. All this made the glamorous brothers their country’s most famous explorers.   

 

The headwaters of the upper Xingu were inhabited by a dozen indigenous peoples, who had remarkably survived into the mid-twentieth century with their elaborate cultures intact. The Villas-Bôas soon progressed from being government explorers to helping these splendid Indians (as they were then known).  They spent thirty years among them, and the rest of their lives campaigning for indigenous rights.  Despite having no anthropological training or experience (or perhaps because of this), the brothers evolved a new modus operandi.  They treated Indians as equals, sometimes jokingly and affectionately, and saw themselves as friends and helpers rather than colonialist representatives of the central authorities.  Leonardo, the youngest brother, died during a 1961 heart operation, but the surviving Orlando and Cláudio each had a ‘post’ 150 kilometers apart on the Xingu river. They visited villages only when invited, usually for a festival or ceremony; but Indians came to them at any time and they knew all about tribal politics.  These methods were adopted throughout other indigenous posts.    

 

The well-ordered societies of the Xingu’s sources were under attack from warlike tribes in surrounding forests and rivers.  So during the 1950s and ‘60s the Villas-Bôas brothers made a series of expeditions to contact and pacify five of these peoples. Each first-contact was very tough and often dangerous – adventures that made Indiana Jones’s exploits look like a stroll in the park.  The last was to win over the belligerent Panará people.  There was an unsuccessful five-month expedition in 1967-68 followed by a thirteen-month attempt that finally achieved face-to-face contact in 1973.  These ventures endured three rainy seasons, when explorers are never dry in torrential rains.  The result was what is known as the Pax Xinguana: the brothers got the indigenous peoples to unite in the face of encroaching national society.  Their contact techniques have been used by all subsequent approaches to isolated peoples.

 

The Brazilian Air Force ran a service of regular flights by WW2 C-47 Dakotas to the string of forest airstrips. The brothers could thus exclude unwelcome adventurers, even missionaries; but they could invite anthropologists and serious journalists. The combination of handsome Indians, famous glamorous explorers and idyllic forests was a media dream, so that the Villas-Bôas got superb coverage.  This completely changed public opinion, to see indigenous societies not as embarrassing primitives but as a source of pride in purely Brazilian culture.  Politicians took note.

 

In 1952 Orlando Villas-Bôas and three others had a radical idea: to get a vast tract of forests protected, just for the indigenous peoples living in them. Nine years of frantic political and media struggle ensued, until in 1961 Brazil elected a dynamic young President, Jânio Quadros – who happened to be family friend of the Villas-Bôas in São Paulo and had seen their achievements in the Xingu.  So Quadros rammed through, as a presidential decree, the 26,000-square-kilometre Xingu Indigenous Park – the first of its kind in the world, and the prototype for similar huge reserves throughout Amazonia that have had such a powerful environmental impact on a global scale.

 

From the outset, the brothers knew that change was inevitable.  But their mantra was: ‘Change, but only at the speed the Indians want’.  The challenge was to make indigenous peoples aware of what was offered by national society – but without losing their pride in their way of life.  In this they succeeded brilliantly, bringing their tribes from pre-literate, pre-metal hunting and fishing to co-existence with modern life – in a mere two generations.  Most of the grand old indigenous spokesmen are their protégés, and young Xinguanos are totally computer-literate.  But they all know that their ancient communal societies, in their beloved forests, are infinitely preferable to life on the rough settlement frontier.

 

The Villas-Bôas left the Xingu in 1976, by then ‘national treasures’ loaded with medals and honours.  They were twice nominated for Nobel Peace Prizes, with every Amazonian anthropologist, headed by Claude Lévi-Strauss, endorsing them. They wrote and lectured tirelessly for the cause throughout the last quarter of the twentieth century, and helped to get the favourable clauses into Brazil’s Constitution. So they would be appalled by President Bolsonaro’s threats to everything they and their indigenous friends achieved.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175625 https://historynewsnetwork.org/article/175625 0
"The News Will Get Out Soon Enough": Homefront Jim Crow and the Integration of the U.S. Navy

"The Golden Thirteen", the first African-American U.S. Navy commissioned officers. They are (bottom row, left to right): Ensign James E. Hare, USNR; Ensign Samuel E. Barnes, USNR; Ensign George C. Cooper, USNR; Ensign William S. White, USNR; Ensign Dennis D. Nelson, USNR; (middle row, left to right): Ensign Graham E. Martin, USNR; Warrant Officer Charles B. Lear, USNR; Ensign Phillip G. Barnes, USNR; Ensign Reginald E. Goodwin, USNR; (top row, left to right): Ensign John W. Reagan, USNR; Ensign Jesse W. Arbor, USNR; Ensign Dalton L. Baugh, USNR; Ensign Frank E. Sublett, USNR.

 

 

This Memorial Day will be particularly poignant given the year’s ongoing events and remembrances marking the 75th anniversary of the end of World War II.  As Americans honor those who fought for freedom and democracy in Europe and the Pacific, they would do well to also commemorate the bravery of the thirteen African American men who integrated the U.S. Navy’s officer corps during the war. 

Politico reporter Dan Goldberg spent eight years researching the story of these forgotten heroes, intent on bringing them the attention they deserve. In The Golden Thirteen: How Black Men Won the Right to Wear Navy Gold he draws on that research, as well as interviews with the men’s family members, to create a detailed picture of their lives and the opposition they faced, from the pseudoscience that propped up racist policies to the everyday degradation and brutality of America’s Jim Crow era.  

The following is adapted from Chapter 9 of The Golden Thirteen.

 

The headlines today – replete with armed protesters, fury at the government and violence – recall an earlier time in the nation’s history, a period that has been whitewashed over 80 years to hide the stains of sins committed when unity against a common enemy was needed most. 

 

Then, as now, critics of the president felt he was indifferent to their suffering and more concerned with alienating his base than using the bully pulpit to bring the country together.

 

In 1943, as the Allies took control of North Africa with a succession of victories abroad, race riots were roiling the home front, but one can draw a straight line from some of America’s most shameful days to the Navy’s decision to commission the Golden Thirteen, the first black officers in the U.S. Navy.  

 

For two years, the drafting and subsequent movement of thousands of black men from segregated towns in the South or racially tolerant cities in the North to boot camps in Southern metropolises or overcrowded cities added fuel to embers that had been smoldering for decades.

 

Riots erupted in the nation’s major urban centers as well as cities critical to defense efforts, including Mobile, Detroit, New York, and Los Angeles. The Social Science Institute at Fisk University counted 242 such outbreaks during 1943, producing what a later observer labeled “an epidemic of interracial violence.”

 

William Hastie, who had resigned his position as Secretary of War Stimson’s civilian aide in January 1943 because no one was taking his complaints about the mistreatment of black soldiers seriously, told the National Lawyers Guild at the end of May that “civilian violence against the Negro in uniform is a recurrent phenomenon. It may well be the greatest single factor now operating to make 13 million Negroes bitter and resentful.”

 

Deadly rioting broke out in Beaumont, Texas and Detroit, Michigan where, in early June, more than 25,000 white workers went on strike after the Packard Motor plant promoted three black men to work on the assembly line beside white men. One striker shouted, “I’d rather see Hitler and Hirohito win the war than work beside a nigger on the assembly line.”

Later that month rioting broke out at Belle Isle, a municipal park on an island in the Detroit River.  Twenty-five blacks and nine whites were killed, and more than 750 were injured before the riot, the worst of the era, ended. 

 

Letters poured into the White House demanding federal action, and Walter White, head of the NAACP, begged the president to intervene, to marshal the nation as he had done so many times before when a national crisis threatened to overwhelm the republic.

 

“No lesser voice than yours can arouse public opinion sufficiently against these deliberately provoked attacks, which are designed to hamper war production, destroy or weaken morale, and to deny minorities, Negroes in particular, the opportunity to participate in the war effort on the same basis as other Americans,” White wrote. “We are certain that unless you act these outbreaks will increase in number and violence.”

 

But the White House made no move, paralyzed by fear of making the situation worse. For every concerned voice that demanded the President intervene to stop Jim Crowism and call for racial equality, there was an equally concerned voice saying it was the very push for racial equality that was causing all these riots, and that Eleanor Roosevelt, in her never-ending quest to promote black men in the factories and the fields, in the Army and the Navy, was responsible for the national discord.

 

“It is my belief Mrs. Roosevelt and Mayor [Edward] Jeffries of Detroit are somewhat guilty of the race riots here due to their coddling of Negros [sic],” John Lang, who owned a bookstore in Detroit, wrote in a letter to FDR. “It is about time you began thinking about the men who built this country.”

 

The Jackson Mississippi Daily News declared the Detroit riots were “blood upon your hands, Mrs. Roosevelt” and said she had “been . . . proclaiming and practicing social equality. In Detroit, a city noted for the growing impudence and insolence of its Negro population, an attempt was made to put your preachments into practice.”

 

Inside the White House, the thought of devoting a Fireside Chat to the subject of race riots was deemed “unwise” by the president’s counselors. At most, Attorney General Francis Biddle argued, the president “might consider discussing it the next time you talk about the overall domestic situation as one of the problems to be considered.”

 

Roosevelt thought even that too much, and when he gave a Fireside Chat on July 28, one month after the Detroit riots, he devoted not one word to race. 

 

Historians Philip A. Klinkner and Rogers M. Smith have argued that Roosevelt’s famous political antennae failed to pick up the changes taking place in the spring and summer of 1943. Before the war, it was almost universally accepted by white Americans that they were a superior race. Even among the most progressive class, only a few believed much could or should be done about inequality in the near term. In 1942, a National Opinion Research Center Poll found that 62 percent of whites interviewed thought blacks were “pretty well satisfied with things in this country,” while 24 percent thought they were dissatisfied. But by 1943 attitudes were shifting, and a year later, 25 percent of white Americans thought black people were satisfied with their status and 54 percent thought they were dissatisfied.

 

“True, white southerners were becoming more restive, but it seems clear that in the context of the war, nationally public attitudes on race had shifted enough that [Roosevelt] could have been more outspoken for reform,” the historians argued.

 

In August, another large riot began—this time in New York City—when Margie Polite, a thirty-five-year-old black woman, was arrested by Patrolman James Collins for disorderly conduct outside the Braddock Hotel on 126th Street in Harlem. Robert Bandy, a black soldier on leave, intervened. He and Collins scuffled, and at some point Bandy allegedly took hold of Collins’s nightstick and struck him with it. Bandy tried to run, and Collins shot him in the left shoulder.

 

The incident was like a spark to kindling on a hot, sweaty night in the city, the kind where the air is thick and humid, and tempers rise to meet the mercury.

 

Men and women sitting on their fire escapes seeking relief from the stifling heat climbed down the ladders and formed a mob. They lived in those overstuffed, sweltering tenements because of the color of their skin, because the city wouldn’t let them leave the ghetto. They were packed into apartments like animals, and now that they were ready to die so that the best ideals of their country might live, their countrymen beat and slaughtered them like animals.

 

The Harlem Hellfighters, the black men who made up the 369th Infantry Regiment, had been sending letters home from Camp Stewart in Georgia in which they told friends and relatives, often in graphic detail, of the gratuitous insults and violence they endured. Harlem’s black press reported on how soldiers were beaten and sometimes lynched in camps across the South. Residents knew of the riots in Detroit and Beaumont. They knew that airplane factories on Long Island, even though desperate for workmen and -women, would not “degrade” their assembly lines with African Americans.

 

It took 8,000 New York State guards and 6,600 city police officers to quell the violence. In all, 500 people were arrested—all black, 100 of them women. One week later, when the New York Times examined the causes of the riot, it declared that no one should be surprised: “The principal cause of unrest in Harlem and other Negro communities has been [the] complaint of discrimination and Jim Crow treatment of Negroes in the armed forces.”

 

The Navy responded to the racial tensions by creating the Special Programs Unit. Its mission was to coordinate policies and protocols for black sailors so that they were used to their full potential and protected—as much as possible—from humiliation and violence.

 

At its helm was Lieutenant Commander Christopher Sargent, a thirty-one-year-old who had clerked for Supreme Court Justice Benjamin Cardozo and worked in the law firm of Dean Acheson, a future secretary of state. Sargent would later be described as “a philosopher who could not tolerate segregation,” and he waged “something of a moral crusade to integrate the Navy.”

 

Unlike many more senior officers, Sargent thought the war was the best time to integrate the fleet and told superiors that racial cooperation would create a more efficient fighting force. 

 

Among the unit’s highest priorities was to see to it that black men were no longer bunched together at ammunition depots or other installations with little real work to do, and that graduates of Class A naval training schools were given proper assignments.

 

The Special Programs Unit then convinced Admiral Ernest King, chief of naval operations, to place 196 enlisted African Americans along with 44 white officers on the USS Mason, a destroyer escort expected to traverse the Atlantic on convoy missions. The ship, still under construction at the Boston Navy Yard, was named for Ensign Newton Henry Mason, a fighter pilot shot down during the Battle of the Coral Sea. Many in the Navy gave it a different name: “Eleanor’s folly” they called it, another slight aimed at the First Lady for her advocacy of integration.

 

Manning the ship did not, of course, represent total integration or full equality. The ship would have all-black crews serving under white officers. White and black men would still sleep in different quarters and eat at different tables.

 

“We are trying to avoid mixing crews on ships,” Navy Secretary Frank Knox told reporters. “That puts a limitation on where we can employ Negro seamen.”

 

Still, the black press heralded the announcement. For years, civil rights leaders had said that the right to fight and die for one’s country was a crucial step toward making the United States a more perfect union. Having black sailors outside the messman branch serve at sea marked “a distinct departure from present Navy policy and is the culmination of a five-year fight,” the Pittsburgh Courier told its readers.

 

But one problem remained beyond the unit’s reach, one symbol of inequality so glaring that it outshone all other successes: at the end of 1943, there were no black officers.

 

The job of convincing Navy Secretary Frank Knox that it was finally time to commission black officers fell to Adlai Stevenson, the secretary’s speechwriter and confidant. 

 

On the question of integrating the officer corps, Stevenson explained to Knox, an efficiency expert, that refusing to commission black men was now unquestionably inefficient. The Army, he said, was still recruiting better-educated, better-disciplined black men in large part because that branch offered a path for advancement. If the Navy wanted to keep up, it would have to consider commissioning African American officers.

 

There were 60,000 black men in the Navy, and 12,000 more were entering every month, Stevenson wrote to Knox on September 29, 1943. “Obviously, this cannot go on indefinitely without accepting some officers or trying to explain why we don’t. I feel very emphatically that we should commission a few Negroes.”

 

Stevenson suggested “10 or 12 Negroes selected from top notch civilians just as we procure white officers.” He ended his memo by telling Knox, “If and when it is done, it should not be accompanied by any special publicity but rather treated as a matter of course. The news will get out soon enough.”

 

Excerpted from The Golden Thirteen: How Black Men Won the Right to Wear Navy Gold by Dan C. Goldberg.  Copyright 2020.  Excerpted with permission by Beacon Press.

 

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175618 https://historynewsnetwork.org/article/175618 0
Will History Judge Trump? Only if We Preserve Our Capacity for Judgment

 

 

 

“If Republican Senators choose a cover-up, the American people and history will judge it with the harshness it deserves.” Nancy Pelosi’s angry tweet last January was a response to the Senate’s failure to call witnesses in President Trump’s impeachment, an event has faded from our collective memory as the very real fear of the coronavirus consumes our lives.  The White House’s chaotic response to COVID-19 will, by some estimates, result in an even harsher judgment by history than the Senate’s craven actions earlier this year. The assertion that “history will judge” assumes that actions in the present gain their full meaning only once the motivations of historical actors and the impact of their decisions have been assessed in the widest possible context – and that such an assessment can only be done in retrospect. That is the primary task of historians.

 

So how will “history” judge the actions of Republican senators and the president? The question takes on urgency because of the public health crisis. At least since February, Republican senators have chosen to ignore both the severity of the crisis and the dubious competence of Trump’s response. To hold Trump and the GOP accountable, even in future textbooks, will require a shared understanding of what constitutes truth.  A hallmark of any modern society is the idea that rational inquiry and unbiased evidence can lead us to the “truth,” a reality that most can accept. This understanding of the truth is at least 400 years old, emerging with gusto during the Scientific Revolution in the seventeenth century and then coming to shape thinking about government and society during the Enlightenment in the eighteenth. This idea of truth based on reason and evidence is what supports almost all research from life-saving medical breakthroughs (such as the coronavirus vaccine we are nervously awaiting) to the development of the iPhone.  But it is not the monopoly of research. Rational evidence-based inquiry is the hallmark of journalism, the work of intelligence agencies, and even the legal system, however imperfectly. 

 

Yet it is exactly this understanding of the truth that is at risk, and that puts us all in danger right now. Last January, Republican Senate leaders worked hard to make sure that their members would not have to listen to evidence that would force them to confront truths they wanted to ignore. Today, many governors are reopening their states even as cases of COVID-19 and deaths continue to mount, ignoring the evidence that shouts “too soon!” The president seems determined to create a narrative that the pandemic is under control and it is time to get back to work, desperate to revive the economy even though many Americans continue to question whether it is safe to end social distancing and reopen businesses yet.

 

Of course, “fake news,” blatant lies, twisting reality to hold on to power – these are nothing new in our political leaders. In a 2004 New York Times piece, Ron Suskind attributed the following infamous quote to a senior Bush official (thought to be Karl Rove):

 

People like you are still living in what we call the reality-based community. You believe that solutions emerge from your judicious study of discernible reality. That's not the way the world really works anymore. We're an empire now, and when we act, we create our own reality. And while you are studying that reality—judiciously, as you will—we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors, and you, all of you, will be left to just study what we do.

 

But until recently, this gleeful willingness to “create our own reality” has not been the norm, nor have institutions that uphold basic truth standards been publicly attacked by the government in the way that has become the hallmark of the Trump presidency. In fact, President Trump has demanded that Republicans publicly accept his version of reality – from the size of the crowd at his inauguration to the idea that Russia did not interfere in the 2016 elections. Now, he is demanding that the Americans people accept his claim of victory over the spread of COVID-19, and that they quietly ignore the deaths that continue to multiply in their communities. 

 

It is a cliché that history is written by the winners. It is not, at least not any more. History is, for the most part, written by professionals who undergo years of training to learn how to use historical sources according to strict rules of evidence to explain what happened in the past. To enforce these rules, the work of historians undergoes peer review before publication, weeding out biases and inaccuracies. It is not a perfect system, but it does fairly well.  But for history to judge this moment, and for citizens to accept that judgment, we will need a common understanding of truth. Even though, as historians, we write, we teach, and we tweet, we cannot bring the force of history’s judgment to bear alone. We need our fellow citizens to believe that our work provides a context predicated on modern notions of truth. We need our political leaders to put the truth ahead of fear and the desire to remain in power, especially when the public health, and indeed, our public institutions, are at stake. We need them all to accept an evidence-based reality. Only then can history make a judgment that will actually matter. 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175622 https://historynewsnetwork.org/article/175622 0
The Social Psychology of Popular Right-Wing Conservatism

An 1856 cartoon depicts Republican John C. Fremont as a representative not only of abolition but of a host of threatening social movements

 

Many people who see little rational basis for supporting Donald Trump ask themselves: Why is he so popular?  Relatedly, why did so many people support Richard Nixon, Adolf Hitler, and other avatars of popular right-wing conservatism?  There are, of course, many different reasons for each situation.  But there also key commonalities that have been identified in meta-analyses of the topic written by the psychologist John T. Jost and colleagues.  In relation to Jost’s work, I have examined aspects of the antebellum South in order to better understand its political culture, especially aspects of that culture that prompted many Southerners to become more emotionally receptive to the appeals of “fire-eater” secessionist conservatives.  More broadly, this historical lens can help illuminate the mass appeal of conservatism in general, focusing particularly on the psychological factors that tend to underlie this appeal.

 

Among the various scholars writing on the topic since World War II, Jost found agreement that an inclination to respect tradition in resisting change and to respect hierarchy in sustaining inequality are two key interrelated core conservative ideological characteristics.  He found that several psychological motives increase the likelihood that people will manifest these ideological characteristics.  These psychological motives include tendencies toward “fear and aggression . . ., dogmatism and intolerance of ambiguity . . ., uncertainty avoidance [including less openness to new experiences] . . ., need for cognitive closure . . ., need for structure . . ., and [support for] group-based dominance.”  The desire to keep under tight reins any conditions that evoke uncertainty or fear is the most prominent and overarching of these motives, which often translates into ideologically-based preferences to hold tight to elements of society and political culture identified with traditional authority. 

 

For decades, antebellum Southern whites were exposed to a regular drumbeat of fearful rumors that bloody slave insurrections would result if slaves were emancipated, which rabidly pro-secessionist political leaders (the fire-eaters) contended would happen if Abraham Lincoln were elected president in 1860.  This fear, along with racist attitudes toward African-Americans, reinforced the Southern white craving for order, the inclination to value the certainties that they felt they knew, and wariness about the possibilities of change.

 

Fear boosted the strength of leaders in politics, press, and pulpit whose cries about the dangers of slave revolt made people more inclined to emotion-based political decision-making rather than reason-based thought.  In his role as one of the secession commissioners sent by the early-seceding states to coax other Southern states to join their cause, Stephen F. Hale of Alabama expressed many Southerners’ fears in portraying Lincoln’s election as “[inaugurating] all the horrors of a San Domingo servile insurrection [referring to the bloody slave liberation struggle during the era of the French Revolution in the country later known as Haiti], consigning her citizens to assassinations and her wives and daughters to pollutions and violation to gratify the lust of half civilized Africans.”  Slave emancipation and equal rights for African-Americans would result in the obsessively-feared “amalgamation” of the races through interracial sex, or “an eternal war of races, desolating the land with blood, and utterly wasting and destroying all the resources of the country.” “With us [Southerners],” Hale observed, “it is a question of self-preservation.  Our lives, our property, the safety of our homes and our hearthstones, all that men hold dear on earth, is involved in the issue.”

 

Contention over slavery was also part of a broader web of controversy arising from what many Southerners thought of as a decades-old culture war over the differing social arrangements and values of the North and South.  Pointing to the fact that abolitionists were usually active in other reform movements in the North (that were virtually non-existent in the South), Southern leaders saw these movements as being of one piece with abolitionism, in dangerously seeking to extend the concept of human rights and opportunity to out-groups—often singling out the feminist movement in this regard—and in seeking to lure people away from traditional social relations and socioeconomic structures.

 

Southern leaders sought to stereotype reform activism as more radical and pervasive in the North than was actually the case.  The historian Manisha Sinha observes that the North was portrayed as an atheistic land “in which property, marriage, female subordination, religious fidelity, and other pillars of social order were constantly challenged.”  The fire-eater James D.B. De Bow on the eve of secession spoke of Southerners as “adhering to the simple truths of the Gospel and the faith of their fathers, they have not run hither and thither in search of all the absurd and degrading isms which have sprung up in the rank [Northern] soil of infidelity.  They are not Mormons or Spiritualist [sic], they are not Owenites, Fourierites, Agrarians, Socialists, Free-lovers or Millerites.”

 

The South’s dominant approach to the Christian religion produced an atmosphere conducive to the fearful imaginings of a culture war, and reinforced the sorts of personality characteristics that Jost believes are conducive to the formation of right-wing conservative political beliefs.  The tendency to demonize the North and its reform movements grew, in part, from a more general Manichean outlook emphasized within Southern churches, which divided the world into harshly drawn categories of good and evil.  In this perspective, the only alternative to a slave-based traditional way of life in the South was considered apocalyptic ruin.  Thus, the influential Rev. James H. Thornwell of South Carolina thundered that “the parties in this [cultural] conflict are not merely abolitionists and slaveholders—they are atheists, socialists, communists, red republicans, Jacobins, on the one side, and the friends of order and regulated freedom on the other.  In one word, the world is the battle ground—Christianity and Atheism the combatants; and the progress of humanity the stake.”

 

Rather than pressing their congregants to realistically face the moral challenge of slavery, clergymen were crucial in helping Southerners circle their attitudinal wagons in defense of the peculiar institution, using a narrow literal interpretation of the Bible to defend it while insisting that the South had a God-given stable order far superior to that of the urbanizing “ism”-plagued North.  Seeking to contain the moral ferment of the North and its potentially destabilizing effects, clergymen focused their congregants on personal salvation, shunning the direction of many Northern churches that sought to reform society in order to remove impediments to godliness.

 

Rather than seeking to treat certain passages of the Bible as allegory or historical custom in order to reconcile Scripture with more conscientious attitudes toward slavery (and to reconcile with scientific discoveries such as the geological findings proving a much older earth than that found in the Bible’s accounts), clergymen sought instead to squeeze reality into a rigid literal interpretation of the Bible.  Thus, they offered their adherents a way of escape from the use of reason and dialogue in performing moral calculus, and demonstrated how to ground their beliefs and behaviors in what made them feel emotionally secure in the present.  This helped hide the immorality of slavery from Southern eyes, and it hid the falsity of Southerners’ belief that their security depended on the continuation of slavery.

 

This religious approach reinforced the closed nature of a society in which dissenting viewpoints and the give and take of civil dialogue were only tolerated when concerning issues that did not challenge the structures of society.  This was most particularly the case from the early 1830s onward, as the rise of abolitionism and other reforms prompted the highly defensive approach to issues that would mark the final decades of the antebellum era.  Using the “isms” of the North as a foil, Southern right-wing leaders began to frame slavery as a “positive good,” in contrast to the previous view of slavery as a morally troublesome institution that was nevertheless too difficult to abandon due to economic and security concerns.  Leaders thus placed the South on a fateful path, rejecting what had been a relatively open political culture in which civil discussion about weaning the South off slavery had been tolerated.  

 

Believing themselves to be sitting atop a social volcano, Southern leaders sought order by isolating the region’s people from the culture of what they considered to be a decadent North and indeed from a decadent Western civilization in general.  Psychological, economic, and physical bullying upheld the closed nature of the South, as did the forceful exclusion of antislavery newspapers, magazines, and pamphlets—backed by the federal government’s acquiescence in banning the distribution of supposedly incendiary works through the mail in Southern states.  Those who voiced heterodox views could readily find their businesses shunned and their standing in society ruined.  “Vigilance committees” filled with men itching to prove their masculinity sprang into action at rumors about whites advocating abolition and rumors that slaves may be planning to revolt.  Many were forcefully exiled from the region, beaten up, or hanged as a result.

 

The suppression of thought in the antebellum South helped ensure that the individual dispositions of many whites would gravitate toward Jost’s categories of fear, dogmatism, intolerance of ambiguity, uncertainty avoidance, need for cognitive closure, and support for group-based dominance, all of which left many Southerners ill-equipped—in their lack of knowledge and in their fearful, conformist attitudes—to stand up to the fire-eaters during the secession crisis of 1860-1861.  Seeking order above all, the South disallowed the possibility of evolutionary change from the 1830s onward, obtaining as a result bloody revolutionary change in the 1860s.

 

Ideally, more history and political science professors will recognize the value of using historical case studies as a lens for better understanding the psychology of right-wing conservatism, particularly in understanding its mass appeal.  Given the propensity of right-wing conservatism toward authoritarianism, such a curricular element could help sensitize citizens to the potential danger that the emotion-driven inclinations characteristic of conservatism pose to the maintenance of a civil democratic polity.  I would recommend the study of the social psychology of past right-wing political movements as a normative element in college-level U.S. history courses and other humanities and social sciences courses geared toward teaching the foundations of civic knowledge.  Ideally, an awareness of the social psychological basis of right-wing conservatism might even suffuse down into the quotidian teaching of history and civics on a secondary school level.  The ascendancy of Donald Trump illustrates how important it is to increase our citizens’ ability to take such historical issues into account in examining current events.

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175623 https://historynewsnetwork.org/article/175623 0
No More Business as Usual! It’s Time for Joe Biden to Defend our Democracy

Senate Counsel Joseph Welch and Sen. Joseph McCarthy, June 9, 1954

 

In 1996 Recep Tayyip Erdogan, now the de facto dictator of Turkey, likened democracy to a streetcar.   “When it arrives at your stop, you get off.”  In 1928, a few years before Adolf Hitler seized power, Joseph Goebbels announced, “We come as enemies!... As the wolf attacks the sheep. That’s how we come.” 

Leaders with authoritarian instincts typically do not disguise their intentions. Their bravado stems from the loyalty of their base, but more importantly, the belief that defenders of democracy will succumb without a fight. 

Donald Trump does not claim to be a dictator, but he behaves as if he already wields unchecked authority.  Will Joe Biden fight? 

In recent weeks, President Trump fired fired five inspectors general, career civil servants, who are appointed by the president to ferret out administrative and criminal cases of waste, fraud, mismanagement, and misconduct in executive agencies.  The first, on April 3, was Michael Atkinson, inspector general for the intelligence community, who last year judged the anonymous whistleblower’s complaint as "urgent" "credible," and worth investigating.  Next, on April 7, came Glenn Fine, acting inspector general of the Department of Defense who chaired the committee charged by Congress to investigate fraud in the $2.2 trillion pandemic-related programs.  On May 1, the president fired Christi Grimm, Principal Deputy Inspector General of Health and Human Services, after she reported serious supply shortages and testing delays at hospitals during the coronavirus pandemic. Then Steve A. Linick, inspector general of the State Department was dismissed last Friday. Linick’s firing may be linked to his inquiry into Mike Pompeo’s misuse of staff for personal tasks. Or it may have been a response to two investigations related to Saudi Arabia: Pompeo’s approval last year of billions of dollars in arms sales to Saudi Arabia in defiance of Congress and an effort, originally led by General Michael Flynn, to bypass established channels and export nuclear technology to Saudi Arabia.  The fifth dismissal, on May 15, removed career civil servant Mitch Behm from his post as Acting Inspector General of the Department of Transportation and Safety (DOT).  Behm was reported to be investigating conflicts of interest related to Elaine Chow, Secretary of the DOT and wife of Mitch McConnell.  His replacement, Howard R. Elliott, is the Administrator of the Pipeline and Hazardous Materials Safety Administration (PHMSA) and reports to Elaine Chow, Secretary of the DOT.  Because Elliott will remain in his first job while overseeing the DOT in his new position, he will report to the head of the department he is inspecting.  

All five inspectors general were fulfilling their duty either to protect Americans from a pandemic or to curtail executive authority. These dismissals sent up warning flags about the fragility of our democracy. They show, yet again, that Trump expects officials to put loyalty to him above the law. 

History has many examples of heads of state who demand unconditional obedience, but they are kings and dictators, not presidents of a robust democracy. Trump’s extreme behavior offers Joe Biden a chance to demonstrate the leadership skills he says the nation needs. 

Until the latest revelations about Linick’s firing, Biden displayed little concern about the purge.  The basement-bound Biden continued his benign stance and assured us that, “Our best days still lie ahead.” He smiled and modeled decency, while describing a smorgasbord of 35 proposals aimed at specific demographic and interest groupsBy campaigning as not-Trump, Biden allows his rival to frame the conversation while he plays catch up.  No wonder Trump gloated after a Biden speech, ‘’When a man has to mention my name 76 times in his speech, that means he’s in trouble.’”  

In the “war room” of the Democratic National Committee, too, it was business as usual. Staffers hyped debate about Biden’s likely running mate and announced a new digital initiative, the soul squad, honoring Biden’s promise to restore the country’s soul.  

Yet, to restore our soul, we need a robust democracy.  

The time has come for Joe Biden to convert his basement into a command bunker and make the war room live up to its name.  On Sunday he declared, “We’ve got to demonstrate respected leadership on the world stage.”  Democrats need that leadership now, as the President continues to purge high officials who speak the truth – justifying such purges by ranting incoherently about the “worst crime in history.”  

Others are modeling civil courage.  Over 2,000 former members of the Department of Justice and FBI called for William Barr’s resignation.  On Saturday Department of Justice Inspector General Michael Horowitz issued an unqualified rebuke of his boss, President Donald Trump for firing Atkinson. Biden could also take a page from Senator Chris Murphy.  “Using foreign aid to destroy rivals. Weaponizing the judiciary. Firing all the inspectors general. Democracies begin to die when a leader starts to destroy the limits on his power, and his faction decides that he is more important than the republic. Welcome to that moment.”  

Joe Biden can take his place in a revered American tradition of patriots who stood up to demagogues.  Think of Joseph Welch, the attorney for the U.S. Army in 1954 when Senator Joseph McCarthy leveled unfounded charges of Communist infiltration in the highest levels of the government. Defending the Army against libelous accusations, Welch said. “Looking at you, Senator McCarthy, you have, I think, sir, something of a genius for creating confusion—creating a turmoil in the hearts and minds of the country…. Have you no sense of decency?"   McCarthy’s reign of fear was finished. 

Turkey in 1996 and Germany in 1928 had only briefly experienced liberal democracy. By contrast, as the Joseph Welch example reminds us, the United States has a distinguished tradition of resilient patriotism. If he rallies Americans to defend our democracy, Joe Biden can take his place in that tradition.  

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175621 https://historynewsnetwork.org/article/175621 0
The Pandemic Exposed a Major Flaw in America’s Health Care System  

 

 

The coronavirus emergency exposed a serious shortcoming in America’s health care system: insurance provided through the workplace. The U.S. approach to health security, an unusual arrangement among modern societies, did not serve the American public adequately during flush times. Now, in hard times, it is glaringly insufficient. The system is in desperate need of reform.

The United States’ largely employer-based insurance arrangement became prominent during World War II. Millions of workers left home to serve in the armed services, creating labor shortages. President Franklin D. Roosevelt worried that Inflation would spike because companies needed to raise wages to compete for an inadequate supply of workers. He responded with an executive order in 1942 that froze wages. Employers managed to attract laborers nevertheless by offering employer-sponsored health insurance. In 1943 Congress made that benefit tax-exempt, an action that reduced the cost of health coverage for individuals and families. In the prosperous decades after World War II, numerous American businesses provided health insurance as an employment perk.

World War II contributed to the formation of a different approach to health insurance in Europe. Economies were in ruins after the war. The Europeans could not establish employer-based systems in an unstable business environment. Programs that emerged on the European continent and in the United Kingdom varied greatly and sometimes included large components of private insurance. But governments served as the primary organizers of medical insurance. Government-sponsored programs brought universal health care to the public. 

The Europeans were more aggressive than the Americans in keeping health costs under control. Washington had less political clout than European governments when dealing with medical industries. In the USA special interests, including insurance companies, doctors, hospitals, drug companies, and medical device firms merged, lobbied, and competed tenaciously for profits. In this fragmented, free-market environment spending on medical care surged. Health care in the United States became, by far, the most expensive arrangement of modern developed nations. Health care spending represents 18% of gross national product (GNP) in the USA. Switzerland, second highest, spends 12% of GNP on medical care. 

America’s employer-based insurance system served nicely during the booming Fifties and Sixties, but over the years changes in employment practices left more and more people with inadequate coverage. Years ago, millions of American workers received impressive medical benefits secured by unions. No longer.  Union membership has declined substantially. Many Americans now work independently as contract workers or consultants. They must secure health insurance on their own, often at considerable expense. Others hold temporary positions that provide no coverage or limited coverage.

Then the pandemic hit, making employment less secure. In just a few months, businesses placed numerous workers on furlough or released them completely. Unemployment claims jumped to 38.6 million in nine weeks. Workers in restaurants, retail stores, hospitality industries, airlines, manufacturing, and other fields joined the ranks of America’s unemployed. The coronavirus produced more than just a job crisis. It created a health insurance crisis. 

A survey in February 2020 revealed that many Americans suffered from medical insecurity before the Covid-19 emergency. The NBC News/Commonwealth Fund reported in early 2020 that one in three Americans worried they could not afford health care. One in five respondents said they had problems paying or were unable to pay medical bills during the past two years. Now, with the pandemic destroying jobs, additional millions are insecure. 

During the Twentieth Century, reform-minded political leaders in the United States tried to improve health security, but they were unable to achieve what European societies accomplished. Franklin D. Roosevelt wanted to include medical insurance in the 1935 Social Security bill. He dropped the plan to win support in Congress. Harry S. Truman sought universal coverage as part of his Fair Deal but encountered enormous pushback from the American Medical Association and other organizations. A huge Democratic victory in the 1964 elections helped Lyndon Baines Johnson and Congress to establish Medicare and Medicaid in 1965, yet those measures targeted only a portion of the total population. In 1994 President Bill Clinton and his wife, Hillary, had to back away from a bold plan to expand coverage after encountering fierce opposition. 

The program created by President Obama and Democratic lawmakers in 2010 has been under attack in recent years. GOP lawmakers and conservatives at the Supreme Court chipped away at Obamacare’s provisions. Now Republican governors and President Trump threaten to take down Obamacare entirely. On May 6, Trump said his administration would continue backing a constitutional challenge at the Supreme Court. “We want to terminate health care under Obamacare,” Trump declared. The president and Republican legislators have long targeted the Affordable Care Act but never identified a comparable replacement. If the GOP’s challenge at the Supreme Court succeeds, twenty million Americans will lose medical insurance provided through the Affordable Care Act.

The United States is the only modern, industrialized nation in the world without universal health coverage. Its insurance system, based primarily on employer-sponsored programs, failed to protect the public before the pandemic. With unemployment now approaching Great Depression levels, the system is woefully insufficient. It needs radical overhaul. Reform will not be generated by the Trump administration or the current Senate leadership. The system might get an overhaul if Democrats win control of the White House and both houses of Congress.

In view of the long history of unsuccessful efforts to reform health insurance in America, many pundits expect powerful groups will once again crush proposals for change. But the severity of the present economic downturn could make a difference. The United States is now in a position somewhat like Europe after World War II. Europeans dealt with the crisis of smashed economies by establishing government-organized health insurance. Now American society is reeling from fierce attacks by an invisible enemy. Covid-19 damaged American businesses. It produced a severe employment problem and a related medical insurance problem. Employer-sponsored health insurance now appears less promising because many jobs are at risk. The present emergency could excite broad-based support for universal health coverage, a reform Europeans embraced long ago.  

During the 2009-2009 financial crisis, Rahm Emanuel, President Obama’s Chief of Staff, made an insightful comment that is relevant to the current situation. “Never allow a good crisis go to waste,” advised Emanuel. “It’s an opportunity to do the things you once thought were impossible.” 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175628 https://historynewsnetwork.org/article/175628 0
"This Too, Shall Pass." History, and Life, Say So!

 

 

Our title, the quoted part used by Lincoln and in the Bible, has it exactly right. The passage of time can change everything. There can be wartime. Then it’s over and “real life” begins again. Those who lived through a major war know that truth entirely too well, on the one hand, yet gratefully in another and important sense: It ended! For many of us there was the decade that America suffered, that time of no-kidding Depression, 1930 to 1942. It was an era of real concern, one that disturbed nearly one and all. Later came a time that had children ducked under their desks, that was called The Cold War. Some in the know worried through it; others didn’t understand things atomic.

Going back to 1917-18, an ugly yet so patriotic war came, yet that awful World War I of poison gas and trenches and machine guns ended in peace at last with a still memorable Armistice Day. I would be born during that war (October 10, 1917), but would live happily in two decades in peace before the next one. Things change, thank goodness. (Change can, of course, can be for the better—or the worse.)

Some eras begin but don’t seem to come to a close. The Atomic Age started with secret preparation and then actual use of two atomic bombs on a people far away who attacked us. They were people who lived a very different kind of life in far Japan. Postwar, time and effort brought great change. A new day seemed off to a start with the United Nations, a day that had started weakly with the League of Nations. All in all, the concept was that the idea of “nationhood” might shrink with the end of international conflict, and the beginning of international understanding. The League was off to a shaky start in the Twenties. Its successor was organized at the close of World War II. New enthusiasts and patriotic nationalists differed for a long time. Real change had not come, the kind Woodrow Wilson hoped for. But the U.N. lived on.

All kinds of “ages” get started, unexpectedly, and affect us profoundly, then may get taken for granted. The time of gasoline-powered automobiles is a good example. Horse-drawn vehicles seemed to work well enough. But gradually Henry Ford and so many others provided an alternative. Eisenhower thought interstate highways a good idea, and we were at last united across and up and down our landscape. Lady Bird Johnson tried to get us to concentrate on beauty in the terrain, arousing some interest.

It was in President Nixon’s administration that enthusiasts won an Environmental Protection Agency and high hopes lived for an ERA. Trade unions tried mightily to change working conditions in America, but problems never died out, somehow. And the time came when young people moved from place to place with augmented telephone gadgets capable of uniting one instantly with a person of choice. Being alone was to be out of date. Amazing!

The age of the coming and total triumph of radio—first AM, then FM—changed evenings, then daytime. Radio changed our whole lives--as we listened to comedy, “popular” then “country” music of all kinds, hour after hour. All of the sudden, it was the Age of Television (even color!), and we were hooked into a new way of learning and living and communicating, all inside our homes (and the local bar).

Our kitchens changed and with that, our very food! The microwave profoundly altered the lives lived by single persons. Gadgets and things that were new, for example, the ease of keeping clean with washers and dryers, and soaps and shampoos, and electric razors, and new materials for clothing. Enroute, it was surprising, really impossible, to lag behind still dirty, unkempt, ignoring change forced by a variety of inventions.

Malls changed our shopping profoundly. Reclining chairs; electric clocks; changes over the decades in musical instruments and how they got used; new habits in recreation phased in: skiing, for example. Games, puzzles, comics and major changes in newspapers brought changes in their readers. Magazines, readable and illustrated, modified dramatically as Life phased in during the mid-thirties. Life began in 1935 and Time joined in changing communication of information permanently, it seemed. “Commentators” moved in on the airwaves and columnists could be aggressive with their editorial efforts. Many cared deeply what we thought. (It was not always like that.) 

Radio and TV moved in on newspapers and school teachers noticed. Somewhat aggressive professors had egos that demanded to be heard in books and articles. Many, it seemed, wanted us to be informed and persuaded with or without our awareness. Parents hoping to mold their own heirs free of outside interference were finding the task uphill as education and media reached out hoping to conquer attitudes long advanced and absorbed in one’s home. As for preaching, ministers lost ground in the 20th century without doubt.

None of that, in a way, was “atomic”; that is to say, life and death. Detouring there for a second, coffins gave way substantially to cremation (with savings at the end). Funeral parlors caught on. The entire, variable, insurance industry, innovating with all kinds of new concepts (reverse mortgages), involved financial invasion, but did wonders for widows and others when loved ones abandoned their livelihoods. Vitally important was the influence of Medicare and of modern medicine. Vital was the beginning of Social Security payments (in 1940 from the 1935 law); it was a true revolutionary alteration in national life, no doubt about it.

There was for a time on these pages the title: Time Passes. And why is that fact of considerable interest to us now?  First, at this point in the lives of so many at home and abroad, changes in what we do and how we live can add up to making life difficult. That Virus has wiped out our lives as we knew them, very fast indeed. We are overwhelmed. We say to ourselves, “the world changed when we weren’t looking!  We put in so much time reflecting morbidly on our passing lives at this crucial, critical, awful time. It was the spring of 2020, and a little before, when the lives of many lost total stability!  For nearly unbearable months we developed and endured a vast upheaval in the lives we thought were laid out for us.  Time is Passing, whether or not, and at this moment in time and space it’s bringing a life that’s demoralizing. What to do?

“That Virus”--as a great many have gotten to refer to it--is now recognized as a Destroyer First Class. There is little need here to run through a list of what it has done: to home life, to that loved place of just being and living daily; to incomes, to all kinds of habits, to people we see daily—and no longer do—to what we used to talk about. To the very level at which we can live! Especially to schools and to much schooling! No need here to keep harping on the devastating Changes that are moving in on habits we thought virtually eternal!

Our ordinary movements seem controlled—no longer ours! Within a day or two a university president conjectured openly that “we may have to close permanently.” The stable institution’s a century and a half old, for Heaven’s sake. Yet the infrastructure is intact, and the need is exactly as before: virtually permanent--unless, unlikely, students turn out to be happy learning on TV, sans live teachers, “all alone.” 

Back to our chosen Title. TIME does indeed pass. Things change. Think about that often, please. I tend to think that even two years of Intermission, or interval, will have trouble permanently harming you and yours. I think idly of those eras when our lives changed profoundly. Nothing permanent.  (Think of when we evolved from our ages 15 to 17….) 

Fortunately, in most cases, our lives didn’t really change forever—although in so many matters it proved better that they did. Take the invention that brought Polio’s downfall; or, say, the dominance of computers on how and what we do. Soon there came a new normalcy. On reflection, what happened was, well, marvelous, changes that became treasured. A new and different life emerged—almost when we weren’t looking. Maybe we got a new house to live in. Different employment. An end to days of woe that came and then went. There have been times when things altered for every one of us, and, by golly, we survived. Opportunities long out of reach were now there for us. (Admittedly, some damage we suffer can be permanent, ugly, unthinkable.  But “the good fight“ can be won.)

An example in my young life: change destroyed my father’s life as a Philadelphia engineer. Houses and cars were seized; he and mother moved in with my sister 50 miles from our town. The Depression crushed us. For a year I lived down our earlier street with a kindly Aunt and her husband. Then my parents moved several thousand miles away and “started over.” I joined them. Soon, he was happily productive; my life was changed by my major newspaper delivery job and I even washed the school’s lunch dishes (but hardly noticed).  I had new friends in school, different from kids I had known, but who cared? Time had indeed passed, and still was, but in my case, life went on.

You, dear reader are at this moment in a position to return to a paragraph of your choice above, say 24 months from now. You will have the happy opportunity, then, of seeing how CHANGE ultimately did well for you and some friends. It probably wasn’t able to really wreck everything. Nearly all those alterations, let’s hope, were endurable. Now: you can, indeed, you must, recall when you first started thinking to yourself: “This, too, shall pass.” Time changes everything. Most of us, I expect, will be living still in 2022 or 2023. That virus will quite possibly be conquered in one way or another. Scientists worldwide are its dedicated enemy, maybe as never before.  Yet: maybe in essentials that extraordinary virus will survive quite a while as good as New. No. No. Let’s Hope and work for its early demise due to effective vaccines, and better treatments changing both onset and time of contagion. And we—behaving ourselves as told?

Join me in this: things don’t remain the same. Believe it. Whatever is devasting our people at this time, well, the era is ahead when, just maybe, it won’t. Looking at the history of your own family—a decade or more ago, then living uneasily in a European country, a South or Central American small nation, when in a hostile City, or endlessly working by the hour in some fields; Time may have brought improvement for you.  Let’s hope so.   

In my case, my life was uneven! There was unexpectedly losing a rewarding job and then came one I disliked intensely. There was then the morning of 1977 when a heart infarction surprised me and proved devastating. In no time at all, my world was entirely different with half my heart gone.  A few years made for a giant difference. Endurable for me and mine. First came survival. Next, circumstances changed. Yet: that virus doesn’t own us forever, however awful things may seem to be at some passing moment.

You and I can count on time passing. We both can count on it.  Endless moping is for the needlessly disheartened. Be thoughtful about a world in which “things change.”  That damn Virus will be set back on its butt (to use language that communicates). Either a New or an Old Normalcy will return—count on it. Maybe you’ll love some of the changes that come—but I admit that’s a lot to expect! 

Many who now watch and enjoy radio or TV once said “turn off that racket.” Those who loved horses abhorred early autos. Many said: “I like the appliances I have.” Or, “As a woman, I really enjoyed working during the Forties in that plant. I was happy then.” It’s unlikely, but maybe Virus Change will alter one or two things for the better; or not! Somehow, I rather doubt it.

I just watched Vera Lynn, heroine vocalist of World War II, singing the hopeful songs of survival that meant so much to the British (and us) back then—The White Cliffs of Dover: “Tomorrow, just you wait and see”—and others that make one tear up. “We’ll meet again, don’t know where, don’t know when….” Those were awful days for our coming Allies. The years 1939 to 1945, ARE GONE! (For those who won, and those who lost, alike! Remembered sadly are the dead boy heroes from Oxford and Cambridge who fell.)

“Good fortune attend ye,” ‘tis said in Ireland and by the Irish everywhere. I don’t have to tell you, the reader, that Times Change. Admit it: many of the changes turned out to be “for the better.” So, go ahead and memorize the idea that “Time Passes.” Consider this slogan as untoward events seize us and try to be permanent: History teaches that “This, too, shall pass.” 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175684 https://historynewsnetwork.org/article/175684 0
Pandemic, Pandering and Prejudice: Trump's Effort to Weaponize Xenophobia

 

 

Trump has now officially blamed China for the coronavirus. This should not be surprising. Trump’s business and political history has been riddled with xenophobia—a pathological fear or hatred of foreigners. This is nothing new. He has historically used paranoid fear of foreigners to obtain business or political advantage.

 

In 1993, he appeared on the Don Imus radio show. Asked what he thought about an Indian tribe’s plans to open a casino in New Jersey that would compete with his Atlantic City casino business, he thundered: “A lot of these reservations are being, in some people’s opinion, at least to a certain extent, run by organized crime elements. There’s no protection. There’s no anything. And it’s become a joke.” 

 

Imus mentioned the Mashantucket Pequot Tribal Nation, which had opened the successful Foxwoods Resorts Casino in Mashantucket, Connecticut, in 1991. “I think if you’ve ever been there,” said Trump, “you would truly say that these are not Indians. One of them was telling me his name is Chief Running Water Sitting Bull, and I said, ‘That’s a long name.’ He said, ‘Well, just call me Ricky Sanders.’” The tribe termed Trump’s comments disrespectful and racist, and filed a complaint against him.

Later in 1993, Trump would testify before Congress that organized crime “is rampant—I don’t mean a little bit—is rampant on Indian reservations.” Later in 1993, he delivered the offensive line he would repeat on television again and again. He said that the Mashantucket Pequots did not look like real native Americans. “They don’t look like Indians to me,” he testified. “They don’t look like Indians to Indians.” Trump’s remarks about organized crime among the Indians of course had no foundation.

 

Coming down the escalator of Trump Tower in June 2015 to kick off his presidential candidacy, he complained that Mexicans were not “sending us their best people.” He did not claim the Mexicans were diseased, but rather that they were criminals. “They’re bringing drugs. They’re bringing crime. They’re rapists.” The speech on the escalator was the harbinger of “truth decay” in our political discourse. By “truth decay,” I mean the willingness smear  any person or any group or to lie about any any event, to obtain political advantage. 

 

Before the virus hit, Trump’s re-election seemed assured. He had the better part of the economic argument with unemployment at its lowest for half a century, the stock market soaring to record highs, the economy at a boil, and his probable opponent a self-avowed Socialist. Then, the ground shifted pretty quickly, which is what is so fascinating about politics. Biden became the opponent. Tough to call him Socialist.  More than 1.6 million Americans have now been infected with coronavirus, nearly 95,000 have died, and more than 38.6 million people in the last nine weeks were part of a new wave of unemployment—the highest levels since the Great Depression. Trump had to quickly resurrect a campaign that was on a ventilator. The solution, the darling of his nursery: xenophobia.

 

Trump is a day trader. In March, Trump called Covid-19 the “Chinese virus,” because he saw political advantage in it. Then, he  dialed it back, praising Xi Jin Ping for his handling of the pandemic. Most recently, he claimed he had evidence for the unsubstantiated theory that the virus arose out of a lab accident in China. Secretary of State Pompeo claims there is “enormous evidence” supporting the lab accident theory.

 

Trump’s virus policy has become xenophobic without even the virtue of consistency. It is about short-term politics, not policy. In a late-night tweet, he said that he would “temporarily suspend immigration into America,” a variant of “last one in, close the door.” With Trump, there is often space between the cup and the lip. His executive order that followed was full of loopholes possibly  reflecting concerns of the business community about the obviously essential nature of immigrant labor to the economy. Trump clearly plans to feature in his presidential campaign the falsity that China is responsible for the virus to deflect attention from his mismanagement of the problem as “Doctor in Chief,” painting “Beijing Biden” as soft on China where he has been the true hardliner. 

 

Strange that we identify diseases with foreigners. Hurricanes and typhoons are named after men and women in alphabetical order, not after the country where the storm system arose. Famines in history have been largely unnamed even though two of the most egregious famines of the 20th century were the products of man’s inhumanity to man, the Nazis’ starvation of the Warsaw ghetto, and their 900-day blockade of St. Petersburg. 

 

Trump knows that xenophobia plays well with his political base. Throughout history people have conflated foreigners and disease. In the modern world, there is a kernel of truth to the idea. “International mobility is central to the globalization of infectious and chronic diseases,” reported the World Health Organization in 2007. Pandemics and globalization go together like a horse and carriage with foreign policy waiting in the wings. No one doubts that international travel can spread disease, but why must it be assumed that it is “foreigners” infecting us, and not home-grown Americans returning from abroad?

 

When the Bubonic Plague infected Europe in the mid 1300s, there was no CDC or FDA. There was no science to test the population or to race for a cure. There was no Dr. Fauci to recommend quarantining the population either. But there was plenty of fear and hatred of foreigners to go around, just as there is today. The society at the time blamed foreigners, beggars, lepers, gypsies and Jews for the 200 million who died in the scourge.

 

Blame went so far as to stigmatize peoples identified with the origin of pandemics. And those stigmatized were invariably others, foreigners, ethnics, them, people not like us. 

And so, syphilis was called the “French disease.” The first written records of an outbreak occurred in 15th century Naples, following a French invasion during the Italian War of 1494-98. But the sting of the stigma was that syphilis implicated sexual promiscuity for which the French were (and are still are) supposedly notorious.

 

People blamed the Yellow Fever epidemic of 1793, which decimated Philadelphia, on immigrants arriving on ships. It took only two months for the epidemic to kill 5,000 of the city’s 21,000 inhabitants. According to publisher Matthew Carey, Philadelphians practiced social distancing, and refused to shake hands with their neighbors. A century later, scientists established that the infection was spread not by immigrants but by mosquito bites. 

 

There followed the Cholera pandemic of 1832, blamed on unsanitary conditions in Asia, the “Russian” flu of 1889; the “Spanish” flu of 1918 (the first recorded cases of which were ironically in Kansas), which killed 675,000 Americans, and even infected President Wilson (he survived); the “Asian flu” of 1957; the Hong Kong flu of 1968; the “West Nile” virus discovered in Uganda in 1937; the Ebola virus of 1976 (named for a river in the Congo; it was to have been named the Yambuku virus after the village where it first surfaced, but scientists named it after the nearby river to avoid stigmatizing the village); and now the “Wuhan virus,” the pejorative coined by our Secretary of State, who is commissioned with promoting stable relations with foreign countries.

 

Trump has become the inflection point of our national anxieties, which is exactly what he wants. No one wants to die of this deadly virus, and the president, along with the governors and mayors should keep the public fully informed with honest answers, not political tropes. 

 

But blaming China and threatening reprisals is not going to save any lives. It won’t open up the country any sooner. It won’t create a single job. But it might just get Trump re-elected. Despite a 45% approval rating. At least he thinks so, and that’s what our politics is sadly all about. 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175626 https://historynewsnetwork.org/article/175626 0
Citizen’s Arrest: Racist at its Roots

 

 

The video-recorded murder of Ahmaud Arbery, a young Black man killed by two white vigilantes while jogging near Brunswick, Georgia, has focused attention on Georgia’s Civil War era Citizen’s Arrest law. 

 

The current version of Georgia Citizen’s Arrest Law, 17-4-60 (2010), states: “A private person may arrest an offender if the offense is committed in his presence or within his immediate knowledge. If the offense is a felony and the offender is escaping or attempting to escape, a private person may arrest him upon reasonable and probable grounds of suspicion.” A private individual who makes a “citizen’s arrest” is instructed “without any unnecessary delay” to “take the person arrested before a judicial officer . . . or deliver the person and all effects removed from him to a peace officer of this state.”  

Georgia’s laws were formally codified in 1861 by Thomas Cobb, a lawyer and slaveholder who died at the Battle of Fredericksburg in 1862. It was the first formal codification of state common law in the United States. It was also racist. In the original code, African Americans were assumed to be enslaved unless they could prove free status. Georgia’s Citizen’s Arrest statues were first entered into the Law Code of Georgia in 1863.

 

Thomas Cobb was the author of An Inquiry into the Law of Negro Slavery in the United States of America (1858). In the book, Cobb argued “[T]his inquiry into the physical, mental, and moral development of the negro race seems to point them clearly, as peculiarly fitted for a laborious class. The physical frame is capable of great and long-continued exertion. Their mental capacity renders them incapable of successful self-development, and yet adapts them for the direction of the wiser race. Their moral character renders them happy, peaceful, contented and cheerful in a status that would break the spirit and destroy the energies of the Caucasian or the native American” (46). Cobb’s views on race and slavery shaped the Georgia legal code.

 

The Code Law Code of Georgia was heavily revised after the 1865 passage of the Thirteenth Amendment ended slavery in the United States, and has been revised and reenacted a number of times during the last 150 plus years. However, the Civil War era citizen’s arrest provision remains.

 

In 1863, when the citizen’s arrest provision was added to the Law Code, slavery and Georgia law enforcement were in serious disarray. Georgian units in the Confederate army were primarily stationed in Virginia. The Union army was preparing to invade the state from Tennessee. Enslaved Africans were fleeing plantations to join Union forces. Confederate deserters were hiding in inaccessible Appalachian counties in the north and swamp regions of the south. With its criminal justice system in a state of collapse, the 1863 code revision empowered white Georgians to replace law enforcement and slave patrols to keep the enslaved Black population under control.

 

After the Civil War, citizen’s arrest supported Ku Klux Klan violence against Black Georgians. In 1868 alone, there were over 300 reported cases of the Klan murdering or attempting to murder Georgia’s black citizens.  According to the Mary Turner Project database, there were 454 documented lynching murders of Black men and women in the state of Georgia between July 29, 1880 and April 28, 1930.  The Fulton County Lynching Project lists 589 documented lynchings between 1877 and 1950.

 

Lynchings of African American men and women in Georgia by white mobs making “citizen’s arrests” have a particularly gruesome history. On January 22, 1912, four African Americans in Hamilton--three men and a woman--were citizen’s arrested and lynched, accused of killing a white planter who was sexually abusing Black girls and women. On July 25, 1946, two African American couples were dragged from their car at Moore’s Ford in Walton County and shot about sixty times by a mob of white men making a “citizen’s arrest.” No one was ever charged with their murders.

 

At 1 PM on February 23, 2020, a 25-year-old black man named Ahmaud Arbery was killed by two white men, Gregory McMichael and his son Travis McMichael. His executioners told police they were in the process of making a citizen’s arrest because they believed Arbery fit the description of a man suspected of break-ins in Glenn County. Gregory McMichael was a police officer in Glenn County in the 1980s and an investigator in its district attorney's office until he retired in 2019. Police and local prosecutors let the men go without further investigation until a national uproar over the killings three months later.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175619 https://historynewsnetwork.org/article/175619 0
A Memorial Day Lament for Capt. Wilfred Owen, Sgt. Joyce Kilmer, and the Needless Dead of Foolish Wars

 

 

Since Memorial Day is a holiday to honor U. S. veterans who died in war, it is fitting to honor one, the poet Sgt. Joyce Kilmer. And since even more English families lost loved ones in that war (WWI), it is also proper to memorialize the English poet Captain Wilfred Owen. 

 

A main problem with many war accounts is their failure to evoke the full tragedy that people suffered. As one historian has written, “Among the major [European] combatants, it is not an exaggeration to suggest that every family was in mourning: most for a relative–a father, a son, a brother, a husband–others for a friend, a colleague, a lover, a companion.” 

 

Ian McEwan’s novel Black Dogs (1993) captures well the scope of such tragedy when he writes of his main character: “He was struck by the recently concluded war [World War II in Europe] not as a historical, geopolitical fact but as a multiplicity, a near-infinity of private sorrows, as a boundless grief minutely subdivided without diminishment among individuals who covered the continent like dust…. For the first time he sensed the scale of the catastrophe in terms of feeling; all those unique and solitary deaths, all that consequent sorrow, unique and solitary too, which had no place in conferences, headlines, history, and which had quietly retired to houses, kitchens, unshared beds, and anguished memories.” 

 

At least, however, the war McEwan writes of (WWII) was fought against a major evil, Adolf Hitler’s Nazi regime, and such a fight was more justifiable than WWI. Although Kaiser Wilhelm II’s erratic, insecure, and belligerent personality shares part of the blame for the war’s start in the summer of 1914, none of the other European leaders had done all they might to prevent the conflict from occurring. They had all valued security, prestige, influence, and allies more than peace, and they had all failed to imagine the full scale of the tragedy they were about to unleash.

 

Recalling on this Memorial Day the lives of just two soldiers who died, Owen and Kilmer, can help at least open the symbolic gates of that massive WWI millions-of-bodies cemetery.

 

Wilfred Owen was born in 1893 in Shropshire County, which sits east of Wales. The youth of this shy, sensitive boy was spent in this county, especially in two of its small towns where he received most of his education, Shrewsbury and Birkenhead. He was the oldest of four children, and his father worked for the railway, serving for a time as a stationmaster of a small station. Paul Fussell, in his widely-praised The Great War and Modern Memory (1975), tells us that the dad’s “salary sufficed to maintain the family in genteel poverty,” and that as a boy Wilfred was “closer to his mother,” who was “pious, puritanical, and strong-willed.” The poet C. Day Lewis tells us that “his relationship with his mother… remained the closest one in his short life.”

 

Although she encouraged him to pursue a religious vocation, he was from a young age determined to be a poet, a choice his father frowned upon, insisting poetry would not provide an adequate living. Although attending a few university classes in Reading, Wilfred could not afford a full higher education. Instead, from 1911 to 1913 he served as a lay assistant to the Vicar of Dunsden (near Reading). In that capacity, he sometimes visited poor families and was disappointed that the Anglican Church could not provide more aid to the needy. 

 

Later in 1913 he went to Bordeaux, France, first to teach English in a Berlitz School, and then in 1914 to act as a tutor in a French family. He did not return to England until September 1915, at age 22, thirteen months after WWI had begun. A month later he enlisted in the Army, and the following June he received his officer’s commission. Not until January 1917 was he sent to France, specifically to the trenches of the Somme battlefield. There, over more than four months in the previous year, the British had suffered 420,000 casualties, including 125,000 deaths. If German and French casualties are added, more than one million men were killed or wounded on that battlefield from July to November.

 

The exact nature of Owen’s private life from 1914 to 1916 is disputed, with some sources suggesting or stating he was gay. More important for our purposes, however, is his development as a man and a poet. The poet Day notes that in looking at Owen’s poetry of 1917 and 1918 he found himself “more and more amazed at the suddenness of his development from a very minor poet to something altogether larger. It was as if, during the weeks of his first tour of duty in the trenches, he came of age emotionally and spiritually.”

 

Another poet and soldier, Edmund Blunden, in his “Memoir” of Owen, quotes letters Owen wrote home. In mid-January 1917 he writes of moving “along a flooded trench. After that we came to where the trenches had been blown flat out…. It was of course dark, too dark, and the ground was… an octopus of sucking clay, 3, 4, and 5 feet deep, relieved only by craters full of water. Men have been known to drown in them. Many stuck in the mud and only got on by leaving their waders, equipment, and in some cases their clothes. High explosives were dropping all around, and machine-guns spluttered every few minutes. But it was so dark that even the German flares did not reveal us.” 

 

Blunden mentions “the winter of 1916-1917 will long be remembered for its scarcely tolerable cold,” and he quotes an Owen letter in later January: “In this place my platoon had no dug-outs, but had to lie in the snow under the deadly wind. By day it was impossible to stand up, or even crawl about, because we were behind only a little ridge screening us from the Boche’s [German’s] periscope.”

 

Fussell sums up Owen’s frontline experiences in 1917 and 1918 this way: “In the less than two years left to him, the emotions that dominated were horror, outrage, and pity: horror at what he saw at the front; outrage at the inability of the civilian world-- especially the church--to understand what was going on; pity for the poor, dumb, helpless, good-looking boys victimized by it all. He was in and out of the line half a dozen times during the first four months of 1917.” 

 

Then in late April 1917 Owen wrote, “For twelve days we lay in holes, where at any moment a shell might put us out. I think the worst incident was one wet night when we lay up against a railway embankment. A big shell lit on the top of the bank, just 2 yards from my head. Before I awoke, I was blown in the air right away from the bank! I passed most of the following days in a railway cutting, in a hole just big enough to lie in, and covered with corrugated iron. My brother officer of B Co., 2nd Lt. G., lay opposite in a similar hole. But he was covered with earth, and no relief will ever relieve him.” Finally, Owen was evacuated; a doctor diagnosed his condition as neurasthenia and forbade him to go back into action. 

 

For the next year Owen found himself in various hospitals or on duty back in England. In a hospital near Edinburgh he met two other English officers and war poets, Siegfried Sassoon and Robert Graves. The older Sassoon, from a wealthy Jewish family, had been wounded in combat and arrived at the hospital after Owen, but not before writing a famous protest against the war, a protest that was read in the House of Commons. It states: "I am making this statement as an act of willful defiance of military authority, because I believe that the War is being deliberately prolonged by those who have the power to end it. I am a soldier, convinced that I am acting on behalf of soldiers. I believe that this War, upon which I entered as a war of defence and liberation, has now become a war of aggression and conquest…. I have seen and endured the sufferings of the troops, and I can no longer be a party to prolonging those sufferings for ends which I believe to be evil and unjust.” Owen admired and was influenced by the older officer and poet, but even before meeting him had expressed his own criticism of the war.

 

While still at the hospital Owen wrote a first draft of one of his most famous poems, “Dulce et Decorum Est.” 

 

Bent double, like old beggars under sacks,

Knock-kneed, coughing like hags, we cursed through sludge,

Till on the haunting flares we turned our backs,

And towards our distant rest began to trudge.

Men marched asleep. Many had lost their boots,

But limped on, blood-shod. All went lame; all blind;

Drunk with fatigue; deaf even to the hoots

Of gas-shells dropping softly behind.

 

Gas! GAS! Quick, boys!—An ecstasy of fumbling

Fitting the clumsy helmets just in time,

But someone still was yelling out and stumbling

And flound’ring like a man in fire or lime.—

Dim through the misty panes and thick green light,

As under a green sea, I saw him drowning.

 

In all my dreams before my helpless sight,

He plunges at me, guttering, choking, drowning.

 

If in some smothering dreams, you too could pace

Behind the wagon that we flung him in,

And watch the white eyes writhing in his face,

His hanging face, like a devil’s sick of sin;

If you could hear, at every jolt, the blood

Come gargling from the froth-corrupted lungs,

Obscene as cancer, bitter as the cud

Of vile, incurable sores on innocent tongues,—

My friend, you would not tell with such high zest

To children ardent for some desperate glory,

The old Lie: Dulce et decorum est

Pro patria mori [It is sweet and fitting to die for one’s country].

 

Before Owen left the hospital he became friendly with a Mrs. Mary Gray, who had an infant daughter. Blunden quotes Gray at length on what a fine man Owen was. She speaks of his “sensitiveness, his sympathy,” his “wonderful tenderness,” and adds that the bond which drew them “together was an intense pity for suffering humanity—a need to alleviate it, wherever possible, and an inability to shirk the sharing of it, even when this seemed useless.” 

 

Despite Owen’s criticism of the war, Fussell tells us that he was anxious “to return to the front although he knew he was going to be killed. Having seen the suffering of the men, he had to be near them. As the voice of inarticulate boys, he had to testify on their behalf.” He returned to France in September, 1918. “In an attack during the first days of October he won the Military Cross. In another attack… on November 4, he was machine-gunned to death.” One week later the war ended, and an hour after it did, with bells still ringing in celebration, his parents received the telegram informing them of their son’s death. 

 

Of the impact of Owen’s poetry, most of which was published after his battlefield death, fellow poet and WWI soldier Day wrote that thirty-five years after first reading it, he realized “how much it has become part of my life and thinking.” In addition, while praising other WWI soldier-poets, Day stated “It is Owen, I believe, whose poetry came home deepest to my own generation, so that we could never again think of war as anything but a vile, if necessary, evil.”

Entering the war later, but dying earlier than Owen was the American poet Kilmer. For him we have two excellent online sources. The first is a memoir by his mother, which also contains some of his poems and letters. The second is a memoir by his friend Robert Cortes Holliday, who was a fellow writer and editor.  

Kilmer was born in New Jersey in 1886 to a well-off family--his father invented Johnson & Johnson’s Baby Powder. Like Owen, Joyce was close to his mother, and her memoir details that relationship and all the honors and accomplishments of her son up through and beyond his college years at Rutgers and Columbia. He graduated from the latter university in May 1908, and the next month married Aline Murray. She was the step-daughter of a Harper’s Magazine editor and herself a poet. Before Joyce went off to war in 1917, the couple had five children, but their oldest daughter, Rose, died shortly before he left. In 1913, she had been stricken with infantile paralysis, an incident that convinced Joyce and Aline to convert to Catholicism. For the rest of his short life his new religion would be very important to him, and by the time he was killed in 1918, he was considered the leading U. S. Catholic poet. 

 

The same year as his conversion saw the publication of his most famous poem, “Trees.”

 

I think that I shall never see 

A poem lovely as a tree. 

A tree whose hungry mouth is prest 

Against the earth's sweet flowing breast; 

A tree that looks at God all day, 

And lifts her leafy arms to pray; 

A tree that may in Summer wear 

A nest of robins in her hair; 

Upon whose bosom snow has lain; 

Who intimately lives with rain. 

Poems are made by fools like me, 

But only God can make a tree.

 

The poem is typical of his simple, straightforward style, as well as his religious sentiments. And although a popular poem with the general reading public, critics do not rank it or him with the best American poems and poets. Nevertheless, his friend Holliday and others depict him as a good and decent man who possessed a fine sense of humor and loved life and his family. Just a few examples from Holliday: “His smile, never far away, when it came, was winning, charming.” “I really doubt very much whether anybody ever enjoyed food more than Kilmer.”    

 

Although several books of his poems appeared during his lifetime, like most poets he could not support his growing family on his poetry alone. Thus, he worked at various jobs, mainly in publishing, editing, and writing reviews and essays, in and around New York City. He also often lectured, at which he was very good.

 

After a brief infatuation with socialism as a young married man, his enthusiasms were much more literary (e.g., for Walter Scott, Charles Dickens, and W. B. Yeats) than political. When WWI began, he was upset at the German invasion of neutral Belgium, but until May 1915, when a German U-boat sank the British passenger liner Lusitania, he was not anti-German. But the sinking, with the loss of 1,198 lives, including 128 Americans, turned him against the Kaiser’s aggression. His poem of that time, “The White Ships and the Red,” reflects his abhorrence at the Lusitania’s sinking with its loss of civilian lives, including women and children, and his growing belief that the war was a battle for freedom. His poem of about the same time, “In Memory of Rupert Brooke,” recalls the death “to make men free” of the well-known English poet and soldier.

 

Kilmer was, however, also critical of the English for their oppression of the Irish, especially their execution of the poet Patrick Pearse and others for their involvement in the Easter Rising of 1916. Nevertheless, soon after the U. S. declared war on Germany in April 1917, Kilmer enlisted and soon got himself transferred to New York City’s “Fighting 69th” infantry regiment, whose men were overwhelmingly Irish-American and Catholic. (Kilmer himself was predominately of English background, but identified more with the Catholic Irish.)  

 

Kilmer and his unit left for France in October 1917, and we get a partial glimpse of the anguish of his wife and parents in his mom’s words: “I lived through the months he was in France, as mothers did in that horrible time, writing him nearly every day and sending parcels.” As for the poet-soldier’s actions and thoughts in France, on the battlefields and off, we get some idea from his few poems and short prose works from France, his more plentiful letters (including to his mother, wife, and two of his children), and Holliday’s quotes from chaplains and others that served with him. For example, “He was worshiped by the men around him.” “He was absolutely the coolest and most indifferent man in the face of danger I have ever seen.” 

 

One of his short prose pieces, “Holy Ireland,” reflects the same basic decency and goodness that we see in so many of his works. It tells of a night he and a group of his fellow soldiers spent in the simple hut of a French widow--her husband killed in the war--and her three young children. “There was a gentle dignity about that plain, hard-working woman.” She fixed them a fine meal, accompanied by ample wine, and they sang songs and hymns in front of a cozy fire. She was delighted that he and his fellow soldiers (Irish-American Catholics) shared her Catholic religion.

 

On 28 July,1918, Sgt. Kilmer led a small group of men to scout out a German machine gun placement within a French woods. A few hours later his dead body was discovered with a bullet hole through the brain.

 

Of the tremendous agony suffered by his family and friends, we can get some small inkling from the memoirs of his mother and his friend Robert Holliday. Although she wrote “I cannot write of that time” about the moment she received news of her son’s death, she added, “I have written these Memories with no other object than to solace a sorely wounded heart.” She also noted that she filled her “sleeping room” with “his pictures from six months to thirty years.” Kilmer’s oldest child, Kenton, later authored a loving book about his father, Memories of my Father, Joyce Kilmer (1993). And still later Kenton’s daughter, Miriam, wrote about her grandfather: “Though some call him a ‘great poet,’ I believe it is fair to say that his work showed promise; that had he not been struck down in his prime, his talent would most likely have developed in later years into something approaching greatness.” 

 

Among the many millions of young men who lost their lives in WWI, we have looked at just two, Owen and Kilmer. But their stories help us sense (in McEwan’s words) “the scale of the catastrophe in terms of feeling; all those unique and solitary deaths, all that consequent sorrow, unique and solitary too, which had no place in conferences, headlines, history, and which had quietly retired to houses, kitchens, unshared beds, and anguished memories.” 

One of the main problems with political leaders who go to war is their lack of imagination and empathy. In a February 1968 speech writer Wendell Berry said: “We have been led to our present shameful behavior in Vietnam by this failure of imagination, this failure to perceive a relation between our ideals and our lives.” Less than a year before, Martin Luther King Jr. had displayed such empathy when he stated:

All the while the [Vietnamese] people read our leaflets and received the regular promises of peace and democracy and land reform. Now they languish under our bombs and consider us, not their fellow Vietnamese, the real enemy. They move sadly and apathetically as we herd them off the land of their fathers…. So they go, primarily women and children and the aged. They watch as we poison their water, as we kill a million acres of their crops. They must weep as the bulldozers roar through their areas preparing to destroy the precious trees. They wander into the hospitals with at least twenty casualties from American firepower for one Vietcong-inflicted injury…. They wander into the towns and see thousands of the children, homeless, without clothes, running in packs on the streets like animals. They see the children degraded by our soldiers as they beg for food. They see the children selling their sisters to our soldiers, soliciting for their mothers.

But even if European leaders of 1914 failed to imagine all the suffering they might cause soldiers and civilians on both sides of the conflict in WWI--and none of them even came close--one might think that they could better imagine what might happen to themselves. The monarchs of Germany, Russia, and Austria-Hungary all lost their thrones, and the Russian tsar and his family eventually lost their lives as the Communists under Lenin prevailed in a civil war. The Austro-Hungarian Empire disintegrated into separate nations, and Germany and Russia both lost territory. Historian Niall Ferguson states that “twentieth-century violence…. was in large measure a consequence of the decline and fall of the large multi-ethnic empires that had dominated the world in 1900.” The Austro-Hungarian and Russian empires, entering WWI on opposite sides, were two of them, and the Ottoman Turkish Empire, allied with Germany and Austria-Hungary, was another. 

How about the victorious powers? Did the war make the “world… safe for democracy,” as U. S. President Woodrow Wilson hoped it would when he asked Congress to declare war on Germany in April 1917? With Lenin and the communists assuming power in Russia, with Mussolini becoming prime minister in 1922 in Italy (one of the victorious nations), with Hitler playing upon Germans’ dissatisfaction with the post-WWI treaty imposed upon them and becoming German chancellor in 1933, and with much of Europe losing some of its best future leaders to battlefield deaths, it is difficult to argue that Wilson’s hope was fulfilled. 

Moreover, Europe as a whole was weakened financially and politically, and so too, eventually, were their imperial powers in the rest of the world. True, France regained the lost provinces of Alsace and Lorraine, but was it worth the lives they lost--for example, three out of every ten French men between the ages of eighteen and twenty-eight?  

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175620 https://historynewsnetwork.org/article/175620 0
The Post Office is Mentioned, but Not Protected, by the Constitution

The Privateer Chausseur captures the HMS St. Lawrence, 1815.

 

 

The Founding Fathers thought so highly of a certain institution that they expressly authorized it in the Constitution. The institution served the nation well. It protected the people’s liberty. It fueled business. It provided jobs for some of America’s most vulnerable citizens, who, if they were fortunate, might retire secure, if not prosperous. Though it actually predated the American Revolution, and Benjamin Franklin was involved early on, the institution's profitability was suspect and it became seen as outmoded. Worst of all, the institution faced an aggressive new competitor that, its advocates claimed, could do the job better. The institution had no choice but to contemplate its own extinction. I’m not talking about the postal service, however. I’m talking about privateering, the Age of Sail practice in which governments, augmented their naval forces during times of war by issuing commissions, sometimes called “letters of marque,” to privately owned and operated vessels. Like the postal service, privateering is mentioned in the Constitution. Article I, section 8 gives Congress the power to both “establish Post Offices and post Roads” and, a few lines later, to “grant Letters of Marque and Reprisal.” Though enshrined in our founding document, privateering disappeared in the nineteenth century. And the Constitution remained intact. If the United States Postal Service suffers the same fate, the Constitution will also endure. Although it seems exotic now, when the Constitution authorized privateering, it was as unremarkable at the time as a chamber pot. Eighteenth-century European powers employed privateers whenever war broke out and had done so if one form or another since the late Middle Ages. When the Revolution began, state legislatures and the Continental Congress encouraged privateering as a quick way to swarm the seas with American ships while building a navy. As a diplomat in France, Benjamin Franklin handed out letters of marque. Privateering was always part patriotic service, part business. Merchant ships and sailors idled by war found new employment as privateers, and although a private warship was more dangerous than a merchantman, the rewards could be greater. Enemy vessels seized by a privateer belonged to the captors—as long as a court declared the vessels “good prize”—and everybody from the ship owners down to the ship’s greenest green horn took home a piece. Or they got nothing, if they captured nothing. The War of 1812 proved the last hurrah for American privateers. A long period of relative peace followed in which European powers built large navies. Privateers, who were always a bit allergic to following the law, became more hindrance than help. At the end of the Crimean War in 1856, 55 nations swore off privateering for good in the Declaration of Paris. The United States, however, wasn’t one of them. As a small navy nation, the United States wanted to keep privateering around, just in case. The need never came, and in the 1890s, when American politicians aspired to build a big navy proper to a rising imperial power, privateering withered never to revive. With rescue funds so far denied to the USPS, friends of the mail have lauded its many benefits, both current and historical.

 

The USPS employs a large workforce, including many veterans; processes a plurality of the world’s total volume of mail; enjoys high popularity with the public; provides low cost service to rural areas; and promises a way to vote, a way to receive medicine, and a way to participate in the census. In the past, the postal service fostered connections among weakly unified states in the early republic, boosted the abolitionmovement before the Civil War, and gave black men and women a chance at a middle-class life in an era of segregation. That the Constitution mentions Congress’s power to establish post offices seems to make it a slam dunk: the nation must have the USPS or the Constitution will be violated, the Founders betrayed, and our democracy imperiled.  

I like the USPS. I enjoy getting the mail. There’s something Christmas morning-like about opening the mailbox. Who knows what’s inside?

Still, the post office has no more of a special status than privateering did. Privateering lasted hundreds of years and was a part of every Atlantic power’s arsenal. Until it wasn’t.

 

If the day comes when the USPS, like privateering, has outlived its usefulness, the Constitution will prove no obstacle. The only question that matters is the practical one: does the postal service accomplish its mission better than the alternatives? The Constitution can survive without a postal service, just as it has survived without privateering.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175627 https://historynewsnetwork.org/article/175627 0
Life during Wartime 508

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/blog/154351 https://historynewsnetwork.org/blog/154351 0
Shining Stars and Rogues: Presidential Offspring in American History (Part 2) Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

For part 1 of this series, see here

 

Franklin D. Roosevelt had five surviving children, all of whom were in some way involved with their parents, with the four men all serving the military in World War II, and all controversial in some way or other for their business dealings.  Outside observers thought they took advantage of their political position as sons of the President, and it rings true upon examination.  And all had multiple marriages.

Anna Eleanor Roosevelt, the only daughter and namesake of her mother, lived at the White House in 1944-1945, and kept the secret that her father had resumed his earlier affair with Lucy Mercer Rutherfurd, which caused alienation from her mother after she revealed the truth upon FDR’s death in April 1945. His daughter had accompanied her father to the Yalta Summit meeting with Joseph Stalin and Winston Churchill in February 1945.

James Roosevelt, FDR’s oldest son, served in the House of Representatives from 1955-1965.  Earlier, he had served as Administrative Assistant, Secretary, and White House Coordinator for eighteen government agencies between 1936 and 1938.

Elliott Roosevelt served as Mayor of Miami Beach, Florida from 1965-1967, and wrote books giving intimate details of the life and affairs of his father. He was seen as scandalous, and was accused of and investigated for scandals by Congress, but no action resulted against him.

Franklin D. Roosevelt, Jr., who looked the most like his father, served as a New York Congressman from 1949-1955, and as Chairman of the Equal Employment Opportunity Commission under President Lyndon B. Johnson from 1965-1966, as well as earlier serving on the President’s Committee on Civil Rights for President Harry Truman in 1946.

John Aspinwall Roosevelt, the youngest son, veered away from his family’s politics, becoming a Republican, alienating his mother and brother Elliott.  He was the only son who did not seek elective office, but was controversial as the other brothers in his business dealings.

Margaret Truman Daniel, the only child of Harry and Bess Truman, became notable as a concert singer, actress, and journalist, but also wrote historical biographies of her parents and a well received series of murder mysteries centered around government buildings.

John Eisenhower, the only surviving child of Dwight D. Eisenhower, spend his career in the military, and then as Ambassador to Belgium from 1969-1971 under President Richard Nixon.  He also wrote several military histories, and a short biography of President Zachary Taylor.

Caroline Kennedy gained more attention as a Presidential daughter than anyone other than Alice Roosevelt.   Being in the White House from ages 3-6, she was regularly the center of attention, and had to bear the loss of her father, later the loss of her uncle, Robert F. Kennedy, and then the tragic death of her brother, John F. Kennedy, Jr in a small plane accident in July 1999.  Her public contribution was as US Ambassador to Japan under President Barack Obama from 2013-2017.  

Her brother had published a political magazine, George, from 1995 until his death. He was often considered a potential political candidate for the Senate, and maybe, the Presidency. In that regard, he can be compared to Quentin Roosevelt, TR’s son, tragically killed in World War I, but thought of as likely to have a future political career.

Michael Reagan, the adopted son of Ronald Reagan and his first wife, Jane Wyman, became notable as a conservative talk show host on radio, and as a political commentator and author. 

 His half brother, Ron Reagan, Jr, the son of  Reagan’s marriage to Nancy Davis, has been a liberal political commentator and radio talk show host, and they have been in regular conflict over the legacy and record of their father.

George H. W. Bush and his wife Barbara had three children who became notable,  George W. Bush became the 43rdPresident of the United States from 2001-2009, and both parents survived to see him in office throughout both terms, and were the only Presidential couple to do so, as JQ Adams only had his father alive when he was elected, and the elder Adams died in the second year of his son’s one-term presidency. Before his presidency, George W. Bush had served six years as Texas Governor from 1995-2001.

Additionally, Jeb Bush served as Governor of Florida from 1999-2007, and sought the Presidency unsuccessfully in 2016.  Many political observers thought it would have been Jeb, rather than George W., who would have sought the Presidency in 2000, if he had not lost his first race for Florida governor in 1994, the year his brother won the Texas governorship. 

Neil Bush became controversial from his business dealings, personal contacts, and the business of his educational corporation under the “No Child Left Behind” policy promoted by his brother during his years in the White House.

Chelsea Clinton, the only child of her parents, has been involved in the activities of the Clinton Foundation, and is an author of several children’s books.  She also campaigned actively for her mother in her 2008 and 2016 Presidential campaigns.

Jenna Bush Hager, the daughter of George W. Bush, has been a public figure as a news personality, author and journalist, and presently cohosts the Today show on NBC, after being a contributor to the show since 2009.  

Finally, three children of Donald Trump and his first wife Ivana have become highly prominent and controversial. Donald Trump Jr. and Eric Trump have been responsible for the Trump Organization activities, but also have engaged in constant political disputes, promoted conspiracy theories and false information, and have aroused anger among environmentalists with their publicity surrounding their big game hunting in Africa of endangered species, including elephants and leopards.  Also, Donald Jr. was involved in Trump Tower meetings with agents from Russia, discussing spreading misinformation about Hillary Clinton, his father’s 2016 Presidential opponent.

Ivanka Trump, married to Jared Kushner, is a Presidential advisor to her father, along with her husband. Both are unpaid, but have a great impact on public policy that is seen as controversial due to ethics concerns. Their roles ignore the nepotism law of the US government passed in the mid 1960s, banning White House activities of relatives. She has gained business dealings, particularly trademarks in China, while serving as Senior Advisor in the Oval Office.

So the major “rogues” in some form were Alice Roosevelt and Archibald Roosevelt, both children of Theodore Roosevelt; the sons of Franklin D. Roosevelt; George H. W. Bush’s son Neil Bush; and the three older children of Donald Trump.  And when history is recorded in the future, it may well be that the Trump children will stand out as more “roguish” than the others.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/blog/154346 https://historynewsnetwork.org/blog/154346 0
The Lynching of David Wyatt

Lincoln Hotel, Belleville, IL

 

 

At the turn of the last century Rev. Charles Thomas of the African Methodist Church was the unofficial leader of the 500 or so African Americans living in Belleville, Illinois, across the Mississippi river from St. Louis. He and his flock had a small presence in the largely German-heritage cities of the Illinois side of the metro region. At the time even East St. Louis was by far mostly white but black people were slowly moving into the area, attracted by the growing industrial economy including the then booming stockyards. Black people suffered from de facto segregation, with white and black areas strictly if informally divided. 

     

Whatever his motive, whether it was to strike an early blow against discrimination or just a matter of pride, Rev Thomas entered the all-white barber shop of Henry Baumgarten in downtown Belleville and asked to have his boots shined. After Baumgarten refused the Reverend sued him for $200. Even though Thomas lost the case and filed an appeal, the incident created great resentment in the majority white area. Black people were not welcome in Belleville, a fact brought home with every hostile look and verbal assault. The stage was set for a greater incident and the spark came unexpectedly.

       

By 1903 enough black people had settled north on the Illinois side of the river to establish the village of Brooklyn as the first black-majority community.   One of its leading citizens was David Wyatt, a schoolteacher and principal. Wyatt was a noted orator and community leader. Among other accomplishments he opened the school to adult learners, a program ahead of its time. So it must have come as a shock when Wyatt was notified his teaching license would not be renewed by the County Superintendent of Schools. 

      

St. Clair County was, much as it is today, one of the most diverse jurisdictions in the country. On its western edge along the Mississippi were densely populated urban areas many of which would develop into some of the worst slums in America. Moving east over the top of the river bluffs were the small, established towns like the county seat of Belleville. On the eastern edge were and are farms and bucolic small towns seemingly a world away from the rest of the county.

      

On June 6, 1903 Wyatt travelled to the courthouse to confront Superintendent Charles Hertel. Although it was a Saturday, Wyatt found Hertel working in his office. In a heated exchange Hertel told Wyatt his license would not be renewed because of allegations of abuse against his students. Wyatt left the office. 

     

Around 6 pm Wyatt returned to the courthouse, entered Hertel’s office and shot him in the chest with a revolver. Hertel’s secretary and his son grabbed Wyatt, and held him until the police lead him away to the nearby county jail. The sound of the shot and the sight of a black man being dragged away from the courthouse set off anger among the passersby.  As the crowd gathered the false rumor that Hertel had been murdered spread fueling the growing anger and calls to ‘lynch the nigger.’ 

     

A self- appointed corps of men decided to carry out the mob’s wishes. Wyatt had been safely locked in a cell behind secured doors., but soon the sound of the jailhouse door being battered by a sledge hammer rang out to the streets. Small boys stood at open windows delivering progress reports to the crowd below.  For hours the mob worked battering down the doors and then the cell as Wyatt loudly begged for mercy or prayed to God. The mob broke the final cell door and swarmed over Wyatt. 

      

He was dragged out of the building and into the street leading to the courthouse square. By now thousands were watching and cheering as St. Clair county officials, including the State’s Attorney, watched.  Belleville police also stood by under orders not to fire into the mob. The fire department was blocked by the crowd who tied knots in the hoses to prevent them from spraying water to break them up.

     

Details of exactly what happened next vary in press reports. Wyatt was dragged a few blocks to either a telephone or telegraph pole where he was lynched. Either before or after the actual lynching a fire was prepared under the body. Whether he was already dead or as some said barely clinging to life, Wyatt’s body was burned as the crowd cheered loudly. What little was left of him was taken to the dump. An estimated 10,000 people from all over the region visited the site of the lynching.  

     

The lynching of David Wyatt provoked a national reaction from New York to Chicago. Some wondered how such a thing could happen so far north. Belleville clergymen hastily rewrote their sermons to denounce the action. City police and county officials who witnessed the attack did almost nothing. Fourteen members of the mob were arrested and let out on bail until the charges were quietly dropped. That was fine by most people in Belleville. Two weeks after what the mayor of Belleville called a “somewhat irregular execution,” George R. Long, president of the local Good Government League said, “I think the matter better be dropped. I am tired of the vilification of Belleville. I am against mob law under all circumstances but I believe the negro got what he deserved. ” One of the few who dared to disagree was Charles Hertel, who fully recovered from his gunshot wound and regretted the lynching. 

     

After the lynching the thinly veiled hostility against Rev. Thomas and his small flock turned to open death threats. Black people were told in no uncertain terms to get out of Belleville.  Rev. Thomas gave his guns to his congregation and fled the town in the middle of the night. 

 

In the century plus that has followed, things have gotten better in Bellville and the county. For example, an African American was appointed chief of police in Belleville. Race relations are for the most part peaceful, but there is still a geographical division in the region. Many white people in Belleville and the relatively new town of Fairview Heights trace their roots to East St. Louis before moving en masse to the east and abandoning the once prosperous city.  But under the surface, many whites still see black people as the other, the people that don’t quite belong in Belleville. In Ferguson, on the other side of the river, everything was fine until it wasn’t. This is the challenge Belleville must face. The history of David Wyatt is a stark warning for Belleville and the region, and a lesson not safely forgotten.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175629 https://historynewsnetwork.org/article/175629 0
The Roundup Top Ten for May 22, 2020

The Lessons of the Great Depression

by Lizabeth Cohen

The larger lesson the New Deal offers is that recovery is a complex and painful process that requires the participation of many, not directives from a few. And that, ultimately, we’re all in this together.

 

The Neoliberal Era is Ending. What Comes Next?

by Rutger Bregman

From higher taxes for the wealthy to more robust government, the time has come for ideas that seemed impossible just months ago.

 

 

Ida Taught Me

by Koritha Mitchell

As the United States seems determined to repeat the horrors of the last turn of the century, I remain grateful for Wells’s example. Here is just some of what she taught me.

 

 

The West Is Relevant to Our Long History Of Anti-Blackness, Not Just The South

by Walter Johnson

The Missouri Compromise paved the path to the Civil War. But it also signaled what would follow: western settlement driven by the idea of expanding a country of, by, and for white men.

 

 

When University Leaders Fail

by François Furstenberg

A university governed by long timelines and long-term thinking grows conservatively and cautiously and prepares itself prudently for potential crises. If you turn a university into a giant corporation, on the other hand, it will rise and fall with the business cycle.

 

 

Most Medical Professionals Aren’t Racist — But Our Medical System Is

by Ella St. George Carey

The medical profession perpetuates racial inequality in health outcomes in large part because medical training still treats white and nonwhite bodies as fundamentally different.

 

 

When the Economy Collapses, Talk Is Cheap—Just Look What Led Up to the Great Depression

by Robert Dallek

For American leaders, the Great Depression is as much a lesson in what not to do as it is in what to do.

 

 

How White Backlash Controls American Progress

by Lawrence Glickman

Individual backlashes are part of a common strain of reactionary politics that rejects challenges to social hierarchies. 

 

 

The ‘American Way of Life’ Is Shaping Up to Be a Battleground

by Keeanga-Yamahtta Taylor

The median wealth of a U.S. senator was $3.2 million as of 2018, and $900,000 for a member of the House of Representatives. These elected officials voted for one-time stimulus checks of $1,200 as if that was enough.

 

 

The Potential Risk of Chasing a COVID-19 Vaccine

by Heidi Morefield

Vaccine development is important to fighting viruses, but can’t come at the expense of other means of prevention.

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175617 https://historynewsnetwork.org/article/175617 0
Morning or Mourning in America? Political Advertising and the Politics of Emotion

 

 

The promise of American greatness knew no bounds in “Morning in America,” the iconic ad created by the group of political consultants and advertising gurus (the “Tuesday Team”) who worked for Ronald Reagan’s 1984 reelection campaign. In the ad, Americans were working, getting married, and buying homes – confident that the country they lived in was “prouder and stronger and better” under President Reagan’s leadership than four short years before.

 

Advertising typically sells us life as we want it to be, not life as it truly is. Maybe that’s why the anti-Trump, conservative super PAC known as the Lincoln Project’s new ad “Mourning in America” is such a gut punch. As misery and despair unfold in scene after scene of job loss and death, there’s no escaping the dystopian nightmare America is now mired in as the country battles the coronavirus. After seeing the ad, President Donald Trump started rage tweeting at the ad’s creators – attorney George Conway, Republican strategists Steve Schmidt and John Weaver, and media consultant Rick Wilson – at nearly 1 a.m. on May 5.

 

The history of political advertising offers a glimpse into why ads like this one can come to define a politician’s candidacy for better or worse. Ads that effectively tap into the collective cultural zeitgeist can become harbingers of a candidate’s fate in a way that political polling will never be able to do. Numbers can’t effectively capture the moment when advertising causes perceptions to harden into actual beliefs that govern voter decisions come election time.

 

We’ve seen such moments before. The 2004 Swift Boat Veterans for Truth ad turned Democratic presidential candidate John Kerry, a decorated Vietnam vet, into an unpatriotic, East Coast elitist. Compare the 2008 wil.i.am “Yes We Can” video in support of Barack Obama to the 2007 viral video “Vote Different,” which superimposed Hillary Clinton’s face in the place of Big Brother in a make-over of the 1984 Apple computer ad. Putting Obama’s speeches to music communicated hope and love. Depicting Clinton as an authoritarian evoked all the negative feelings against her.

 

It’s worth a trip down memory lane to recall exactly how much advertising has influenced presidential campaigns and their outcomes. Historians credit Eisenhower’s decision to accept advertising with changing the course of how modern presidential campaigns would be conducted going forward, according to the former Museum of Television & Radio’s (now the Paley Center for Media) 2004 exhibit, “Madison Avenue Goes to Washington: The History of Presidential Campaign Advertising.” Whistle-stop tours ended and the era of slick ads designed to enhance a presidential candidate’s image began.

 

Eisenhower hired veteran ad man Rosser Reeves in 1952 to create what would become the first presidential campaign spots to air on TV. Republican Presidential candidate Thomas Dewey had rejected Reeves’s advances four years earlier because he considered mixing puffery with politics undignified.

 

Reeves, later known for the famous M&M candy campaign “Melts in your mouth,” crafted Eisenhower’s image by removing his glasses and filming him answering questions from Americans about taxes and Korea. The strategy behind the $60,000 campaign, called “Eisenhower Answers America,” allowed Reeves to present the candidate as fatherly, decent, responsible and trustworthy. Democrats complained that Eisenhower was selling the highest office in the land like advertisers sold soap or laundry detergent.

 

John F. Kennedy’s 1960 campaign featured the first-time political ads were shot on location. One positive ad showed Kennedy talking directly about his Catholic religion, a controversial issue at the time. The Kennedy campaign also produced a damaging attack ad that undermined the carefully honed image Republican opponent Richard Nixon had earlier created during his famous Checkers speech. When Nixon ran against Kennedy, Eisenhower was asked at a press conference to name something his vice president had done. Eisenhower answered, “If you give me a week, I might think of one. I don’t remember.” The Kennedy campaign used the remarks in TV and radio ads that questioned Nixon’s experience and capability.

 

Stoking fear about Republican Barry Goldwater became the key Democratic advertising strategy used to keep Lyndon B. Johnson in the White House in the 1964 presidential race. The Democrats hired the ad agency Doyle Dane Bernbach (today DDB Worldwide) to produce a series of hard-hitting ads. One TV spot featured a saw slowly cutting off the eastern section of a model of the U.S. Just before the piece fell off and floated away, the announcer told viewers that Goldwater thought the nation would be better off without the Eastern Seaboard.

 

Perhaps the most controversial and memorable presidential campaign ad of all time, the “Daisy Girl” spot, played off of the very real fear Americans had of nuclear annihilation, as historian Robert Mann recounted in his book Daisy Petals and Mushroom Clouds: LBJ, Barry Goldwater and the Ad That Changed American Politics. The countdown to a nuclear missile launch is told through the eyes of a young girl pulling petals from a daisy flower as the mushroom cloud appears in her eye. The spot aired just once, on Labor Day, during NBC’s Monday Night at the Movies, but the subsequent media interest brought it fame as an attack ad.

 

The 1968 three-way race between Richard Nixon, Hubert Humphrey and George Wallace saw the birth of the media specialist in presidential campaigns, which became a major contributor to cost, even as attack ads grew in prominence. One Humphrey spot featured a man’s voice laughing uproariously as the words “Agnew for Vice President?” appeared on a TV screen.

 

When Nixon’s media advisors created their own in-house agency called “The November Group” in 1972, they “saved the 15 percent of the amount of television and radio air costs an agency holds back as the fee for placing an ad,” wrote Kathleen Hall Jamieson in Packaging the Presidency: A History and Criticism of Presidential Campaign Advertising. Advertising teams would appear again in Reagan’s Tuesday Team – named after election day held on the first Tuesday in November – and in 2000 for George W. Bush’s “Park Avenue Posse,” headed by the ad agency Young & Rubicam’s Jim Ferguson.

 

At the time, the 1988 race between George H.W. Bush and Michael Dukakis was considered the most negative historical contest since the 1964 Daisy Girl spot. Two ads painted a devasting picture of Dukakis as weak on crime. Jamieson described  “Revolving Door” in her book this way: “A procession of convicts circles through a revolving gate and marches toward the nation’s living rooms.” Jamieson said the ad made a false inference “that 268 first-degree murderers were furloughed by Dukakis to rape and kidnap,” since the facts revealed that only William Horton, a first-degree murderer whose saga the Bush campaign chronicled in the “Willie Horton” ad, escaped his furlough in Massachusetts and committed a violent crime.

 

As journalists stepped up their scrutiny of advertising claims made in the 1992 race, Bill Clinton’s ad team used real news footage in attack ads and played on a positive association by including a photo of a young Clinton shaking JFK’s hand.

 

The people who produce advertising understand how emotions can drive decision making, and voting choices are no different from product purchasing decisions. One’s instincts, often translated into “which candidate would you would rather have a drink with,” influence the ballot box, as the Lincoln Project’s Rick Wilson knows only too well.

 

Speaking of his homage to the original Morning in America spot, Wilson wrote in his May 6 column in The Daily Beast, “That brilliant, evocative minute caught a moment of uplift in the minds and hearts of American voters in that rarest of political spots; it was true in the audience’s gut.”

 

The darker Mourning in America video, where America is presented as “weaker and sicker and poorer,” might well be the 60-seconds that decide Trump’s fate in November.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175514 https://historynewsnetwork.org/article/175514 0
100 Years of Political Spectacle: Women, Political Protest, and the White House

photo composite of National Woman’s Party White House protest for Women’s Suffrage, February 1917, Library of Congress,

and National Nurses United White House protest for health care workers lacking PPE, May 7, 2020, Photo courtesy of National Nurses United. 

 

 

"As such, their silence rang indignant: their banners directly addressed President Wilson, their location directly confronted the president,

and their public demonstration enacted the freedom they sought."

-- Belinda A. Stillion Southard, “Militancy, Power, and Identity: The Silent Sentinels as Women Fighting for Political Voice”

 

 

On April 21, 2020, nurses stood in front of the White House and recited the names of colleagues who had died from COVID-19. By standing at 1600 Pennsylvania Avenue, they hoped to bring their demands for more personal protective equipment (PPE) “directly to President Trump’s doorstep.” These powerful images of (mostly women) nurses hoisting signs visually recall the first White House protest— in favor of women’s suffrage. The National Woman’s Party’s fight with President Woodrow Wilson a century ago not only sheds light on the nurses' demonstration and our assumptions about women, patriotism, and protest, but offers possible strategies for the nurses who seek long-term change in healthcare practices.

 

On June 20, 1917, Lucy Burns, co-founder of the National Woman’s Party (NWP), and Dora Lewis gathered with other suffragists in front of the White House. They held a banner criticizing President Wilson’s opposition to women’s suffrage: “We, the Women of America, tell you that America is not a democracy…. President Wilson is the chief opponent of their national enfranchisement.” Their immediate audience was a visiting Russian delegation -- as well as President Wilson and members of Congress. As the NWP suffragists continued to protest for the 19th Amendment in front of the White House for the next 10 months, they carefully crafted a spectacle. They were silent (and later known as the Silent Sentinels), dressed in the colors of the NWP (purple, white, and gold), and meticulously followed DC’s traffic and assembly ordinances. Their audience was complex and multi-layered: President Wilson and Congress; their own NWP membership; the wider women’s suffrage movement; and the general public. Many, both inside and outside the suffrage movement, labelled the NWP protesters as militant and unpatriotic. Criticism of the Silent Sentinels derived from widely held beliefs about femininity and patriotic duty during wartime—as well as the sanctity of the White House.

 

A century later, unionized nurses took a page from the suffragists. They gathered at the White House to call attention to the 9,000 health care workers in the United States who have tested positive for COVID-19. A spokesperson from the National Nurses Union (NNU) read the names of 45 deceased nurses and called directly upon President Trump and Congress to mandate that the federal Occupational Safety and Health Administration (OSHA) “issue an emergency temporary standard to protect health care workers and other frontline workers from COVID-19 exposure.” Dressed mostly in red (the color of their union’s logo), the nurses silently held photographs of deceased co-workers -- using the spatial aesthetics of social distancing to create a 21st-century spectacle. A week later, the number of fallen nurses had almost doubled and the NNU returned with eighty pairs of empty, pristine, white shoes. Like the suffragists, the nurses carefully choreographed space, speech, silent body rhetoric, and images to explain their cause and influence lawmakers, but also to signal strength to National Nurses’ Union members.

 

But such displays have often alienated more mainstream groups. Just as the NWP split from the more mainstream National American Woman Suffrage Association, the NNU split from the American Nursing Association as they sought to “more aggressively fight hospital management for better working conditions for nurses.” After a one-day strike at 34 California hospitals in 2011, the California Hospital Association called the NNU nurses “inappropriate and irresponsible.” Additionally, during the 2014 Ebola epidemic, the NNU highlighted the lack of preparation of hospitals and the condition of the nurses who cared for them and their protests of “subpar” PPE have been called “militant” and “aggressive” by labor historian Nelson Lichtenstein.

 

Militant or not, the membership of the NNU has grown while other American unions have experienced decline. The Atlantic calls the NNU “one of the smallest, but most powerful unions in the country” and Berkeley’s Harley Shaiken highlights their innovative and bold approach -- which has resulted in policy successes even in right-to-work states. Political Scientist Laura Bucci emphasizes that “This kind of action harkens back to other female-led labor struggles, from workplace conditions in the garment industry to those in America's classrooms where public action is used as a measure to achieve political change outside of bargaining periods.”

 

The National Women’s Party emphasized their femininity when they believed it would amplify their message and, similarly, the NNU has leaned into the gendered nature of the nursing profession. RoseAnn DeMoro, former executive director, claims that “There's something in the culture of women that’s collective and cohesive....They play for each other and their organization, and the patients in a way that’s in their social structure really [and]..an extension, in some ways, of the family—and I think that’s why it’s devalued by employers.” As a result, nurses are often hindered from combining their militancy with their femininity. Writing before COVID-19, activist and scholar Linda Briskin presciently concluded that nurses’ professional commitment to care often conflicts with their work environments. From “healthcare restructuring, nursing shortages, intensification of work, precarious employment and gendered hierarchies” nurses are increasingly left unable to do their jobs leading to a “militant discourse around the public interest, and a reconstitution and reclamation of ‘caring” that may lead to more strikes.   

What lessons might the 1917 National Woman’s Party protest hold for nurses? The Silent Sentinels turned President’s Wilson’s own political communication and strategy against him -- something the NNUs should consider. When he consulted public opinion (the cutting edge for the early 20th-century), the Sentinels rallied the public's support to their cause. Although the suffragists targeted Wilson, he became their greatest potential ally because his “rhetorical presidency facilitated the generation of direct and indirect appeals.” The NNU might likewise interrogate the unique communication style of President Trump. Rather than allow Trump to control the narrative on Twitter and speak to the public through his (until recently) daily press-conferences, could the nurses challenge Trump on his own turf by using spectacle and social media to affect the public's understanding of the PPE shortages?

 

The suffragists appropriated Wilson’s war ethos to construct themselves into “soldiers joining the fight for democracy." The nurses must likewise consider the tropes and themes of the Trump presidency to keep healthcare reform on the agenda. Their success may depend upon their ability to use spectacle and gender to impact public opinion during a new form of war.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175515 https://historynewsnetwork.org/article/175515 0
Hong Kong Apocalypses: Teaching the Recent Past and the Speculative Future

 

 

Perhaps it should have been predictable that when the recent past tries to predict the near future, the two collide in an unimaginable present. In that collision, unsettling though it is, the idea of a past that is distinct from the present has unraveled in an era that from any number of angles has been “history in the making.” I wonder if it would be different if we thought of it as “the present in the making?” And I’m left with the idea that the two are just mirrors held up to one another.

 

Most historians are comfortable with the idea that history is the interaction of the past and the present, mediated by our sources. From our perch in the present, we interpret, reconstruct, and analyze what happened in the past. Yet, though we tend to acknowledge the importance of all three of these—past, present, and sources—I have found that most historians are more comfortable with, even fetishize, the last two, and only grudgingly admit the role of the present. Yet, as historians and humans, it is our burden to be trapped in the present. Phrases like “time travel” or “immersed in the sources” reveal the extent to which many historians see their profession, at least sometimes, as an escape from the present. Indeed, the closer one’s subject of study comes to the current page of the calendar, the less pure it is often seen. 

 

The line between past and present (presumed to isolate the concerns of the present from the proper objects of history) often fades, though. The temporal border between the objects of “history” and “journalism” is (half-jokingly?) placed at World War II, or at the French Revolution, or any other time that suits the audience. Although the demarcation may be meant as a joke, its fluidity illustrates that the whole enterprise is somewhat arbitrary.

 

This spring I taught “Contemporary China” as a History class, aware that I was walking a confusing line. The point of the class was largely to be a vehicle for a study tour (cancelled by the pandemic) that would bring students to China for two weeks following end of term, but the topic raised questions. What would distinguish this as history, and not sociology, or political science, or journalism? Finishing the class—remotely, of course, though that had nothing to do with the course’s subject—I cannot be sure I know the answer to that even now, but I have been privileged, and burdened, to see and live the most extraordinary interplay among past, present, and source.

 

One of the featured topics for the course was Hong Kong and the protests that had swept across the territory during 2019, while I was planning the class. Determined to make the course topical, multidisciplinary, and also grounded in historical context, I arranged guest speakers and film screenings to accompany the more traditional readings and discussion. 

 

I arranged our discussion of Hong Kong to accommodate the launch of a new book on the protests, Jeffrey Wasserstrom’s Vigil: Hong Kong on the Brink. Students read the book just as it was released, and then welcomed Wasserstrom to campus to talk about the events as they unfolded. As if to further strain the chronology, we combined Wasserstrom’s visit with a screening and discussion of the Hong Kong film Ten Years

 

The historical context was this: Hong Kong had been ceded to Britain in 1842, spoils of the first Opium War. To this original colony were added the Kowloon Peninsula in 1860, and the so-called New Territories, in 1898. The first concessions were granted to Britain “in perpetuity,” but the New Territories—which comprised almost 90% of the colony’s area—was leased for 99 years, until 1997. As all three areas became both intertwined and dependent on the mainland for utilities like electricity and water, and the People’s Republic of China made clear that the lease would not be renewed, 1997 became an end date for Britain’s colonial endeavor on the China coast.

 

Ten Years was acclaimed upon its 2015 release, exceeding expectations at the box office and outperforming even the new Star Wars installment when it opened. It is a work of speculative fiction, a suite of five short films giving different directors’ takes on what Hong Kong might look like in another decade, in 2025, just about halfway through the 50 year period during which, according to the 1984 Basic Law, Hong Kong’s “way of life shall remain unchanged.” (Hong Kong Basic Law, Article 5) The five stories vary in tone, but all of them paint a dark future for the territory: staged assassinations, censorship run amok, ritual suicides, families torn apart. All of them grim, but all of them fictional.

 

Historians often resist predicting the future, claiming instead that their purview is the past. Similarly, speculative fiction, which trades on suggesting possible futures, tends to resist a date that is too specific or too near. Ten Years defied that tendency, giving a date that was both precise and imminent. Even more imminent than many expected. 

 

Somewhat counterintuitively, speculative fiction relies on history to work. Works that explicitly explore an altered past, like Philip K. Dick’s The Man in the High Castle, clearly rely on our understanding of the history to make its points, but even less concrete—more speculative—examples depend on how we know the world to be, so that they can point out ways it could be different (or perhaps ways in which it is not quite how we believe it to be). If we are shown a date too near, then power of the book is often evaluated as a kind of parlor game—what did it get right?—rather than a commentary on our world—the present.

 

There were no moon bases, HAL computers, or Jupiter missions when the actual year 2001 arrived, but by 2019—6 years before Ten Years fictional setting—university campuses were aflame. Scenes of students weaving through plumes of tear gas were meant to be provocative in the film; instead, they were merely descriptive. There were no staged assassinations in 2019, as there were in the film, but many of Hong Kong’s most prominent pro-democracy activists were jailed. In April 2020, this list came to include most of the movement’s first generation leaders, including legislators like Martin Lee.

 

Lee, now 81, was arrested on April 18. He had served on the Hong Kong Legislative Council (LegCo) and been the chair of the Hong Kong Democratic Party. In normal times, his arrest would have been front-page news (as, in fairness, it eventually was, a week or so after the event), but its lack of prominence was part of the plan. With the world distracted, attention was hard to come by. Even my class, with its focus on contemporary China and a reading list that included Hong Kong, struggled to find time to give to it.

 

This was without doubt the intention behind the timing of these arrests. Counting on distraction, the PRC government moved to silence some of its most outspoken and consistent critics. Lee, though, was not quite silenced: Interviewed by the Foreign Press Association on April 23, Lee said that Hong Kong is “the key to China’s future,” insisting that the People’s Republic of China would be judged on its ability to keep its word, as it seeks greater prominence and influence on the world stage. How well Hong Kong preserved the autonomy and freedom promised in the Basic Law, Lee argued, would be critical to the world’s judgment of the PRC.

 

And what of the world’s judgment? What of my students’ judgment, for that matter? While Covid-19 dominates headlines, and frames China’s relationship to the rest of the world, it is far from the only factor. It is, however, a controlling one. The often unspoken goal of many history classes is to reach closure. Sometimes that is a point in the recent past (the lament of the second half of the US survey: where do I stop?) or a more or less arbitrary date, often in the title of the course. But Contemporary China didn’t end just because the course did. Our last class, held April 28, didn’t provide an end point for anything other than our (virtual) meetings.

 

That was partly because we were unable to have the familiar rituals of an on-campus class, and partly because many of the topics we were studying—the repression of Uighurs in China, the Hong Kong protests, US-China trade wars, and of course the global responses to Covid-19, but it was largely because the capstone of the class was not to be. One principle I have always tried to follow as an educator is exposing students to ideas and information and letting them judge for themselves. This was the idea behind taking students to China: with a little exposure, including some formal meetings but, ideally, with lots of informal interactions with Chinese students and others, my students could flesh out what they had learned. Maybe some would be inspired to further classes or even careers focused on China. For now, at least, those plans are on hold.

 

I finished “Contemporary China” feeling deeply unsatisfied. Instead of boarding a flight to Beijing, I was convening Zoom sessions, wondering what had been accomplished. But in the end, I think I understood what it means to do history better than did when I started teaching the class.

 

Left without the capstone, without the trip toward which the course had been pointing, students were left to do the work of the historian. To assign meaning amid uncertainty. We never have complete sources, and we never have all the information. It is tempting to think that, in the present, we can know “everything,” and that the further into the past our subjects recede, the greater our challenge. But historians (ought to) know better than that. Both the past and the present elude our understanding, and only through sources do we have chance at making sense of either of them.

 

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175508 https://historynewsnetwork.org/article/175508 0
Interview: Historian and Professor Nancy K. Bristow on the Forgotten Police Shooting of Black Students at Jackson State College Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Documentary, NW Lawyer, Real Change, Huffington Post, Bill Moyers.com, Salon.com, and more. He has a special interest in the history of visual imagery, medicine, law, human rights and conflict. He can be reached by email: robinlindley@gmail.com.

Just after midnight on May 15, 1970, officers of the Jackson Police Department and the Mississippi Highway Patrol attacked unarmed students at Jackson State College, a historically black college in Mississippi. The law enforcement officers unleashed a murderous fusillade of hundreds of gunshots into a group of African American students who were simply standing in front of a women’s dormitory. After just half a minute of intense gunfire, two young men were left dead and a dozen young men and women were wounded, including several women in the dorm.

This tragic moment of brutal racial violence was soon largely forgotten in history. And, in subsequent accounts, the incident seems eclipsed in memory by the massacre at Kent State University ten days earlier, on May 4, 1970, when National Guard troops confronted antiwar protesters and wildly fired live ammunition into the throng of students, killing four white students and wounding nine others.

In her groundbreaking new book Steeped in the Blood of Racism: Black Power, Law and Order and the 1970 Shootings at Jackson State College (Oxford), distinguished historian and Professor Nancy K. Bristow unravels this complex story of this tragic and overlooked racial incident at Jackson State. The book is based on her meticulous examination of the history of racism in Jackson, the role of the college in the community, the horrific shooting, and the subsequent investigations and unsuccessful attempts of surviving victims and relatives to find justice. She also considers the role of memory, deeply ingrained racism, and the cultural response, in understanding this act of state violence against African Americans.

Steeped in the Blood of Racism presents the results of Professor Bristow’s painstaking investigation of the bloody eruption of violence at Jackson State and its historical context.  In addition to wide-ranging archival research of official records, photographs, news reports, and other material, she interviewed dozens of witnesses and others. As with her previous work, her powerful and deeply humane book offers a moving and sensitive account of an episode of overlooked history with consequences that still resonate today.

Professor Bristow teaches American history with an emphasis on race and social change at the University of Puget Sound in Tacoma, Washington, and she is a founding member of the university’s African American Studies Program. She also serves on the Race and Pedagogy Institute’s leadership team. She is a past Washington State Professor of the Year. Her other books include the critically acclaimed American Pandemic: The Lost Worlds of the 1918 Influenza Pandemic and Making Men Moral: Social Engineering during the Great War.

Professor Bristow graciously responded to a series of questions by email.

 

Robin Lindley: Congratulations on your groundbreaking and heartbreaking new book Professor Bristow. In Steeped in the Blood of Racism, you have done an extensive investigation into the police shootings of students 50 years ago at Jackson State College in Mississippi.

Your new book seems a departure from your previous books: American Pandemic on the 1918 influenza epidemic in America, and Making Men Moral on social engineering during the First World War, although those books also explore issues of memory, race and class, among other matters. What inspired your meticulous exploration of the horrific events at Jackson State?

Professor Nancy K. Bristow: I have been teaching African American history for thirty years, and running throughout this history is a central theme of the brutality of white supremacy.

I learned about the Jackson State shootings through a student’s research paper, and recognized both its importance in illuminating the consequences of the “law and order” perspective championed by Richard Nixon and its resonances with the ongoing crisis of state-sanctioned violence against people of color today.

As I began researching what happened at Jackson State, I realized how effectively this one story illustrates so many of the themes that emerge in the history of state violence against African Americans—how easily law enforcement fires on African Americans, the racism that makes this action such a quick response, the reality that officers rarely face consequences for their actions, the ongoing trauma that results from this brutality, and the resistance of so many in the white community to seeing this violence for the ongoing crisis it represents.

The story of what happened at Jackson State in May 1970 is one that every American should know if they want to make sense of the world they live in today.

What happened at Jackson State College on May 15, 1970, when police unleashed a tremendous fusillade of gunshots on a group of African American students, fatally wounding two young men and injuring many other students?

The night of May 14-15, 1970 was the second night of conflict between students and law enforcement.  The night before, some young people—students or local youths—had thrown rocks at white motorists on Lynch Street.  This thoroughfare bisected the campus, and was a longstanding source of harassment for students.  White commuters often endangered them by racing through the campus, and were also known to abuse student with racial epithets and insults.  In 1964 a student was struck and hospitalized, and in the aftermath, and on many occasions in subsequent years, the young people protested the mistreatment and laid claim to the street. 

On the night of May 13, police were called to close the street to motorists. A few trash bins were set on fire, and there was even a hapless attempt to attack the ROTC building. But law enforcement did not enter the campus, and things soon quieted. 

The next night, though the president of the university, John A. Peoples, asked the city to close the street to forestall any additional problems, the city refused.  Late that evening, rock throwing resumed, and the street was again closed. A crowd gathered in front of a men’s dormitory, Stewart Hall, on the western edge of campus. Someone drove a dump truck up Lynch Street from a nearby construction site and it stalled in front of Stewart Hall, and someone ignited it.  A city fire truck, called to put out the blaze, brought with it both city policemen and the Mississippi Highway and Safety Patrol (MHSP). Their mission was to protect the fire truck. But with the fire out, these forces did not retreat to the periphery of the campus as they were charged to do, but instead marched up Lynch Street through the center of campus.  They took with them the city’s armored tank, shotguns loaded with heavy buckshot, two submachine guns, and two rifles with armor-piercing bullets. 

The forces halted in front of the women’s dormitory, Alexander Hall, where young people were enjoying the warm Mississippi night, and turned to face them, weapons leveled. Women’s curfew was at 11:30, and a little after midnight plenty of men were still hanging out, talking with the women through the windows.  They were shocked by the arrival of the troops, because they knew they were on their own campus, and had done nothing wrong.  Even so, they retreated behind the chain link fence between the dorm and the street when an officer commanded them to. And then a bottle crashed on the pavement and law enforcement opened fire.  For 28 seconds.  They fired more than 150 rounds, leaving over 400 bullet and shot marks on the exterior of Alexander Hall.  Two young men—James Earl Green and Phillip Gibbs—were killed, and twelve other young people were injured in the barrage.

When the firing finally stopped, the officers turned to picking up their spent shells, leaving the students to tend to one another.  They offered no aid to the injured. The on-site commander of the MHSP, Lloyd Jones, bullied the students, ordering one to check on the bodies of Green and Gibbs, lacing his words with racial epithets.  One reporter noted a mood of “levity” among the officers. Finally, the National Guard, who was to have replaced the city police and MHSP on campus, arrived and began helping the students.

You detail the history of Jackson State—a historically black college—and its place in the Jackson community and the state of Mississippi, a notorious site of racial violence through history. The shootings occurred early in the days of desegregation and in the time of a growing Black Power movement, but wasn’t Jackson State a quiet, conservative school that was tightly controlled by a white board of directors?

Though some of this is true—certainly the repression faced by African Americans in Mississippi—the situation at the college was more complex than the traditional narrative has suggested. 

Yes, Jackson State College was controlled by an all-white Board of Trustees of Institutions of Higher Learning in Mississippi, and the board attempted to keep a tight rein on students across the state. For the black colleges, this meant students faced particularly harsh repercussions if they engaged in activism of any sort, and especially civil rights actions.

Throughout the 1950s and into the 1960s the college’s president, Jacob Reddix, conceded to the board’s demands in hopes of building a school that could serve its students well. In 1957 Reddix withdrew the school’s basketball team from the NCAA playoffs to avoid facing a white team in the next round of the tournament.  In 1961, when students from Tougaloo College north of Jackson attempted to desegregate the city’s downtown library and were arrested, some 800 students at Jackson State rallied in support, and even attempted a march to downtown.  Law enforcement cracked down on the march, and the president expelled two students believed to be leaders in the actions.  He also suspended the student government. Students learned early the consequences of activism on the Jackson State campus.

But by 1970 Jackson State was a changing institution.  A new president, alumnus John A. Peoples, saw himself as part of “the new breed of college presidents,” people who were “proud to be black and who would speak up for the freedom of black people.” He wanted to create a “true university,” and this included expanding student rights and allowing greater opportunities for student expression and racial consciousness. The school newspaper increasingly carried stories about contemporary issues, from the war in Vietnam to issues of black identity. Students brought local activist and Fayetteville Mayor Charles Evers and Black Power activist Stokely Carmichael to campus. The institution opened its Institute for the Study of History, Life and Culture of Black People, with noted writer and faculty member Margaret Walker as its director.  And some students became advocates of Black Power. It was this changing campus that law enforcement assaulted in May 1970.

The horrific shootings at Jackson State happened just 10 days after National Guard troops at Kent State University in Ohio killed four white students and wounded others during a protest against the US invasion of Cambodia. What should readers know about the historical context of these two college campus shootings?

On April 30, President Richard Nixon informed the American public that US troops had invaded Cambodia.  People were shocked, and antiwar activists were outraged.  Nixon had run on the promise of ending the war, had recently announced troop withdrawals, and was now expanding the war. Protests erupted around the country.  On the fourth day of protests at Kent State University, National guardsmen opened fire on students there, killing four young people and injuring nine more.  In the aftermath, students at campuses nationwide organized memorial services and letter writing campaigns, vigils and mock funerals. The largest student strike in US history took hold.

Though there were no established antiwar groups on the Jackson State campus, activists there nevertheless mobilized to protest both the Cambodian invasion and the Kent State shootings. Some 200 to 300 students boycotted classes, and still more participated in a protest rally, and conversations about “KSU/Cambodia/Viet Nam/Draft/etc.” continued on the campus in the days to come. It was in the midst of this nationwide unrest that law enforcement opened fire on students at Jackson State.

The story of the Kent State killings has overshadowed the deadly Jackson State police assault since both incidents occurred. You stress that both incidents were conflated as two shootings of student protestors, but you found that the incidents were very different. What did you learn about how the shootings at Jackson State were seen and why they were so readily forgotten by most, and then virtually disappeared from history?

The important distinction between the two shootings is the white supremacy that caused the shootings at Jackson State and accounted for the amnesia that surrounds them. 

The students at Jackson State College were assaulted because they were black students attending an HBCU in one of the most viciously racist states in the country.  As the President’s Commission on Campus Unrest concluded, “racial antagonisms” were essential to understanding law enforcement’s behavior, as was their awareness that if they fired, they would face neither disciplinary action nor criminal charges.

Though many of the students at Kent State understood this distinction, the broader American public conflated the two events. TIME headlined its piece on the Jackson State shootings “Jackson: Kent State II.” This quick linkage unfortunately hid the essential role of racism in the events at Jackson State College. In the years to come, as anniversaries passed, the media increasingly ignored the Jackson State shootings, using the violence at Kent State to serve as the iconic example of all of the campus conflicts of the era.  If they were all the same, it seemed, one could stand for the whole.  The victims at Jackson State became, as the Chronicle of Higher Education listed them, simply “others who died.”       

What was the historical problem you began with for your book and how did your book evolve during the course of your research?

I initially imagined a book exploring the ways the law and order narrative was used to justify state violence in the latter part of the civil rights era.  I wanted to understand how, as the nation professed to be moving into a more just future in the wake of major civil rights legislation, new mechanisms were employed to continue to repress African Americans.  I planned to write a book looking at how white representatives of the state used violence to control various elements of the black community in the late 1960s and early 1970s—not only college students but also black power advocates, people engaging in civil disturbances, the incarcerated.

I began my research with the shootings at Jackson State College, and stumbled into the opportunity to write this more focused book for Oxford University Press. I jumped at the opportunity, because my initial trip to Jackson State convinced me how important it was that more people learned this story.

What was your research process? For me, the book reads in part like a gripping, scrupulously detailed account of a complex crime.

I began with the archival sources, I think because that felt easiest.  But once I was situated with the essentials of what took place, I realized the book would be hollow, would lack the humanity it demanded, if I did not talk directly with people who had been affected by the shootings—those who knew James Earl Green and Phillip Gibbs, those who were injured, those who were witnesses, those who fought to keep the memory of that night alive, and those who continue the work of commemoration and remembering.

Talking to people, having so many people entrust their story to me, was the most humbling experience I have had as a historian. I will never forget this gift, and the responsibility it carries.

I have felt my duty as a historian in an entirely new way with this work. With any project one needs the research to be comprehensive and careful, the story that is told complex and correct.  With this charged story of violence and loss, and with issues of white supremacy at its center, the responsibilities seemed dramatically heightened. 

Two young African American men were killed in the shooting: James Earl Green and Philip Gibbs. What would you like readers to know about these men?

People need to understand, most importantly, that they did not deserve what happened to them, that they were murdered by white supremacist members of law enforcement. And people need to know that these young men were full of life, with talents they did not get to develop in full.  And people need to know how beloved they were, how costly their deaths were for their friends, families, communities, and honestly their nation. 

More specifically, people should know that Phillip Gibbs was a junior at Jackson State, where he was studying politics and considering a career in law.  He and his wife Dale had a son, Phillip Jr., and she was, unknown to them at the time, pregnant with their second son, Demetrius. His friends remembered him as a “caring, sharing person,” someone who would “help anyone.” Gibbs was visiting with his sister and her roommate through the window of Alexander Hall just before he was shot.

James Green was just 17 years old, and was close to his high school graduation.  He was the middle of nine children, and had a wonderful sense of humor.  His sisters remember his loving personality, and how he could lift a person’s spirits no matter how down they were. And, they told me, “he could make a joke out of anything.”  He was on his way home from work at the Wag-a-Bag grocery where he had worked since he was eleven when he was killed.

The students at Jackson State were shot by notoriously racist officers of the Jackson police and the Mississippi Highway Patrol. In this instance, it seems the National Guard urged restraint. What are some things you learned about deep-seated white supremacist views of the police officers and their record of relationships with the African Americans in Jackson and Mississippi?

The Jackson city police had a long history of racism and of mistreatment toward African Americans, and were a constant source of harassment for the students at Jackson State.  They had brutalized civil rights activists throughout the decade. Though the force had added a few African Americans by 1970, none of them were officers, and black and white policemen still did not partner or ride together.  It had been the chasing of a student onto the campus that provoked two nights of unrest in 1967, ending in the shooting of four young people and the death of local activist Benjamin Brown. Brown had not been involved in the trouble, and was shot from behind as he ran away from law enforcement.

The Mississippi Highway and Safety Patrol (MHSP) remained an all-white force in 1970, and was well known not only in Mississippi but also nationwide for its brutal white supremacy. Lloyd “Goon” Jones, the on-site commander at Jackson State on the night of May 14-15, might help illustrate this. On the force since 1956, he had been a part of several moments of abuse during the civil rights struggle.  He had arrested Freedom Riders in 1961, he had been among those who failed to intervene in the riots at the University of Mississippi in 1962, he had ordered the tear-gassing of one of the March Against Fear campsites in 1966, and in 1967 had fired his weapon the night law enforcement shot and killed Benjamin Brown. The tapes of his transmissions both before and after the shootings offer a visceral understanding of the utter disregard he felt for African Americans as he used derogatory language, and dismissed the importance of the slain and injured.  Interviews with eyewitnesses and victims, both those by investigators in 1970 and those I conducted, only confirm the absolute absence of humanity of this man, who commanded the MHSP that night.

As you detail, no police perpetrators of the attack were criminally charged or disciplined for their actions. Indeed, they won praise from the white Southern establishment for advancing “Law and Order.” You explore the “Law and Order” narrative and how it came to define responses to African Americans in the Jim Crow era and the recently desegregated South. What should we know about the racially-coded appeal of “Law and Order” to many Americans in 1970?

Perhaps most importantly, the appeal of this racism, cloaked in the language of “law and order,” reached well beyond Mississippi or the South.

Linked to longstanding historical stereotypes that framed African Americans as inherently dangerous, misrepresentations first used to justify slavery, this revised rhetoric of law and order emerged on the American political landscape in Barry Goldwater’s 1964 run for the presidency. Though his candidacy failed, others recognized that some voters were attracted to his criticism of federal intervention on behalf of civil rights, and to his calls for law and order. The success in the 1968 primaries of George Wallace, reconstructed to evade the use of derogatory language even as he continued to appeal to white supremacy through the language of “states’ rights,” reinforced the idea that there were many white Americans who would welcome the opportunity to cloak their racism in the guise of a concern for law and order. 

Richard Nixon’s Southern Strategy was based in part on the idea that this kind of racist appeal could bring white southern voters, once solidly Democratic, into the Republican party.  He was right.  Part of the tragedy is that these appeals did not go down with the Nixon presidency, but became standard practice across the political spectrum, but especially among Republicans, facilitating not only the ongoing crisis of state violence against people of color but also helping to build the carceral state.

What happened with official investigations of the Jackson State police shootings?

There were multiple investigations of the shootings.  The first was a local one, conducted by a bi-racial committee appointed by the mayor. It concluded that there was “no evidence that the crowd in front of Alexander Hall threatened the officers prior to firing.” A more fulsome investigation was carried out by the President’s Commission on Campus Unrest.  Supported by a team of investigators, as well as the materials gathered by both the FBI and the mayoral committee, this politically and racially balanced body released a special report on the Jackson State shootings that emphasized the role of racism in producing the shootings. It concluded, “The 28-second fusillade from police officers was an unreasonable, unjustified over-reaction” and “clearly unwarranted.” Neither of these bodies, though, carried any ability to indict or prosecute.

The justice system proved anything but.  Though US Attorney General John Mitchell assured the campus that he would order a full investigation, the appointment of Judge Harold Cox to oversee the federal grand jury undercut that promise. Cox was a well-known racist and segregationist who used racial epithets regularly. He had established his white supremacist credentials when he threw out the felony charge of “conspiracy to deprive the victims of their civil rights” against seventeen men in the murder of James Chaney, Mickey Schwerner, and Andrew Goodman during Freedom Summer. 

The federal grand jury produced neither a report nor any indictments.  The Hinds County grand jury modeled itself on the federal one, with the judge borrowing from the words of Cox. Suggesting jurors should see their work as a “bulwark against those who seek . . . to oppress,” the court concluded that no one who engaged in civil unrest or did not extricate themselves from it “has any right to expect to avoid serious injury or even death” if “extreme measures or harsh treatment” proved necessary. This grand jury, too, would return no indictments for the shootings.

It may surprise and upset some readers that wounded survivors of the shootings and families of the fatally-wounded men sued for some compensation, yet they never found justice or restitution. Why was it so difficult for these worthy plaintiffs to find some degree of sort of recompense in the post-segregation South of the early seventies?

In 1972, three of the injured students and the families of James Earl Green and Phillip Gibbs brought a civil suit.  Led by the gifted Constance Slaughter, the first African American woman to graduate from the University of Mississippi Law School, the case was a very strong one.  But it was not strong enough to succeed against the white supremacy of the defense, which relied on the rhetoric of law and order to depict the students as criminals, or of the all-white jury, which ruled for the defendants. 

A victory on appeal proved hollow when the court also ruled that the city, state, and their representatives, were covered by sovereign immunity.  When the Supreme Court refused to hear the case in 1974, the doors to justice were closed. 

 You are a distinguished historian and an expert on historical memory. How would you like to see the Jackson State shootings in our history?

This event needs to be part of the way we think about the black power era, the late 1960s and early 1970s.  We need to understand that in the aftermath of the civil rights successes of the mid-1960s, old-fashioned racist violence by law enforcement continued, even as it was dressed up in new rhetoric. 

By 1970 police in Mississippi and elsewhere could not, at least theoretically, gun down unarmed innocents, even if they were black.  The reality is that they could, so long as they could justify themselves as acting in the interests of law and order against a purportedly criminal element.  Being black in the United States was all it took to be cast as such. This reality is laid bare in the horrors of the Jackson State shootings and their aftermath.

Thank you Professor Bristow for your illuminating comments and for your original research on this overlooked history. Is there anything you’d like to add on your book or your work as a historian?

Mostly I want to thank you for your willingness to share this story with your readers, and to ask readers to hold the stories of those who were hurt—physically, psychologically, emotionally, and in other ways—in their minds and in their hearts. 

Also, I want to urge readers to recognize the direct linkages between this past and our present.  The crisis of state-sanctioned violence against people of color is ongoing. We must demand accountability when those who represent us as officers of the state--law enforcement officers--act on white supremacy and commit this kind of violence.

Fifty years is too long for these crimes to continue to happen, and to go unpunished by our justice system.

It’s an honor for me to share your thoughtful words Professor Bristow. I admire your humane and creative approach to history that breathes life into the past and inspires hope for a more tolerant and just world. Thanks again for your generosity and your pioneering history of the Jackson State shootings in 1970, a timely and provocative history. Congratulations on this important new book.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/blog/154350 https://historynewsnetwork.org/blog/154350 0
A Mathematical Duel in 16th Century Venice (Excerpt)

 

 

The Secret Formula is the story of two Renaissance mathematicians whose jealousies, intrigues, and contentious debates led to the discovery of a formula for the solution of the cubic equation.

 

Niccolò Tartaglia was an ambitious teacher who possessed a secret formula – the key to unlocking a seemingly unsolvable, two-thousand-year-old mathematical problem. He wrote it down in the form of a poem to prevent other mathematicians from stealing it. Gerolamo Cardano was a physician, gifted scholar, and notorious gambler who would not hesitate to use flattery and even trickery to learn Tartaglia’s secret. In this era when mathematicians challenged each other in intellectual duels held outdoors before enthusiastic crowds, their contentious relationship would change the history of mathematics forever.

 

 

 

Sixteenth-century mathematical duels had a long history, rigid procedures, and a lasting impact on the duelists’ standing in their communities. They also played a hugely influential role in the development of the history of mathematics.

When a duel was to take place, a mathematician or scholar would send another a list of problems to be solved in a given amount of time—the “challenge gauntlet”—after which the recipient would propose a further set of problems to his rival. Tradition required that in case of disagreement a public debate should be held in which the contenders would discuss the disputed problems and solutions in front of judges, notaries, government officials, and a large crowd of spectators. It was not unusual in those duels for tempers to flare, and personal abuse take the place of scientific argument. Admittedly, the stakes could be very high: the winner of a public mathematical duel—whoever had solved the largest number of problems— gained not only glory and prestige but possibly also a monetary prize, new fee-paying disciples, appointment (or confirmation) to a chair, a salary increase, and, often, well-paid professional commissions. The defeated contender’s future career, on the other hand, risked being seriously compromised.

At the beginning of 1535, Antonio Maria Fior began a dispute with Niccolo Tartaglia, challenging him to a mathematical duel to be conducted according to the rules of the time. He proposed thirty problems to Tartaglia, who in response sent the challenger thirty other questions. As was customary, the winner would be whoever answered the most questions in a given period of time. On February 22, 1535, Fior and Tartaglia entrusted their respective lists of questions to Iacomo Zambelli,a notary public in Venice, agreeing to hand over the solutions forty or fifty days later. The stakes of the contest? First of all, honor and reputation, and then a lavish dinner at the tavern for each unsolved problem, with the bill to be footed by the unfortunate contestant who capitulated before his rival’s question.

In the ensuing months, echoes of the Venetian challenge spread well beyond the city, carrying the news of its resounding outcome: Tartaglia had literally humiliated Fior by solving in a couple of hours all thirty problems posed by his opponent, while the latter had not been able to answer a single one of Niccolo’s questions (the victor, the story goes, magnanimously gave up his right to the dinners). In fact, far more astonishing was the topic on which Fior had challenged Tartaglia. Such details about the contest slowly trickled out of Venice and eventually reached Tartaglia’s birthplace, Brescia, causing many friends to reach out to him and confirm the details of the duel. One such acquaintance, the imperishable Messer Zuanne de Tonini da Coi, wrote:

I heard many days ago that you had entered into a challenge with Maestro Antonio Maria Fior, and that finally you had agreed on this: that he would propose to you thirty really different problems written under seal and in the custody of Maestro Iacomo di Zambelli, notary; and that, similarly, you would propose to him thirty other, truly different ones. And so you did, and each of you was given forty or fifty days to solve the said problems; and you established that, at the end of this period, whoever had solved the largest number of problems would receive the honors, in addition to whatever small sum you had waged for each problem. And it has been reported and confirmed to me in Brescia that you solved all his thirty problems in two hours, which is something I find hard to believe.

“Everything you have been told is true,” replied Tartaglia, who right away explained how and why he had been able to perform a feat that was to be the peak of his scientific career and a decisive turning point in the history of algebra. 

According to this account, Niccolo Tartaglia discovered the formula for the solution of third-degree equations of the type x3 + bx = c on February 12, 1535, a formula that a few years later would prove crucial for solving any cubic equation. It was an epoch-making, extraordinary result: the first true algebraic discovery since the time of the Babylonians, one that would open up new and boundless horizons for algebra and that posterity would remember as the most significant mathematical result of the sixteenth century.

A word of caution, however: Renaissance mathematical duels obeyed a gentlemen’s agreement: a contender could not propose to his opponent problems he did not know how to solve himself. Therefore, if Fior felt confident in challenging Tartaglia with problems involving cubic equations of the type mentioned earlier, then the Venetian abbaco master must have discovered the fateful formula before his Brescian rival—unless Fior had not played fair and sought to intimidate Tartaglia with problems beyond his own ability to solve, just as Tonino da Coi had done five years earlier. On the other hand, if we rule out this possibility, how could a teacher known only for being a good expert in accounting calculations—and, what is more, not particularly cultivated or at ease with algebraic abstractions—been capable of such an extraordinary undertaking?

Tartaglia’s opinion of Fior’s intellectual capabilities—he did not possess the science but only great experience—is consistent with that of Matthäus Schwarz, the German accountant who had taken lessons from Fior. Once established that Fior lacked the education and fertile mind required to find the formula on his own, and assuming that he already knew it when he threw down the gauntlet to challenge Tartaglia, under what circumstances did he acquire this knowledge? And, especially, who was the “great mathematician” who had revealed to him the secret thirty years earlier, and who therefore should have been credited with the exceptional discovery? How was it possible for such a breakthrough to remain secret through all this time? Was Fior the only one to benefit from the confidences of the mysterious personage? And why, after his initial hesitation, had Tartaglia taken Fior’s surprising revelation seriously, causing him to put off his “studies, care, and work” to find—or find again—the crucial formula?

Tartaglia was therefore not willing to reveal the solution process he had used to answer Fior’s questions for fear that Messer Zuanne would infer from it the solution formula for third-degree equations of the form x3 + bx = c, which for the moment Niccolo had no intention to give away. As mentioned before, in those days it was the custom among mathematicians to refrain from announcing their results; keeping them secret could help attract students as well as serve as a weapon in public contests. Tartaglia followed the custom, as obviously did the unknown “great mathematician” mentioned by Fior, if he existed at all.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175506 https://historynewsnetwork.org/article/175506 0
Who Can Learn From Taiwan? Apparently not WHO

Ma Ying-jeou (Taiwan) and Xi Jinping (PRC) meet in Singapore, 2015, the highest level meeting

between leaders of the two proclaimed Chinese nations since 1945. Attribution

 

 

In early April, World Health Organization Director-General Tedros Adhanom Ghebreyesus requested “please don’t politicize this virus.” Tedros made his remarks after U.S. President Donald J. Trump threatened to withdraw funding from the organization, a threat Trump made reality to the dismay of leaders around the world. States around the globe are dealing with the outbreak of the worst viral infection since the 1918 Flu. COVID-19, the disease caused by the novel coronavirus SARS-CoV-2, has upended societies from where it originated in Wuhan, China, to New York City, United States. Trump’s decision to cut funding for the WHO in the midst of a pandemic was political theater, which Tedros noted and sought to prevent. Tedros’s critique that leaders not politicize COVID-19 was valid, but also dismissive of the WHO’s political role.

 

Towards the end of April, Taiwan has had 429 people infected with COVID-19 and 6 die. Researchers at Johns Hopkins originally forecast Taiwan would have one of the highest number of infections outside China. How has an island of nearly 24 million that is a global transportation hub separated from China by 180 kilometers, and just over 1,000 kilometers from where the disease originated, managed to contain the spread of COVID-19 so effectively? To answer that question, one cannot turn to the WHO. This was dramatically revealed when a Hong Kong journalist asked WHO Assistant General-Director Bruce Aylward over Skype about Taiwan’s response to the virus. Dr. Aylward first pretended to not hear the question, then ended the call when asked again. When the journalist called back to get an answer about Taiwan, she was told they’d already discussed China.

 

The WHO’s exclusion of Taiwan is an historical artifact. The United Nations (UN) founded the WHO in 1948. The Republic of China (ROC) was a founding member of the UN and the WHO. China was also a permanent member of the Security Council, along with Britain, France, the Soviet Union, and the United States. Permanent members had, and still have, veto power over matters brought before the council.

 

After 1949 two states claimed to represent the Chinese nation. The Communist Party of China established the People’s Republic of China (PRC) after winning the Chinese Civil War (1946-1949). The Kuomintang, or nationalist party, relocated to Taiwan where they set up the ROC in exile. One of the communist’s first international acts was an attempt to assume China’s place in the UN. The nationalists, however, controlled the organization’s China seat and, along with their U.S. allies, had no intention of losing the power that came with it.

 

For 22 years the nationalists and their allies kept the PRC out of the UN and its rapidly expanding portfolio of international agencies, such as the WHO. Nationalist delegates to the UN used Beijing’s participation in the Korean War (1950-1953) against U.S.-led UN forces as evidence the PRC was not a “peace-loving state,” a key criterion for membership. Indeed, UN participation in the Korean War resulted from China’s contested status in the world body; Soviet delegates were boycotting the organization in protest of PRC exclusion when the UN voted to intervene in Korea. Taipei and its allies used the Korean War to block discussing which state represented China in the UN until 1961. As decolonization resulted in more states joining the organization, nationalist representatives and their allies allowed the General Assembly to debate Chinese representation, but argued it was an “important question,” thus requiring a two-thirds majority for any change to the ROC’s status in the UN.

 

In 1971 the UN passed Resolution 2758 (XXVI) that determined the Communist PRC was the legitimate government of China. It also expelled Taipei from all positions in the world body, including the WHO in 1972. Ever since then, Beijing has proactively excluded Taiwan from international organizations, particularly those related to the UN. From 1949 to the 1980s, both Beijing and Taipei adhered to a “one-China policy,” which essentially meant that any organization or state that recognized one could not recognize the other. After Taiwan began democratizing in the 1990s, however, its leaders began taking a more pragmatic approach to international relations. To put it succinctly, Taipei stopped representing the ROC as China. Beijing, however, maintained a one-China policy that included Taiwan. Consequently, Taiwan remains locked out of, or is only able to participate marginally, in many international organizations.

 

The nationalists excluded the PRC from the UN and its affiliated organizations for 22 years. In comparison, the communists have now excluded the ROC for nearly four decades. Taipei’s claim it represented all of China was a fantasy, but no more so than Beijing’s claim to represent Taiwan. When both parties claimed to represent China, Beijing’s exclusion of Taipei was defensible. But since 1993 Taiwan has sought to reengage the international community in its own right, only to be stymied by China. Shortly after the SARS epidemic in 2002 Taipei won the ability to cooperate more closely with the WHO. In recent years, however, Beijing has curtailed Taipei’s participation with the WHO. It may be more than a coincidence that after Taiwanese elected current president Tsai Ying-wen, who favors a more independent Taiwan, the island nation found the WHO less accommodating.

 

The WHO claims they are learning from all areas, including Taiwan. After Dr. Aylward’s interview, the organization put out a statement that they were following the outbreak in Taiwan closely. According to a Reuters article, however, Taiwanese officials remain frustrated that the WHO has not shared Taiwan’s experience controlling SARS-CoV-2 with its members and reports Taiwan’s number of cases under China’s. Taipei was aggressive and proactive in preventing the spread of the novel coronavirus to Taiwan. Taiwan’s success at controlling COVID-19 spread has not gone unnoticed in medical communities. By not differentiating Taiwan’s health measures from China’s, the WHO is doing a disservice to international efforts to control the pandemic.

 

The world is facing a pandemic caused by a novel coronavirus. Taiwan has been successful at controlling that virus and consequently the spread of COVID-19 among its citizens. The global community could learn from Taiwan’s health officials and practices. Maybe it is time for international organizations like the WHO to admit that Taiwan, while still claimed by Beijing as a part of China, neither coordinates with nor is represented by the PRC. The WHO, and other international organizations, exclude Taiwan based on antiquated political considerations. Historical artifacts belong in museums, not international policy.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175512 https://historynewsnetwork.org/article/175512 0
What Happened to My Future: The Pivotal Years of 1914, 1929, and 2020

This car carried the late late Archduke Franz Ferdinand when he was assassinated. 

Museum of Military History, Vienna. Photo Alexf CC BY-SA 3.0

 

 

As the years 1914, 1929, and 2020 began, many people thought their futures looked good. This optimistic feeling was captured well in the memoir of the Austrian Jewish writer Stefan Zweig: 

 

When I attempt to find a simple formula for the period in which I grew up, prior to the First Word War, I hope that I convey its fullness by calling it the Golden Age of Security. . . .

 

. . . The world offered itself to me like a fruit, beautiful and rich with promise, in that radiant summer [of 1914]. And I loved it for its present, and for its even greater future. 

 

Then, . . . [on 28 June 1914], in Sarajevo, the shot was fired which in a single second shattered the world of security and creative reason in which we had been educated, grown up and been at home--shattered it like a hollow vessel of clay.

 

That assassin’s shot, which killed Austrian Archduke Franz Ferdinand, began a sequence of events that by August 4 entangled most of Europe’s powers into World War I (WWI). In 1917, the USA entered the conflict, and later that year war was among the factors that brought the Communists under Lenin into power in Russia. By the time the war ended in November 1918, about 10 million fighting men had been killed--in France, a horrific 3 of every 10 between the ages of 18 and 28. 

 

In addition to the military deaths, millions of civilians died because of the war. Even after it ended, many more millions of men and women continued to die in the chaos of the Russian Civil War and a continuing worldwide influenza pandemic, both partly due to WWI. 

 

The Center for Disease Control (CDC) estimates those pandemic deaths at a minimum of “50 million worldwide with about 675,000 occurring in the United States,” where in the spring of 1918 it was first discovered among military personnel. The movement and dislocation of people caused by war aided its global spread. Like the war itself, the flu epidemic killed many young adults, those who before 1914 still had hopes for a brighter future. 

By 1920 the world had changed greatly, especially in Europe, where so many young men lay dead. Old empires, like the Austro-Hungarian one, had disintegrated and new nations and borders sprang up. In the USA, the Republican candidate Warren Harding was elected, seeking a return to “normalcy.” “Poise has been disturbed, and nerves have been racked,” he said. What America needed was “not heroics, but healing; not nostrums, but normalcy; not revolution, but restoration . . . not submergence in internationality, but sustainment in triumphant nationality.”

And throughout the 1920s, until the Great Depression began in October 1929, many people tried to reaffirm traditional values. Prohibition began in January 1920 and lasted until 1933. Three Republican presidents dominated the decade. Refusal to join the League of Nations and isolationism characterized our foreign policy, and in the famous Scopes “monkey trial” of 1925, a high school biology teacher was found guilty of teaching Darwinian evolution instead of the biblical account of creation. But rapidly developing technology like the radio, movies, the Ford Model T, and a constant stream of other new consumer products undercut old ways. In Europe Zweig noted that “all values were changed,” and “anything that gave hope of newer and greater thrills… found a tremendous market.” In the USA, despite prohibition there were speakeasies that sold illegal liquor, and new dances that shocked traditionalists. In 1920, women across the USA for the first time were able to vote for a president. A U.S. college president expressed feelings shared by many conservative Americans when he said, “The low‑cut gowns, the rolled hose, and the short skirts were born of the Devil.” 

The cataclysmic events of 1914-1919--WWI, Russian revolution and civil war, influenza epidemic, Paris Peace Conference and the treaties that resulted from it-- greatly affected the young adults who survived. This was especially true for many European veterans who served in the war--notably Hitler and Mussolini (ages 25 and 31 respectively in August 1914)--but also for some who never fought, like the American writer F. Scott Fitzgerald (only 21 when he dropped out of Princeton to join the army in 1917). 

 

In the 1920s he and some of his fellow writers like Ernest Hemingway became known as part of the “Lost Generation.” The latter author began his The Sun Also Rises (1926) with the epigraph “You are all a lost generation.” The term suggested that because of the horrific and often pointless deaths of millions, many of the young had lost faith in abstract ideals and values like patriotism and courage. They were “lost” in being aimless, lacking purpose and drive. In his novel of WWI, A Farewell to Arms (1929), Hemingway has his protagonist Frederick Henry say that “abstract words such as glory, honor, courage or hallow were obscene.”

 

Europe’s writers were even more affected by the war. Some like the poet Wilfred Owen died in battle. Others, as Robert Wohl indicated in The Generation of 1914, remained “under the spell of the experience they had undergone during the critical decade between 1910 and 1920. They were never able to free themselves from the conviction, given lasting shape by the war, that they were living through an apocalypse…. They had yet to absorb what it meant to live in a society characterized by persistent rather than occasional change…. They wavered uncertainly and unpredictably between a desire to spring forward into the future and a longing to return to the hierarchies and faith of the past.” 

 

But the past could not be brought back. The war had changed too much. Unemployment and inflation troubled many European nations, as did left-right political clashes. Unsettled conditions helped Mussolini come to power in 1922 and Hitler to attempt a takeover in Munich in 1923. Although it failed, the Great Depression aided him to become German chancellor in 1933. 

 

Until that cataclysmic economic meltdown began in October 1929 in the United States, conditions looked rosy for many young Americans. At the beginning of 1928 Calvin Coolidge was still president, and in 1925 he had said “the chief business of the American people is business.” On January 1, 1929 The New York Times editorialized that 1928 had been a year “of unprecedented advance, of wonderful prosperity…. If there is any way of judging the future by the past, this new year will be one of felicitation and hopefulness.” Between 1921 and 1929, U. S. industrial production doubled. As compared with pre-war years, many more students were now going to college, and jobs were plentiful. In Middletown, the Lynds’ famous study of 1920s Muncie, Indiana, they indicated that while the state’s population increased only about 25 percent from 1890 to 1924, the number of those graduating from “the State University” increased almost 800 percent. In late 1929, unemployment (3.2%) and inflation (less than 1%) were still low.

 

But then the Depression began, and with it increased misery for many Americans, young and old alike. By 1933 the unemployment rate was 25 percent and, despite Franklin Roosevelt’s New Deal, declined only to 17 percent by 1939. Salaries also decreased for many, like professors, still lucky enough to have a job.

 

In Germany the unemployment rate was even higher by January 1933, which bolstered the popularity of Hitler and his Nazi party. He became chancellor of Germany at the end of that month. For the next six years he increased German armaments and made increasing demands on other countries before beginning World War II in September 1939 by invading Poland.

 

Finally, we have 2020 and the coronavirus pandemic, which by 15 May had sickened more than 4,431,700 million people and killed at least 303,000 people in at least 177 countries, including at least 86,700 deaths in the USA. How many more will get sick and die is anyone’s guess. Based on our past knowledge and available scientific knowledge, however, the number will be large--one reliable source in early May more than doubled its death forecast, mainly because so many states were beginning to ease social distancing and other requirements. The effects of the pandemic on numerous phases of our life are also likely to be significant. Wearing of face masks, social distancing, massive unemployment, educational and entertainment disruptions, closing of businesses, especially restaurants, an acceleration of online shopping, rethinking of the relationships between governmental bodies (at all levels) and individuals--all of these phenomena and more are now occurring. Like 1914 and 1929, 2020 will likely prove to a pivotal year for many people, young and old, around the globe.

 

How pivotal will depend in major ways on the results of vital U. S. elections in November. President Trump, who will be seeking reelection, is in some ways like the three Republican presidents elected in the 1920s. Like Harding, he wishes for a return to normalcy, and Trump’s “make America great again” reminds us of Harding’s call for a “triumphant nationality.” Like Coolidge, he champions business interests before those of the needy--though his demeanor is hardly like that of “silent Cal.” And like Hoover, who started off his presidency with an expanding economy and low unemployment only then to plummet into the Depression, Trump has not heretofore handled the most serious crisis of his presidency (coronavirus) well

 

If the Democratic nominee (presumably Joe Biden) is elected, the pandemic, plus four years of Trumpism, is likely to spur major changes just as the Depression and four years of Herbert Hoover helped produce FDR’s New Deal. Dealing with the coronavirus, plus reversing Trump’s environmental, tax, and medical policies, might just be a start. 

 

Newspapers and journals are full of predictions about what our future might look like. But three of the wisest commentaries come from historians and contributors to the History News Network (HNN). Ed Simon, an HNN contributing editor, states that “what happens next is unclear for all of us, and he quotes historian Rebecca Spang’s “The Revolution Is Under Way Already” (in The Atlantic): “everything is up for grabs.” Historian Steve Hochstadt, also on HNN, writes “as for what that future will be like, at the moment I haven’t a clue.” But like Simon and Sprang he asserts, “we, men and women, can influence how we come out of this chaos.” How we vote in November will dramatically impact that “how”--and all of our futures.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175510 https://historynewsnetwork.org/article/175510 0
Graduate Students: NOT THE WORST

Cudahy Library, Loyola University, Chicago. Photo Amerique, CC BY 3.0

 

There is a lot to be depressed about if you are a college instructor right now.  Virtual teaching reveals itself, again, as a less than ideal platform for communicating with students and generating classroom camaraderie.  The budgetary forecast gets grimmer by the day as we face plummeting enrollments, pay cuts, furloughs, and lay-offs. 

 

Yet teaching, especially in these abysmal conditions, has buoyed me.  My students are lifting me up with their resilience, creativity, and shockingly good performance.

 

I have the privilege of teaching only one class this semester; and it’s a graduate research seminar.  I know, it’s cushy.  My university made this arrangement with me—front-loading most of my course work in the fall—so I could leave early this semester to teach a Fulbright semester in Germany.  I was not supposed to be sitting here in my Chicago home, grading and writing while supervising my 11-year-old’s virtual learning (I won’t pretend to call this home schooling).  I had been making arrangements for an exciting summer of teaching at the University of Heidelberg, scaling the Alps, cooking courses in Paris, and trips to places I had never visited before.  I do not need to tell you what happened to all that.

 

As lockdown continued into March 40th, 41st, 42nd, I encountered many of the problems we are lamenting to each other on Twitter and Facebook.  The tech failed multiple times before it worked, Zoom discussions felt awkward, and students expressed their shared sense that historical research felt a lot less meaningful than it had weeks earlier.  Their fatigue and frustration set me up to do what my administrators and colleagues had been advising since February—lower expectations. 

 

The first drafts of their papers arrived in my inbox right as stay-at-home orders came down.  My addled lockdown brain could not honestly assess if they were any rougher than drafts of earlier semesters.  Like my colleagues, my main thought was: “get them through this semester.  Your job is to help them finish as best they can.”  They then dazzled me with their final papers.

 

I still cannot believe it.  When we spoke on Zoom last week, one student was recovering from COVID while another was nursing his infected partner. One was trying to finish her work while starting a full-time archive job, but virtually. Another was still teaching elementary school fulltime, but from her living-room sofa.  Several had completely relocated themselves and their belongings to points out of state, either because they were kicked out of campus housing or because family circumstances demanded a transfer.  One moved her elderly parents into her home so she and her spouse could provide care for her mother, who is recovering from a recent cancer surgery.  Thank God for her wicked-sharp sense of humor which somehow still makes appearances, like when she logged into zoom with a virtual display of hell as her background.  

 

I feel relieved that some of them have had the good sense to ask for incompletes so they could focus their attention on the pressing matters at hand.   The others?  Their work is the sunshine breaking through this cloudy Chicago spring.  Polished writing and vivid details are filling me with joy.  I had expected something less-than; I would have been fine with errors and superficial analysis this time around, but instead they whiplashed me with improvements.  I don’t know how, but they rallied as a collective.  Graduate students are sometimes picked on for their quirky intensity (cue the 30-Rock “graduate students are the worst!” memes), but it is precisely these idiosyncrasies that animate and inspire me at the moment.  My civil-war-history enthusiast asked if he could have an extra day on the project because he was feeling the weight of finishing the very last piece of written work of his entire graduate career.  The care I saw in that day-late draft practically moved me to tears—I don’t think there is one typo.  Another, who scared me by going incommunicado in March, emerged from his isolation with a captivating study of Vietnam veteran artists.  I have another urban history paper that is, I think, almost ready for publication.  Its writer just wrote back a nervous “okay” to my earlier email about sharing it with some colleagues.  

 

I will admit, it’s not just the classwork.  At least one of these students moves me with his masterful kitchen experiments.  Wowed (along with his other other followers) by his Instagram posts of sourdough tartines, smothered porkchops and bread puddings, I convinced him to barter some of his chocolate chip cookies for a homemade mask.  Somehow, he also produced a compelling history of burlesque performance in turn-of-the-century Chicago.  Another invited me to participate in a virtual lecture on an architectural tour site about the 19th-century theater scene in Chicago, which was also the topic of his seminar paper.     

 

I want to hug them, throw them a party.  But, of course, there’s no way that’s going to happen.  It especially kills me that I can’t celebrate my graduating students.  I am one of those professors who actually likes the ceremony and bravado of commencement, especially at Loyola where we process to the drone of bagpipes—gets me every time.  Damn this virus! Some of my students are already getting ready to move, trying to figure out what to do with the piles of books that the library does NOT want them to cram into campus return boxes.  I just imagine them closing their trunks of their cars and driving off… no applause, no toasts, no fist bumps (er…elbow bumps?).  

 

New York Times columnist David Brooks recently broke from his usual rallying cry for broad-based liberal arts curricula to speculate that health-care workers are showing great heroism in the moment because their hard-core education experiences have given them exceptional training and grit.  “While most academic departments slather students with A’s,” he writes, “science departments insist on mastery of the materials.”  I do not dispute the problem of grade inflation; and I think that doctors, nurses, and first-responders deserve our respect for their sacrifices.   Still, the supposedly “coddled” students in my history class have shown me what it means to rise to the occasion and perform like the world is watching—even when no one is.  They don’t need a grade boost from me.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175507 https://historynewsnetwork.org/article/175507 0
Coronavirus Exposed the Contradictions Behind the Curtain. Will Americans Pay No Attention?

American Nazi Party leader George Lincoln Rockwell, 1967. CC0 

 

 

Historians study the past in its own right, but also to better understand the present and even to anticipate the future.

 

The coronavirus crisis has illuminated historically rooted contradictions in American life whose implications are beyond profound. The virus is not the cause of the deep-seated economic, political, and social crisis that now imperils the American nation. Rather, it has acted as an accelerant and a powerful illuminator.

 

Efforts to mitigate the virus, to be sure, have taken a wrecking ball to the American economy but, President Donald Trump’s boasts notwithstanding, the economy was already teetering despite the apparent boom that preceded the virus. Economically, the United States has become a plutocracy. The unprecedented maldistribution of wealth and equally unprecedented public and private indebtedness promised eventual economic turmoil, a bursting of the bubble. The only question was when it would arrive. The virus has now answered that question. The odds are great that recovery, again contra Trump, will not come quickly.

 

Turning to politics, the presidential election in 2020 is nothing less than the most important in American history since 1860 and it, too, could have earth-shaking consequences. A Trump reelection will not only affirm the plutocracy and accelerate horrendous environmental degradation as well as militarization of the planet, it will moreover signal the failure of American democracy.

 

Americans have enthusiastically elected and reelected flag-draped populist reactionaries before—Andrew Jackson and Ronald Reagan are the two best examples—but the stakes have never been greater than they are today. Unanticipated and quirky developments in history happen all the time, and perhaps the 2016 election will be seen as such, but the reelection of such a palpably reckless narcissistic demagogue as Trump will lay bare that the United States is a deeply dysfunctional democracy with an electorate too uneducated and too politically immature to make choices remotely coincident with the challenges that we face. The anti-science response to climate change and now to the virus are the two most significant of many examples one could cite of the devastating consequences of ignorance.

 

The United States retains substantial social strength and sources of cohesion—neighborliness, decency, generosity, and national patriotism—but it also nurtures, and always has nurtured, a deeply reactionary strain. There is no denying that a virulent white supremacy lies at the center of the Trump phenomenon. Anyone who seriously studies American history should grasp that these reemergent racist, reactionary elements—the same forces that ensured the persistence of slavery well after other countries had abandoned it, that produced an enduring apartheid state, and that buttressed a persistently imperial foreign policy—remain powerfully ensconced in our body politic.

 

History thus helps illuminate the present crisis, but what can be done about it? Bernie Sanders had an answer: democratic socialism. The virus has powerfully illuminated the weaknesses of our economic and political system that democratic socialism--or whatever you want to call it--could address through a more equitable distribution of wealth and social welfare, science-based national health care, a coherent public health response to the pandemic, environmental protection and emission controls, arms control (rather than escalation) in concert with a more cooperative internationalism that replaces solicitude for reactionary regimes.

 

Liberal centrists grapple with these issues around the edges rather than attacking them head on, hence they do not inspire and they do not deliver meaningful change. Joe Biden, Nancy Pelosi and many others, albeit sometimes well intentioned, have proven better at accruing personal wealth than effecting meaningful political change.

 

The aversion to democratic socialism is equally rooted in American history. It is a consequence of decades of global cold war and the suppression of the American political left that went along with it. This is a critical story that some historians have grasped, yet overall it has been woefully under-analyzed and under-appreciated. The left remains suppressed but it, too, is trying to reemerge, as popular support for Sanders in his two campaigns has shown. If liberal centrism, kept afloat by Obama, continues to founder under his would-be successors, the choice may be between socialism or an American variant of fascism that Trump so well personifies.

 

It is not altogether clear what choice the American public would make.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175511 https://historynewsnetwork.org/article/175511 0
Tax Protesting on the Cheap

 

 

My years in grad school (1964-68) coincided with LBJ’s escalation of the Vietnam War. The more I learned about our actions, the less defensible they seemed. Continuing and even expanding the war was neither humane nor in our country’s Realpolitik best interest. Finally, I vowed not to support it with my tax dollars.

 

By my calculation, about half of our budget went to the military, especially if one included interest on the national debt, which had largely stemmed from World War II, the Korean War, and now the Vietnam War.

 

But how to protest? From my Harvard teaching stipend, the government already took out more tax dollars than I owed.

 

Luckily, I was the “proctor” of the “Harvard Co-op House,” which meant I lived and ate free, so I had saved a few hundred dollars from my fellowship and invested them in stocks. My choices had made money, so I owed taxes on my profits. Indeed, I owed exactly $9.96.

 

I filled out my IRS tax return and paid exactly half. Then to my tax return I attached a letter explaining why my check was for $4.98.

 

For weeks I heard nothing. Well, each month I got a reminder notice that my obligation had gone up, with penalties and interest, first to $5.20 and then I think $5.50. Then one morning an undergraduate rushed up to my room. “Loewen, Loewen!” he gasped, “there’s an IRS guy downstairs in a trench coat and he’s asking for you!”

 

His anxiety was contagious. I combed my hair, straightened my clothing, and came down the stairs, heart pounding.

 

The man indeed wore an ominous dark trench coat, but he proved to be a sweetheart. “Sorry to bother you,” he began, “but it’s about your taxes.” He went on to tell me that he understood my motivation perfectly; his own son was doing the same thing! He went on to explain that he wasn’t a collector. His job was to explain to me my options, which were three:

 

-- “You could pay what you owe.” I was considering doing just that, but he went on immediately, “I don’t suppose you’ll do that, or you would have paid in the first place.” “Oh, right,” I replied.

 

-- “You could do what Joan Baez and all those other celebrity tax resisters do,” he went on. “Tell us where you assets are, and the government will seize the money. That way, you’ll be off the hook, but not through any act of yours.”

 

That sounded a bit hypocritical to me, so I asked, “What’s the third option?”

 

“Oh, we’ll hound you to death,” he replied.

 

That actually sounded interesting. “I’ll take number three,” I said.

 

“OK, then,” he said, explaining again that his job that morning was to give me my options and record my response.

 

Later that day I went to my two banks, one where I had my checking account and the other a small savings and loan where I had my savings, and closed both accounts. I figured why make it easy for the government to seize my assets?

 

I had forgotten about my paycheck, however. At the end of the month, my usual envelope came from Harvard with my salary for being House Tutor. It contained not just my usual check, but a letter on government letterhead, signed, or at least stamped, by the Treasurer of the United States. He explained that the government had garnished some $6 from me, which explained why my check was short that month. Since a citizen cannot sue the government without its consent, the letter went on, my only recourse was to ask the Treasurer to give it back to me.

 

That would be unlikely to succeed, I concluded. But what if the government had made a simple error in computing my taxes and instead of $6 claimed I owed $6,000,000? Could they garnish all my pay? Forever? With no recourse? A scary imbalance of power!

 

Although I had lost $6, I was satisfied. Surely I had cost the government hundreds of dollars to collect my six. That had to have registered somewhere in the bureaucracy.

 

After finishing my degree, I moved to Mississippi to teach at Tougaloo College. One or two war protesters had flatly refused to pay taxes in Mississippi and were facing prison terms in Parchman as I recall, even though Parchman Penitentiary is a state facility. Again, however, that option was not open to me, because at year’s end, the government would owe me money back from my withholding. However, I now had my own phone, which meant I now had my own phone bill. Every month I paid not only Southern Bell (the temptation to add an “e” is almost overwhelming) for my calls but also an “excise tax” to the federal government. This 10% levy was put on phone bills during World War II, explicitly as a war tax. After the Korean War ended, it was being phased out, when Lyndon Johnson reimposed it to help pay for the Vietnam War. Across the country, antiwar activists were refusing to pay it.

 

At first, phone companies simply carried over the unpaid amount as if it were part of the phone bill. When the amount grew too large and went unpaid for too long, they shut off the phone. Having no phone made it hard to participate in modern society. By 1968, most phone companies were taking the reasonable position that they were not a collection agency for the federal government.[i] They reported the unpaid assessment to the government, then dropped the amount from the next bill. Then they repeated the process for that month’s new excise tax.

 

In Mississippi, no one had raised the issue, so Southern Bell kept piling up my unpaid amounts. I could see where this was headed, so I phoned them. (Back then, you could phone the phone company.) As soon as they heard my plea, with the precedent of other companies elsewhere, they changed their policy to match. Now my excise tax obligations were piling up where they belonged, between the federal government and me.

 

Again, the amounts were miniscule – less than $2/month. Again the government spent much more money threatening me each month. I honestly don’t remember how this tempest wound down, but I think I might have taken the easy way out. I was now engaged in social change in Mississippi, a full-time calling, as well as teaching at Tougaloo, and didn’t have time or energy to spare. But I still salute those heroes, from Joan Baez and Jane Fonda to lesser-known folk like Steve Trimm and John McAuliffe, who opposed the war with their whole being. They played a major role in ending it. Me, only a minor one.

 

[i]This position may have resulted from a court ruling.

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/blog/154349 https://historynewsnetwork.org/blog/154349 0
The Unfulfilled Potential of Progressive Democracy after World War II

British Minister of Health Aneurin Bevan on the first day of National Health Service operation, July 5, 1948.

Photo Liverpool University Faculty of Health and Life Sciences, CC BY-SA 2.0

 

 

Looking back in 1999, the media pondered the twentieth-century’s most significant event. Among several credible possibilities, wise voices argued for the defeat of the Nazis -- for what kind of world would we have otherwise inhabited during the rest of the century?  Yet victory in the titanic conflicts of World War II did not in itself determine what would ensue.  In the three allied democracies -- Britain, the United States, and Resisting France  -- progressive forces girded to oppose a drift back to pre-war normalcy and to promote a transformational agenda for the postwar.  The wartime experiences of Britain, France and the U.S. of course differed profoundly, as did the trajectories of the postwar struggles in the three historic democracies. But the progressive impetus in each nation reflected a common assumption: victory over Nazism would be drained of meaning if society, economy and government at home simply slipped back into their customary pre-war ways and inequities.

 

Well before V-E Day, progressive impulses crystalized in three contemporaneous but independent manifestos, with comparable themes and aspirational values.  In France, the National Council of the Resistance (CNR), established in 1943,  coordinated various movements of the internal Resistance and formally linked them to General de Gaulle’s Free French in London and Algiers. In March 1944, after extensive debate, the clandestine, sixteen-member council unanimously adopted a Common Program for  post-Liberation France. In the U.S. the CIO labor federation mounted an unprecedented effort for the Democratic Party in the election of November 1944. As an inspiration, the CIO-PAC revived FDR’s earlier call for an expansion of the New Deal with an “Economic Bill of Rights” -- which the president himself had then relegated to a bottom drawer -- and infused its People’s Program for 1944 with that vision. In Britain, as the end of the wartime coalition government finally approached, the Labour Party’s long-gestating social democratic program could be launched as its manifesto, Let Us Face the Future, for the general election of June 1945, the first in ten years.

 

Each agenda rested on expansive visions of human dignity, equal opportunity, and social rights. Each sought to promote “full employment,” enhanced social security, decent wages for workers, and expanded trade union rights. Each insisted that a healthy democracy must regulate its economic system to promote greater productivity and a sounder distribution of wealth to undergird mass purchasing power – to increase economic growth in comparison to past performance and to better distribute its bounties.

 

Hanging in the balance were such matters as priorities for postwar reconstruction or reconversion; the possibilities for national planning to stimulate economic growth; extending price controls to prevent a damaging wage-price inflationary spiral; what sectors (if any) to nationalize and on what terms; trade union powers as against the prerogatives of management; the potential reach of social security benefits; the allocation of veterans benefits; the expansion of educational opportunity at various levels; and how to meet desperate needs for new, affordable housing. In the broadest terms, would the elusive ideals of greater equity and social solidarity become core democratic values?

 

The CNR Common Program served as a cement for France’s postwar governments, starting with de Gaulle’s provisional unity cabinet after the Liberation in 1944-45, followed by elected “tripartist” governments sans de Gaulle in 1946-47. The  new tripartist alliance of communists, socialists, and progressive social-Catholics had emerged from the wreckage of the Nazi occupation and the Vichy regime.  The tripartist years saw the nationalization of utilities, transport, and large banks; planning for industrial modernization; the expansion of social security, family allowances for children, and national health insurance; and unprecedented powers for trade unions in negotiations and over shop floor issues. The failures of tripartism included a comprehensive blueprint for educational reform in a democratic spirit that ended up on the shelf.  

 

After its decisive electoral victory in July 1945, the British Labour Party struggled with the severe shortages of dollars to pay for imports of food and raw materials. The Attlee government used rationing and budgetary strategies to sustain “fair shares” and mass purchasing power without runaway inflation, and improvised central planning for investment in a “mixed economy.” On another track, Labour designed a comprehensive  welfare state with a National Health Service at its core, and a remarkable public housing program featuring semi-attached houses rather than high-rise towers. Labour nationalized energy, utilities, transportation, and (after prolonged hesitation) steel; insured strong trade union bargaining powers; and constantly reiterated egalitarian values, including a commitment “to raise the living standards of the people as a whole.”  

 

In the U.S. the Truman administration and Congressional progressives fought mightily in 1945-46 for a major extension of wartime price controls to protect consumers, and for a Full Employment Bill in 1946 to assure that “every man and woman in the country who is willing to work and capable of working has the right to a job.” The bill would have instituted economic planning so that federal “investment and expenditure” might preempt any major recession.  Progressives lost both those hard-fought struggles, leaving only the G.I. Bill of Rights as a pathway to social change.  Then in the mid-term election of November 1946 the Republicans won control of Congress and focused on two policies: a “rich man’s tax cut” -- as Truman called it when he successfully vetoed the bill -- and the Taft-Hartley labor relations bill which restricted the rights of trade unions, and whose veto by Truman Congress handily overrode.

 

Was America, then, an exception to the postwar progressive tide in the Western democracies? The British Labour Party, at any rate, did not believe so as it celebrated Truman’s fierce campaign for an expansion of the New Deal in 1948:  “Fortunately the re-election of President Truman has shown that the American people are more progressive than most of their newspapers and Congressmen. The Fair Deal, backed by a politically conscious labor movement, is based on… moral principles which inspire our own socialism….  Over a wide field the Truman Administration and the Labour Government have the same interests and ideals – and the same enemies.”

 

What was true in 1948 still applies. For as historian Marc Bloch cautioned: “more or less unconsciously every national tradition of historians has elaborated its own vocabulary in which issues are problematized. Different [national] schools of historiography almost never ask the same questions.”  This is certainly true in considering the lineage of progressive forces in the three historic democracies. In the postwar moment the shapes and trajectories of progressive forces differed, but all struggled in the electoral and legislative arenas for national health care provision, affordable housing,  “full employment,” trade union power, and broader educational opportunity.  Outcomes unsurprisingly varied, but the parallel struggles should impress and inspire.  

 

Never definitively solved in Britain, France, or the U.S., those particular issues have resurfaced time and again through changing contexts and political reversals up to the fraught present -- even as new kinds of issues (including global financialization, gross income inequality, climate change, and the erosion of democratic institutions) have begun to dominate argument in the public square.  At the moment, progressive forces suddenly seem to be laid low – the French Socialist Party at its very nadir, the Labour Party a shrunken and powerless opposition in the UK, and the Democratic Party in the U.S. momentarily stifling its left wing standard-bearers.  It is easy enough to believe, however, that progressive forces will regroup and rekindle their struggles, old and new.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175513 https://historynewsnetwork.org/article/175513 0
Amid the Coronavirus Pandemic, America’s Billionaires Thrive and Prosper

Bravo Eugenia, owned by Jerry Jones. Photo Nicolas SYT, CC BY-SA 4.0

 

Although most Americans currently face hard times, with unemployment surging to the levels of the Great Depression and enormous numbers of people sick or dying from the coronavirus pandemic, the nation’s super-rich remain a notable exception.

Financially, they are doing remarkably well.  According to the Institute for Policy Studies, between March 18 and April 28, as nearly 30 million Americans applied for unemployment benefits, the wealth of America’s 630 billionaires grew by nearly 14 percent.  During April 2020 alone, their wealth increased by over $406 billion, bringing it to $3.4 trillion.  According to estimates by Forbes, the 400 richest Americans now possess as much wealth as held by nearly two-thirds of American households combined.

Some of the super-rich have fared particularly well.  Jeff Bezos (the wealthiest man in the world) saw his wealth soar between January 1 and early May 2020 to $142 billion―an increase of $27.5 billion.  During that same period, Elon Musk’s wealth grew by $11.4 billion to $39 billion and the wealth of Steve Ballmer (ranking sixth in wealth) increased by $8 billion to $66.1 billion.  The gains of Mark Zuckerberg (ranking third) were more modest, but his wealth did rise to $79.3 billion.

Although some billionaires lost money, this was not likely to put them out on the streets.  The wealth of Bill Gates (ranking second) dropped from about $113 billion to $106 billion, while the wealth of Larry Ellison (ranking ninth) slipped from $58.8 billion to $58.7 billion.

During this time of economic crisis, two features of the U.S. government’s economic bailout legislation facilitated the burgeoning of billionaire fortunes:  first, the provision of direct subsidies to the wealthy and their corporations, and, second, the gift of huge tax breaks to rich Americans and their businesses.  Consequently, although the U.S. economy continues to deteriorate, stock prices, helped along by this infusion of cash, are once again soaring.

In terms of health, American billionaires are also doing quite nicely, with no indication that any of them have been stricken with the coronavirus.  When news of the disease hit, billionaires immediately began renting superyachts at fantastic prices to ride out the pandemic.  As one yacht broker explained, a yacht “in a nice climate isn’t a bad place to self-isolate.”  Such yachts can carry supplies that will last for months, and “clients are arranging for their children to be schooled on board, with cooking lessons from the yacht’s chef and time with the crew in the engine room learning about technology.”  Other super-wealthy Americans took refuge in their fortress-like country estates or flew off in their private jets to fashionable, secluded areas.

Of course, the ability of the rich to stave off a serious or fatal illness is enhanced by their easy access to the best of medical care.  On Fisher Island―a members-only location off the coast of Florida where the average income of residents is $2.2 million per year and the beaches are made from imported Bahamian sand―the residents, unlike other Floridians, had no problem purchasing thousands of rapid Covid-19 blood tests.  To secure immediate and near-unlimited access to healthcare, including such tests, billionaires often employ “concierge doctors”―for a hefty annual fee.

Naturally, thanks to their soaring wealth and relatively secure health, America’s billionaires are able to continue the kind of lifestyle to which they are accustomed.

Housing is not a major problem.  Although journalists have trouble keeping track of the bewildering array of mansions purchased by America’s billionaires, Jeff Bezos reportedly owns 14 homes, including a newly-acquired $165 million Beverly Hills mansion.  Another of his lavish dwellings, located in an exclusive section of Washington, DC, contains 11 bedrooms and 25 bathrooms. Although Mark Zuckerberg apparently possesses only 10 homes, Larry Ellison has bought dozens of incredibly expensive mansions and real estate properties, plus (at a price of $300 million) a Hawaiian island.

Just how many homes Bill Gates owns remains unclear, as he has made a number of secretive real estate purchases.  Nevertheless, they include multiple luxurious horse ranches scattered across the United States.  He spends most of the time, though, at Xanadu 2.0, his $127 million, 66,000 square-foot mansion in Medina, Washington.  Requiring 300 workers to construct, this behemoth contains very unusual high tech features, a trampoline room with a 20-foot ceiling, six kitchens, a dining room able to seat up to 150 people for dinner, a 22-foot wide video screen, a home theater, garages for 23 cars, and 18.75 bathrooms.  There is also a lakefront shore containing large quantities of sand delivered every year by a barge from the Caribbean.

The super-rich have more than enough wealth to squander upon a variety of extravagant items.  Their superyachts cost as much as a billion dollars each, and boast such fixtures as night clubs, swimming pools, helipads, and even missile defense systems.  In 2019, the United States ranked first in the world in superyachts, with 158 in operation.  Many billionaires also own private superjets, such as the $403 million (“before any customization work”) “Queen of the Skies,” featuring a full office, bedroom, and “a stately dining room that can be converted into a corporate boardroom.” (Both Jeff Bezos and Bill Gates are superjet owners.)  Moreover, the ultra-affluent possess luxury car collections, multiple passports (available for millions of dollars), and gold toilets.

Billionaires do face problems, of course, including boredom, finding the necessary household “help,” fending off challenges from their increasingly desperate workers, and defeating politicians who dare to champion taxing great wealth to fund vital public services.

Nevertheless, the vast gulf separating the lives of the super-rich from those of most Americans raises the issue of whether this small, parasitic stratum of U.S. society should be maintained in such splendor.  Many Americans might already be wondering about this as they cope with economic collapse and ever-widening death.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175516 https://historynewsnetwork.org/article/175516 0
What the New Evidence on Pope Pius XII and the Holocaust Means

 

 

Researchers combing through recently-opened wartime records of the Vatican have discovered that a senior papal adviser, Angelo Dell’Acqua, told Pius XII in 1942 that reports of the slaughter of European Jews were unreliable because Jews “easily exaggerate.” Despite numerous other reports that Pius XII had received, which amply documented the atrocities, he chose to embrace Dell’Acqua’s perspective. 

 

This new evidence dovetails with an episode that emerged from declassified U.S. government files many years ago. Those documents revealed that in January 1943, the American ambassador to the Vatican, Harold Tittman Jr., asked the Pope why his recent Christmas message, in which he mentioned the persecution of innocent people in Europe, did not mention either the Jews or the Nazis.

 

Tittman reported back to Washington that Pius XII replied “that he ‘feared’ that there was foundation for the atrocity reports of the Allies [about the Jews] but led me to believe that he felt that there had been some exaggeration for purpose of propaganda.”

 

Some government officials in London and Washington likewise were inclined to doubt reports about Jewish suffering—especially if the reports came from Jews. Regarding German atrocities against Jews in occupied Poland in 1940, Reginald Leeper, the chief of the British government’s Political Intelligence Department, wrote to a colleague: “As a general rule Jews are inclined to magnify their persecutions.”

 

“Sources of information [about anti-Jewish atrocities] are nearly always Jewish whose accounts are only sometimes reliable and not seldom highly colored,” Foreign Office staff member I. L. Henderson insisted. “One notable tendency in Jewish reports on this problem is to exaggerate the numbers of deportations and deaths.” He wrote those words in early 1945, nearly six months after the liberation of the Majdanek death camp.

 

A State Department memorandum to the Office of Strategic Services in August 1942, summarizing the latest reports it had received on the Nazis’ plans to annihilate European Jewry, asserted that the information seemed to be nothing more than a “wild rumor inspired by Jewish fears.”

 

There is no record that President Franklin D. Roosevelt disbelieved the reports concerning the Nazi mass murder. But his characterizations of Jewish suffering sometimes bordered on dismissive.

 

Consider, for example, Roosevelt’s response in October 1940, when refugee advocate James G. McDonald clashed with Assistant Secretary of State Breckinridge Long, who was in charge of refugee affairs for the State Department and vehemently opposed more immigration. Each of them took their case to the president. Long wrote in his diary afterwards that Roosevelt was “100% in accord with my ideas.” FDR also told Long that when McDonald came to the White House, the president “told him not to ‘pull any sob stuff’ on him” regarding the plight of refugees seeking a haven.

 

Walter Kirschner, a longtime family friend of the Roosevelts, encountered a somewhat similar attitude when he approached the president regarding the plight of European Jewry, in 1944. According to Kirschner, FDR replied: “I haven’t time to listen to any Jewish wailing.”

 

Why did Pope Pius XII view reports of anti-Jewish atrocities as “exaggerations”? What prompted President Roosevelt to dismiss “Jewish wailing” and “sob stuff”? 

 

There is no way to know if either man was personally influenced by the centuries-old stereotypes of Jews as incorrigible liars who fabricate claims in order to advance their interests. 

 

However, there can be no doubt there were practical considerations at stake. In the case of the Pope, the “Concordat” which the Vatican signed with Hitler in 1933 pledged that German Catholic clergy would refrain from political activities in exchange for the regime’s non-interference in German Catholic religious life. Despite the Nazis’ many violations of the agreement, Pius XII insisted on keeping his end, even to the point of not raising questions about the mass murder of the Jews.

 

President Roosevelt, for his part, had a practical consideration of his own. Fully acknowledging the extent of the Jews’ plight and drawing attention to it could have policy implications. Senior State Department official R. Borden Reams candidly noted in a 1942 memo to a colleague that if the administration publicized the mass murder, “the way will then be opened for further pressure from interest groups” for steps such as admitting more refugees to the United States—something FDR and his administration strongly opposed.

 

In the end, whatever their respective reasons, the Pope and the president both opted to look away from the greatest humanitarian crisis of our time.

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175517 https://historynewsnetwork.org/article/175517 0
Roundup Top Ten for May 15, 2020

We Must Not Forget The Jackson State Massacre

by Robert Luckett

The killings at Jackson State in 1970 should be a reminder that state-sanctioned violence aimed at the marginalized remains a systemic part of American life.

 

‘Mrs. America’ Reminds Us that More Women in Politics Won’t Necessarily Mean More Liberal Policies

by Leandra Zarnow

The recent streaming hit "Mrs. America" underestimates the influence of New York Representative Bella Abzug on future progressive politics and the complex impact of women's activity in the political arena.

 

 

Shanghai’s Past, Hong Kong’s Future

by James Carter and Jeffrey Wasserstrom

The story of Hong Kong and Shanghai isn’t simply a defining story of the last two centuries of Chinese history. It is really the story of all world cities around the globe today: how they thrive and how they decline.

 

 

Brave New Classroom: Lessons from the First Six Weeks

by Hannah Leffingwell

This crisis has pulled the ground out from under us all—professors, researchers, doctoral students, and undergraduates alike. But in truth, the weak spots revealed through this crisis have long existed.

 

 

Jogging Has Always Excluded Black People

by Natalia Mehlman Petrzela

The most enduring legacy of the racialized experience of recreational running is the surveillance and suspicion to which black people have long been subjected when using public space.

 

 

How We Got to Sesame Street

by Jill Lepore

The beloved children's program grew from the need to use the overabundance of televisions to fix the dearth of preschools in 1960s America. Jill Lepore assesses how the show has changed along with society.

 

 

Will Covid-19 Lead to Men and Women Splitting Care Work More Evenly?

by Sarah Keyes

History shows men have always been able to handle care work — when they have to.

 

 

Flight Status

by Sarah Rose

During the Vietnam War, the women who served on special Pan Am flights flew into a war zone to transport soldiers. Why has their role been forgotten?

 

 

An Unlikely Bohemia: Athens, Georgia, in Reagan's America

by Grace Elizabeth Hale

Athens kids built the first important small-town American music scene and the key early site of what would become alternative or indie culture.

 

 

How Covid-19 Exposed the Deep Divide between White Rural Georgia and Atlanta

by James C. Cobb

Rural antagonism toward Atlanta has been a defining element in Georgia politics for almost 150 years and today underlies Governor Brian Kemp's potentially disastrous response to Covid-19. 

 

]]>
Mon, 01 Jun 2020 19:38:59 +0000 https://historynewsnetwork.org/article/175503 https://historynewsnetwork.org/article/175503 0