History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 20 Jun 2021 03:17:27 +0000 Sun, 20 Jun 2021 03:17:27 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed Experts Beware: Is America Headed for a Scopes Moment over Critical Race Theory?

Clarence Darrow and William Jennings Bryan during the trial of John Scopes, Dayton, TN 1925.

 

 

In a recent debate over a law to ban the teaching of Critical Race Theory, Tennessee legislator Justin Lafferty (R) explained to his colleagues that the 3/5th Compromise of 1787, used to determine a state’s representation in Congress by counting enslaved people as “three fifths of all other Persons,” was designed with “the purpose of ending slavery.” Lafferty had his facts spectacularly wrong, but that did nothing to derail the law’s passage.

 

Anti-Critical Race Theory laws like the one passed in Tennessee – as well Texas, Iowa, Oklahoma, and Florida -- are not just aimed to push back against the heightened awareness of the nation’s history of racial injustice in the wake of the popularity of the 1619 Project and last summer’s massive protests over the murder of George Floyd. They are also attacks on educators -- and on expertise itself. As Christine Emba explained in a recent Washington Post article on conservatives’ current obsession with Critical Race Theory, “disguising one’s discomfort with racial reconsideration as an intellectual critique is still allowed.” Not only is it allowed in these public debates, it is an effective strategy to curb movements for social change. It is also not new.

 

A century ago a similar right-wing outrage campaign was launched against the teaching of evolution in public schools. The 1925 Scopes “Monkey Trial” remains a touchstone of this era of conservatism. When John Scopes, a substitute teacher in Dayton, Tennessee was charged with violating a new state law against teaching evolution, the case became an international story. Scopes was found guilty and fined $100.

 

The Scopes Trial’s legacy rests perhaps too comfortably on defense lawyer Clarence Darrow’s skewering of the anti-evolution hero William Jennings Bryan in that hot Tennessee courtroom, memorialized in the play (and film) Inherit the Wind. Darrow’s withering questioning made Bryan appear ignorant and incurious. In response to Darrow’s questions about other religious and cultural traditions, Bryan acknowledged that he did not know about them, but added that he did not need to know since through his Christian faith, “I have all the information I need to live by and die by.” 

 

Bryan’s responses were more clever than the popular legend of the trial might lead us to believe. By asserting that he did not need to know what Darrow and the scientists knew, Bryan was calling into question the social value of modern expertise itself. When Darrow asked if Bryan knew how many people there were in Egypt three thousand years ago, or in China five thousand years ago, Bryan answered simply, “No.” Darrow pressed on, “Have you ever tried to find out?” Bryan: “No sir, you are the first man I ever heard of who was interested in it.” Translation: experts studied subjects that no one needed to care about. When asked if he knew how old the earth was, Bryan again responded he did not, but added that he could “possibly come as near as the scientists do.” Here Bryan rejected the premise that the experts really knew what they’re talking about any more than he – presenting himself to the court and the public as a simple man of faith -- did.

 

The legacy of these tactics is on full display today. As David Theo Goldberg wrote in the Boston Review recently, Republican critics of Critical Race Theory “simply don’t know what they’re talking about.” Goldberg is correct of course, but their ignorance is not a hole they are looking to fill anytime soon. It is rather both a shield and a weapon used to go on the offensive against the experts themselves. What the experts “know” about the 3/5th Compromise or the history of racial injustice generally (or climate change, or the dangers posed by COVID-19, or the outcome of the 2020 election) threatens their beliefs in how American society should look and function.   

 

Similar to what we’re seeing today, the attack on the teaching of evolution in the 1920s was an effective means by which to challenge all manner of troubling developments that always seemed to emanate from the latest pronouncement of some expert somewhere. Mordecai Ham, for example, was a popular Baptist preacher who first converted Billy Graham. In a 1924 sermon he moved seamlessly from attacking evolution as false to warning parents that having Darwinism taught to their children would assuredly lead to communism and sexual promiscuity. He thundered, “you will be in the grip of the Red Terror and your children will be taught free love by that damnable theory evolution.” That Ham skipped effortlessly from the teaching of evolution to Bolshevism to free love makes sense only if one remembers that winning a debate over evolution was not the goal--condemning the modern day teaching of evolution was. Evolution then served as the entry point to attack educators and expertise in general as existential threats to their way of life.

 

After Bryan’s death in 1925 sidelined the evolution debate, conservatives continued to connect expertise with unwelcome social change. When University of North Carolina (UNC) sociologists began to investigate the often-poor living conditions in nearby textile mill villages, David Clark, publisher of the Southern Textile Bulletin, the voice of the powerful textile industry, became irate. Clark was convinced that university sociologists were not “just” interested in research. In response he accused the school’s experts of promoting “dangerous tendencies” and “meddling” in the business community’s affairs. The university, he charged, “was never intended as a breeding place for socialism and communism.” UNC’s sociologists like Howard Odum, a fairly conservative, but well-respected expert, was taken aback by Clark’s virulence. But, like Bryan and Ham, linking expertise with radicalism was central to Clark’s strategy.

 

As Goldberg observed of today’s critics of Critical Race Theory, David Clark actually knew very little about sociology or socialism. This became clear when UNC invited him to campus in 1931 to make his case before the faculty and students themselves. During the question and answer period an exasperated audience member asked Clark if he knew what socialism actually was. He responded: “I don’t know, and I don’t think anybody else does.” A newspaper account recorded that “The audience fairly howled.”

 

Clark’s followers would not have been bothered by his concession on socialism -- and they would have not been surprised that the university audience laughed at him. Once again, the goal was not to win a debate over socialism; it was to stop social change they objected to. The experts represented a movement aimed at them, they believed -- a movement that also seemed to take delight in pointing out all that people like David Clark did not know. UNC, so proud of its accomplished faculty, was actually, in the conservatives’ view, a “breeding ground for reformers” and “radicals.”

 

The factual misstatements by today’s Republicans can seem breathtaking to those who value living in an evidence-based reality. These include historical errors like Representative Lafferty’s forehead-smacking error on the 3/5th Compromise or Congressman Madison Cawthorn’s (R-NC) reference to James Madison signing the Declaration of Independence. And there is the ongoing misrepresentation, and even outright denial, of current day events that happened in plain sight – the January 6th insurrection, for example. But attempts by “the experts” to set the record straight will most likely be seen as more proof that the world is out to get them. For the rest, perhaps some comfort can be taken by remembering that facts, as John Adams once pointed out, “are stubborn things.”

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180483 https://historynewsnetwork.org/article/180483 0
It's Time for a "Don't Trust, Do Verify" US-Russia Cybersecurity Treaty

Last month's Colonial Pipeline hack shows the urgency of US-Russia cybersecurity negotiations.

 

 

 

As he started his European tour this week, which culminates in a summit in Geneva with Vladimir Putin, President Joe Biden said he wants a stable, predictable relationship with Russia. Moscow has been echoing that sentiment. Although each side has a different understanding of what those qualifiers mean and expectations for the meeting are very low, the hacks on SolarWinds and Colonial Pipeline demonstrate that cyberspace is the most glaring threat to stability and predictability.

 

In early spring of 1985, when Ronald Reagan and Soviet leader Mikhail Gorbachev had their first meeting in that Swiss city, expectations were also low. The Soviet downing of a Korean commercial airliner in 1983 and Reagan’s not-off-mic comment in 1984 about outlawing and bombing the USSR clearly indicated just how tense relations were.  

 

The upcoming Biden-Putin summit provides an opportunity to begin discussing a framework for an Internet version of the most significant U.S.-Russian cooperation to date: the product of work done at the Geneva and, later, Reykjavik summits: The Intermediate Range Nuclear Forces (INF) treaty  signed by Reagan and Gorbachev in 1987.

 

Reagan adopted the Russian phrase “Trust but Verify” and developed respect for a Soviet leader whose ideology he loathed. Gorbachev vanquished internal foes to ensure successful treaty implementation. The result was thought to be impossible: military intelligence officers inspected the opposing countries’ missile storage and launch facilities.  Another thirty inspectors from each side took up residence at the gate of their former enemy’s most secret rocket-motor manufacturing facility.

 

The idea of on-site inspection had been discussed for years, but no one believed both sides could push the boundaries of sovereignty and counter-intelligence concerns to make it work.  But INF did work. All 2,693 short and intermediate range nuclear missiles were destroyed and mutual trust established. The treaty ushered in two decades of bilateral cooperation, including the Cooperative Threat Reduction program, which secured and eliminated strategic and chemical weapons across the former Soviet Union.

 

Critics, no doubt, will regard applying the arms control approach to cyber security as naïve, impractical, and even dangerous.  But it’s worth remembering that big problems require bold solutions.  And the incentive is clear: hackers threaten governments, the private sector and individuals, electric grids, transportation and energy facilities, defense installations and intellectual property.  A tit-for-tat response to an attack may well escalate into armed conflict

 

A cyber treaty is certain to be based on little trust, with lots to verify.  Technological challenges, however, can be overcome.  Both sides have extensive experience in monitoring public communications. From Solzhenitsyn’s days in a “sharashka” (scientific labor camp) developing decoding technology for Stalin, to the now ubiquitous SORM (an abbreviation for “network eavesdropping”) boxes attached by security services to the equipment of every telco and internet provider in the country, Russian officials know who is doing what to whom.

 

American systems are more poetically nicknamed: PRISM, MYSTIC, Carnivore, Boundless Informant. Government agencies conduct packet sniffing and people snooping—at home to benefit local law enforcement and abroad to spy on friends and enemies, counter ISIS and track monsters like Bin Laden.

 

What if each side allows the other to install such systems on the global Internet Exchange Points (IXPs) on their territory and let loose the algorithms and other tools necessary to identify botnets, hackers and disinformation campaigns?

 

A monitoring center staffed by experts from both countries could be established with anomalies and threats displayed in real time. The UN could supply neutral inspectors and arbitrate disputes.  The treaty should provide protocols for deterring and punishing bad actors.

 

As with INF, the devil will be in the details. Thousands of IXPs will have to be monitored. Though many Russians and Americans understand that their digital privacy has already been compromised, meta-anonymity could be maintained to protect individuals. 

 

A cyber treaty could also help both countries combat drug trafficking, terrorism and child pornography.

 

The advantages of a don’t trust, do verify “cyber-INF” seem clear. But do our leaders have the political will to go forward? Without in any way minimizing the obstacles, we believe there are reasons for cautious optimism.  In 2015, Russia and China agreed not to conduct cyberattacks against each other that would “disturb public order” or “interfere with the internal affairs of the state.”   In September 2020, President Putin proposed a cyber agreement with the United States. President Biden seems cautiously open to seeking out commonalities, without, of course, the unrequited bromance his predecessor had with Putin.

 

There’s no time to lose. A digital iron curtain is descending. Russia continues to turn the screws on internet freedom and is examining ways to isolate itself from the WWW, while pressing foreign content providers to submit to local rules about appropriate content and come on shore with their customer data – or face fines, restrictions and eventual blocking.

 

Should our leaders find the courage to create a monitorable digital peace, perhaps they’ll be willing to turn their attention to the other urgent problems of the 21st century – climate change, terrorism, inequality, pandemics and unchecked artificial intelligence.

 

The aphorism Robert Kennedy “borrowed” from George Bernard Shaw seems appropriate for addressing the prospects of a substantive cyber treaty.  “Some men see things as they are and ask, ‘Why?’ I dream things that never were and ask, ’Why not?’"

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180515 https://historynewsnetwork.org/article/180515 0
The Night Vietnam Veterans Stormed Bunker Hill

Vietnam Veterans Against the War (VVAW) members vote to remain on Lexington Green in defiance of an order by local government to vacate, May 30, 1971. VVAW were subjected to a mass arrest, but gained support by town residents who gave them rides to the Bunker Hill monument in Charlestown to continue the group's march from Concord to Boston. Photo Richard Robbat.

 

 

Citing continuing public health concerns about COVID-19, the city of Boston has declined again this year to issue a permit allowing the annual and always much-anticipated Bunker Hill Day Parade to proceed through the streets of Charlestown. 

The hiatus is an opportunity to recall the holiday’s history and the summer fifty years ago when the country was as politically divided as it is today, until Vietnam Veterans Against the War, or VVAW, insisted on celebrating Bunker Hill Day early.

Bunker Hill Day was initially intended to commemorate the role Massachusetts played in securing the nation’s independence.  Fought on June 17, 1775, the Battle of Bunker Hill was a pyrrhic victory for the imperial British.  The newly formed Continental Army was forced to retreat but not before inflicting enough damage that British forces were confined to Boston.  Famously fought on nearby Breed’s Hill, the battle’s anniversary was first observed with a parade in 1785.  On the fiftieth anniversary, the newly formed Bunker Hill Monument Association organized the first Bunker Hill Day.  While very much a local holiday then as now, the entire nation observed it in 1843, when the Association’s soaring 221-foot granite obelisk was dedicated. 

After Irish immigrants moved into the neighborhood in the final quarter of the nineteenth century, Charlestown became “the only place on the planet,” as famously noted by actor Will Rogers, “where the Irish celebrate a British military victory.”  A cartoonist from that era was prompted to draw a picture of the obelisk with the words “Erected by the Irish in Memory of Patrick O’Bunker of Cork.  Observances came to include companies of reenactors marching in colonial attire to the cadence of fife and drum, as well as elements of Irish peasant culture, including carnivals, fireworks, and alcohol.  As journalist J. Anthony Lukas put it, Bunker Hill Day became “an exuberant statement of Charlestown's independence from the rest of the world.” 

The late 1960s and early seventies were difficult years for Charlestown.  To the dismay of many white parents, the Massachusetts legislature was insisting on school desegregation.  And, as a working-class neighborhood, Charlestown was sending a disproportionate number of its children to fight in Southeast Asia.  The Charleston community engaged in activism on behalf of anti-busing efforts, sometimes resorting to violence; however, few joined what became the most vocal and sustained antiwar movement in US history out of fear of hurting troop morale. 

No one could predict how this community, very much on edge in the spring of 1971, would respond on the Sunday evening of Memorial Day Weekend when not the British, but a wave of American Vietnam veterans swept up Breed’s Hill towards the obelisk the Irish-Americans in Charlestown had made their own.

Forty-eight hours earlier, over one hundred members of VVAW dressed in jungle fatigues had commenced a three-day march that was intended to retrace Paul Revere’s mythic midnight ride in reverse.  Like Revere, the antiwar veterans were seeking to bring a message to the people, in their case that the country had shamefully reversed its earlier course and become the type of imperial aggressor the colonists had once fought to vanquish.  The march route passed through four Revolutionary War battlefields where the veterans planned to demonstrate their patriotic respect for their colonial brothers-in-arms while illustrating with their physical wounds and anguished spirits how far the nation had fallen from its founding ideals. 

VVAW’s march kicked off without incident in Concord, where officials from the National Park Service had granted the veterans permission to camp next to the Old North Bridge, and townspeople served the veterans a hearty dinner.  In marked contrast, the Lexington Selectmen (the Massachusetts equivalent of a town council) refused to grant the veterans permission to camp the second night of the march on the town’s sacred Battle Green.  Intent on punishing the veterans for what he later described as deflating the spirits of those troops still in harm’s way, the Chairman of the Board of Selectmen ordered a mass arrest.

When the veterans were released from the town’s makeshift jail and had paid their fine in county court, they considered skipping Bunker Hill, the final Revolutionary War battlefield on their itinerary.  The mass arrest had taken up a lot of time and they were now at risk of arriving late to the Memorial Day antiwar rally on Boston Common to which they had invited the public. 

Of greater concern was the fact that Charlestown might not be as welcoming as the liberal elites of Lexington, many of whom had decided to get arrested with the veterans and who would later ensure the Chairman was not re-elected. 

Over a dinner prepared for them by one of Lexington’s congregations, the veterans conferred about what to do.  Buoyed by the national media’s sympathetic coverage of the mass arrest, a wounded veteran living at the Bedford VA hospital urged the veterans onwards.

“We’ve already begun the Battle of Lexington,” he enthused about VVAW’s success thus far in unleashing the energy that birthed the nation.  “The whole country knows it.  So let’s go on to Bunker Hill.”

The problem of lost time was solved by hitching rides to Charlestown from their Lexington supporters.  Disembarking in Sullivan Square so they could respectfully approach the Bunker Hill battlefield on foot as the descendants of those who fought and died there, some of the veterans later recalled feeling very worried about how they would be received.

“Was it gonna be food and acceptance or sticks and stones?” one wondered.

As the veterans started uphill toting the very real-looking toy M16s they had carried from Concord as a sign of their authority to speak about the war, windows in the tenement buildings lining the narrow streets flew open and cheers erupted from them.  The veterans had served alongside Charlestown’s own sons and were being honored as such.  Minutes later, when the veterans set foot on the hallowed ground where so many Americans had died so that their children could be free, Charlestown’s residents bore silent witness as the veterans ceremoniously rejected their weaponry in a message that the war must end.

“We love you and we are happy to be here with you,” one of the still stunned veterans exclaimed to these new supporters who hours before VVAW considered avoiding.  “We must begin to share with one another the peace we need right now.”

The next morning, when the veterans emerged from their tents, countless residents returned to their side, offering food and coffee to fuel the veterans’ final push for Boston.

Fifty years ago this summer, Bunker Hill Day was celebrated early by Charlestown’s residents and a new breed of antiwar activists who came together around the idea that the Vietnam War did not reflect the values for which the colonists gave their lives on Breed’s Hill nor those of the Bunker Hill Irish-American community whose children were being forced to fight it.  It was a victory for VVAW and the antiwar movement as great as the ones traditionally celebrated on Bunker Hill Day.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180485 https://historynewsnetwork.org/article/180485 0
Valor Roll: American Newsies in the Great War and the Flu Pandemic

"Scotty and His  Beloved Sho-Sho Gun", Gayle Porter Hoskins. Appeared in Ladies Home Journal, June 1919. Image Delaware Art Museum

Newsboy Albert Edward Scott of Brookline, MA enlisted at age 15 in 1917. The scene depicts his death by a sniper's bullet after defending a road as a machine gunner. "Scotty" was the youngest American casualty of the war.

 

 

World War I presented new opportunities to honor newsboys, particularly those who joined the armed forces. The Boston Globe made a minor celebrity of Fifekey Bernstein, the first Boston newsboy to enlist in the war. The Chicago Tribune placed former Loop news crier Joe Bagnuola on its “valor roll” after he distinguished himself as a battlefield messenger. And the Hartford Courant lauded the fighting spirit of Nat Fierberg, who joined the army to avenge the death of his brother Sam, a former Main Street hawker. Sam had enlisted at age 14 and died at Seicheprey, the first major action involving US ground troops. The Courant commended the boys who joined up “to make news instead of sell it.”

 

Newspapers also applauded newsboys who demanded to see the draft cards of suspected “slackers,” who taunted those who drove on gasless days as “Hun lovers,” or, in the case of a 10- year-old in New York, who gut-punched a suspected German spy as he was being led through Penn Station under armed guard.

 

Newsboys were anything but slackers. Newsboys’ homes supplied many raw recruits. Father Dunne’s home in St. Louis sent 126 residents into the armed forces, five of whom were killed in action. The Brace Memorial Home in New York contributed 2,890 current or past residents to the military. Its superintendent signed enlistment papers for 1,600 boys. Fifteen were killed in action and twenty were wounded. The first to fall, at Château-Thierry, was George “Blackie” Kammers. Others included “Libby” Labenthal, a pitcher on the home’s baseball team, and Peter Cawley and Jackie Levine, who starred in the home’s minstrel shows. Their inch-long obituaries mention their affiliation with the Newsboys’ Lodging House, just as those of Ivy Leaguers mention their association with Harvard, Princeton, or Dartmouth.

 

Eighteen was the minimum age for induction into the army, yet boys like Sam Fierberg sometimes lied their way into service. The foremost example is Albert Edward Scott, a newsboy from Brookline, Massachusetts, who enlisted in July 1917 at age 15 and became a machine gunner in the 101st Regiment. The youngest American casualty of the war, “Scotty” died defending a road near Epieds, killing thirty “boches” before a sniper got him. He received a hero’s burial in France and posthumous honors at home. An oil painting by Gayle Porter Hoskins showing four soldiers gathered around Scotty’s body in the woods was one of the Ladies’ Home Journal’s “souvenir pictures of the Great War.” The Roosevelt Newsboys’ Association raised funds to install a bronze tablet and bas- relief sculpture of the painting in Brookline’s town hall. Vice President Calvin Coolidge ordered two navy destroyers to convey a three- hundred- piece newsboy band from New York for the dedication ceremony, attended by former secretary of state William Jennings Bryan. Scotty was eulogized as a “steady, self-reliant, manly American boy” who “did his duty in war and in peace, in France and in Brookline.” On a visit to Boston, Marshal Ferdinand Foch, supreme commander of the Allied armies, left a wreath of roses bearing the French tricolor to be placed on Scotty’s tablet. The next year the corner of Chambers and Spring Streets in Boston was renamed Benjamin Rutstein Square after a popular West End newsboy killed in the Argonne in 1918. Thus did newsboys participate in the culture of commemoration that followed the war. Plaques, parades, paintings, wreath layings, and street dedications helped give meaning to the slaughter and replenish the wellsprings of nationalism.

 

Movies and songs about newsboy proliferated during the period, and some directly addressed the theme of making the world safe for democracy. The silent film Ginger, the Bride of Chateau Thierry, follows two tenement sweethearts who are separated when Ginger is adopted by a judge. She befriends his son Bobby but stays true to newsboy Tim Mooney. Once grown up, the two men vie for her hand, but they call a truce when war is declared; they ship out to France, and Ginger follows with the Red Cross. When Tim is wounded, Bobby risks his life to carry him to a hospital where he dies in Ginger’s arms, freeing her to marry Bobby. The movie, which includes actual scenes of trench combat, portrays the war as a great crusade unifying the classes. Striking the same note musically was the 1919 Tin Pan Alley flag-waver “I’d Rather Be a Newsboy in the USA than a Ruler in a Foreign Land.” One critic called it “pathetic patriotic piffle,” but it got a smile from AEF commander Pershing when sung for him by a wounded Yank at Walter Reed Hospital.

 

One of the most devastating effects of the war was the influenza pandemic of 1918, which killed 650,000 Americans and 50 million people worldwide. The scourge pushed war news off the front pages and took its toll on many of the children and elderly who sold those papers. One casualty, “Mullen the newsboy,” was a Chicago Loop vendor who gained notoriety after passage of the Seventeenth Amendment by vowing to run for the Senate and introduce legislation allowing indigent newsboys to live in old-soldiers’ homes. Chicago newsies came under scrutiny as potential carriers of the disease because of their habit of spitting for good luck on every nickel earned. Newsies in Norwood, Massachusetts, had to put their money on a table for the manager to spray with a disinfectant before he’d touch it. In Harrisonburg, Virginia, the Daily Independent suspended publication after its entire workforce fell ill. Newspapers from Pueblo, Colorado, to Winnipeg, Manitoba, outfitted carriers with gauze masks to protect their health and alleviate subscribers’ fears. The Wichita Eagle went further, offering its newsboys mittens, health insurance, and the services of a physician.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180484 https://historynewsnetwork.org/article/180484 0
A Celebrity Apology and the Reality of Taiwan

Students in Taipei protest a trade agreement with the People's Republic of China in 2014.

Photo Max Lin, CC BY-SA 2.0

 

 

 

John Cena made international headlines this week while promoting his new movie, The Fast and the Furious 9. The professional wrestler turned action star referred to Taiwan as “the first country” where people would be able to see the film. Chinese citizens were outraged. Cena quickly issued a video apology, spoken in Chinese, for the “mistake.” The apology provoked another round of criticism. The reaction on social media and cable news was unforgiving, calling Cena everything from gutless to disgusting. I do not intend to join in the pile on. Cena’s had enough punishment for one news cycle.

 

However, underneath the scathing takes, lies an inconvenient truth: most Americans couldn’t find Taiwan, which sits in the East China Sea between China, Japan, and the Philippines, on a map, much less trace the origins of Taiwanese identity. Cena’s foray into international affairs provides an opportunity to examine the entrenched misunderstandings about the history of Taiwan at a time when the U.S., and the world, is paying attention.

 

The oft-repeated dictum that “Taiwan is an integral part of China’s historic territory” was not widely held within China in 1895, the year that the Qing Dynasty ceded Taiwan to Japanese colonization. The Chinese government did not begin to assert control over much of the island until the 1870s, and in 1895 officials expressed less interest in protecting Taiwan than other territories demanded by Japan. They were particularly interested in avoiding Japanese rule of Taiwan, but suggested to British and French diplomats that those countries could annex Taiwan. Within Taiwan itself, when given the option of remaining to live under Japanese colonialism or to move to China, less than 10,000 of Taiwan’s roughly 2.5 million inhabitants chose to make the journey across the Taiwan Strait. None of this indicates that Taiwan rose to the level of integral territory before the twentieth century.

 

During the 50 years of Japanese rule, the majority of those residents and their descendants came to think of themselves as Taiwanese, albeit in ways that reinforced divisions between indigenous and non-indigenous groups. Violent and non-violent resistance to the Japanese colonial regime remained a feature of Taiwan’s history, but it was couched in terms of preventing either encroachment into indigenous lands or the eradication of social and religious practices, and rarely if ever in the language of reunification with China. Taiwanese remained interested in China, of course, but as an ancestral homeland or a site for lucrative business activities. Instead, they developed new identities as Taiwanese and displayed them in calls for independence from Japan, drives for voting rights within Japan and an autonomous legislature for Taiwan, and a wide-range of social and cultural behaviors, from social work to religious festivals. All of these behaviors clearly distinguished them from the Japanese settlers and the colonial government that attempted to transform them into loyal Japanese subjects. Instead, they became Taiwanese.

 

That they had not remained Chinese—at least not as people in China defined that term during the early twentieth century—became very clear to everyone on the scene soon after the end of World War II. Members of the Nationalist Party and the government of the Republic of China (ROC), and Chinese popular opinion, had begun to speak of Taiwan as a part of China during the 1930s and 1940s, in the context of anti-Japanese sentiment and war. However, government officials and many Chinese settlers looked upon the Taiwanese as backwards people who had been tainted by Japanese influence. Those Taiwanese viewed themselves as having resisted Japanese assimilation and having built their identities in burgeoning modern metropolises and in relation to modern capitalist industries. Even though many Taiwanese began to study the new national language of Chinese, as they had Japanese, they felt no connection to the national struggles and heroes that they were told to embrace.

 

All of this was evident before 1947, when the separation between Taiwanese and Chinese came into high relief during the 2-28 Uprising and its brutal suppression by Nationalist Chinese military forces, and the White Terror that began soon thereafter. Decades of single-party rule under martial law by Chiang Kai-shek’s regime did not effectively instill most of Taiwan’s residents with a new sense of Chinese national identity. The ROC nevertheless continued Taiwan’s condition of political separation from China, a fact that has been in existence now for almost all of the past 126 years, and Chinese insistence on the idea of Taiwan as a part of China has failed to convince the roughly twenty-three million Taiwanese. Chinese views have been more effective in shaping international opinion, but they do not change Taiwan’s modern history or the reality that Taiwan is a country.

 

To close, the controversy surrounding Mr. Cena’s apology highlights two things: the power of ideas—in this case, the idea of Taiwan as a part of China—and the geopolitical and economic power of countries like China to shape opinions and actions both domestically and around the world.  People, companies, and countries should make their own decisions about what accommodations they are willing to make to do business with China and its citizens. But they should do so with an understanding of the history that lies behind and challenges such ideas.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180486 https://historynewsnetwork.org/article/180486 0
American Conference for Irish Studies Connects the Past, Future of Irish-American Relations

Republic of Ireland President Michael Higgins addresses the ACIS virtual meeting. 

 

 

 

 

U.S. and Irish leaders have suggested the historic bonds between the two countries must evolve to meet modern challenges. This includes Ireland’s role in post-Brexit relations with the European Union, U.S. support for maintaining peace in Northern Ireland, and other robust alliances to protect democracy against rising tides of illiberalism and authoritarianism.

 

Irish President Michael D. Higgins, Irish Ambassador to the United States Dan Mulhall, and U.S. Rep. Brendan Boyle (D-Pa.) made separate remarks at the 2021 American Conference for Irish Studies (ACIS). The annual conference was based at Ulster University’s Magee campus in Derry, Northern Ireland, but held virtually June 2-5 due to COVID-19 restrictions.

 

In addition to their usual studies of history and literature, Mulhall suggested scholars explore how globalism has shaped modern Ireland since the start of the 21st century. Of particular interest, he said, is how the country--so far--has managed to resist anti-immigrant populism and other right-wing ideology that has taken root elsewhere.

 

“The challenges stem from societal changes, especially in Ireland,” Mulhall said. Not only must Ireland explain how it has changed, especially from the outdated and cliched archetypes, he said, but also expand its outreach to new generations of the more than 33 million Americans who claim Irish ancestry. This includes those who are Black, Hispanic, or LGBTI.

 

From either side of the Atlantic, Mulhall said, “it is unreasonable for us to expect a monolithic outlook.”

 

Ireland plans to open its eighth consulate office next year in Miami, a more extensive U.S. presence than many larger nations. Ireland is now the ninth largest source of foreign direct investment in the U.S., with Irish firms employing over 110,00 American workers. More than 750 U.S. multinationals have made Ireland their base for European operations.

 

“In the wake of Brexit, the relationship grows in importance to the U.S.,” said Boyle, who has represented a Philadelphia district since 2015. “Ireland is the only English-speaking country around the E.U. table.”

 

Boyle joined the 2019 congressional delegation led by the U.S. House Speaker Nancy Pelosi (D-Calif.) to the Republic of Ireland, part of the E.U., and Northern Ireland, which remains tied to Britain. In his ACIS remarks, Boyle reiterated that U.S. President Joe Biden and members of Congress from both parties will not accept a “hard border” on the island of Ireland, especially if it threatens peace in the north.  

 

More importantly, he warned of the danger of allowing segments of any population to slip into economic isolation and social resentment, which can “manifest in unhealthy ways.” He suggested this has happened more in America than in Ireland.

 

“Ireland most shares American values,” Boyle said. “With the world in a democracy recession, we need Ireland to speak up for these values.”

 

Higgins praised Biden’s inaugural speech “offer of a moral reawakening on our global responsibilities, including how we respond to COVID-19 and climate change, global conflicts gone on too long.”

 

Returning to the traditional academic ground of ACIS, Higgins said American scholarship of Irish history, especially the late 19th and early 20th century, is “a debt never to be forgotten.”

 

ACIS was founded in 1960 and has about 800 members in the U.S., Ireland, Canada, and other countries. More than three dozen U.S. colleges and universities offer Irish studies programs, and about 12,000 American students visit Ireland annually under non-pandemic conditions.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180513 https://historynewsnetwork.org/article/180513 0
The Roundup Top Ten for June 11, 2021

The Fog of History Wars

by David W. Blight

Nations have histories, and someone must write and teach them, but the 1990s battle over the National Standards for History remains a warning to all those who try – setting a history curriculum is politics by other means, and the right has always been willing to fight over it.

 

Conspiracies in the Classroom

by Elizabeth Stice

"Colleges and universities and faculty have a responsibility to ground their disciplines in truth claims that go deeper than the rabbit holes of the internet and to graduate students who are capable of distinguishing between conspiracy and reality."

 

 

The Push For LGBTQ Equality Began Long Before Stonewall

by Aaron S. Lecklider

Pride month is based on an origin story of the LGBTQ liberation movement that starts with Stonewall. There is a longer history of queer political activism that has been erased because of its origins in the left.

 

 

The Fissure Between Republicans and Business is Less Surprising than it Seems

by Jennifer Delton

Friction between the Trump-led Republican Party and big business organizations like the Chamber of Commerce over supposed "woke capitalism" isn't a new story. Big business's partisan allegiances have shifted according to capital's interests for decades. 

 

 

A Supreme Court Case Poses a Threat to L.G.B.T.Q. Foster Kids

by Stephen Vider and David S. Byers

State and local social service agencies for decades have been actively working to protect the safety and dignity of queer youth in the foster care system. A Supreme Court case threatens that progress in the name of "religious freedom." 

 

 

Protesters in Elizabeth City, N.C. are Walking in the Footsteps of Centuries of Fighters for Black Rights

by Melissa N. Stuckey

A historian living and working at the site of Andrew Brown Jr.'s killing by police explains that local protesters are following generations of freedom seekers. 

 

 

It’s Time for an Overhaul of Academic Freedom

by Emily J. Levine

The idea of academic freedom doesn't account for the present precarity of most university teachers, and doesn't rest on a positive concept of what professors should do with students and the public. 

 

 

The Problem with a U.S.-Centric Understanding of Pride and LGBTQ Rights

by Samuel Huneke

The histories of gay liberation politics in divided Germany offer surprising insight into what it means for LGBTQ people to live freely in a society. 

 

 

‘Lady of Guadalupe’ Avoids Tough Truths About the Catholic Church and Indigenous Genocide

by Rebecca Janzen

"Although it portrays the story of the Virgin of Guadalupe for a broad audience, ultimately this film sanitizes the real-life brutality of the Church toward Indigenous peoples in the 16th century."

 

 

The Last Time There Was a Craze About UFOs and Aliens

by Daniel N. Gullotta

A recent resurgence of interest in UFOs in respectable public discourse recalls the 1990s, when the X Files reflected a similar moment of distrust in authority and conspiratorial thinking. 

 

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180512 https://historynewsnetwork.org/article/180512 0
America's First Peaceful (Just Barely!) Transfer of Power

 

 

On July 14, 1798—nine years to the day after the storming of the Bastille—President John Adams signed an American Sedition Act into law. The 1789 Parisian incident had set in motion events that ultimately toppled and killed King Louis XVI; his queen, Marie Antoinette; and their heir to the throne, the dauphin. Adams’s signature likewise led to his own ouster, but the president; his lady, Abigail; and their heir, John Quincy, got to keep their heads in the transition and thereafter. On two telling dimensions—orderliness of regime change and avoidance of bloodshed—Federalist-era America showed itself vastly superior to Revolutionary France. But the events of 1798-1801—America’s first peaceful transfer of power from one presidential party to another—were in fact far more fraught than is generally understood today and in myriad respects cast an eerie light on the not entirely peaceful transfer of presidential power in 2020-21.   

UNDER THE TERMS OF THE Sedition Act, anyone who dared to criticize the federal government, the president, or Congress risked a fine of up to $2,000 and a prison term of up to two years. But venomous criticism, even if knowingly false and violence-inciting, that targeted the vice president was fair game under the law. Thus, in the impending 1800 electoral contest between Adams and his main rival, Thomas Jefferson—who was also Adams’s sitting vice president—Adams and his Federalist Party allies could malign Jefferson, but Jefferson and his allies, the Democratic Republicans, could not reciprocate with equal vigor. Congressional aspirants attacking Congressional incumbents would need to watch their words, but not vice versa. Just in case the Democratic Republicans managed to win the next election, the act provided that it would poof into thin air on March 3, 1801, a day before the new presidential term would begin.

 

On its surface, the act seemed modest. It criminalized only “false, scandalous, and malicious” writings or utterances that had the “intent to defame” or comparable acidic motivation. The defendant could introduce into evidence “the truth of the matter contained in the publication charged as a libel.”

 

This was more generous than libel law at the time in Britain, where truth was no defense. Indeed, truth could actually compound a British publisher’s liability. “The greater the truth, the greater the libel,” because the libelee would suffer a greater reputational fall if the unflattering story was, in fact, true. British law was thus all about protecting His Majesty and His Lordship and His Worshipfulness from criticism; it was the product of a residually monarchial, aristocratic, and deeply deferential legal and social order. British freedom of the press meant only that the press would not be licensed or censored prepublication. Anyone could freely run a printing press, but printers might face severe punishment after the fact if they used their presses to disparage the powerful.

 

Back in the 1780s, Jefferson had urged James Madison and other allies to fashion a federal Bill of Rights that would go beyond English law—but not by miles. As Jefferson envisioned what would ultimately become America’s First Amendment, “a declaration that the federal government will never restrain the presses from printing any thing they please, will not take away the liability of the printers for false facts printed.” Jefferson evidently could live with publisher liability for “false facts printed.” But what if the falsehood was a good-faith mistake, or a rhetorical overstatement in a vigorous political give-and-take? Could an honest mistake or mere exuberance ever justify serious criminal liability and extended imprisonment?

 

Also, who would bear the burden of proof? The Sedition Act purported to criminalize only “false” statements, but in the 1790s many derogatory comments were legally presumed false. The Sedition Act said that a defendant could “give in evidence in his defence, the truth of the matter,” but many edgy statements mixed truth with opinion and rhetoric. If a critic wrote that John Adams was a vain and pompous ass who did not deserve a second term, how exactly could the critic establish the courtroom “truth of the matter”?

 

ADAMS ERRED NOT SIMPLY in signing the Sedition Act but in mindlessly and mercilessly prosecuting and punishing, and never pardoning, men under it. He and his minions hounded tart but peaceful speakers and printers whose only real crime was dislike of John Adams, his party, and his policies, in cases whose facts were miles apart from treason, riot, or mayhem. Indeed, under the ridiculously strict standards of his own administration, a young John Adams himself should have been fined and imprisoned back in the 1760s and 1770s for his vigorous denunciations of colonial Massachusetts royal Governor Thomas Hutchinson.

 

In the first high-profile sedition case, brought in October 1798, the Adams administration targeted a sitting Democratic Republican congressman from Vermont, Matthew Lyon, for political writings and harangues, some of them at campaign rallies. In one passage highlighted by the prosecution, Lyon had written that Adams had “swallowed up” every proper “consideration of the public welfare” in “a continual grasp for power, in an unbounded thirst for ridiculous pomp, foolish adulation, or selfish avarice.” Adams, wrote Lyon, had “turned out of office . . . men of real merit [and] independency” in favor of “men of meanness.” Lyon had also read at public meetings a communication from a French diplomat bemoaning the “extremely alarming” state of relations between France and the United States, worsened by the “bullying speech of your president and stupid answer of your senate.” Congress, wrote the diplomat in words that Lyon publicly repeated, should send Adams “to a mad house.”

 

How exactly could Lyon prove in a courtroom the technical truth of these words, blending as they did fact, opinion, analysis, interpretation, and rhetoric? The jury convicted and the court sentenced Lyon to a fine of $1,000 and a four-month imprisonment.

 

Dozens of newspapers across the continent brought readers detailed reports of the cause célèbre. While in prison, Lyon wrote an account of his travails that Philadelphia’s Aurora General Advertiser published in early November, followed by newspapers in many other localities. The congressman vividly described his conditions of confinement: “I [am] locked up in [a] room . . . about 16 feet long by 12 feet wide, with a necessary in one corner, which affords a stench about equal to the Philadelphia docks, in the month of August. The cell is the common receptacle for horse-thieves, money makers [counterfeiters], runaway negroes, or any kind of felons.” When Lyon stood for reelection—from prison!—in December, his constituents gave him a roaring vote of confidence, returning him to his House seat. Adams thus won the first courtroom battle but was beginning to lose the war of public opinion.

A year and a half later, the last big Sedition Act trial before the election of 1800 resulted in an even harsher sentence—nine months’ imprisonment. The defendant was the trashy but talented journalist James Callender—the man who broke the Alexander Hamilton sex-scandal story in 1797 and would later, in 1802, expose Jefferson’s affair with his slave mistress Sally Hemings (who was also his deceased wife’s half sister). In the run-up to the election of 1800, Callender published a campaign pamphlet, The Prospect Before Us.

 

Callender painted in bright colors and attacked Adams for just about everything: “Take your choice, then, between Adams, war and beggary, and Jefferson, peace and competency!” The “reign of Mr. Adams has been one continued tempest of malignant passions. As president, he has never opened his lips, or lifted his pen without threatening and scolding.” The administration’s “corruption” was “notorious.” Indeed, the president had appointed his own son-in-law, William Stevens Smith, to a plum federal office, surveyor of the port of New York, thus “heap[ing] . . . myriads of dollars upon . . . a paper jobber, who, next to Hamilton and himself is, perhaps, the most detested character on the continent.”

 

Notably, Callender also blasted the Sedition Act itself, and Adams’s abuse of it: “The grand object of his administration has been . . . to calumniate and destroy every man who differs from his opinions.” The “simple act of writing a censure of government incurs the penalties, although the manuscript shall only be found locked up in your own desk,” noted Callender. Here, the Sedition Act did indeed approximate mind control, yet Adams apparently never shuddered to think about his own diary diatribes against Hutchinson and other governmental figures in the 1760s and 1770s. Finally, Callender, who showed more self-awareness than Adams on this point, connected his critique of the act to the very nature of the election-year pamphlet in which his more general critiques of Adams were appearing. The act made it virtually “impossible to discuss the merit of the candidates.” If a person proclaimed that he “prefer[red] Jefferson to Adams”—as Callender was of course doing in this very pamphlet—wouldn’t that itself be an actionable slur on Adams?

 

The Adams administration apparently agreed, and prosecuted Callender in the spring of 1800 for what today looks like a rather typical, if overstated, campaign tract.

 

Callender’s nine-month sentence drew the gaze of printers and readers across the continent, just as the Adams-Jefferson race was unfolding in a series of statewide contests for electoral votes. Alongside the conviction of Lyon, Callender’s case cast Adams in an unflattering light, as did other lower-profile cases. (One featured a Newark drunkard, Luther Baldwin, who made a crude joke about the president’s rear end.)

 

All told, the Adams administration initiated more than a dozen—indeed, one recent historian says many dozen—prosecutions under the Sedition Act and closely related legal theories. Some cases never came to trial but still captured attention. For example, the feisty printer of Philadelphia’s Aurora General Advertiser, Benjamin Franklin Bache, named for his famous printer-grandfather, died while under indictment—the victim of a yellow fever pandemic. The Aurora was a high-profile anti-administration paper published in an iconic city. Going after Bache was the eighteenth-century equivalent of a Republican president today seeking to imprison the editors of the Washington Post or a modern Democratic president aiming to criminalize the publishers of the National Review.

 

Indeed, Jefferson himself had had secretly financed Callender (a fact which only later came to light).  If Callender was guilty, why not his accomplice Jefferson? So Adams’s policies were in fact the eighteenth century equivalent of, say, Donald Trump trying to imprison Joe Biden in 2020 for speaking ill of Trump and supporting others who did the same.

 

TWO SUPREME COURT JUSTICES riding circuit had sided with Adams, but America’s ultimate supreme court consists of the sovereign American people, who express themselves most consequentially via constitutional amendments and pivotal elections. The Adams-Jefferson contest was just such a pivotal election, and the court of public opinion ultimately sided with Jefferson and Madison, as has the court of history.

 

The biggest problem with the Sedition Act of 1798 was its self-sealing quality. Anyone in the press who harshly criticized this horrid law (such as Callender) risked prosecution under the law itself.

 

But each state legislature was a special speech spot. Even if newspapers risked prosecution under the Sedition Act if they initiated their own critiques of the act, or reprinted other newspapers’ critiques, surely they would enjoy absolute immunity if they merely told their readers what had been said in the special speech spots in state capitals. Thus, Madison and Jefferson quietly composed resolutions for adoption in the Virginia and Kentucky legislatures, respectively.

 

Madison was by far the abler constitutional theorist and practitioner, and his version has aged better than Jefferson’s. On Christmas Eve 1798, the Virginia General Assembly denounced the provisions of the Sedition Act as “palpable and alarming infractions of the Constitution.” That act, “more than any other, ought to produce universal alarm, because it is levelled against that right of freely examining public characters and measures, and of free communication among the people thereon, which has ever been justly deemed, the only effectual guardian of every other right.”

 

Over the next six weeks, newspapers in most states reprinted or excerpted Virginia’s protest. In the short run, Madison and Jefferson did not succeed in getting other state legislatures to join the Virginia and Kentucky bandwagon. But in the end, it did not matter whether the two statesmen immediately convinced a majority of state lawmakers, just as it did not matter whether they immediately convinced a majority of sitting Supreme Court justices. What mattered most in 1800–1801 was winning a majority of Electoral College votes in the Jefferson-Adams slugfest.

 

And that Jefferson did. When the American people, having now seen quite clearly what freedom meant to Adams and what freedom meant to Jefferson, decided between these two icons of 1776, they decided for Jefferson.

 

BUT THERE WAS A CATCH, involving palace intrigue eerily similar to some of the strangest moments that would unfold in America 220 years later, in January 2021.

 

The backstory to this episode of palace intrigue and near mayhem in 1800–1801 began, fittingly enough, with the early 1790s rivalry between Jefferson and Hamilton. Who was truly Washington’s prime minister? In particular, who should succeed to the presidency if both Washington and Adams were to die, become disabled, or resign?

 

The Constitution’s Vacancy Clause left this question for the federal legislature to decide: “Congress may by Law . . . declar[e] what Officer shall then act as President.” The text authorized an ex officio designation—not who but what, not which person but “what Officer” qua officer would serve as acting president as part of his regular office. In 1791 Jefferson’s partisans in Congress, led by Madison, proposed to designate the secretary of state as the officer next in line, a move that would bolster the status of Thomas Jefferson (who then held that office) and deflate the pretentions of then Treasury Secretary Alexander Hamilton. Hamilton’s Congressional admirers balked. As a compromise, some proposed to designate the chief justice—a post then held by the Hamilton-leaning John Jay. After bouncing between House and Senate and various committees thereof, the bill as finally adopted in 1792 placed America’s top senator—the Senate president pro tempore—first in line, followed by the Speaker of the House.

 

Alas, this was unconstitutional. As Madison and others persuasively pointed out, senators and House members were not, strictly speaking, “officers” within the letter and spirit of the Constitution’s Vacancy Clause. Only judges and executive officials—those who acted upon private persons, and were not mere lawmakers—were proper “officers” for succession purposes. Indeed, Article I, section 6 expressly prohibited sitting congress members from holding executive or judicial office: “no Person holding any Office under the United States, shall be a Member of either House during his Continuance in Office.”

 

All this set the scene for the post-election drama of 1800–1801. The Democratic Republicans won the election, with 73 electoral votes for Jefferson compared to 65 for Adams. But the fledgling party blundered, slightly.

 

Under the original Constitution, there was no separate balloting for the vice presidency. Rather, each member of the Electoral College cast two votes for president. The top vote-getter, if backed by a majority of Electors, would win the presidency, and whoever came in second in the presidential balloting would become vice president. The Democratic Republicans aimed to catapult Jefferson into the presidency and his running mate, New Yorker Aaron Burr, into the vice presidential slot, but every Jeffersonian Elector also voted for Burr. The party should have designated one Elector to throw away his second vote to ensure that Jefferson would outpoint Burr, but somehow failed to do this. Thus there was a tie at the top, a tie that would need to be untied by the lame-duck, Federalist-dominated House of Representatives.

 

The House could surely pick Jefferson—the only proper outcome, thought the Jeffersonians. Indeed, this is what the House ultimately did, thanks in no small measure to Hamilton’s emphatic appeals to Congressional Federalists on behalf of Jefferson. Hamilton told his correspondents that despite his own fierce feuds with Jefferson and the personal dislike that each man had for the other, the former secretary of state was an honorable and capable public servant committed to his country’s welfare. Once in power, Jefferson would, Hamilton hoped, eventually see the (Hamiltonian) light and govern in a way that would protect America’s vital interests at home and abroad. (Hamilton guessed right on this, in general.) Hamilton told his Federalist allies that Burr, by contrast, was a charming but corrupt wild card, who might sell the nation out to the highest bidder merely to line his own pocket.

 

Still, the Federalist-dominated Congress could lawfully pick Burr. Many Jeffersonians considered this scenario underhanded, because none of Burr’s Electors had truly wanted to see him president. From a legal point of view, however, Burr’s votes were no different from Jefferson’s. If Federalists actually preferred Burr, why shouldn’t he win as the consensus candidate? After all, had Federalist Electors known long in advance that Adams was a lost cause, they could have chosen to vote for Burr in the Electoral College balloting in the several states. Had even a single Federalist so voted, Burr in fact would have received more electoral votes than Jefferson, and thus would have won under the strict letter of the rules. How was the matter any different if Federalist House members opted to back Burr over Jefferson when allowed to untie the Electoral College tally? If this flipping of their ticket irked Jeffersonians, they had only themselves to blame for having picked Burr as their second man. After all, even if Burr were selected by the Federalist-dominated House, nothing would stop (President) Burr from resigning in favor of (Vice President) Jefferson. Easier still, nothing stopped Burr from publicly urging all House members to endorse Jefferson, mooting any need for post-inaugural heroics.

 

What if the House failed to pick either Jefferson or Burr? This sounded lawless, but it wasn’t, really. The Constitution required the House to untie the election under special voting rules reminiscent of the old Articles of Confederation. Each state delegation in the House would cast one vote, and the winner would need a majority of state delegations. If a state delegation were equally divided or abstained, its vote would count for zero, not one-half for each candidate. It was thus imaginable that neither Jefferson nor Burr would have an absolute majority of state-delegation votes in the House—nine out of sixteen—when Adams’s term expired at the end of March 3.

 

If so, could Adams simply hold over for a short period past his constitutionally allotted four years? For, say, a month? For a year? For four years? Or would the Succession Act spring to life when Adams’s term expired, allowing the Senate’s president pro tempore to become the president of all America? Even if that person were a Federalist? (The Federalists had a comfortable majority in the lame-duck Senate; the new Senate would be closely divided.) What about the argument that the Succession Act was in fact unconstitutional?

 

Enter “Horatius,” stage right. In a pair of newspaper essays initially published in early January 1801 in the Alexandria Advertiser and widely reprinted in both the capital area and beyond, the anonymous Horatius offered a cute way of untying the “Presidential Knot.” Horatius argued that the Succession Act was indeed unconstitutional. The lame-duck Congress should thus enact, and the lame-duck president, Adams, should sign, a new Succession Act designating a proper “officer” to take charge after March 3 in the event of a Jefferson-Burr House deadlock. Horatius did not explicitly state what officer should now fill the blank, but the obvious choice, legally and politically, for the lame-duck Federalists, was the secretary of state. After all, he was the highest-ranking officer, except for the arguable possibility of the treasury secretary and the chief justice. But the position of chief justice was vacant in early January. And although Horatius said none of this—he didn’t need to—the sitting secretary of state in early 1801 just happened to be the Federalists’ most popular and able politician: Jefferson’s old rival and first cousin, once removed, John Marshall.

 

It was an elegant and brilliant idea, a political and legal stroke of genius—evil genius, from a Jeffersonian perspective. But whose genius idea was it to crown John Marshall? Who was this Horatius? Most likely, according to modern scholars, John Marshall himself!

 

Even if Marshall was somehow not Horatius, Marshall surely agreed with Horatius. In mid-January 1801, James Monroe sent Jefferson a letter bristling with concern: “It is said here that Marshall has given an opinion in conversation …that in case 9 States should not unite in favor of one of the persons chosen [by the Electoral College—that is, Jefferson or Burr], the legislature may appoint a Presidt. till another election is made, & that intrigues are carrying on to place us in that situation.” In an earlier letter to Jefferson, Monroe had also identified Marshall as a likely beneficiary of the Horatius gambit: “Some strange reports are circulating here of the views of the federal party in the present desperate state of its affrs. It is said they are resolved to prevent the designation by the H. of Reps. of the person to be president, and that they mean to commit the power by a legislative act to John Marshall,. . . or some other person till another election.”

 

Jefferson responded by treating the situation as 1776 all over again, rallying his troops and rattling his saber. In mid-February 1801, he told Monroe that he “thought it best to declare openly & firmly, one & all, that the day such [a succession] act passed, the middle states would arm, & that no such usurpation even for a single day should be submitted to.” This was not casual chitchat. In 1801 Monroe was the sitting governor of Virginia, which of course bordered on the new national capital city. Jefferson was telling Monroe to ready his militia to march on Washington—with weapons—and Monroe was listening carefully.

 

Jefferson’s were the words of a sloppy, rash, and trigger-happy politico. What was his legal warrant for threatening to incite states near the national capital (“the middle states”) to take up arms against the central government? The Horatius gambit was surely sharp dealing, given that it aimed to give the presidency to neither Jefferson nor Burr, but how was it illegal? The Jeffersonians themselves had created the mess that Horatius slyly offered to tidy up. After all, Jefferson himself and his party had picked the ethically challenged Aaron Burr to be—under their own plan—a heartbeat away from the presidency.

 

If Burr were supremely honorable, he could simply declare, publicly and unequivocally, that he would not accept the presidency even if offered the post by the lame-duck Federalist-dominated House. Had Burr made such a clear and public declaration, it is impossible to imagine that the House could have deadlocked. Jefferson would have become president by process of elimination, much as if Burr were dead. (Imagine, say, an early 1801 duel in which Hamilton killed Burr!)

 

To his credit, Burr did not actively lobby in his own behalf. He did not hasten to Washington City to meet with House members, nor did he make any promises by letter or via intermediaries in exchange for House votes. But he did not, as he easily could have done, emphatically and openly disavow willingness to be selected over his senior partner.

 

Four years earlier, Jefferson had acted with more modesty when he had faced a remarkably similar situation. In mid-December 1796, he wrote a letter to his campaign manager, Madison, that ended up yielding enormous political dividends. If, upon the unsealing and counting of Electoral College ballots in early 1797, he and Adams ended up tied in the contest to succeed the retiring George Washington, thus obliging the House to break the tie, he wrote, “I pray you and authorize you fully to solicit on my behalf that Mr. Adams may be preferred. He has always been my senior from the commencement of our public life, and the expression of the public will being equal, this circumstance ought to give him the preference.” As events unfolded, Adams ended up with an outright majority over Jefferson in the Electoral College tally, rendering Jefferson’s sacrificial offer moot.

 

Adams himself learned of the letter and was charmed. (Jefferson, who had far more self-possession and politesse, generally knew how to play Adams—via professions of friendship and fulsome praise of the senior statesman’s early services to the republic.) In an exultant note to Abigail written on New Year’s Day, 1797, John regaled his wife with (imagined and inflated) details of Jefferson’s admiration and deference:

 

So many Compliments, so many old Anecdotes. . . . [Dr. Benjamin Rush] met Mr. Madison in the Street and ask’d him if he thought Mr. Jefferson would accept the Vice Presidency. Mr. Madison answered there was no doubt of that. Dr. Rush replied that he had heard some of his Friends doubt it. Madison took from his Pocket a Letter from Mr. Jefferson himself and gave it to the Dr. to read. In it he tells Mr. Madison that he had been told there was a Possibility of a Tie between Mr. Adams and himself. If this should happen says he, I beg of you, to Use all your Influence to procure for me [Jefferson] the Second Place, for Mr. Adams’s Services have been longer more constant and more important than mine, and Something more in the complimentary strain about Qualifications &c.

 

Perhaps Jefferson in late 1796 knew all along that Adams had more votes, and the letter to Madison was a brilliant ploy designed mainly to flatter Adams and put him off guard. (If so, it worked.) Or perhaps Jefferson meant everything he said (which was less than Adams recounted; the tale grew in the telling). Either way, it is notable that Aaron Burr did not follow in Jefferson’s deferential footsteps, even though Burr, in 1800–1801, had infinitely more reason to yield to his senior partner and teammate Jefferson than Jefferson in 1796 had to yield to his old friend, but now rival, Adams.

 

On Wednesday, February 11, 1801, Congress met in the new capital city of Washington in the District of Columbia to unseal the presidential ballots that had been cast by electors in the several states. Per the Constitution’s explicit provisions, the Senate’s presiding officer—that is, the incumbent vice president, Thomas Jefferson himself—chaired the proceedings. As expected, there was the tie at the top: 73 votes for Jefferson and 73 votes for Burr. The House immediately began balloting by state delegation. House rules said that the House “shall not adjourn until a choice be made.”

 

All through the night and the next morning, the House voted over and over, but neither Jefferson nor Burr could reach the requisite nine states (out of sixteen total). After twenty-eight continuous rounds of balloting, the exhausted legislators broke off shortly after noon on Thursday to get some sleep. Friday the 13th brought no resolution. Nor did Saturday. Still nothing when Congress reconvened on Monday the 16th. Adams’s term of office was due to expire on Tuesday, March 3—a mere fortnight away.

 

If the impasse continued, would Adams audaciously (illegally?) hold over past his allotted four years? Or would the lame-duck and electorally repudiated Federalist Congress in its final hours ram through a new Succession Act, à la Horatius, crowning Marshall ex officio as acting president, either in his capacity as secretary of state or in his new and additional role as America’s chief justice? (He was nominated for this post by President Adams on January 20 and confirmed by the Senate on January 27; he received his judicial commission on January 31 and took his judicial oath on February 4. Thus for the last month of the Adams administration, he wore both an executive and judicial hat.) If Adams or Marshall took steps to act as president on March 4, would Jeffersonian middle-state militias in Virginia and Pennsylvania respond with force as threatened? Would the self-proclaimed acting president Adams or Marshall counter with federal military force? Whom would the federal military salute? Would Federalist New England militias mobilize and march south? Would Hamilton try to jump into the fray? (In the late 1790s, he had been commissioned as a high general, second in command to George Washington, in anticipation of possible military conflict with France.) With the irreplaceable Washington no longer alive to calm the country and rally patriots from all sides to his unionist banner, would the American constitutional project ultimately collapse in an orgy of blood and recrimination, like so many Greek republics of old and the fledgling French republic of late?

 

These and other dreadful questions darkened the horizon in mid-February. And then, suddenly—as if a strong blast of fresh air abruptly swept across the capital city—the impasse ended. On the thirty-sixth ballot, on the afternoon of Tuesday, February 17, enough House members changed their minds to swing the election to Jefferson, by a vote of ten states to four, with the remaining two states professing neutrality. Most historians believe that Jefferson gave certain assurances to fence-sitting Federalists. Jefferson denied having made any promises, but he was a master wordsmith; his carefully crafted statements of intent (as distinct from promises) had sufficed. Thus, various Federalists crowned Jefferson with the expectation, confirmed by winks and nods from Jefferson and his authorized intermediaries, that he would govern as a moderate.

 

ON MARCH 4, 1801, America’s new chief justice administered the presidential oath of office to his rival and kinsman to complete the nation’s first peaceful (?) transfer of power. Adams was not there to witness the event. Earlier that day, he had left the capital city on a coach bound for his family homestead, brooding about what might have been.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180415 https://historynewsnetwork.org/article/180415 0
Why a Culture War Over Critical Race Theory? Consider the Pro-Slavery Congressional "Gag Rule"

 

 

 

What is Critical Race Theory and why are Republican governors and state legislators saying such terrible things about it? If you are among the 99% of Americans who had never heard of this theory before a month or two ago, you might be forgiven for believing that it poses a grave threat to the United States through the indoctrination of our schoolchildren. To clarify the reasons behind the sudden rise in attacks against this little-known theory, it can first help to consider an earlier campaign of silencing in US history—the effort to shut down any discussion of slavery in Congress through a gag-rule that lasted for almost a decade in the 1830s and 1840s.   

 

In 1836, in response to a flood of anti-slavery petitions, the House of Representatives passed a resolution (Rule 21) that automatically tabled all petitions on slavery without a hearing. By doing so, they effectively prohibited even the discussion of slavery in Congress. The Senate, for its part, regularly voted not to consider such petitions at all. Southern Representatives and their Democratic allies in the North believed that any attention paid to slavery was divisive in that it heightened regional tensions and promoted slave rebellions. They argued that the drafters of the Constitution never intended for the subject of slavery to be discussed or debated in Congress.

           

At the beginning of each session after 1837, during discussion of the House rules, the ex-President and then Representative John Quincy Adams would attempt to read anti-slavery petitions he had received. Originally, only Whigs supported his efforts, but more Democrats joined him each session, so that the majority against Adams gradually decreased until the gag-rule was repealed at the beginning of the 1845 session.

 

Parallels between the gagging of anti-slavery petitions and the campaign to prohibit the teaching of Critical Race Theory are clear, if unnoticed before now. Like the Southern delegations who opposed discussion of slavery, opponents of Critical Race Theory believe that any discussion of persistent racial inequities in legal and other institutions is unacceptable because it is “divisive.” Ben Carson and Christie Noem (Gov. ND-R) have asserted that Critical Race Theory is “a deliberate means to sow division and cripple our nation from within.”

 

In fact the theory, based on an understanding that race is not biological but socially constructed, yet nevertheless immensely significant for everyday life, provides a way to investigate systemic racism and its consequences. It recognizes that racism did not exist solely in the past, that structures embedded in laws and customs persist in the present and permeate social institutions. These structures, intentionally or not, lead to the treatment of people of color as second-class citizens or less-than-full human beings.

 

As their central charge, critics frequently take the theory’s argument that in the US racism is “structural” or “systemic” as synonymous with saying that the United States is “systematically” or “inherently” racist. However, doing so conflates “systemic” with “systematic”: “systemic” practices are those that affect a complex whole of which they are a part; “systematic” practices are planned and methodical. To say an attitude or pattern is structural does not mean that it is unavoidable and unchangeable, that it cannot be addressed and its effects reduced through reforms. Indeed, a central tenet of the theory is that racism has produced its effects through specific, historical institutions, and that reduction of racial inequities can be accomplished, but only once the existence of such injustices is recognized.

 

Most lines of attack on Critical Race Theory depend in similar ways on misunderstandings or distortions. Whether subtle or not-so-subtle, unintentional or willful, their effect is the same: they misrepresent the theory. The opponents criticize what they call the theory’s “race essentialism”— their misconception of Critical Race Theory as saying that an individual, based on their race, is “inherently” racist or oppressive. Against the idea of structural or “inherent” racism, the critics assert that racism only expresses personal choices and actions. But we need not accept their assumption that racism must be either structural or personal; both can surely exist at the same time.

 

Nor do we need to agree with the opponents that the theory considers all white people “inherently privileged” because of their race. In the 1930s, Social Security benefits were denied to domestic workers, the right to organize a union was withheld from propertyless farmworkers, and federally funded mortgages were denied to people of color generally through the practice of “redlining.” The vote was denied to many people of color via poll taxes and other legal obstacles. Recognizing this pattern is not the same as saying that white workers, voters, and mortgage holders are “inherently privileged.” Yet recognizing such a pattern does mean that some of the inequalities and disadvantages under which people of color have labored as a result of discriminatory legislation can be addressed through reformative legislation.       

 

When State Senator Brian Kelsey of Tennessee supported a ban on teaching Critical Race Theory in public schools, he stated that the theory teaches “that the rule of law does not exist and is instead a series of power struggles among racial groups.” However, to acknowledge that laws have been shaped by social structures and cultural assumptions of a particular time does not mean that the rule of law does not exist. Rather, it poses a challenge for us to root out the racist patterns and practices that have been invisibly at work in the idea of “equality under the law.”  

 

Finally, the detractors charge teachers with “imposing” or “forcing” the theory on their students. But these critics are not in fact calling for independence of thought. Rather, their charge seeks to suppress thought that questions historic and continuing inequities and inequalities, just as, almost two hundred years ago, representatives of Southern slave-owners and their Northern sympathizers imposed a gag-rule on their anti-slavery Congressional colleagues.

 

It is instructive that opponents of Critical Race Theory deny what the theory does not assert—that each white person is inherently, essentially racist, and that the institutions of American society are fundamentally, unchangeably racist. It may be easier to legislate these denials and to gag educators than to acknowledge what the theory does assert, and then work to make the difficult changes that are called for in the legal and the educational systems of our country. By denying that racism is entrenched and unyielding, they render it more entrenched and more resistant to attempts to address its consequences. 

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180452 https://historynewsnetwork.org/article/180452 0
The Legacy of Same-Sex Love in Ancient Thebes

 

 

 

Among the many roads leading up to the Supreme Court’s 2015 decision on same-sex marriage, one of the more significant routes passes through Boeotia in central Greece.  This region, and its principal city, Thebes, established a precedent for male same-sex unions that deeply impressed the ancient Greek world as well as gay rights pioneers in nineteenth-century England and the U.S. 

 

The story is little known compared with, say, that of the poetess Sappho of Lesbos, whose homoerotic verses have made the name of her home island, in adjective form, a virtual synonym for female same-sex love.  The Thebans wrote little compared with other Greeks, and those who wrote about them were often biased against them.  But traces survive of their uniquely gay-friendly culture, including a set of archaeological sketches made in 1880 but brought to light only very recently.

 

The long trail begins not with a Theban but with a Corinthian, a wealthy aristocrat named Philolaus.  Sometime in the 8th century BC this man left Corinth with his male lover, an Olympic athlete named Diocles, and landed in Thebes.  The pair were fleeing the incestuous passion of Diocles’ mother – a drama worthy of Sophocles, one supposes, but Aristotle, the source for their flight, gives no details.

 

Committed male couples, willing to go into exile together, were as yet uncommon in ancient Greece.  Homoerotic affairs were more typically short-lived, ending when the junior partner – who may have been pre-pubescent at the outset – began to grow facial hair.  That is the model described by Plato’s speakers in the dialogue Symposium, one of our fullest sources for ancient sexual mores.  But Philolaus and Diocles were both mature men.

 

Did this pair go to Thebes because they knew that their bond would be welcomed there?  Or did their arrival help make Thebes a more gay-positive place?  Aristotle says that Philolaus crafted laws for the Thebans, and other writers make clear that those laws gave special support to male unions.  It’s the first we hear, in any Greek city, of a legislative program designed to encourage same-sex pair bonding; some other Greek law codes explicitly discouraged it.

 

Fast-forward to the 4th century BC, where evidence of Theban uniqueness is more widespread.  In Athens, observers like Xenophon noted that male lovers among the Boeotians (the ethnic group that included the Thebans) lived together “as yoke-mates,” a metaphor usually used of heterosexual marriage.  Aristotle, in a work now lost but cited by Plutarch, described how Theban male couples swore vows of fidelity to one another beside the tombs of Heracles and Iolaus, a mythic pair of heroes assumed by most Greeks to have also been sexual partners,

 

By far the most significant evidence of same-sex bonding among Theban adults is the legendary Sacred Band, described by Plutarch (himself a Boeotian) in his work Parallel Lives.  This infantry regiment was formed from 150 male couples, of which both partners were clearly above the age of military service.  The Thebans established this corps in 378 BC in response to Spartan aggression, and with its help they defeated the Spartans in open battle only seven years later.

 

Our information about the Band is scanty enough that a handful of scholars have doubted Plutarch and suggested the erotic principle described in the Lives was only a fiction.  But the mass grave of the Band was uncovered in 1880 at Chaeronea, the spot where they fell in battle, and sketches were made of their remains by Panagiotis Stamatakis, the chief excavator.  These sketches, uncovered by Greek archivists only in the past few years, reveal that pairs of corpses were interred with arms linked – dramatic confirmation of Plutarch’s account.

 

The effectiveness of the Band, says Plutarch, was based on the way that two lovers, fighting side by side, would strive to impress one another with prowess and courage.  It’s just the same reason a pair of chariot horses run faster than any one horse, says Plutarch (who seems to know this for a fact).  Plato makes a related point in Symposium, discussing a hypothetical army of lovers: No one, he says, would want his beloved to see him turn and run in the face of danger. 

 

The tomb of the Sacred Band was unearthed at just the moment, in the late nineteenth century, that gay men, in Europe and the U.S., were first coming out of the closet.  Walt Whitman wrote in Leaves of Grass of a city where “manly love” flourished, seemingly inspired by Plutarch’s account of Thebes.  In England, the classical scholar and essayist J.A. Symonds wrote an impassioned defense of Greek male homosexual culture, including that of Thebes, in his seminal pamphlet “A Problem in Greek Ethics.”

 

The work of both Symonds and Whitman influenced George Cecil Ives, a gay Victorian man and a friend of Oscar Wilde, to make the Sacred Band his emblem of gay male pride.  He formed a secret society, the Order of Chaeronea, to provide a forum for closeted homosexuals and to work toward a more inclusive society.  The voluminous diary he kept during the late 19th and early 20th century is today considered a vital source for the start of the gay rights movement.

 

Ives became so entranced with the Sacred Band that he began dating his diary entries according to the years elapsed from 338 BC, the date of the Battle of Chaeronea.  He thought the new age of the world had begun not with the birth of Christ but with the destruction of the Sacred Band by Alexander the Great.  The extinction of the Band, in his eyes, had marked a great fall from grace, the end of an era when gay men like himself could live a life unimpeded by condemnation.

 

Ives was optimistic that the golden age that ended at Chaeronea could someday return.  “I believe that Liberty is coming,” he wrote in his diary in 1893.  “I sometimes think that some of us will live to see the victory.”  How gratified he would have been by the legalization of same-sex marriage and other kinds of “victory” for inclusion -- milestones attained with the help, in some small part, of Thebes and its Sacred Band.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180453 https://historynewsnetwork.org/article/180453 0
Paying People to Get Vaccines is an Old Idea Whose Time has Come Again

John Haygarth founded the Smallpox Society of Chester in 1778 to promote the then-unpopular practice of inoculation.

 

 

Several states now offer incentives for COVID vaccinations, hoping that enough people will sign up to drive the infection rate down and protect the entire community. When this was first tried in the late eighteenth century, it met with mixed success.  The originator was John Haygarth of Chester in Northwest England who published his plan for a "general inoculation” of the poor, as "An Inquiry How to Prevent the Small-pox" in March 1778. Haygarth argued that "social distancing” and other preventive steps could quell urban smallpox epidemics.  At that time, smallpox most often attacked poor young children whose families could not afford to shield them from all contact with infectious people. It was by far the most fatal disease in Britain, causing about half of all deaths among children under ten.  Most adult city residents had contracted the disease as children and become immune.  Outbreaks struck every few years when enough children had been born to sustain a fresh epidemic although inoculation, a preventive measure, was increasingly accessible.

When it was first introduced into England in 1721, smallpox inoculation, also known as “engrafting,” or “variolation,” was a brutal and dangerous procedure.  By inserting matter from a smallpox pustule under the skin of a new patient, a practitioner could usually produce a comparatively mild infection.  Although it was less lethal than naturally contracted smallpox, which killed about one patient in five, inoculation initially had a fatality rate of about one in fifty.  By mid-century, practitioners had become more adept, making the procedure safer and less complex.  Members of the entrepreneurial Sutton family were especially skilled. Daniel Sutton claimed that between 1763 and 1766, he had inoculated 22,000 people with only 3 deaths. However, this created a new problem: because inoculation caused an actual infection with smallpox, recently inoculated patients could infect any susceptible person who came near them before they had fully recovered.  Haygarth set out to solve this double problem: save more children without spreading the disease to others.

After studying smallpox outbreaks, Haygarth decided that it spread only by contagion from person to person and was transmitted primarily by an airborne vapor over a very short distance.  He drew up “Rules of Prevention” warning those with smallpox to avoid going out in public and those who were susceptible from entering any house that held a smallpox patient.  Everyone and everything touched by any discharges from a patient should be washed and exposed to fresh air and all medical attendants must wash their hands. Then he launched a campaign in Chester for mass inoculations.  Inoculating groups of people at the same time could also reduce the odds that they would transmit smallpox.

With his ally Thomas Falconer, Haygarth founded a "Society for Promoting Inoculation at Stated Periods and Preventing the Natural Smallpox" in Chester.  They raised donations to pay local doctors to perform the procedure and to pay poor families for bringing their children. All the doctors volunteered to participate without charge, increasing the fund for the families.  One concern was that recently inoculated children might spread smallpox. Even worse, some families exposed their children deliberately. To prevent these problems, the Society offered two payments to any poor family that (1) inoculated its children and (2) faithfully followed Haygarth's “Rules for Prevention”. The Society wrestled with the ethics of paying parents for inoculations but concluded that it could be considered compensation for the wages lost while they nursed their children. They also hired an “inspector” to follow up with the families and ensure that they observed the quarantine rules.  Mass inoculations were not new, but this may have been the first time anyone tried to create widespread immunity by offering a reward to families for inoculations. At first the society was successful, inoculating hundreds of children, suppressing incipient epidemics, and halving the death rate in Chester from smallpox.  In 1781, however, with their funds dwindling, they decided to stop paying families for inoculating their children and focus on rewards for obeying the rules of prevention.  When the scheme was first initiated, they thought paying for the inoculation itself was necessary to overcome “inveterate prejudices,” against the procedure, but it had come to be seen as a bribe for doing something wrong.

  After six years the plan collapsed in the face of frequent re-importation from nearby cities, the transit of infected soldiers through the city, and the resistance of many Cestrians.  The Society admitted that a single payment of five shillings for following the rules was too small to attract parents.  With so many new cases emerging, the inspectors were overwhelmed, and the society could not raise enough money to sustain its campaign or attain widespread compliance. (Proceedings, 205) Haygarth, a very tenacious man, was undaunted.  As the constant re-importation had contributed to the Society's collapse, he decided that only a nationwide inoculation effort could succeed. He filled in the details in 1793 with a sprawling two volume compilation entitled Sketch of a Plan to Exterminate the Casual Small-pox from Great Britain.  This ambitious plan was out of step with the realities of eighteenth-century British governance.  In any case, it was soon pre-empted by the safer practice of “cowpox” vaccination.  Even vaccination did not eliminate smallpox although child mortality began to plummet after a reformed British government made it mandatory in 1853. Yet Haygarth's efforts were not wasted.  His research on smallpox produced new information about the behavior of contagious diseases including incubation times, infectivity, conditions for exponential growth, epidemiology, and methods of control.  Haygarth drew on his growing expertise to investigate influenza, typhus and, less successfully, yellow fever, and to establish isolation wards for typhus at the Chester Infirmary. Haygarth was among the leaders of a small group of British physicians who redefined infectious diseases and claimed that contagion could be prevented by relentless cleanliness, separation, and fresh air.  Although their work was always controversial in Britain, it was widely read both at home and abroad and created a more secure foundation for medical research. So when you line up for your shots, win the vaccination lottery, or throw away your mask, take a minute to acknowledge the activists of the eighteenth century who worked to improve public health with mass inoculation.    

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180451 https://historynewsnetwork.org/article/180451 0
Understanding Gun and Police Violence Lies Between History and Power

Navy Junior ROTC Cadets use the firearms training simulator at Great Lakes Naval Air Station (IL).

 

 

National Gun Violence Awareness Day is here, yet given the continued spikes in gun and police violence since the past year, one could argue a reminder isn’t necessary. According to the Alliance for Gun Responsibility, African Americans are 10 times more likely than whites to die by gun homicide. They are also more than three times likely to be killed during a police encounter.

Gun and police violence are not separate affairs. Police shootings are part of America’s gun problem: police violence is a leading cause of death for young men. There is a correlation between police killings, states’ gun control laws, and gun ownership rates. And peoples of African ancestry, more than others, pay for this mix of gun culture and militarized police with their lives.

After Derek Chauvin's conviction, many believed justice prevailed and that the George Floyd Justice in Policing Act would become law. Yet further police killings just 24 hours after the verdict, and President Biden marking Floyd’s murder with a discussion rather than a law, shows why that belief may be premature.

U.S. police kill civilians at higher rates and in larger numbers compared to other democracies and policy changes, body cameras, and media scrutiny have not reduced the racial disparity in fatal police shootings. We need to understand the limits of policy prescriptions in the face of deeply rooted cultural and social norms related to policing and gun violence. The Supreme Court ruled racial segregation in public schools was unconstitutional in 1954, but today public schools are more segregated by race and income more than six decades ago.

We cannot likewise reform our way out of policing and guns because they are tethered to settler colonialism and slavery’s ongoing violence. Understanding this past in the present outweighs any act of protest or congressional bill. But grasping the past means understanding that a third of U.S. adults cannot pass a U.S. citizenship exam, most K-12 students have a poor grasp of U.S. history and so do many lawmakers. Indeed, billionaire David Rubenstein undertook a project to teach politicians U.S. history.

The issue of police and gun violence, especially against non-white peoples, is marginalized if not absent in the public’s understanding of both. There are stubborn perceptions: guns don’t kill, people do; most cops are good with only a few bad ones; the threat of police violence experienced by non-whites are overblown or justified, and high-profile cases of police “misconduct” are anomalies that are fixable through reform. These perceptions ignore the history of the power derived from control over people.

In the colonies that later formed the United States, policing and violence were tightly braided. If policing and gun violence were circles of a Venn diagram, the overlap would be enforcement—the act of power over another. Enforcement animated police violence, and as policing became bureaucratized so did this legitimacy shield and give police greater discretion in the use of deadly force.

Policing in the United States and in England evolved from community watches. These were supplemented by unarmed and unpaid constables without uniforms. In urban areas, like Boston and New York City, centralized police forces became publicly funded, full-time bureaucracies. Their mandates were to ensure social rather than crime control. Private businesses transferred the cost of their protection to the state, which paid for policing.

U.S. rural, southern areas used a mixture of “slave patrols,” bounty-hunting, deputization, and general surveillance of chattel—to apprehend, deter revolt, and enforce racists laws. Together, rural and urban policing were corrupt and brutal, operating under the control of politicians, who were beholden to economic elites. Political and economic elites who created the venues for public drinking, prostitution, and workers to strike, then criminalized those behaviors, assigning them to an identifiable “class” dangerous to social order.

Protests or strikes were criminalized as “riots.” Police were legally authorized to use force under the guise of rule of law, to patrol and surveil, and to wear uniforms which signaled a clear difference between them and the “dangerous” elements. The central problem that led to policing was never a crime, but political and economic power. As centralized police departments became the norm, they decided to arm officers after officers had already armed themselves. Indeed, white people were armed in the United States long before centralized police forces, and so white police forces simply formalized the “right to bear arms” argument, seemingly reserved for white people. Viewed from this perspective, gun control has also meant limiting gun access to peoples of African ancestry.

Though gun and police violence are in the crosshairs of current political debates, the real targets are the history and the very coercive power used to build this nation. The nation cannot be made anew and accountable through feel-good implicit bias training, anti-racist workshops, and policy prescriptions. Facing deeply rooted cultural and social norms around policing and gun violence requires confronting histories and concentrations of power which make them viable, and at a minimum making police violence an integral part of National Gun Violence Awareness.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180449 https://historynewsnetwork.org/article/180449 0
The New Meaning of "The Loyal Opposition"

 

 

 

 

The phrase “the loyal opposition” was coined by John Hobhouse in a debate in the English Parliament in 1826.  Less than a hundred years later, A. Lawrence Lowell, a political scientist (and later president of Harvard University) proclaimed the loyal opposition “the greatest contribution of the nineteenth century to the art of government.”

 

Designed to make space for the political party out of power to dissent and hold the majority party accountable without facing accusations of treason, the concept of a loyal opposition depends on the deference of non-governing parties to the authority of democratic institutions and the normative framework in which they operate.

 

The saving assumption of the loyal opposition, Michael Ignatieff, former leader of the Liberal Party in Canada and President of the Central European University, has written, is that “in the house of democracy, there are no enemies.”  When politicians treat each other as enemies, “legislatures replace relevance with pure partisanship.  Party discipline reigns supreme… negotiation and compromise are rarely practiced, and debate within the chamber becomes as venomously personal as it is politically meaningless.”

 

Republicans in the United States Congress, many of whom endorsed groundless claims that the 2020 presidential election was rigged, it now seems clear, have changed the meaning of “loyal” to obeisance to party rather than to democratic principles.  And the decision of GOP leaders in the House and Senate to block a bi-partisan commission to investigate the January 6 assault on the Capitol serves as the most recent example:

 

According to John Katko, the New York Republican Congressman who negotiated the provisions of the draft legislation with his Democratic counterpart Benny Thompson of Mississippi, the bill was modeled on the 9/11 comission to ensure it was “depoliticized entirely.”  The commission would have been composed of an equal number of Republicans and Democrats, with equal subpoena powers, an inability to subpoena a witness without bi-partisan agreement, and shared authority to hire staff.

 

Although Democrats incorporated the provisions Kevin McCarthy (R-California) demanded into the bill, the House Minority Leader declared last month that he opposed the commission because its “shortsighted scope” omitted “interrelated forms of political violence in America… I just think a Pelosi commission is a lot of politics.”

 

Senator Mitch McConnell (R-Kentucky), who declared in 2010 that “the single most important thing we want to achieve is for President Obama to be a one-term president” and in 2021 that “100% of his focus” would be on “stopping the [Biden] administration,” claimed, without evidence, that Pelosi, Thompson and Company negotiated “in bad faith” in order to “centralize control over the commission’s process and conclusion in Democratic hands.”  Although the Justice Department is limited to investigating crimes and lacks the power to subpoena individuals with knowledge of the assault who did not break the law, the Minority Leader opined that the DOJ probe rendered a bi-partisan commission “redundant.”  To this allegedly good reason, he added his real reason: winning majorities in the House and Senate in 2022 requires Republicans to prevent Democrats from continuing “to debate things that occurred in the past.”  McConnell then orchestrated the filibuster that prevented the Senate from considering the legislation.

 

Ditto John Thune, Republican Minority Whip.  Without addressing the need to determine what happened on January 6, who was responsible, and how another assault might be prevented, Thune expressed his fear that an investigation “could be weaponized” in 2022.  Senator John Cornyn, who had agreed in February “with Speaker Pelosi – a 9/11 type commission is called for to help prevent this from happening again,” also began to sing along with Mitch.  “The process has been highjacked for political purposes,” he declared.  Democrats are “going to try to figure out what they can do to win the election.  Just like 2020 was a referendum on the previous problem, they want to make 2022 one.”

 

In the closing pages of 1984, George Orwell’s dystopian novel, O’Brien, a functionary in the totalitarian state of Oceania (whose first name is never revealed), predicts that in the not-too-distant future “there will be no loyalty, except loyalty to the Party… There will be no laughter, except the laugh of triumph over a defeated enemy.”

 

It has been said that “when the loyal opposition dies, the soul of America dies with it.”  And it may not be unreasonable to fear that unless principle begins to trump party that time may be at hand.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180455 https://historynewsnetwork.org/article/180455 0
John Cena's Taiwan Controversy Recalls Richard Nixon's Biggest Mistake

Richard Nixon Meets Chairman Mao, February 1972

 

 

John Cena, a former wrestler turned actor, found himself enmeshed in a controversy after giving an interview to a Taiwanese television station. Promoting his new film, F9, Cena said, “Taiwan is the first country to watch Fast and Furious 9.”  Calling Taiwan a country sparked a furious backlash in mainland China, where Cena is a major star and the Fast and Furious franchise has raked in billions over the years. Cena quickly backtracked, offering a groveling and pathetic apology to the Chinese government. In an effort to appease communist China, Cena took to the social web site Weibo, and speaking in Mandarin (which he is fluent in), he issued the following statement: “I love and respect China and Chinese people. I’m very, very sorry for my mistake.”

 

Cena deserves condemnation for bowing to Beijing, though he’s hardly alone in doing so, but he can perhaps be forgiven for not understanding the geopolitical issue involved between China and Taiwan. Situated 100 miles south of the mainland, Taiwan’s global situation is ambiguous. In 1949 Chiang Kai-Shek and the Nationalists fled as the civil war ended and Mao’s forces took control of China. Despite the military defeat, Chiang proclaimed himself the legitimate ruler of China and denied that China had any right to rule over Taiwan. For two decades the United States accepted his claims and refused to establish ties with Mao Zedong’s People’s Republic of China.

 

It wasn’t until Richard Nixon went to China in February 1972 that the situation changed when Nixon abandoned America’s longtime ally. At the end Nixon’s visit the two sides issued what became known as the “Shanghai Communique.” China’s part of the statement included the following: “The Government of the People’s Republic of China is the sole legal government of China; Taiwan is a province of China which has long been returned to the motherland; the liberation of Taiwan is China’s internal affair in which no other country has the right to interfere; and all U.S. forces and military installations must be withdrawn from Taiwan.” Nixon, the man who owed his rise in politics to his fierce anti-communism, willingly went along with China’s demands. The American response in the communique was that the United States acknowledged “one China and that Taiwan is part of China.”[i]

 

Nixon’s betrayal of Taiwan was far worse than anything he did during the Watergate scandal. However flawed Chiang Kai-Shek may have been as a leader, he was a man who fought against Mao’s tyranny and stood loyally with the United States for decades. Taiwan was a haven for many who had fled Mao’s terror and they counted on Americans to defend the island from a communist takeover. Although not quite another Munich, Nixon’s selling out of Taiwan is a stain upon his presidency and a dark chapter in American foreign policy.

 

While China has never invaded Taiwan, the threat of intervention looms over the island. Today Taiwan maintains that it is in fact an independent nation while Beijing insists that it is part of the People’s Republic. Over a dozen countries, including the Vatican, have diplomatic relations with Taiwan, but the United States is not among them. That shameful fact is part of Richard Nixon’s legacy. In retrospect, his opening to China is not the success he believed it was. At some point the two nations were going to establish formal ties. The line that “Only Nixon could go to China” was perhaps correct then but one of Nixon’s successors would have done it anyway. Further, Nixon’s adulation of Chairman Mao looks worse as time has gone on. Mao’s brutal dictatorship, especially his Great Leap Forward, which one historian estimates led directly to the death of 45 million Chinese, does not make Nixon’s effusive praise of Mao look wise.[ii] His ignoring of the PRC’s appalling human rights record and willingness to forsake an old and valued friend is a blight on his presidency. A man who did much good and is far better president than he is given credit for, Nixon deserves nothing but censure for his capitulation to China and Mao.

 

[i] Richard Nixon, RN: The Memoirs of Richard Nixon (New York: Simon & Schuster, 1978), 576-577.

[ii] For the estimate of 45 million killed see Frank Dikotter, Mao’s Great Famine: The History of China’s Most Devastating Catastrophe, 1958-1962 (New York: Bloomsbury, 2010).

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180448 https://historynewsnetwork.org/article/180448 0
Whataboutism Didn't Get Nixon Off the Hook. It Shouldn't Stop Investigation of the Capitol Riots

 

 

The idea of there being an equivalency—moral or otherwise—between the Capitol riot of January 6 and Black Lives Matter and Antifa street agitation is preposterous. Republican attempts to submarine the January 6 commission proposal based on the isolated incidents of urban violence that arose during legitimate citizen protest over police abuses is a complete ruse. But whataboutism is a favorite tactic of politicians who have no good response for their own malfeasance or that of their followers.

Let’s take Richard Nixon and Watergate as the example.

In January 1973, Richard Nixon faced a sticky situation. He was trying to end the Vietnam War for the United States through brutal tactics because diplomacy had failed. In December, Nixon, without Congressional approval, instituted a punishing bombing campaign of Hanoi and Haiphong Harbor to bring the North Vietnamese back to the bargaining table in Paris. The strategy was having some success, though at the expense of international outrage over the bombing of civilian centers.

Nixon wanted support from the former president, Lyndon Johnson, and reached out to him on January 2, 1973, for what would be their last phone call. Johnson encouraged Nixon to keep at it. “Well, I just feel the torture you are going through on Vietnam,” Johnson said. Nixon replied: “As you know, and I’m sure you feel the same way, we’ve got to get this finished in the right way and not in the wrong way.”

Johnson responded, almost inaudibly, “You’re doing it, and I just wish the best for you.”

The complication for Nixon was that the Watergate burglars’ trial was just about to commence in Judge John Sirica’s courtroom in Washington. Howard Hunt, one of the leaders of the burglars, pleaded guilty before the trial would start, believing he had an implicit promise of a pardon through his lawyer’s talks with Chuck Colson, a Nixon adviser. All of this seemed suspicious to Senate Democrats, including Ted Kennedy, Sam Irvin and Mike Mansfield, who were making noises of an investigation into Watergate and the political shenanigans of the 1972 campaign.

Nixon wanted a strategy to starve the Congressional appetite for a full-blown Watergate investigation. John Dean, Nixon’s White House Counsel, suggested to Nixon’s Chief of Staff Bob Haldeman that Nixon rummage around at the FBI to see if there might be corroboration to the rumor that LBJ had wiretapped Nixon’s campaign plane in the 1968 campaign. J. Edgar Hoover supposedly told Nixon after he was elected that his plane had been bugged by Johnson.

Nixon never forgot it. And now Dean was recommending they try to “turn off” the Watergate investigation by threatening to expose Democratic wrongdoing in 1968, or whataboutism.

Nixon ordered a deep dive into FBI files and had his staff contact former FBI assistant to Hoover, Cartha “Deke” DeLoach, to find out if hard evidence existed of the 1968 plane bugging. This was a dangerous game, as Nixon knew LBJ would react with anger if he found out. Nonetheless, Nixon persisted.

On January 11, Nixon asked for an update from Haldeman. John Mitchell, Nixon’s attorney general, had spoken with DeLoach, who, according to the Haldeman and an Oval Office tape, confirmed that the spying on Nixon did take place. DeLoach offered to help find confirming evidence but refused to provide an affidavit. Nixon was unhappy. “Bob, I want it from DeLoach.”

All concerned were wary of Johnson’s reaction. Schemer that he was, Nixon suggested that someone tell Johnson that the Washington Star was on to the 1968 bugging story and that together they had to squelch the story by telling Congress to back off all campaign investigations, whether of 1968 or 1972.

The ploy backfired. “LBJ got very hot,” according to Haldeman and called DeLoach and said, “if the Nixon people are going to play with this, that he would release [deleted material—national security], saying that our side was asking for certain things to be done.” Whatever the counterthreat was, it was serious enough to be classified to this day.

In the end, the gamesmanship did nothing to deter the Watergate investigation. The Senate voted in February to start its inquiry. That famous investigation, led by North Carolina Democrat Sam Irvin and Tennessee Republican Howard Baker, broke the back of the Nixon administration’s criminal cover-up of the Watergate break-in. John Dean broke ranks and testified about his own culpability in the cover-up and his warning to Nixon that there was a “cancer growing on the presidency.” Dean’s testimony was fully corroborated when the White House tapes were ordered turned over by the Supreme Court in July 1974. Weeks later, Nixon resigned.

Whataboutism is a dangerous strategy. At least in the Watergate example the two instances were of some equivalence, if true—both involved presidential surveillance that was probably unlawful. The current situation is vastly different. Comparing protest violence connected to street demonstrations might be apple to apple—for example, violence at BLM protests equated with violence by Trump Proud Boys supporters in the street protests in Washington in December 2020—these might be considered of a kind.

But a violent insurrection at the United States Capitol with the purpose of stopping Congress from certifying a presidential election is of a different character and magnitude altogether. A frontal attack on democracy is entirely different from street violence that is easily controlled by law enforcement. We have had riots in our cities and looting related to civil unrest, but we have never had the seat of government placed under siege, except in time of war with Britain in 1812.

If Republicans want to investigate street violence, they are free to look into it. But it is imperative to our very form of government that the insurrection of January 6 be completely and full investigated.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180450 https://historynewsnetwork.org/article/180450 0
Two Films Show the Historical Toll and Present Danger of Ethnic Violence

Still from Habermann (2010)

 

 

With ethnic tensions in the USA much in evidence as witnessed by attacks on Asian Americansincreased antisemitism, and continuing Trumpian white resentment against minorities and immigrants, two films recently available on Amazon Prime Video that display ethnic hatred are timely indeed. They also reflect real historical happenings.

 

The first, named as “Hatred” on Prime, is a Polish drama that first came out in 2016 under the title “Volhynia.” But the movie does show hatred aplenty, including some brutal killings--especially of Poles by Ukrainians.

The original title refers to a Ukrainian area below modern-day Belarus. It is a region now in Ukraine that in history has bounced back and forth between Russian and Polish control, e.g., Russia ruled it in the nineteenth century and Poland controlled the western part of it between WWI and WWII. A graphic at the film’s beginning cites the ethnic makeup of Volhynia prior to WWII as 70% Ukrainian, 16% Polish, and 10 percent Jewish.

Many viewers of the film will probably just shake their heads and ask how humans can be so hateful and cruel to each other. All the ethnic and religious killings remind one of the lines from Indian-born writer Salman Rushdie’s The Moor’s Last Sigh: “In Punjab, Assam, Kashmir, Meerut--in Delhi, in Calcutta--from time to time they slit their neighbor’s throats. . . . They killed you for being circumcised and they killed you because your foreskins had been left on. Long hair got you murdered and haircuts too; light skin flayed dark skin and if you spoke the wrong language you could lose your twisted tongue.”

 

But in Volhynia, like in India and Pakistan, not only did different ethnic groups (e. g., Poles, Ukrainians, Jews, and Russians) clash, but so also did differing religious beliefs--in Volhynia’s case, Catholicism, Orthodoxy, and Judaism. In eastern Europe as a whole, similar conflicts occurred from even before the assassination of the Austrian archduke Franz Ferdinand by a Bosnian Serb in 1914 up to the 1990s’ conflicts among Serbs, Croats, Bosnian and Kosovar Muslims and other ethnic groups in the former Yugoslavia that led to the deaths of hundreds of thousands and created millions of refugees. In his sweeping The War of the World: Twentieth-Century Conflict and the Descent of the West, historian Niall Ferguson lists ethnic conflict as one of the three main causes of the “extreme violence” of the century, and central and eastern Europe as the most deadly of the “killing spaces.”

 

Hatred centers on the story of a young Polish girl, Zosia Głowacka, living in a Volhynian village. It begins with the wedding of her sister to a Ukrainian. Zosia is also in love with a young Ukrainian, Petro--an early scene shows them physically intimate. But in exchange for farmland and some animals, her father marries Zosia to Maciej, an older widowed Polish landowner with children (In the interwar years, the Polish government had helped many war veterans and other Polish colonists settle in Volhynia, which contributed to Ukrainian resentments). The wedding occurs on the eve of Germany’s 1939 invasion of western Poland, and Maciej is soon drafted to help the Poles fight the Germans.  

 

After the Germans quickly rout them, Maciej and other Poles attempt to return to their Ukrainian homes, but many of them are captured, tortured, and killed by local Ukrainians. Maciej, however, returns to his village by disguising himself as a Ukrainian. But he will not remain there long because according to the secret agreement attached to the Nazi-Soviet Pact of August 1939, part of “Polish” Volhynia is to be taken over by the Russians. After they do so, they arrest and send many Poles, including Maciej, to forced labor in Siberia or Kazakhstan--the new teacher also tells (in Russian) her young students that religion is a superstition.

 

From late 1939 to the summer of 1941, Zosia remains at Maciej’s farm with his children and an infant son of her own, probably fathered by Petro, who gets killed soon after helping Maciej’s children and Zosia avoid deportation.

 

In June 1941 the Germans attack the USSR and quickly take over the Volhynian area where Zosia lives. Some of her fellow villagers who are Ukrainian greet the German troops and cooperate in the arrest and killing of Jews and Poles. Zosia, however, risks her own life to help some Jews.  

 

By the summer of 1943 the Ukrainian Insurgent Army (UPA) has grown, as has the number of local Poles it has killed. The film shows two Ukrainian Orthodox priests preaching to their congregations. The first warns against excessive nationalism, but the second states, “We need to fill all the rivers with Polish blood because Ukraine has to be pure.”

 

Shortly thereafter some of the most horrific scenes of the film appear as local Ukrainians burn Polish huts and kill by burning, stabbing, axing, and other means, while shouting, “Death to the Poles.”

 

Zosia escapes with her little son, but sees her stepson burned alive. Eventually, she arrives at the home of her sister, Helena, and her Ukrainian husband, Vasyl, who is urged by his brother to kill the Polish Helena. Instead, Vasyl ends up killing his own brother with an axe.

 

Shortly thereafter, however, it is the Poles’ turn to be barbaric. They kill Helena’s whole family, including her for marrying a Ukrainian. Zosia escapes again, hiding in the woods with her son.

 

The film’s final scene shows her on a long dirt road, lying in the back of a horse-drawn cart, her and her son being transported by a kindly young man who found them in the woods. And the following wording is displayed on the screen: “In the period of 1943-45 an estimated 80 to 100 thousand Poles and 10 to 15 thousand Ukrainians had fallen victim to Ukrainian nationalists’ attacks and Polish retaliations in the Eastern Borderlands” (According to various historical sources, these estimates seem a bit high, but a joint Polish-Ukrainian conference in 1994 agreed that 50,000 Polish deaths was a moderate estimate).

 

The second film on Amazon Prime, Habermann, is a 2011 Czech-German film that reflects tensions between Czechs and Germans in the Sudetenland area bordering Germany and Czechoslovakia during the years 1938 to 1945. This border region was part of Czechoslovakia from 1918 until 1938, when Hitler it annexed. On the the first page of his Mein Kampf [1925], Hitler had written that all German speaking people should be united in an enlarged Germany. In March 1938 he began this process by absorbing (German-speaking) Austria to Germany. Later that year, in late September, he got the governments of England and France to “appease” him (in the infamous Munich Agreement) by agreeing that the Sudetenland, where ethnic German speakers were in the majority, was to be given to Germany.

 

August Habermann is a Sudeten German sawmill owner whose family has run the mill for generations. He is married to the Czech Jana, whose father (unbeknownst to her) was Jewish. They have a young daughter. August’s best friend is a Czech forester named Jan Brezina, who is married to Martha, an ethnic German. Most of the employees at the mill are ethnic Czechs, and August treats them fairly. But he begins having major problems after the German takeover, when SS Major Koslowski starts making demands on him and the sawmill and complains that Habermann employs mainly Czechs as opposed to Sudeten Germans.

 

The film then displays various examples of Nazi cruelty--for example, Major Koslowski demands that Habermann select 20 Czech civilians for execution to avenge the deaths of two German soldiers, and Habermann’s wife Jana is sent to a concentration camp. It also shows various examples of Czech resentment of Nazi control. Although some Sudeten Germans like August Habermann are unhappy about Nazi demands, others, like his younger brother Hans, who joins the German army, are fervent Nazi supporters.

 

The movie’s final scenes, like its opening one that foreshadows them, display the violent wrath (including killing) of the Czechs against their Sudeten German neighbors after the Nazis pull out in 1945. Unfortunately for August, the local Czechs blame him for cooperating with the Nazis. They even direct their hatred at his Czech wife, Jana, who has been freed from the concentration camp. “Habermann’s whore,” they call her.

 

Like Hatred, Habermann visualizes for its audiences many unpleasant truths about how beastly we humans can be to one another. According to the Czech historian Tomas Stanek, in Verfolgung [Persecution]1945, from May until early September 1945, Czechs brutalized and killed hundreds of thousands of Sudeten Germans as they drove them out of Czechoslovakia.

 

As we watch all this ethnic hatred on display, we naturally ask ourselves, why?  How can we act so inhumanely toward other human beings?

 

In Chapter 1, “A Century of Violence,” of my An Age of Progress? Clashing Twentieth-Century Global Forces (2008), I attempted to explain ethnic and other 20th- century violence. In doing so, I cited the Nobel-Prize winning economist Amartya Sen, who wrote that much of it flowed from “the illusion of a unique and choiceless identity,” for example, that of nationality, race, or class. He added that “the art of constructing hatred takes the form of invoking the magical power of some allegedly predominant identity that drowns other affiliations and in a conveniently bellicose form can also overpower any human sympathy or natural kindness that we may normally have.”

 

I also indicated that

 

there are many reasons why the deaths of foreigners or those considered fundamentally different seemed to matter much less to people than the deaths of those more similar. . . .  It is natural for people to feel more compassion for those closer to them--for family members, neighbors, or members of a group or nation with whom they identify. In addition, in the case of a nation or state, patriotism and nationalism were often reinforced by education, by media, and by social and cultural rituals such as the singing of national anthems, and, especially in wartime, by government propaganda. (For more of my thoughts on the motivations for violence and the dehumanization that often precedes it, see here.)

 

Looking specifically at the two films reviewed here, the focusing on past grievances and an ideology of ethnic nationalism are two main causes of much of the bloodshed. Volhynian Ukrainians remembering Polish government and individual mistreatment of them in the first film, and Czech revenge for Nazi oppression in the second are main factors.  In an interview about his Volhynian film, director Wojciech Smarzowski said it’s “ against extreme nationalism. The film is a warning–it shows what a human being is capable of doing when equipped with a relevant ideology, political or religious doctrine and is allowed to kill.”

 

I have often written against any nationalism, dogmatism, or ideology that makes us less tolerant of others. And I closed a recent essay with the hope that the USA could be “a land of many ethnic groups and various believers and non-believers [who] can live harmoniously together, can become “stronger, not in spite of its many elements, but because of them.”

How exactly to do so is complex, but a starting point might be to look at the example of the South Africans Nelson Mandela and Archbishop Desmond Tutu, who in the 1990s respectively created and chaired a Truth and Reconciliation Commission, which attempted to move South Africa’s Whites and Blacks beyond the cycle of violence and counter-violence.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180454 https://historynewsnetwork.org/article/180454 0
Governing With an Evenly Divided Senate is a Rare Tightrope Act

Sen. Kyrsten Sinema (D, AZ) may ultimately decide how much legislation passes the Senate before the midterm elections.

 

 

Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

 

Today, America has, for the fourth time, an evenly divided US Senate. Already this has complicated the ability of President Joe Biden and the Democratic Party to accomplish their goals.  Senate Democrats need party unity in an unusually urgent way. Passing most legislation under current Senate rules is blocked by the ability of the Republicans to filibuster. While 50 Democrats plus Vice President Kamala Harris could vote to change the rules, two party moderates, Joe Manchin of West Virginia and Kyrsten Sinema of Arizona, have been prominently opposed to changing the filibuster despite Republican obstructionism.

The tactic of budget reconciliation has allowed the passage in March of the “American Rescue Plan,” and may be pursued for the “American Jobs Plan,” the much-debated infrastructure bill, due to constant refusal of Republicans in the Senate to present a counterproposal close enough to the Biden Administration plan to begin real bipartisan negotiation.  Another major initiative, the “American Families Plan,” a major federal initiative promoting education, healthcare and child care by raising taxes on individuals who earn more than $400,000 in income and on corporations, also faces political barriers.  

It is becoming apparent that only modifying or removing the filibuster, which requires a 60 vote supermajority to move legislation toward a final vote (and thus allows a minority faction to control legislation), will make it possible to accomplish such ambitious goals, the greatest since the Great Society or the New Deal. Likewise for Biden’s goals for civil rights, gun regulation, voting rights, climate change, immigration, the minimum wage, criminal justice reform, and education.  What’s more, Biden faces another opponent in the calendar; Republicans are betting on the filibuster continuing to prevent Democrats from passing potentially popular legislation before the midterm Congressional elections in 2022, which historically favor the party opposed to the president.

The average age of current US Senators is 63. Five Senators are older than 80, 25 are in their 70s, and 18 of these 30 Senators are Democrats. There is also concern and alarm over the fact that if one of those 18 Democrats should become incapacitated or die, the Republican Party would hold a majority of the Senate.

The Senate has been evenly split three times in the past, with the 83rd Congress (1953-1955) covering the first two years of the Eisenhower presidency being the most chaotic.  In January 1953 the Senate had 48 Republicans, 47 Democrats, and Independent Wayne Morse of Oregon.  Three senators died in 1953, and six died in 1954, a total number never reached before or since that time. After the death of Senate Majority Leader Robert Taft of Ohio, on July 31, 1953, the Democrats had more members for nearly a year, until July 7, 1954.  However, while Independent Wayne Morse had left the Republican Party, he agreed to caucus with them to keep their majority. The Democrats would not be able to take over leadership in the 83rd Congress.  However, in 1955, with the Democrats controlling the Senate by one vote, Morse finally joined the Democratic Party, although he voted in an independent fashion.

Seven decades earlier, the 47th Congress (1881-1883) convened with 76 Senators from 38 states. With the Democrats and Republicans evenly divided, the two Independents divided their party support. Republicans could rely on the vote of Chester A. Arthur, the vice president under James Garfield, to break the tie. Arthur’s ascendancy to the presidency in September 1881 after Garfield’s death by assassination made the Republican hold on the Senate more tenuous as the vice presidency remained vacant for the remainder of Arthur’s only term, and the Senate elected presidents pro tem from within their ranks.

Most recently, the 107th Congress (2001-2003) under President George W. Bush saw party control of the US Senate switch a total of three times in a two year period.  The Republicans controlled from Inauguration Day on January 20, 2001 until June 6, 2001, when Republican Senator Jim Jeffords of Vermont became an Independent and agreed to caucus with the Democrats, switching the Senate from a 50-50 tie to 51-49 Democratic control.  It would remain that way for the rest of the two years in the Senate, although technically, with the Senate out of session, the Democrats lost the majority. Minnesota Senator Paul Wellstone died in a small plane crash on October 25, 2002, and interim Senator Jean Carnahan of Missouri was defeated at the polls that November.  Her husband, Mel Carnahan, had been elected posthumously to the Senate in 2000, three weeks after he was killed (also in a small plane crash), and she was appointed to the seat until the next regular Congressional election in 2002. Since the Senate was not in session or doing any important business in November and December 2002, the party switch had no consequences.

One must hope that the aging Senate will not see the loss of members of either party, but a death could be politically significant, depending on the timing and the partisan affiliation of the deceased.  History suggests the Democrats, the party in the White House, are likely to lose seats and thus control of the Senate in the 2022 midterm elections.  But two thirds of the seats coming up for election are now held by Republicans, and a number of veteran Republicans are not running for reelection. The Democrats could defy that pattern, win seats, and achieve a solid majority in the Senate for the third and fourth years of the Biden term.  Having a record of popular legislation will be essential in that effort. If the Democrats can’t accomplish enough through reconciliation, scrapping the filibuster may be their only chance.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/blog/154506 https://historynewsnetwork.org/blog/154506 0
The Roundup Top Ten for June 4, 2021

What We Believe About History

by Kristin Kobes Du Mez

"Understanding that beliefs have a history does not preclude a commitment to truths outside of history. But it does prompt believers to consider how historical forces and cultural allegiances may have shaped their own deeply held convictions."

 

Anti-Vaxxers are Claiming Centuries of Jewish Suffering to Look like Martyrs

by Sarah E. Bond

"Anti-vaxxers and anti-maskers would have us believe that the evil of being encouraged to get a vaccine is the same as the project of ethnic labeling and cleansing undertaken by the Third Reich. It appears at first a farcical analogy, but it’s not without its dangers."

 

 

Beyond the Nation-State

by Claire Vergerio

Much of what has been told about the rise of the nation-state from the Peace of Westphalia in 1648 is wrong. Reevaluating the history of the nation-state is essential for conceptualizing solutions to local and global problems that defy the logic of the nation-state.

 

 

The Racist Roots of Campus Policing

by Eddie R. Cole

Campus police forces often trace their origins to moments when Black demands for expanded housing opportunity clashed with universities' ambitions for expansion or desire to maintain white residential areas near their campuses. 

 

 

The Unbearable Easiness of Killing

by Arie M. Dubnov

"As a colleague justly commented, it is only helpful to call a situation ‘complicated’ if one is committed to unfolding the package, willing to examine its contents and prepared to be surprised by what one finds hidden inside."

 

 

How Cruelty Became the Point of Our Labor and Welfare Policies

by Gail Savage

The persistence of Malthusian thinking in social welfare debates is leading to policies that create needless suffering and a corrosion of the common bonds of humanity that sustain a society.

 

 

The Reconstruction Origins of "Black Wall Street"

by Alexandra E. Stern

Understanding Tulsa's Black Wall Street as a product of the rise and fall of Reconstruction helps to think more productively about how the Tulsa massacre speaks to the policy problems of racial justice. 

 

 

It’s Time to Break Up the Ivy League Cartel

by Sam Haselby and Matt Stoller

Ivy League institutions have an unfair hold on the distribution of opportunity and on the diversity of ideas in America and the world. 

 

 

James Meredith Reminds Us that Powerful Movements can Include those with Very Different Ideas

by Aram Goudsouzian

Meredith’s historical meaning is slippery, but that very inability to pin him down can teach important lessons – not only for how to remember the 1960s, but for how to think about social change.

 

 

Race, Free Speech, and the Purge of Campus Blasphemers

by Jonathan Zimmerman

An adjunct literature instructor at St. John's University has fallen victim to an adminstration's desire to make complex teaching challenges – like how to evaluate Twain's use of racial slurs in the context of satire – into simple rules. 

 

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180445 https://historynewsnetwork.org/article/180445 0
In 1844, Nativist Protestants Burned Churches in the Name of Religious Liberty

A mob burns St. Augustine's Catholic Church in Philadelphia, 1844, from John B. Perry A Full and Complete Account of the Late Awful Riots in Philadelphia

 

 

Former U.S. senator Rick Santorum has deservedly lost his position at CNN for his April speech in which he described all of Native American culture as “nothing.” But he made that remark in service to an equally suspect claim: that America “was born of the people who came here pursuing religious liberty to practice their faith, to live as they ought to live and have the freedom to do so. Religious liberty.” Contrary to Santorum’s rosy picture, many of the English settlers of what is now the east coast of the United States were as devoted to denying religious liberty to others as they were to securing their own ability to worship as they pleased. And as a committed Catholic, Santorum should know that for many Protestants, “religious liberty” meant attacking the Catholic Church.

 

The first English monarchs to back colonization hoped to contain Catholic expansion with what historian Carla Gardina Pestana calls “a Protestant empire.” While some colonies persecuted dissenters—whipping Baptists and Quakers—most tolerated varieties of Protestantism. But the settlers often drew the line at Catholicism. Each November, colonists celebrated “Pope’s Day” by lighting bonfires, firing cannon, and marching effigies of the pontiff through the streets, all to celebrate their common Protestant identity. Colonial governments outlawed Catholic priests, threatening them with life imprisonment or death. Even Maryland, founded in part as a Catholic haven, eventually restricted Catholic worship.

 

The Revolution—secured with the help of Catholic Spain and France, as well as that of many American Catholics—toned down some of the most vicious anti-Catholicism. Most American Protestants learned to respect and live with their Catholic neighbors. But while the United States Constitution forbade the establishment of religion or religious tests for office, individual states continued to privilege Protestantism. Some limited office holding to Protestants, declared Protestantism the official religion, and, most commonly, assigned the King James Bible in public schools, over the objections of Catholics.

 

Political anti-Catholicism gained new adherents in the 1830s, in response to both Catholic Emancipation in the British Empire and increased Irish Catholic immigration to the United States. In 1835, New York’s Protestant Association debated the question, “Is Popery compatible with civil liberty?” In 1840, a popular Protestant pastor warned that “It has been the favourite policy of popish priests to represent Romanism as a harmless thing.” “If they ever succeed in making this impression general,” he continued, “we may well tremble for the liberties of our country. It is a startling truth that popery and civil and religious liberty cannot flourish on the same soil; popery is death to both!”

 

Such beliefs led anti-Catholics to attack Catholic institutions as alien intruders. In August 1834, a mob burned down the Ursuline convent in Charlestown, Massachusetts, acting in the conviction the they were protecting American liberty against an institution that “‘ought not to be allow[e]d in a free country.’’ Five years later, a Baltimore mob threatened a convent there with a similar fate. As Irish immigrants filled both the pews and pulpits of American Catholic churches, such anti-Catholicism merged with a nativist movement that hoped to restrict immigration and make naturalization difficult.

 

The most sustained attack against Catholics came in Philadelphia in the spring and summer of 1844. Inspired by the success of a third-party nativist candidate in New York City’s mayoral election, Philadelphia nativists staged their own rallies throughout the city and its surrounding districts. In May, rallies in the largely Irish Catholic Third Ward of Kensington sparked three days of rioting. On the third day, nativist mobs burned two Catholic churches, along with the adjacent rectories and a seminary. Outside of one church, they built a bonfire of Bibles and other sacred texts, and cheered when the cross atop the church’s steeple collapsed in flame. In a nearby Catholic orphan asylum, the superioress wondered how she could evacuate nearly a hundred children if the mob attacked. “They have sworn vengeance against all the churches and their institutions,” she wrote. “We have every reason to expect the same fate.”

 

In the aftermath of the May riots, a priest in the heavily nativist district of Southwark resolved to prepare his church against future attacks. Along with his brother, he organized parishioners into a security force, armed with a collection of weapons ranging from surplus military muskets to bayonets stuck on brush handles. When, in July, the church’s neighbors realized the extent of his preparations, they concluded that the Catholics were planning to murder their Protestant neighbors in their sleep. Mobbing the church, they launched a second wave of riots, and even bombarded the church with a stolen cannon. Eventually, the county’s militia arrived in force and fired into the crowd. By the time the fighting was over, two dozen Americans were dead, and the nation was in shock.

 

Throughout all of this, leading nativists insisted that they tolerated all religions. “We do not interfere with any man’s religious creed or religious liberty,” asserted one. “A man may be a Turk, a Jew or a Christian, a Catholic, Methodist or a Presbyterian, and we say nothing against it, but accord to all a liberty of conscience.” He then immediately revealed the limits of his tolerance: “When we remember that our Pilgrim Fathers landed on Plymouth rock, to establish the Protestant religion, free from persecution, we must contend that this was and always will be a Protestant country!” That second sentiment—the insistence that the country truly belonged to members of one creed—explains the fury of the mob.

 

The same cramped view of religious liberty echoes in Santorum’s speech. As a Catholic, Santorum unsurprisingly identifies America with “the morals and teachings of Jesus Christ,” rather than only Protestantism. He also calls the United States “a country that was based on Judeo-Christian principles,” letting Jews halfway into his club. But any effort to privilege some religions over others reminds us that purported advocates of tolerance may be religious supremacists under the skin. Pursuing religious liberty for one’s own kind is only the beginning of freedom. Securing liberty to all is the true achievement.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180388 https://historynewsnetwork.org/article/180388 0
The War Beat, Pacific: How the American News Media Went to War Against Japan

Reporters observe the surrender of Japan on the USS Missouri, Tokyo Bay, September 2, 1945. Still from footage by William Courtenay

 

 

 

Even the barest outline of what had happened was enough to transform America.

 

The news broke of the Japanese attack on Pearl Harbor just after lunch on the East Coast. Within minutes, mayors had placed their cities on a war footing, FBI agents were rounding up Japanese Americans, and tearful families were gathering at train stations and bus depots as soldiers and sailors, their leaves canceled, headed off to fight. In Congress the next day, every legislator, bar one, voted for the declaration of war, while the Republican leadership pledged to adjourn politics “for the duration.” One opinion survey found the nation “deeply resentful of the treachery.” Another concluded that “commentators of all political hues are in agreement that the first Japanese bomb dropped upon Hawaii wrought suddenly the miracle which no amount of logic or persuasion had previously been able to achieve,” making isolationism “the initial casualty of the war.”

 

Yet Americans had received surprisingly little information about what had happened in Hawaii on Sunday, December 7, 1941. Early that afternoon, the Navy Department had instructed officers to “place naval censorship in effect.” As a result of this order, one reporter had a news alert cut off halfway through transmission—although, he, like others in the US press corps scattered across the Pacific, did not initially mind. These correspondents recognized the extreme importance of denying operational information, including the numbers of ships sunk on battlefield row, to the marauding Japanese enemy. It was only over time, as the Navy persisted with its policy of super-strict censorship, that their frustration grew, especially once the military’s concerted effort to control the flow of news was compounded by other problems.

 

Just getting close to the battlefield presented the first challenge. A sea voyage offered an uninviting mixture of long periods of boredom on an alcohol-free ship, punctuated with short bursts of terror whenever someone spotted an enemy plane or submarine. An air trip could be infinitely worse. Some of the planes that the military used to ferry men back and forth to islands like New Guinea and Guadalcanal were unreliable. Correspondents who boarded them knew they were taking their life in their own hands, even if their aircraft managed to miss the terrible tropical turbulence that, as one noted, made the sky “as rough and solid seeming as an angry sea.” Once safely on the ground, the climate was another trial, from the vicious heat and humidity in the summer to the torrential downpours during the rainy season. And then there were the many unsavory aspects of jungle life: “spiders as big as saucers . . .,” observed one correspondent in New Guinea, “butterflies as big as birds, lightning bugs like flashlights.” The mosquitoes posed the biggest menace of all, since they spread malaria; but other nasty diseases also took their toll, including dysentery, dengue fever, and “jungle rot.”

 

The correspondents who managed to witness the fighting then had to overcome a further series of obstacles. “Stories,” one of them remarked, “had to be flown back to Pearl Harbor, to Australia, or long distances to ships that were far enough out of range of the [enemy] to break radio silence before they could be transmitted. Often, planes were not immediately available for this courier service,” with the result that it “was usually a matter of days before the stories reached the US and found their way into print.” Even then, editors often considered the Pacific War less newsworthy than other theaters, much to the annoyance of the correspondents who risked their neck to report it. A Time reporter in the Solomon Islands in 1943 was so angry when he discovered that his magazine had cut his story to make space for “a new batch of Russian War pictures,” that he sent a cable to his editors protesting “desolate Time’s treatment.” A correspondent who covered the bloody Saipan invasion in the summer of 1944 was equally upset when he discovered how his reports had rated, protesting they had been placed a “very bad fourth,” way behind the D-Day invasion in Normandy and the Republican National Convention.

 

This sense of neglect was a surprisingly common refrain throughout much of the war. On one occasion in late 1942, Hanson Baldwin of New York Times became so frustrated that he claimed that the Pacific War had become “the ‘unknown war.’” This was an exaggeration. Even at times when military censorship was overly stringent, or when editors were preoccupied with events elsewhere, the home front usually received some basic information on the fight against Japan.

 

As the war progressed, the situation also began to improve. During 1943 and 1944, both the Army and the Navy made a conscious effort to lift the veil that had descended over the Pacific War. In fact, the two services engaged in a dynamic rivalry, as first one, and then the other, tried to grab the biggest share of the headlines. Douglas MacArthur invariably took the lead, prodding his PROs to devise new methods to speed the flow of stories about his numerous invasions in New Guinea and then the Philippines during 1943 and 1944. The culmination came during the liberation of Luzon in January 1945, when MacArthur’s officers established a “floating press headquarters on three small ships . . . whereby correspondents, PROs, censors, and a transmission set-up were able to move in just behind the assault waves and begin functioning immediately.” The Navy eventually responded, so that during the battle for Iwo Jima in February 1945 it had the capability to send both dispatches and photographs to America in time for publication on the same day—an innovation the press praised as ranking “among the miracles of modern transmission.”

 

As the war reached its murderous conclusion, not every story was reported in such graphic detail. When the Japanese decided to contest every street in Manila, resulting in the death of more than 100,000 civilians, MacArthur had little desire to draw attention to this aspect of his return to the Philippines. The Navy, meanwhile, remained leery of releasing news on the kamikaze attacks, not wanting the Japanese to know how much damage their suicide pilots were inflicting. Navy censors even blocked the release of one major story on the kamikazes until September 2, 1945, when, as the correspondent who had written it remarked, “it couldn’t hope to compete with the signing of the surrender aboard the [USS] Missouri.”

 

More than three hundred correspondents, broadcasters, and photographers assembled to record this surrender ceremony, making it, as one reporter noted, “the most thoroughly covered [story] of the war in all theaters.” It was certainly a complete contrast to the first three years of the Pacific War, when Americans received surprisingly little battlefield information about some of the biggest, most controversial events, from Pearl Harbor and the Bataan death march to the Marines’ heroics on Guadalcanal. In this period, the Pacific had been “the shrouded war.”

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180394 https://historynewsnetwork.org/article/180394 0
Politics Aside, Was Trump Even an Effective Leader?

 

 

 

One of the most intriguing aspects of current politics is the fealty of the Republican Party to Donald Trump. Central to this loyalty is the view that Trump was an effective leader. As a candidate in 2016, the future president claimed that he was uniquely qualified to lead the country, unite the public, and overcome gridlock in Congress. To accomplish these goals would require successful persuasion. Was this talented self-promoter able to win public support for his initiatives? Was this experienced negotiator able to overcome polarization in Congress and obtain agreement on his proposals? Was Donald Trump an effective leader?

 

Did the public follow the president’s lead?

 

At the core of Donald Trump’s political success were his public relations skills. He possessed well-honed promotional talents sharpened over a lifetime of marketing himself and his brand. Once in office, the president wasted no time in conducting a permanent campaign to win the public’s support. On the day of his inauguration, Trump filed for reelection with the Federal Election Commission. Less than a month afterwards, on February 18, 2017, he held the first of what were to be dozens of political rallies around the country.

 

Did he succeed in winning support for himself and his policies? I have shown in great detail, he did not. Instead, he consistently failed to win the public’s backing for either his policies or his own handling of them. Indeed, he seemed to turn the public in the opposite direction. He made the Affordable Care Act, which had been unpopular, popular, and the health care policies he backed unpopular. Similarly, in the face of a general desire to control our borders and protect the country from terrorists, Trump managed to alienate the public from his immigration policies. Perhaps most remarkably, his tax cut for nearly all taxpayers and businesses was unpopular. In addition, the public remained supportive of free trade and critical of his handling of trade policy.   

 

Capping Trump’s failure to win public support was his earning the lowest average level of general job approval of any president in the history of polling. Moreover, this approval was also the most polarized, with the difference among members of the two major parties averaging 81 percentage points.

 

As president, Trump dominated the news, but his impulsive, undisciplined, and divisive communications created distractions from his core message and alienated the public. His discourse was characterized by ad hominem attacks aimed at branding and delegitimizing critics and opponents, exaggerated threats and inappropriate offers of reassurance, blurred distinctions between fact and fiction, encouragement of cultural divisions and racial and ethnic tensions, and challenges to the rule of law. The public was not persuaded by this inflammatory rhetoric and concluded that he was an untrustworthy source of information.

 

Trump was more successful in solidifying his core supporters—those who already agreed with him. Although we cannot know for certain, it appears that his rallies, tweets, and other communications—along with affective polarization and motivated reasoning—kept Republicans in the public in his camp, making it more difficult for congressional Republicans to challenge him.   

 

Most significantly, it appears that Trump’s efforts to influence the public were detrimental to the polity. His rhetoric encouraged incivility in public discourse, accelerated the use of disinformation, legitimized the expression of prejudice, increased the salience of cultural divisions and racial and ethnic tensions, and undermined democratic accountability. For the Republicans who followed him, he distorted their knowledge about politics and policy, warped their understanding of policy challenges, and chipped away at their respect for the rule of law.

 

Did the president succeed in leading Congress?

 

Donald Trump claimed a unique proficiency in negotiating deals. Announcing his candidacy for the presidency on June 16, 2015, he proclaimed, “If you can’t make a good deal with a politician, then there’s something wrong with you. . . . We need a leader that wrote The Art of the Deal.” Was he able to exploit his experience to win congressional approval for his policies?

 

He was not. Once in office, he floundered. His passivity, vagueness, inconsistency, and lack of command of policy made him an unskilled, unreliable, and untrustworthy negotiator. He often adopted a reactive posture and easily lost focus. He was not successful in closing deals and convincing wavering members, principally Republicans, to support him. His shifting positions, inconsistent behavior, exclusion of Democrats in developing policies, and use of threats and ridicule squandered whatever potential for compromise might have existed. As a result, he received historically low levels of support from Democratic senators and representatives. His high levels of support from Republicans in both chambers of Congress were largely the product of agreement on policy and party leaders keeping votes he might lose off the agenda. When they were resistant, the president could not convince Republicans to defer to him, and his customary tools of threats and disparagement gained him little.

 

Trump was successful in preventing bills he opposed from passing, as are most presidents, but Congress passed little significant legislation at his behest. He was even less successful after Democrats gained control of the House in the 2018 midterm elections. He could not win support for new health care policy, immigration reform, or infrastructure spending. By 2020, he had virtually no legislative agenda. Congress took the lead on pandemic-related bills. Government shutdowns and symbolic slaps at his foreign policies characterized his tenure, even when Republicans were in control of the legislature.  

 

Abandoning Leadership

 

Donald Trump wrote off the majority of the public and much of Congress. His genius for politics focused on playing to his base, with all its attendant detriments for the success of his presidency and the health of the polity. Governing by grievance may have met his personal needs but it did little to enhance his effectiveness as a leader. In the end, his response to his failure to persuade was to push the boundaries of presidential power and violate the norms of the presidency.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180395 https://historynewsnetwork.org/article/180395 0
The Enduring Fascination – And Challenge – Of World War II

 

 

More than seven decades after the end of World War II, why are we still so fascinated by it?  On a primal level, World War II is the complete package. Violence, action, adventure, romance, drama, death defying feats, passions, race, gender, new inventions, crisis decision making, colorful personalities and leaders, evil personalities and leaders, horror, heroism, and a triumphant ending. It doesn’t get any better (or worse) in the realm of human experience.

World War II also serves to remind us what happens when a country is caught flatfooted and unprepared to respond to a crisis. The Pearl Harbor attack made clear that preparedness for a crisis is paramount. Failure to learn that lesson almost always leads to disaster. The 9/11 attack in 2001, first. Then twenty years after, the nation’s failure to prepare and have plans in place to combat the COVID Pandemic. In both cases, the U.S. paid a terrible price for its lack of preparedness as it did with Pearl Harbor.

It’s simplistic to say that World War II is a case of wanting to hang onto a feel-good, nostalgic past triumph. History is never past. It continues to repeat itself in many ways, and most importantly in many of the eternal issues--war and peace, violence and non-violence, authoritarian rule and democratic government, conservative and liberal ideology, civil liberties and national security, and terrorism and intervention. 

Author and World War II expert Michael Bess says the war continues to challenge us to never lose sight of the nation’s principles and values:

The issue raised here is a vital one for any democratic society: how to balance a commitment to constitutional rights and liberties with the demands of security in wartime.  The lesson of World War II, in this regard, is clear: take the long view; don’t get lost in the panic of the moment.  In 1942, in the name of national security, we Americans seized a racially demarcated subset of our citizenry and threw them in the slammer.  In both cases, the justification was the same: We are at war.  We have to do this in order to survive. But this turned out not to be true.  Not a single case of Japanese-American subversion was ever prosecuted during World War II.

History should be approached as a living, breathing organic day-to-day experience. The events of the past that continually influence, shape, and contain important lessons for the present and the future are perpetually invaluable. One of my favorites is nicely summed up on the University of People website: 

Learn from the past and notice clear warning signsWe learn from past atrocities against groups of people, genocides, wars, and attacks. Through this collective suffering, we have learned to pay attention to the warning signs leading up to such atrocities. Society has been able to take these warning signs and fight against them when they see them in the present day. Knowing what events led up to these various wars helps us better influence our future.

Do “genocide,”, “atrocities,”, “wars,”, “attacks,” “collective suffering,” “warning signs,” “fight against them,” or “better influence our future,” sound familiar? The message is to be forewarned is to be forearmed. That’s the purpose of knowing and taking to heart the great lessons of, and from, the past. In the end the past is the present and the future. 

Here are three immediate examples that painfully underscore that.  The U.S. stamped an everlasting stain on its claim to be the global champion of democracy when it interned 120,000 Japanese Americans during the war. The interned not only committed no crime but were productive citizens that made integral contributions to the nation in agriculture, trade, and the manufacturing industries.

The U.S. learned from that heinous act. In the aftermath of the 9/11 attack, fear and hysteria did not run rampant in the nation. There was no wholesale lock-up of Muslims in the country under the guise that they posed a threat to national security. Nearly two decades later, then President Trump’s demand to exclude citizens from nations deemed “terrorist” from entrance into the U.S. ignited major resistance and legal challenges. It was soon modified and then scrapped. We learned again.

There were assorted identifiable white nationalist, supremacist and neo-Nazi supporters involved in the violence during the Capitol takeover January 6, 2021. The reaction from the government, media and public was swift condemnation, mass arrests, and prosecutions of the perpetrators. Congressional hearings were held that decried the laxity of response and ignoring intelligence warnings of possible violence. There would be no Reichstag type takeover here. 

There is the always public tremor over the use of atomic power. When the Biden administration in April 2021 approved a plan to bankroll a multibillion-dollar project in New Mexico to manufacture key components for the nation’s nuclear arsenal, antinuclear and environmental watchdog groups sprang into action. They threatened lawsuits, court action, and public protests over the plan. 

I could name many more examples of how World War II hold lessons for the present.

The monumental destruction World War II wreaked should never blind us to the fact that the war was first and foremost a major historical event. As with all major historical events, they happen in a continuum of time and place. As such, they have important social, political, and economic consequences long after their end. In What is History?, eminent historian E.H. Carr ruminated at length about the inseparable linkage between the past and the present, “It is at one the justification and the explanation of history that the past throws light on the future, and the future throws light on the past.”

Carr goes further. He insists that history has value only when it sheds light on the present and future, “History establishes meaning and objectivity only when it establishes a coherent relation between past and future.”

America’s master oral history chronicler Studs Terkel published many books in which regular folk told their stories about just about every aspect of American life. There was no surprise then that the Good War had the sledgehammer impact on the public it did when it was released in 1984. 

The stories the men and women of World War II told had instant and moving resonance for legions of readers born years, even decades, after the war. They could identify with the human emotions and drama that poured forth in their remembrances. It was the epitome of living history. It was no accident in May 2021, thirty-seven years afterThe Good War, was published, and thirty-six years after it won a Pulitzer Prize, the book still ranked among the top 20 bestsellers in two non-fiction categories on Amazon. 

This literally speaks volumes why World War II, the good war, still fascinates us. And undoubtedly will continue to.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180390 https://historynewsnetwork.org/article/180390 0
Review: Lesley Blume's “Fallout: The Hiroshima Cover-up and the Reporter Who Revealed It to the World”  

 

 

In this crisply written, well-researched book, Lesley Blume, a journalist and biographer, tells the fascinating story of the background to John Hersey’s pathbreaking article “Hiroshima,” and of its extraordinary impact upon the world.

In 1945, although only 30 years of age, Hersey was a very prominent war correspondent for Time magazine—a key part of publisher Henry Luce’s magazine empire—and living in the fast lane.  That year, he won the Pulitzer Prize for his novel, A Bell for Adano, which had already been adapted into a movie and a Broadway play.  Born the son of missionaries in China, Hersey had been educated at upper class, elite institutions, including the Hotchkiss School, Yale, and Cambridge.  During the war, Hersey’s wife, Frances Ann, a former lover of young Lieutenant John F. Kennedy, arranged for the three of them to get together over dinner.  Kennedy impressed Hersey with the story of how he saved his surviving crew members after a Japanese destroyer rammed his boat, PT-109.  This led to a dramatic article by Hersey on the subject—one rejected by the Luce publications but published by the New Yorker.  The article launched Kennedy on his political career and, as it turned out, provided Hersey with the bridge to a new employer – the one that sent him on his historic mission to Japan.

Blume reveals that, at the time of the U.S. atomic bombing of Hiroshima, Hersey felt a sense of despair—not for the bombing’s victims, but for the future of the world.  He was even more disturbed by the atomic bombing of Nagasaki only three days later, which he considered a “totally criminal” action that led to tens of thousands of unnecessary deaths.

Most Americans at the time did not share Hersey’s misgivings about the atomic bombings.  A Gallup poll taken on August 8, 1945 found that 85 percent of American respondents expressed their support for “using the new atomic bomb on Japanese cities.”

Blume shows very well how this approval of the atomic bombing was enhanced by U.S. government officials and the very compliant mass communications media.  Working together, they celebrated the power of the new American weapon that, supposedly, had brought the war to an end, producing articles lauding the bombing mission and pictures of destroyed buildings.  What was omitted was the human devastation, the horror of what the atomic bombing had done physically and psychologically to an almost entirely civilian population—the flesh roasted off bodies, the eyeballs melting, the terrible desperation of mothers digging with their hands through the charred rubble for their dying children.

The strange new radiation sickness produced by the bombing was either denied or explained away as of no consequence.  “Japanese reports of death from radioactive effects of atomic bombing are pure propaganda,” General Leslie Groves, the head of the Manhattan Project, told the New York Times.  Later, when, it was no longer possible to deny the existence of radiation sickness, Groves told a Congressional committee that it was actually “a very pleasant way to die.”

When it came to handling the communications media, U.S. government officials had some powerful tools at their disposal.  In Japan, General Douglas MacArthur, the supreme commander of the U.S. occupation regime, saw to it that strict U.S. military censorship was imposed on the Japanese press and other forms of publication, which were banned from discussing the atomic bombing.  As for foreign newspaper correspondents (including Americans), they needed permission from the occupation authorities to enter Japan, to travel within Japan, to remain in Japan, and even to obtain food in Japan.  American journalists were taken on carefully controlled junkets to Hiroshima, after which they were told to downplay any unpleasant details of what they had seen there.

In September 1945, U.S. newspaper and magazine editors received a letter from the U.S. War Department, on behalf of President Harry Truman, asking them to restrict information in their publications about the atomic bomb.  If they planned to do any publishing in this area of concern, they were to submit the articles to the War Department for review.

Among the recipients of this warning were Harold Ross, the founder and editor of the New Yorker, and William Shawn, the deputy editor of that publication.  The New Yorker, originally founded as a humor magazine, was designed by Ross to cater to urban sophisticates and covered the world of nightclubs and chorus girls.  But, with the advent of the Second World War, Ross decided to scrap the hijinks flavor of the magazine and begin to publish some serious journalism.

As a result, Hersey began to gravitate into the New Yorker’s orbit.  Hersey was frustrated with his job at Time magazine, which either rarely printed his articles or rewrote them atrociously.  At one point, he angrily told publisher Henry Luce that there was as much truthful reporting in Time magazine as in Pravda.  In July 1945, Hersey finally quit his job with Time.  Then, late that fall, he sat down with William Shawn of the New Yorker to discuss some ideas he had for articles, one of them about Hiroshima.

Hersey had concluded that the mass media had missed the real story of the Hiroshima bombing.  And the result was that the American people were becoming accustomed to the idea of a nuclear future, with the atomic bomb as an acceptable weapon of war.  Appalled by what he had seen in the Second World War—from the firebombing of cities to the Nazi concentration camps—Hersey was horrified by what he called “the depravity of man,” which, he felt, rested upon the dehumanization of others.  Against this backdrop, Hersey and Shawn concluded that he should try to enter Japan and report on what had really happened there.

Getting into Japan would not be easy.  The U.S. Occupation authorities exercised near-total control over who could enter the stricken nation, keeping close tabs on all journalists who applied to do so, including records on their whereabouts, their political views, and their attitudes toward the occupation.  Nearly every day, General MacArthur received briefings about the current press corps, with summaries of their articles.  Furthermore, once admitted, journalists needed permission to travel anywhere within the country, and were allotted only limited time for these forays.

Even so, Hersey had a number of things going for him.  During the war, he was a very patriotic reporter.  He had written glowing profiles about rank-and-file U.S. soldiers, as well as a book (Men on Bataan) that provided a flattering portrait of General MacArthur.  This fact certainly served Hersey well, for the general was a consummate egotist.  Apparently as a consequence, Hersey received authorization to visit Japan.

En route there in the spring of 1946, Hersey spent some time in China, where, on board a U.S. warship, he came down with the flu.  While convalescing, he read Thornton Wilder’s Pulitzer Prize-winning novel, The Bridge of San Luis Rey, which tracked the different lives of five people in Peru who were killed when a bridge upon which they stood collapsed.  Hersey and Shawn had already decided that he should tell the story of the Hiroshima bombing from the victims’ point of view.  But Hersey now realized that Wilder’s book had given him a particularly poignant, engrossing way of telling a complicated story.  Practically everyone could identify with a group of regular people going about their daily routines as catastrophe suddenly struck them.

Hersey arrived in Tokyo on May 24, 1946, and two days later, received permission to travel to Hiroshima, with his time in that city limited to 14 days.

Entering Hiroshima, Hersey was stunned by the damage he saw.  In Blume’s words, there were “miles of jagged misery and three-dimensional evidence that humans—after centuries of contriving increasingly efficient ways to exterminate masses of other humans—had finally invented the means with which to decimate their entire civilization.”  Now there existed what one reporter called “teeming jungles of dwelling places . . . in a welter of ashes and rubble.”  As residents attempted to clear the ground to build new homes, they uncovered masses of bodies and severed limbs.  A cleanup campaign in one district of the city alone at about that time unearthed a thousand corpses.  Meanwhile, the city’s surviving population was starving, with constant new deaths from burns, other dreadful wounds, and radiation poisoning.

Given the time limitations of his permit, Hersey had to work fast.  And he did, interviewing dozens of survivors, although he eventually narrowed down his cast of characters to six of them.

Departing from Hiroshima’s nightmare of destruction, Hersey returned to the United States to prepare the story that was to run in the New Yorker to commemorate the atomic bombing.  He decided that the article would have to read like a novel.  “Journalism allows its readers to witness history,” he later remarked.  “Fiction gives readers the opportunity to live it.”  His goal was “to have the reader enter into the characters, become the characters, and suffer with them.”

When Hersey produced a sprawling 30,000 word draft, the New Yorker’s editors at first planned to publish it in serialized form.  But Shawn decided that running it this way wouldn’t do, for the story would lose its pace and impact.  Rather than have Hersey reduce the article to a short report, Shawn had a daring idea.  Why not run the entire article in one issue of the magazine, with everything else—the “Talk of the Town” pieces, the fiction, the other articles and profiles, and the urbane cartoons—banished from the issue?

Ross, Shawn, and Hersey now sequestered themselves in a small room at the New Yorker’s headquarters, furiously editing Hersey’s massive article.  Ross and Shawn decided to keep the explosive forthcoming issue a top secret from the magazine’s staff.  Indeed, the staff were kept busy working on a “dummy” issue that they thought would be going to press.  Contributors to that issue were baffled when they didn’t receive proofs for their articles and accompanying artwork.  Nor were the New Yorker’s advertisers told what was about to happen.  As Blume remarks:  “The makers of Chesterfield cigarettes, Perma-Lift brassieres, Lux toilet soap, and Old Overholt rye whiskey would just have to find out along with everyone else in the world that their ads would be run alongside Hersey’s grisly story of nuclear apocalypse.”

However, things don’t always proceed as smoothly as planned.  On August 1, 1946, President Truman signed into law the Atomic Energy Act, which established a “restricted” standard for “all data concerning the manufacture or utilization of atomic weapons.”  Anyone who disseminated that data “with any reason to believe that such data” could be used to harm the United States could face substantial fines and imprisonment.  Furthermore, if it could be proved that the individual was attempting to “injure the United States,” he or she could “be punished by death or imprisonment for life.”

In these new circumstances, what should Ross, Shawn, and Hersey do?  They could kill the story, water it down, or run it and risk severe legal action against them.  After agonizing over their options, they decided to submit Hersey’s article to the War Department – and, specifically, to General Groves – for clearance.

Why did they take that approach?  Blume speculates that the New Yorker team thought that Groves might insist upon removing any technical information from the article while leaving the account of the sufferings of the Japanese intact.  After all, Groves believed that the Japanese deserved what had happened to them, and could not imagine that other Americans might disagree.  Furthermore, the article, by underscoring the effectiveness of the atomic bombing of Japan, bolstered his case that the war had come to an end because of his weapon.  Finally, Groves was keenly committed to maintaining U.S. nuclear supremacy in the world, and he believed that an article that led Americans to fear nuclear attacks by other nations would foster support for a U.S. nuclear buildup.

The gamble paid off.  Although Groves did demand changes, these were minor and did not affect the accounts by the survivors.

On August 29, 1946, copies of the “Hiroshima” edition of the New Yorker arrived on newsstands and in mailboxes across the United States, and it quickly created an enormous sensation, particularly in the mass media.  Editors from more than thirty states applied to excerpt portions of the article, and newspapers from across the nation ran front-page banner stories and urgent editorials about its revelations.  Correspondence from every region of the United States poured into the New Yorker’s office.  A large number of readers expressed pity for the victims of the bombing.  But an even greater number expressed deep fear about what the advent of nuclear war meant for the survival of the human race.

Of course, not all readers approved of Hersey’s report on the atomic bombing.  Some reacted by canceling their subscriptions to the New Yorker.  Others assailed the article as antipatriotic, Communist propaganda, designed to undermine the United States.  Still others dismissed it as pro-Japanese propaganda or, as one reader remarked, written “in very bad taste.”

Some newspapers denounced it.  The New York Daily News derided it as a stunt and “propaganda aimed at persuading us to stop making atom bombs . . . and to give our technical bomb secrets away . . . to Russia.”  Not surprisingly, Henry Luce was infuriated that his former star journalist had achieved such an enormous success writing for a rival publication, and had Hersey’s portrait removed from Time Inc.’s gallery of honor.

Despite the criticism, “Hiroshima” continued to attract enormous attention in the mass media.  The ABC Radio Network did a reading of the lengthy article over four nights, with no acting, no music, no special effects, and no commercials.  “This chronicle of suffering and destruction,” it announced, was being “broadcast as a warning that what happened to the people of Hiroshima could next happen anywhere.”  After the broadcasts, the network’s telephone switchboards were swamped by callers, and the program was judged to have received the highest rating of any public interest broadcast that had ever occurred.  The BBC also broadcast an adaptation of “Hiroshima,” while some 500 U.S. radio stations reported on the article in the days following its release.

In the United States, the Alfred Knopf publishing house came out with the article in book form, which was quickly promoted by the Book-of-the-Month Club as “destined to be the most widely read book of our generation.”  Ultimately, Hiroshima sold millions of copies in nations around the world.  By the late fall of 1946, the rather modest and retiring Hersey, who had gone into hiding after the article’s publication to avoid interviews, was rated as one of the “Ten Outstanding Celebrities of 1946,” along with General Dwight Eisenhower and singer Bing Crosby.

For U.S. government officials, reasonably content with past public support for the atomic bombing and a nuclear-armed future, Hersey’s success in reaching the public with his disturbing account of nuclear war confronted them with a genuine challenge.  For the most part, U.S. officials recognized that they had what Blume calls “a serious post-`Hiroshima’ image problem.”

Behind the scenes, James B. Conant, the top scientist in the Manhattan Project, joined President Truman in badgering Henry Stimson, the former U.S. Secretary of War, to produce a defense of the atomic bombing.  Provided with an advance copy of the article, to be published in Harper’s, Conant told Stimson that it was just what was needed, for they could not have allowed “the propaganda against the use of the atomic bomb . . . to go unchecked.”

Although the New Yorker’s editors sought to arrange for publication of the book version of “Hiroshima” in the Soviet Union, this proved impossible.  Instead, Soviet authorities banned the book in their nation.  Pravda fiercely assailed Hersey, claiming that “Hiroshima” was nothing more than an American scare tactic, a fiction that “relishes the torments of six people after the explosion of the atomic bomb.”  Another Soviet publication called Hersey an American spy who embodied his country’s militarism and had helped to inflict upon the world a “propaganda of aggression, strongly reminiscent of similar manifestations in Nazi Germany.”

Ironically, the Soviet attack upon Hersey didn’t make him any more acceptable to the U.S. government.  In 1950, FBI director J. Edgar Hoover assigned FBI field agents to research, monitor, and interview Hersey, on whom the Bureau had already opened a file.  During the FBI interview with Hersey, agents questioned him closely about his trip to Hiroshima.

Not surprisingly, U.S. occupation authorities did their best to ban the appearance of “Hiroshima” in Japan.  Hersey’s six protagonists had to wait months before they could finally read the article, which was smuggled to them.  In fact, some of Hersey’s characters were not aware that they had been included in the story or that the article had even been written until they received the contraband copies.  MacArthur managed to block publication of the book in Japan for years until, after intervention by the Authors’ League of America, he finally relented.  It appeared in April 1949, and immediately became a best-seller.

Hersey, still a young man at the time, lived on for decades thereafter, writing numerous books, mostly works of fiction, and teaching at Yale.  He continued to be deeply concerned about the fate of a nuclear-armed world—proud of his part in stirring up resistance to nuclear war and, thereby, helping to prevent it.

The conclusion drawn by Blume in this book is much like Hersey’s.  As she writes, “Graphically showing what nuclear warfare does to humans, `Hiroshima’ has played a major role in preventing nuclear war since the end of World War II.”

A secondary theme in the book is the role of a free press.  Blume observes that “Hersey and his New Yorker editors created `Hiroshima’ in the belief that journalists must hold accountable those in power.  They saw a free press as essential to the survival of democracy.”  She does, too.

Overall, Blume’s book would provide the basis for a very inspiring movie, for at its core is something many Americans admire:  action taken by a few people who triumph against all odds.

But the actual history is somewhat more complicated.  Even before the publication of “Hiroshima,” a significant number of people were deeply disturbed by the atomic bombing of Japan.  For some, especially pacifists, the bombing was a moral atrocity.  An even larger group feared that the advent of nuclear weapons portended the destruction of the world.  Traditional pacifist organizations, newly-formed atomic scientist groups, and a rapidly-growing world government movement launched a dramatic antinuclear campaign in the late 1940s around the slogan, “One World or None.”  Curiously, this uprising against nuclear weapons is almost entirely absent from Blume’s book.

Even so, Blume has written a very illuminating, interesting, and important work—one that reminds us that daring, committed individuals can help to create a better world.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180389 https://historynewsnetwork.org/article/180389 0
Preserving the Stories of the Second World War My interview methods were rudimentary and virtually nonexistent when I first met some of the men I would later interview. I had luckily written to a military historian whose books I devoured as a child, the late Colonel Raymond F. Toliver. He had been the first to release books full of information gathered from his many years of friendships and interviews with the German fighter pilots of World War II. He was also friends with and had interviewed many of the American and a few British pilots also.

It was due to Ray Toliver’s guidance that I learned about these men, and I studied the best ways to speak with them as I began making contacts. I took his advice to heart: “Read everything you can and know as much about your subject as possible before you meet them. Write down the first twenty primary questions you want answered, and the rest will fall into place as the interview goes along.”

As I went along, I began to develop my own methods of weaving into the history when speaking to these men. From my first contacts at symposiums and gatherings with men such as James H. Doolittle, James Gavin, Matthew Ridgeway and Omar N. Bradley (just to name a few), I read all I could. I took copious notes and cross-referenced data.

As I began my first serious sit-down interviews in Germany, I learned something more of the process. Very seldom do subjects give their story in perfect chronological order, and they seldom give you everything that they experienced. There is always something they later remember. Hence, many of these interviews lasted from a few hours to several years. For example, I first interviewed Erich Rudorffer (224 kills) in 1984, and the last time we clarified his information was in 2009.

Regardless of nationality or language, I almost always found these subjects willing to discuss their wartime service, though of course some were more open to discussion than others. I also learned that, if the research was well-done before the actual encounter, I would have to tailor the interview to the individual. Many subjects before they came to know me wanted to see the list of basic questions in advance. Others (especially the SS) wanted to know why I wanted the interview, and what my motivations were. They did not want to speak to a young man with a politically motivated approach to their stories.

Another thing I learned was that, as I transcribed the interviews and sent them back to the subjects to check for accuracy, I developed a trust with them. They saw that I had no hidden agenda. That led them to tell me about others I should interview, who may have an interest in telling their stories. Most of these names I had never heard of so I researched before I made contact.

For anyone wanting to embark upon a career as a historian specializing in interviews, World War II is pretty much closed out, same with Korea, and Vietnam veterans will not be around much longer. However, regardless of which era of conflict you wish to focus upon I would say that you must maintain perspective. Accessibility and availability are key to a successful interview, and with modern technology the in-person method is not always required. However, it was always good to have a few drinks with the older guys as I was made welcome inside their inner circle.

The American and British sources were far more easily approached when compared to most of the Germans, and for obvious reasons. After many years (when I had the time) of writing and responding to letters, making telephone calls and then meeting in person when I could afford it, their stories began adding up to a great collection.

The greatest difference between former Allied veterans and the Germans was that many of the Americans and British were rather well known in their own countries. The Germans were virtually unheard of, as if the German people wanted to simply forget the Third Reich and the men who fought for it.

What I learned, and tried to pass on to my readers, and when I was a professor, pass on to my students, was that if you research history, let alone war, if you only access one side of the story, then you have an incomplete research project. As a result, I tried to be open minded and give others their voice. They fought a longer and harder war than most, and most of these men had no love for their leadership.

However, as U-boat commander Reinhard Hardegen said; “Few of us were in a position to tell Hitler, Goering or anyone else that they had made great mistakes, and that we were paying for them. In a dictatorship like ours, there was only one voice. All others were silenced.”

The men interviewed for this book (my first collection of Americans) were very welcoming. They were eager to speak with a historian of a younger generation. As Robert Johnson said, “I travel to schools talking to the kids about World war II, why we fought it, and how important it was that we win it. I will not be around much longer. It’s good that guys like you can carry it on and educate the next generations. They should never forget that history.”

The American veterans were all rightly proud of their service, our nation’s place in the history of the war, and the fact that I was wanting to join the pantheon of people who had already interviewed them over several decades. Few of them tired of receiving historians; all of us eager to collect their memories. When I started the process, it was a hobby. Only later did I realize it would be a career.

Later on I read some magazines such as World War II, Military History and Aviation History. I saw that every now and then there were question-and-answer interviews with veterans, a few well-known, most just average people who served during the war. I also found out they paid for the articles.

I wrote for these magazines with the mentoring of senior editor Jon Guttman and managing editor Carl von Wodtke. Jon was already an established historian and an interview specialist who had gathered the stories of the most obscure airmen from the war, and some quite famous. During the long road to eventual authorship, I made friends with the late pilot and historian Jeffrey L. Ethell, a man I thought the world of. Jeff then gave me more contacts, even in Japan.

I would say that the purpose of a historian, especially military history, is to secure the information from both sides of a conflict. It is fine to get an interview with a person who fought a war. It becomes a full circle event when you get the story from his enemy. On that note, I was able to assist other historians and also myself connect various enemy pilots who had fought each other. After examining the interviews, comparing dates, and researching records, you sometimes get lucky.

Examples are when Adam Makos was writing his book, A Higher Call, about the encounter between Luftwaffe ace Franz Stigler and the crew of a B-17, Ye Olde Pub and its pilot, Charles Brown. Another revelation was when my research connected the dots for the surviving Allied airmen who were held under sentence of death at Buchenwald concentration camp. They were saved by an unknown Luftwaffe colonel. That mystery was solved in the last few years when I dusted off a couple of old interviews. I knew that German officer who saved 164 surviving airmen. His name was Hannes Trautloft. Such is the satisfaction of seeing your historical research solve long standing mysteries.

I know that public schools no longer teach history, not as I learned it, and that is a great intellectual tragedy and a disservice to the memories of those who came before us. Another mission I have embarked upon is to dispel the myths and rumors regarding the total sainthood of all of the Allied soldiers and the complete evil of every German or Japanese serviceman. Good and evil exist everywhere. Recognizing the truth behind the post war propaganda and assigning blame where warranted preserves the truth.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180355 https://historynewsnetwork.org/article/180355 0
My Great Grandfather, Stephen Douglas, and the Seductions of Non-intervention

Also among my belongings is a gold-framed tintype photo, of the kind made by itinerant photographers around the time of the Civil War.  Etsy offered one for sale recently for $18, and in the days before Etsy, when I used to frequent funky antique shops, the tintype I have is the just sort of thing I might have bought as an item of décor.  Only I inherited it. And now am I finally taking in that the man in the photo is my own great grandfather, Edwin Alexander Banks.   

I recognize him: take away the Confederate uniform, add a white Hemingway-esque beard, and I can see he is a twin image of my dad, Col. Richard Griffin Banks, USA, Retired.  Edwin Banks was married to Eliza Ward Pickett, the woman I think of as “the other Eliza,” memorialized on my silver serving spoon inscribed “Corrine to Eliza Pickett.”  

I knew nothing about Edwin until recently, when I learned from the 1860 census that he claimed his profession as “editor.”  He was only 21 but, despite his youth, had partnered with 45-year-old Col. J.J. Seibels in publishing a Montgomery newspaper called The Confederation.  They took a strong stand on the most pressing issue of the day: whether the South should stay in the Union or secede.

The Confederation’s stance was ultra-Unionist, equating secession with treason.  This did not mean it was anti-slavery. According to the 1860 Montgomery slave census, Edwin and Eliza enslaved two people: a 17-year-old mulatto and a 20-year-old mulatto.  Their names are not listed.  

With the 1860 Presidential election looming, Col. Seibels wrote to Stephen Douglas, the Democratic candidate from Illinois who was running against Abraham Lincoln, urging him to make a campaign swing through the South.  Trailing Lincoln in the Northern states and John Breckinridge, a breakaway pro-slavery Democrat, in the Southern states, Douglas had no hope of winning.

But Seibels argued he could still make a last-ditch effort to promote the cause of the Union in the South.  Douglas agreed and, endorsed by The Confederation, appeared in Montgomery just days before the election.  He held forth on the steps of the Alabama statehouse for four hours, reassuring his listeners that slaveholders had nothing to fear from the Federal government.  It is not a stretch to imagine that in the audience that day was John Wilkes Booth, in town to play Richard III, his first leading part as a Shakespearean actor. (For more on this subject, see “Maybe the White Abolitionist Should Have Listened to the Black Abolitionist” and “How to Change History.”)

So close to the election, Stephen Douglas’s speech got little attention outside Montgomery, although The Confederation published a transcript of it.   I summarize it below because I believe it persuasively illustrates that the Civil War was fought not over self-determination; not over states’ rights; not over heritage. It was fought over slavery.  Others can parse whether the moral or the economic aspects of slavery were paramount; I will leave this as an unqualified declarative sentence: The Civil War was about slavery.   So successful have the Lost Cause apologists been at clouding this truth that it is actively contested even now.

Douglas hoped to find a compromise between North and South and it was brave of him to venture into the heart of secession country. But throughout the speech, he pays homage to state’s rights and self-determination as a way to reassure his audience that in the United States they will be able to keep their slaves, no matter what. 

 

 

Let me sum up his main points:

1. Yes, of course you can keep your slaves. 

“ . . . your title to your slave property is expressly recognized by the Federal Constitution as existing under your own laws, where no power on earth but yourselves can interfere with.” 

And

“The true doctrine of the Constitution, the great fundamental principle of free government … is that every people on earth shall be allowed to make their own laws, establish their own institutions, manage their own affairs, take care of their own negroes and mind their own business.”

2.  If your slaves run away and are caught – no matter where – of course they will be returned to you:  

“ . . . the Fugitive Slave Act of 1793 . . . declares that any person held to service or labor in either of the States of this Union, or in any organized Territory, under the laws thereof, escaping, shall be delivered up.”

3. No need to treat your slaves humanely.  Just because the workday in some Eastern factories may now be limited to 10 hours, there is no reason for that to prevail in the slave states.  

“ . . .Get up a protective law for your property and what is your property worth?  Whenever you permit Congress to touch your slave property you have lost its value.” 

4.  Don’t worry about the Territories becoming free states.

“As the law now stands . . . slaves are to be held in the Territories the same as in the States . . .under the laws thereof, beyond the reach of Congress to interfere.”

5.  Abolitionists are terrible people and they are only making things worse for slaves by pushing slaveholders toward greater cruelty.

“ . . . have they not forced the master to draw the cord tighter, and to observe a degree of rigor in the treatment of their slaves which their own feelings would like to ameliorate, if the Abolitionists would permit them to live in safety, under a milder rule?” 

Douglas’s pandering in Montgomery was to no avail.  Lincoln won the election and within months, the Deep South states started seceding from the Union just as Douglas had feared.  

But that wasn’t the last of his Montgomery statehouse address.  It had a rebirth in November 1939, some 80 years after it first ran in my great-grandfather’s newspaper, when the Journal of Southern History found occasion to republish it.  The text was accompanied by an introduction that included a rather startling observation by authors David R. Barbee and Milledge L. Bonham:  “Today the reader may find the speech very convincing . . .”  

Why would Douglas’s 1860 speech be germane to readers in 1939?  And why very convincing?   Because, as Stephen Douglas had made clear on that distant November afternoon, he believed strongly in the idea of non-interference between sovereign states.  Such views were a comfortable fit with American isolationist policy in the year when Nazi Germany invaded Poland, and Britain and France declared war on Germany.  Douglas’s words provided great cover:  If bad things are happening elsewhere and it isn’t your business, look away.

There is a postscript to my great grandfather’s story.  Edwin Banks was a Unionist but when the Civil War began, he joined the Confederate Army.  He served in various postings around the South, finally being sent to New Orleans.  There he remained until 1867, and there he died of yellow fever at age 29.  

The War had been over for three years by then, but in a sense it killed Edwin Banks.  During the War, Union forces occupying New Orleans had instituted strict sanitary regulations that had kept yellow fever at bay.  From 1860 to 1865 there was a total of 20 yellow fever deaths in New Orleans. But in 1866 all-white governments were in power in the South, and local health authorities had regained control of the Mississippi riverfront and relaxed these precautions.  

The following year brought an epidemic in which yellow fever killed more than 3,000 residents.  As a local physician remarked, “We … have occasion to mingle some thanks among the many curses” that New Orleaneans had heaped on the Union occupation.”*   

* See Yellow Fever, Race, and Ecology in Nineteenth-Century New Orleans, by Urmi Engineer Willoughby

 

Read more about Ann's Confederates In My Closet on her website. 

 

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/blog/154504 https://historynewsnetwork.org/blog/154504 0
The Settler Colonialist Frame Helps Clarify What's at Stake in the Middle East for Israelis, Palestinians, and Peace

Demonstrators protest Palestinian Evictions from Sheikh Jarrah, East Jerusalem, 2010.

Photo Amir Bitan, CC BY-SA 3.0

 

 

 

Colonialism has done immense damage in the world, and its damage is ongoing. Look around the world and consider the problems we face. In almost every case, you will see the hand of colonialism causing the problem or making it worse.

Today, we frequently hear progressive voices argue that “settler colonialism” is a helpful frame for understanding the nature of the deadly and intractable conflict in Israel and Palestine. My sense is that few people understand colonialism, let alone settler colonialism. I write here to provide some context.

For centuries, nations have crossed borders and claimed the lands of others as their own. With notable exceptions, most of this colonial activity started before the 20th century, but much of Africa, the Middle East, and Asia remained colonized until the mid to late 20th century. In our time, newer forms of international domination have emerged that are more efficient and less cumbersome than taking lands and the people on them.

Historians, and other scholars who have studied colonialism, have distinguished between multiple forms of colonialism. All types of colonialism have in common a scheme for the colonizing nation to prosper at the expense of the colonized. Another common characteristic is that colonizing nations rarely, if ever, consider the perspectives of indigenous or long-established populations in making their decisions. Almost always, local people are understood by the colonizer as inferior and dangerous, and subject to economic and sexual exploitation, religious conversion, and cultural destruction.  All too frequently, violent suppression, sometimes leading to ethnic cleansing and genocide results from colonialism.

It is common to see the words “exploitation colonialism” and “settler colonialism” as shorthand for explaining the different kinds of colonialism. The former is more common. It seeks to exploit a resource or resources from the colonized nation (commodities, a port, a canal, land, labor, etc.). In this case, few from the colonizing nation move to the colony and those that do are there primarily to carry out military or administrative functions. Frequently, privileged elements of the local population carry out much of the work of sustaining the colony. There are many classic cases of exploitation colonialism, from India to much of Africa and Asia.

Settler colonialism is less common, but its long-term impacts on the nations that experience it are profoundly felt in today’s world. Settler colonialism seeks to take the land from the original inhabitants and remove them from the area of settlement. This has almost always led to ethnic cleansing and wars of genocide. The classic cases of settler colonialism are the United States, Canada, New Zealand, Australia, South Africa, and most recently, Israel.

The Jewish people do not claim to be the original inhabitants of what is now Israel, but stake their claim on biblical accounts and the desperate need for a homeland following multiple expulsions, centuries of discrimination, decades of pogroms and anti-Semitic violence in Europe, and eventually, the Holocaust.

My own ancestors migrated to the United States and Canada in the early 20th century because Eastern Europe offered neither safety nor opportunity to them as Jews. For my grandparents and parents, the creation of the State of Israel represented a critical refuge, but just as importantly, they connected it to a significant diminishment in the anti-Semitism they experienced in their daily lives. 

The development of modern Zionism, the idea of creating a Jewish homeland, has its origins in nineteenth century Europe as a response to persistent discrimination that resisted even sincere attempts at assimilation by Jewish people in Western Europe. In Eastern Europe, where many Jews lived, legal barriers to equality existed well into the 20th century and anti-Semitic violence was a constant threat.

As a result of this discrimination many Jews moved, some from Eastern Europe to Western Europe, many more to the Americas, with the US receiving the vast majority, and a small number to Palestine.[i]

It is worth noting that anti-Semitism has not gone away. Indeed, we are witnessing a resurgence in violent anti-Semitism in the US and around the world. And the recent war between Palestinians and Israel has led to a significant increase in anti-Semitic attacks in the US.

For some four hundred years before World War I, Palestine was part of the Ottoman Empire. The breakup of which after the war led to a period of British rule between 1917 and 1948. At that time, the British maintained support for a Jewish homeland in Palestine. When Israel was created in 1948, with United Nations approval, there was no Arab support, and the Palestinians never consented to the creation of Israel.

British and UN support for a Jewish State in Palestine did not seriously consider the interests or desires of Palestinian Muslims or Christians and set a foundation for a colonial conflict over the creation of the new nation. Unless this is acknowledged, the current conflict cannot be understood or resolved.

A war for independence was fought between 1948 and 1949, and hundreds of thousands of Palestinians fled in the context of war and then were denied reentry. Within a year Israel passed a series of laws claiming the property of those who left and seizing the lands and buildings of some 400-600 Palestinian villages that existed prior to the creation of Israel.

In 1967, after another war, the Israelis expanded their borders to occupy the Sinai and Gaza, the West Bank and East Jerusalem, and the Golan Heights. At first, Israeli settlements on the West Bank and in East Jerusalem, were limited. Settlements were never welcomed by Palestinians, in part because they came at the expense of land and sovereignty. But since 1980, the settlements have expanded rapidly and have left the West Bank ungovernable and not viable as part of a Palestinian state. In Gaza, the Israelis abandoned their settlements in 2005, and recreated them in the West Bank. Gaza has been under blockade since and is now an isolated, densely settled Palestinian enclave with neither control of its own borders nor access to political rights in any nation.

Since 1948, Israel has been practicing settler colonialism. Granted lands without the consent of those living on them, Israel has been a militarized nation from the start. Expanding their borders through war and conflict since its founding, many Israelis have then demonized the people who lived on the lands they took.

No people surrender lands without a struggle. In North America, armed conflict over the lands that would become the United States lasted from 1610 until the 1920s. 

The decision to occupy the West Bank and blockade Gaza is an act of provocation. The expulsion of Palestinians from lands and homes on the West Bank and in East Jerusalem will always be met with resistance, sometimes violent. Israelis know this.

Over and over we hear that Israel has the right to defend itself. But the reality is that violent conflict is intrinsic to the settler-colonial enterprise. Settler societies will thus feel compelled to defend themselves against the resistance they create by forcibly taking land and removing people from their homes.  

The track record of settler societies has not been good regarding justice for those who came before the settlers. Israel is going down a dangerous road with increasing speed. The chance to pull back, to establish two viable and thriving democratic states, is dwindling.

I urge all Americans to pressure our government to insist on a halt to Palestinian expulsions from any lands in East Jerusalem and the West Bank and the implementation of a plan to repatriate the lands of the occupied territories leading to the development of an independent state of Palestine.

 

[i] Simona Sharoni and Mohammed Abu-Nimer, “The Israeli-Palestinian Conflict” in, Jillian Schwedler, Understanding the Contemporary Middle East  Fourth edition. Boulder, Colorado: Lynne Rienner Publishers, Inc., 2013.

 

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180391 https://historynewsnetwork.org/article/180391 0
The Post-Trump GOP: Rebirth or Stillbirth?

US Reps. Matt Gaetz (FL) and Marjorie Taylor Greene (GA) hope that Trumpism without Trump will make the Republican coalition a winner.

 

 

 

The Republican Party is clearly going through some sort of transformation. Despite losing the presidency, the House, and the Senate in 2020, the party is not having its expected self-analysis and autopsy. Rather than take a cold, hard look at what went wrong, the Party is circling the wagons, and betting it all on Donald Trump as its future. Post-election, Trump quickly put a stop to all questioning and solidified his place as titular head of the party. And any and all doubters had to be banished or destroyed. The Republican Party is now the Trump Party.

Liz Cheney found out just how firm a hold Trump has on the party as, despite a 93% pro-Trump voting record, she was stripped of her position in the Party leadership, and now faces threats of a divisive primary and banishment from the party. Lesson learned?  This warning to all who defy Trump registered loud and clear: unless you publicly adhere to the Trump line (the 2020 election was stolen, January 6 was not an insurrection, up is down), you run the risk of being run out of the party. Take on Trump at your peril.

To outsiders, this total control exercised by Trump is more than a mystery, it defies logic. How, one might ask, can so many be so immune from so much evidence? Donald Trump’s takeover of the party is a stunning political feat; others have tried before him, but very few have succeeded. Give Trump credit.  

Should the Trump takeover continue, what impact might it have on the political fortunes of America’s party system?  Are we on the verge of a political realignment?

THE POLITICS OF REALIGNMENT

Traditionally in the United States, party realignments occur every forty years or so. The last major realignment occurred after the Great Depression of 1929. Realignments need a predicate, and the Depression so shook up the world of politics that the glue that held the old party system together lost its adhesive quality and opened the door to a new party system.  In the presidential election of 1932 and then the midterm election of 1934, Franklin D. Roosevelt and the Democrats drew in many new voters and those disaffected by the Republicans to form the New Deal coalition that came to dominate politics for roughly thirty years. It was a big tent, with conservative segregationist Southerners aligning with northern liberals to form a governing party. Republicans became the party of less: less government, less welfare, less of most things (except defense).

The FDR coalition began to break up in the 1960s. It had grown old and tired and when Lyndon Johnson pushed for massive civil rights changes, the Democratic Party split, with Southern segregationist whites abandoning the Democrats for the Republican Party. LBJ knew this would be the result of passage of progressive civil rights legislation, but felt it was the right thing to do and knowingly ceded the South to the Republicans. From that point on, the two parties were more ideologically cohesive, but also began to split further and further apart, with the Democrats moving to the left, and the Republicans moving far to the right. Richard Nixon’s “Southern Strategy” was an effort to draw southerners into the Republican Party, and had it not been for Watergate, might have accelerated the pace of the shift over to the Republicans.

THE REPUBLICANS MAKE A BID FOR REALIGNMENT

Arizona Republican Senator Barry Goldwater railed against the old ideology of the party as a “less than” party and wanted the Party to truly stand up for conservative values. In the short term, this backfired as Goldwater lost the 1964 presidential election to Johnson in a landslide. But Goldwater’s views began to permeate the party. By 1980, the Republicans had a more attractive face for their conservatism: Ronald Reagan. Reagan blasted government programs (“Government is not the solution to our problems, government is the problem”) and the old FDR coalition, by then running on fumes, began to collapse.

But Reagan did not realign the parties. He proved more popular personally than were his ideas. The cult of Reagan would dominate the party for three decades, but not lead to a partisan realignment. Enter Newt Gingrich, whose “fight club” mentality made politics a blood sport. Democrats were not to be defeated at the ballot boxes, they were to be vanquished a wartime enemies. Gingrich’s politics of personal destruction took the party of Reagan, and gave it a brutal, vicious edge.

As a result of the Democrats’ failure to reinvigorate their base, and the Republican narrowing of message and efforts at political warfare, the United States seemed headed less for a political realignment and more towards a de-alignment. A pox on both your houses! Elections were won by a Democrat, then a Republican, a Democrat and again, a Republican. There was no governing party nor was there a governing ideology.

THE UNLIKELY RISE OF DONALD TRUMP

Enter Donald J. Trump.  His election was as surprising as was his ability to take over the Republican Party. In fact, the party just seemed to lay down and let Trump take over. It put up little resistance and, despite Trump’s manifest weaknesses as a person and as a leader, his base became almost fanatically loyal to him. Soon it became the Party of Trump, a cult of personality in which Trump the person was worshiped, while Trump’s ideas could hop from one position to another. The Republicans ceased to be a party of ideas and became a party of personality.

Is Trump the stuff of which party realignments are made? The Republican Party of 2021 is banking on it. Is there method to this madness? (Spoiler Alert: NO).  Is Trump the Party’s Savior or Destroyer? Is loyalty to one person sustainable and enough to build a new majority or a new Republican Party?

Historically, events (the Depression for example) or ideas (an end to slavery) have animated the birth of new political realignments. What we are witnessing in the rise of Donald Trump as leader of the Republican Party is merely a cult of personality. When he goes, it goes. What will be left behind in the ashes of Donald Trump’s departure from the political scene? Is he developing a cadre of young leaders who were inspired by his rhetoric and accomplishments as was the case with John F. Kennedy and Ronald W. Reagan?  Did he leave behind inspiring calls rhetorical flourishes such as “Ask not what your country can do for you, ask what you can do for your country” or memorable calls to action such as “Mr. Gorbachev, tear down this wall!”? Did he bequeath a governing philosophy (“America First”?) that will live on long after he departs?  Just what is there to cling to, post-Trump?

More likely than a partisan realignment, Donald Trump’s harnessing of anger and resentment will be a virus that remains in the body politics form years to come. It will be hard to rid society of the toxic materials he has spewed into our bloodstream.

Healthy political systems have avenues of rebirth and reinvigoration. Our system is ripe for a realignment and hungry for a governing coalition. It may yet appear. But history suggests that anger and resentment are not the building blocks that will attract a majority of citizens to march behind its banner. From the Know-Nothings to McCarthyism, from the anti-Chinese laws to the anti-Muslim bans, negative appeals have a short shelf life. As a country, the better angels within us seem somehow to save us from ourselves. 

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180392 https://historynewsnetwork.org/article/180392 0
Disregard for the Electoral Process is New and Alarming

Congressional Aides with the Electoral College Ballots

 

 

A Reuters/Ipsos poll released in April 2021 indicates that a majority of Republicans feel that the presidential election was stolen from Donald Trump. On January 6, when Congress convened to count and certify the electoral votes, 147 Republican members of the House of Representative voted against certification even after a mob had taken over the U.S. Capitol. This is unprecedented. Never before has a major political party rejected the results of a presidential election. What caused this phenomenon? When and how did forces come together resulting in an attack on democracy by a major political party?

American history is replete with presidential elections that could have been justifiably challenged. Many times results have been less than clear-cut and controversial. Before the 12th Amendment each elector would cast two votes. The candidate with the most votes became president and the runner-up vice-president. In the 1800 election, Jefferson and Burr, the Democratic Republicans, tied for first. It was left to the House of Representatives controlled by the Federalists to decide whether Jefferson or Burr would be president. They chose Jefferson, who was then accepted by all sides as our third president. Today it would be inconceivable for a Republican Congress to decide which Democrat is elected president. But that happened in 1800 as the Federalists accepted the  Electoral College system as prescribed by the Founding Fathers.

In 1824, Andrew Jackson got the most popular votes but nobody won a majority of electoral votes. The House of Representatives then elected John Quincy Adams president with the support of failed candidate Henry Clay. Jacksonians complained of a “corrupt bargain,” but Adams was accepted as president.

The 1860 election of Lincoln is the one time that democracy did not work. But Lincoln was rejected by a regional faction, not by the action of a national party. That is different from 2020 when the legitimacy of an elected president has been rejected by the leadership of a major national party.

The 1876 election became extremely complex and controversial. Samuel Tilden the Democrat outpolled Republican Rutherford B. Hayes. The vote count in three southern states (Florida, Louisiana and South Carolina) was disputed, with substantial evidence indicated that Democratic forces stole the election from the Republicans in the three disputed states. Hayes supporters in each state sent alternate slates of electors. The ongoing presence of federal troops in the South to enforce Reconstruction heightened the partisan stakes of the election, and no vote counting agreement could be made. With Congress unable to decide, a 15-member election commission was appointed, and awarded Hayes the presidency. The Democrats agreed on the condition that federal troops would be removed from the South. Tilden was disappointed but accepted the results. Ultimately, both parties accepted the election, though at great cost to Black southerners.

In 1888, Democratic President Grover Cleveland received the most popular votes, but Republican Benjamin Harrison became president winning a majority of electoral votes. No controversy occurred.

In 1960, Americans witnessed the closest presidential election of the 20th century as John Kennedy narrowly won over Richard Nixon in the popular and electoral votes, but with evidence of fraud in Illinois and Texas that could have changed the election in a recount. Vice President Nixon presided over the Congressional counting and certification of the electoral votes. Nixon declared that it was his “honor” to declare John Kennedy the new president.

In 2000, the close election came down to the state of Florida in the contest between Bush and Gore. The Florida Supreme Court authorized a hand recount of undervotes, which were ballots showing no presidential preference. Gore’s lawyers contended that machines reading punch card ballots missed presidential votes. By a margin of one vote, the U.S. Supreme Court stopped the recount, awarding Bush Florida by 537 votes and the presidency. Gore conceded the election on the night of the Supreme Court ruling and a majority of Democrats accepted the results.

In 2016, Hillary Clinton won nearly three million more popular votes than Donald Trump, but Trump won the electoral vote majority. She conceded on election night. A majority of Democrats accepted the results with no credible attempt at a recount.

This history makes it clear that America has had controversial presidential election results that could have been contested but were largely accepted by both major parties and the losing candidate until 2020. How did this break in tradition happen? In the 1990s a more virulent partisanship arose. When Bill Clinton was elected, Senate Republican leader Bob Dole stated that Clinton, elected with only 43 percent of the vote, should have only limited power. Republican members of Congress began implying and even stating that Clinton is “not my president.” After the Republicans took over both Houses of Congress in the 1994 elections, Clinton became the “inconsequential” president to them. This occurred as a media revolution was happening.

Talk radio in the 1990s gained an unprecedented power. A survey of Congressional staffers in 1994 indicated 46 percent credited Rush Limbaugh as their greatest influence. About 25 million Americans a day were tuning in to hear Limbaugh de-legitimize Clinton. From abortion to policies toward gays to healthcare issues, this man is un-American Limbaugh declared. To complete the media revolution, Fox News with no pretense of objectivity became the highest rated cable news network.

For eight years of George W. Bush’s administration, virulent rhetoric about the presidency ceased. But then came Barack Obama and more de-legitimizing. On Obama’s inauguration day, Republican lawmakers and strategists held a meeting to discuss ways to completely stop the Obama Administration. When Obama proposed healthcare reform, demonstrators carried signs with a racist caricature of the president to the U.S. Capitol. Republican Rep. Joe Wilson of South Carolina yelled, “You lie!” to the President as he addressed a joint session of Congress regarding his healthcare proposal. Wilson later apologized, but Limbaugh said that there was nothing to apologize for. As Obama campaigned for president, a birther/conspiracy movement arose claiming that he was not a natural-born American citizen.

With no proof, Donald Trump gained national attention as a birther in 2011. He went on to win the presidency in 2016 captivating those who had become virulent partisans. His base became the culture warriors, talk radio/cable news acolytes, and white racist tinged people. These people had been seeing a disrespect for the presidency become commonplace and acceptable since the early 1990s. When Trump created a cult of personality they were not only ready to join but they followed Trump in destruction of democracy. They still support the ex-president who is under criminal investigation.

In January 1965 after President Lyndon Johnson gave his State of the Union address outlining his Great Society, Republican House leader Gerald Ford said that his party also envisioned that kind of America, but disagreed as to the way it should be achieved. That was the respectful partisanship of its day toward a president. Due to a process of disrespect begun in the 1990s, that kind of partisanship is gone.

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180393 https://historynewsnetwork.org/article/180393 0
The Roundup Top Ten for May 28, 2021

Will We Ever Get Beyond "The Fire Next Time"?

by Elizabeth Hinton

"What we witnessed in 2020 was the latest manifestation of an ongoing crisis that could have been solved if elected officials had properly understood the root causes the first time around."

 

The Housing Market is Booming but Remains Deeply Unequal

by LaDale Winling

The standards and practices of real estate appraisal were developed in the context of white supremacy in the 1920s and since then have worked to make home ownership a path toward building wealth that has favored white Americans. 

 

 

I Am A Veteran History Teacher. Let Me Teach History

by Valencia Ann Abbott

Culture wars and legislative battles over things like the 1619 Project and "critical race theory" have made classrooms "a minefield of political dos and don'ts," where facts and truths are treated as dangerous, says a veteran secondary school history instructor. 

 

 

We are Told America is Living Through a ‘Racial Reckoning’. Is it Really?

by Simon Balto

Policing historian Simon Balto argues that White America today seems most intersted in defending a personal sense of innocence over racism rather than working for justice. The new culture war over "critical race theory" is evidence.

 

 

The Feminist Past History Can't Give Us

by Paula Findlen

"What it really meant to be a woman of science three centuries ago is not so easily conscripted into contemporary narratives of feminist liberation."

 

 

The History the Japanese Government Is Trying to Erase

by Chelsea Szendi Schieder

An academic involved in the recent "comfort women" controversy while teaching in Japan warns "In failing to teach what the wartime state did, the Japanese government only emboldens the forces of misogyny and racism and cultivates new generations of violence."

 

 

The Lies Cops Tell and the Lies We Tell About Cops

by Stuart Schrader

"The core of policing is not safety. It is social control. All the other lies obfuscate this function."

 

 

Billie Jean King, Foremother

by Robert Lipsyte

Billie Jean King's legacy runs far beyond tennis, and has become even more relevant with the passage of time as challenges to the doctrine of amateurism and the NCAA make clear. 

 

 

Why We Should Abolish the Campus Police

by Davarian L. Baldwin

University police forces are a major factor in strained relationships between colleges and their communities. Abolish them, says the author. 

 

 

What Scaremongering About Inflation Gets Wrong

by Rebecca L. Spang

Inflation has become a subject of political dread as Americans have shifted from seeing themselves as producers to seeing themselves as consumers. But historical perspective shows that policy picks winners and losers and is dependent on choices about what to measure and how. 

 

]]>
Sun, 20 Jun 2021 03:17:27 +0000 https://historynewsnetwork.org/article/180385 https://historynewsnetwork.org/article/180385 0