History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sat, 27 Nov 2021 21:22:14 +0000 Sat, 27 Nov 2021 21:22:14 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed Despite What You Hear, Most Americans Look Favorably on Internationalism



Amid all the flag-waving, chants of “USA, USA,” and other nationalist hoopla that characterize mainstream politics in the United States, it’s easy to miss the fact that most Americans favor global governance.  Although a great many Americans do feel a sense of identification with the U.S. government, a majority also supports the exercise of transnational authority.

This approval of global governance is especially striking in the case of the United Nations.  A February 2020 Gallup poll reported that 64 percent of U.S. respondents wanted the UN to play a leading or a major role in world affairs.  Similarly, a Pew Research Center poll that summer found that 62 percent of Americans had a positive view of the world organization, compared to 31 percent with a negative one.  Respondents gave the UN particularly high ratings for promoting peace (72 percent) and promoting human rights (70 percent), while also according it positive ratings for promoting economic development, taking action on climate change and infectious diseases, and caring about the needs of ordinary people.

The strong support for the United Nations has continued during 2021.  A survey done in early September by Public Opinion Strategies and Hart Research Associates found that 84 percent of U.S. respondents believed it important for the United States to maintain an active role in the UN, that 69 percent viewed the UN as a relevant organization needed in the world today, and that 63 percent favored resuming payment of U.S. dues to the UN (which the Trump administration had halted).  Although the favorable rating of the UN dropped somewhat (to 56 percent) from the poll’s finding the preceding year, the unfavorable rating also dropped (to 26 percent), leaving the global body with an approval ratio among Americans of more than two to one.

Furthermore, despite an almost complete absence of recent polls on strengthening global governance, there are indications that most Americans like the idea.  In late June and early July 2020, a survey conducted by the Swedish company Novus for the Global Challenges Foundation reported strong support among Americans for enhancing international solidarity.  Asked if a “global supranational organization should be created to make binding global decisions on how to manage global risks,” 54 percent of U.S. respondents supported the idea, while only 27 percent opposed it.

Not surprisingly, Americans have formed organizations that, working with comparable groups in other countries, seek to move beyond the traditional nation state system by fostering support for global governance.

The United Nations Association of the USA (UNA-USA) is comprised of over 20,000 members (60 percent of whom are under the age of 26) and more than 200 chapters across the country.  According to UNA-USA, it is “committed to strengthening the United Nations system, promoting constructive United States leadership in that system, and achieving the goals set forth in the UN Charter.”

The UN is merely a confederation of nations, and there are also U.S. organizations, though smaller than UNA-USA, that champion the establishment of a more unified entity, a world federation.  The largest of these organizations is Citizens for Global Solutions (previously the World Federalist Association).  In its own words, it works to “educate and advocate for a democratic federation of nations with enforceable world law to abolish war and global violence in the resolution of disputes, protect universal human rights and freedoms, and restore and sustain our global environment.”

In fact, Americans have exhibited considerable support for some form of global governance ever since 1945, when World War II and the use of nuclear weapons shook up traditional thinking about international relations.  But recent global disasters have given it a new urgency.  Among these disasters are the onrushing climate catastrophe, the Covid-19 disease pandemic, growing economic inequality, and the massive flight of refugees from their homelands. 

Admittedly, these global crises have sometimes led to xenophobia, intolerance, and other forms of nationalist backlash.  Indeed, it’s hard to imagine the popularity of Donald Trump and his election to the presidency without them. 

But worldwide crises have also exposed the limitations of the nation state system, in which nearly 200 nations jealously guard their own “national interest,” and suggest the need for enhanced global governance.

For this reason, most Americans favor supplementing their U.S. citizenship with world citizenship.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181838 https://historynewsnetwork.org/article/181838 0
For 2022, the Democrats Don't Have an Alternative to Embracing Left Populist Energy



With the help of business-oriented Republicans, Democrats have passed an infrastructure bill.  If that does not sound exciting to you, that is because nothing is more boring than infrastructure bills.  Little wonder the Democrats have an enthusiasm gap. They fear that they will lose control of Congress in 2022.   They probably will.  In recent history, incumbent parties have lost in mid-term elections.  One needs only think of Donald Trump in 2018, Barack Obama in 2010, or Bill Clinton in 1994.    To avoid such a fate, progressives need to do something different. They need to organize. But how?

The Populist movement offers one answer.  Lawrence Goodwyn’s classic, The Populist  Moment, emphasized how Populists in the 1890s created a “movement culture”  to earn a chance for fundamental change.  Rural people, especially before cars and telephones became widespread, were isolated from their neighbors. The Populists responded with picnics, camp outs, and speakers who helped explain their plight and its remedies.

By creating a movement culture, they found a way to link together rural communities to then- radical political goals such as adopting greenback currency, controlling the high shipping rates set by railroads, and building farmer-owned agricultural cooperatives. The Populists lost the key election of 1896, in part because of an ill-considered fusion with Democrats. With Democrat William Jennings Bryan at the top of their Populist ticket, they lost sight of their origins as a radical movement.

Today’s Republicans understand the importance of building movements.  For one thing, they have issues to galvanize their core supporters. They have retained their sense of grievance over the supposed chicanery of the 2020 election.  They remain preoccupied with seemingly disparate issues:  mask and vaccination mandates, the 1619 Project, critical race theory.  Republicans have effectively linked this basket of fears together under one umbrella: defending the family against an intrusive government.

And now they are building a movement. Republicans are energized with the kind of activism that one might associate with the Tea Party.  They have disrupted school board meetings and treat Trump rallies like free rock concerts.  The Republicans aren’t waiting for the midterm elections—their base is active now.

For progressives, the Populist moment should continue to inspire. Progressive politics emerged during the 1930’s, and, like the Populist party, emerged almost spontaneously.  Their mood was embodied most notably by sit-down strikes, beginning in Flint and Akron, as unions spread across the country.  The impact of workers on the ground was buttressed by the Wagner Act, which acknowledged the right to organize unions.

In the 1930’s, Communist Party activists had an extraordinary amount of enthusiasm.  Their energy allowed them to have an impact beyond their numbers. They led sit-down strikes, fought evictions, and denounced racism.  

In the 1960s, the Democrats successfully tied themselves to the Civil Rights Movement. Its vibrant activist  culture riveted African-Americans and trade unionists across the North.   The movement pushed Johnson to sign the Voting Rights Act, the Civil Rights Act, and legislation creating Medicare.

Like Civil Rights, the movement to end the Vietnam War was led by ordinary people.   The phenomenon was new enough that the media used a new term:  “the grass roots.” Can progressives build the grass roots again?

In some ways, they already have. Over the last few years, we have seen the growth of Occupy Wall Street. It protested corporate control of the economy and started calling the rich by a new name:  the 1%.   MeToo and Black Lives Matter have changed popular discussion of gender and race issues crucial to solidarity among workers. 

Lest we forget, in 2016, the Democrats nearly nominated a  democratic socialist, Bernie Sanders.  He was backed by small donors and fueled by the energy of volunteers.  Highlighting his connection to movements, he channeled the ideas of Occupy Wall Street, which were pitch-perfect for reaching young voters.  Even more effective was Alexandria Ocasio-Cortez, our first truly social-media savvy politician. All of this ferment has been led by the grassroots, this time mobilized both in person and on social media. Now, workers are striking against their employees in numbers unmatched since the 1940s.

Still, Trump Republicans have built a seemingly formidable movement culture. They meet online, and attend rallies religiously. Their biggest strength is Donald Trump; he is the glue holding them together.  But he’s not super-glue.  Instead, he resembles Elmer’s school glue. He can stick glitter onto construction paper, but it might only stick for a couple of hours.

Notably, Trump is a one-term president.  He was once an Incumbent, and incumbent presidents have advantages, from framing debates, cutting taxes, signing executive orders, and cultivating donors. It helps most of them to win a second term. Barack Obama did that. So did George W. Bush and Bill Clinton. 

Trump tried to win reelection, and in spite of his incumbent status, he failed. If Trump can pull off winning another term, he would be the first to win non-consecutive terms since Grover Cleveland. MAGA could actually mean Make America Grover Again.  It does not sound that inspiring.

Trump dominates the opposition party, but he is still playing with a weak hand.  Democrats cannot take things for granted.  They need the kind of energy that comes from movements, from people working together. It means taking risks.   After all, volunteers animated by climate change or the student loan crisis will not always follow a poll-tested script. But they could do something better:  surprise the pundits and win the midterms.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181837 https://historynewsnetwork.org/article/181837 0
Adulation for Today's Space Race is Misplaced. So is Nostalgia for the First One



“Dude space is freaking awesome!” Jeff Bezos (played by Owen Wilson) drawled in a recent Saturday Night Live skit. The comedy sketch, which parodied Bezos’ July trip to space, also featured comical portrayals of Virgin Group founder and fellow space traveler Richard Branson and Tesla CEO and commercial space flight developer Elon Musk. SNL dismissed these three men’s celestial competition as a “midlife crisis of cosmic proportions.”


And the show is not alone. The billionaire space race has been met with a storm of criticism. But some maintain the benefits of making space an arena of competition for the mega-rich. After all, the 1960’s space race spurred the economy, inspired STEM education, and is remembered as one of the United States’ crowning achievements. But this gets the history wrong. During the Cold War space race, the United States government ignored the wishes of the American people, as well as domestic racial and social issues, in its quest for political supremacy.


At the time, the now-lauded Cold War space race raised the same question confronting modern-day billionaires: who asked for this and what are the benefits? Then, and now, the lunar race was about the triumph of capitalism.


The space race began in the midst of the Cold War as a proxy conflict between the United States and the Soviet Union as both nations sought to prove the superiority of their competing systems. The two world powers saw their lunar competition as indicative of their earthly status. Lyndon Johnson, Vice President and Chairman of the National Space Council, said in 1962 that the United States’ very “survival as a free and first rate-nation” was linked to its “leadership in space.” Also in 1962, the New York Herald described the race’s prize saying:


“At stake is the world’s recognition as to which nation - the Soviet Communist dictatorship or the democratic free society of the United States - has the vision, the resolution, the scientific and engineering skill, and the stamina to do it first.” The two nations were not just fighting to prove who could build the better spaceship, they were fighting to validate their respective economic systems.


The problem? The American public never bought into the idea. Polls taken throughout the 1960’s show that the majority of Americans did not believe that Project Apollo was worth the cost. In 1965, just 39% of Americans thought that the United States should spend whatever it took to get to the moon, and Americans consistently ranked spaceflight as one of the main programs to be cut in the federal budget.


Why the unpopularity? First of all, space is dangerous. A 1963 newspaper article declared the “prestige of being first not worth casualties.” Also, the money could be well-used elsewhere. Congressman Chet Holifield (D-CA) spoke for many Americans when he argued that the money spent by NASA would be better used on schools, housing, or hospitals. It seemed absurd to be spending so much in space when there were so many problems at home.


Gil Scott-Heron agreed. His 1970 poem “Whitey on the Moon” illustrated the dichotomy between space adventures and the reality of American urban life. In the poem, Scott-Heron highlights the enormous quantity of resources poured into the space program by repeating the poem’s title in between lines that recount a growing litany of woes, notably the struggle to pay rent and medical bills for his sister who has been bitten by a rat in their unhygienic urban living space. Indeed, Scott-Heron even makes the connection explicit: rent increased “’cause Whitey’s on the moon.” The government’s big spending on space missions directly hurt the forgotten taxpayers who bore the brunt.


Scott-Heron specifies the astronauts’ race for a reason. The money spent on the space program seemed especially misdirected to black Americans, who continued to suffer economic discrimination. Martin Luther King explained the disconnect when he spoke to the Southern Christian Leadership Conference (SCLC) in 1967, pointing out that as NASA launched rockets into space, “the Negro still lives in the basement of the Great Society.” He did not deny “the value of scientific endeavor,” but simply acknowledged the “striking absurdity in committing billions to reach the moon where no people live, while only a fraction of that amount is appropriated to service the densely populated slums.” People of color continued to question the space program beyond the initial moon landing. In 1971, two-hundred black protesters marched to the Kennedy Space Center to decry the launch of Apollo 14. Like King, SCLC leader Hosea Williams explained that the march was “not protesting America’s achievements in outer space,” but rather “protesting our country’s inability to choose humane priorities.” NASA had spent billions looking upwards when there was much to be done on earth.


So why did the United States government forge ahead? Pride, power, and money. Today, American capitalism continues to fuel the billionaire space race, even as earthly conditions remain tense. Jeff Bezoes enjoyed what news outlets have called a “joyride” to space just over a year after Amazon workers walked out to protest dangerous working conditions during the COVID-19 pandemic. Complaints against space exploration come from environmental advocates as well. Back in the 1960’s, Americans rated air and water pollution as issues of greater concern than making it to the moon. Today, the stakes are higher. With global climate change already topping UNESCO’s list of challenges for the next decade, the fossil fuels that space missions emit are no cheap price. U.S. Labor Secretary Robert Reich encapsulated the fears of many when he tweeted about the “record-breaking heatwaves” scorching the earth as billionaires had “their own private space race.”


The billionaires’ space companies are not without technological merit. Sirisha Bandla, a Virgin Galactic employee, conducted NASA-supported experiments during Branson’s flight, and NASA picked Musk’s SpaceX to develop the first lunar commercial human lander. The media attention has focused far more on the individual billionaire personalities, however. In the 1960’s, capitalism was portrayed as necessary for all. Today, it is clear who it has really helped.


The Cold War space race was not a uniformly praised scientific achievement. It was a flexing of political muscles in the name of capitalism while millions of Americans – notably those of color – struggled to meet their basic needs. The billionaire space race follows in its footsteps. How history remembers today’s space race is up to us. If we allow it to be painted in the same rosy image as the last one, we might be in for continued galactic adventures and even more suffering here on earth.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181797 https://historynewsnetwork.org/article/181797 0
The Accident that Almost Decapitated the US Government

Currier and Ives lithograph of the USS Princeton explosion, 1844



It is the stuff of political thriller novels and a recent television series (Designated Survivor). What if the government of the United States was decapitated by a single event? In reality, one man tried, and failed. In 1865, John Wilkes Booth’s conspiracy succeeded in killing President Abraham Lincoln, but two other targets, Vice President Andrew Johnson and Secretary of State William Seward survived. Fiction stories and Booth aside, such a calamity almost happened—on February 28, 1844, on the calm waters of the Potomac River.

President John Tyler, the first vice president to take the reins of power after a president’s death, wanted a better Navy. William Henry Harrison had died in 1841 after only a month in office. Tyler, a former Democrat, feuded over economic policy with the Whig Party, on whose ticket he had been elected, immediately after taking office. The Whigs tossed him out of the party. The Democrats, miffed at his prior desertion from their ranks, held him at arm's length. Unpopular, especially in the North, Tyler was a president without a party.

His plan to win a term of his own in 1844 focused on the annexation of the Republic of Texas into the Union. Texas, Tyler told his cabinet and friends, was the “great object of my ambition.” But there were two potential foreign problems—Mexico and Great Britain. Mexico, still officially at war with its former province, and the British, still assertive of their power, especially on the high seas, were not expected to quietly stand by and let Texas be absorbed by the United States.

Part of Tyler’s strategy to defend against these potential foreign threats to Texas annexation was a new state-of-the-art warship, a steamship named the USS Princeton. The product of years of lobbying by Captain Robert Stockton, a friend and political supporter of Tyler, the Princeton, launched in September 1843, was a prototype for the modernization of the Navy. In an era when vessels of war were still sailing ships, steamships, with clunky paddlewheels, exposed engines, and limited range due to fuel requirements, were viewed with disdain by Navy Department officials. Incorporating the design and innovations of John Ericsson, a Swedish inventor that he had befriended, Stockton’s Princeton was a steamship with a difference. Instead of a paddlewheel, it had a screw propeller system that connected to engines hidden within the hull, where they were protected from enemy fire. Its boilers burned efficient anthracite coal, increasing its range. It was a hybrid, also able to run under sail, with its smokestack collapsible, reducing wind friction and increasing its speed. 

Moreover, Stockton determined to make his ship the most powerful warship afloat. He had two twelve-inch bore wrought iron cannons placed on her deck. Named the Peacemaker and the Oregon, each was designed to hurl a cannonball weighing more than two hundred pounds downrange, with accuracy, for almost five miles. His ship, bragged Stockton, was virtually “invincible against any foe” and her innovations and deadly force “may be productive of more important results than anything that has occurred since the invention of gunpowder.” After touting the ship in Philadelphia and New York, Stockton brought the Princeton to Washington in February 1844 to show it off to the nation’s political elite, and to get funding for a fleet of warships like her.

In order for those in power to see his ship up close, Stockton planned three afternoon excursions down the Potomac River, starting at Alexandria. The most important of these was set for February 28, when President Tyler, most of his cabinet, members of Congress, their wives, and other important guests, would be aboard. The ship’s massive cannons would be fired three times in route for all to see, a mid-nineteenth century version of shock and awe by the American Navy. The National Intelligencer proclaimed that that the cream of Washington society would see firsthand “this splendid and unequalled specimen of our naval ingenuity, and to witness something of the performance of her formidable battery.” 

February 28 dawned a cloudless late winter day in the nation’s capital. Once the ship was underway, Captain Stockton, a wealthy man, spared no expense in entertaining his guests, serving a meal of the finest delicacies, along with plenty of wine and champagne. Stockton decided that the Peacemaker would be the cannon fired on the cruise. The first two firings went smoothly and all were suitably impressed. The third, to honor George Washington as the ship was on the Potomac near Mount Vernon, was scheduled for just after the guests finished eating. At the table, a beaming Tyler had offered a toast, “To the three big guns—the Peacemaker, the Oregon, and Captain Stockton.”

Topside, on the Princeton’s bow, most of the cabinet, and several senators and congressmen, then gathered behind Stockton for the third firing. Word was received that the president was detained below and to proceed without him. As the Peacemaker was fired, it exploded, hurling chunks of iron, some weighing more than a ton, across the deck. When the smoke cleared, eight were dead and almost thirty wounded. Among the dead were two cabinet members, Secretary of State Abel Upshur and Secretary of the Navy Thomas Gilmer. Had Tyler been present, he would have been standing with them and likely would have met the same fate. As he was headed up the steps for the third firing of the cannon, several people began singing an old song that was one of his favorites from his youth, and he stopped to listen. Flirtation may also have been on his mind. The widowed president had earlier been sipping champagne below deck with a young New York socialite that he had his eye on, Julia Gardiner, who was on board with her father and sister. Her father was among the fatalities in the explosion. Within months, Tyler would make Julia his bride.

In early 1845, in the waning days of his administration, President Tyler, after many political battles over the issue, accomplished his goal and signed a law formally annexing Texas into the Union.

The office of vice president had been vacant since 1841, when Tyler became president upon Harrison’s death. There was no procedure for filling a vacancy in that position until the enactment of the Twenty-Fifth Amendment to the Constitution in 1967. Had Tyler been killed in the explosion on the Princeton, the president pro tem of the Senate, Senator Willie Mangum of North Carolina, would have assumed the presidency. Mangum was a Whig, a loyal member of the party that had kicked out Tyler, and was an opponent of the annexation of Texas.  But for a song, history would likely have been changed.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181834 https://historynewsnetwork.org/article/181834 0
The 1950s Dog Dads Responsible for Voter Fraud Claims Today Good morning, HNN!

In the past six months, over a dozen states have passed new restrictive voting laws that favor Republicans. If you follow the money, the original source of these efforts to undermine democracy becomes clear. Cat lovers, rejoice: it all dates back to a couple of pooch-lovers...

You can also listen to the episode on Spotify and Apple Podcasts, and watch it on Instagram.

Today’s story comes from Dark Money by Jane Mayer, as well as a recent investigation of hers called “The Big Money Behind the Big Lie.”

Next season on Skipped History, beginning in March:

There’ll be Indigenous history, banking history, and the 1619 Project may make an appearance! We’ll try to get to the bottom of the question, “To what extent has the US ever really been a democracy?” while inevitably pivoting to explore the escapades of more poodles. Thanks so much for tuning in this season!

Cheers, Ben

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/blog/154565 https://historynewsnetwork.org/blog/154565 0
Perlstein's "Reaganland" Sheds New Light on a Familiar Story




Rick Perlstein, America’s preeminent historian of modern conservatism, in his latest book, Reaganland, America’s Right Turn 1976-1980, takes us more deeply than anyone writing today into the minds of those who, with Ronald Reagan as their willing front man, started us on our current political path.


Perlstein began his long-ranging study of conservatism with the insightful Before the Storm: Barry Goldwater and the Unmaking of the American Consensus. In that book we have a detailed chronicle of the beginnings of modern conservatism. Perlstein continued with Nixonland, and now his latest important work, a devastating, demanding and encyclopedic account (the book is more than 900 dense pages long, with 144 pages of source notes) of the two antagonists at center of the book, Jimmy Carter and Ronald Reagan. It is a tale of the political party each belonged to, and the handlers and manipulators who surrounded them.


Because of Reagan and the cult that enveloped him, conservatism as we know it became a unified, carefully crafted movement that eventually took strong hold in the United States. Though not always in power, its right-wing ethos grew stronger every year. If we had kept our eyes open, we would have seen what was taking place. There had always been a strong conservative bent in America from the days of the Revolution. It picked up steam without abating to where it is today. But no one knew at the time in the early 1960s, that conservatism would turn into a movement that threatens democracy as we know it. Goldwater as a presidential aspirant crashed and burned. Eventually, as Perlstein sees it, Jimmy Carter, encircled by an incompetent staff who were also not up to the presidency, failed miserably. In doing so, he and the Democratic Party allowed Reagan and his dedicated managers to remake American politics, and to bring into the open many Americans’ deep-seated intolerant beliefs.   


It is that story that Perlstein tells in Reaganland so well, with great zeal, detail and context. He makes an interesting point that were it not for his editors and their potentially active red pencils, his study could have run many volumes longer. It’s a credit to him that he was able to compress what he discovered into what he needed to make the book as rich as it is.


As a producer for NBC News including the years 1976-1980, the years Perlstein writes about, I covered both men and their minions many times, though I spent more hours observing Carter than I did Reagan, including trips to Carter’s home in Plains, Georgia. As a news producer, I met, worked with and reported on many of the key players from that era. I was never close to any of the people on either side. Still, I thought I knew pretty much everything that was going on at that time.  Perlstein showed me what I did not know by opening my eyes to behind the scenes activities, including intimate conversations gleaned from diaries, letters and interviews. His research is prodigious.


In covering politics, it is important to understand how each campaign operates in its quest for the White House. A surprise for me was learning from Perlstein that when campaigning, Reagan sometimes wrote his own columns and speeches. In the press, we believed that Reagan was a mere the actor, a puppet rather than the policy maker. Though he gave the same speech at every stop – the cards with his speech were soon falling apart -- it hardly mattered to his audience which adored him. Reagan helped set the lighting at every stop. He knew what he wanted. Covering the campaign I saw him express his vision and I marveled at how he had the pulse of his audience at every stop. Rarely did he hold a press conference after a speech. His handlers were very careful to limit his unscripted appearances.


Once when I had been working with Douglas Kiker, a well-informed political reporter for NBC News, we were on the campaign trail covering Reagan. One night after a Reagan speech Kiker and I hustled to the back entrance of the hotel where Reagan had appeared.  We caught up with him in the back of the hotel on a very humid night in front of a steamy pool for a quick, difficult, rare, and not very revealing interview. But it was a real interview, one on one, not the usual speech template. It showed Reagan to be equally adept spontaneously as when he performed a prepared speech. Though his handlers worried about what he would say and how he would say it, our short interview showed they had little to worry about. We see his advisors as calculating and secretive, something they were. In the end they were looked on as sinister but their success with Reagan’s followers wiped those concerns away. Reagan, though at times headstrong, after eight years as governor of California and long experience as an actor and spokesman on television, understood that his staff knew his strengths and weaknesses. He understood they were guiding him in the right direction. He usually did what he was told and he did it very well.


There are critics who believe Perlstein is too tough on Carter. Perlstein gives Carter credit for his success with the Camp David accords, but that victory was not enough to define his presidency.  Carter had no one except himself to blame for his failure to reach the American people. It was a time when the country was starting to devolve into deep partisanship, more divided than anyone thought. Witness the partisan divide today. Perlstein thoroughly details how Carter, during his presidency, to his lasting detriment, refused at times to listen to his equally inexperienced advisors. Carter wanted to make America better. I do not doubt his desire, but in pushing the idea that the nation had to suffer to succeed, Perlstein says he never made a convincing case for his imprecise, muddled vision. Carter’s speeches and folksy ways -- his wearing a cardigan sweater, sitting before a fireplace – failed as he tried to connect with a nervous public that could only see difficult days ahead. His so-called folksy ways did not work for Jimmy Carter. Americans did not want to suffer. The public decided life was much better than Carter said. It assumed life would get better over time so why make it worse than it was. In the end voters chose Reagan because he offered a sweeter vision for America.


Perlstein does not give Carter any breathing room. He makes a very good case that Carter, Hamilton Jordan, Jody Powell, Gerald Rafshoon and sometimes Pat Caddell rarely understood how to run the country. He also makes the strong case that Reagan and his handlers, Richard Wirthlin, Richard Viguerie (with his direct mail genius), John Sears and Lyn Nofziger, though not always in agreement as a group, had a purpose. They wanted to change America for all time, which they did with Reagan’s eight years in the Oval Office.


A great strength of Perlstein’s is his use of context, never letting the reader forget what else was going on in the world at the time of Reagan’s emergence. In chapter after chapter Perlstein outlines where society was in movies, on television, in books, popular music and more. It gives us a chance to pause in the central narrative, take a breath and remind us what the world was like between 1976 and 1980. Context also explains how Carter and Reagan were reacting to the world around them. Two major social realities, homophobia and anti-feminism were ugly strategies fostered by Ronald Reagan’s advisors. Republicans understood how effective those ideas were in turning people toward Reagan instead of against him. Conservatism was on the move.


In 1980 we had peanut farmer and former governor of Georgia as president with very little real experience in governance and a B actor who was also a TV host working at opposite ends of the political spectrum. Though at the time they were unaware of the consequences of their battle, they helped create a new idea of America, a clear juxtaposition of ideas by the handlers and advisors each man had. Then, by whittling away at who we thought we were, Reagan and his inner circle eventually achieved their goal. Despite the victories of Bill Clinton, Barack Obama and Joe Biden, the conservative movement now seems a permanent fixture on our political landscape.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181835 https://historynewsnetwork.org/article/181835 0
HNN Will Be On Thanksgiving Break This Week HNN will be off Thanksgiving Week. We will return with news aggregation on November 29, and publish new op-eds on December 5. We wish all of our readers a safe, restful, and happy holiday. 

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181839 https://historynewsnetwork.org/article/181839 0
The Roundup Top Ten for November 19, 2021

As A White Student in a Mostly Black School After Brown, I Learned Not to Fear History

by Woody Holton

"My three and a half years as a racial minority convinced me that one of the biggest beneficiaries of school desegregation was me."


The Elephant Who Could Be a Person

by Jill Lepore

A petition challenging the keeping of Happy, an Asian Elephant, by the Bronx Zoo raises questions about the legal status of personhood. If it applies to protect the property and civil rights of corporations, can it be extended for the protection of the natural world? 



Today's Educator Shortages are a Product of Decades of Bullying and Ignoring Teachers

by Diana D'Amico Pawlewicz

Again and again, politicians and administrators have implemented policies that make teaching unattractive work, then acted surprised when people chose other options. 



We're Still Living with the Consequences of Letting Iran-Contra Perpetrators Get Away with It

by Zeb Larson

An explainer on the Iran-Contra affair, and why the failure to hold wrongdoers accountable shapes our politics today. 



Are We Witnessing a General Strike Today?

by Nelson Lichtenstein

DuBois's insight that enslaved people abandoning plantations during the Civil War was a form of general strike helps us understand the seemingly unorganized trend of workers quitting their jobs today as a meaningful labor action that points in the direction of economic freedom.



SNCC's Unruly Internationalism

by Dan Berger

SNCC activists' global understanding of the problem of racism, expressed at the height of the Cold War, cost the organization external support, but left a vital legacy for international movements for justice. 



How to Ensure a New Redlining Initiative Succeeds

by Robert Henderson and Rebecca Marchiel

Ensuring equity in mortgage lending requires understanding why the Community Reinvestment Act failed to achieve the same goal decades ago, through a better awareness of the ongoing problems in mortgage lending. 



Why Mislead Readers about Milton Friedman and Segregation?

by Nancy MacLean

"One would think that today the facts about the long struggle of southern white leaders to preserve segregation are so well known that simple fact-checking would suffice to rule out attempts to whitewash their efforts."



Al Levy's Court Martial: An American Dreyfus Affair

by Jeannette Gabriel

Al Levy's court martial exposed the discrimination embedded in American military culture during World War II, and the way that antisemitism informed the way his accusers questioned his loyalty. 



Can Universities Counter the Global Tide of Nationalism?

by Emily J. Levine

Nationalism and a growing rivalry with China have pushed some politicians to reconsider the openness of American universities to foreign students. The history of academic exchange suggests this may be a mistake. 


Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181833 https://historynewsnetwork.org/article/181833 0
Kyle Rittenhouse's Trial Will End in a Verdict. The Nation's Trial By Ordeal Won't



The judge is never meant to be the center of news stories in a trial—and certainly not in a trial that has become a microcosm for American politics. The trial of Kyle Rittenhouse is winding down. Closing arguments will begin on Monday. And all around the nation, people are watching to see what the verdict will be. And the judge in charge of the trial, Bruce Schroeder, has managed to become a central figure in the reporting.

When he led the jury selection on November 1, Schroeder opened his comments with a brief discourse on the history of the law. And while preparing to discuss the importance of the jury system in America, he decided to give the long history, starting back, “A thousand years ago, a case like this would have been tried. Well, let’s go back even further. Let’s go back to the—I’m sure you remember it. The fall of Rome in 476 and when Rome fell the world changed dramatically.” He says that the fall of Rome led to the end of the jury system, replaced by two things—trial by combat, subject of the recent movie The Last Duel (which he referenced)—and, more interestingly, trial by ordeal.


See transcript of Judge Schroeder's remarks below. 

Trial by ordeal is exactly the kind of thing that makes calling something “medieval” a casual pejorative. Schroeder certainly presented it as such, saying, “the concept was that a person would be made to walk across for example a bed of coals, burning coals, or stick her hand or his hand into a boiling water, and depending on the healing period, if the person recovered sufficiently and this was blessed you know as so it would be a message from God, that was determined if he didn’t come out too badly he was innocent. If he did his punishments had just begun.” The idea, of course, is that people in the Middle Ages were superstitious, illogical, and ignorant. The trial by ordeal, the idea that walking across coals or putting a hand in boiling water was a message from God about innocence, seems absurd.

And it is. Because much like those notions of the Middle Ages, that idea of the trial by ordeal is fundamentally untrue. A trial by ordeal was about the court of public opinion, about the community and their decisions about innocence and guilt. And whatever happens next week, the Rittenhouse trial has indeed become a trial by ordeal—one where the public is making decisions, regardless of what is happening in the court room, and not in a way that is bringing consensus and closure. And it is one in a long train of American legal battles wherein competing interpretations of guilt reveal more about the society than the case itself.

The social nature of trial by ordeal can be illustrated with an example from that most violent of episodes, the First Crusade. A preacher who had been leading the march from Antioch to Jerusalem in 1098 was accused, essentially, of fabricating celestial visions. He was put on trial; witnesses came forth to support his cause, and he won. But in the aftermath, so much ill will was generated that he volunteered for a trial by ordeal. He would walk between two great piles of fire, promising to come out unhurt. And here is where it all gets tricky—we do not know what happened next. Much like in the aftermath of any political trial, the outcome of the trial and the outcome of how we feel about it diverge dramatically. The most sympathetic account, which also has the most complete description of the trial, says that he passed between the fires unharmed—that the community had sided with him, and therefore placed the fires far enough apart that he could make it in safety. This, after all, is the secret of the ordeal: the fire, the water, the placement, the decisions on healing, all depend on how the community feels, and on sufficient communal faith in the shared experience of suffering. But in that account, the crowd in the aftermath seized him, pulled him along the ground, and injured him severely enough that he died a week later. This was from the most sympathetic account, and an eye witness one at that. One of the later accounts, the most hostile, changed the story, saying he had been burned passing through and died the next day, clearly guilty.

A trial by ordeal was not about miracles or superstition. It was, in effect, about the community making a decision on the innocence or guilt of the party, and then bringing it about. This is an irrepressibly recurring element of the rudest conception of justice across societies and time, from Socrates to John Brown. And certainly we prefer our legal system to try and be above it, as much as any system made up of human beings with all their biases, presided over by a judge with all of theirs, can be. But the Rittenhouse trial will not end next week with a verdict. It will not end when it exits the courtroom. It is a trial that was already part not only of public discourse but political discourse, something that is only ramping up. This time the mounds of fire will not signify our common understanding and consensus.


Judge Schroeder's remarks transcribed:

This is about our constitution. A thousand years ago, a case like this would have been tried. Well, let’s go back even further. Let’s go back to the—I’m sure you remember it. The fall of Rome in 476 and when Rome fell the world changed dramatically. And the ancient law which had developed from Greece and Rome which involved jury trials—not the kind of juries we have today but they were the resolution of legal disputes by the use of reason—that fell with Rome. And some very primitive methods came into use to decide legal disputes including criminal cases. Sometimes I—I saw somebody at one of my grandkids games a month or so ago and he was wearing a shirt that said demand trial by ordeal. No, demand trial by combat. And I thought what an unusual shirt. But because I talk about this once in a while I thought it was kind of interesting. I guess it turns out that there’s a program on television that deals with that does anybody familiar with that? Well there’s some program on TV that has dealt with the concept of trial by combat and I think there may be a movie coming out about that. Actually it was true a thousand years ago, that people who were having with a legal dispute including in criminal cases could either themselves or hire someone to fight for them and they’d fight with the other side physically and there’d be an ecclesiastical blessing, a priest would bless it and it was decided that God would guide the hand of the innocent and that the innocent person would win that fight and that would be the outcome of the case. They also had something called trial by ordeal. And the concept was that a person would be made to walk across for example a bed of coals, burning coals, or stick her hand or his hand into a boiling water, and depending on the healing period, if the person recovered sufficiently and this was blessed you know as so it would be a message from God, that was determined if he didn’t come out too badly he was innocent. If he did his punishments had just begun. So that’s how they actually decided cases. And then in 1215 Pope Innocent III prohibited priests from participating in these ceremonies. So um they had to turn to some other way to determine these issues and they moved back to the jury system and the initial juries were actually made up of people who knew the people involved. People who had either been witnesses who saw what happened or people who were acquainted with the people and knew whether they were honorable people whether they were violent people, whatever the case may be. They had some knowledge of the people and they’d get together and they’d talk it over and they’d come up with a verdict. Well what‘s the number one problem that you would see with a system like that? Bias.


Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181758 https://historynewsnetwork.org/article/181758 0
Fashion and Freedom from Suffrage to AOC



This September, after one year hiatus because of the pandemic, the rich and famous once again gathered in New York City for the annual Met Gala. Celebrating the Museum’s new exhibition “In America: A Lexicon of Fashion,” the evening was themed around the idea of “American Independence” as guests (and their sponsors) demonstrated their interpretation of this concept through their clothes.


While some chose to highlight American designers, or to reference Hollywood’s iconic stars, others brought their politics to the red carpet, demonstrating that politics are a huge part of what defines American fashion. Indeed, although Alexandria Ocasio-Cortez’s “Tax the Rich” dress received most of the attention, she was not the only one who used her outfit to make a political statement. Representative Carolyn Maloney showed her support for women’s rights by wearing a dress in suffrage colors and slogans like “ERA Yes” and “Rights for Women” modeled like the sashes suffragists donned.


Poet and activist Amanda Gorman drew attention to immigration when she wore a blue dress reminiscent of Lady Liberty, while holding a book-shaped clutch with a golden emblem of the phrase “Give Us Your Tired” on it. Soccer athlete Megan Rapinoe and actor and writer Dan Levy used their appearance to show support for LGBTQ rights, while model Cara Delevingne literally wore her message on her chest with an armor-like vest with the words “Peg the Patriarchy.”


Given these examples, it is somewhat surprising that in the exhibit’s new fashion lexicon the word “politics” doesn’t appear. The exhibit showcases 12 themes that comprise what its curators envision as the modern vocabulary of American fashion: Nostalgia, Belonging, Delight, Joy, Wonder, Affinity, Confidence, Strength, Desire, Assurance, Comfort, and Consciousness. Yet, especially in the American context, politics and fashion are intrinsically linked to one another, and this connection is important to the understanding of both.


Another word that is missing from the lexicon and is also associated with America is “Freedom.” Clothes—and fashion—were often at the center of social struggles to gain freedom and equality, not only as symbolic gestures but also as ends in themselves. Particularly in the feminist movement, sartorial freedom was part and parcel of a broader agenda for women’s rights.


Although fashion and feminism are often positioned as opposing forces, the idea that clothing and adornment practices hinder political activism is very far from the truth. Throughout the 20th century, feminists did not view fashion as a frivolous or marginal issue, but as a useful means to redefine gendered, classed, and racialized notions of femininity, as well as to promote feminist agendas. As they employed mainstream fashion styles, they expanded the spaces of feminist activism beyond formal organizations and movements, reclaiming fashion as a realm of pleasure, power, and feminist consciousness.


Much of the feminist effort to achieve sartorial freedom focused on implementing styles that allowed for comfort, mobility, and freedom. Women sought to relieve themselves from restrictive undergarments and long trailing skirts, while pushing for loose silhouettes and utilitarian elements in clothing such as pockets and zippers. In the 1920s, women equated their choice to wear knee-length skirts with their right to vote, while suffragists argued that pockets were the first step in gaining equality between the sexes. Pockets, just like voting, became a feminist rallying cry for freedom and equality. By the 1940s, sportswear, a style that extolled comfort and mobility, and was most identified with its female designers, became, according to Dorothy Shaver, the president of the Lord & Taylor department store empire, the epitome of the “American Look.”


Yet, the goal was not just to seek clothes that would provide these freedoms, but that such outfits would also be popular, beautiful, and part of the mainstream – or in other words, fashionable. Especially for women who were barred from positions of power due to their class or race, the ability to claim “fashionability” was sometimes more important than the comfort and mobility that these styles provided their bodies. Immigrant women, like early 20th century labor activist Clara Lemlich, combined their demand for labor rights with the respect they deserved as “fashionable ladies.” Suffragists argued that their fashionable taste was evidence of their worthiness to be voters. And young Black migrants to cities in the 1920s adopted flapper styles as part of their demand for racial inclusion.


These women faced criticism. However, they did not think that fashion was a distraction. On the contrary, it was claiming their right to participate as equal members in the fashion world that helped them to imagine their role as equal members in society. Fashion, as they understood it, and as many in the Met Gala also understand, has power, and as such it can be used to convey and promote political agendas in creative ways.


While most of us are not writing manifestos to justify what we wear, our clothes sometimes can carry a loud message. From the pink “pussy hats” of the 2017 Women’s March to face masks with BLM written on them, fashion can be a useful tool in the activist’s box as well as a personal form of individual expression. And especially in an event like the Met Gala, when the eyes of the media and the attention of the world are more open to hear this message – fashion can speak volumes.


Rather than treat the Met Gala’s red carpet as a space of escapism, Ocasio-Cortez, Maloney and the others turned it into a protest ground. And as they wore their politics on their sleeves, or in the case of Ocasio-Cortez—on their back—they also showed us what “American Independence” is all about. Advancing policies and political agendas are not restricted to the halls of Congress, they can be manifested in the clothes we wear, whether it is a white suffragist dress from 1917 or a white “Tax the Rich” dress from 2021. Even if the Costume Institute chose to omit “politics” and “freedom” from its new lexicon of American fashion, it was the guests of the Gala who put these words back in.


Yet even more importantly, fashion, freedom, and American independence have been and are still connected to ideas of women’s rights and equality. Ocasio-Cortez did not just protest against capitalism, she also made sure the dress had pockets.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181760 https://historynewsnetwork.org/article/181760 0
Will SCOTUS Force Us All to Find Out How Polite an Armed Society Will Be?



It appears that the Wild West will soon return to America. At least, that’s what the Supreme Court indicated at oral argument recently.


The right to bear arms is enshrined in the Second Amendment to the Constitution. In District Columbia v. Heller, a 5-4 2008 decision, the Supreme Court held there was a constitutional right to have a gun for self-defense in the home. Rock-ribbed conservative Justice Antonin Scalia, writing for the Court, left a marker:

“Nothing in our opinion should be taken to cast doubt on longstanding prohibitions on the possession of firearms by felons and the mentally ill, or laws forbidding the carrying of firearms in sensitive places such as schools and government buildings….”


Beware the opening wedge. The rule may soon swallow the exception.


The Supreme Court, newly minted with Trump-appointed conservative justices, is considering a case called New York State Rifle & Pistol Association v. Bruen,  involving the New York’s stringent “Sullivan Law,” on the books since 1911. The law, a model for similar laws in other states, was enacted on Jan. 23, 1911, after a “blueblood murder-suicide” in which Fitzhugh Coyle Goldsborough shot dead the novelist David Graham Phillips in a brazen early afternoon attack near Gramercy Park. After firing six shots, Goldsborough put the gun to his temple, killing himself. The city coroner reasoned that reasoned that “the time had come for legislation that would prevent the sale of pistols to irresponsible persons.” As they say in New York, “There oughta be a law.” Yes, and there oughta be a law against chaos, too.


Under the New York regime, a citizen must show “proper cause.” to get a license. “Proper cause” requires a showing of a special need to defend oneself rather than a speculative wish to protect person or property. What is proper cause is left to the discretion of the authorities.


Actually, if you are an originalist, and believe that the Constitution should be interpreted in accordance with the original understanding of the society in 1791 when the Second Amendment was enacted, you might want to know that we have had more than 700 years of legal history supporting the proposition that, as one conservative judge recently put it, “Government has the power to regulate arms in the public square.”

Not all conservatives agree with the challenge to New York’s law. Former federal appeals court judge J. Michael Luttig argued in an amicus brief to the Supreme Court that “the original understanding of the Second Amendment was that there is not an absolute, unfettered right to carry loaded guns in public.”

Judge Luttig also authored a guest essay in The New York Times explaining that “at stake… is whether the Supreme Court will claim for itself the power to decide where and when Americans can carry loaded firearms in public-a power that the Constitution reserves for the people and their elected representatives.”

Fordham Professor Saul Cornell, one of the leading authorities on early American constitutional thought, led 16 professors of history and law in a brief, arguing that “One of the longest continuous traditions in Anglo-American law are limits on the public carry of arms in populous areas.”

At the recent  oral argument, though, conservatives on the Court seemed to ignore the original understanding and history of public carry laws, and challenged the lawyers for New York state who defended the Sullivan law.

Justice Amy Coney Barrett wondered whether New York wanted the Court to overrule Heller. As the corrupt police commissioner said of gambling in Casablanca, “Shocking!” Of course, Barrett is apparently prepared to  overrule Roe v. Wade , the 7-2 1973 decision establishing a woman’s constitutional right to an abortion.

Justices Alito and Kavanaugh grilled New York Solicitor General Barbara Underwood on whether it was appropriate for New York to second-guess the assertions of a citizen that he or she is in danger. Justices Thomas and Chief Justice Roberts pressed Underwood on whether the restrictions should be different in rural and urban areas, with Roberts wondering what a ruling in favor of open carry might mean for “sensitive areas” like courthouses, schools, airports, stadiums or places where alcohol is served.

Roberts, as well as other justices, has said that the Court loses public support when it is perceived as partisan, and nothing could be more partisan than its approach to gun rights. When it comes to abortion, conservatives on the Court are willing to overrule settled law and invalidate Roe v. Wade, even to the extent of upholding a Texas statute that grants a bounty to someone who wants to sue providers (or other enablers) of an abortion beyond the first six weeks of pregnancy. When it comes to gun rights, however, they are ready to ignore precedent and history and assume a power that the Constitution explicitly reserves to the people and the states.

My father and mother, lifelong New Yorkers, visited Taos, New Mexico in the 1930s. They went to a diner for breakfast. A cowboy, resembling the gunslingers of the old West, sporting a Stetson hat, boots and jangling spurs, and two six-guns at his waist strode into the place. Before he sat down, he unbuckled his gun belts, and hung his brace of deadly weapons on the coat rack at the end of the booth. The open carry of weapons was legal then under New Mexico law, and it is legal now in New Mexico, as well as 43 other states.


Other states have been more skeptical when it comes to the legal carry of guns. California, Hawaii, Maryland, Massachusetts, New Jersey, and Rhode Island--and New York--require that persons obtain a license to carry concealed weapons in public, wherever they believe they need their guns for self-defense. As a practical matter, that means everywhere 24/7.

Can it be that the Supreme Court will guarantee the constitutional right of someone to go to a sporting event or get on an airplane, bus, subway or train packing heat? Maybe the proprietors of businesses can bar them, but can the government? That will be up to the Supreme Court. As Alice said in Wonderland. “It gets curiouser and curiouser.”

A decision is expected by next summer.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181759 https://historynewsnetwork.org/article/181759 0
Look to Lincoln's Interactions with Black Americans to Understand His Racial Attitudes

"Frederick Douglass appealing to President Lincoln and his cabinet to enlist Negroes," mural by William Edouard Scott, at the Recorder of Deeds building, built in 1943, Washington.



Six weeks after Lincoln’s death, Frederick Douglass described him as “emphatically the black man’s president, the first chief executive “to show any respect for the rights of a black man, or to acknowledge that he had any rights the white man ought to respect,” the first one to rise “above the prejudices of his times and country.” Lincoln treated each African American “not as a patron, but as an equal.” Speaking of his own experience in 1864 when the president summoned him to the White House to discuss public affairs, Douglass concluded: “In daring to admit, nay in daring to invite a Negro to an audience at the White House, Mr. Lincoln did that which he knew would be offensive to the crowd and excite their ribaldry. It was saying to the country, I am President of the black people as well as the white, and I mean to respect their rights and feelings as men and as citizens.” Echoing Douglass, historian David Reynolds has recently deemed the sixteenth president a “radical antiracist” and a “leftist abolitionist who loathed racism.”


In The Black Man’s President: Abraham Lincoln, African Americans, and the Pursuit of Racial Equality, I attempt to illustrate Lincoln’s racial egalitarianism by describing his interactions with African Americans, both during his presidency and his Illinois years. As Reynolds recently observed, it is only by studying Lincoln’s “personal interchange with black people” can “we see the complete falsity of the charges of innate racism that some have levelled at him over the years.”


Lincoln’s unfailing cordiality to African Americans in general, both in Springfield and Washington; his willingness to meet with them in the White House; to honor their requests; to invite them to consult on public policy; to treat them with respect and kindness whether they were kitchen servants or leaders of the Black community; to invite them to attend receptions and tea; to sing and pray with them on their turf; to authorize them to hold events on the White House grounds—all those manifestations of an egalitarian spirit fully justified the tribute paid to him by Frederick Douglass and other African Americans who met with him. Among them was Sojourner Truth, who called at the White House in 1864 and shortly thereafter wrote:  “I never was treated by any one with more kindness and cordiality than were shown to me by that great and good man.”


Black workers in the Executive Mansion reacted similarly. Elizabeth Keckly, a former slave who became the First Lady’s dressmaker and confidante, observed Lincoln closely throughout the Civil War and told a journalist: “I loved him for his kind manner towards me.” He “was as kind and considerate in his treatment of me as he was of any of the white people about the white house.” Other African Americans with whom Lincoln interacted included Peter Brown, a waiter whose young son spent much time at the Executive Mansion during the Civil War and remembered that Lincoln “was kind to everybody” and “sympathized with us colored folks, and we loved him.” The daughter of William Slade, the African American head butler at the Executive Mansion, reported that the president “never treated [the White House employees] as servants, but always was polite and requested service, rather than demand it of them.” 


Many Black people visited the Lincoln White House. As Clarence Lusane, a political scientist at Howard University, noted: “political access to the White House had been extended to the black community for the first time in U.S. history” during Lincoln’s presidency, and that included not only prominent people like Frederick Douglass and Sojourner Truth, but also “many lesser known activists and ordinary African Americans” who “met with him there as well. The significance of these encounters cannot be overstated.” The “multiracial space that Lincoln opened would be a critical new element in the ongoing struggle for black freedom and equality.”   


Those callers included Black Washingtonians who attended receptions at the Executive Mansion, much to the consternation of Democrats, who denounced the president for thereby promoting social equality and miscegenation. Other African Americans came to ask favors, to urge emancipation and the recruitment of Black troops, to lobby on behalf of colonization abroad, to express thanks, and on some notable occasions to press for Black suffrage rights. Arguably the most influential such meeting occurred in March 1864, when two African Americans from New Orleans presented a petition signed by a thousand Black residents of the Crescent City appealing for the right to vote. Lincoln received them cordially, expressed sympathy for their cause, but noted that states, not the federal government, determined suffrage qualifications. A few days later, the president acted on their request by writing to the newly elected governor of Louisiana, suggesting that it would be desirable if the state constitutional convention that was scheduled to meet soon would enfranchise at least some black men, like those serving in the army and the very intelligent, by which he evidently meant literate. The governor used that letter to effect, and the convention voted to authorize the legislature, if it saw fit, to enfranchise some Black men. That helped pave the way for the eventual enfranchisement of Black Louisianans in 1868.

On April 11, 1865, thirteen months after receiving the New Orleans petition, Lincoln for the first time publicly endorsed Black suffrage. He told a huge crowd gathered at the White House that he favored extending voting rights to Black soldiers and sailors as well as the very intelligent. Upon hearing that speech, John Wilkes Booth turned to his companions and said: “That means [n-word] citizenship. Now by God I’ll put him through!” He added: “That is the last speech he will ever make.” Three days later he carried out his threat by assassinating the president. Thus Lincoln should be regarded as a martyr to Black civil rights, as much as Martin Luther King, Medgar Evers, Viola Liuzo, James Cheney, Micky Schwerner, Andrew Goodman, and the other champions of that cause who were murdered during the civil rights movement of the 20th century.                                                                       

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181761 https://historynewsnetwork.org/article/181761 0
Professor, Novelist, and MacArthur "Genius" Charles Johnson on His First Career: Cartoonist We think in pictures. Like music, the content of a drawing can be universally recognized; it cuts across language barriers and can be ‘worth a thousand words.’ –Charles Johnson

Drawing is not what one sees but what one can make others see. – Edgar Degas


Professor Charles Johnson (Photo by Mary Randlett)


Most readers probably know Seattle’s Charles Johnson, the retired UW English Professor, as a celebrated scholar, beloved teacher, and literary icon. He has written four acclaimed novels including Dreamer and the National Book Award-winning Middle Passage, as well as numerous essays, short stories, screenplays, and studies of race, culture and eastern religion such as his book Taming the Ox. He is also recipient of a MacArthur “Genius” Fellowship and is recognized as an influential public intellectual.

After earning a doctorate in philosophy (emphasizing phenomenology and literary aesthetics), he then served for more than 30 years as a professor of English at the University of Washington where he taught literature and creative writing, directed the creative writing program, and held an endowed chair, the S. Wilson and Grace M. Pollack Professorship for Excellence in English. Students remember him for his encouragement, academic rigor, attention to individual needs, and devotion to excellence as a teacher.

Charles Johnson’s reputation in the world of writing and books thus is well established. And the prolific retired professor continues to write and speak for audiences around the globe from his home in Seattle’s Wedgwood neighborhood.

However, many readers may not be aware that Johnson loves visual art and his first career was as comic artist. As he puts it, he’s been “addicted” to drawing since childhood.

Johnson’s early cartoon work recently gained renewed attention. A selection of his cartoons from 50 years ago is now on exhibit at Chicago’s Museum of Contemporary Art along with the work of other Black Chicago cartoonists from 1940-1980. He wrote the introduction for It’s Life as I See It, the exhibit’s catalog. That title comes from his caption for his vintage cartoon that depicts a Black artist describing his painting of a pure black rectangle to a white observer. And next year, the New York Review of Books will publish a collection of more than 200 of Johnson’s cartoons in a volume called All Your Racial Problems Will Soon End: The Comic Art of Charles Johnson.


Professor Johnson recently sat down at a northeast Seattle café and generously recounted his fondness for visual art and his lifelong, near obsession with making cartoons, a passion since childhood that he continues to indulge at age 73—among his many other interests.

Thinking in Pictures. Since childhood, the cerebral Johnson has always first thought in pictures. “Images and ideas fill my head where nobody can see then. I have to externalize them on the page. It could be a drawing. It could be a short story. It could be an essay. . .In these [creations], you can see through my eyes.” His universally admired vivid description and convincing characterization in his novels and short stories attest to his strong visual sense. And this ability to picture scenes in his mind enhances his artistic talent.

Professor Johnson’s First Love: Art. Johnson recalled that he began drawing in elementary school in his hometown of Evanston, Illinois, and never stopped. Drawing “was something I loved to do.” He described the sensual delights he still enjoys: the texture of paper, the smell of ink or paint, experimenting with various pens, pencils and brushes. He especially appreciates the process of coming up with ideas, then creating images and sharing them with others. And he likes the opportunity to play as he draws. He said, “Playfulness is an element of all art.”

His mother, a voracious reader who loved music, admired his drawings. His father also enjoyed his son’s creations but was skeptical about Johnson’s dreams of becoming an artist. At the time, his father worked two or three jobs at a time to support the family. Nonetheless, he bought art supplies for his son and even gave him a drawing table one Christmas. At twenty-five dollars, the table was a big splurge for his family and Johnson is still grateful for his father’s generosity. The drawing table “became my place of worship,” Johnson said, where he spent hours exercising his imagination as he sketched and drew almost daily.

Distance Learning. “Drawing was my passion all the way through middle school and high school,” Johnson said. “I drew everything I possibly could.”  

At the nationally renowned Evanston High—a temple for youthful scholarship— Johnson made cartoons and comic strips for the student newspaper and gave cards and other drawings to friends.

When he was just 15, about the time he began studying Buddhism and meditation, Johnson came upon an ad in Writer’s Digest for a correspondence course in drawing offered by prolific cartoonist and writer Lawrence Lariar, then the cartoon editor for Parade magazine and editor of the series Best Cartoons of the Year, as well as a former idea generator at Disney.

Johnson told his father that he'd decided on what profession he wanted to pursue, that of an artist, and his dad said, “Chuck, they don't let Black people like do that. You need to think of something else.” But his father eventually gave Johnson the money for the two-year course. Johnson noted that his hardworking father wasn’t familiar with the art world and also had grown up under unrelenting segregation and limited opportunity in the Jim Crow South.

Before starting the correspondence course, Johnson wrote Lariar about what his father said about Black people being precluded from art. To Johnson’s surprise, Lariar wrote back within a week, and he told Johnson that “your father is wrong.” He added, “You can do whatever you want with your life. All you need is a good teacher.”

Lariar became an important mentor for the aspiring cartoonist. Over the next two years, Johnson religiously submitted drawings for each correspondence lesson and Lariar responded with thoughtful critiques and encouragement.

Bus Trips East. In the summer breaks during high school, Johnson took a Greyhound bus from Chicago to visit his relatives in Brooklyn. He also made a point of seeing Lariar at his home on Long Island where Lariar would treat him to lunch and discuss art and stories of many of his friends, well-known cartoonists and artists who Johnson admired. Lariar, a liberal Jewish American, was quite open-minded with a special sense of humor. Johnson said, “I think he delighted in surprising his white neighbors by having Black guests.”

Pounding the Pavement in the Big Apple. On those summer trips to New York, young Johnson made appointments and visited publishing houses throughout Manhattan where he’d share samples of his illustrations and cartoons.  

Along the way, he met beloved comic artist Charles Barsotti, who eventually became a regular cartoonist for The New Yorker. Barsotti offered support and praised Johnson’s work. He also mentioned the need for more young African American cartoonists to share their perspectives. He believed Johnson could tackle issues about race that white cartoonists were reluctant to even consider. Later, Johnson took that comment to heart.

The First Art Sale. Johnson made his art first sale to a Chicago magic company in 1965 at age 17. He illustrated a half dozen magic tricks for one of the company's catalogs. Johnson proudly framed a dollar from that initial sale and it still hangs in his home study.

As a high school senior, Johnson won two second-place awards for his work, a comic strip he called "Wonder Wildkit" he co-authored with a friend, and a sports cartoon, in a national contest for high school cartoonists sponsored by the Columbia University School of Journalism. He only learned of the awards, however, during his first year of college when he returned home and saw a news story about it in his hometown newspaper, The Evanston Review.

College: Majoring in Journalism. As high school graduation approached, Johnson was set on studying art in college. “All I wanted to do was get out of high school and go to art school. And I got accepted in art school but, at the very last minute, I bailed out. I wondered then, in the spring of 1966, whether or not this would be a marketable degree.”

Thanks to the advice of a practical high school counselor, Johnson decided against art school and instead chose to major in journalism and, as it turned out, journalism was a “good fit,” he said. “In journalism school I could draw and write at the same time.”

Johnson started college at Southern Illinois University in Carbondale in 1966, a tumultuous time. During that year, tens of thousands of US troops headed to Vietnam, the Black Power movement gained momentum, Bobby Seale and Huey P. Newton founded the Black Panther Party, James Meredith was shot and wounded at the outset of his March Against Fear, and racist thugs threw rocks at Dr. Martin Luther King Jr. as he led civil rights marches in the Chicago area.

In his first year of college, Johnson illustrated for the college paper and other publications. In his courses, he studied and wrote about the great cartoonists of America and Europe such as Nast, Daumier, Hogarth, Cruikshank, and Rowlandson,

During his undergraduate years, Johnson said, “I made every kind of drawing you can imagine. Editorial cartoons, single panel gag cartoons, comic strips, design work, illustrations, and even a commemorative stamp.” He also published cartoons in newspapers such as the Chicago Tribune and Southern Illinoisan, and magazines including Jet, Ebony, Players, Negro Digest, and others.

Intern at the Chicago Tribune. Johnson eventually worked as a summer intern in 1969 at the Chicago Tribune as a cartoonist and writer for that newspaper's "Action Express" public service column. After returning to college, he was a stringer for the paper at SIU. “I didn't really file any news stories at all until the following spring of 1970.”

The one big news story Johnson covered for the Tribune concerned peaceful demonstrations in May 1970 at SIU to protest Nixon’s expansion of the Vietnam War with the invasion of Cambodia. An editor added his opinion to the story that the protest was prompted by “outside agitators.” Johnson objected strenuously. That wasn’t true but, to Johnson’s chagrin, the paper printed the editor’s misinformation in the published story.

First Book of Cartoons: Black Humor. Also in 1970, Johnson published his first book, Black Humor, a collection of 89 cartoons that drew on Black history and culture as they targeted bigotry, hypocrisy, cruelty, liberal guilt, hate from any source, and more.

Johnson skipped classes to create Black Humor in just one almost sleepless week. Amiri Baraka sparked frantic work. At a reading Johnson attended, Baraka, by then a leading Black Nationalist figure, urged Black students to bring their talents back to their communities. Charles Barsotti had offered similar advice to Johnson a few years earlier. The result was this first book of biting and revelatory cartoons.

Thanks to a suggestion from acclaimed writer and book editor Bob Cromie at the Chicago Tribune, Johnson brought his Black Humor manuscript to Johnson Publishing (no relation), where Ebony and other periodicals of Black interest were produced. The publisher accepted the book and brought out Johnson’s debut humor collection.

Johnson finished other books of cartoons, including Half Past Nation Time in 1972. Unfortunately, a fly-by-night publisher never got this book to market. He also created a collection of cartoons on slavery, I Can Get Her for You Wholesale, and another on Buddhism entitled It’s Lonely at the Top.

 You sure you got the right retreat?


Johnson’s cartoons reveal the breadth of his interests including Black culture and racial reckoning and far beyond to philosophy, religion, science, the academic world, art, and every aspect of the human comedy. In a recent profile of Johnson in The Chicago Tribune, reporter Christopher Borrelli described Johnson’s wide range of literary work as “unclassifiable.” The same applies to his visual art.

A PBS Drawing Show: Charlie’s Pad. During his college years, Johnson even worked as an admired television performer.

In 1969, he called his local Public Broadcasting Station on campus at SIU and asked if they would like a program where he would teach drawing on the air. The local producers were enthusiastic because they needed content and the two-camera show with a host stuck at a drawing board would be cheap to produce.

Over the next year, Johnson taped fifty-two 15-minute episodes of Charlie’s Pad. The program was broadcast nationally and in Canada and rebroadcast for a decade. Often, Johnson would tape three shows, back-to-back on days when he wasn't attending his classes.

Charlie’s Pad received wide acclaim. Viewers sent him their drawings and notes of gratitude for his lessons. Johnson recalled hearing from one viewer in recent years who said he learned to draw from the program and, since then, he had taught his child to draw.

Charles Johnson hosting his PBS drawing show, Charlie’s Pad.


Despite favorable response to the show, Johnson tired of his performance role as a TV drawing teacher. “It had totally exhausted any interest in being on TV. I didn’t care about it and I wasn’t interested in looking at myself on television. Being in front of the camera wasn’t creative to me. It wasn’t fulfilling” But, he added, he has enjoyed working behind the camera and he has written screenplays for television and film projects such as the award-winning PBS film Booker.

On to a Doctorate in Philosophy. In addition to journalism and literature classes, Johnson attended “lots of philosophy courses” as an undergraduate at SIU. By his senior year, he said, “I was interested in writing and philosophy. I wanted to finish my undergraduate journalism degree and study philosophy.”

He was admitted in 1973 to the philosophy graduate program at the State University of New York at Stony Brook and received his PhD in philosophy in 1988. His emphasis evolved from the study of the philosophy of Marx and his adherents to the abstruse realms of existentialism, aesthetics and phenomenology.

Becoming a Novelist. Johnson also began more creative writing in undergraduate school with his mentor, the legendary SIU Professor John Gardner, an acclaimed novelist (Grendel and The Sunlight Dialogues) and critic (On Moral Fiction). Gardner worked closely with Johnson on his fiction writing and the elements of storytelling. The two became friends and kept in close touch until Gardner’s untimely sudden death in a motorcycle accident in 1982.

In his years working with Gardner, Johnson produced six apprentice novels, all of which he discarded. His first published novel, Faith and the Good Thing, came out in 1974 to favorable reviews for its inventive storytelling and evocative prose. On reading Faith, his cartooning mentor and accomplished writer Lawrence Lariar wrote to Johnson and told him: “You have the touch.”

Critic Arthur P. Davis described Johnson’s groundbreaking novel Faith as “a fascinating mélange of classic philosophy, scholasticism, occult writings, folklore (including Southern superstition and Negro tall tales), surrealistic dreams, flashbacks, and down-to-earth realism.”

A Job at the UW. The University of Washington Department of English hired Johnson to teach creative writing and literature in 1976. He taught legions of grateful students at the UW for more than three decades.

As Johnson focused intently on academic affairs and on his students who admired his wide knowledge and caring instruction, his drawing also continued. He published cartoons widely in literary journals and magazines with themes from philosophy and current events to eastern religion and culture. Several of the cartoons on Buddhism appeared in a collection from Tricycle Press, Buddha Laughing.

Which comes first in cartooning—the idea or the image? In response to this question, Johnson immediately mentioned his admiration for Seattle-based editorial cartoonist and journalist David Horsey. “He's one of the best draftsmen I've ever seen and one of the best caricaturists. I compare his work to Mort Drucker who was spot on with every caricature he did for Mad magazine.”

Johnson and Horsey discussed cartooning, and both agreed that, “The most important thing for any editorial cartoon is a good idea to start with.”  Johnson added, “And the same thing goes for other forms of cartooning and commercial illustration. Now that idea might come to you as an image, but the idea is the starting point.”           

Beauty, Wonder and Mystery. In his fiction writing, Johnson has strived to create works of wonder and mystery, and he has encouraged his students to do the same. (See his guide to writing, The Way of the Writer: Reflections on the Art and Craft of Storytelling.)

Johnson still expresses some dismay that many of the writers and artists he met in his career have been “some of the most unhappy people I knew. If a person can create beauty, that means the person should be the happiest person in the world.”

 He added, “To me, if you can create beauty as a gift to others, what more do you want? No riches can surpass that gift.”

The Ongoing Work. Johnson emanates serenity and contentment in his busy retirement as he continues to write and juggle projects, speaking engagements, and requests for essays and interviews and lectures. And he inevitably returns often to his deluxe, glass-topped drawing table in his booklined study to create more art.

After retirement from the UW, Johnson with his artist daughter Elisheba created and published three books in their series, The Adventures of Emery Jones, Boy Science Wonder. The books are illustrated by Johnson and recount the adventures of a curious and science-enthralled African American school kid. Beyond sharing compelling stories featuring an inquisitive and daring young boy, the books are an effort to spark young Black students to consider STEM (science, technology, engineering, and mathematics) studies.

In his 2020 book Grand: A Grandparent’s Wisdom for a Happy Life, Johnson offers a series of heartfelt essays to inform and inspire parents and grandparents alike, as well as the children in their lives. He dedicated the book to his grandson Emery, now age nine. He wrote, "I feel hopeful that Emery will come to appreciate the unpredictable serendipity of life when it goes against our plans and delivers delightful surprises."

Recent projects keep Johnson occupied. In the past few months, he has written a preface for the late Ralph Ellison’s novel Juneteenth; edited an anthology of stories and essays by Black Americans for the Chicago Quarterly Review; worked with author Steve Barnes on a graphic novel, The Eightfold Path, due out in January 2022; sold his papers or literary archive to Washington University in St. Louis; and is at work on his first collection of cartoons in forty-nine years, the new book for next September, All Your Racial Problems Will Soon Be Solved.

Striving to Unite, to Enlighten. Johnson observed that “we’re a very divided country now. We live in dangerous times.” He cited recent reports on an increase in hate crimes against African Americans and Asian Americans, now at the highest level in the last ten years. 

He lamented that many of his cartoons from the early seventies on race and bigotry are still timely. “Some people think that these are historic problems and that all have been settled. But we haven’t solved them. We haven't evolved in certain ways.” He suggested that America is far from a post-racial society.

Johnson sees the extreme polarization in our nation as harmful to everyone.  “For moral reasons and because I’m a Buddhist, I’m not going to use art to stoke division or feed hatred.” Instead, his work will advance a more just and compassionate society and celebrate the interconnectedness of all humans.

As the late poet and UW Professor Theodore Roethke wrote: “In a dark time, the eye begins to see.” Now, in our troubled time of division and unrest, Charles Johnson offers art to restore us, to illuminate the dark.


Robin Lindley is a Seattle attorney and writer, and features editor of the History News Network (historynewsnetwork.org).

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/blog/154561 https://historynewsnetwork.org/blog/154561 0
What "Forget the Alamo" Forgets `

Davy Crockett depicted in The Fall of the Alamo (1903), Robert Jenkins Onderdonk.



Each year before the pandemic over two and one-half million people visited the Alamo in San Antonio, Texas. Widely portrayed as a shrine to freedom, it is the site of an early battle of the Texas War of Independence where Mexican troops, led by President and General Antonio López de Santa Anna, defeated and killed some 189—the exact figure is in dispute—independence fighters, including frontier icons Davy Crockett, Jim Bowie, and William Travis. “Remember the Alamo” became the revenge slogan that presumably drove other independence fighters led by Sam Houston six weeks later to definitively defeat Santa Anna’s troops at the Battle of San Jacinto and secure Texas independence.


The Alamo is at the center of the Texas creation story, purportedly a symbol of courage in the willingness to fight to the end for a noble cause, to secure freedom from Mexican tyranny.


Left unsaid, though according to many historians, is that the real goal of Texas independence was to protect a slave system that was under threat of abolition by the Mexican authorities. They had abolished slavery in the rest of the republic in 1829 under President Vicente Guerrero, only reluctantly agreeing to an exemption for Texas that Stephen Austin had traveled to Mexico City to secure. If the Alamo is a shrine to freedom, it was freedom for the Texas slave owners to continue and expand their system.


Along comes Forget the Alamo by three Texas writers: Bryan Burrough, Chris Tomlinson, and Jason Stanford. They seek to set the record straight by lampooning the Texas foundation myth—hence the book title that mocks one of the central patriotic slogans in not only Texas but also American history. They show the importance of slavery and that Mexico was opposed to it. Santa Anna spared the lives of the three slaves at the Alamo. They deflate the images of Bowie and Travis, who were slave owners, and Crockett who surrendered rather than going down fighting. Most importantly, they show how the myth was embellished over the years until Hollywood got into the act with Walt Disney’s Davy Crockett television series in the 1950s and a movie starring John Wayne as Crockett in 1960.


They make the connection between a generation of faux coonskin cap-wearing boys, raised on the Alamo myth via Disney’s Crockett, who would later become American troops in Vietnam, thinking, at least initially, that they were somehow remembering the Alamo in fighting against a communist-led foe.


In some respects, their study of what Hollywood did with the Alamo story is reminiscent of the work of sociologist Jerry Lembcke who has written about the distorted ways in which Hollywood films portrayed the Vietnam War.  In both cases—the Alamo and Vietnam—Hollywood on balance embellished a patriotic myth which motivated America troops for future wars.


Like with Lembcke’s work, the authors distinguish between the war and the memory of the war, a point also made by Viet Thanh Nguyen when he states that every war is fought twice, first on the battlefield and then in memory.


In the case of Texas, there is an ongoing memory struggle between those on the right who wish to keep the Anglo hero narrative myth intact and revisionist reinterpretations. Cross-cutting that is a division between how Anglos and Mexicans who live in Texas view the event.


The authors show how the prevailing Anglo heroic narrative has been used to oppress Mexican-American school children, who were taught to see themselves as the enemies of the freedom-fighting Texans. Their “revisionist” solution is problematic, though. They point out that there were Mexican residents of Texas, called Tejanos, who sought independence alongside the Anglos and that nine of them died at the Alamo. Ergo, Mexican schoolchildren now have their own Texas independence heroes. The problem with that liberal formulation though is that, while it is true that some Mexicans did side with the Anglos, it was a very small proportion of them.



Table. Anglo and Tejano Distributions in Texas in 1836


                          Population                                       Alamo Defender Deaths


                      Number                Percent                        Number           Percent


Anglos           30,000                    89.6                            180                 95.2


Tejanos            3,470                    10.4                                9                   4.8


Total              33,470                  100.0                            189                100.0  

Note: In addition to Anglos and Tejanos, there were 5,000 Black slaves and 14,500 Indians in the Texas population, none of whom died at the Alamo.

Source: Randolph B. Campbell, An Empire for Slavery: The Peculiar Institution in Texas, 1821-1965 (Baton Rouge: Louisiana State University Press, 1989).


The Tejano casualties at the Alamo were less than half the proportion of Tejanos in the Texas population. The disproportion is even greater when considering that Bexar, the former name of San Antonio, had the largest concentration of Tejanos in Texas. Quite clearly, the majority of them stayed away from the Alamo those fateful days.     


Tejanos who sided with the slave holders are given a pass from criticism in the book. Instead, the authors seem to celebrate them as Texas minorities to look up to. That seems to be for the authors a main corrective to the Anglo hero narrative.


To be fair, the authors do state that the Tejano independence fighters were motivated by wanting autonomy from Mexico City and not the preservation of the Anglo slave system. But does that really get them off the hook from being accomplices of slavery? It’s similar to maintaining that many Southerners were fighting for independence rather than slavery in the Civil War when the two were inextricably bound together in consequence.


Were the Tejanos who supported Texas independence admirable role models for Mexican-American schoolchildren today, as implied by the book, or a minority of Tejanos who collaborated with the Anglos and abetted their plan to establish a slave republic and ultimately with the growth of American imperial power?


The authors do not cover other Tejanos who resisted the independence nor those to participated along with escaped slaves and Indians in guerrilla skirmishes afterwards against the new authorities.


Nor do the authors question Texas being taken from Mexico and becoming the stepping stone to taking the whole Southwest twelve years later when U.S. troops invaded Mexico in the Mexican-American War.


On balance, Forget the Alamo is well worth reading for what it gets right as well as what it adds about the continuing struggle over the Alamo narrative in Texas. But it is, ultimately, a book written within the constraints of an American unwillingness to fully deal with the historical reality that the United States forcibly stole Texas and the rest of the Southwest from Mexico in two wars that have had great continuing consequences for the futures of both countries.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181762 https://historynewsnetwork.org/article/181762 0
The Roundup Top Ten for November 12, 2021

The Changing Same of U.S. History

by David Waldstreicher

Historians have returned to the question of whether the Constitution is the problem or the solution with renewed vigor and high stakes. Those accusing ideological rivals of "doing politics, not history" are not innocent of the same charge. 


White Supremacists Attacked Democracy and Have Thus Far Faced No Consequences

by Carol Anderson

Threatening to demolish the structure of government to preserve white supremacy is a time-honored American tradition, as is escaping consequences for doing so.



Have the University of Austin Founders Been in a Classroom Lately?

by Aaron R. Hanlon

Proposing a new college to fight campus illiberalism is a solution to a problem that doesn't really exist. Really. 



White Backlash is America's Most Destructive Habit

by John S. Huntington and Lawrence Glickman

The authors endorse the term "counterrevolution" for a repeated pattern of political mobilization among White Americans combining distrust of democracy, apocalyptic rhetoric about the effects of racial equality, and the endorsement of antidemocratic and violent means to halt change.



In Rittenhouse Trial, Language Matters

by Felicia Angeja Viator

Kyle Rittenhouse's trial evokes the 1943 "Zoot Suit Riots," when white vigilantes, including uniformed servicemen, beat Mexican American youth in Los Angeles and other cities. The courts contributed to exonerating the vigilantes by repeating the language of a moral panic that characterized the victims as "gangsters" and hoodlums.



UC Churns Through a Quarter of its Lecturers a Year. Like Me.

by Diane Mendoza Nevárez

"When UC treats lecturers as gig workers, they deny students access to the mentorship crucial for student retention and success."



When Politicians Like JD Vance Call Professors Like Me the Enemy, What's Really Going On?

by Benjamin Carter Hett

It's no longer politically expedient to attack minorities or immigrants; professors make a great substitute because their work is often a mystery to the public and they don't have the power to fight back. 



Black Veterans of the First World War are Often Overlooked

by Michelle Moyd

Nearly 638,000 African men fought in Africa and Europe. Some were conscripted by colonial powers and forced to fight or labor, and others hoped through service to stake claims to political rights. More global attention to their service and its relationship to colonialism is needed.



The Academy Museum Ignores Hollywood Labor History

by Andy Lewis

The Academy of Motion Picture Arts and Sciences was originally established to help studios negotiate contracts with the studio unions. Today, the on-set tragedy in New Mexico reminds that film production is an industry and workers make it run. The Academy Museum misses that part of the story.



Extremism Didn't Begin with Trump, and Won't End with Him Either

by Joseph Lowndes

Pat Buchanan never succeeded in winning the Republican nomination, but he did as much as anyone to shape the politics of grievance and the image of beseiged white America that drives the party's base today. 


Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181756 https://historynewsnetwork.org/article/181756 0
The Black Women Veterans of World War II Fought for More than the "Double V"

Members of the 6888th Central Postal Delivery Batallion march in a parade honoring Jeanne d'Arc in the square where she was burned at the stake, Rouen, May 1945.



During World War II, Black newspapers rallied African-Americans behind the “Double Victory” campaign to fight the war against ethnic oppression abroad as well as racial oppression at home. But the African-American women who served during this time also had a third enemy – the one that held them back because of their gender.


The Double Victory campaign was inspired by a letter published in the Pittsburgh Courier on January 31, 1942 entitled “Should I Sacrifice To Live ‘Half-American?” In the letter, James G. Thompson explained that the first V was “for victory over our enemies from without” and the second V was “for victory over our enemies from within.” The paper would later proclaim “this slogan as the true battle cry of colored America.” The Pittsburgh Courier debuted its Double V logo the next week in the February 7 edition, and would continue to print it as part of its masthead for the remainder of the war.


Meanwhile, Massachusetts Congresswoman Edith Nourse Rogers was hard at work on a bill to create a women’s branch of the U.S. military. Many women had served as volunteers during World War I but were not eligible for veteran’s benefits since they had not been official members of the U.S. military. Rogers wanted to make sure that this did not happen again during this new war.


Once the Women’s Army Auxiliary Corps (WAAC, later WAC) was established, Mary McLeod Bethune went into action. Bethune was the head of the Negro division of the National Youth Administration as well as founder of what is now Bethune Cookman University. She was also an advisor to four presidents, including President Franklin Roosevelt. She had seen how Black young people, especially Black women, struggled to find good jobs during the Great Depression of the 1930s. With her access to the highest reaches of the U.S. government, she used her influence to ensure that Black women would be able to have meaningful opportunities within the WAAC. Ones that would position them to be eligible for those educational and professional veteran’s benefits too.


In general, the women who joined the WAC were subjected to sexist assumptions about their virtues and ability to make meaningful contributions to the war effort as a part of the military. Male soldiers and officers were vocal about their beliefs that women had no place in the service. Inside and outside of the military, a common assumption was that these women had enlisted so that they could “service” the male service members. They were not issued weapons nor given any weapons training – even when ordered into hostile areas – because brandishing a gun was considered “unladylike.” However, the Black women who served had to deal with a unique set of challenges.


These challenges included being subjected to double segregation. Black men in the military were segregated only on the basis of their race. Black women were separated by their race and gender. To get into the WAC, a woman had to meet high standards of morality and femininity; white women might be able to loosen up a little once they were officially in. But Black women had to meet the highest of these standards at all times. Even when they continued to maintain those standards, they were more often than not perceived as only capable of performing domestic duties despite their educational and professional backgrounds.


The stripes that the Black female officers wore made them even more of a target. It enraged some members of the American public that a white soldier who held a lower rank was expected to salute these women and follow their orders. This led to Black female officers experiencing encounters with the police because they were assumed to be imposters and worse. One was hospitalized after being beaten up on a train platform in Tennessee.


However, despite these unique challenges, Black female soldiers fought back in creative ways. They used the power of their personal connections and of the Black press to overturn discriminatory policies and practices, such as having the “Colored” signs removed from the mess hall tables during the first WAAC officer training class. They studied Army policies until they knew them backwards and forwards. So when Major Charity Adams responded “Over my dead body, sir” to a general’s threat to have her replaced after refusing his frivolous order, she had the documentation to back her up when he attempted to have her court martialed. They even went as far to stop work, effectively going on strike, when pigeonholed into intolerable working conditions at the Fort Devens Hospital in Massachusetts in 1945.


Despite these extra burdens, Black women who served in the WAAC/WAC during World War II went on to distinguish themselves. The 6888th Postal Battalion Directory was given the “impossible” task of getting the backlog of mail moving in six months. They cleared the backlog in three. Black female officers trained all of the Black women who enlisted in the corps. They showed the top brass what Black women were capable of achieving, if only given a chance.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181689 https://historynewsnetwork.org/article/181689 0
Remember the Army's Role in the Pacific War: Important Then, Influential Afterward

Troops of the 185th Infantry, 40th Division, advance toward Japanese positions on Panay Island, Philippines, March 1945



In the Pacific theater during World War II, the American land war was fought primarily by the Army, though popular memory has focused almost exclusively on the comparatively smaller Marine Corps effort. The Corps made fifteen amphibious combat landings over the course of the entire war. In the spring of 1945, Lieutenant General Robert Eichelberger’s Eighth Army alone carried out thirty-five amphibious landings over a five-week period in the Philippines. At full strength, and at its largest size ever, the Marine Corps mobilized six combat divisions, comprising about a quarter million troops in theater, all of which were fully dependent on the Navy and the Army for logistical support since the Corps was designed to function as an expeditionary fighting force, not a self-sustaining military organization. The Army deployed twenty-one infantry and airborne divisions, plus several more regimental combat teams and tank battalions whose manpower equated to three or four more divisions. In addition, the Army handled enormous logistical, transportation, intelligence, medical and engineering responsibilities, not to mention aviation, since the Air Force was part of the Army in those days.

By the summer of 1945, 1,804,408 ground soldiers were serving somewhere in the Pacific or Asia. They were part of the third largest land force ever fielded in American history, behind only the European theater armies in World Wars I and II. Though soldiers comprised the main American ground force in the war against Japan, they sometimes felt like junior partners to the more famous Marines. “Out here, mention is seldom seen of the achievements of the Army ground troops,” Major General Oscar Griswold, a corps commander, wrote from the South Pacific to a colleague in the fall of 1943, “whereas the Marines are blown up to the skies.” To a generation of Americans reared in a mass media culture and for whom the notion of recognition, or credit, for a job well done was of crucial importance, this perception could be corrosive. Paul Fussell, a keenly insightful commentator on World War II American culture, and a combat veteran of the ground war in Europe, opined that for soldiers the ultimate purpose of their dangerous efforts was often the value they ascribed to “the distant, credulous home-town audience for whom one performs by means of letterpress rather than by one’s nearby equals who know what the real criteria are. That all-important home-town audience the troops never forgot.” 

With this ephemeral goal as the measuring stick of worth, the Army often came up short during the war with Japan, and certainly in posterity’s view of that war. For some this led to anger and resentment against Marines as glory hounds with an overactive publicity machine. “The Marines are so hopped up with their publicity–someone has to fight them and I’m the guy,” one junior officer huffed in a 1944 letter to his family. Another wrote wearily from Saipan, “Our men are getting awfully tired of reading about the exploits of the Marines out here. We have been able on many occasions to identify pictures of ‘Marines’ in action as being pictures of army troops. The standing joke now is that the Marines’ secret weapon is the Army.” This anger against a sister service–Marines and soldiers had much more in common than otherwise--was ultimately pointless and counterproductive. The Marine Corps comprised only five percent of the U.S. armed forces in World War II and yet Marines suffered ten percent of all American battle casualties, including over 19,000 fatalities, so the Corps more than earned its vaunted reputation for valor, a fact that most soldiers recognized. If history is to assign something like credit for the American ground victory in the Pacific/Asia theater, it properly belongs to both services, with the Army nonetheless playing a significantly larger role. By design, the Army did the vast majority of the planning, the supplying, the transporting, the engineering, the fighting and the dying to win a war whose end represented a tremendous American triumph as well as a disquieting harbinger with ominous undertones for the American future.

The unglamorous locales, from the jungles of New Guinea, Guadalcanal and Mindanao to the frozen valleys of Attu, the rocky caves of Biak and Peleliu, the ruined metropolitan blocks of Manila and Cebu City, the grassy hills of Guam and the mind-numbing ridges and peaks of Burma, carried with them a troubling whiff of clairvoyance. “This is the Pacific, WWII, all over again,” Stanley “Swede” Larsen, who served in the Second World War with the 25th Infantry Division, wrote from Vietnam in 1965 to one of his former World War II commanders. Now a general, Larsen saw “the same shortages, same malaria problems, personnel headaches, transportation bottlenecks etc. Little. . . could I have guessed exactly 20 years ago that we would go full circle and be back at the same game, in the same part of the world.”

Indeed, as Larsen indicated, the battlegrounds of the Pacific, and the war itself, hinted strongly at the patterns of succeeding history, especially for the Army which, as an institution, shaped much of that history. The fact that the Army bore the brunt of the ground fighting against Japan was by no means a singular occurrence. In fact, it was a true indicator of what was to come. Since World War II, the Army has not only done most of the ground fighting–with a strong assist from the Marine Corps–but it has done the vast majority of America’s fighting altogether. From Korea to Afghanistan, over 90 percent of American wartime casualties have been suffered by ground troops, most of whom were Army soldiers and most of whom were killed, wounded or captured somewhere in Asia. With the exception of brief expeditions in Grenada, Panama and Somalia, every subsequent war involving Army conventional forces has been fought somewhere on or near the Asian land mass. It is also worth noting that all of this happened at a time when the advent of nuclear weapons (another harbinger unleashed by the Pacific War) and enormous advances in technology were supposed to make the average ground soldier obsolete.

In the Pacific, the Army glimpsed so many more of its future challenges and trends. Examples abound. Soldiers sometimes fought as guerrilla warriors. In many instances, they mobilized local insurgents to fight the Japanese, just as Special Forces would later do in many places around the globe. In this shadow war fought largely in the Philippines and Burma, Americans often had to immerse themselves into exotic cultures, gain the trust of local leaders and think as a local, not a westerner, might. Similarly, Army soldiers, from New Caledonia to Okinawa, had to learn to develop productive relationships with a dizzying assortment of ethnic groups, tribes, and small countries, not dissimilar to the cross-cultural diplomacy of the Cold War era and the twenty first century.

Winston Churchill once famously quipped, “There is at least one thing worse than fighting with allies and that is to fight without them.” The last time the United States fought a war with no allies was in 1898 against Spain. In the Pacific, the Army fought alongside a variety of partners, most notably Australia, Britain and China. The Army’s relationship with the Australians and the British was at times surprisingly contentious, at least for such culturally similar, longstanding allies. The alliance with Chiang Kai Shek’s China stands as perhaps the most flawed and ill-fated in all of American history. The Americans, mainly through the person of Lieutenant General Joseph Stilwell, experienced the enormous frustration of latching themselves to a corrupt, repressive and maddeningly inefficient regime, one with a voracious appetite for Lend Lease support but little inclination to fight the way the Americans wanted them to against the common enemy. For the Army, China would turn into a witch’s brew of cultural misunderstandings, byzantine politics, command inspired backbiting, and willful self deception (primarily on the part of the American government and general public). It was a frightening foreshadowing of future experiences with many other defective, and corruptive, American alliances, most notably the partnerships with South Vietnam and Afghanistan.

So, given the seminal importance of the Pacific War and the crucial role the Army played in fighting it, why does the Army’s leading role remain so overlooked and relatively obscure all these decades later? Why is the World War II Army known much better for helping take down Hitler rather than Hirohito? Cole Kingseed once argued persuasively that five factors contributed to the greater prominence of the Army’s war in Europe over the Pacific: the Germany-first strategy that dictated Allied priority of resources and thus the entire course of World War II; the maritime nature of the Pacific struggle, leading historians to a naval-dominated narrative; the cult of personality surrounding General Douglas MacArthur who, by his own design, absorbed almost all accolades to himself rather than to the soldiers who did the real fighting; unbalanced press coverage by correspondents who found Europe a far easier, and more hospitable, place from which to report than the wilds of the Pacific; and the troubling racial savagery that characterized the war from start to finish. I believe there were two other factors as well. The chaotic and tragic debacles of multiple early Allied defeats undoubtedly contributed to this obscurity. After all, Americans never suffered any defeat at the hands of the Germans to equate with the Japanese conquest of the Philippines in 1942. Moreover, the brutal, unfettered manner in which both sides fought the Pacific/Asia war hardly lends itself to a popular good versus evil narrative of the sort to capture posterity’s imagination, though unquestionably the United States attempted to adhere to some semblance of humane war-making far more than did its adversary. Perhaps when we look back at the Pacific War, and the unglamorous battles the Army fought over an area that spanned nearly one-third of the earth’s surface, we are actually looking into a mirror of sorts and, in that mirror we see ourselves a little too clearly, maybe even too close for our ultimate comfort.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181687 https://historynewsnetwork.org/article/181687 0
Historically, Black Distrust of Police is About More than Acts of Violence

Protesters on 125th Street carry posters of NYPD Lt. Thomas Gilligan, who was not indicted after killing 15 year old James Powell in 1964. 

Police accounts claimed Powell was armed with a knife, an account believed by few Harlem residents. 



We’re supposed to be living in a time of racial reckoning. After the world watched George Floyd’s excruciating street execution, over and over, there emerged a groundswell of support for antiracist work throughout society in the spring and summer of 2020. Reforming policing and reining in its obvious excesses became one of the most immediate goals.

In the last place that George Floyd called home, a majority of Minneapolis City Council members initially pledged, loudly, to dismantle the Minneapolis Police Department, which turned into a modest budget cut. Disappointed activists put the question of replacing the police department with a department of public safety to voters this Election Day, and it failed. A yearlong, bipartisan police reform effort in Congress collapsed, producing nothing at all. Various states and localities have implemented reforms, such as limiting chokeholds or making police complaints publicly available, almost singularly focused on reducing violence and death at the hands of police.   

This makes perfect sense, especially given the prevalence of these deaths caught on video in recent years, and the stark horror of watching people lose their lives, often in circumstances that are questionable, at best. However, policing’s racial problem is age-old, and goes well beyond physical violence as I document in my new study of structural racism in New York in the 1950s and 1960s, The Harlem Uprising - Segregation and Inequality in Postwar New York City. If we don’t understand the history, we miss the long legacy of distrust that people of color, particularly Black people, feel toward the police. And beginning and ending at beatings and shootings leaves us with a severely incomplete understanding of what is and has been wrong.

In July, a white NYPD lieutenant, with nearly two decades on the job and well over a dozen commendations for outstanding police work, including four for disarming men with guns, shot and killed a fifteen-year-old Black boy. He said the youngster came at him with a knife, even when the officer announced himself, even after he shot the boy the first time. Witnesses disputed the officer’s account, and there was no video. A grand jury declined to indict the lieutenant, and the department ruled the shooting justified.

You may have heard about this one, but probably not, because it was in 1964. The boy was James Powell, Thomas Gilligan his killer. As for who believed this story, the New York Amsterdam News, the city’s oldest Black newspaper wrote that “Nobody in Harlem does.” In 1964, that was understood to mean Black New Yorkers did not trust the NYPD’s version of events. They had little reason to.

The NYPD in the mid-1960s was thoroughly corrupt, racist and violent. Officers, individually and collectively, reserved their worst behavior for the city’s segregated Black neighborhoods, like Harlem, Bedford-Stuyvesant and Brownsville. Taking bribes from building contractors, tow truck drivers, funeral directors and defense lawyers was as much a part of the job as clocking in. They ran protection rackets on sex traffickers and heroin dealers, generating substantial weekly payments that would be split among the men working these patrols, and required local businesses to pay smaller amounts, which would be divvied up and spread around the local precinct. To anyone skeptical, read the Knapp Commission’s report.

Police in the city treated Black neighborhoods, especially Harlem, as crime reservations, containment areas for drugs, gambling and sex work. It’s not that the NYPD, from rank-and-file high up into the executive corps, permitted these illicit industries to exist. Instead, they worked hard to make sure these outlets of misery and ruin flourished, both out of selfish financial motives and racial spite, given that the NYPD was 95 percent white. Multiple memoirs from men on the force during this time establish that they worked with “a deep sea of racism and bitterness, poison and untamed cruelty in our souls.”

And yes, the police were also violent, ranging from arbitrary roughness on the street, to sadistic squad room beatings, to shootings. Of course Black New Yorkers opposed this. But they also wanted safer communities, which the NYPD assiduously denied them. They had to live with all manner of random street crime, especially that which is concomitant with an impoverished neighborhood rife with addiction. More policing and less police violence and discourtesy are not mutually exclusive, or at least they should not be.

Some of this behavior has changed. It is much more difficult for police departments today to be so outwardly corrupt that they effectively act as untouchable criminal organizations. But many aspects remain. Police departments across the country have long used drivers as ATMs to fund local budgets, frequently targeting Black motorists for heightened enforcement of minor traffic laws. Not only do these stops engender bitterness, but the resulting fines also threaten to financially ruin people already living on the edge, and too often escalate into totally unnecessary shows of force. Ticket and arrest quotas demand that people be brought into the criminal justice system without good reason.

The racist assumption that Black and Brown men are more likely to be up to no good led to the explosion of stop-and-frisk in New York City, a practice the state approved in 1965. For decades, the police practiced the on-street search of someone who aroused an officer’s suspicion, including “furtive movements” or simply being in a location with a high crime rate. The number of these searches peaked at  685,724 in 2011. Since 2014, the department instructed officers to only detain people when they have a strong suspicion of criminal activity, and the numbers dropped into the five-figure range.

For the last two decades, ninety percent of those officers detained were Black or Latinx, though the city is about 45 percent white. Ninety percent were released, having been found not engaging in or possessing anything illegal. Put another way, the NYPD was stopping and frisking tens of to hundreds of thousands of people, every year, who were doing nothing illegal. There is no evidence the practice measurably reduces crime.

The police also lie, from the smallest to the direst of matters, and courts side disproportionately with an officer’s word. Officer Michael Slager of North Charleston, South Carolina, said he feared for his life because Walter Scott had taken his Taser and Slager “felt threatened.” In reality, Slager had tased Scott after a physical altercation; Scott then ran away, unarmed, and Slager shot him in the back five times, killing him. But again, this is just a recent example of a very old practice, and perhaps one of the worst examples of a “cover charge,” or a false accusation after the fact that permits officers to brutalize and arrest or kill someone. If the victim survives and complains, who believes an alleged felon?

Policing has been in need of reform since its inception. Changes have come, but they’ve been slow, and significant numbers of people, disproportionately Black and Brown, have experienced and continue to experience discourtesy, bigotry, false allegations, and violence. The outrage we saw explode in 2020 shouldn’t have surprised anyone. If anything, we should be surprised the streets of America had been so quiet before.

People of color, and Black people in particular, have always experienced justice differently in this country. Distrust toward the police is multigenerational, and not the result of one violent act, or Marxist agitation, or exposure to rap lyrics. Until we fundamentally reimagine what policing looks like in this country, this situation seems unlikely to change.

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181688 https://historynewsnetwork.org/article/181688 0
Citizen's Arrest Law at Center of Trial of Arbery's Killers Originated in Slavery

A mural painted on the African American Cultural Center of Brunswick, Georgia honors Ahmaud Arbery.



Jury selection has begun for the trial of Gregory and Travis McMichael and William "Roddie" Bryan Jr., the three white men accused of murdering Ahmaud Arbery, an unarmed twenty-five year old African American man in February 2020, outside of Brunswick, Georgia. The three face multiple charges including malice and felony murder. According to the Georgia State Legal Code, a conviction for malice murder requires murderous intent or forethought. A conviction on the charge of felony murder means the murder was committed, whether intentionally or not, during the course of another crime. The penalty for both crimes can be death, imprisonment for life without parole, or imprisonment for life with the possibility of parole.


The defense is expected to argue that the assault on Ahmaud Arbery was legitimate under Georgia’s Citizens Arrest law, which was applicable at the time, and that Arbery’s death was caused by his physical resistance to a legal action and was therefore self-defense on the part of the three white men.


Arbery was jogging on a rural road when the armed white men approached him in two vehicles. They claim his behavior was suspicious, but they had not witnessed any criminal behavior on his part, nor had they notified the police prior to stopping him, which should negate a Citizen’s Arrest defense, unless of course, jogging while Black is itself legal grounds for a Black man to be stopped by white vigilantes. Without the claim of Citizen’s Arrest, the three men were simply assaulting Arbery, and he was the one to legitimately defend himself, not them. A thousand potential jurors from Glynn County in coastal Georgia were summoned for questioning by defense and prosecution attorneys. It remains to be seen whether a Georgia panel that includes white jurors will convict three white men of murdering an African American.


The Georgia Citizen’s Arrest law, which was repealed in May 2021, dated to 1863 during the American Civil War. Its last iteration, passed by the Georgia legislature in 2010 and signed into law, stated “A private person may arrest an offender if the offense is committed in his presence or within his immediate knowledge. If the offense is a felony and the offender is escaping or attempting to escape, a private person may arrest him upon reasonable and probable grounds of suspicion.” A private individual who makes a “citizen’s arrest” is instructed “without any unnecessary delay” to “take the person arrested before a judicial officer . . . or deliver the person and all effects removed from him to a peace officer of this state.” Nearly every state in the United States currently permits citizen arrests in one form or another.


Georgia’s state laws were formally codified in 1861 by Thomas Cobb,  a lawyer and slaveholder. It was the first formal codification of state common law in the United States. In the original code, African Americans were assumed to be enslaved unless they could prove their free status. Citizen’s Arrest statues were added to the Law Code of Georgia in 1863. The Law Code of Georgia has been revised a number of times during the last 150 plus years, however, the Civil War era Citizen’s Arrest provision remained in effect until after Arbery’s death.


The idea of citizen’s arrest was imported to the British North American colonies along with English common law and probably dates back to the European Middle Ages and feudalism. According to the 1285 Statute of Winchester, private citizens were permitted to arrest someone who committed a crime and were, when alerted, required to assist in the apprehension of someone suspected of criminal behavior. Starting at the end of the 18th century, British and American courts and legislatures established that a private citizen who perpetrated a citizen’s arrest was legally to blame and liable for damages if it was demonstrated that the person seized was not actually engaged in a criminal act.

Citizen’s arrest in the United States has a long history and a deep connection to the enslavement of African Americans. In the thirteen British colonies that became the United States, the idea of citizen’s arrest frequently accompanied mob violence and the murder of the accused, particularly enslaved Africans. In 1712 and 1741 in New York City, enslaved Africans suspected of plotting rebellion, were publicly and gruesomely executed. After the 1739 Stono Rebellion by between 50 and 100 enslaved Africans in coastal South Carolina, the rebels were summarily executed and South Carolina passed new restrictions to control the Black population. Earlier, fearing the possibility of slave uprisings, the South Carolina assembly approved legislation requiring all white men to carry firearms when they attended church on Sundays. Any adult white male who did not comply with the law was fined. White vigilantism and restrictive laws generally increased after suspected or actual slave plots.

In a precursor to citizen’s arrest laws, fugitive slave notices in newspapers across the British colonies offered rewards to private bounty hunters for the capture and return of escaped enslaved Africans. In 1769, Thomas Jefferson, the principal author of the Declaration of Independence and a future President of the United States, advertised in the Virginia Gazette for the capture and return of “Albemarle, A Mulatto slave called Sandy, about 35 years of age.” Jefferson offered a 40 shillings reward if Albemarle was captured locally, 4 pounds if he was captured elsewhere in Virginia, and 10 pounds if Albemarle had made it to another colony.


In 1776, one the accusations made against the King of England in the Declaration of Independence was that he had “excited domestic insurrections amongst us,” including the November 1775 decree by Virginia Governor Lord Dunmore promising freedom to any enslaved African that supported British efforts to suppress the rebellion.


Article IV, Section 2, Clause 3 of the Constitution of the United States established the legal basis for anti-African, anti-slave vigilantism and violence, declaring that “No Person held to Service or Labour in one State, under the Laws thereof, escaping into another, shall, in Consequence of any Law or Regulation therein, be discharged from such Service or Labour, but shall be delivered up on Claim of the Party to whom such Service or Labour may be due.” The first federal Fugitive Slave Law in 1793 authorized local governments, but not federal authorities, to capture and return to slave holders any self-emancipated freedom-seekers who had escaped from bondage. Local authorities were also empowered to fine anyone who assisted escapees. Under this act, Frederick Kitt, acting as the agent for President George Washington, posted fugitive slave notices in a number of newspapers, offering a ten dollar reward for the capture and return of “ONEY JUDGE, a light Mulatto girl, much freckled” who had “absconded from the household of the President of the United States.”

In 1842 the United States Supreme Court unanimously upheld the constitutionality of the fugitive slave act and overturned charges in Pennsylvania against Edward Prigg, a slavecatcher, convicted of kidnapping and returning to slavery a woman who had earlier escaped from enslavement in Maryland. Because enforcement of the 1793 law was left to local authorities, it was rarely implemented in Northern states that had abolished slavery. The 1850 Fugitive Slave Act required Northerners, ostensibly including free Blacks, to aid in the recapture of escaped slaves or face heavy fines and denied freedom-seekers the right to due process. Both laws remained in effective until they were repealed by Congress in 1864 as a Civil War measure.

When the citizen’s arrest provision was added to the Georgia Law Code in 1863, slavery and law enforcement in Georgia were in serious disarray. Georgia units in the Confederate army were primarily stationed in Virginia and enslaved Africans were fleeing plantations to join approaching Union forces. The code revision essentially empowered any white Georgian to take steps to keep the enslaved Black population under control.

Southern citizen’s arrest vigilantism was even worse after the end of the Civil War as former slaveholders and the white population in general tried to terrorize African Americans into accepting a new form of subservience. Citizen’s arrest supported Ku Klux Klan violence against Black Georgians, and in 1868 there were over 300 reported cases of the Klan murdering or attempting to murder Georgia’s Black citizens. The Fulton County Remembrance Coalition lists almost 600 documented lynchings in the state of Georgia between 1877 and 1950.

Ashley M. Jones, the new poet laureate of Alabama, has a couplet in her poem, All Y’all Really from Alabama, that powerfully captures the racism imbedded in Citizen’s Arrest and much of the American legal system. “We hold these truths like dark snuff in our jaw, Black oppression’s not happenstance; it’s law."

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181690 https://historynewsnetwork.org/article/181690 0
Writing a Classic: Richard Tregaskis and "Guadalcanal Diary"




For two months in the summer of 1942, Richard Tregaskis, a young correspondent with the International News Service, had toiled away in the Southwest Pacific to report on the news from a little-known island in the Solomons named Guadalcanal. Tregaskis had joined the approximately 11,000 men of the First Marine Division who stormed the beaches on August 7, 1942, to seize the island from the Japanese. 


Tregaskis was no stranger to combat at this point in his career and had always been eager to be close to where American forces were fighting, serving as an embedded reporter long before the term came into use. He watched from the deck of a U.S. Navy cruiser as Lieutenant Colonel James Doolittle’s B-25B Mitchell bombers took off from the carrier USS Hornet to bomb Tokyo. Later, he was on the Hornet to witness its dive-bombers and torpedo planes, several which did not return, hurtle off the ship’s flight deck on their way to attack the Japanese fleet during the critical Battle of Midway.


The Guadalcanal landing marked America’s first use of ground troops in a major offensive against the Japanese Empire. Tregaskis’s dedication to his job during his time on Guadalcanal impressed the marines’ commander, General Alexander Vandegrift. The general recalled that Tregaskis, one of only two reporters with the marines during their first uncertain weeks on the island, seemed to be everywhere, and the information he acquired was “factual and not a canned hand-out.” 


Vandegrift especially remembered that during the height of the fighting for what came to be known as Edson’s Ridge, he could hear through the darkness the sound of a typewriter clacking away. “I asked who could be writing at this time when he could not possibly see the paper,” noted Vandegrift. “Dick spoke up, ‘It’s me, General, I want to get this down while I am still able. Don’t worry about my seeing, I am using the touch system.”


Tregaskis turned his experiences of the often-hellish fighting on the island into a best-selling book, Guadalcanal Diary. Still in print today, the book is one of the best of its kind by modern war reporters for its ability to capture in print a ground’s-eye view of combat and its debilitating effect on the marines. Tregaskis endured the same dangers faced by the troops, including withstanding bombing by Japanese aircraft during the day and shelling from their navy—dubbed the “Tokyo Express”—most nights. The marines also had to deal with inadequate supplies of food and equipment, and the constant fear of being overrun by a single-minded foe. 


All these hardships were matched by the difficulties of fighting on the island itself—an often impenetrable jungle that limited vision to just a few yards, jagged mountains climbing to a height of 8,000 feet above sea level, sharp-bladed kunai grass, pesky and venomous insects, dangerous crocodiles, screaming birds, swarms of mosquitos that brought with them tropical maladies that could incapacitate a man for weeks or months, nauseating odors, and hot, humid conditions that bred all sorts of funguses and infections.


Tregaskis’s manuscript outlining his time on Guadalcanal in an easily understood diary format arrived INS offices at 235 East Forty-Fifth Street, New York, without fanfare in early November 1942. It had made quite a journey. The pages had been transported from Fleet Headquarters in Pearl Harbor, Hawaii, via airmail from a young INS reporter, Richard Tregaskis. 


Barry Faris, INS editor-in-chief, wrote Tregaskis that he had turned the manuscript over to Ward Greene, executive editor of King Features, owned and operated, as was INS, by newspaper publisher William Randolph Hearst. Faris told his reporter that Greene would make every attempt to get Tregaskis’s manuscript accepted by a book publisher and subsequently serialized in magazines. “I did not have a chance to read it thoroughly as I would have liked,” Faris informed Tregaskis, who would be splitting the revenue from the book fifty-fifty with his employer, “but from what I did see I think you did a magnificent job on it.” 


One person who did take the time to read Tregaskis’s writing from beginning to end was Bennett Cerf, cofounder with his friend Donald Klopfer of the New York publishing firm Random House. Greene had distributed copies of the manuscript to nine publishers and asked them to bid for the opportunity to publish the book, a method “that had never been done before,” Cerf noted. 


Just the day before he received Tregaskis’s text, Cerf had been talking with his colleagues that the first book that came out about Guadalcanal would “be a knockout because Guadalcanal marked the turning of the tide” in the war in the Pacific, which had been going badly for the Allies since the Japanese had bombed the American fleet at Pearl Harbor in the Hawaiian Islands on December 7, 1941. As the publisher noted, “the dictators were ready and the liberty-loving people were caught unprepared.” 


Cerf received the manuscript from King Features on November 11, took it home with him, read it that night, called Greene at nine the next morning, and told him: “I’ve got to have this book.” A pleased Cerf related years later that Random House had signed up to publish the young reporter’s work before “any of the other eight publishers had even started reading it.” 


Cerf’s premonition that the American public would be interested in learning more about the marines and their pitched battles with the enemy on a remote island thousands of miles away turned out to be accurate. Rushed into print on January 18, 1943, Guadalcanal Diary became a best seller and the first Random House book to sell more than a hundred thousand copies. Critic John Chamberlain of the New York Times wrote that Tregaskis’s book served as “a tonic for the war-weary on the homefront,” showing, as it did, to the Japanese and those who doubted America’s resolve, that a country “doesn’t necessarily have to love war in order to fight it.”


During his time with the marines on Guadalcanal, Tregaskis carried in his pockets notebooks on which he would write information about what he had seen and experienced. Once he had filled a notebook, he would transfer the information nightly into a black, gilt-edged diary. “The theory and practice was that I could get all the details I needed by referring to the notebook number, 1, or 3, or 4 when and if I could later get to writing a book from my notes,” Tregaskis recalled.


After leaving Guadalcanal via B-17 Flying Fortress on September 25, Tregaskis started writing his book while in Noumea, New Caledonia, where he was waiting for a military transport plane to take him on to Honolulu. When he finally landed in Honolulu, his writing had to be done in the navy offices at Pearl Harbor, going there every morning, working under the censor’s gaze, and watching as his diary was locked in a safe every night; he never got it back and could not find out what happened to it. “And as fast as I could write my manuscript, a naval intelligence officer took my efforts and hacked away with a pencil and a pair of scissors,” Tregaskis reported. “That was the way it was with sharp-eyed military censorship in those days.” 


Although a likeable fellow personally, the censor Tregaskis worked with was “stiff as a porcupine when it came to his official duties. He even chopped out a mention of the fact that the Japanese camps usually had a sweetish smell. He apparently felt that if they read my story the enemy might start using a deodorant as a kind of camouflage.”


Unfortunately, Tregaskis later noted that he never got back the black, gilt-edged diary from naval authorities. He did manage to keep some of the pocket notebook. “They are vastly detailed and any comparison of them with the final text of Guadalcanal Diary will show that there are 20 or 40 or 50 facts in this kind of notebook for one which survives in to print,” Tregaskis noted. (For his later works, including his book Vietnam Diary, published in 1963, Tregaskis had evolved into a simpler system—all his notes were written into one large diary book. One benefit of doing so he said was that, because of its size, it was hard to misplace.)


Random House published Guadalcanal Diary on January 18, 1943, and Tregaskis’s work made a steady climb up the best-seller charts, reaching, the publishing company’s advertisements were quick to report, the number-one position on lists compiled by the New York Times and New York Herald Tribune. Sales of the book, which cost $2.50, were boosted by positive reviews from critics across the country, who praised Tregaskis not for his literary flair, but for his factual and honest reporting about what the marines faced in the Solomons.


Years after the war ended, Tregaskis could boast to a friend that his classic book of war reporting had sold more than three million copies, counting all editions and had been translated into twelve languages, including Japanese, Chinese, Spanish, French, and Danish. Its continued popularity bolsters Tregaskis’s belief that among the American ideals, “courage remains the most valuable of all.”

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181691 https://historynewsnetwork.org/article/181691 0
The Genesis of US Corporations’ Political Dominance Good morning, HNN!

Have you ever wondered how elected officials became so beholden to money? Well, look no further than the Powell Memo, a confidential document from 1971 that inspired corporations to take over US politics.

You can also listen to the episode on Spotify and Apple Podcasts, and watch it on Instagram.

Today’s story comes from We, the Corporations by Adam Winkler, and Dark Money by Jane Mayer. Have you read them? I find both positively stunning.

Next time, on the Season 3 finale of Skipped History...

We’ll examine the family foundation primarily responsible for undermining election integrity today. (I actually wrote about the foundation last week for paying subscribers to the Skipped History newsletter, which inspired me to make our finale on the same subject.)

See you then!


Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/blog/154559 https://historynewsnetwork.org/blog/154559 0
Tom Standage on his Brief History of Motion



Tom Standage is the author of six history books, including A History of the World in 6 Glasses, The Victorian Internet (a history of telegraphy) and An Edible History of Humanity. His latest book, A Brief History of Motion, charts the rise of personal transportation, from the introduction of the wheel (around 3500 BCE) to the 21st century developments of electric cars, autonomous vehicles and services like Uber.

In addition to his book writing, Standage, who lives in London, serves as a deputy editor of The Economist. HNN discussed with Standage his views on technology, the writing of history and the global push to switch to electric vehicles.


Q. Your previous books, including The Victorian Internet and A History of the World in Six Glasses have focused on new technologies as drivers of societal change. Your histories differ from more traditional historical accounts which place greater emphasis on political change (e.g. a new prime minister or president).  Do you think that technological change is undervalued in most traditional histories of European and North American nations?

Yes, I think it is undervalued. My specific interest is the social impact of technology, and how people react to it, so this is a form of social history, with a focus on how the adoption of technology can cause bottom-up change. In that sense it’s an antidote of sorts to the top-down, “great man” view of history. But there’s then the danger of falling into the trap of technological determinism: the notion that technology is the only thing that drives societal change. In my work I try to show the interaction between “technology push” and “societal pull” — between what new technologies can offer, and what people actually want —because new things only take hold if they meet a genuine need or align with a wider societal shift. So I think you need to consider both the technology, and the prevailing social and political environment, rather than focusing on just one or the other.

Q. Your new book, A Brief History of Motion, begins with the development of the wheel around 3500 B.C.E and continues through contemporary automobiles. You limit your account to “personal transportation,” including the horse, carriages and the rise of the automobile. You ignored sea and air transportation. Why did you narrow your focus?  

I’m mostly interested in the impact of the automobile, and the way it reshaped the world during the 20th century. To understand why the automobile was embraced so rapidly, particularly in the United States, it’s important to understand the context in which it appeared, and the earlier development of wheeled vehicles such as carriages, steam trains and bicycles. The car promised to combine their best elements: it offered the speed of a train, but could travel on existing roads like a horse-drawn vehicle, but also granted personal freedom for spontaneous travel, like a bicycle. We can only understand what happened with the car, and what might happen next, if we examine those earlier technologies. Hence my focus on wheeled vehicles, and why I ignored sea and air transportation.

Q. You are currently a deputy editor of The Economist. How do you balance this with writing books? Are your journalistic duties complementary to your book research?  

I find they fit together very well. In my work as a journalist I focus on future trends and edge cases where we can see aspects of the future in the present. As an author I focus on patterns in history, and what they can tell us about the present and the future. So the two approaches are very complementary. I have less time than I used to for writing books, though, so the gap between my books keeps getting longer. It’s up to eight years now, whereas my first three books appeared within the space of five years.

Q.  Unlike many academic historians who confine their writing to past events, you are not afraid to make predictions about the future.  In a chapter titled “The Road Ahead,” you discuss the concept of “peak car” and you state “the enthusiasm for (personal ownership) of cars is finally waning.” You go on to state that the Coronavirus pandemic has discouraged public transit use but that it is “unlikely to herald a global boom in car sales.” This has not proven to be the case. In New York State in 2020, car registrations rose 18% over 2019. Why did this happen and do you still think the public’s desire to own cars will shrink in the future?

Yes, I think the pandemic-era rise in driving, which is driven in large part by an understandable reluctance to use public transport, will prove to be temporary, and in the medium term people will drive less, not more, as more people spend more time working from home. The longer-term trends are very clear: the number of miles driven per vehicle, and per person, each year is declining, even in the US, where both peaked in 2004. The fraction of people with driving licenses is declining in all age groups. Car ownership is becoming more expensive and less convenient, and the alternatives to car ownership, at least for people who live in cities, are becoming steadily more attractive as smartphones allow us to combine public transport, ride-hailing, bike rental and so on into an “internet of motion”. Smartphones can also directly replace the use of cars for shopping, meeting friends, going out to get food, and so on. Cars will not go away, but car ownership will, I think, make less sense for many people in future.

Q. In the U.S., President Biden has announced a goal of having 50% of all new cars sold by 2030 be electric vehicles. In the U.K. Prime Minister Boris Johnson has set a more ambitious goal: to ban the sales of all gasoline and diesel cars by 2030. Is this a worthwhile goal and do you think the U.K. can meet it?

Yes, absolutely. EV sales have gone from nowhere five years ago to about 10% of vehicles sold in Britain in 2021, and 16% if you include plug-in hybrids. I myself bought a plug-in hybrid last year, with a battery-only range of 40 miles, so it’s essentially an electric car except on longer trips out of town. The big challenge, both in Britain and America, and indeed elsewhere, is improving the infrastructure so that there are enough charging points. This is why I didn’t buy a pure EV: for longer trips finding a charging point is still very hit-and-miss. I always look for one when I’m in another city, and often they’re occupied, or don’t work, or aren’t compatible with the charging networks I belong to. This area needs massive investment, and it will create jobs, so it’s something politicians are looking at on both sides of the Atlantic.

Q. Can you share with HNN readers any new projects that you are working on? 

I always like to have something cooking, as it were, and lately I’ve been digging into another area of interest, which is the Scientific Revolution of the 1660s, and its relationship to the history of medicine. At the time medicine was extremely unscientific, based on the ancient theory of humors, and germ theory did not take hold for another two centuries. So I’ve been digging into that and suspect there may be a book in there somewhere. But, at this rate, not until about 2031!

Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181692 https://historynewsnetwork.org/article/181692 0
The Roundup Top Ten for November 5, 2021

Another Buffalo Was Possible

by Keeanga-Yamahtta Taylor

India Walton seemed on track to become the first Black woman mayor in Buffalo, and the first socialist to lead a major city in decades. The sitting mayor rallied to defeat her, but we should still consider the possibility of more liberatory politics. 


The Untold Story of the World's Biggest Nuclear Bomb

by Alex Wellerstein

Read a detailed account of the moment in the Cold War when the United States and Soviet Union contemplated, then developed and tested, nuclear weapons of horrifying power. 



The Attack on University of Florida Professors is Totalitarian

by Silke-Maria Weineck

Even beyond academic freedom, events in Florida signal the effort to tie the interests of the state's universities to the agenda of the state's ruling party. 



Right-Wing Ideologues Turn Aggressors Into Victims

by Waitman Wade Beorn

"Allowing the right to weave pernicious counternarratives and to create saints from sinners will only embolden future Ashli Babbitts and spawn more violence. "



Why are Medieval Weapons at the Center of a Supreme Court Case?

by Jennifer Tucker

The history and traditions of English law inform American judicial interpetation today, including efforts to discern the functional meaning of the Second Amendment. A group of historians has briefed the Court that restricting dangerous weapons in public is long-established. 



Work Requirements Would Undo A Signature Biden Accomplishment

by Molly Michelmore

An expanded Child Tax Credit would potentially reduce child poverty by 40%. Placing work requirements on the credit would harm children for the sake of the historic pattern of policing the line between the deserving and undeserving poor. 



Fannie Lou Hamer's Leadership Shows We Can't Separate Civil Rights and Economic Justice

by Keisha N. Blain

The author of a new biography of the Mississippi Freedom Democrat argues that Hamer's legacy shows that inequality erodes both civil rights and democracy. 



Laugh at Parodies of School Board Meetings, but Take Local Politics Seriously

by Lily Geismer and Eitan D. Hersh

Local politics – if it involves a wide spectrum of community opinion – can help override partisan polarization, create new coalitions, and empower citizens to make meaningful change.



It's not Just the Missionaries: Haiti had 782 Kidnappings This Year

by Cécile Accilien

"The kidnapping business is in fact supported by the convergence of interests of the political and business elite and the international community, while the interests of the vast majority of Haitians are obviously not taken into account."



How Academia Laid the Groundwork for Redlining

by Todd Michney and LaDale Winling

Richard T. Ely and his student Ernest McKinley Fisher pushed the National Association of Real Estate Boards to adopt "the unsupported hypothesis that Black people's very presence inexorably lowered property values," tying the private real estate industry to racial segregation. 


Sat, 27 Nov 2021 21:22:14 +0000 https://historynewsnetwork.org/article/181684 https://historynewsnetwork.org/article/181684 0