Books Books - articles brought to you by History News Network. Sat, 13 Sep 2025 21:59:53 +0000 Sat, 13 Sep 2025 21:59:53 +0000 Laminas_Feed_Writer 2 (https://getlaminas.org) https://www.hnn.us/article/group/3 A. Philip Randolph Lambasts the Old Crowd “The New Negro” is one of the oldest, longest serving, and most fascinating concepts in the history of African American culture. Expansive and elastic, capable of morphing and absorbing new content as circumstances demand, contested and fraught, it assumes an astonishingly broad array of ideological guises, some diametrically opposed to others. The New Negro functions as a trope of declaration, proclamation, conjuration, and desperation, a figure of speech reflecting deep anguish and despair, a cry of the disheartened for salvation, for renewal, for equal rights.

While most of us first encounter the “New Negro” as the title of the seminal anthology that Alain Locke published in 1925, at the height of the Harlem Renaissance, we find the origin of the term — so closely associated by scholars and students alike with the multihued cacophony of the Jazz Age — actually goes back to 1887, four years after the U.S. Supreme Court had voided the liberal and forward-looking Civil Rights Act of 1875, thereby declaring with ultimate finality the end of Reconstruction. This was a time of great despair in the African American community, especially among the elite, educated, middle and upper middle classes. How to fight back? How to regain the race’s footing on the path to full and equal citizenship? This is how and when the “New Negro” was born, in an attempt to find a way around the mountain of racist stereotypes being drawn upon to justify the deprivation of Black civil rights, the disenfranchisement of Black men, and the formalization of Jim Crow segregation, all leading to the onset of a period of “second-class citizenship” that would last for many decades to come — far longer than any of the first New Negroes could have imagined.

One member of the New Negro coalition was A. Philip Randolph (1889-1979), a Black socialist and labor organizer hailed by Martin Luther King Jr. as “truly the Dean of Negro Leaders.” Immediately attracted to socialism as the best means of addressing the systemic exploitation of Black workers, he joined the Socialist Party with Columbia University student Chandler Owen. The two started giving soapbox speeches and founded the socialist Messenger magazine in 1917, which they proclaimed offered readers “the only magazine of scientific radicalism in the world published by Negroes!” Even before founding the Brotherhood of Sleeping Car Porters and Maids in 1925 and making the Messenger the union’s official organ, Randolph used the idea of the New Negro repeatedly as a call to action. Indeed, the Messenger cast out the previous era’s New Negroes as old and unable to address the crisis of the Red Summer. 

In the September 1919 issue, the Messenger included a half-page satirical political cartoon, “Following the Advice of the ‘Old Crowd’ Negroes,” that featured Du Bois, Robert Russa Moton (the successor to Booker T. Washington at the Tuskegee Institute after Washington died in 1915), and Emmett Jay Scott, secretary to Moton and a close adviser of Washington. On the left side of the cartoon a white man in military uniform leads a jeering, torch-carrying mob, and wields a club to attack an already bloodied Black person who struggles to raise from the ground. Another bloodied Black victim sits propped up at the base of the Statue of Liberty. On the night of July 18, 1919, in Washington, DC, over one hundred white servicemen, a “mob in uniform,” wielded pipes, clubs, rocks in handkerchiefs, and pistols attacking Black people they saw on the street. The Black historian Carter G. Woodson, in fact, fled the mob on foot. 

In the cartoon, three august “Old Negroes” propose accommodationist responses to the violence. A seated Du Bois implores, close ranks let us forget our grievances, a reference to his famous Crisis editorial the previous year urging Black readers to support World War I. Beside him, with hands clasped, stands Moton, who urges, be modest and unassuming! Scott, reaching back to Moton, says, when they smite thee on one cheek—turn the other. 

A cartoon from The Messenger, 1919. 

“The ‘New Crowd Negro’ Making America Safe for Himself,” features a clearly younger New Negro veteran in a speeding roadster — labeled the new negro, equipped with guns firing in the front and sides, and displaying a banner commemorating infamous 1919 sites of race riots, “Longview, Texas, Washington, D.C., Chicago, ILL.—?” As he fires at the fleeing white mob, a fallen member of which is in uniform, he declares, since the government won’t stop mob violence ill take a hand. In the clouds of smoke appears the caption giving the ‘hun’ a dose of his own medicine. Above, the editors quote Woodrow Wilson’s April 1918 Great War rallying cry against Germany: force, force to the utmost—force without stint or limit! Clearly, socialism, for Randolph, offered New Negroes the organizational fighting power Black people needed to fend off the most symbolically treacherous of all white mob attacks — those by U.S. military servicemen in uniform.

A cartoon from The Messenger, 1919.

In “Who’s Who: A New Crowd—A New Negro,” published in the May-June 1919 issue of the Messenger, in the wake of the Great War, Randolph urges Black socialists to join forces with white radicals and labor organizers to usher in a new era of social justice. 

—Henry Louis Gates, Jr. and Martha H. Patterson

A. Philip Randolph, from The Messenger, 1917. [Wikimedia Commons]

Throughout the world among all peoples and classes, the clock of social progress is striking the high noon of the Old Crowd. And why?

The reason lies in the inability of the old crowd to adapt itself to the changed conditions, to recognize and accept the consequences of the sudden, rapid and violent social changes that are shaking the world. In wild desperation, consternation and despair, the proud scions of regal pomp and authority, the prophets and high priests of the old order, view the steady and menacing rise of the great working class. Yes, the Old Crowd is passing, and with it, its false, corrupt and wicked institutions of oppression and cruelty; its ancient prejudices and beliefs and its pious, hypocritical and venerated idols.

It’s all like a dream! In Russia, one-hundred and eighty million of peasants and workmen—disinherited, writhing under the ruthless heel of the Czar for over three hundred years, awoke and revolted and drove their hateful oppressors from power. Here a New Crowd arose—the Bolsheviki, and expropriated their expropriators. They fashioned and established a new social machinery, the Soviet—to express the growing class consciousness of teaming millions, disillusioned and disenchanted. They also chose new leaders—Lenin and Trotsky—to invent and adopt scientific methods of social control; to marshal, organize and direct the revolutionary forces in constructive channels to build a New Russia.

The “iron battalions of the proletariat” are shaking age-long and historic thrones of Europe. The Hohenzollerns of Europe no longer hold mastery over the destinies of the German people. The Kaiser, once proud, irresponsible and powerful; wielding his sceptre in the name of the “divine right of kings,” has fallen, his throne has crumbled and he now sulks in ignominy and shame—expelled from his native land, a man without a country. And Nietzsche, Treitschke, Bismarck, and Bernhardi, his philosophic mentors are scrapped, discredited, and discarded, while the shadow of Marx looms in the distance. The revolution in Germany is still unfinished. The Eberts and Scheidemanns rule for the nonce; but a New Crowd is rising. The hand of the Sparticans must raise a New Germany out of the ashes of the old.

Already, Karolyi of the old regime of Hungary, abdicates to Bela Kun, who wirelessed greetings to the Russian Federated Socialist Soviet Republic. Meanwhile the triple alliance consisting of the National Union of Railwaymen, the National Transport Workers’ Federation and the Miners’ Federation, threaten to paralyze England with a general strike. The imminence of industrial disaster hangs like a pall over the Lloyd George government. The shop stewards’ committee or the rank and file in the works, challenge the sincerity and methods of the old pure and simple union leaders. British labor would build a New England. The Sein Feiners are the New Crowd in Ireland fighting for self-determination. France and Italy, too, bid soon to pass from the control of scheming and intriguing diplomats into the hands of a New Crowd. Even Egypt, raped for decades prostrate under the juggernaut of financial imperialism, rises in revolution to expel a foreign foe.

And the natural question arises: What does it all mean to the Negro?

First it means that he, too, must scrap the Old Crowd. For not only is the Old Crowd useless, but like the vermiform appendix, it is decidedly injurious, it prevents all real progress.

Before it is possible for the Negro to prosecute successfully a formidable offense for justice and fair play, he must tear down his false leaders, just as the people of Europe are tearing down their false leaders. Of course, some of the Old Crowd mean well. But what matter is [it] though poison be administered to the sick intentionally or out of ignorance. The result is the same—death. And our indictment of the Old Crowd is that: it lacks the knowledge of methods for the attainment of ends which is desires to achieve. For instance the Old Crowd never counsels the Negro to organize and strike against low wages and long hours. It cannot see the advisability of the Negro, who is the most exploited of the American workers, supporting a workingman’s political party.

The Old Crowd enjoins the Negro to be conservative, when he has nothing to conserve. Neither his life nor his property receives the protection of the government which conscripts his life to “make the world safe for democracy.” The conservative in all lands are the wealthy and the ruling class. The Negro is in dire poverty and he is no part of the ruling class.

But the question naturally arises: who is the Old Crowd?

In the Negro schools and colleges the most typical reactionaries are Kelly Miller, Moton and William Pickens. In the press Du Bois, James Weldon Johnson, Fred R. Moore, T. Thomas Fortune, Roscoe Conkling Simmons and George Harris are compromising the case of the Negro. In politics Chas. W. Anderson, W. H. Lewis, Ralph Tyler, Emmet Scott, George E. Haynes, and the entire old line palliating, me-to-boss gang of Negro Republican politicians, are hopelessly ignorant and distressingly unwitting of their way.

In the church the old crowd still preaches that “the meek will inherit the earth,” “if the enemy strikes you on one side of the face, turn the other,” and “you may take all this world but give me Jesus.” “Dry Bones,” “The Three Hebrew Children in the Fiery Furnace” and “Jonah in the Belly of the Whale,” constitute the subjects of the Old Crowd, for black men and women who are overworked and under-paid, lynched, jim-crowed and disfranchised—a people who are yet languishing in the dungeons of ignorance and superstition. Such then is the Old Crowd. And this is not strange to the student of history, economics, and sociology.

A man will not oppose his benefactor. The Old Crowd of Negro leaders has been and is subsidized by the Old Crowd of White Americans—a group which viciously opposes every demand made by organized labor for an opportunity to live a better life. Now, if the Old Crowd of white people opposes every demand of white labor for economic justice, how can the Negro expect to get that which is denied the white working class? And it is well nigh that economic justice is at the basis of social and political equality.

For instance, there is no organization of national prominence which ostensibly is working in the interest of the Negro which is not dominated by the Old Crowd of white people. And they are controlled by the white people because they receive their funds—their revenue—from it. It is, of course, a matter of common knowledge that Du Bois does not determine the policy of the National Association for the Advancement of Colored People; nor does Kinckle Jones or George E. Haynes control the National Urban League. The organizations are not responsible to Negroes because Negroes do not maintain them.

This brings us to the question as to who shall assume the reins of leadership when the Old Crowd falls.

As among all other peoples, the New Crowd must be composed of young men who are educated, radical and fearless. Young Negro radicals must control the press, church, schools, politics and labor. The condition for joining the New Crowd are: equality, radicalism and sincerity. The New Crowd views with much expectancy the revolutions ushering in a New World. The New Crowd is uncompromising. Its tactics are not defensive, but offensive. It would not send notes after a Negro is lynched. It would not appeal to white leaders. It would appeal to the plain working people everywhere. The New Crowd sees that the war came, that the Negro fought, bled and died; that the war has ended, and he is not yet free.

The New Crowd would have no armistice with lynch-law; no truce with jim-crowism, and disfranchisement; no peace until the Negro receives complete social, economic and political justice. To this end the New Crowd would form an alliance with white radicals such as the I.W.W., the Socialists and the Non-Partisan League, to build a new society—a society of equals, without class, race, caste or religious distinctions.

Excerpted from The New Negro: A History in Documents, 1887-1937. Copyright © 2025 by Martha H. Patterson and Henry Louis Gates, Jr. Reprinted by permission of Princeton University Press.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186059 https://historynewsnetwork.org/article/186059 0
A Powerful Influence on American Democracy In May 1951, Kwame Nkrumah received an invitation to Lincoln University. The news that his alma mater had plans to confer upon him an honorary doctorate the very next month landed with total surprise. As Nkrumah wrote:

It was just over six years since I had left America and I could not believe that such an honour could be bestowed upon me in so short a space of time. I felt that I had not done enough to merit it and my first inclination was to decline it.

The Lincoln invitation had been the doing of Horace Mann Bond, the first Black man to lead the university and its president since 1949. Bond, a precocious African American student from Nashville, had graduated with honors from Lincoln in 1923 at nineteen and then earned advanced degrees from the University of Chicago. He had made his academic reputation with original research on the education of Blacks in the American South. In his first book, The Education of the Negro in the American Social Order, he questioned the use of IQ tests by the army to assess the intelligence of African American recruits.

This anticipated by decades a scholarly consensus that that would eventually find that standardized tests were anything but culturally neutral. Subsequent work by Bond reappraised the history of the American Reconstruction Era and refuted the idea long held dear to champions of the myth of the “Lost Cause” and of the so-called Redemption, the period of resumed white supremacy across the South that followed Reconstruction. It held that profligacy caused by the entry of Blacks into government after the Civil War had driven the South into economic ruin.

In addition to being an original thinker, influential scholar, and part of what was still a very small cohort of academically trained Black historians in the United States, Bond was also a classic “race man.” This once-common term was used for African Americans who wore pride in their identity openly and believed that their social duty was to do whatever they could to advance the prospects of Black Americans as a group. In many of the black and white photographs of Bond from this era, there’s a hint of a scowl, and in that expression, I have often been tempted to read not just the flinty combativeness he was known for, but also smoldering resentment over the wages that racism in his society exacted from him and from Black people in general.

Although descended from enslaved great-grandparents, Bond was born into the Black middle class as the son of two college-going parents, a mother who became a schoolteacher, and a father who was a Congregational minister who preached throughout the South. As a boy, he was regaled with memories of Africa by his aunt Mamie, who had worked as a medical missionary on the continent. Then, as a young man, he had avidly read stories about Africa in the pages of Du Bois’s NAACP journal, The Crisis, which often emphasized the existence of kingdoms and accounts of African achievement. Du Bois wrote much of this content himself, beginning with the story of his first voyage to the continent, in 1923, when he visited Liberia, one of only two Black-ruled countries in the world at the time (although Haiti was then under American military occupation). Du Bois often lapsed into what one historian has called “a hyper-lyricism brought on by the sheer euphoria of having slipped the surly bonds of American racism.” “Africa is vegetation. It is the riotous, unbridled bursting life of lead and limb,” Du Bois gushed in one typical column. It was also “sunlight in great gold globules,” and “soft, heavy-scented heat,” that produced a “divine, eternal languor.”

In 1949, Bond took the first of his own eventual ten trips to Africa, and it utterly reshaped his life. It wouldn’t be an exaggeration to say that it also powerfully altered the historical trajectory of Black people on both sides of the Atlantic for the next two decades. Bond’s interest in Nkrumah, and the bridge he helped build for him with African Americans, threw a precious lifeline to the emerging Gold Coast leader at a time when he had few other cards at his disposal. And it pointed to a possible future of deep and mutually strengthening ties between two parallel movements, one for civil rights in America, and the other for independence for Africa’s colonies. Both were in dire need of allies as the world entered the Cold War. Bond’s early trips to Africa placed him at the forefront of an ideologically diverse group of African American intellectuals and political activists that would swell dramatically throughout this period —all of them fired up with the idea that the liberation of Africa and the battle for full citizenship rights for Black Americans were so fundamentally linked that if they were to advance at all, they would have to proceed in tandem.

In its first phase, this group included African Americans who had become familiar to the broad public: the novelist Richard Wright, the diplomat Ralph Bunche, the nationally prominent labor leader and elder statesman, A. Philip Randolph, and, just slightly later, a young Baptist minister named Martin Luther King Jr. Behind big names like these stood a panoply of others who also played crucial roles in building bonds between Black America and Africa but who mostly labored in relative anonymity. These included people such as William Alphaeus Hunton, a professor of English, and the historians Rayford Logan and William Leo Hansberry, all of whom taught at Howard University. The latter, uncle of the playwright, Lorraine Hansberry, had begun teaching African history at Howard in 1922. Four years later, with the appointment of Mordecai Wyatt Johnson, Howard got its first Black president, but it wasn’t until two decades after that, in 1954, at Hansberry’s initiative, that the university introduced the nation’s first African Studies curriculum.

The Second World War and its aftermath saw a recentering of pan-Africanist energy in Africa itself.

After following the example of Lincoln’s leadership by educating more and more students from Africa and the Caribbean, Black colleges and universities in the United States became a catalyst for this, spurring the development of a global Black consciousness movement. Not only did thinkers from different continents come together on these campuses, but with a critical mass came much more militantism. Here, although Lincoln had been the undeniable pathbreaker, it was Howard University that, starting even before the Second World War had ended, surged ahead to become the most important locus of ideas and activism linking Blacks from Africa and the diaspora in profound new ways.

Nnamdi Azikiwe of Nigeria has been called a “student zero” of African nationalism on American campuses for the way he had helped recruit African students, including Nkrumah, to historically Black colleges in the United States. Although Azikiwe eventually graduated from Lincoln, he had transferred there from Howard, where he had been unable to pay the bills for his studies. It was at Howard, he later wrote, where “the idea of a new Negro evolved into the crusade for a new Africa.” This resulted from the intense stimulation he experienced on a campus that had been assembling a deepening bench of intellectual stars since Alain Locke, a Rhodes Scholar, was hired in the 1920s. In Azikiwe’s case, it came from studying there under people like Leo Hansberry and Ralph Bunche.

At Howard, and wherever else a critical mass of students from Africa and the Black diaspora outside of the United States gathered, something else important began to occur: a sharing of experiences of exploitation and suffering under imperial rule. This also juiced campus progressivism. Learning from each other bred a bolder self-confidence, and as it did so, colonized and recently emancipated peoples began to lose whatever lingering patience they had with the temporizing of Western nations based on the supposed need for tutelage and gradual preparation for the responsibilities of self-government.

 

From the moment of his appointment as the first Black president of Lincoln University in 1945, Bond faced persistent pressure from trustees and others to change the school’s vocation. For decades, its official mission had been “the education of Colored youth.” Bond acceded to the removal of that phrase from Lincoln’s charter, but he pushed back against demands that the university actively recruit white students in order to significantly dilute its Black student body. These calls became even more insistent in the early 1950s when desegregation cases were working their way through the federal courts, making it seem increasingly likely that racial separation in American schools was doomed to fade.

True race man that he was, Bond was furious over the board’s pressure and responded defiantly. At most northern colleges and universities, Black students and faculty still numbered few to none. Lincoln, by contrast, had long welcomed white students and even recruited small numbers of them from nearby communities. “Having done this have we not done enough?” Bond asked. “Our self-respect will not permit us to do more.” In 1949, the Lincoln alumnus Thurgood Marshall, then legal counsel of the NAACP, gave a speech on campus in favor of integrating his alma mater. But Bond, who had personally led the desegregation of local schools in the community surrounding Lincoln by suing to force them to accept Black students, pushed back. According to a biographer, he criticized Marshall and the NAACP for praising white colleges that had two or three Black undergrads while maintaining all white boards and faculties. “Let those white colleges with token Black students hire Black faculty and choose Black board members; then they might merit being called interracial, as Lincoln did.”

Resentment over such double standards fueled Bond’s determination to intensify his school’s relations with Africa, both in terms of supporting applicants from the continent, as it had long done, and through a new kind of personal diplomacy toward Africa. Through Bond, the politics of these two issues — integration at home and the pull of Africa abroad — on the surface, seemingly unrelated, would become increasingly and explicitly joined. As they did so, they set him at odds with Lincoln’s board and ultimately contributed to his firing in 1957, ironically the year that Nkrumah led Ghana to independence.

 

Bond’s first visit to Africa in 1949 was on a trip partially paid for by a Lincoln alumnus from Nigeria. His first inkling of what Africa could mean for Lincoln and what Lincoln could mean for the continent had likely occurred two years earlier. That was when Nnamdi Azikiwe had returned to the campus to receive an honorary degree. Around that time, Bond began to argue that his university’s longstanding connections to the continent constituted a major competitive advantage that Lincoln had done little to exploit. Africa was clearly moving into a new age of eventual independence, and with alumni like Azikiwe and Nkrumah, the school had a special role to play. Bond even wrote that these two had “learned Democracy — with a capital D” at Lincoln, where they were made “good Americans — with an immense admiration for American inventiveness, enterprise and industry.”

By the time of his 1949 tour of West Africa, Bond’s thinking had evolved from vague and boosterish notions about the public relations gains to be won by Lincoln to a political vision about synergies to be developed between currents of Black nationalism on opposing sides of the Atlantic. Writing from Africa to the editor of the Baltimore Afro-American, then a leading Black newspaper, Bond affirmed: “Here is Black nationalism — the more astonishing to an American because of the low esteem in which the African American is held. But the American Negro enjoys that same tremendous prestige here that America does.” This was the germ of a robust and sophisticated later argument that the exercise of sovereignty and self-rule by new African leaders could serve as powerful sources of pride and inspiration for African Americans, while also helping to undermine the worst sorts of racist stereotypes held by whites against them.

“The key point for realizing the aspirations of the American Negro, lie[s] in Africa, and not in the United States,” Bond remarked in a “Letter from Africa” column dated October 17, 1949. “It is the African who, I think, will dissipate forever the theories of racial inferiority that now prejudice the position of the American Negro.” Of all the colonies in sub-Saharan Africa, the Gold Coast seemed closest to achieving independence from a European power peacefully. Bond became one the first African American thinkers to seize on its importance as a lodestar for African American liberation as well. If the Gold Coast, soon Ghana, could bring to vivid life images of Black people successfully conducting their affairs in a reasoned and orderly manner, he believed, it would deliver a serious blow to white supremacy everywhere.

The acerbic, chip-on-his shoulder Bond may have been among the first to think this way, but he was by no means alone. Indeed, one of the most remarkable things about this forgotten epiphanic moment is how widespread such thinking became across the African American political spectrum. According to the standards of the early Cold War, Bond stripped of his pan-Africanism was a run-of-the-mill, pro-business, anticommunist figure. Thoughts like his about the importance of Ghana’s example to African Americans found their neat echo, though, in 1950 in the words of Alphaeus Hunton. This Harvard educated grandson of Virginia slaves, Howard University English professor, and Communist Party member, became a leader of a pioneering anti-imperialist group called the Council on African Affairs (CAA). The CAA’s members were fiercely hounded by the McCarthy-era’s hysterically anticommunist House Un-American Activities Committee. In 1951, Hunton was imprisoned for his refusal to testify before the committee. He emigrated to Africa in 1960, first to Ahmed Sékou Touré’s Guinea, then to Nkrumah’s Ghana, and finally to Zambia, where he died of cancer in 1970. In one letter, he wrote:

It is not a matter of helping the African people achieve freedom simply out of a spirit of humanitarian concern for their welfare. It is a matter of helping the African people because in doing this we further the possibility of their being able to help us in our struggles here in the United States. Can you not envision what a powerful influence a free West Indies or a free West Africa would be upon American democracy?

Bond’s writings and conversations from this time reveal still more complexity about the ways in which racial identity questions for Black Americans were evolving in relation to a changing Africa. From that first trip to the continent, at a time when “Negro” or “colored” were the standard appellations for Blacks, Bond had already begun to anticipate the shift, still at least a quarter century away, toward the term African American. “Sincerely — (and with a great new pride that I am an American of African descent…)” he wrote at the close of one letter.

Excerpted from The Second Emancipation: Nkrumah, Pan-Africanism, and Global Blackness at High Tide by Howard W. French. Copyright © 2025 by Howard W. French. Used with permission of the publisher, Liveright Publishing Corporation, a division of W.W. Norton & Company, Inc. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186058 https://historynewsnetwork.org/article/186058 0
Life in the Firestorm The fallen brick sat at the edge of an abandoned lot, staring up at Roberto Ramirez like a question mark. Ramirez, a sixth grader in the Bronx, had been instructed by his art teacher to search for “found objects,” and his eyes gravitated toward this small chunk of a crumbling tenement. The assignment was to envision the objects “as something else,” so Ramirez pictured the brick as a tiny building that was still standing. He took a paintbrush to its rough exterior, and after carefully outlining the building’s matchbox windows, he filled their frames with fire. Not all of them, though — only the windows on the upper floors. He knew that in the Bronx, fires started at the top.

Ramirez’s technique quickly caught on among the other kids in his class, and before long the eleven­ and twelve­-year-olds had produced a series of fifty flaming miniatures. It was 1982, and they drew what they knew: life in a firestorm. Another student, John Mendoza, and his family had been burned out of their apartment three times by arsonists, an experience as routine as it was calamitous. Yet what the students didn’t know, beyond the rumors children sometimes absorb, was why fire was so prevalent in their neighborhood. For an explanation, they looked to their new art teacher, Tim Rollins, a white conceptual artist with ties to the downtown art scene. As an outsider in the Bronx, Rollins had no satisfying answers, so he decided on a field trip. “We go down to the Fire Department,” he recalled, and “the firemen see these ten crazy kids and me come stomping in and asking, ‘Why are there so many fires?’ ” The firefighters offered only vague replies. The students left the station dejected, but in the mystery Rollins spotted a teaching tool. He asked the sixth graders to inscribe an explanation on the bricks, and marveled, “We got 70 different reasons.” Ramirez, for one, blamed tenants who were behind on rent: he wrote “rent late” on the building’s roof. His classmate claimed that “junkies burned the buildings down,” while another wrote, “no heat.”

The students were left with concrete bricks in lieu of concrete answers. What they were attempting to do was give the bricks a history. The project became known as the Bricks series, and it was the first in a decades­long, intergenerational collaboration called Kids of Survival (or K.O.S.), so named because “we were broke but not broken.” One of the Bricks now sits in the permanent collection of the Whitney Museum of American Art.

The question haunting the Bricks series to this day is why the students suggested “70 different reasons” for the conflagration that upended their lives and engulfed their neighborhoods. How could the toll from the fires have been so colossal and its source so opaque?

 

Seven years earlier, in April 1975, a different act of painting offered some insight. Smearing black pigment onto their hands and faces, landlord Imre Oberlander and his associate Yishai Webber prepared to torch one of the former’s six buildings in the South Bronx. The white incendiaries believed blackface would offer them cover, like a perverse kind of safety gear. At four a.m. on a Friday morning, the two men cruised down Southern Boulevard en route to the targeted building. They hoped the twilight would provide further protection, but when they drove past a police car, their broken taillight caught the attention of the officers on patrol. Pulling them over on the wide thoroughfare, the policemen saw two Hasidic men from Williamsburg made up in blackface, one wearing a wig, and proceeded to search the car. They found two “firebombs ” —crude incendiary devices made out of gasoline, gunpowder, and a timing device.

Oberlander became one of the first landlords charged in connection with the decade’s “epidemic of arson,” as the New York Times had begun to call it. Though the Bronx, in particular, had been burning for years by this point, authorities remained so oblivious to the root causes that they initially suspected Oberlander and Webber of being spies en route to the Soviet diplomatic compound, ten miles away, at the opposite end of the borough. It is true that the firestorm involved vast conspiracies, transnational dealings, and a doctrine of containment, but it all had little to do with the Cold War. What drove the arson wave was profit. Oberlander had collected $125,000 (nearly $750,000 in 2024 dollars) in insurance payouts from twenty-one separate fires between 1970 and 1975. All his claims were “paid off without a murmur from the insurance company,” railed Bronx District Attorney Mario Merola, warning without hyperbole that this was just “a drop in the bucket.”

Between 1968 and the early 1980s, a wave of landlord arson coursed through cities across the United States, destroying large portions of neighborhoods home to poor communities of color. From Boston to Seattle, tens of thousands of housing units burned (this is a conservative estimate); the most affected neighborhoods lost up to 80 percent of their housing stock. Yet historians have largely neglected the burning of the nation’s cities, and popular memory has commonly confused the 1970s arson wave with the well-documented but far less destructive urban uprisings of the previous decade. The 1960s rebellions — most famously Watts in 1965, Newark and Detroit in 1967, and everywhere after the assassination of Martin Luther King Jr. in 1968 — were born of Black (and in some instances Puerto Rican) outrage over the persistence of white supremacy despite the tangible gains of the civil rights movement. In most cases set off by an incident of police violence, the rebellions represented a collective revolt against not just overpolicing but the daily persecution of Black communities in the form of unequal employment, housing, education, and more. Though these events were often deemed “senseless riots” devoid of a coherent politics, they were formidable and far-reaching — though fledgling — insurgencies. Historians typically describe this era as stretching from Birmingham in 1963 to the nationwide uproar following MLK’s murder in 1968, although important recent work has tracked the rebellions into the 1970s.

Whether measured in dollars or lives lost, the destruction caused by the uprisings of the 1960s pales in comparison to the arson wave of the 1970s. In 1967, the most violent year of the decade, the number of dead were counted in the tens and the insurable losses totaled $75 million. By contrast, at least five hundred people died of arson annually across the United States during the 1970s, and by 1980 the New York Times was estimating that arson caused $15 billion in total annual losses. Admittedly, these are crude and fraught barometers of historical significance. The rebellions had immense political implications on a national scale, and they justifiably loom large within the popular imagination. The 1970s blazes were perhaps too common, too consistent with existing iniquities, to draw the same kind of attention.

The latter decade was defined not by insurrection but by indemnification, though the two were connected, as we will see. The 1970s conflagrations bring into view the untold history of the racially stratified property insurance market, a key force in the making and remaking of American cities. Although fire usually requires only oxygen, heat, and fuel, the crucial ingredient during that decade was state-sponsored fire insurance, initiated by federal fiat in response to the 1960s uprisings. The reform effort was supposed to put an end to insurance redlining, which had left entire swaths of the American city uninsured or underinsured due to the race and class of their residents. Yet increased access to second-rate fire insurance, when paired with state cutbacks and ongoing mortgage redlining, incentivized landlord arson on a vast scale.

The Bronx lost approximately 20 percent of its total housing stock to fire or abandonment between 1970 and 1981—around 100,000 units, nearly the equivalent of the number of housing units in today’s Richmond, Virginia, or Reno, Nevada. Destruction on this scale, unfathomable as it may be, should not be seen as evidence of the Bronx’s exceptionality. The arson wave hit cities across the country, in every region. Coast to coast, Black and Brown tenants were blamed for the fires. Yet the evidence is unequivocal: the hand that torched the Bronx and scores of other cities was that of a landlord impelled by the market and guided by the state.

That hand was also, in the case of Imre Oberlander, covered with dark pigment. Who was the audience for this 4 am racial masquerade? Was it the building’s tenants, the block’s bystanders, the beat cops? Whomever they imagined as potential witnesses, Oberlander and Webber were performing a well-rehearsed script of Black and Brown incendiarism. The specter of the Black firesetter, in particular, is older than the United States itself. For the two white arsonists, the racist trope was something to exploit. The landlord and his accomplice believed it could deflect blame and prevent them from being identified. They applied blackface as though it, too, were a form of insurance.

Oberlander and Webber may have also seen blackface as a shield against a different bigotry — that of “Jewish lightning.” The stereotype of the arsonist Jew was a vestige of medieval anti-Semitism that was modernized by fire insurers in the mid-19th century, when underwriters at Aetna, the Hartford, and other notable firms warned against issuing policies to “Jew risks,” in part because of Jews’ supposed proclivity for fraud. The stereotypical arsonist, whether in its anti-Semitic or anti-Black variant, fulfilled a similar function: distracting from the larger power structures at work. In the 1970s, the Jewish slumlord became a potent symbol of Black exploitation, but in fact the redlining banks and insurance companies had, to different degrees, discriminated against both Black and Jewish communities.

The irony in Oberlander and Webber’s blackface gambit was that the two men ended up getting caught precisely because their performance of Blackness was both too convincing and too implausible. That is, their apparent Blackness may well have played a role in the police officers’ decision to pull them over, and their thinly veiled whiteness — upon closer inspection — almost definitely prompted the search of their car.

Few landlord arsonists actually made a habit of wearing blackface, because few had a need for it. The arson wave was made possible by financial masquerade—an array of insurance and real estate practices that obscured accountability and diffused risk — combined with official neglect and the presumed criminality of the Black and Brown tenants held culpable for the fires. Instead of blackface, landlords often chose more cunning disguises, such as hiring paid “torches,” usually neighborhood teenagers, to do the burning for them. But that was just the opening scene of a multi-act white-collar revue, one that featured Hollywood studios dishing out Bronxploitation films, journalists vilifying the supposed welfare arsonist, underwriters flooding cities with subpar coverage, insurance executives feigning impotence, real estate players attacking rent control, criminologists theorizing about broken windows, lawmakers gutting the fire service, and pundits yammering on about riots and pyromaniacs. All sang the same chorus, drowning out dissenting voices as well as the true origins of the arson wave. Blackface was not necessary when there was such a vibrant tradition of briefcase minstrelsy.

 

The torching of wide swaths of the American metropolis may strike some as a bizarre event in the distant past. Yet it is very much part of how our cities came to be. Long neglected by historians, the 1970s arson wave vividly reveals late-20th-century shifts in political economy that still shape our lives. Out of its embers was forged the metropolis we know today: one defined by volcanic real estate booms, economy-cratering busts, and an ongoing decline in housing stability. The world in which a solidly built home could generate more value by ruination than habitation is the same world in which homelessness, eviction, and foreclosure have become defining aspects of urban life.

The story of landlord arson is not a cautionary tale of capitalism gone awry, of a few bad apples, of uncaring policymakers, of government overreach, or of a grittier bygone era. To frame it as a singular, sensational episode of the past is to gloss over its continuities with — and its role in creating — the structures of the present. Warning against such “spectacularization of black pain,” Saidiya Hartman counsels that “shocking displays too easily obfuscate the more mundane and socially endurable forms of terror.” The arson wave renders visible much that is hidden in plain sight, historically and to this day.

Over the last 50 years, housing insecurity and real estate volatility have come to define our cities, and though there are many causes, none is more significant than financialization, which surged in the years after 1968. Financialization is the process by which an economy that was once organized around the making and trading of physical commodities becomes increasingly oriented around the profits from financial activity. The high finance we know from the nightly news and the silver screen is found on the fiftieth floor of a glass-encased skyscraper and in the cacophonous pits of a stock exchange, fueled by adrenaline, greed, and cocaine. The image we have is set a thousand feet in the air, its spoils and scandals a world apart, even if they eventually touch the rest of us. But this is not the only face of financialization, nor necessarily the one that sheds the most light on the crises of the present. The arson wave opens up a view of financialization from the ground up and far from the fray of Wall Street.

The stock image of the 1970s American city features an urban economy in decline. It is rarely acknowledged that there were profits to squeeze from the destruction of the metropolis, particularly in neighborhoods of color. The ascendance of the FIRE industries on the heels of the civil rights movement created conditions primed for plunder, especially in cities suffering from the flight of white residents and well-paying jobs. “Instant liquidity,” as one arsonist for hire described it in his testimony before Congress, was the real estate equivalent of Wall Street’s liquidity preference: the priority placed on an asset’s ready convertibility into cash. What made buildings liquid was property insurance expansion, presented as a means of racial justice and redress. Which is to say that race underwrote the gains enjoyed by landlords.

For those looking to make a quick buck, the Bronx and other communities of color possessed a peculiar asset: the powerful alibi of racial pathology. The presumption of Black and Brown criminality blotted out the fact of dispossession so completely that, all these decades later, the vague impression that Bronxites burned down their own borough endures, while the vast fortunes made were forgotten. The peril of getting caught perpetrating fraud thus transferred to its victims, where it has long remained.

Excerpted from Born in Flames: The Business of Arson and the Remaking of the American City by Bench Ansfield. Copyright © 2025 by Bench Ansfield. Used with permission of the publisher, W. W. Norton & Company, Inc. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186057 https://historynewsnetwork.org/article/186057 0
Textiles as Historical Texts In Weaving the Word, Kathryn Sullivan Kruger, a professor of English, examines the link between written texts and woven textiles. Kruger asserts that before stories were recorded through written text, cloth preserved and communicated these important social messages. Kruger argues for expanding the idea of literary history to include women’s role in transmitting traditions, stories, and myths via fabric. By including textiles in our study of literature and history, we will find many female authors. She also maintains that during times when weaving was analogous to storytelling, “women’s endeavors were equal to culture and were not considered beneath culture or marginal to it.” Cloth tells stories, records histories, and shapes culture in a synergistic interaction that makes it impossible to disentangle the effect of one on the other.

The Bayeux Tapestry, an 11th-century embroidered account of the Norman conquest of England in 1066 by William the Conqueror, is a clear example of textiles as historical texts. While the events of this epic battle are enshrined in woolen thread on linen, no one knows who stitched it. An 18th-century legend has it that Queen Matilda, William the Conqueror’s wife, carried out the embroidery with her ladies in waiting. While a romantic notion, this was certainly not the case. Most scholars believe that a group of Anglo-Saxon embroiderers stitched it near Canterbury, England. All the surviving evidence indicates that only women in early medieval England embroidered and that it was a highly regarded female occupation. However, there is no known convention of women embroidering on such a large scale — the tapestry is 70 meters long — or for such an important political purpose. This has led some to speculate that perhaps it was not the work of women, and Bayeux Tapestry Museum curator Antoine Verney has suggested that men could have been trained in embroidery to execute this important royal commission, potentially in Normandy, since the tapestry resided in the Bayeux Cathedral for centuries.

Textile archaeologist Alexandra Lester-Makin, an expert in early medieval embroidery and the Bayeux Tapestry, disputes this idea, noting that the needlework on the tapestry is highly skilled. She thinks it unlikely that it was the work of a team who had just recently learned to embroider. There is evidence of female embroidery workshops in England in the 11th century, indicating the likelihood that the tapestry was created by women. A skilled embroiderer would have organized and overseen the production process to maintain consistency and coordinate the many embroiderers working on the piece at the same time. Many women’s hands would have also been involved in spinning and weaving the linen and spinning and dyeing the wool embroidery thread. Notably, the style of embroidery used on the tapestry is meant to conserve thread, likely due to a firsthand awareness of how very labor intensive it is to produce from having spun wool themselves. The thread wraps around the back of the work only in short couching stitches, so the majority of the wool is laid down in long stitches on just the front surface of the work. This style of embroidery is a relatively quick way to fill in large spaces, much like painting, and evokes the brush strokes of illuminated manuscripts.

The women doing the embroidery work may have had some creative license over the messages that were communicated and immortalized in the tapestry. Some experts on the tapestry suggest that while the main story running horizontally across the center was dictated and likely sketched by men to record the details of the conquest, the borders were left to the discretion of the embroiderers, who included animals — often dragons, lions, and griffins — and scenes from Aesop’s fables alluding to ideals of medieval morality. Certain fables are embroidered more than once, like “The Fox and the Crow” and “The Wolf and the Crane.” They are drawn differently and appear to be embroidered by different hands, suggesting that each embroiderer was likely unaware that another had chosen to stitch the same image or scene. The fables in the borders can be read as commentary on the main action of the tapestry — perhaps a way for the Anglo-Saxons to tell their version of the story in the margins of the Norman tale.

Drawing of three of Aesop’s fables found on the Bayeux Tapestry, 1889. [Wikimedia Commons]

Lester-Makin expressed that as much as she would like to believe this was the case, she is not sure that women would have been given such freedom over the border content. However, that doesn’t mean that their experiences and perspectives were not included. “I think that even if they didn’t necessarily have free reign, there are still areas of expression that can be witnessed. This is a witness to what they have gone through or know that somebody went through … there are other ways … to read the tapestry and of seeing the embroiderers within it.” She called attention to a scene where an Anglo-Saxon woman is holding a child’s hand as Norman soldiers set fire to her home. “Whether that was chosen freely by the embroiderers or not, that is still a commentary and if you think of women embroidering that, and you never know what they may have witnessed or had done to them. That’s a harrowing scene.” Similarly, the borders show the bodies of dead Anglo-Saxon soldiers having their armor pulled off or being devoured by animals. “That kind of thing happened and … you can imagine someone stitching that and going, ‘oh my god, that happened to my brother, my cousin, my dad, my husband.’ ” Whether or not the women chose any of the tapestry’s content, they stitched it, and prior to that, they may have lived it. The tapestry is a testament to their experience preserved in a language they spoke.

Bayeux Tapestry scene. [Wikimedia Commons]

In our interview at the Bayeux Tapestry Museum, Verney stated that the genius of the tapestry was that it was the first known graphic representation of a current event in northern Europe, adding that if it was not captured in this object, the history may be lost today. He said that the technique of embroidering wool yarn on linen cloth was likely chosen because it was a relatively quick method and made it easy to share the story of the event on both sides of the English Channel to a largely illiterate public. It may have also served a political purpose. It was a way to integrate the Anglo-Saxon tradition of needlework into the story of the Norman conquest of England and assure the English that their traditions were valued and would be preserved under this new rule.

French historian R. Howard Bloch calls the embroidery of the Bayeux Tapestry “a powerful vehicle for cultural memory at a time when even the most powerful lords were illiterate.” Janet Catherine Berlo wrote in response to Bloch’s statement, “I position it as ‘a powerful vehicle for cultural memory’ of a different sort — a cultural memory for those of us who seek to understand the long history of the poetics of embroidery, and our places in it.” It is clear which history was thought valuable to preserve at the time — the content of the tapestry — and which was not — the process of its creation. Women looking to find their place in the “long history of the poetics of embroidery” often discover that it is a game of hide and seek. Even when the work remains, the hands that made it are so often invisible. Like so many stories of women throughout history, the creation story of the Bayeux Tapestry seems indelibly lost.

 

Eight hundred years after the original Bayeux Tapestry was finished, a group of women in Victorian England created a full-scale replica of it, now on display at Britain’s Reading Museum. The effort was spearheaded by Elizabeth Wardle, who in 1885 organized 39 members of the Leek Embroidery Society so that Britain could have its own copy of this important historic artifact. It took just one year for the women to re-create the entire tapestry, working from pictures that had been handcolored by archivists at what is now the Victoria and Albert Museum in London. It seems that these women were working to find their own place in embroidery history grounded in the Victorian-era “medieval revival,” which spurred a renaissance of medieval art and architecture. Their focused effort reflects an interest in their British heritage, the tradition of English needlework, and a wish to meaningfully contribute to those legacies. Unlike the anonymous stitchers of the original tapestry, these women added their names below the sections they worked on, escaping the obscurity of their medieval counterparts. Their signatures show that some women worked alone for long stretches of the tapestry, while others worked closely together on a section. Seeing three women’s names running the length of a four-foot section, we can imagine them huddled together talking and stitching.

Another difference between the Victorian re-creation and the original tapestry reflects the cultural mores of the time. In the original tapestry, there are several naked men, and male horses are depicted with anatomical accuracy. The Leek embroiderers omitted these “racy” details, though through no fault of their own. The men working in the museum archives felt it was improper to send such images to a group of British ladies. They “cleaned up” the photos that the women then faithfully copied.

More recently, a community project on the island of Alderney, in the British Channel, took inspiration from the tapestry but had a different aim: to finish it. The last panel of the original work is famously missing, its story lost to time. Historically, what naturally follows the Battle of Hastings, where the tapestry currently ends, is the coronation of William the Conqueror as William I of England. Kate Russell, the librarian on Alderney, spearheaded the project and together with artist Pauline Black imagined the ending and created the plan for the tapestry in 2012. Four hundred and sixteen people ranging in age from 4 to 100 contributed stitches to the final piece. Along with a large contingent of Alderney islanders and notably King Charles III, then Prince of Wales, stitchers came from nearly every continent of the world. Russell told me not a day went by that there wasn’t at least one person stitching while the library was open and often several people working together: “During that entire year, there was never any rancor, tension, disagreement, squabbling or any other sort of discord. Lots of stitching; no bitching. I imagine it must have been similar for the original stitchers, too, though the trauma they were living through in that torn-up country that England had become must have meant an entirely different atmosphere.”

Fran Harvey, a local resident and principal stitcher, said: “England was never the same after the Norman invasion. And I don’t think Alderney, as a community, will ever be the same again after so many people came forward and put their stitches into this amazing work. It is a landmark in Alderney’s modern history, and I feel sure that everybody involved in it, just like us, is very proud … The Tapestry … is like a thread that runs between Normandy and Alderney. It is almost a thousand years long, and today it brings us closer together.” Russell was awarded a British Empire Medal by Queen Elizabeth II for services to history and culture. Now, as a tourist destination on Alderney, the tapestry illustrates the cultural heritage of the community and carries the legacy forward.

Today, Mia Hansson, a Swedish seamstress living in England, is working to single-handedly re-create the Bayeux Tapestry. While most Bayeux Tapestry projects reflect a connection to British and French culture, Hansson’s embroidery pieces are motivated by her connection to a culture of needlework — an answer to Berlo’s call “to understand the long history of the poetics of embroidery” and her place in it. She plans to finish her Bayeux Tapestry replica just in time for a major restoration of the original tapestry, which the French Ministry of Culture has scheduled to begin in 2028. The restoration effort has been led thus far by a team of seven female textile conservationists who have assessed the areas in need of repair. A one-thousand-year­-old tapestry presents unique challenges. Because no one has worked on anything like this before, restorers will have to learn as they go. Hansson is helping to keep this object of cultural memory alive and in circulation even if the original can no longer be displayed for a time.

While stitching the tapestry, she was “forced to learn the history, almost against [her] will,” noting that history is the only subject she ever fell asleep in. But her real connection to the work is with the original stitchers and her grandmother, who, though deceased, is always looking over her shoulder to make sure the back side of the work is neat. She has come to know the original stitchers of the tapestry quite well through her close study and faithful re-creation of their work: “Although I often get frustrated with them and the way they chose to stitch, which I now have to replicate, I feel strangely protective over them. There were reasons why they did things in a certain way and I don’t always understand … I can complain and want to put my veto in, ask questions and want to suggest other ways of doing things, but … I want to give the women the benefit of doubt.”

Hansson said she can feel the tensions between the embroiderers who worked closely together and likely for long hours with poor lighting, as though there are ghosts in the fabric. Their varying skill levels are clear from the stitching. Some appear less patient than others: “There are places where stitches overlap, where none of the women wanted to give in. In other places, there is a gap, where the women have failed to connect their work. Why? Was there an argument? Was it a simple oversight?” Unlike the harmonious working environment depicted in the stitching of the Alderney panel or the Victorian re-creation, Hansson imagines “the air being thick with emotion at times” while stitching the original tapestry. 

Choosing to re-create the Bayeux Tapestry has connected Hansson to a community of people interested in the tapestry and given her a role in a broader cultural and historical conversation. She gives talks to schoolchildren, women’s groups, historical reenactors, and embroidery guilds. She has a designated dress for many of these talks; the material was handwoven by a friend, and she sewed the garment with her mother. She added a 17th-century pocket to wear on top of the dress, which she embroidered with images from the tapestry. During these talks, Hansson said, “I step into a role and kind of become part of the tapestry. I live and breathe it with every ounce of my body and soul. It’s quite magical.” She jokes that her gravestone will read, “The woman who became the Bayeux Tapestry,” as though she herself had become a carrier of cultural memory, an embodiment of the original embroiderers’ hands and minds a thousand years later.

Excerpted from With Her Own Hands: Women Weaving Their Stories. Copyright © 2025 by Nicole Nehrig. Used with permission of the publisher, W. W. Norton & Company, Inc. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186055 https://historynewsnetwork.org/article/186055 0
The Founders’ Family Research George Washington’s fellow founders reveled in genealogy as a means to explain themselves, their situation, and to some degree their new and important positions. Benjamin Franklin, John and Abigail Adams, Thomas Jefferson, and others undertook research, wrote and sketched out their family relationships, and discussed the meaning of these connections. It is more challenging to locate a founding father who was not interested in his own family’s founding than one who was. Family history research, correspondence about genealogy, the exercise of that information in court, and the public display of it were a matter of course for John and Abigail Adams, Benjamin Franklin, Thomas Jefferson, Alexander Hamilton, James Madison, James Monroe, and many more.

Sampler, by Abigail Adams, 1789. [Cooper Hewitt, Smithsonian Design Museum]

By the late 18th century, expressing ambivalence about genealogy in the form of time and energy expended toward family history research, alongside a disavowal of the significance of family history, was common. Abigail Adams explained her own interest as purely academic, albeit intense. As she wrote about the document that would demonstrate the lineage of the Massachusetts Quincys’ descent from the Sear de Quincy, she explained that “I do not expect either titles or estate from the Recovery of the Geneoligical Table … yet if I was in possession of it, money should not purchase it from me.” “Can it be wondered at,” she asked her sister, “that I should wish to Trace an Ancesstor amongst the signers of Magna Carta[?]” The signers of that document, contrary to the narrative of monarchy, could claim a lineage of expanding political rights and participation, one assumes.

But surely a key aspect of the founding generation’s ambivalence about genealogy was associated with family roots abroad, usually in England. That interest came with a whole host of associations, some problematic given the recent revolution and the geopolitics of the 1780s and beyond, and some appealing, as Adams illustrated, in terms of longstanding ideas about authority and authenticity.

The founders all appreciated that a deeper knowledge of their family’s past required genealogical research. They all took care to explain the character of their interest in their family’s past. And they all made claims to the significance of family, both in general terms and in terms of their particular family’s background. The timing of these elites’ genealogical interests in the post-revolutionary period and the evidence of extensive pre-revolutionary family interest in genealogy among their ancestors also illustrate the deep tradition of genealogy — of which they were well aware — that had developed by the early Republic. For Franklin, the Adamses, Jefferson, Hamilton, Madison, Monroe, and many others, riding the wave of political position often intersected with opportunities for and an impetus to genealogical research, reflection, and articulation.

When the founding generation turned to autobiographical reflection on the storied lives they had led, they began just as the previous generations did: rooted in family. Benjamin Franklin’s famous, posthumously published Autobiography began with his genealogical reflections and travels. When he turned to memoir, Thomas Jefferson began more casually, though his would remain as a manuscript. “At the age of 77,” he wrote, “I begin to make some memoranda and state some recollection of dates and facts concerning myself.”

Yet after this sentence of introduction, Jefferson spent the next passages describing his father’s family (“the tradition in my father’s family was that their ancestor came to this country from Wales … the first particular — information I have of any ancestor was my grandfather who lived at a place in Chesterfield called Ozborne’s and ownd the land afterwards the glebe of the parish”) and his mother’s family (“They trace their pedigree far back in England & Scotland, to which let every one ascribe the faith & merit he chooses”). When John Adams turned to autobiography he, too, began with family because, as he noted, “the Customs of Biography require that something should be said of my origin.” His relation of his paternal and maternal relatives was considerably longer than Jefferson’s, allowing him to fully root his own life in a long New England tradition. James Madison’s autobiographical manuscript treated his family the most briefly, but mirrored his father’s family record in recalling his birth when his parents were visiting relatives elsewhere in Virginia, thus echoing the family history.

Though he did not write an autobiography, or leave anything like a memoir, in the last years of his life George Washington evinced probably the most important and revealing investment in genealogy as a form of continuing importance for American elites on the cusp of a new century, hand on the tiller of a new nation. In 1791, Sir Isaac Heard, the Garter King of Arms at the College of Arms in London, wrote to President Washington with extensive information about the Washington genealogy and heraldry in England, as well as a request for more details about the family in America.

In addition to his expertise as England’s foremost genealogical authority, Heard was married to a Bostonian and had traveled in North America as a young man. His interest in the Washington family, Heard wrote, proceeded “from a sincere respect for the distinguished Character of Your Excellency” but also originated in his own American connections, “[c]ircumstances which have constantly excited my anxious Attention to the Scenes of that country & fervent wishes for the welfare of many families with which I had the happiness to be acquainted.” The materials included were, as one might expect from an expert genealogist, very detailed. There was a sketch of the arms and crest of the Washington family; the former includes a raven rising, with wings poised, from a cornet. There was an abstract of the will of Lawrence Washington, George Washington’s paternal grandfather, and two items of estate administration that formed part of Heard’s research into Washington family connections. And there was an annotated family tree. All in all, it was an impressive package.

In his response (nearly half a year later) to Heard’s interest in learning yet more information about the Washington family from American sources, George Washington first wanted to be clear about his own and his country’s use for genealogy. He noted, “This is a subject to which I confess I have paid very little attention. My time has been so much occupied … that but a small portion of it could have been devoted to researches of this nature, even if my inclination or particular circumstances could have prompted the enquiry.” Further, “[w]e have no Office of Record in this Country in which exact genealogical documents are preserved; and very few cases, I believe occur where a recurrence to pedigree for any considerable distance back has been found necessary to establish such points as may frequently arise in older Countries.”

The president dissembled. Washington had long been interested in the history of his family, and deeply invested in the symbols of his paternal lineage. The coat of arms that Sir Issac Heard sent from England was familiar from its long-standing and regular use by the Washington family. George Washington first commissioned silver with the Washington shield on it when he was in his twenties and had just taken full possession of Mount Vernon in 1757. Ready to furnish his home, he ordered a “Neat cruit stand & casters”; it was beautifully crafted and in the latest style. Two years later, he would marry Martha Dandridge Custis, who brought with her — and her children would add to these — items adorned with each of those coats of arms to the household at Mount Vernon.

Common to all of these founders’ founding stories is not only a cognizance of such things as heraldry, but also an appreciation for and willingness to undertake family history research, usually involving communication with other family members, sometimes involving travel, always relying on the same kind of work done by previous generations. Second, they all framed their family history pursuits in ambivalent terms by the later 18th century. And third, none eschewed family history because of the potential taint of aristocracy. For these sons and daughters of mostly middling means who had become elite by virtue of leveraging property and politics, surrounded by plenty of other families who were celebrating centuries of elite status in the British colonies, and then in the American nation, family history was still an obvious privilege and one that they embraced rather than eschewed.

From Lineage: Genealogy and the Power of Connection in Early America by Karin Wulf. Copyright © 2025 by Oxford University Press and published by Oxford University Press. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186053 https://historynewsnetwork.org/article/186053 0
A Ghost from Kitchens Across the Nation While most Americans today would likely be hard put to name a modern-day conjure woman if asked, a caricature of one smiled warily at them from their kitchen cupboards for over a century: Aunt Jemima, Pearl Milling Company’s cherished pancake mix mascot. Introduced in 1889, Aunt Jemima is a fictious character based on Negro Mammies, enslaved women who held a central place on plantations. They were women like Harriet Collins and Harriet Jacobs, women who played a major role in the development of American food traditions and medicine. Negro Mammies were conjure women who used local flora to heal minor ailments; nursed all the children on the grounds, both Black and white; cooked and organized food in the Big House; provided advice to younger enslaved women; and offered spiritual comfort, often by way of mojos, sacred amulets, to the enslaved.

Mojos were a staple of hoodoo, a conjure tradition that developed in the Deep South and lower East Coast. Mojos often gave the oppressed confidence to rebel against their oppressors — slave against master, wife against husband. This use of mojos would live on well into the twentieth century, becoming one of the defining aspects of African American culture, especially our music.

While newly freed African Americans were busy telling stories about the mojos of Negro Mammies in their early blues songs, the American public began to wax nostalgic over plantation life. In the eyes of the American public, the Negro Mammy was a docile slave who championed the institution of slavery. The national worship of Negro Mammies reached a fever pitch in 1923. At the start of that year, a bill was put forward in the Senate to erect a million-dollar marble and granite statue of their beloved Negro Mammy in Washington, DC.

Most white Americans never even had slaves, much less been raised by a Negro Mammy. So how did the Negro Mammy, a figure who was tucked away on rural southern plantations, a figure who was relatively obscure in the nation before the Civil War, become a wildly popular national icon and lightning rod of racial conflict?

It began with a party.

Memory Jug, c. 1890. [Smithsonian American Art Museum]

In 1893, the United States government decided to throw a grand party in Chicago: the World’s Columbian Exposition, an international fair to celebrate the four hundredth anniversary of the “discovery” of America by Christopher Columbus in 1492. (The complications in setting up the fair, a daunting task in the then largely industrial and hardly picturesque Chicago, made the fair a year late for the anniversary.) At the World’s Fair, several countries, from neighboring countries like Mexico to countries in the Far East, like Japan, were invited to set up an exhibit. The World’s Fair organizers wanted to display to the nations — especially Europe — how far the United States of America had come in four hundred years; they wanted to stress that our wild democratic experiment had been a success. The evidence of that success was our rapid technological innovation at the turn of the twentieth century.

And if you were one of the twenty-seven million people who purchased a ticket to the fair for fifty cents between May and October, you would have indeed been privy to grand feats of innovation that showcased American ingenuity: the world’s first Ferris wheel, a 264-foot-tall wheel that spun on a seventy-one ton axle, carried thirty-six cars that could fit sixty people at a time, and whose heights rivaled the Eiffel Tower, which was featured in the 1889 World’s Fair in Paris; electric lights whose colors danced to music and whirled in fountains at a time when most Americans were still using oil lamps to light their homes; one of the first electric train lines, ferrying visitors on a loop in the air over the fair’s 663 acres; and Thomas Edison’s kinetoscope, which displayed a mesmerizing precursor to movies.

The contributions of African Americans were noticeably missing from this grand celebration. The World’s Fair organizers refused to include African Americans in the fair’s planning, actively barring our proposals for booths that showcased our extraordinary cultural and economic progress achieved merely thirty years after enslavement. The proposed booths would have been astounding to an American public who believed we would never rise above the status of lowly, ignorant servants. By the 1890s, we had doubled our literacy rate, providing a robust education to thousands of Black people who, under slavery, had been violently prohibited from learning to read or write. We had tripled the number of books written by African Americans. And there was a significant increase of African Americans who took up the professions of teaching, ministry, medicine, and law.

In the end, it was Haiti, not America, that gave African Americans a place at the Chicago World’s Fair. Haiti, like many other countries, was represented at the World’s Fair with its own dedicated building. The Haitians opened their doors to African Americans to voice their complaints about and their contributions to the nation. Ida B. Wells, at the time an investigative journalist, partnered with other leading Black intellectuals — Irvine Garland Penn, Ferdinand Lee Barnett, and Frederick Douglass — to produce a pamphlet called “The Reason Why the Colored American Is Not in the World’s Columbian Exposition.” Wells stood on the steps of the building dedicated to Haiti at the World’s Fair, passing out copies of this pamphlet to the visitors from all over the globe who stopped to gaze upon and consider the first and only free Black republic in the New World.

In the pamphlet, Wells pointed out that the wealth created by African Americans’ industry has afforded to the white people of this country the leisure essential to their great progress in education, art, science, industry and invention. Wells understood that to try to tell the story of America without African Americans is as foolish as building a house upon shifting sands — which was exactly the physical construction of the fair.

At the center of the fair was a gleaming “White City” that swayed on stilts. Workers had cleared forlorn-looking oak and gum trees in the large muddy swamp of Jackson Park, which sat on the shore of Lake Michigan. They drove large stilts deep into the sandy marsh to support six large buildings of stucco — a low-cost plaster. They painted these cheap buildings bright white to look as if they were marble. Styled after Greek and Roman architecture, these six buildings formed a square called the Court of Honor, showcasing the major areas of innovation in America: liberal arts, agriculture, anthropology, electricity, machinery, and mining. These hastily built, faux marbled buildings on shoddy foundations were to be the symbols of American progress. And so, the Court of Honor, a make-believe city, held all the tensions of the American dream: buildings with a gleam so white, so bright, they detracted from the muck below that upheld them.

It was in the Court of Honor’s agriculture building that you would find the exhibit of Aunt Jemima’s pancakes. Many of the products that were sampled in this building are still found in our grocery stores today, over a hundred years later, such as Quaker Oats, Cracker Jacks, and Wrigley’s Chewing Gum.

In 1890, R.T. Davis, the president of Davis Milling Company, bought Aunt Jemima’s pancake mix from Chris Rutt and Charles Underwood, who first developed the product in 1889. The first self-rising flour mix on the market, Aunt Jemima’s pancake mix was made of wheat, rice, and corn. This was a striking departure from the pancakes of the South — called hoecakes, ashcakes, johnny-cakes, or pone — which were typically made from cornmeal. To market this unfamiliar product to the American public, Chris Rutt decided to draw upon the preeminent form of entertainment in the late nineteenth and early twentieth century: minstrel shows, where white men darkened their skin with burnt cork to imitate the songs and dances of the enslaved.

Fairgoers who visited Aunt Jemima’s pancake exhibit would have recognized her name from “Old Aunt Jemima,” a staple song and skit of the minstrel circuit. When Rutt heard the song performed in an 1889 minstrel show and saw how popular it was among the crowd, it struck him that Aunt Jemima would be the perfect “face” for his product. After the show, Rutt plastered a grotesque painting of Aunt Jemima on every newspaper, magazine, and paper box advertising their new pancake mix. R. T. Davis took the branding further by casting Nancy Green, a formerly enslaved Black woman, to play the role of Aunt Jemima at the World’s Fair.

At the pancake exhibit, an enormous barrel of pancake flour loomed behind Green. At 16 feet high, 12 feet wide, and 24 feet long, this barrel was bigger than the average SUV. Draped in an apron and donning the Negro Mammy’s customary red bandanna, Green flipped over one million pancakes over the six months the fair was in operation. While she made pancakes, she sang spirituals and relayed stories about the “good old days” of slavery. It was said that the exhibit drew a crowd so large the police had to step in to keep the walkways clear. The live advertisement was an incredible success, fetching over 50,000 orders of Aunt Jemima’s pancake mix from fair visitors hailing from all over the country. Due to her laudatory reception, the officials at the fair named Aunt Jemima “Queen of the Pancakes.”

Left: Aunt Jemima advertisement, 1894. [Wikimedia Commons] Right: Trademark registration for Aunt Jemima, 1903. [Library of Congress]

Consider the souvenir gift of the exhibit: a button pin featuring a smiling Aunt Jemima with the phrase “I’se in town, honey!” scrawled across the top. Aunt Jemima’s successful move from the rural, southern plantation to the bustling, urban “town” of the North (like Chicago) is predicated upon her willingness to remain a servant, a Negro Mammy to whites. This sentiment captures the attitudes of both northern whites who were agitated by the influx of African Americans and southern whites who were dismayed at the loss of their workforce during the mass migration. Through Aunt Jemima’s pancake mix, the longed-for Negro Mammy could return to white kitchens once again. 

But while white Americans saw in Aunt Jemima the docility and domesticity of the Old South, African Americans took something entirely different from the exhibit. African Americans, too, were familiar with the song “Old Aunt Jemima” — but it held a radically different meaning. African American minstrel performer and former slave Billy Kersands originally came up with the “Old Aunt Jemima” song and dance routine in 1875. And when African Americans came onto the minstrel stage, they often added layers of nuance to the routine that whites did not pick up on. After all, it was an incredibly ironic performance: African Americans were mocking white Americans, who had built a career out of mocking their castmates’ days of enslavement.

As scholar M.M. Manring observes, Kersands’s “Old Aunt Jemima,” which impersonates a Negro Mammy, is based upon a slave song called “Promises of Freedom.” This song added a great irony to his act, subtly making fun of the very whites who enjoyed minstrel shows. In the song, the enslaved ridicule slave masters who made vacuous pledges of future manumission. One verse features a mistress who promised to free the enslaved upon her death. Rather than fulfill her promise, she simply refuses to die, going plumb bald in her old age.

Whites in America did not realize that by welcoming Aunt Jemima into their homes, they were not only gaining access to her delicious pancakes; they were also partaking of the conjure African American women have been wielding for centuries. Beneath the light melody and playful dance that accompanied the song “Old Aunt Jemima,” the lyrics issued a deadly serious threat: a mojo used by the enslaved to get back at their masters who failed to uphold past promises of freedom.

When Rutt’s Aunt Jemima and Kersands’s “Old Aunt Jemima” are laid side by side, they tell a story of two different Americas. At the 1893 Chicago World’s Fair, Aunt Jemima’s pancakes gave whites the America they longed for — one where newly freed African Americans embraced the docility and domesticity posed by an imaginary Negro Mammy. But “Old Aunt Jemima” describes an America that has fallen short of providing the freedom it guarantees to all its citizens — an America that newly freed Blacks wanted to hold accountable for such failings. In Black America, Aunt Jemima rises like a ghost from kitchens across the nation, wielding the mojos of past Negro Mammies. Behind this popular pancake mix stands a secret history where Aunt Jemima is no longer a slave but a Black revolutionary.

Excerpted from the book The Conjuring of America: Mojos, Mermaids, Medicine, and 400 Years of Black Women’s Magic by Lindsey Stewart. Copyright © 2025 by Lindsey Stewart. Reprinted with permission of Legacy Lit, an imprint of Grand Central Publishing. All rights reserved.

 

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186052 https://historynewsnetwork.org/article/186052 0
A Mere Mass of Error On October 22, 1880, the front page of Truth, a tiny and previously obscure New York City newspaper, was dominated by a story that threatened to doom the presidential hopes of Republican candidate James Garfield. Splashed across the front page was a large photograph (still a rarity in newspapers then) of a handwritten letter in which Garfield appeared to secretly promise to oppose efforts to ban Chinese immigration in order to protect the supply of cheap labor for industrialists. Never mind that Garfield’s Republicans had, like their Democratic rivals, already adopted Chinese exclusion in their campaign platform. A quickly convened court hearing provided expert and investigative evidence that the “Chinese letter” was a forgery and the whole affair a ginned-up illusion, only to be countered with competing experts and alternative facts extending the hearings and keeping the controversy in the news. Word of this muddied debunking chased the lie down the channels of the 19th-century information networks — telegraph and railroad lines — but initially did little to quench outrage among the nearly all-white electorate of the Western states upon which the election now hinged. 

Before Garfield and the Republicans could mount an effective response, the photographic image of the “Chinese letter” had spread across the entire nation. Flyers and posters of the images, often labeled as “Garfield’s Political Death Warrant,” were “being scattered throughout every [New York] county and school district”; being handed out to Chicago children “at the doors of the public schools” to bring home to their parents; becoming “the sole topic of conversation” in Toledo, Ohio, and in Nevada mining towns; and, as one member of the Democratic National Committee gloated, being “scattered all over the Pacific slope,” making “the Chinese problem” all at once “the foremost argument in the campaign.” The Los Angeles Herald declared: “The election of Garfield would be the signal for the discharge of all white men from employment by manufacturers and corporations and substitution of Chinese coolies.” (“Coolie” was a derogatory term for Asian laborers adopted from British colonial culture).

Nowhere did the arrival of this lie cause more mayhem and misery than in Colorado, where news of the Garfield letter set the match to an explosive anti-Chinese climate stoked for months by the local Democratic Party-aligned press. News of the letter’s claim was being flogged in Denver papers within a day of its publication in New York City, followed within days by photo-lithographic printing plates shipped by train that brought the photographic proof to Denver whites. Soon enough, on Sunday, October 31, a barroom assault on a handful of Chinese pool players erupted into a racial pogrom against the city’s Chinese population. Dozens of Chinese homes and businesses were burned, scores of Chinese immigrants badly beaten, and 28-year-old Lu Yang (Look Young) was dead.

This devastating “October Surprise” was rendered all the more potent by Garfield’s five-day delay in issuing an official denial. He privately assured Republican Party leaders that the letter was “a base forgery,” but, refusing their increasingly desperate pleas, told them that he “hoped to answer all my accusers by silence.” In accordance with the contemporary norm that it was unseemly for candidates to campaign for themselves, Garfield would agree only to have a surrogate, Republican National Committee chairman Marshall Jewell, denounce the letter as a forgery. There was more to Garfield’s delay than propriety, however. Without yet having seen a photograph of the letter, the candidate wasn’t entirely sure that he hadn’t written it, or rather that a member of his staff hadn’t perhaps done so and signed it on Garfield’s behalf, as was sometimes the practice with minor correspondence. Without sharing his uncertainties with his party leadership, Garfield, away from Washington, DC, quietly sent his secretary “to search our files which had been carefully indexed to see if they contained any such letter.”

In the meantime, the “Chinese letter” scandal metastasized, feeding on the uncertainty created by Garfield’s silence. Republicans responded first with moral outrage. “That there has been a most deliberate conspiracy, carried out in all its parts with foresight, with malign and infamous intent to destroy the name of James A. Garfield,” thundered celebrity preacher Henry Ward Beecher from his Brooklyn pulpit, denouncing the unseen wirepullers “who undertook, by lies, by forgery … to blight a fair fame,” and predicted that “the people [will] be the voice of God, come to judge such” men.

In the end, James Garfield won the 1880 presidential election, if just barely. The “Chinese letter” hoax seems to have cost him California and Nevada, and resulted in the slimmest popular vote margin in U.S. history (two thousand ballots out of nine million). While the hoax failed in its immediate aim of winning the White House for Democrats in 1880, it arguably contributed more to a successful and tragically consequential sleight-of-hand: convincing white workers to focus on nonwhite immigrants as the greatest threat to their prosperity rather than the white businessmen who set wages, hours, and working conditions.

Forty years before the “Chinese letter” hoax rocked the 1880 presidential election, a hoax involving the 1840 U.S. Census used the nascent authority of the new science of statistics to promulgate false evidence that the mental health of African Americans collapsed outside of slavery. Secretary of State John C. Calhoun, the infamous advocate of slavery responsible for the census, argued that emancipation “would indeed, to [the enslaved], be a curse rather than a blessing.” Calhoun deployed convenient census errors to inhibit abolitionist efforts to stop the spread of slavery to new U.S. states. The false conclusions drawn from the 1840 census became the first major dataset in what would become the massive edifice of American scientific racism that propped up U.S. white supremacy into the second half of the 20th century.

“Who would believe without the fact black and white before his eyes,” marveled a letter in the New York Observer, that “there is an awful prevalence of idiocy and insanity among the free blacks [and] … slaves?” Startling as it was, this conclusion was “obvious,” the writer explained, “from the following schedule,” referring to columns of data reproduced from the 1840 U.S. Census. This letter and its accompanying excerpt from the census were themselves quickly reproduced without analysis or comment in the American Journal of Insanity and other medical journals around the country, perpetuating the “fact” that, as one appalled white Northerner observed, “lunacy was … about eleven times more frequent for the African in freedom as in slavery” and that “more strange than this,” the mental health of free African Americans worsened still further the farther north from the Slave South they lived. The unexpected conclusion that freedom was unhealthy for “Africans” delighted slavery’s defenders and confounded their opponents in the antislavery movement. The conclusion was seemingly irrefutable, however, bearing as it did the authority of both the federal government and the new science of statistics.

Calhoun’s longtime adversary John Quincy Adams — the 77-year-old former president (upon whom Calhoun had been disagreeably foisted as vice president 20 years before), currently Massachusetts congressman, and, three years earlier, defender of the Amistad slave ship rebels before the U.S. Supreme Court — was leading a call for the results of the 1840 census to be publicly retracted. According to Adams, some Massachusetts country doctor had reportedly “discovered that the whole of the [census] statements in reference to the disorders of the colored race were a mere mass of error, and totally unworthy of credit,” rendering the 1840 census “worthless, at best utterly botched and at worst maliciously falsified.” Adams would later claim that he had already convinced Calhoun’s predecessor of the falseness of the census data a week before the man’s unfortunate and dramatic demise. Calhoun now found it politically impossible to completely ignore Adams’ repeated accusations in Congress that “atrocious misrepresentations had been made” by the census of which existed “such proof as no man would be able to contradict,” and that the nation had, thanks to Calhoun, been “placed in a condition very short of war with Great Britain as well as Mexico on the foundation of these errors.” Adams demanded that the secretary of state reveal “whether any gross errors have been discovered in the printed Sixth Census … and if so, how those errors originated, what they are, and what, if any, measures have been taken to rectify them.” 

Calhoun agreed to “give the subject a thorough and impartial investigation.” Adams savored watching Calhoun “writhe … like a trodden rattle on the exposure of his false” assurances regarding the accuracy of the census and grumble that “there were so many errors they balanced one another, and led to the same conclusion as if they were correct,” imagining (naïvely it turned out) that the exposure of the errors would end the spread of the census’ false conclusions. 

Ultimately while Congress conceded that “in nearly every department of the late census errors have crept in, which go very far to destroy confidence in the accuracy of its results,” they declined to incur the “great expense” of commencing a new corrective census, and shrugged off the offending inaccuracies, concluding regarding the 1840 census that “it’s near approximation to the truth is all that can be hoped for.” The false claims about African American intelligence and sanity would stand. It was at this moment that the 1840 U.S. Census data became a species of hoax rather than simply a fiasco.

Indeed, the 1840 census was still a potent enough cultural force over a decade later that it was the only element of American racism that Harriet Beecher Stowe thought merited its own extended appendix in A Key to Uncle Tom’s Cabin (1854), her documentation of the truth behind Uncle Tom’s Cabin, her blockbuster 1852 antislavery novel. “In order to gain capital for the extension of slave territory,” Stowe fumed, “the most important statistical document of the United States has been boldly, grossly, and perseveringly falsified, and stands falsified to this day. Query: If state documents are falsified in support of slavery, what confidence can be placed in any representations that are made upon the subject?”            

What did accrue significant public confidence in the United States after the 1840 census, however, was the notion that science could be used to confirm racial equality and defend racist institutions and laws while evading accusations of racial bias. American culture threw itself into the production of scientific racism with gusto for the next hundred years, justifying everything from slavery and segregation to racist immigration, marriage, citizenship, and sterilization laws.

This excerpt originally appeared in The Great White Hoax: Two Centuries of Selling Racism in America, published by The New Press. Reprinted here with permission.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186048 https://historynewsnetwork.org/article/186048 0
Slave Hunts as “Normal Policing” In May 1752 the French minister of the navy, Antoine de Rouillé, wrote to the governor of Saint-Domingue about the new problem of slaves in France. Slaves were “multiplying every day, more and more, in almost all the towns of the kingdom.” The minister’s disquiet followed a controversy that centered on an African man, age 22, whom I shall call Jean, though he also appears under other names (Charles-Auguste and Adonis) in the police archives. He was enslaved to Guy Coustard, a sugar planter in Saint-Domingue. Jean had the Coustard family’s monogram (CO) branded on his left breast.

Documents about Jean’s brief sojourn in France come from two slender files at the Bastille Archives, which contain letters to the lieutenant-general of police from the minister of the navy and from Jean’s would-be benefactor, the Dowager Princess of Nassau-Siegen, born Charlotte de Mailly de Nesle, who tried and failed to protect Jean from Coustard. Her staff and Coustard lodged in the same hotel, near the Luxembourg Palace. Through her servants, she learned of Jean’s physical abuse and despair.

From Mailly de Nesle we learn that Jean arrived in Paris during the spring of 1751 and fled from the city twice. On both occasions he tried to escape by joining the army. In March 1752 the French constabulary arrested him in Sedan, a frontier garrison town, and escorted him back to Paris in chains. He wound up in the dungeon of For l’Évêque, a former ecclesiastical prison. Many of the other inmates at that time were soldiers. Unlike Jean, who had hoped to become free by joining the army, those men were draftees, who had sought freedom from the army through desertion. On April 8, someone other than Coustard claimed Jean from prison. Port records in La Rochelle note that a slave named Jean sailed for Saint-Domingue in July.

The capture and imprisonment of Jean resulted from an order of the king, popularly known as a lettre de cachet. Masters paid a fee to police for these roundups and paid for the maintenance of their slaves in prison. In March 1752, Jean-Jacques Coustard, an elderly Parisian judge, lobbied the Crown to arrest Jean by royal writ. The judge did not own slaves himself and had probably never set foot in the colonies. He came from a clan of Angevine drapers who bought their way into the Paris legal establishment in the 17th century. The Paris Coustards abandoned trade for the law, to become a judging dynasty, just as a more intrepid, piratical sprig of the family settled in Saint-Domingue. The judge and Guy Coustard, Jean’s master, were cousins, not brothers. The capture of Jean resulted from the maneuvering of Crown officials to oblige both a sugar magnate and a member of the city’s judicial elite.

Jean’s failed bid for liberty offers a glimpse of how elusive freedom became for many slaves in Paris after the mid-18th century. His removal from the army and deportation back to Saint-Domingue resulted from new policing practices that crystallized around the time of his brief stay in France. Despite fleeing Paris, Jean became one of the first victims of an emerging system, based in France’s capital, by which slave owners, or their proxies, caused freedom-seeking domestics to disappear. 

 

The rising importance of the slave trade, and of colonial slave plantations, to Parisian social and economic life led the city’s elites to adopt a new attitude toward people of African and South Asian descent, whom they increasingly viewed as potentially saleable belongings. Resplendent sojourners from Saint-Domingue played a role in diffusing new racial concepts in Paris, but their influence should not be overstated. Ideas of race did not waft into the capital as a foreign essence. By 1750, slave plantations and the slave trade out of East and West Africa had become economically vital to Parisian institutions, including the Company of the Indies, which enjoyed direct support from the Crown and strong ties to Parisian high finance. There was nothing distantly managerial about the activities of Paris-based officials in the Africa trade. Consider this document from 1750, written one year before Jean arrived in Paris. Signed by all directors of the Company of the Indies, it sets forth a new scale of value for slave sales in Senegal.

RÉGULATION DES NOIRS, NÉGRESSES, NÉGRILLONS ET NÉGRITTES

21. Every negro between 14 and 40 will be reputed as one Indian piece so long as he has none of the defects indicated below.

22. One négrillon (boy) of 14 equals one Indian piece.

23. Four négrillons (boys) or négrittes (girls) from the age of 8 to 13 equal three Indian pieces.

24. Six négrillons (boys) or négrittes (girls) from the age of 4 to the age of 8 equal three Indian pieces.

25. Four négrillons (boys) or négrittes (girls) who are 4 years of age or younger equal one Indian piece so long as they are not nursing.

26. One negress who is between 14 and 35 years of age equals one Indian piece.

27. One negress who is age 13 and 14 equals one Indian piece.

28. Men between 40 and 50 years of age, and women between 35 and 40 years of age, equal one-half Indian piece and cannot compose more than 3 percent of the cargo.

29. All nursing children will follow their mothers and not be counted.

30. All negroes, negresses, négrillons (boys), and négrittes (girls) will be considered valid Indian pieces so long as they are not epileptic, maimed, blind, or suffering from formal disease.

31. Some missing teeth, and negroes with enlarged testicles who do not have hernias, cannot be refused by captains and surgeons, or excepted from the above regulation.

32. Negroes with one bad eye who are not over 30 years, others of the same age who are missing one or two fingers, however robust their bodies, will only be counted as one-half an Indian piece.

33. A negro who is lacking two toes will be estimated as two-thirds of a piece; a negress in the same case will be evaluated similarly; and négrillons (boys) and négrittes (girls) by the same proportion.

To pin down the novelty of this document requires that we identify what is not new. At direct points of sale among slave buyers in Africa or the Americas, this meticulously commodified view of the human body was familiar. It was normal for company agents to haggle over people with missing toes and enlarged testicles. There is also nothing new about the term pièce d’Inde (Indian piece), from the Portuguese peça das Indias, which originally referred to the value of a piece of cloth exchanged for slaves in Africa by 15th-century traders. French merchants began to employ this term in the early 18th century.

What seems new is this bald enactment by Paris-based officials of a common system of meaning that binds together the capital and trading posts in Senegal in which Africans about 30 years old are whole units, Africans about 40 years old are half-units, and nursing babies, the blind, and ailing people literally have no value. This is not merely a blunt statement of adhesion to the language of the slave captain by the city’s most eminent merchants; it is the other way around. It is Paris scripting the dialogue at the point of sale.

Police sources about slaves in Paris might seem worlds away from plantation inventories, or Indies Company contracts, yet they convey the same matter-of-fact view of black people as property. Stakeouts and arrests could not have occurred otherwise. Urban slave hunts, far from chafing against local values, reaffirmed them. The property that officials in Paris were willing to defend changed in step with the kind of property that Parisians believed in. By the mid-century, policemen accepted that property could take the form of people.

Slave hunts brought the ideology of the slave owner into the streets of Paris, raising the question of what neighbors thought. At least for bystanders, the arrest of slaves looked just like regular police raids. The question is not how neighbors reacted to the spectacle of capture so much as how they understood the status of their neighbors’ domestics, whether they reported fugitives to the police, and whether they hid people. It is impossible to venture a single answer to this question. Police files offer many clues to friendship, love, and complicity between Parisians and enslaved people. There were, nonetheless, some residents of the city who described their neighbors’ domestics in the crudest possible terms. In 1751, la Dame Mallecot, the wife of an administrator in Cayenne, sought help from the police with the removal of Esther, an African (Igbo) domestic. Mallecot plotted the woman’s arrest, sent Esther to the home of an elderly neighbor, and left town. The neighbor’s son complained to the lieutenant-general of police. “I beg you sir to order that Mallecot come for her negress, whom I will return. It is her property, she will do with it what she wants.” Esther was “a deposit” (un dépôt) for his neighbor to reclaim.

There did not need to be a slave master in the picture. Police agents presumed black and brown people to be stolen goods even when no one reported them missing. The arrest of a man called Mustapha in 1755 offers a revealing instance of this. Mustapha, newly arrived from Marseille, was doubly jinxed. The police had doubts about the fancy napkins Mustapha was hawking on a bridge, and they were just as suspicious about the provenance of Mustapha himself. He deepened their concern by refusing to answer questions (although he was believed to know French) and spent four weeks in For l’Évêque. “We did not find anything in his pockets indicating to whom he belonged.”

 

During the reign of Louis XIV, royal officials began to theorize policing as a vast, tentacular cleansing project by an all-knowing state. As Michel Foucault observes, the rise of new policing ideas would change the structure of government as people began to reimagine its purpose. Policing became a boom topic for publishers and Crown officials, especially after the death of Louis XIV in 1715. The end of Louis’s long reign heightened the reforming zeal of police enthusiasts, to inspire dictionaries, treatises, proclamations, and experiments in repression and surveillance. In Paris, the word police encompassed just about everything. It meant ridding the city of moral filth, actual filth, crime and delinquency, crooked houses, illegal workers, badly lighted streets, family embarrassments, and riotous effervescence among the laboring poor. In the service of this billowing project, the lieutenant-general of police in Paris could issue his own royal writs for the arrest of undesirables, who entered dungeons without passing through the courts.

The practical ability of municipal authorities in Paris to police evolved over time. The invention of inspectors in 1708, with an amplified role after 1740, altered the relationship between police and city dwellers. Through their webs of spies and informants, twenty police inspectors maintained an unrelenting, round-the-clock surveillance of lodging houses and rented rooms frequented by étrangers (strangers). The French word étranger, imbued with a sense of danger and suspicion, referred to outsiders in general, including people from elsewhere in France.

Changes to the policing of Paris responded to dearth, social unrest, and an increase in human mobility. Migration expanded both the city, as a physical space, and its population. The new brutal efficacy of police inspectors around the mid-century also came on the heels of war — the War of the Austrian Succession — and should be read in light of that conflict. As Arlette Farge notes, resistance to troop levies, together with mass desertion, spurred social upheaval in Paris. This may help to account for the menacing force of police in Paris after the war in confrontations with strangers and crowds.

Once agents of the Paris police put themselves in the service of slave owners, it became perilous for fugitives to hide in the city. Jean needed to escape from Paris and not into it. Enslaved domestics who accompanied masters to Paris in the 1740s tended to disappear after a couple of weeks.

Admiralty records provide numerous examples of flight by teenage Africans between 1742 and 1747. The police did not catch these people and there is no evidence they tried to. (They may have been focusing on deserters.) On the rare, documented occasions before 1750 when masters sought help from the police to recover enslaved domestics, nothing happened. In 1742 Anne-Marie-Josephe de Sorel, from Léogane, reported the flight of her slave Pierrot to the Admiralty. To find the boy, she summoned “Sir Genesty, exempt, and she charged him with conducting searches for the said negro, which he assures her of having done for several days and nights” to no effect. In August 1749 a Parisian solicitor reported the flight of his slave Jeanne, who remained at large despite “investigations and house searches that her master caused to be done” — which suggests another failed police hunt.

Masters in the 1750s who appealed to the police framed their demands by emphasizing the moral threat posed by escapees. At the time, the police and most of French society viewed the whole serving class as degenerate scoundrels. Through their depiction of runaways as urban contaminants, masters recast slave hunts as normal policing. In 1751 the Portuguese bishop of Noronha, governor of Sao Tomé, reported the flight of Figueret, “about 4 foot 3, black, dressed in black, in a curly wig gathered at the back, age 16 or 17, from Goa in the Indies.” Figueret was known to be spending his days at the Saint-Germain fair. Noronha explained that the boy “who belonged to him, has been extremely deranged for five or six months, since arrivingin Paris, and it being important to oversee his conduct, to prevent him from committing some disorder, he would be very grateful for him to be put in the prison of For l’Évêque until he departs Paris for the Orient.” When informing the police about the flight of his slave, Louis Aubin, the Chevalier de Nolivos noted “how much pleasure (his arrest) would give me, because, independent of the real loss caused by this domestic, he swindled me.” Masters in the 1750s emphasized the resemblance between runaways and other delinquents. They did so to enable the extrajudicial arrest of people they regarded as valuable assets. 

Excerpted from Slaves in Paris: Hidden Lives and Fugitive Histories by Miranda Spieler, published by Harvard University Press. Copyright © 2025 by the President and Fellows of Harvard College. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186043 https://historynewsnetwork.org/article/186043 0
Irrelevant at Best, or Else Complicit It was not an optimistic time. In the United States, President John F. Kennedy and civil rights activist Medgar Evers had been shot dead in 1963, Malcolm X in 1965, and Dr. Martin Luther King Jr. and Robert F. Kennedy in 1968. Bodies piled up, too, in Vietnam. The year 1968 had brought a global surge of energy and solidarity: the growth of social movements, of struggles against dictatorships and authoritarian rule, of resistance even in the face of violent repression. But 1969 saw a massive global let-down. Coalitional hopes sagged nearly worldwide, replaced by feelings of chaos, dread, and hopelessness. 

“Design,” whatever that might be, no longer looked to anyone like the answer to any of the world’s problems. At the 1969 International Design Conference in Aspen (IDCA) — the same conference that in 1961 had been themed “Man / Problem Solver,” that had emphasized the designer’s “great social responsibility” to help build “a new society with new institutions,” that had celebrated design’s capacity to “‘blast off’ for richer worlds” — the atmosphere had turned somber. The 1969 conference was titled “The Rest of Our Lives.” The industrial designer George Nelson bemoaned, in his conference talk, the difficulty of escape from “the perverted offspring of the American dream” — the dream itself having been brought about, Nelson said, in part by blind faith in technology. The conference’s overall mood, one commentator observed later, reflected “the despair the participants felt at the crumbling of American ideals.” 

The 1970 conference, titled “Environment by Design,” was even darker. Three days in, the American architect Carl Koch declared from the podium that “Our national leadership is unspeakable. The government’s sense of priorities is criminally askew. Our cities are rotting visibly before our eyes.” By a few days later, the program of organized talks had disintegrated. 

People gathered ad hoc in the conference tent to connect with one another and express ideas about the current crisis. A group of French participants read a screed against design itself, written for the occasion by Jean Baudrillard. Baudrillard’s statement lambasted the conference’s environmentalist theme as disingenuous (“Nothing better than a touch of ecology and catastrophe to unite the social classes”), even as it acknowledged, “The real problem is far beyond Aspen — it is the entire theory of Design and Environment itself, which constitutes a generalized Utopia; Utopia produced by a Capitalist system.” (Utopia, here, seems to imply the most self-delusional kind of fantasy.) 

The final hours of the conference, IDCA president Eliot Noyes wrote afterward, underlined “the relative irrelevance of the design subject in the minds of many who were attending.” At the subsequent board meeting, Noyes resigned as president, and the board resolved to search for a radically new form for the 1971 conference, if the conference were to be held again at all. Both the conferees and the board, Noyes reflected, now harbored “serious doubt as to whether at this moment in our national history and our state of emotional disrepair a conference on design can or should be held at all.” Focusing on design seemed irrelevant at best, or else complicit, deplorable, malign.

The whole concept of design was also under attack from those outside design’s professional bounds. In 1971, the German philosopher Wolfgang Fritz Haug published Kritik der Warenästhetik (later translated into English as A Critique of Commodity Aesthetics), a Marxist-cum-Freudian manifesto that described designers as the “handmaidens” of capitalism. Design, Haug contended, was an engine of the appetite-generating “illusion industry” of media and advertising, as well as of the broader consumer capitalist system behind them, all of which were organized around driving consumption and thereby producing profits. 

Haug, like the Frankfurt School before him, charged the modern culture industries and the commodities they produced with the manipulation of human beings. But Haug added a meaningful nuance to Theodor Adorno and Max Horkheimer’s thesis: he showed that manipulating people was only possible because design and its peer disciplines colluded with those people’s pursuit of self-interest, which was continuous, intelligent, and fully intentional. Even “manipulative phenomena” like design, as Haug put it elsewhere, still spoke “the language of real needs.” 

So what to make of design? Was it a necessary evil, or a poison to be eradicated? Neither: it was that poison’s dangerously sweet taste. Or, to use Haug’s own metaphor, design was like the Red Cross in wartime. “It tends some wounds, but not the worst, inflicted by capitalism,” Haug wrote. “Its function is cosmetic, and thus prolongs the life of capitalism by making it occasionally somewhat more attractive and by boosting morale, just as the Red Cross prolongs war. Thus design, by its particular artifice, supports the general disfigurement.”

1971 was also the year the Austrian American designer Victor Papanek published Design for the Real World. It has since become one of the most widely read design books in history; it has been published all over the world, has been translated into over twenty languages, and (as of 2024) has never fallen out of print. It’s a manifesto against what design had become. And it’s a passionate brief for what Papanek believed design could be.

 

As of 1971, Victor Papanek was dean of the newly formed School of Design at the California Institute of the Arts (CalArts). And he had begun to develop his own methodology for a design practice focused, he believed, on solving for real human beings’ real needs. 

Papanek preached design’s “unique capacity for addressing human issues,” as he put it in the magazine Industrial Design, and its “value beyond the purely commercial imperative.” His philosophy of “DESIGN FOR THE NEEDS OF MAN” was a set of seven “main areas for creative attack”:

1. Design for Backward and Underdeveloped Areas of the World.

2. Design for Poverty Areas such as: Northern Big City Ghettos & Slums, White Southern Appalachia, Indian Reservations in the Southwest and Migratory Farm Workers.

3. Design for Medicine, Surgery, Dentistry, Psychiatry & Hospitals.

4. Design for Scientific Research and Biological Work.

5. Design of Teaching, Training and Exercising Devices for the Disabled, the Retarded, the Handi-capped and the Subnormal, the Disadvantaged.

6. Design for Non-Terran and Deep Space Environments, Design for Sub-Oceanic Environments.

7. Design for “Breakthrough,” through new concepts.

That designers should organize their work around addressing human beings’ real-world needs, however clumsily taxonomized—rather than around aesthetics, or function, or the profit imperative—was the message of Design for the Real World. First published in Swedish in 1970, it found global success when published in English in 1971, taking its place among other leftist English-language jeremiads of the time: Jane Jacobs’s The Death and Life of Great American Cities (1961), Rachel Carson’s Silent Spring (1962), James Baldwin’s The Fire Next Time (1963), Kate Millett’s Sexual Politics (1970), E. F. Schumacher’s Small Is Beautiful: Economics as if People Mattered (1973). 

Papanek’s book attributes a lot of agency to design: “In an age of mass production when everything must be planned and designed,” he writes, “design has become the most powerful tool with which man shapes his tools and environments (and, by extension, society and himself ).” But the book doesn’t celebrate that agency. Instead, it charges designers, and the broader economies of production within which they operate, with wasting and abusing their power. 

Take the process of creating and distributing a new secretarial chair. In a “market-oriented, profit-directed system such as that in the United States,” such a new chair almost invariably “is designed because a furniture manufacturer feels that there may be a profit in putting a new chair on the market,” rather than because there is any empirical evidence that a particular population’s sitting needs are not being met. The design team is simply “told that a new chair is needed, and what particular price structure it should fit into.” The team may consult resources in ergonomics or human factors, but inevitably they will find that the information available about their potential “users” is sorely lacking. So they design another generic chair, made neither to fit a specific population nor to solve a new problem. After some perfunctory testing, the chair hits the market, where, invariably, someone other than the secretary decides whether to buy it for her use. Some money is made. No one’s life improves. But the manufacturer is satisfied: “If it sells, swell.”

Young man paints the back of a wooden chair, by Arnold Eagle, c. 1940. [The J. Paul Getty Museum]

What should designers do instead? “A great deal of research,” Papanek replied. Designers should ask “big,” “transnational” questions: “What is an ideal human social system? … What are optimal conditions for human society on earth?” They should inquire into their potential users’ “living patterns, sexual mores, world mobility, codes of behavior, primitive and sophisticated religions and philosophies, and much more.” And they should learn about other cultures’ ways of prioritizing and addressing needs. They should undertake “in-depth study” of such “diverse social organizations” as the “American Plains Indians, the Mundugumor of the Lower Sepik River basin; the priest-cultures of the Inca, Maya, Toltec, and Aztec; the Pueblo cultures of the Hopi; the social structuring surrounding the priest-goddess in Crete; the mountain-dwelling Arapesh; child care in Periclean Greece; Samoa of the late 19th century, Nazi Germany, and modern-day Sweden”; et cetera, et cetera.

Papanek’s commitment to identifying needs by learning about the lives of specific users—largely those from non-Western cultures—might be called an “ethnographic” impulse: a drive to study groups of people (usually groups other than one’s own) and to document their cultures, customs, habits, and differences from an assumed norm. The ethnographic impulse played out not only in Papanek’s bloc-buster book but also in his self-curation and self-presentation. He built a personal library, his biographer notes, containing hundreds of volumes of anthropological research and writing. Beginning in the 1960s, Papanek invited reporters into his home to photograph or draw him and his wife (whoever she was at the time) and their decor: Navajo weavings, Buddhist figures, Inuit masks and ritual artifacts, Balinese masks, other objects of vernacular culture. 

Papanek also endeavored, through this period, to document his alleged ethnographic capital as a set of professional credentials. In the “biographical data” sheet (something like a curriculum vitae) that he presented to CalArts in 1970, Papanek wrote that he 

a. had traveled widely throughout Europe, Thailand, Bali, Java, Cambodia, Japan, etc.

b. spent nearly 6 months (with the Governor’s permission) living in a Hopi Indian pueblo

c. spent several months with an Alaskan Eskimo tribe and nearly five years in Canada

d. spent part of 5 summers in an art-and-craft centered milieu in the Southern Appalachians

e. received various grants that took me to Lapland, Sweden and Finland during the summer of 1966; Finland and Russia during the summer of 1967; and will take me to Russia, Finland, Sweden, and Norway during the summer of 1968

His biographer calls several of these items—particularly those suggesting that Papanek had carried out fieldwork with Hopi and Alaskan Eskimo tribes—“fallacious.” But that didn’t stop Papanek from repeating them across documents and forums. 

Excerpted adapted from The Invention of Design: A Twentieth-Century History by Maggie Gram. Copyright © 2025 by Maggie Gram. Available from Basic Books, an imprint of Hachette Book Group, Inc.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186041 https://historynewsnetwork.org/article/186041 0
An Attempt to Defeat Constitutional Order Conservatives in South Carolina first attempted to defeat the state’s new post-Civil War constitution by appealing to the federal government they had fought three years prior. A petition was submitted to Congress, describing the new constitution as “the work of Northern adventurers, Southern renegades, and ignorant negroes” and claiming that “not one percentum of the white population of the State approves it, and not two percentum of the negroes who voted for its adoption know any more than a dog, horse, or cat, what his act of voting implied.” Conservatives complained that “there seems to be a studied desire throughout all the provisions of this most infamous Constitution, to degrade the white race and elevate the black race, to force upon us social as well as political equality, and bring about an amalgamation of the races.” They ended the petition with a warning: “The white people of our State will never quietly submit to negro rule.”

Congress refused conservative entreaties. But conservatives persisted in their fight. To prevent Black people and Republicans from prevailing in the first elections after the constitution, many turned to coercion, intimidation, and violence. In testimony before a state legislative committee investigating a disputed election, one South Carolinian said that employers had resolved not to employ any man who voted Republican. This was a smart strategy as many former slaves still relied on contracts with their former masters to earn a living. Slaveholders had exploited Black labor to build their wealth, and then used that wealth to build white political power. 

Conservatives also used the legal system. One former slave was arrested and held without trial. Authorities released him when he agreed to vote Democrat. Sometimes, conservatives resorted to even more direct methods. In the spring of 1868, the Ku Klux Klan appeared in South Carolina for the first time, and worked to win the 1868 election for conservatives. After years of being denied a voice in the political process, Richard Johnson was excited to vote. But the night before the election, “the Ku Klux came through our plantation, and said if any of the colored people went to the polls the next day to vote, that they would kill the last one of them.” Some Black men on the plantation were so determined to vote that they still turned up at the polls. But several decided not to vote at the last minute because “the Democrats had liquor at the box upstairs and were drinking and going on in such a manner that the colored people were afraid to go up.” Eli Moragne was one of them. The day before the election, the Klan broke into his home, dragged him outside, stripped him naked, and then whipped him. He showed up despite the experience but was told that if he “voted the Radical ticket [he] would vote it over a dead body.” Armed white men stood between him and the ballot box.

Union Republican Ticket for Constitution, 1868. [University of South Carolina]

Sometimes, Democrats engaged in violence without bothering to wear their Klan robes. William Tolbert, a Democrat who helped murder a Black Republican, observed that “committees were appointed, which met in secret, and they appointed men to patrol in each different neighborhood.” This was done “to find out where the negroes were holding Union leagues.” They had instructions to “break them up, kill the leaders, fire into them, and kill the leaders if they could.” Committees were supposed to take ballots from Republicans and kill those who resisted. Republicans did resist because Tolbert described a scene where one Republican had been shot dead and others had fled. The violence was effective. At one precinct, Tolbert would ordinarily have expected between four and five hundred Black men to vote, but Democratic committee members in the area only allowed two Black men to vote before they started shooting. There were similar drops in Black turnout across the state. For example, in Abbeville County, around 4,200 Black men were registered voters, but only 800 actually voted in 1868’s fall elections.

Republicans won the governorship and control of the legislature. But Democrats and conservatives saw that violence could be effective. 

Carte-de-visite of members of Republicans in the South Carolina State Legislature, 1868. [Wikimedia Commons]

State authorities did try to respond. Amid Klan violence sweeping the state, Governor Robert Scott signed a bill authorizing a state militia. However, most whites refused to serve, a trend that became especially pronounced when Governor Scott rejected all-white militia companies offered by former rebels. In the end, as many as 100,000 men, mostly Black, joined by the fall of 1870. They often wielded state-of-the-art weapons such as Winchester rifles. White newspapers spread conspiracy theories about the militia. For example, after describing the militia sent to Edgefield as “the Corps d’Afrique,” the Charleston Daily Courier claimed that it had come to the town to commence “the arrest of citizens on trumped up charges of being ‘rebel bushwackers,’” and “‘members of the Ku Klux Klan.’” It then suggested that the militia had tortured an innocent white man into admitting that he was a “bushwacker.” Two things appear to have been truly offensive about Black militia units. First, they inspired pride among Black people. The paper complained that when a Black militia unit went to Edgefield, “the negroes of Edgefield became exceedingly jubilant, and determined to congratulate the colored soldiers on their great victory.” Second, the militia gave Black men another economic option besides relying on their former masters. As the paper lamented, “Among the numerous evils which have resulted to the people of Edgefield from this invasion of the county by the negro militia, has been the desertion of the fields by the negro laborers.”

Violence between Black militia units and white people erupted in Laurens County right after the 1870 election. After a gun discharged during a fight between a police officer and a citizen, a white mob began shooting at militia in the town. Several Black men and a few white men died during the fighting and in the subsequent upheaval. One of them was Wade Perrin, a Black legislator. White men caught up to him, ordered him to dance, sing, pray, and then run away. While he was running, they shot him in the back. Between 2,000 and 2,500 armed white men occupied the town. They had confiscated militia weapons from the armory. Two different stories developed about what had caused the violence. The Daily Phoenix blamed Black people. In the months before the 1870 election, the paper reported, “the white people had been subjected to an organized system of disparagement, abuse, and threats of violence to person and property, which had produced that feverish state of feeling incident to a deep sense of outrage and injustice.” Black people had allegedly become so unruly that “for weeks, whole families had not undressed for bed, so great was the apprehension of midnight negro risings, burnings and butcheries.”

The South Carolina Republican, however, claimed that a white man deliberately attacked a policeman to provoke him into firing so they would have an excuse to shoot. This must have been a premeditated plot because “it was not three minutes after the first shot was fired before a line of white men had formed across the public square … The white men came from every direction, out of the stores, the courthouse, and every other place, and what appears very singular is that every one was fully armed.” After the white men had fired on the militia, the paper reported that “white couriers were dispatched on every road, to rouse the people, so that by night at least one thousand men were scouring the countryside on horseback, and in little squads hunting up Radicals.” The incident attracted national media coverage. The New York Herald observed that “‘The War of the Races’ in South Carolina did not end with the rebellion, but occasionally bursts forth with its wonted fury.”

Governor Scott declared martial law in four South Carolina counties. But he also ordered remaining militia weapons in Laurens County transferred to Columbia. Removing the weapons ensured that the militia couldn’t be a serious fighting force and made the martial law proclamation meaningless. A wave of Klan violence swept the state after Laurens. The violence diminished temporarily later in 1871, though there is disagreement about why. Some have suggested that aggressive federal measures were responsible. 

In 1871, the federal government stationed more troops in the state and engaged in a thorough intelligence gathering operation to learn more about the Klan. Federal legislation authorized President Ulysses S. Grant to use the military to enforce the law and placed congressional elections under federal supervision. What became known as the Ku Klux Klan Act allowed Grant to suspend the writ of habeas corpus when he deemed it necessary. After considerable debate, Grant suspended the writ in nine South Carolina counties on October 17, 1871. Over the next months, federal authorities arrested thousands of men for allegedly participating in the Klan and secured dozens of convictions and guilty pleas. These efforts were enough for one historian to claim that “the limited steps taken by the Federal government were adequate to destroy” the Klan.

Indeed, Klan violence was lower for the end of 1871 and some of 1872 than it had been earlier. At the time, however, law enforcement officials themselves were skeptical about whether their efforts had been effective. One prosecutor even suggested that “orders were given” from unknown persons to end the violence “for the present” and that the Klan would simply “wait until the storms blew over” to “resume operations.” By the summer of 1872, Klan activity intensified, indicating that any benefits from federal intervention were limited.

Left: Jonathan Jasper Wright, 1870. [Wikimedia Commons] Right: William Whipper, c. 1879. [Wikimedia Commons]

Given the immense opposition it faced, South Carolina’s government made important achievements. The state greatly extended educational opportunities. In 1868, 400 schools served only 30,000 students. But by 1876, 2,776 schools served 123,035 students. The state also expanded the University of South Carolina, even providing 124 scholarships to help poor students with tuition.

Perhaps most importantly, South Carolina saw unparalleled Black involvement in politics during Reconstruction. During these years, 315 Black men served in political office. Six served in Congress. Two Black men served as lieutenant governor. South Carolina was a place where a parent could take a son who had experienced chattel slavery just three years previously to the legislature, point to a majority of the members, and say, “that could be you one day.” The state that was the first to plunge the nation into Civil War because of its commitment to Black slavery was also the first to raise a Black man up to its supreme court. Jonathan Jasper Wright was born in Pennsylvania to free Black parents and managed to save enough money to attend college, a rare feat for both white and Black people in the era. He read law in his spare time while teaching to support himself. Upon passing the bar, he became the first Black lawyer in Pennsylvania. After the Civil War, he came to South Carolina to organize schools for freedmen. Wright had a neatly trimmed beard and mustache, and his somber eyes betrayed a young man who was in a hurry or a man weighed down with cares, or perhaps both.

Corruption marred all of the progress. In 1870, the Charleston Daily News wrote that “the South Carolina Legislature enjoys the reputation, both at home and abroad, of being one of the most corrupt legislative bodies in existence.” Corruption was so bad, the paper claimed, that “a remark frequently made among the white men in Columbia, Radicals and Democrats, was that two hundred thousand dollars, judiciously circulated among the legislators, would secure the passage of a bill repealing the Emancipation act, and putting all but colored legislators back in slavery.” The paper then asserted that there was an organization known as the forty thieves pillaging the treasury. The organization allegedly had a captain, three lieutenants, four sergeants, and twenty-eight privates. The group conspired to prevent the legislature from passing any “measure unless money was paid to the members of the organization.”

Although conservatives may have exaggerated corruption, it did plague South Carolina during Reconstruction. After John Patterson won election to the U.S. Senate, authorities arrested him when a legislator said he had voted for Patterson after receiving a bribe. Critics called Patterson “Honest John,” supposedly because he always made good on his promises to pay bribes. The legislature attempted to impeach Governor Scott for his behavior in issuing bonds. At the end of 1871, a Republican newspaper lamented that “1872 finds South Carolina financially in a bad way, with no one to blame but officials of our own party. This is a disagreeable statement to make, but it is the truth.” William Whipper, who had argued for enfranchising women at the 1868 constitutional convention, asserted Scott bribed legislators to escape impeachment.

All the corruption caused schisms in the Republican Party. Eventually Whipper, who would himself be accused of corruption, asserted, “It is my duty to dissolve my connection, not with the Republican Party, but with the men, who by dishonesty, demagogism and intrigue have defamed the name of Republicanism, and brought financial ruin upon the State.” Disgruntled Republicans joined the new Union Reform Party along with some Democrats. In the 1870 campaign, the party’s platform was “honesty against dishonesty — cheap, economical government against exorbitant taxation — reduction of public expenses against extravagant expenditure of the people’s money — responsibility of officials for the faithful discharge of their duties against irresponsibility, selfishness and greedy absorption of power.” The Reform Party failed to win the fall elections, though members alleged fraud and intimidation at the polls. Corruption in the Republican Party deprived it of unity precisely when it was most needed to overcome the massive resistance it faced.

Some observers even claimed that corruption led to the Klan violence against Black people and Republicans. But whatever else is true about the corruption in the South Carolina Republican Party, it does not explain the attempt to overthrow the constitutional order. We know this because conservatives and Democrats never gave the 1868 constitution or the Republican Party a chance. They schemed to prevent a constitutional convention in the first place, protested to federal authorities, and used terrorism, cold-blooded murder, and economic coercion to prevail in the 1868 general election. The reality is that, given their hostility to Black political advancement, they would have engaged in violence and attempted to defeat the new constitutional order even if every Republican official had been honest and efficient.

Excerpt adapted from Sedition: How America’s Constitutional Order Emerged from Violent Crisis by Marcus Alexander Gadson. Copyright © 2025 by New York University. Published by NYU Press.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186038 https://historynewsnetwork.org/article/186038 0
Lethal Injection Is Not Based on Science We know how to euthanize beloved pets — veterinarians do it every day. And we know how physician-assisted suicide works — it is legal in several states. If drugs can be used to humanely end life in these other contexts, why is it so difficult in the death penalty context? The answer is one of the best-kept secrets of the killing state: lethal injection is not based on science. It is based on the illusion of science, the assumption of science. “What we have here is a masquerade,” one lab scientist says. “Something that pretends to be science and pretends to be medicine but isn’t.” Consider first the birth of lethal injection.

In 1976, the Supreme Court gave states the green light to resume executions after a decade of legal wrangling over the constitutionality of the death penalty, and Oklahoma was eager to get started. The only hitch was how to do it. Oklahoma’s electric chair was dilapidated and in need of repair, but more importantly, it was widely viewed as barbaric and inhumane. The state was looking to try something new. A state legislator approached several physicians about the possibility of death by drugs — a lethal injection. They wanted nothing to do with it, but the state’s medical examiner, Dr. Jay Chapman, was game. “To hell with them,” the legislator remembered Chapman saying. “Let’s do this.”

Chapman had no expertise in drugs or executions. As Chapman himself would later say, he was an “expert in dead bodies but not an expert in getting them that way.” Still, he said he would help and so he did, dictating a drug combination to the legislator during a meeting in the legislator’s office. Chapman first proposed two drugs, then later added a third. Voila. In 1977, the three-drug protocol that states would use for the next 30 years was born.

The idea was triple toxicity — a megadose of three drugs, any one of which was lethal enough to kill. The first drug, sodium thiopental, would kill by barbiturate overdose, slowing respiration until it stopped entirely. The second drug, pancuronium bromide, would kill by paralyzing the diaphragm, preventing it from pumping air into the lungs. And the third drug, potassium chloride, would kill by triggering a cardiac arrest. The effects of the second and third drugs would be excruciatingly painful, so the first drug did double duty by blocking pain as well.

How did Chapman come up with his three-drug combo? “I didn’t do any research,” he later confided in an interview. “I just knew from having been placed under anesthesia myself, what was needed. I wanted to have at least two drugs in doses that would each kill the prisoner, to make sure if one didn’t kill him, the other would.” As to why he added a third drug, Chapman answered, “Why not? … You wanted to make sure the prisoner was dead at the end, so why not add a third drug,” he said, asking: “Why does it matter why I chose it?”

This is how the original three-drug lethal injection protocol came to be: a man working outside his area of expertise and who had done no research just came up with it. “There was no science,” says law professor Deborah Denno, one of the leading experts in the field. “It was basically concocted in an afternoon.” As another lethal injection expert, law professor Ty Alper, put the point, Chapman “gave the matter about as much thought as you might put in developing a protocol for stacking dishes in a dishwasher.” For the careful dish stackers among us, it’s fair to say he gave it less.

But that was good enough for Oklahoma, which adopted the new execution method without subjecting it to a shred of scientific scrutiny. No committee hearings. No expert testimony. No review of clinical, veterinary, or medical literature. The state was embarking upon an entirely new way to kill its prisoners, and did none of the most basic things.

Texas followed Oklahoma’s lead the next day, and then other states did too, carelessly copying a protocol that had been carelessly designed in the first place. “There is scant evidence that ensuing States’ adoption of lethal injection was supported by any additional medical or scientific studies,” a court reviewing the historical record wrote. “Rather, it is this Court’s impression that the various States simply fell in line relying solely on Oklahoma’s protocol.” As Deborah Denno observes, the result was an optical illusion — states touted a “seemingly modern, scientific method of execution” without an iota of science to back it up. Jay Chapman was as surprised as anyone by other states’ adoption of his protocol. “I guess they just blindly followed it,” he later stated, adding, “Not in my wildest flight of fancy would I have ever thought that it would’ve mushroomed into what it did.” “I was young at the time,” he explained. “I had no idea that it would ever amount to anything except for Oklahoma.”

Over time, every death penalty state in the country would adopt Chapman’s three-drug lethal injection protocol — not because they had studied it, but because in the absence of studying it, there was nothing to do but follow the lead of other states. “I didn’t have the knowledge to question the chemicals,” one warden explained, saying that he had “no reason to because other states were doing it.”12 “It wasn’t a medical decision,” an official from another state explained. “It was based on the other states.”

Sociologists have a name for this, a term of art for fads based on a faulty assumption. They call it a “cascade to a mistaken consensus,” and lethal injection is a textbook example. States had come to a consensus in adopting the three-drug protocol, but it was based on the assumption that other states knew what they were doing. They did not.

 

The fact that the three-drug protocol wasn’t based on science is not to say that science on the drugs didn’t exist. All three drugs were FDA approved, so there were studies and FDA warning labels saying what each drug did. The problem was that none of that science could predict what would happen when the drugs were used in lethal injection. Lethal injection is an “off-label” use of a drug, and although doctors use drugs for off-label purposes all the time, they aren’t trying to kill people, so their off-label use doesn’t come anywhere close to the use of those drugs as poison in lethal injection. Lethal injection uses drugs in amounts that no one has ever prescribed, let alone studied in a research setting. It delivers the entire dose of a drug at once — a practice known as “bolus dosing” — rather than delivering the drug in an IV drip, as is typical for large doses in the clinical setting. And it uses combinations of drugs that are simply unfathomable in the practice of medicine, giving rise to the possibility of “profound physiological derangements” (science-speak for freakishly weird results), as overdoses of different drugs affect the body in different ways.

Who knew what was going to happen when all three of these perversions came together. No one did, and the studies to find out had not even begun. In the biomedical research setting, a baseline showing of scientific support is required for testing on animals, and the three-drug protocol didn’t even meet that threshold. As one lab scientist quipped, “You wouldn’t be able to use this protocol to kill a pig.”

But states weren’t killing pigs. They were killing people, so they forged ahead, undaunted by the unknowns. Yet over time, the executions that followed created data points of their own, and those data points drew scientists. If states would not go to the science, science would come to them.

Granted, the data was thin. In some states, the problem was secrecy. “There is an enormous amount of information from executions (autopsies, toxicology, ECG recordings, EEG recordings, execution logs, and photographs),” one expert explained, “but most of it has been kept secret.” In other states, the problem was poor record-keeping. In still others, it was a state’s decision to stop keeping records altogether. For example, Texas — which conducts more executions per year than any other state — stopped conducting post-execution autopsies altogether in 1989. “We know how they died,” a state spokesperson stated when asked about the reason for the no-autopsy policy.

That said, the raw data that scientists did manage to get was enough to raise serious concerns about the three-drug protocol. State officials were making “scientifically unsupportable” claims about lethal injection, researchers stated, so they decided to look at the data to see what it showed. In 2005 and 2007, researchers published two peer-reviewed studies on lethal injection, the first major studies of their kind.

In the first study, researchers obtained toxicology reports from forty-nine executions in Arizona, Georgia, North Carolina, and South Carolina. (Texas and Virginia, the two states with the most executions in the country at the time, refused to share their data.) Because they had no other way to determine whether prisoners were anesthetized when they were injected with the second and third drugs, researchers measured the postmortem amounts of sodium thiopental (the first drug) in the blood, finding that most prisoners had amounts lower than what was necessary for anesthesia, and some had only trace amounts in their system.

“Extrapolation of ante-mortem depth of anesthesia from post-mortem thiopental concentrations is admittedly problematic,” the researchers conceded. Still, the wide range of sodium thiopental amounts in prisoners’ blood suggested gross disparities during their executions as well. “It is possible that some of these inmates were fully aware during their executions,” the researchers stated, but their conclusion was more modest: “We certainly cannot conclude that these inmates were unconscious and insensate.”

Vigorous debate ensued. “You can’t take these post-mortem drug levels at face value,” one forensic pathologist stated, explaining that the amount of a drug in the blood dissipates after death, just as it does in life, and most autopsies in the study were conducted around twelve hours after death, so the postmortem measurements didn’t say much about the sodium thiopental in a prisoner’s blood during the execution. The study’s authors shot back with point-by-point responses to the criticism, but the damage was done. The so-called “Lancet study,” named for its publication in one of the most prestigious medical journals in the world, would forever be tainted by skepticism.

Had the first study been the only study of the three-drug protocol, one might have said that the science was inconclusive. But a second study was published two years later, and its findings were far less subject to dispute. In the second study, researchers examined execution logs in California. California’s expert had testified that the effects of sodium thiopental were well understood. Within sixty seconds of receiving the overdose, “over 99.999999999999% of the population would be unconscious,” the state’s expert stated, and “virtually all persons [would] stop breathing within a minute.” But when researchers examined the logs from California’s eleven executions by lethal injection, they found that this was not the case. In six of the eleven cases — 54% — the logs showed that the prisoner “continued to breathe for up to nine minutes after thiopental was injected.”

This was alarming not only because it showed that the state’s expert was wrong, but also because it suggested that the prisoners had died torturous deaths. In the absence of a trained professional assessing anesthetic depth, the cessation of breathing provides a rough proxy for adequate anesthesia. Thus, the fact that over half the prisoners continued breathing was an ominous sign that they had not been fully anesthetized prior to injection of the drugs that would cause slow suffocation and cardiac arrest. Executioners had recorded prisoners’ vital signs, but had not understood what they meant.

California’s execution logs revealed another problem as well: the same six prisoners who continued to breathe did not go into cardiac arrest after injection of the third drug, potassium chloride, which the state’s expert had said would kill within two minutes. Given the massive dose of potassium chloride, how could this possibly be? The answer was one of the “profound physiological derangements” that no one saw coming, at least not until researchers documented it: the bolus dose of sodium thiopental had depressed circulation so dramatically that it blunted the bolus dose of potassium chloride. Prisoners’ hearts raced in response to the potassium chloride, but not enough to induce cardiac arrest, leaving them to die by slow suffocation from the paralytic instead.

The findings from California’s execution logs led a federal court to invalidate the state’s lethal injection protocol in 2006. “The evidence is more than adequate to establish a constitutional violation,” the court stated, noting that it was “impossible to determine with any degree of certainty whether one or more inmates may have been conscious during previous executions or whether there is any reasonable assurance going forward that a given inmate will be adequately anesthetized.” The governor has since declared a moratorium on executions in the state, and it remains in place today.

Looking back, it’s fair to say that for the first 30 years of lethal injection, states used a three-drug protocol without understanding how it actually worked. State experts made claims and stated them with confidence, but what they said didn’t turn out to be true. Sodium thiopental didn’t do what states said it would do, and potassium chloride didn’t do what states said either — largely because no one accounted for the possibility that a bolus dose of the first drug would blunt the bolus dose of the third. States had no idea what their toxic drug combinations would actually do. They were slowly suffocating prisoners to death, and they didn’t have a clue.

Excerpt adapted from Secrets of the Killing State: The Untold Story of Lethal Injection by Corinna Barrett Lain. Copyright © 2025 by New York University. Published by NYU Press.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186036 https://historynewsnetwork.org/article/186036 0
Are You Not Large and Unwieldy Enough Already? On May 25, 1836, John Quincy Adams addressed the U.S. House of Representatives in an hour-long oration. Eight years earlier, when Adams was still president of the United States, an address of such length by the erudite Harvard graduate would have been unremarkable. But by 1836, Adams was no longer president. He had been defeated for reelection by Andrew Jackson in 1828; left the White House in 1829 without attending his successor’s inauguration; quickly grown restless in retirement as he observed with dismay Jackson’s populist, expansionist, and proslavery policies; and returned to Washington in 1831 as a member of the House. The nominal issue that inspired Adams’ sprawling speech in 1836 was a resolution authorizing the distribution of relief to settlers who had fled their homes in Alabama and Georgia following a series of violent altercations with Indigenous people. Adams used that conflict as an opportunity to embark on a wide-ranging discourse. As a Congressional Globe journalist archly put it, the ex-president addressed the chamber “on the state of the Union.”

Although Adams expounded on numerous subjects, he focused on the most pressing issue of the moment: the rebellion in the Mexican province of Coahuila y Tejas (or, as Americans called the northern part of the province, Texas). Beginning in October 1835, “Texians,” as expatriate American settlers in Texas were known, had revolted against Mexican rule. By April 1836, the Texians had unexpectedly defeated the Mexican force sent to subdue them, achieved a fragile independence, and appealed to the United States for annexation. Jackson plainly favored annexation, and Adams accused numerous House members of “thirsting” to annex Texas as well.

In dire terms, Adams warned against expanding the boundaries of the United States to include Texas. His opposition to annexation may have surprised some of his colleagues in the House. As a U.S. senator from Massachusetts in 1803, he had been the only Federalist to vote in favor of Thomas Jefferson’s acquisition of the Louisiana Territory. In 1818, as secretary of state during the administration of James Monroe, he had defended Andrew Jackson when Jackson, then an army general, had invaded Spanish Florida. In 1821, Adams acquired Florida for the United States from Spain in return for setting the southwestern boundary of the United States at the Sabine River — the border between the modern states of Louisiana and Texas. With that agreement in place, Adams believed that U.S. expansion had gone far enough. Before the House in 1836, he argued that to extend the already “over-distended dominions” of the United States beyond the Sabine would be an untenable overreach. “Are you not large and unwieldy enough already?” he asked proponents of annexation. “Is your southern and southwestern frontier not sufficiently extensive? Not sufficiently feeble? Not sufficiently defenceless?” Annexation, he predicted, would precipitate a war with Mexico that the United States might well lose. Adams warned that Mexico had “the more recent experience of war” and “the greatest number of veteran warriors.” He reminded the House of ongoing U.S. military stumbles in Florida, where the United States had struggled to establish its control since acquiring the peninsula from Spain: “Is the success of your whole army, and all your veteran generals, and all your militia-calls, and all your mutinous volunteers against a miserable band of 500 or 600 invisible Seminole Indians, in your late campaign, an earnest of the energy and vigor with which you are ready to carry on that far otherwise formidable and complicated war?” Not least of all, he warned that if Mexico were to carry the war into the United States, the invader would find numerous allies among slaves and especially among the Indigenous people whom the United States was in the process of removing to the Indian Territory on the border with Texas. “How far will it spread,” Adams asked, should Mexico invade the United States, “proclaiming emancipation to the slave and revenge to the native Indian”? In such an instance, “Where will be your negroes? Where will be that combined and concentrated mass of Indian tribes, whom, by an inconsiderate policy, you have expelled from their widely distant habitations, to embody them within a small compass on the very borders of Mexico, as if on purpose to give that country a nation of natural allies in their hostilities against you? Sir, you have a Mexican, an Indian, and a negro war upon your hands, and you are plunging yourself into it blindfold.” Adams’ speech sparked a debate that consumed five hours, causing the House to stay in session long into the evening. That night, Adams, in his inimitably cramped handwriting, recorded the day’s events in his diary. He congratulated himself that he had succeeded in sapping the House’s enthusiasm for annexation. Indeed, Adams and his like-minded colleagues in Congress managed to deter annexation for nine more years.

Ornamental map of the United States and Mexico, 1846. [David Rumsey Historical Map Collection]

In Adams’ view, the United States, which between 1783 and 1836 had expanded its territory northwest into the Great Lakes region, west into the Great Plains, and south to the Gulf of Mexico, had swollen beyond its capacity either to exercise effective sovereignty over border regions or to defend its extended borders against imperial competitors. The U.S. presence in the borderlands, a multilateral and multiethnic region, was tenuous: until the 1840s, Britain dominated the region between the western Great Lakes and Oregon, while Spain and, later, Mexico controlled the region between Texas and California. The success of the Seminoles together with the escaped slaves who were allied with them in resisting U.S. forces in Florida was hardly exceptional. In the western Great Lakes region, the Ojibwe dominated. The British liberally supported the Ojibwe and other Indigenous nations in the Great Lakes region. In the event of another war with Britain, the natives were likely to once again be British allies as they had been in the War of 1812. As for the horse-mounted natives of the Great Plains such as the Comanches and the Lakota, the United States in 1836 could not even begin to imagine challenging their control of the grasslands. Likewise, the fear that an invasion by a foreign power on the southwestern border might spur a slave revolt was quite real; by promising freedom, the British had encouraged thousands of enslaved people to join them in fighting against the United States in both the Revolutionary War and the War of 1812. In the first decades of the 19th century, numerous slaves fled from Georgia and Louisiana to Florida and New Spain; once in Spanish territory, maroon communities encouraged further flight and, slaveholders feared, rebellion. In short, Adams was entirely correct that in the first decades of the 19th century, the United States maintained a relatively weak presence on its borders where it had to contend with powerful, autonomous native groups, fugitive slaves, and competing imperial powers.

Leaders such as Adams who in the first decades of the 19th century pondered the weaknesses of the United States in its border regions were in many respects confronting a new problem. Before 1800, the most profitable imperial holdings in the Americas were of two types: sugar plantations in the Caribbean and coastal Brazil; and Spain’s silver mines at Potosí in the Andes and the Bajío in Mexico. Almost everywhere else, until the end of the 18th century, the British, French, Spanish, and Portuguese empires in continental North and South America were primarily commercial and tributary rather than territorial. European imperial settlements on the American mainland, with the notable exceptions of the Spanish silver mines and a few other places such as Mexico’s Central Valley, hugged the coastlines. European empires primarily claimed sovereignty over vast interiors of the Americas based on the reciprocal exchange of gifts and tribute with native leaders and by virtue of commerce in animal products and slaves that European merchants carried on with the Indigenous people of continental interiors.

Thus, throughout much of British, French, and Spanish North America, European imperial claims to territory depended on the commercial and diplomatic loyalties of Indigenous people. European military forces occasionally launched punitive expeditions into the interior against natives who resisted these commercial and diplomatic arrangements but rarely managed, or even tried, to establish an enduring military presence. Imperial boundaries, in this scheme, remained only loosely defined.

This system, in which Indigenous people held considerable influence, began to change in the late eighteenth and early 19th century, as European empires shifted away from defining sovereignty in terms of relationships with Indigenous people and toward negotiating imperial boundaries with each other. In 1777, for instance, Spain and Portugal agreed in the first Treaty of San Ildefonso to create a joint boundary commission to survey the border between their South American empires, marginalizing the Indigenous nations who lived in those lands. When the United States and Spain agreed to a border between Georgia and Spanish Florida in 1795, they did not consult with the Seminoles who inhabited the territory. Indigenous people were similarly excluded in 1818, when the United States agreed to a treaty with Britain establishing the northern boundary of the United States and providing for joint Anglo-American occupation of Oregon. They were likewise left out in 1821, when Adams negotiated with Luis de Onís, a Spanish minister, establishing the border between the United States and New Spain at the Sabine River. All these agreements belonged to a larger European-US effort to sideline Indigenous people and negotiate imperial boundaries among themselves. European- and American-made maps reflected the shift in imperial mentalities: in the seventeenth and eighteenth centuries, when imperial claims depended on alliances with Indigenous people, maps of the North American interior abounded with the names of Indigenous nations. By the 19th century, similar maps had erased references to Indigenous nations and showed only empty space.

Yet while European powers and the United States could erase Indigenous nations from their maps, they could not so easily dispense with the necessity of dealing with autonomous and powerful Indigenous nations on the outskirts of their territories. In the first decades of the 19th century, the old, somewhat unpredictable system of imperial sovereignty contingent upon diplomatic and commercial relations with Indigenous people persisted even as the new territorial system based on diplomacy (and sometimes war) between empires was ascending. For example, when the United States achieved its independence from Britain in 1783, it acquired — on paper at least — an extensive territory between the Appalachians and the Mississippi River. In 1783, however, the borders spelled out in treaties remained less meaningful than commercial and diplomatic relations with Indigenous people. While the British formally ceded the trans-Appalachian region to the United States, they maintained for decades merchant outposts in what was nominally U.S. territory. The U.S. explorer Zebulon Pike encountered one such outpost on the Upper Mississippi River in January 1806: a North West Company trading post. Seeing “the flag of Great Britain” over the post in what was nominally U.S. territory, Pike wrote, “I felt indignant.” But there was little he could do to assert U.S. authority. 

More than just flying their flag in U.S. territory, the British, through their trade, retained the commercial and diplomatic allegiance of Indigenous people in the new US Northwest Territory. When the United States and Britain went to war six years after Pike stumbled across the British trading post, most of the Indigenous people in the Northwest Territory sided with the British. To the south, the Spanish had seized Florida from Britain during the American Revolution; the Florida peninsula almost immediately became a haven for fugitive slaves from the United States. The Spanish, who also controlled New Orleans, periodically inconvenienced American merchants by closing the mouth of the Mississippi River to commercial travel.

Between 1803 and 1821, the United States acquired both Florida and New Orleans by treaty. The United States thus removed those territories from the control of an imperial competitor but in so doing took on an extensive territory where it struggled to establish its sovereignty. Understanding the early 19th-century United States as weak relative to Indigenous people, escaped slaves, and imperial competitors contradicts both the popular and the scholarly view of the United States in this period. Most historians of what the historian Arthur M. Schlesinger Jr. once called “the age of Jackson” depict U.S. expansion not only as inexorable but as one of the defining characteristics of the period. According to this view, the United States in the first half of the 19th century was like a seething boiler that could barely contain the outward economic and cultural pressures within it: a virulent, racist hatred of Indigenous people; an all-but-insatiable desire for land; a dynamic, profitable, and expanding slave-based plantation system; an explosive market economy; and a self-righteous American missionary Protestantism that saw itself as a reforming beacon to the world.

Pictorial map of the Great West, 1848. [David Rumsey Historical Map Collection]

Expansion was not a national consensus, and the expansionism that Andrew Jackson advocated was always a politically divisive and contested issue. In 1819, by a vote of 107–100, Jackson only narrowly escaped censure in the House of Representatives for his unauthorized attacks against Spanish outposts and British subjects during an invasion of Spanish Florida the previous year; in 1830, Jackson’s Indian Removal Act barely passed the House of Representatives, 101–97; in 1832, an anti-Jackson coalition won a majority of the Senate; and beginning in 1836 and lasting for the next nine years, Adams and his congressional allies successfully deterred Texas annexation. Adams was one of numerous elected leaders — many of them Northeasterners who eventually coalesced into the Whig Party — who advocated strengthening U.S. commerce, manufacturing, and infrastructure within existing U.S. boundaries rather than overstretching U.S. power by sprawling across the continent. Adams understood a reality about the U.S. position in North America that “manifest destiny” obscures:  a relatively weak United States found itself engaged with powerful European imperial competitors, and even more powerful Indigenous nations, in a complicated struggle for sovereignty in several regions on its borders. Unable to simply impose its will, the U.S. often reached out into the borderlands through diplomacy or commerce. Manifest destiny was just one of many narrative visions for the borderlands; in the first decades of the 19th century, it was neither the dominant vision nor the most plausible. 

Excerpt adapted from The Age of the Borderlands: Indians, Slaves, and the Limits of Manifest Destiny, 1790–1850 by Andrew C. Isenberg. Copyright © 2025 by the University of North Carolina Press.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186034 https://historynewsnetwork.org/article/186034 0
Mutant Capitalism In Neal Stephenson’s Snow Crash (1992), a novel that channeled perfectly the libertarian imagination of the post–Cold War moment, the territory once known as the United States has been shattered into privatized spaces: franchise nations, apartheid burbclaves, and franchulets, a world of what I have called “crack-up capitalism.” The threat in the plot is the Raft, a maritime assemblage several miles across: a decommissioned aircraft carrier lashed to an oil tanker and countless container ships, freight carriers, “pleasure craft, sampans, junks, dhows, dinghies, life rafts, houseboats, makeshift structures built on air-filled oil drums and slabs of styrofoam.” The Raft “orbits the Pacific clockwise” bearing a cargo of “Refus” or refugees, welcomed aboard by an entrepreneurial tech evangelist who has just cornered the global fiber optic grid and has schemes to subjugate the population through a computer virus administered as a bitmap narcotic. The Raft’s passengers are dehumanized and anonymized: a mass of insects “dipping its myriad oars into the Pacific, like ant legs” at whose arrival the coastal residents of California live in terror, subscribing to a “twenty-four-hour Raft Report” to know when the “latest contingent of 25,000 starving Eurasians has cut itself loose” to swim ashore.

Stephenson’s descriptions are stomach-turning, indulging in a grotesque racist imagery of nonwhite danger. The Raft was the fodder for, as he wrote, “a hundred Hong Kong B-movies and blood-soaked Nipponese comic books.” As the race scientist and former National Review journalist Steve Sailer noted, the Raft also had an obvious antecedent: the “Last Chance Armada” of Jean Raspail’s 1973 novel, first published in French, The Camp of the Saints. In that book, a disabled messianic leader from the Calcutta slums boards millions of indigent Indians on a lashed-together fleet of old ships to travel West “in a welter of dung and debauch.” The novel revels in what one scholar calls “pornographic prose” in its depiction of coprophagy, incest, and pedophilia aboard the armada. The plot ends in an orgy of violence after what the author sees as the suicidal embrace of the armada by the foreigner-friendly French population.

The first English translation of The Camp of the Saints was published by Scribner’s in 1975 to many positive reviews. The cover image showed a single Caucasian hand holding up a globe from grasping brown hands with a catch line reading: “a chilling novel about the end of the white world.” The book returned to public discussion during the first successful presidential campaign of Donald Trump as an alleged inspiration to his advisers Steve Bannon and Stephen Miller, but it was already a common touchstone decades earlier. It was reissued in 1986 by the white supremacist Noontide Press and in 1987 by the American Immigration Control Foundation (AICF), which, along with the Federation for American Immigration Reform (FAIR) helped mainstream anti-immigrant arguments in part by piggy-backing on the mailing lists of right-wing magazines to help seed a national movement. 

In 1991, John Randolph Club (JRC) founding member Sam Francis described the book as “a kind of science fiction novel” that had become American reality. “The future is now,” he wrote. The vision of the maritime refugee indexed with the evening news in the early 1990s. There were more than 30,000 interceptions of Haitians at sea in 1992 and nearly 40,000 Cubans in 1994; the same year, the Golden Venture ran aground in Rockaway Beach, carrying 300 Chinese would-be migrants. Raspail’s novel “forecasts the recent landing of the Golden Venture,” as one letter to the Washington Times put it in 1993. The Social Contract Press reissue featured a photo of Chinese men wrapped in blankets after disembarking from the vessel in the background. Introducing the novel, the nativist ideological entrepreneur and FAIR director John Tanton wrote that “the future has arrived,” citing the Golden Venture and other instances of maritime flight that had taken Raspail’s plot “out of a theorist’s realm and transposed it into real life.” Fiction can be more powerful than fact,” wrote JRC member and American Renaissance founder Jared Taylor in a review of The Camp of the Saints. “The novel,” he wrote, “is a call to all whites to rekindle their sense of race, love of culture, and pride in history for he knows that without them we will disappear.”

The Camp of the Saints had a special place in the paleo imagination. Ahead of the first JRC meeting, the Ludwig von Mises Institute’s Lew Rockwell claimed partial credit for the book’s circulation in the United States in 1975. In his talk “Decomposing the Nation-State” at the Mont Pelerin Society in 1993, Rothbard wrote that he had previously dismissed the novel’s vision, but “as cultural and welfare-state problems have intensified, it became impossible to dismiss Raspail’s concerns any longer.” He referred to his proposal of privatizing all land and infrastructure discussed in the last chapter as a solution to the “Camp of the Saints problem.” When the JRC met in Chicago in December 1992, the conference was titled “Bosnia, USA” and Hans-Hermann Hoppe spoke in the lead-off session named after The Camp of the Saints.

The year between the first and second meeting of the JRC had been momentous. The Los Angeles riots in April, Buchanan’s run for president, and Rothbard’s proposal of a strategy of right-wing populism made 1992 look like, in the words of author John Ganz, “the year the clock broke.” Another notable event was the publication of an article in National Review by the scheduled keynote speaker at the club: the journalist Peter Brimelow, a naturalized U.S. citizen born in England in 1947. When the article was published as a book by Random House in 1995 with thanks given to Rockwell and Jeffrey Tucker at the Ludwig von Mises Institute (as well as his agent Andrew Wylie), Alien Nation was described as a “non-fiction horror story of a nation that is willfully but blindly pursuing a course of suicide.” Historian Aristide Zolberg writes that the book “marked the ascent to respectability of an explicitly white supremacist position … that had hitherto been confined in the United States to shadowy groups.” Alien Nation came in the immediate wake of the passage of Proposition 187 in California, blocking access to education and health services for undocumented immigrants, one of the earliest instances of local governments “trying to retake immigration control into their own hands.” “No writer has argued more effectively for this change of policy than Peter Brimelow,” wrote Brimelow’s former colleague at Forbes, David Frum. “No reformer can avoid grappling with [his] formidable work.”

In 1999, Brimelow took his project online — “fortunately the Internet came along,” as he put it later — founding the website VDARE.com, named after the first child born to white settlers in North America, Virginia Dare. Serving as what the Washington Post called a “platform for white nationalism,” the website has hosted prominent advocates of scientific racism like Jared Taylor, J. Philippe Rushton, and Steve Sailer as well as alt-right activists Richard Spencer and Jason Kessler.

An amplifier for themes and tropes of the Far Right, a search of the website yields more than 20,000 posts with the term “white genocide,” more than 13,000 with “race realism,” and 6,000 with “Great Replacement.” Brimelow is also proximate to more mainstream figures in the United States. He was hosted at the home of then-president Donald Trump’s economic adviser Larry Kudlow in 2018 and held a role at the same time at Fox reporting directly to Rupert Murdoch. Brimelow has become Jean Raspail’s spokesperson for the 1990s and 2000s. 

 

Where does the resurgence of the Far Right come from? Scholars attempting to explain how apparently fringe political ideologies have moved to center stage since the election of Trump in 2016 have split into two camps. The first locates the origins of the Far Right in culture: racism, chauvinism, xenophobia, the “tribalism” of “white identity politics,” or a longing for “eternity.” As a group, these commentators seem to ignore the admonition from Frankfurt school sociologist Max Horkheimer repeated so often that it threatens to become a cliché that “whoever is not willing to talk about capitalism should also keep quiet about fascism.”

Capitalism can be hard to find in this literature. A recent book on “the far right today” does not mention the term once. Four other books on the alt-right and white power movement barely mention it, and a fourth only to say that the alt-right is “skeptical of global capitalism.” References to “identity” outnumber “capitalism” at a ratio of several dozen to one. The assumption seems to be that Far Right ideology is either post- or pre-material: it inhabits a space of culture detached from issues of production and distribution. This is startling given the fact that the radical Right’s central issue is nonwhite immigration, an eminently economic issue with a vast specialized literature. 

By contrast, the second school of interpretation finds the origins of the Far Right in the spirit of capitalism itself. Rather than a rejection of neoliberalism, they see the Far Right as a mutant form of it, shedding certain features like a commitment to multilateral trade governance or the virtues of outsourcing while doubling down on Social Darwinist principles of struggle in the market translated through hierarchical categories of race, nationality, and gender. Brimelow’s work helps us see how the nation is understood as both a racial and economic asset to the Far Right.

Brimelow is described variously as a “white nationalist,” “restrictionist,” or “Alt Right figurehead.” Yet he is almost never described the way he described himself: as a libertarian conservative or even a “libertarian ideologue.” It is rarely, if ever, noted that he was a fixture in the standard networks of neoliberal intellectuals seeking to rebuild the foundations of postwar capitalism. He spoke at a Mont Pelerin Society (MPS) regional meeting in Vancouver in 1983 alongside Margaret Thatcher’s speechwriter and later National Review editor John O’Sullivan. Brimelow’s interviews and lengthier features in Forbes in the late 1980s and 1990s drew almost exclusively from the MPS roster. This included profiles and interviews with Thomas Sowell (twice), Peter Bauer, Milton Friedman (twice for Forbes and twice for Fortune), and Murray Rothbard. His longer features were built around the research of Gordon Tullock, Hayek, Friedman, and MPS member Lawrence White. He wrote a glowing review of Milton and Rose Friedman’s memoirs, recounting Milton’s first trip overseas to the inaugural MPS meeting and praised the couple’s contributions to “the free-market revolution in economics that has overthrown the statist-Keynesian-socialist consensus.”

To describe Brimelow as nativist and white nationalist may be correct, but it threatens to banish his concerns from the domain of the rational and the economic. In fact, he was a typical member of a transnational milieu linking Thatcherite intellectuals taking their own version of a cultural turn around the Institute of Economic Affairs’ Social Affairs Unit with social scientists like Charles Murray and Richard J. Herrnstein concocting theories linking race, intelligence, and economic capacity as well as neoconservatives from the United States to Singapore to Japan rediscovering the relevance of “Asian values” for capitalist success. For the new fusionists of the free-market Right, the economic was not a pristine space quarantined from matters of biology, culture, tradition, and race. Rather, these thought worlds overlapped and melded with one another.

Brimelow’s first book was not about politics or race. It was called The Wall Street Gurus: How You Can Profit from Investment Newsletters, marketed alongside books like The Warning: The Coming Great Crash in the Stock Market and Wall Street Insiders: How You Can Watch Them and Profit. Like the authors of those newsletters, investment was simultaneously a strategy of money-making and leveraging symbolism and accruing influence. We can understand his turn to whiteness as the outcome of a portfolio analysis. The nation was a safe asset. The pro-white play looked like a payday. 

Excerpt adapted from Hayek’s Bastards: Race, Gold, IQ, and the Capitalism of the Far Right by Quinn Slobodian. Copyright © 2025 by Quinn Slobodian. Published by Zone Books.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186032 https://historynewsnetwork.org/article/186032 0
“The End Is Coming! The End Is Coming!” If you were a child during the late 1990s, there’s a good chance you either owned Beanie Babies or your parents went crazy over them. Beanies featured simple designs that inspired complex madness. A mixture of polyester, synthetic plush, and plastic pellets, Beanies were ordinary-looking children’s playthings that took the world by storm. Throughout the mid- to late 1990s, they became ubiquitous in American homes and fostered a community. By the end of the decade, the craze went haywire. In April 1998, a police department near Chicago removed weapons from the streets by offering a buyback program where people could exchange their guns for Beanie Babies. A few months later, in a foreshadowing of the inanity of contemporary politics, U.S. trade representative Charlene Barshefsky sparked controversy when Customs forced her to turn over a few dozen Beanie Babies she purchased while traveling to China with President Bill Clinton. “Instead of trying to reduce our $50 billion trade deficit with China,” stated Republican National Committee chairman Jim Nicholson, “our trade representative was scouring the street markets of Beijing grabbing up every illegal, black market ‘Beanie Baby’ she could get her hands on.” Citing “a source close to the White House delegation,” the Washington Post reported that Barshefsky turned over 40 Chinese Beanies.

Beanie Babies came with a red heart-shaped tag with the word “Ty” printed in large white letters. Ty is an homage to Ty Warner, who created the Beanies in 1993. His company Ty Inc. grew from a modest upstart near Chicago with a handful of employees to running a 370,000-square-foot warehouse and generating $1.4 billion in sales by 1998.

Looking at a Beanie Baby would give the impression that they were children’s toys. But what drove the revenue of Ty Inc. wasn’t parents buying toys for their children to play with. It was adults buying Beanies for themselves and treating them like financial instruments. The CD-ROM Ultimate Collector for Beanie Babies made a splash at the video game industry’s largest trade event in 1999. For $25, consumers received software helping them organize their collection and track price changes. Ultimate Collector featured tax summaries and insurance reports. “It was no longer a child’s toy,” said one collector when recalling why she accumulated Beanies throughout the late ’90s. “It was the hunt for me.”

Despite selling millions of stuffed animals, the Ty company convinced consumers that Beanies were in short supply. In the late ’90s, every few months a particular Beanie Baby would be “retired,” which drove prices up. Online Beanie reselling became so common that when eBay filed paperwork with the U.S. Securities and Exchange Commission to become a publicly traded company in 1998, it cited the volatility of Beanie Baby sales as a risk factor to the company’s financial health. During the second fiscal quarter in 1998, eBay had “over 30,000 simultaneous auctions listed in its ‘Beanie Babies’ category,” the company stated. “A decline in the popularity of, or demand for, certain collectibles or other items sold through the eBay service could reduce the overall volume of transactions on the eBay service, resulting in reduced revenues. In addition, certain consumer ‘fads’ may temporarily inflate the volume of certain types of items listed on the eBay service, placing a significant strain upon the Company’s infrastructure and transaction capacity.” eBay was correct to be cautious about a fad. Beanie sales accounted for a tenth of its total revenues, but the gravy train would not last.

There was enough interest in this market that Beanie Baby price guides were produced to forecast Beanie resale values the same way investment banking analysts make stock predictions. At its peak, there were more than a hundred Beanie Baby price guides. Price guides were often published by collectors who had large investments in Beanie Babies. Unlike an impartial observer, it was in the collector’s personal interest to hype prices rather than proceeding with caution. At the time that Beanies were reselling for thousands of dollars, one price guide predicted they would appreciate another 8,000% over the following decade. Enough people acted on price guide recommendations that, for a brief time, their outrageous predictions became self-fulling until the bubble popped.

Self Portrait with Toys, by Viola Frey, 1981. [Smithsonian American Art Museum]

Retiring Beanies initially drove up resales. Then Ty Inc. tried a more aggressive tactic. “For years, nothing has been hotter than those cuddly little animals with cute little names,” stated CBS Evening News correspondent Anthony Mason during a September 1999 segment. “But abruptly this week, the makers of Beanie Babies, Ty Incorporated, announced over the internet that it was over … By the turn of the century, Beanie Babies will become has beans.” The ushering in of the new millennium would include Y2K and Beanie Baby production stoppages. Beanie Baby web forums were full of apocalyptic posts with titles like “The End Is Coming! The End Is Coming!”

Prices did not rise as Ty hoped. More people came to the realization that stuffed animals that originally sold for $5 should not be reselling for $5,000. That people who banked their life savings on children’s collectibles were playing a fool’s game. That there was no value in investing in a mass-produced product whose entire worth is premised on speculation. Beanie collectors were enamored with old toys that sold for high prices. But when old toys become valuable it is because most toys get beat up when children play with them, so finding a toy in mint condition 30 years after it came out is a rarity that drives prices up. 

Beanies were collected by thousands of adults and stored in glass cases. They were produced in such high volume that the supply outstripped the demand. Finding an old mint-condition Beanie Baby is about as rare as finding a pothole on a city street. 

People who spent thousands of dollars chasing the Beanie fad had closets full of merchandise that was worth less than its original retail price. A soap opera actor who spent $100,000 on Beanies as an investment in his children’s college education saw his investment evaporate. That might sound laughable, but the actor’s son made a haunting documentary about the saga that displayed the humanity driving the craze and its unfortunate consequences. Collectors were left with credit card debt they couldn’t pay off and regret they couldn’t shake.

By 2004, Ty Warner claimed a loss of nearly $40 million on his tax return. In 2014, Warner was convicted of tax evasion. (Warner loathed all taxes, including road tolls. According to Zac Bissonnette, Warner instructed people he was riding with to “just throw pennies and keep driving! It’s an illegal tax!”) Like so many other rich men found guilty of serious crimes, he avoided jail time and remains wealthy. 

Excerpt adapted from 1999: The Year Low Culture Conquered America and Kickstarted Our Bizarre Times, available for preorder from University Press of Kansas. Copyright © 2025 by Ross Benes. 

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186030 https://historynewsnetwork.org/article/186030 0
Telling Chestnut Stories At one time, more than 4 billion American chestnut trees spread from southern Canada all the way to Mississippi and Alabama. While it was rarely the dominant tree in the forests, this giant of the eastern woodlands was hard to miss. It could stand over one hundred feet tall, the trunks straight and true.

To those who lived in the eastern United States, especially Appalachia, the tree was invaluable. It provided food for both people and animals and wood for cabins and fence posts. In cash-poor regions, the tree could even put some money in people’s pockets when they sold nuts to brokers to take to the cities of the Northeast. Some joked that the tree could take you from the cradle to the grave, as the wood was used to make both furniture for babies and caskets. It was certainly “the most useful tree.” In the early 20th century, however, the chestnut came under serious threat. A fungus, first identified in New York in 1904, began killing chestnut trees. It quickly spread south across Appalachia, resulting in the almost complete loss of the species by the mid-20th century. This loss had enormous ecological impacts on the forests of the eastern United States and contributed to a decline in the population of many species of wildlife, including turkeys, bears, and squirrels.

Today, while millions of American chestnut sprouts remain in the forests of the East, almost all the large trees, as well as most of the people who remember the trees’ dominant place in the forest ecosystem, are gone.

Since 1983, the American Chestnut Foundation (TACF) has taken the lead in restoring the American chestnut. While scientists coordinate the project, volunteers play an important part in planting and caring for trees in TACF test orchards. One of the tools TACF uses to connect with volunteers are chestnut stories. In TACF’s publication, originally published as the Journal of the American Chestnut Foundation and known since 2015 as Chestnut: The New Journal of the American Chestnut Foundation, chestnut stories can take many forms — oral histories, essays, poems — but they all document the relationship between humankind and the American chestnut tree. Chestnut stories serve an important purpose: reminding people of the value of the species and the many ways people used the tree before its decline. In documenting the story of the American chestnut through the journal, in sharing and interpreting this story, and in using it to mobilize volunteers and resources, TACF has demonstrated the value that practices rooted in the field of public history and the study of memory can bring to the realm of environmental science. Public historians are well aware of the power that narrative has in prompting action and encouraging people to rethink the status quo. The chestnut stories documented by TACF help create a historical narrative and also serve as a justification for the reintroduction of the species into the modern landscape. As we deal with the long-term consequences of climate change, the emergence of new diseases, and the loss of habitat, the work of TACF can, perhaps, provide a road map for other organizations to employ science, technology, public history practices, and memories to mobilize people to solve environmental challenges.

 

While it is difficult to pinpoint the exact moment when the fungus that devastated the American chestnut arrived in North America, it is possible to date when and where someone first noticed its effects. In 1904, in the Bronx Zoological Park in New York City, employee H.W. Merkel noticed that large sections of the park’s chestnut trees’ bark were dying, and the overall health of the trees appeared to be deteriorating. Dr. A.A. Murrill, who worked for the New York Botanical Gardens, was called in to study the affected trees. He identified the cause: a new fungus, which he called Diaporthe parasitica (the name was changed in 1978 to Cryphonectria parasitica). It is highly unlikely that the trees in the Bronx Zoo were the first to be infected by the fungus, which had come into the port of New York from Asia, but rather it was the first place that someone paid enough attention to notice it.

The “chestnut blight,” as it was commonly called, spread quickly, infecting trees in other locations in New York, as well as in New Jersey and Connecticut. Scientists studying the blight, such as Dr. Haven Metcalf and Dr. J.F. Collins, published bulletins about the disease, which contained recommendations for how to slow the spread. These recommendations included checking nurseries for blighted trees and quarantining those suffering from the blight, creating blight-free zones where chestnuts were removed in the hopes that the blight’s progress would be stopped if there were no chestnut trees, and performing surgery to remove cankers from infected trees. Unfortunately, the advice they gave did not stop the blight, and it began pushing farther south. In Pennsylvania, the Chestnut Blight Commission had permission to enter private property and remove trees infected with or threatened by the blight. In all, the commission spent over $500,000 to fight the blight. But, again, their efforts did not halt the spread. The blight reached West Virginia by 1911, pushing into the heart of Appalachia, where the tree had an important place in the lives of mountain communities. Combined with ink disease, which had been attacking chestnut trees in the southern end of the tree’s range since the 19th century, the blight caused widespread devastation. By 1950, 80% of the American chestnut trees were gone. In all, the upper portion of over 3.5 billion trees died, the equivalent of approximately 9 million acres of forest. The root structure of many trees, however, did not die, and stump sprouts continue to emerge from root systems today, well over a century later. Unfortunately, before they are able to grow very large, these stump sprouts become infected by the blight and die back. So today, while millions of stump sprouts do exist, few mature trees are left.

Chestnut blight, 2009. Photograph by Daderot. [Wikimedia Commons]

TACF eventually took the lead in chestnut restoration efforts. TACF began formally documenting its progress in 1985 with the publication of the Journal of the American Chestnut Foundation, published as Chestnut: The New Journal of the American Chestnut Foundation since 2015. In the first edition, the then editor Donald Willeke lays out the mission of the journal: “We hope that it will be both a scientific journal and a means of communicating news and developments about the American Chestnut to dedicated non-scientists (such as the lawyer who is writing this Introduction) who care about trees in general, and the American Chestnut in particular. and wish to see it resume its place as the crowning glory of the American deciduous woodlands.” Over the years, the journal has moved from a volunteer publication released once or twice a year (depending on the year and on capacity) to a glossy, professional magazine released three times a year.

In the journal, the progress of the backcross breeding program is broken down into terms nonscientists can understand. The journal, however, is not only about the science behind the restoration effort. One of the most significant sections of the journal in its early years was the “Memories” section, which documented “chestnut stories.” While many of the memories included in the journal came to TACF unsolicited, TACF also recognized the importance of documenting people’s chestnut stories in a more organized fashion. In 2003, the then membership director Elizabeth Daniels wrote about the American Chestnut Oral History project, which aimed to preserve chestnut stories for future generations. In the spring of 2006, the then editor Jeanne Coleman let readers know she was interested in gathering chestnut stories. Stories came pouring in, and as Coleman says in the fall 2006 issue, “These stories are heartwarming [and] often funny.” Today, in essence, the journal itself acts as the archive of TACF’s chestnut stories, preserving and sharing them simultaneously.

Untitled (Squirrels in a Chestnut Tree), by Susan Catherine Moore Waters, c. 1875. [The Metropolitan Museum of Art]

In reviewing all 79 issues of TACF’s journal that are available online as of January 2022, as well as other sources, the significance of the chestnut stories becomes quite clear. The work of the scientists engaged in backcross breeding and genetic modification is essential to the restoration of the chestnut. But the success of TACF also has come from thousands of members and volunteers who have supported the work of the scientists. From the beginning, TACF understood the importance of engaging with people outside traditional scientific research circles to accomplish restoration. 

People who mourned the past also supported work to bring about a future where the chestnut once again plays an important role in the ecosystems of the eastern woodlands. TACF members have been, per scientist Philip Rutter from the University of Minnesota, “trying to do something about the problem rather than just lament the loss,” which certainly challenges the argument that nostalgia can reduce the ability to act in the present. 

While maybe not quite as tall or as wide as remembered in chestnut stories, the American chestnut tree occupied a significant place in the forest and in the lives of those who lived under its spreading branches—and it is certainly worthy of the work to restore it. Chestnut stories document this significant place chestnuts held in the forest ecosystem, and the sharing of the stories reminds people of the value the tree brought to Americans before the blight destroyed it. In an interview with Charles A. Miller, William Good remembers how farmers fattened their hogs off chestnuts: “In the fall, because people didn’t raise corn to feed their hogs, farmers would let them run in the mountain where they fattened up on chestnuts. The hogs would have to eat through the big burs on the outside to get the nut out. . . . The hogs must have liked the nuts so much they would chew through them until their mouths were bleeding.”

In an interview that appears in the 1980 folklore collection Foxfire 6 and is reprinted in TACF’s journal, Noel Moore recollects that people in the Appalachians did not build fences to keep their stock in; they instead fenced around their homes and fields to keep out the free-range stock wandering the woods. Each fall, farmers would round up and butcher their hogs that had grown fat on acorns and chestnuts. Chestnuts also served as an important food source for wild animals, including turkeys, black bears, white-tailed deer, gray fox squirrels, and the now extinct chestnut moth. These animals, in turn, fed those who hunted them. Chestnuts also were an important food source for people. Myra McAllister, who grew up in Virginia, recalls that she liked chestnuts “raw, boiled, and roasted by an open fire.” Cecil Clink, who grew up in North Bridgewater, Pennsylvania, remembers filling old flour sacks with the nuts, which his mother would store in the family’s “butry,” or buttery, “with the smoked hams. . . . [They ate] the nuts boiled, or put them on the cook stove and roast them.” Other people made stuffing and traditional Cherokee bread out of the nuts, though they could not grind the nuts into flour because they were too soft; the nuts had to be pounded by hand into flour instead. And it was not just the nuts themselves that people loved. Noel Moore remembers the taste of the honey that bees made from the chestnut blossoms. The leaves also had medicinal uses: the Mohegans made tea to treat rheumatism and colds, and the Cherokee made cough syrup.

Some stories recall the act of selling chestnuts, which gave many families cash that they might otherwise not have had—likely making it a memorable moment. In Where There Are Mountains, Donald Davis shares the chestnut story of John McCaulley, who as a young man had gathered chestnuts for sale. The nuts he gathered sold for four dollars a bushel in Knoxville, Tennessee—and McCaulley could gather up to seven bushels a day. Jake Waldroop remembers seeing wagons loaded with chestnuts in his northeast Georgia mountain community. The wagons headed to “Tocca, Lavonia, Athens, all the way to Atlanta, Georgia.”63 Noel Moore recalls seeing families coming from the mountains and heading to the store in the fall, laden with bags of chestnuts. They traded the bags for “coffee and sugar and flour and things that they had to buy to live on through the winter.”6 Exchanging chestnuts for supplies or cash was “much less risky than moonshining.”

Chestnutting, by Winslow Homer, 1870. [Wikimedia Commons]

To the north in Vittoria, Ontario, Donald McCall, whose father owned a store in the village, recollects that “farmers counted on the money from their chestnuts to pay taxes on the farm.” The trees themselves also had value. William B. Wood recalls that his father tried to save the family farm during the Great Depression by felling and selling the wood of a huge chestnut tree dead from the blight that Wood calls the “Chestnut Ghost.” Unfortunately, the plan did not work, and the family lost their farm.Other stories connect to the “usefulness” of the tree. Because chestnut wood is “even grained and durable . . . light in weight and easily worked,” the tree was used for a wide variety of purposes. Georgia Miller, who was 101 when she wrote “Chestnuts before the Blight,” recalls the split rail fences that lined the edges of pastures. The chestnut wood split easily and lasted longer than that of other species, making it a good material for what some called “snake fences.” Daniel Hallett, born in New Jersey in 1911, says his family used chestnut to trim doors and windows and also for chair rails in the houses they built. Dr. Edwin Flinn’s story (told by Dr. William Lord in 2014), which focuses on the extraction of tannins from dead chestnut trees, shows that the tree remained valuable even after the blight struck.

Chestnut stories recount more than memories of the tree’s usefulness or the role it played in Indigenous and rural economies. Many stories have documented how an encounter with the tree mobilized someone toward engagement with the restoration effort, demonstrating that chestnut stories can provide a pathway to a wider recognition of the natural world. Patrick Chamberlin describes such an experience in “A Practical Way for the Layman to Participate in Breeding Resistance into the American Chestnut.” Chamberlin tells readers how his grandmother used to reminisce about the tree when he was a young boy and then how he came across a burr from an American chestnut while in high school. He started looking for trees as he explored the woods on his parents’ farm. Eventually, while wandering near the old homestead site where his grandmother grew up, he came across two flowering chestnut trees. Returning later in the season, he found nuts from the trees. Through this experience, Chamberlin became involved in the back- cross breeding program—and, at the end of his essay, encourages others to do the same. A chestnut story from the distant reaches of his youth started him on his journey, and science helped him continue his work into the present. Fred Hebard, who directed TACF’s Meadowview Research Farms for twenty-six years, saw his first American chestnut sprout while helping round up an escaped cow with a farmer he worked for. When the farmer told him the story of the American chestnut, Hebard ended up changing his college major, earned a PhD in plant pathology, and began researching the chestnut. It became his life’s work.

Reprinted from Branching Out: The Public History of Trees. Copyright © 2025 by University of Massachusetts Press. Published by the University of Massachusetts Press.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186029 https://historynewsnetwork.org/article/186029 0
Scared Out of the Community Our featured weekly excerpts usually spotlight new history titles, but sometimes the news of the day makes returning to past scholarship, responding to different times and looking at the past from different contexts, a useful endeavor. This is the first entry in a series we hope to revisit from time to time, excerpting books from previous decades in order to bring the history they document to new audiences. Below is an excerpt adapted from Unwanted Mexican Americans in the Great Depression: Repatriation Pressures, 1929–1939, by Abraham Hoffman, published in 1974 by the University of Arizona Press. You can read the entire book online here

The old man entered the circular park, looked around, and sat down on one of the many benches placed there for the use of the town’s citizens. Several hundred people, mostly men, were also in the park, enjoying the afternoon sun. Sitting in the park enabled the old man to forget the reason that had brought him there. The deepening economic depression had cost him his job, and work was hard to find.

A sudden commotion startled the old man out of his reverie. Without warning, uniformed policemen surrounded the park, blocking all exits. A voice filled with authority ordered everyone to remain where he was. While the policemen guarded the exits, government agents methodically quizzed each of the frightened people, demanding identification papers, documents, or passports. With shaking hands the old man produced a dog-eared, yellowed visa. Only the other day, he had considered throwing it away. After all, he had entered the country so long ago…

The agent inspected the papers and barked several questions at the old man. Haltingly, he answered as best he could, for despite his years of residence in the country he had learned the language only imperfectly. With a nod of approval, the officer returned the papers. The old man sat down again; a sense of relief washed over him.

The agents continued their interrogation, and after about an hour everyone in the park had been checked and cleared. Or almost everyone. Seventeen men were placed in cars and taken away. The inspection over, the policemen left the park to the people. Few cared to remain, however, and in a few moments the place was deserted.

 

The time was 1931; the place, Los Angeles, California, in the city’s downtown plaza. The government agents were officers in the Department of Labor’s Bureau of Immigration, assisted by local policemen. Their goal was the apprehension of aliens who had entered the United States illegally.

Unlike many post-World War II aliens who knowingly entered in violation of immigration laws, immigrants prior to the Great Depression entered the United States at a time when the government’s views on immigration were in flux, moving from unrestricted entry to severe restriction. Many aliens found themselves confused by the tightening noose of regulations; one immigrant might enter with one law in effect, but his younger brother, coming to the country a few years later, might find new rules — or new interpretations of old rules — impeding his entrance.

With the onset of the depression, pressure mounted to remove aliens from the relief rolls and, almost paradoxically, from the jobs they were said to hold at the expense of American citizens. In the Southwest, immigration service officers searched for Mexican immigrants, while local welfare agencies sought to lighten their relief load by urging Mexican indigents to volunteer for repatriation. The most ambitious of these repatriation programs was organized in Los Angeles County, an area with the largest concentration of Mexicans outside of Mexico City.

Not all of the repatriates, however, departed solely under pressure from the Anglos. Many Mexicans who had achieved varying degrees of financial success decided on their own to return to Mexico, taking with them the automobiles, clothing, radios, and other material possessions they had accumulated. The Mexican government, vacillating between the desire to lure these people home and the fear that their arrival would add to an already existing labor surplus, sporadically launched land reform programs designed for repatriados. Between 1929 and 1939 approximately half a million Mexicans left the United States. Many of the departing families included American-born children to whom Mexico, not the United States, was the foreign land.

The peak month in which Mexicans recrossed the border was November 1931, and in all subsequent months the figures generally declined. Yet it was after this date that the number of cities shipping out Mexican families increased. Even after the massive federal relief programs of the New Deal were begun in 1933, cities such as Los Angeles, Chicago, and Detroit still attempted to persuade indigent Mexicans to leave.

With the start of World War II, Mexican immigration was renewed, when the United States and Mexico concluded an agreement to permit braceros to enter the United States. A system of permits and visas for varying periods testifies to the evolution of border regulations; their abuse and misuse bear witness to the difficulties of making such a system coherent. 

No other locality matched the county of Los Angeles in its ambitious efforts to rid itself of the Mexican immigrant during the depression years. By defining people along cultural instead of national lines, county officials deprived American children of Mexican descent of rights guaranteed them by the Constitution. On the federal level, no other region in the country received as much attention from immigration officials as Southern California. Because of the tremendous growth of this region after World War II, Southern California’s service as a locus for deportation and repatriation of Mexican immigrants is little remembered. To the Mexican-American community, however, repatriation is a painful memory. 

 

In 1931, various elements in Los Angeles had indicated support for the idea of restricting jobs on public works projects to American citizens. Motions were presented and passed by the Los Angeles city council and the county board of supervisors, while the Independent Order of Veterans of Los Angeles called for the deportation of illegal aliens as a means of aiding jobless relief.

The board of supervisors went so far as to endorse legislation pending in Congress and in the state legislature, which would bar aliens who had entered the country illegally from “establishing a residence, holding a position, or engaging in any form of business.” Supervisor John R. Quinn believed that such legislation would provide a sort of cure-all for all problems generated by illegal aliens, whom he believed numbered “between 200,000 and 400,000 in California alone.” Said Quinn in two remarkably all-inclusive sentences:

If we were rid of the aliens who have entered this country illegally since 1931 ... our present unemployment problem would shrink to the proportions of a relatively unimportant flat spot in business. In ridding ourselves of the criminally undesirable alien we will put an end to a large part of our crime and law enforcement problem, probably saving many good American lives and certainly millions of dollars for law enforcement against people who have no business in this country.

Quinn also believed the “Red problem” would disappear with the deportation of these aliens. 

It was in this atmosphere that Charles P. Visel, head of the Los Angeles Citizen’s Committee on Coordination of Unemployment Relief, published a press release in city newspapers. The statement announced a deportation campaign and stressed that help from adjoining districts would be given the local office of the Bureau of Immigration. Each newspaper printed the text as it saw fit, so that while one newspaper printed sections of it verbatim, another summarized and paraphrased. Certain embellishments were added. “Aliens who are deportable will save themselves trouble and expense,” suggested the Los Angeles Illustrated Daily News on January 26, 1931, “by arranging their departure at once.” On that same day, the Examiner, a Hearst paper, announced, without going into any qualifying details, that “Deportable aliens include Mexicans, Japanese, Chinese, and others.”

As the days passed, follow-up stories and editorials kept the public aware of the project. The Express two days later editorially endorsed restrictionist legislation and called for compulsory alien registration. On January 29, the Times quoted Visel, who urged “all nondeportable aliens who are without credentials or who have not registered to register at once, as those having papers will save themselves a great deal of annoyance and trouble in the very near future. This is a constructive suggestion.” The impending arrival of the special agents from Washington, DC, and other immigration districts was made known, the word being given by Visel to the newspapers. 

La Opinión, the leading Spanish-language newspaper in Los Angeles, published an extensive article on January 29. With a major headline spread across page one, the newspaper quoted from Visel’s release and from the versions of it given by the Times and the Illustrated Daily News. La Opinión’s article pointedly stressed that the deportation campaign was being aimed primarily at those of Mexican nationality.

 

Commencing February 3, Supervisor William F. Watkins of the Bureau of Immigration and his men, with the assistance of police and deputies, began ferreting out aliens in Los Angeles. By Saturday 35 deportable aliens had been apprehended. Of this number, eight were immediately returned to Mexico by the “voluntary departure” method, while an additional number chose to take the same procedure in preference to undergoing a formal hearing. Several aliens were held for formal deportation on charges of violating the criminal, immoral, or undesirable class provisions of the immigration laws. Five additional immigration inspectors arrived to provide assistance, and five more were shortly expected. 

On Friday the 13th, with the assistance of 13 sheriff’s deputies led by Captain William J. Bright of the sheriff’s homicide detail, the immigration agents staged a raid in the El Monte area. This action was given prominence in the Sunday editions of the Times and the Examiner. Watkins wrote to Robe Carl White, assistant labor secretary, that such coverage was “unfortunate from our standpoint,” because the impression was given by the articles that every district in Los Angeles County known to have aliens living there would be investigated. “Our attitude in regard to publicity was made known to the authorities working with us in this matter,” Watkins complained, “but somehow the information found its way into the papers.”

Considering the announcements from Walter E. Carr, the Los Angeles district director of immigration, that no ethnic group was being singled out and that only aliens with criminal records were the primary interest of the Bureau of Immigration, the aliens captured in the Friday the 13th raid could only have made the Mexican community wary of official statements. Three hundred people were stopped and questioned: from this number, the immigration agents jailed 13, and 12 of them were Mexicans. The Examiner conveniently supplied the public with the names, ages, occupations, birth places, years in the United States, and years or months in Los Angeles County, while the Times was content just to supply the names.

While generalizations are impossible about the people stopped, questioned, and occasionally detained, the assertions that all the aliens either were people holding jobs (that only could be held by citizens) or were criminals in the county did not apply to these arrested suspects. Of the twelve Mexicans arrested, the most recent arrival in the United States had come eight months earlier, while three had been in the United States at least seven years, one for thirteen years, and another was classified as an “American-born Mexican,” a term which carried no clear meaning, inasmuch as the charge against the suspects was illegal entry. Eleven of the twelve gave their occupation as laborer; the twelfth said he was a waiter.

 

As Watkins pursued the search for deportable aliens, he observed that the job became progressively difficult:

After the first few roundups of aliens ... there was noticeable falling off in the number of contrabands apprehended. The newspaper publicity which attended our efforts and the word which passed between the alien groups undoubtedly caused great numbers of them to seek concealment.

After several forays into East Los Angeles, the agents found the streets deserted, with local merchants complaining that the investigations were bad for business. In the rural sections of the county surveyed by Watkins’ men, whole families disappeared from sight. Watkins also began to appreciate the extent of Southern California’s residential sprawl. He observed that the Belvedere section, according to the 1930 census, might hold as many as 60,000 Mexicans.

The Mexican and other ethnic communities were not about to take the investigations passively. La Opinión railed at officials for the raids, while ethnic brotherhood associations gave advice and assistance. A meeting of over one hundred Mexican and Mexican American businessmen on the evening of February 16 resulted in the organization of the Mexican Chamber of Commerce in Los Angeles, and a pledge to carry their complaints about the treatment of Mexican nationals to both Mexico City and Washington, DC. Mexican merchants in Los Angeles, who catered to the trade of their ethnic group, felt that their business had been adversely affected, since Mexicans living in outlying areas now hesitated to present themselves in Los Angeles for possible harassment. Sheriff William Traeger’s deputies in particular were criticized for rounding up Mexicans in large groups and taking them to jail without checking whether anyone in the group had a passport or proof of entry.

Ambassador Rafael de la Colina had been working tirelessly on behalf of destitute Mexicans in need of aid or desiring repatriation. Much of his time was occupied with meeting immigration officials who kept assuring him that the Mexicans were not being singled out for deportation. He also warned against unscrupulous individuals who were taking advantage of Mexican nationals by soliciting funds for charity and issuing bogus affidavits to Mexicans who had lost their papers.

The Japanese community also expressed its hostility to the immigration officials. When several agents stopped to investigate some suspected Japanese aliens, the owner of the ranch employing the aliens threatened to shoot the inspector “if he had a gun.” Japanese people obstinately refused to answer any questions, and Watkins believed that an attorney had been retained by the Japanese for the purpose of circumventing the immigration laws. 

Despite the adverse reaction to and public knowledge of the drive on aliens, Watkins persisted. “I am fully convinced that there is an extensive field here for deportation work and as we can gradually absorb same it is expected [sic] to ask for additional help,” he stated. Responding to the charges of dragnet methods, he notified his superiors in Washington:

I have tried to be extremely careful to avoid the holding of aliens by or for this Service who are not deportable and to this end it is our endeavor to immediately release at the local point of investigation any alien who is not found to be deportable as soon as his examination is completed.

 

On February 21, 1931, Watkins wrote to White, and the following month to Visel, that 230 aliens had been deported in formal proceedings, of whom 110 were Mexican nationals, and that 159 additional Mexican aliens had chosen the voluntary departure option to return to Mexico.

These figures revealed that seven out of ten persons deported in the Southern California antialien drive were Mexicans. By the supervisor’s own admission, in order to capture the 389 aliens successfully prosecuted during this period, Watkins and his men had to round up and question somewhere between 3,000 and 4,000 people — truly a monumental task.

The effect of the drive on the Mexican community was traumatic. Many of the aliens apprehended had never regularized an illegal entry that might have been made years before. Other than that, to call them criminals is to misapply the term. The pressure on the Mexican community from the deportation campaign contributed significantly to the huge repatriation from Los Angeles that followed the antialien drive. But this seemed of little concern to the head of the Citizens Committee on Coordination of Unemployment Relief. By the third week in March, an exuberant Visel could write to Labor Secretary William N. Doak:

Six weeks have elapsed since we have received ... Mr. Watkins, in reply to our request for deportable alien relief in this district. We wish to compliment your department for his efficiency, aggressiveness, resourcefulness, and the altogether sane way in which he is endeavoring and is getting concrete results.

The exodus of aliens deportable and otherwise who have been scared out of the community has undoubtedly left many jobs which have been taken up by other persons (not deportable) and citizens of the United States and our municipality. The exodus still continues.

We are very much impressed by the methods used and the constructive results steadily being accomplished.

Our compliments to you, Sir, and to this branch of your service.

However much Visel’s interpretation of the benefits derived from the campaign squared with reality, the Department of Labor was no longer as eager to endorse the Los Angeles coordinator, or even to imply the existence of an endorsement. Perhaps the department feared any such reply might be converted into another publicity release. At any rate, with Nation and New Republic lambasting the department, Doak shied away from making a personal reply. Visel’s letter was answered by Assistant Secretary W.W. Husband, who acknowledged Visel’s message and then circumspectly stated:

It is the purpose of this Department that the deportation provisions of our immigration laws shall be carried out to the fullest possible extent but the Department is equally desirous that such activities shall be carried out strictly in accordance with law.

Excerpt adapted from Unwanted Mexican Americans in the Great Depression: Repatriation Pressures, 1929–1939 by Abraham Hoffman. Copyright © 1974 by The Arizona Board of Regents. Used with permission of the publisher, the University of Arizona Press. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186026 https://historynewsnetwork.org/article/186026 0
Creating the “Senior Citizen” Political Identity A mass social movement of the elderly was arguably the major force behind Social Security old-­age pensions: the Townsend movement, named after its founder, Francis Townsend, M.D.

Townsend was by far the largest social movement of the 1930s — ­larger than the protests of the unemployed and labor union organizing taken together. Never before had the elderly organized to advance their interests as old people. Yet despite its size and originality, it is the least studied social movement in American history. Scholars and policy wonks of its time focused almost exclusively on debunking the Townsend plan, pointing out its economic fallacies and labeling its supporters naïve and crackpot. Very few contemporary scholars have studied it. I find myself wondering, is this a sign of disdain for old people, precisely what the movement was challenging?

The Townsend movement’s influence continued even after the 1935 passage of the Social Security Act. It created a “senior citizen” political identity, now a powerful voting bloc. Townsenders insisted that the elderly should become a “guiding force in all things, political, social and moral.” This was a claim to wisdom and relevance at a time when the culture was increasingly disrespectful of old people. That claim allowed a single­-issue social movement to consider itself a patriotic cause, one that could improve the well­being of the entire nation. Part of the Townsend movement’s appeal was that it could be both narrow and wide: its narrowness made it recognizable and straightforward, its message stickier; its width made it selfless. Its genius lay in joining its members’ material interests with altruism. Townsenders saw themselves as simultaneously beneficiaries of the nation and contributors to it, givers as well as takers.

Car in Columbus, Kansas, with sign supporting the plan of Dr. Francis Townsend to create a nationwide pension plan for the elderly, 1936. Photograph by Arthur Rothstein. [Wikimedia Commons]

Dr. Townsend was aware that the United States was alone among developed countries in providing no government old-age pensions. He published eight letters on the subject in the Long Beach Press Telegram between September 30, 1933 and February 20, 1934, and in those five months found he had jump­-started a social movement. He soon sketched out an actual pension plan. Recruiting two partners — his brother Walter L. Townsend and real estate broker Robert Earle Clements —­ he created Old Age Revolving Pensions, Limited (OARP). The organization thus acquired two sorts of leaders: the doctor who led the social movement and the shrewd commission-earning businessmen who built the organization. This division of labor made the movement a juggernaut, but it also produced conflicts and allegations of corruption. 

From early on it was clear that a social movement was arising. OARP’s newsletter, The Modern Crusader, soon sold 100,000 copies — ­granted, it cost only two cents. The proposal generated excitement throughout California, and OARP chapters appeared so fast that the headquarters could not keep track. 

A committee, allegedly including statisticians, began drafting legislation, and within a year John McGroarty, the 73-year-old poet laureate of California, got himself elected to Congress, where he introduced the first Townsend bill, HR 3977.12 (The plan would be revised several times.) Its major provisions were: Every American citizen aged 60 or older would receive a monthly pension of $200, provided that they retired and refrained from wage-earning. Younger people who were “hopelessly invalided or crippled” would receive the same pension. Originally the plan proposed funding the pensions through a 2 percent sales tax — a regressive tax that would have disproportionately burdened low-income people. Later the plan substituted a tax on transactions, which continued to be regressive, would have raised commodity prices exponentially, and would have provided an incentive for vertical integration.

When McGroarty first introduced the bill, he presented it not as a pension proposal but as a plan for economic recovery, a claim often repeated by its supporters, one of whom called it a “big business” plan. It would work because the legislation would require each stipend to be spent within a month. The plan thus called itself “revolving” pensions on the theory that, after the first month, what was paid out would be recompensed by taxes, as if the same money would be cycling through the economy. The pensions would thus stimulate an economy in deep depression by boosting consumer spending. 

Even better, Representative McGroarty argued that freeing up jobs held by older people would open jobs for younger people — ­4 million jobs would allegedly become available. Retirement would then allow elders to become a “service class” of volunteers doing charitable work; this would allow government to operate at a “high standard,” as a supportive newspaper put it. To the criticism that government stipends would encourage passivity and dependence, Dr. Townsend responded that volunteerism should be a fundamental aspect of active citizenship. As he put it, the “secondary purpose of Townsend clubs is a desperate fight to continue the democratic spirit and form of government in these United States.” He argued that his plan would end child labor and reduce or even do away with crime, which resulted, in Townsend ideology, from poverty and unemployment. It would end war, which was also the result of poverty and inequality. Dr. Townsend’s arguments became ever more utopian and less realistic — an unusual trajectory, as over time most social movements make compromises, and their goals become more modest.

 

Experts in groups such as the American Association for Old Age Security, the American Association for Labor Legislation, and the National Conference of Social Work had been discussing possible welfare programs since the 1920s. Some of them would participate in writing the Social Security Act of 1935, including lions of social reform Edwin Witte, John R. Commons and Arthur Altmeyer, and social democratic feminists Grace Abbott, Sophonisba Breckenridge, Florence Kelley, Julia Lathrop, and Mary van Kleeck (many of them part of the Hull-House network). They had promoted the 1933 Dill-Connery bill, which would have provided federal grants-in-aid for the elderly, to be matched by the states, in amounts up to $15 a month per person. It passed the House in 1933 and 1934, but failed both times in the Senate. President Roosevelt did not support it, and Massachusetts sued the Treasury Department, arguing that the program was an attack on the constitutionally reserved powers of the states. Though that argument was rejected by the conservative Supreme Court, the bill’s failure suggests the strength of resistance to such welfare expenditure — and the difficulty of overcoming opposition to the Social Security Act a few years later.

Dill-Connery’s stipends would have been too minuscule to help low-income people, and they would have been controlled by state governments, which almost guaranteed that nonwhites would be excluded. The Social Security Act to come would have equally great limitations. It excluded the majority of Americans — farmworkers, domestic workers, and most other employed women — who worked mainly for small employers who were not required to participate. Unemployed women were expected to share a husband’s stipend. Divorced women would have no entitlement to an ex­-husband’s pension, and other unmarried women would get nothing.

The Townsend plan was better. True, it relied on discriminatory funding, like Social Security. Townsend proposed funding by sales taxes, while Social Security was funded by a percentage of earnings, so the poor who needed help most would get least. But the universality and the size of Townsend plan pensions would have mitigated inequality, by providing the same level of support to all Americans. Moreover, it would include people of both sexes and all races, a nondiscriminatory policy that might have set a precedent for future programs. The Townsend plan might also have ideological influence, contributing to a positive view of government responsibility for the public welfare. Put simply, the Townsend plan was advancing a democratic understanding of citizens’ entitlements. By contrast, most New Deal programs were discriminatory, offering more to those who needed less, by excluding the great majority of people of color and white women.

Although the Townsend plan would have been redistributive across class, sex, and race lines and could thus be categorized as a left or progressive plan, it was also redistributive along age lines, and this was problematic. It called for transferring massive resources from young and middle-aged adults to older ones. One opponent calculated that it would give half the national income to one-eleventh of the population. A historian recently estimated that a quarter of U.S. GDP would move from those under 60 to the elderly. Either figure confirms the plan’s unfairness toward younger people and their needs. Townsenders countered with a moral argument: “We supported our children in youth, is it not right and just that in old age we shall be taken care of by youth?” This sentiment, consistent with traditional family values, brought in socially conservative supporters.

Dr. Townsend’s Socialist Party history, during a period of the party’s strength, must surely have influenced his concern to help the needy. Nevertheless he took care to dissociate his plan from socialism. He frequently insisted that the plan did not undercut the profit system “or any part of the present administration of business.” This political ambiguity made the plan seem inconsistent, even incoherent to its opponents. Yet it was a “masterly synthesis of conservatism and radicalism,” in the words of one scholar. Townsend supporters were not naïvely “falling for” this political fusion of left and right, as their opponents charged. While Townsenders were supporting an impossible means of financing the pensions, as opponents pointed out repeatedly in every conceivable medium, they might be classified as intuitive social democrats — believing, and hoping, that a rich capitalist country could become a welfare state. For some, that belief was more emotional than political, and few of them had a broad conception of a welfare state. But the critics’ disdainful appraisal of Townsenders as fools was itself foolish. They were as educated and informed as any middle-class Americans.

While anyone could join and everyone would be entitled to a pension in the Townsend plan, the movement’s racial and religious composition was extremely homogeneous and almost identical with that of the Klan — ­white and Protestant, particularly evangelical. One writer commented that “one sees no foreign-looking faces.” There were a few exceptions. In one upstate New York industrial county, most votes for the Townsend/Democratic Party candidate came from immigrant voters, and one organizer pleaded for literature in Yiddish, Polish, and Italian. But the national Townsend organization never crafted appeals to immigrants, “ethnics,” or Black Americans. There were a few African American clubs and a few integrated clubs, mostly in California — Los Angeles, Long Beach, Oakland, Stockton — ­but the overwhelming majority of clubs were 100% white.

Townsend national leaders probably had little concern for elderly Black people or people of color in general. The demographics of southern California may have played a role here: in 1930 African Americans constituted only 1.4% of the state population. On the other hand, the state’s population included tens of thousands of people of color who rarely appear in material by or about Townsend: 415,000 people of Hispanic origin, about 7%, and 169,000 of Asian origin, just under 3%. No doubt the whole movement had not only a white but a Protestant evangelical appearance and discourse, and most nonwhite people in the western states had learned caution about entering unknown “white” spaces. Certainly the Townsend movement and its clubs did not attempt to recruit them.

As with the Klan, many Townsend movement members were businessmen and white-collar workers, with some professionals. Its demographics contrasted with the Klan’s in several ways however — it had more big-city dwellers, more women, of course more gray hair, and fewer young and middle-aged people. But while most active Townsenders were middle class, conceived broadly, that label meant something very different in the midst of an economic depression: the majority had probably experienced a sudden economic collapse rather than chronic deprivation. Some West Coast members, especially the poorest ones, were refugees from the “dust bowl.” But regions of chronic poverty, such as the southern states, did not produce many Townsend clubs. The universality of proposed pensions, which threatened to include African Americans, no doubt repelled many white southerners. As a Jackson, Mississippi, newspaper editorialized, “The average Mississippian can’t imagine himself chipping in to pay for able-bodied Negroes to sit around in idleness.”

 

The Townsend movement was a business. Millions flocked to it because of the pension plan and the doctor’s hokey charisma, but also because it offered them a chance to make a bit of money. Clements introduced the same recruitment-by-commission arrangement that had so ballooned the KKK. Previously an organizer for the Anti-Saloon League (like quite a few Klan organizers), he “hired” some three hundred organizers, aka recruiters, many of them also former Anti-­Saloon League employees, some of them ministers. There were no wages, only commissions: they earned 20%, or 2.5 cents, from every 12 cents that new members paid. One early organizer claimed that the doctor promised him “handfuls” of money from the work.

At first the doctor worried about the opportunities for embezzlement created by the commission system, but his staff clung to the system because it was so cheap. Understandably, Townsenders appreciated the opportunity to earn in the midst of the still worsening Depression. Members felt even better because they were earning by bringing people into a just cause. Dr. Townsend defended the system by arguing that it freed the organization from having to solicit large donations from the rich. Recruitment by commission was more democratic, he said — it meant that the needy supported the movement themselves and would not be beholden to big money. Yet the doctor also defended this approach with an argument that justified and flattered his personal leadership: “Townsend …  is a Program of Proxy … Thousands of the world’s best people do not possess the high qualifications for personal leadership . . . yet they can partake in the program by letting their money become proxy for them. . . . Your dollars can become you.”

Excerpted from Seven Social Movements That Changed America. Copyright © 2025 by Linda Gordon. Used with permission of the publisher, Liveright Publishing Corporation, a division of W. W. Norton & Company, Inc. All rights reserved

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186024 https://historynewsnetwork.org/article/186024 0
There’s Some Spirit Left Yet On December 8, 1876, Bristol police arrested the bookseller Henry Cook for selling the American birth control booklet Fruits of Philosophy. (Victorian readers knew the latter noun was a byword for “science.”) The edition’s title page bore the name of the National Reformer publisher Charles Watts, who had purchased the plates years before and printed it alongside dozens of National Secular Society pamphlets.

Unbeknownst to the London-based Watts, Henry Cook had previously served two years in prison for selling pornography from under his shop’s counter. Between the Fruit’s pages of plain text, Cook had inserted images from his old trade that illustrated explanations of human anatomy and recommendations for safely preventing pregnancy.

On December 14, Annie Besant — the National Secular Society’s second-in-command, a 29-year-old single mother who had defiantly left her sexually abusive Anglican vicar husband in an era of strict coverture laws — arrived at the National Reformer’s smart new Fleet Street office to find a nervous Charles Watts. Upon hearing of the Bristol arrest, he had telegraphed to Cook, “Fear not, nothing can come of it.” But until now, Watts had never actually read the pages that he printed.

Handing a copy of Fruits of Philosophy to Besant, he asked her opinion. She read it on the train en route to a lecture on the emancipation of women. The pamphlet advocated parental responsibility, and the restriction of family size within the means of its existence. While the 1830s American medical English lacked her flair—George Bernard Shaw had recently proclaimed her among Britain’s best orators — Annie concluded that she would have been proud to author such a work.

Unlike the Victorian Dr. Acton — then instructing England that masturbation led to blindness — in his Fruits the American Dr. Charles Knowlton wrote about sex as a natural enterprise, and nothing to be ashamed of. Nor should it be limited to the purpose of procreation. “A temperate gratification,” Knowlton wrote, “promotes the secretions, and the appetite for food; calms the restless passions; induces pleasant sleep; awakens social feeling, and adds a zest to life which makes one conscious that life is worth preserving.”

From the train station, Annie telegraphed Watts, “Book defensible as medical work.” 

Her confidant Charles Bradlaugh, on the other hand, felt it was indictable. The National Secular Society president, a disciple of Richard Carlile and John Stuart Mill who for the past decade had attacked the Church and Crown over its hold on free speech, knew that open discussion of sex remained taboo. Previously he had urged Watts to pull the title from the press. Now that the horses had bolted, Bradlaugh instructed his long-serving coworker to appear in Bristol and admit to the magistrates that he was the pamphlet’s publisher. The hearing did not go well. Charles Watts gave the court 13 copies of the book. Embarrassed when select passages were read aloud, he denounced the pamphlet’s “vile purpose,” and withdrew his support for the arrested bookseller. After Boxing Day, Henry Cook would be sentenced to two years of hard labor. Before returning to London, Watts promised a judge that he would cease to print Fruits of Philosophy. The matter seemed closed. The incident barely made a ripple in the great paper ocean of Victorian newspapers.

Annie Besant, c. 1870. [National Portrait Gallery, London]

Meanwhile, news of the publisher Charles Watts’ impunity made its way back to London. On January 8, 1877, police arrested him without warning in Fleet Street. He was arraigned at Guildhall for publishing an obscene book, released on bail and committed for a February trial at the Old Bailey.

Understandably, Watts panicked. Charles Bradlaugh promised to hire a skilled lawyer, with the aim of convincing a grand jury to return a “no bill,” or recommendation to drop the indictment. “The case is looking rather serious,” Bradlaugh admitted to Watts, “but we must face it. I would the prosecution had been against any other book, for this one places me in a very awkward position.”

Annie Besant argued with both men that the case absolutely must go to trial, as the publicity would shine a needed light on a woman’s right to sex education and the power to make decisions about her own body and health. Knowlton’s Fruits might be bruised, but it was all they had. At length, both men finally agreed with her, even if they remained unenthused. “I have the right and the duty,” Bradlaugh said, “to refuse to associate my name with a submission which is utterly repugnant to my nature and inconsistent with my whole career.” However, “The struggle for a free press has been one of the marks of the Freethought party throughout its history, and as long as the Party permits me to hold its flag, I will never voluntarily lower it.”

Galvanized, Annie organized a defense fund, collecting over £8 at a talk that weekend in Plymouth. Concurrently, Charles Watts had a change of heart. He was switching his plea to Guilty, and planned to throw himself at the mercy of the Central Criminal Court.

Bradlaugh called him a coward, and, after 15 years of working and campaigning together, fired him from the National Reformer. The two would engage in an exchange of public recriminations that forced freethinkers to choose a side. Annie learned of this turn of events upon her return to London. She had been prepared, she wrote, to stand by her colleague Watts in battle, “but not in surrender.” She returned the donations to her Plymouth brethren, read Fruits of Philosophy once again, and planned on a course of action that no British woman had ever undertaken before.

 

Sharing the same roof as the notorious Newgate Prison, the stone blockhouse of the Old Bailey squatted stolidly in the center of the City of London. The courthouse was a five-minute walk from the National Reformer office, via Limeburner Lane. In the February 5, 1877, volume of its proceedings, under the heading Sexual Offences, a clerk’s hand recorded:

CHARLES WATTS (41), PLEADED GUILTY to unlawfully printing and publishing thirteen indecent and obscene books – To appear and receive judgment when called upon.

In the end, his admission brought the leniency he hoped for. No jail time, and a steep £25 fine, for costs. 

Annie Besant proposed that she and Bradlaugh form their own publishing company, taking the National Reformer pamphlet plates away from the pigeon-hearted printer Charles Watts. That they had no experience in business did not deter Annie, who held that “all things are possible to those who are resolute.” The pair cobbled together funds to rent a dilapidated shop on Stonecutter Street, a passage linking Shoe Lane to Farringdon Street. The shop was even closer to the Old Bailey than their old one. If her scheme went as planned, she would have a shorter walk to her trial.

By the end of February, the partners had opened the Freethought Publishing Company. As he sniped at and attempted to scoop Charles Watts’ new rival publication, the Secular Review, she directed her partner to a more important fight: printing an updated edition of the prosecuted Fruits of Philosophy and challenging Britain’s obscenity law.

With his eye on standing a fourth time for Parliament, Bradlaugh did not share Besant’s enthusiasm for martyrdom. He did not even like the book. With the Church and Crown arrayed against them, he doubted they could win. They could be sentenced to prison. Mrs. Besant said she would publish the pamphlet herself. 

Readers of March’s final edition of the National Reformer found the announcement, topping the page of advertisements for tailored trousers and Bordeaux burgundies, of a new edition of Fruits of Philosophy: “The Pamphlet will be republished on Saturday, March 24, with some additional Medical Notes by a London Doctor of Medicine. It will be on sale at 28 Stonecutter Street after 4 pm until close of shop. Mr. Charles Bradlaugh and Mrs. Annie Besant will be in attendance from that hour, and will sell personally the first hundred copies.”

On the day of the actual printing, Bradlaugh was in Scotland to give a talk. His daughter Hypatia described Annie’s fear of a police raid and seizure of the stock before the sale. With her sister Alice’s help, the women “hid parcels of the pamphlet in every conceivable place. We buried some by night in [Annie’s] garden, concealed some under the floor, and others behind the cistern. When my father came home again the process began of finding as quickly as possible these well-hidden treasures — some indeed so well hidden that they were not found till some time afterwards.”

On the Saturday, Besant and Bradlaugh found a crowd waiting outside their printshop. In twenty minutes the first print run of 500 copies sold out. Despite her hand delivery of the National Reformer to magistrates’ postboxes in Guildhall, the police never showed. 

The following day, a Sunday, Besant and Bradlaugh hand-sold 800 copies of Knowlton and mailed parcels of the pamphlet to fulfil orders across England and Scotland. Letters of support flowed in. The feminist journalist Florence Fenwick Miller admired Annie’s noble stand against “this attempt to keep the people in enforced ignorance upon the most important of subjects.” Miller included a donation for the defense fund she promised they would be needing. She wished she had “fifty times as much to give.”

A week passed, the Freethought press kept printing, and the Fruits kept spilling out the door. “The Vice Society has plenty of spies and informers on its books,” Besant wrote. “One wise sentence only will I recommend to that sapient body; it is from the cookery book of Mrs. Glasse, dealing with the cooking of hares — Men and brethren, ‘first, catch your hare.’”

Annie decided to help them. To the police she offered to be at Stonecutter Street daily from 10 to 11 am. At last, on April’s first Thursday, she and Bradlaugh arrived to find “three gentlemen regarding us affectionately.” They looked to her then “as the unsubstantial shadow of a dream.”

The trio followed them into the shop. Detective Sergeant Robert Outram produced a search warrant. Bradlaugh said he could look around all he wanted; the last of the first print run of 5,000 copies had been sold the previous day. Outram nonetheless played his part as planned, placed the pair under arrest, and marched them down to Bridewell for booking.

If Annie Besant had any illusions that she would be treated differently than those arrested for street crimes, they were shattered when she was told to empty her pockets and hand over her purse, and was led by a matron into a cell to be searched.

“The woman was as civil as she could be,” she wrote, but “it is extremely unpleasant to be handled, and on such a charge as that against myself a search was an absurdity.”

To Annie’s surprise, she and Bradlaugh were led to the Guildhall basement. For two and a half hours (“very dull,” she wrote, “and very cold”) she simmered as, in a neighboring cell, “Mr. Bradlaugh paced up and down his limited kingdom.” Together they listened to the names of prisoners being summoned, until theirs were the day’s last names called to “go up higher.” Annie entered the dock, and measured up the magistrate. He appeared to her “a nice, kindly old gentleman, robed in marvellous, but not uncomely garments of black velvet, purple, and dark fur.” As the proceedings began, clerks handed her a succession of little tan envelopes holding telegrams from admirers, pledging their support.

A detective constable testified that on March 24 he had purchased a copy of Fruits of Philosophy from Annie Besant, who took his one shilling and returned sixpence change. “Bradlaugh saw her take the money,” William Simmonds added matter-of-factly. “I believe that a large amount of books,” the policeman concluded, “are now kept upon those premises for the purpose of sale.”

That suspicion was what compelled Detective Sergeant Robert Outram’s visit to the shop on the day when the pamphlet’s print run had already sold out. In the courtroom DS Outram, too, had seemed kind, as she watched him find seats for Bradlaugh’s daughters. Still, “It amused me,” Annie wrote, “to see the broad grin which ran round when the detective was asked whether he had executed the seizure warrant, and he answered sadly that there was ‘nothing to seize.’” Bail was set for the next hearing, “to which adjuration I only replied with a polite little bow.”

Walking into the waning spring sunlight, she was surprised to see a small crowd cheering. One voice called, “Bravo! There’s some of the old English spirit left yet!” The criminals had missed luncheon, and so set off to have a meal. Supporters straggled behind them like the tails of a soaring kite. Dining in the gathering dusk, Annie experienced the intoxicating thrill of reading about herself in the newspaper.

“The evening papers all contained reports of the proceedings,” she wrote with satisfaction, mentioning the Daily Telegraph and Evening Standard, “as did also the papers of the following morning.” They included, she especially noted, the hallowed Times, where her name appeared for the first time. Victoria’s favorite publication, the Pall Mall Gazette, placed news of Annie Besant’s arrest — “on a charge of publishing a book alleged to be immoral” — immediately after the lines detailing Her Majesty’s daily engagements. The queen’s activities necessitated two lines of type. Annie’s warranted 33.

Adapted excerpt reprinted with permission from A Dirty, Filthy Book: Annie Besant’s Fight for Reproductive Rights, by Michael Meyer, now available in paperback from WH Allen. © 2024 by Michael Meyer. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186023 https://historynewsnetwork.org/article/186023 0
“A Party for the White Man” When an April 1964 Gallup poll asked Republicans whom they would most like to see nominated as president, only 15% named Barry Goldwater; 52% named Nelson Rockefeller, Henry Cabot Lodge Jr., George Romney, or William W. Scranton, all liberals on civil rights with support among African Americans. Outraged by Goldwater’s opposition to the Civil Rights Act, Pennsylvania Governor Scranton stepped into the race and launched a “stop-Goldwater” bid with Rockefeller’s endorsement just five weeks before the national convention. Though too late to win any primaries, he hoped that his campaign could sway delegates who agreed with him that Goldwater’s extremism represented “a weird parody of Republicanism.”

Conservatives, however, had a three-year head start packing state delegations with Goldwater supporters. Eliminating Black-and-Tan remnants from Southern delegations was essential to this strategy. South Carolina’s state party, which had been open to black inclusion in the 1950s, issued a report in the early 1960s boasting that “not a single Negro showed any interest” in the party, which “was welcomed by new Party leaders as victory in the South at any level could never be achieved by a Negro dominated party.” Georgia’s Republican Party continued to welcome black participation through the early 1960s, and an African American served as vice chairman of the state party. One white official bragged that the GOP was one of only two “integrated public organization[s] in the state.” At the 1963 state party convention, the Fulton County Republican Committee proposed a platform endorsing black equality. Not only was the statement rejected, but the delegation from Atlanta was not prepared for the onslaught of conservatives who had only recently become interested in the mechanics of party gatherings. Whereas previous state conventions had averaged fewer than 400 participants, conservatives, including many former Democrats, filled the convention with more than 1,500 delegates. By the final day, they had removed every single black leader from power, including John H. Calhoun, the man who delivered Atlanta’s black vote to Nixon in 1960. For the first time in 40 years, Georgia’s delegation to the Republican National Convention was entirely white. One of the party’s new officials proclaimed, “The Negro has been read out of the Republican Party of Georgia here today,” and members celebrated with an all-white banquet.

 

Finally in control of the party’s most powerful committees and inspired by the GOP’s first staunchly conservative presidential nominee in decades, Goldwater delegates sought to humiliate their establishment enemies at the Republican National Convention in San Francisco’s Cow Palace. Conservatives intentionally delayed proceedings so that Nelson Rockefeller could not deliver his convention address until midnight, or 3:00 am on the East Coast. When the New York governor finally stepped on stage, a steady stream of boos interrupted him for a solid three minutes. Black delegates faced similar disrespectful treatment from members of their own party. The vice chairman of the DC Republican Committee, Elaine Jenkins, recalled, “There was no inclusion of black Republicans as a group at the convention. White staffers treated the few of us present as truly non-existent or invisible.” On one occasion, Goldwater’s “Junior Sergeant at Arms” blocked four black men from entering the convention floor, including Edward Brooke, one of the most powerful public officials in Massachusetts. it was not until a Nixon associate, John Ehrlichman, intervened on their behalf that they were granted entry.

African Americans in attendance also faced verbal and physical abuse. Memphis Black-and-Tan leader George W. Lee had to be escorted from Scranton headquarters to an undisclosed motel after receiving death threats during his contest against Memphis conservatives. When Clarence Townes, the only African American in Virginia’s delegation, cast his vote for Rockefeller, he “was forced to flee from the convention hall in company with television newscasters to escape angry conservatives.” In one of the most shocking events of the convention, William P. Young, Pennsylvania secretary of labor and industry, noticed smoke coming from his clothes after a heated exchange with a group of Goldwater delegates. After burning his hand to extinguish the flames, he discovered four holes burned into his suit jacket from a lit cigar placed in his pocket by an unknown assailant. The event was witnessed live by television cameras and reporters on scene. Shortly thereafter, one Southern entrepreneur began selling “Goldwater Cigars,” which included a card that read, “These cigars can be used in many ways … Some Republican People at the San Francisco Convention Slipped a Lighted Cigar into a Negro Delegate’s Pocket! They Say He Seemed to Get the Idea That He Wasn’t Wanted. And He Left the Room in a Hurry!”

The events at Cow Palace confirmed many black Republicans’ worst fears about their party. Edward Brooke described the convention as “an exercise in manipulation by a zealous organization which operated with a militancy, a lack of toleration, at times a ruthlessness totally alien to Republican practices.” In a convention hall filled with Confederate flags waved by southern delegations, one African American remarked, “it’s clear to me … that this taking over of our party is based on resentment of civil rights advances.” Sandy Ray of New York lamented that his party had become home to “extremists, racists, crackpots, and rightists. What we experienced at the convention television onlookers could not believe.” Jackie Robinson declared, “as I watched this steamroller operation in San Francisco, I had a better understanding of how it must have felt to be a Jew in Hitler’s Germany.”

Scranton’s last-ditch campaign failed, and Goldwater easily secured the GOP nomination. Although a July poll of registered Republicans found that 60% favored Scranton, conventions are not democratic proceedings. Southern delegations cast over 97% of their votes for Goldwater under their newfound all-white leadership. By refusing to slate African Americans as delegates even from diverse states like California and replacing black leaders in Georgia and Tennessee with conservative whites, Goldwater’s forces had reduced black representation at the national convention to its lowest numbers in over 50 years. Especially disconcerting to moderates was that 7% of the convention’s 1,300 convention delegates were members of the anti-civil rights John Birch Society, while only 1%, or 14 individuals, were African American. Conversely, the 1964 Democratic Convention featured a record 65 black delegates.

In his acceptance speech, Goldwater rejected another opportunity to reconcile with moderates and liberals. Although civil rights had been the nation’s most pressing domestic issue for the past four years, the nominee did not make a single reference to the movement. He identified communism and an ever-expanding federal government as the primary threats to American “liberty,” but conspicuously left Jim Crow off the list. He used the words “free” and “freedom” 26 times, though none referred to the ongoing struggle for black equality that raged in the South. The same summer as Goldwater delivered his convention speech, four civil rights activists had been murdered, 80 had been beaten, and 67 black churches, homes, and businesses had been burned or bombed in Mississippi alone. Goldwater was silent on this wave of violence in the South in his convention speech, and yet railed against violence in the “streets” of Northern cities. He also expressed his disdain for moderate Republicans, who had so often dismissed him as an extremist, and famously proclaimed, “extremism in the defense of liberty is no vice! and … moderation in the pursuit of justice is no virtue!” 

Civil rights activists dressed up as Ku Klux Klan members to protest racists supporting the presidential campaign of Barry Goldwater at the Republican National Convention, San Francisco. Photograph by Warren K. Leffler. [Library of Congress]

Despite efforts to alienate African Americans at the convention, they refused to withdraw quietly without a fight. According to Jet, black Republicans “poured out in numbers” to attend an anti-Goldwater rally led by Jackie Robinson and the Congress of Racial Equality. The rally, whose participants also included Nelson Rockefeller, Henry Cabot Lodge, George Romney, Jacob Javits, and Kenneth Keating, culminated in a 40,000-person march from Market Street to Cow Palace. On July 15, African Americans assembled at the Fairmont Hotel to discuss protest strategies. Temporarily naming themselves the Negro Republican Organization (NRO), the group issued a statement read by William Young to the press. “We have no confidence” in Goldwater’s “ability to enforce” the civil rights bill, they announced, and pledged to “defeat that minority segment” of the GOP, “which appears determined to establish a lily-white Republican Party.” Jackie Robinson called for a coordinated NRO walkout from the convention floor, but George Parker of Washington, DC, cautioned that because of their small numbers, “it would look as if they were just going out to lunch.” They ultimately agreed to stage a “walk around” instead of a walkout, hoping that television cameras would broadcast their protest. The demonstration occurred as the convention began counting verbal votes for the presidential nominee. A counter-protest soon eclipsed the demonstration, and journalists found it difficult to see the marchers amid “a tunnel of Goldwater banners, signs, pennants, streamers, and flying hats.” 

To the black delegates who formed the NRO, leaving the party was not an option. The hostile national convention provided motivation to continue their fight against an uncompromising conservative movement. As Sandy Ray declared after the convention, “if we sit quietly and allow this band of racists to take over the party, we not only signal the end of the party of freedom, we also help to bring about the total destruction of America through racism.” When asked in 1968 if he had ever considered leaving the GOP, George W. Lee somberly replied, “during my Goldwater fight in San Francisco … I was a lone individual down there,” but he never thought of ever leaving the GOP, because “somebody had to stay there in the Republican Party and fight, and fight, and fight with the hope that the Republican Party wouldn’t be made a party of ultra-conservatism and further than that, a party for the white man.” 

Adapted excerpt reprinted with permission from Black Republicans and the Transformation of the GOP, by Joshua D. Farrington, now available in paperback. © 2016 by the University of Pennsylvania Press. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186021 https://historynewsnetwork.org/article/186021 0
Lacking a Demonstrable Source of Authority In the early 1880s Crow Dog, a member of the Sicangu Lakota Oyate (or Brulé Lakota or Brulé Sioux in older literature) ambushed and killed Spotted Tail, who was also Sicangu Lakota. The event took place on tribal land, which was increasingly surrounded by colonizers yet still very remote from what then constituted much of the United States. The motivation for the killing was assuredly political in nature — both men vied for leadership authority within the community — but it may have had personal elements as well. Nonetheless, the heinous act created a rift within the community, one that the community sought to repair. Shortly after the killing, a tribal council sent peacemakers who helped the families negotiate a settlement. The issue was resolved within the community to its satisfaction when Crow Dog and his family agreed to pay Spotted Tail’s family $600, eight horses, and a blanket, an astounding sum that testified to the significance of the family’s and community’s loss.

Although the matter was settled to the community’s satisfaction under Sicangu Lakota Oyate law, federal officials seized on what they regarded as an opportunity. At the time, in the 1880s, the federal government was in the early stages of what is regularly referred to as the Allotment Era of federal policy. The Allotment Era, lasting from approximately 1871 to 1934, was defined by attempts by the federal government, philanthropists who believed they were doing right, and those who sought tribal lands and resources to destroy tribal nations and tribalism. The Allotment Era included, among other things, the process of allotment that fundamentally changed and reduced tribal land holdings, boarding schools that sought to eradicate tribal ways of life by forcing Native children into a Western mode of life, and Indian police and Indian courts that enforced Western law and norms in Native spaces. It would be difficult to overstate the amount of time, energy, and resources that were directed toward eradicating tribal ways of life during the Allotment Era or the lasting harm that the era’s efforts continue to cause.

Shortly after the matter was settled among the Sicangu Lakota Oyate, the federal government arrested Crow Dog under the pretense that a “public outcry” demanded that the killer be brought to justice. The true purpose for arresting Crow Dog, however, had little to do with public opinion. At the time, federal officials tasked with engaging with Native peoples wanted to exercise criminal jurisdiction over Native peoples on Native lands. In one respect, the sovereignty and nationhood of Native peoples made this seem absurd — much like it would be absurd if the United States tried to extend its criminal law over peoples living in Canada or Mexico. Yet, tribal nations were increasingly surrounded and imposed upon by the growing colonial force that was the United States. Under these circumstances and within the spirit of the Allotment Era, forcing American criminal law on Native peoples on Native lands felt less like an absurdity to many federal officials and more like a necessity.

Crow Dog’s arrest and trial were intended to produce a test case that would provoke American courts to decide whether the federal government had jurisdiction to exercise American criminal law over Native peoples on Native lands. The trial was swift, Crow Dog was convicted in a territorial court and sentenced to hang, and federal officials had their test case that was soon to be heard by the Supreme Court. According to legend, Crow Dog managed to convince a federal marshal to let him go free for a period of time to arrange his affairs. The day Crow Dog promised to return was cold and snowy, and few if any expected him to keep his promise. Nonetheless, he showed up on time, making him a local hero.

Crow Dog’s situation allows us to recognize that the American, or Western, system of justice is focused on punishing the offender. Crow Dog, under this vision of criminal justice, needed to feel a roughly equivalent amount of harm that he caused. Federal officials sought the death penalty and were incensed when he “went free” under tribal law. However, for the Sicangu Lakota, and for many tribal nations, the focus of the criminal justice system was not on punishing the offender but rather on making the victim (or the victim’s family) as whole as possible. Restoring a sense of balance and harmony within the community was the foremost goal and best accomplished through restitution rather than punishment. Consequently, under the Sicangu Lakota system, Crow Dog was not buying his way out of or otherwise avoiding justice but fully and meaningfully participating in effectuating it.

 

Every single court case, from the biggest to the smallest, is just a question that is seeking an answer: Did the accused commit the crime for which she or he is on trial? Did the company breach its contractual obligations? Is a tomato a fruit or a vegetable? Consequently, the key to reading and understanding court opinions is to discern the question that the court is trying to answer. When the test case that emerged from Crow Dog’s situation reached the Supreme Court, the question to be considered was blissfully uncomplicated and likely obvious: Did the federal government have jurisdiction to enforce American criminal law over Native peoples on Native lands?

The answer, according to the Supreme Court in its 1883 decision Ex Parte Crow Dog, was an equally simple “no,” even if the methodology for arriving at that answer was somewhat convoluted and the language employed by Justice Stanley Matthews in the majority opinion was replete with the types of rhetorical unnecessities that Strunk and White sought to kill off. Put most simply, the federal government had already given itself jurisdiction over crimes committed in “Indian Country” through two statutes. Yet, in those statutes the federal government specifically exempted from its jurisdiction crimes that were committed by one Native person against another Native person or crimes by Native people that had already been punished by the tribal nation. Since Crow Dog clearly fell within both exceptions, the lawyers for the federal government sought alternative justifications for federal jurisdiction and settled on tribal cessions made in an 1868 treaty and an 1877 agreement. The Supreme Court rejected this line of reasoning, stating among other things, “It is quite clear from the context that this does not cover the present case of an alleged wrong committed by one Indian upon the person of another of the same tribe.” Without jurisdiction, the federal government was forced to free Crow Dog, at which point he returned to his community, lived to an old age, and continued to remain a thorn in the side of federal officials.

The Supreme Court’s decision in Crow Dog was unquestionably a victory for Crow Dog and the Sicangu Lakota Oyate particularly and for tribal sovereignty and Native America more generally. It was an acknowledgment by the courts of the United States that the federal government, in what might be understood as a commitment to its foundational principles, could not simply assert its authority without a basis for that authority. It is rightfully celebrated for that which it stands.

Unfortunately, victories for tribal interests in American courts are rarely complete or without some corresponding aspect or aspects that diminish, limit, or completely negate the positive impact of the case for Native America. This is so with Crow Dog. Two distinguishing features significantly dull the shine of this particular outcome. The first is the rationale upon which the decision was made. While the final result of the case supported tribal sovereignty, Justice Matthews’ opinion makes clear that this was more an unintended consequence than a purposeful goal or statement of principle. The main focus in Matthews’ opinion was on federal claims to authority and their sources, or lack thereof. There is no discussion whatsoever of tribal criminal procedures or that the matter was handled within the community to the community’s satisfaction.

The limited discussion of tribal peoples and methods in the opinion centers not on Sicangu Lakota Oyate structures or law but on the supposed deficiencies of Native America. In language that echoed earlier decisions and portended future ones, Matthews described Native peoples as “wards subject to a guardian” and “a dependent community who were in a state of pupilage.” Consequently, Matthews would later argue, it was unfair to measure Native peoples against American law. As part of the most famous passage in the case, Matthews wrote that the application of American law to Native peoples “tries them, not by their peers, nor by the customs of their people, nor the law of their land, but by superiors of a different race, according to the law of a social state of which they have an imperfect conception, and which is opposed to the traditions of their history, to the habits of their lives, to the strongest prejudices of their savage nature; one which measures the red man’s revenge by the maxims of the white man’s morality.”

Native peoples, inferior to their American counterparts according to Matthews, were merely the lens to view American jurisdiction and process. Although the opinion happened upon such an end, Matthews clearly did not intend to foster or support tribal sovereignty or methodologies. On the contrary, Matthews’ opinion demonstrates a low opinion of Native peoples. Even though it was a win for tribal interests, the case has limitedusefulness as a building block or intellectual basis for subsequent arguments in favor of Native rights and authority.

The second prominent feature of Crow Dog that mitigated its benefit for Native America was how Justice Matthews opened the door to a reconsideration of the result. Near the end of his opinion, Matthews wrote that to find American jurisdiction over Crow Dog’s actions on tribal land was “to reverse in this instance the general policy of the government towards the Indians, as declared in many statutes and treaties, and recognizedin many decisions of this court, from the beginning to the present time.” Had Matthews ended here, he merely would have made the type of general observation that is found in countless court opinions and that may be more or less accurate but is often ephemeral and mostly harmless. However, he did not stop with this bit of fluff. Instead, he continued by stating, “To justify such a departure, in such a case, requires a clear expression of the intention of Congress, and that we have not been able to find.”

 

Particularly at its highest levels, we often conceptualize the three branches of the American government as sometimes “talking” to each other. Since authority is divided between the president, Congress, and the courts, none of the three can exercise its will without limitation. To that end, sometimes when one branch runs against the boundaries of its authority, it will signal through various means to another branch what it would like to see done or propose an alternative path to complete a goal that cannot be accomplished as currently constituted or otherwise offer guidance, advice, or requests.

Understood within this framework, Justice Matthews was very much “talking” to Congress through this opinion. The Court was unable to find American jurisdiction over Crow Dog and tribal lands under the circumstances with which it was presented. Consequently, it is difficult to understand Matthews’ assertion it would take a “clear expression of the intention of Congress” for the Supreme Court to find jurisdiction as anything other than an open invitation to Congress to change the circumstances. Matthews offered his brief description of the “general policy of the government towards the Indians” and then explained how Congress might alter that general policy with a “clear expression.” Matthews deliberately neutered the opinion’s capacity to protect and acknowledge tribal sovereignty by describing to Congress how to overcome the ruling in future cases.

Two years later, Congress accepted Matthews’ invitation, passing the Major Crimes Act in 1885. As originally constituted, the new law gave the federal government jurisdiction over seven “major” crimes committed by a Native person against another Native person in Indian Country, including murder. Federal officials and others seeking to radically transform Native peoples and ways of life had another weapon in their arsenal, just as they had hoped when they initiated the action against Crow Dog.

Of course, just because Congress passes a law doesn’t mean that it has the authority to do so. As many of us learn in our tenth-grade civics class, our government is one of limited and enumerated powers. Years ago, after I finally looked up the word “enumerated,” I better understood the basic premise that the phrase “limited and enumerated powers” is intended to invoke: governmental authority extends only as far as is spelled out in the U.S. Constitution. Put differently, unless the power to act is articulated in the U.S. Constitution, the government doesn’t have that power. This is how we assess the constitutionality, or validity, of laws — those laws that are made under demonstrable grants of authority are constitutional and valid, and those laws that lack a demonstrable source of authority are unconstitutional and invalid. When an assessment of the constitutionality of a law occurs in a court, we refer to the process as judicial review.

Congress passed the Major Crimes Act, but this in and of itself did not settle the question of American jurisdiction over Native individuals on tribal lands. Eventually the constitutionality of the Major Crimes Act would be tested.

Adapted excerpt reprinted with permission from The Worst Trickster Story Ever Told: Native America, the Supreme Court, and the U.S. Constitution, by Keith Richotte Jr., published by Stanford University Press. © 2025 by Keith Richotte, Jr. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Sat, 13 Sep 2025 21:59:53 +0000 https://historynewsnetwork.org/article/186019 https://historynewsnetwork.org/article/186019 0