Books Books - articles brought to you by History News Network. Mon, 13 Oct 2025 23:01:05 +0000 Mon, 13 Oct 2025 23:01:05 +0000 Laminas_Feed_Writer 2 (https://getlaminas.org) https://www.hnn.us/article/group/3 The Most Integrated Institution in West Texas In May 1960, Canyon, Texas, was a bona fide, used-to-have-the-city-limits-sign sundown town. It had six thousand residents, and not a single one of them was African American. In Randall County, Black residents made up only 0.0016 percent of the population. West Texas State College (WT) was still segregated, still mostly a finishing school for aspiring teachers, and still a college that drew its student body from lily-white Panhandle towns. And the school was under a court order to desegregate.

Desegregating WT was always going to be a very different cultural experience than in other parts of Texas or the larger South. For the overwhelming percentage of its students, having regular encounters with a single African American would be a novel experience. The population of West Texas in 1960 was still 98 percent white and native-born. A third of West Texas’ 107 counties had a total Black population of less than 1 percent. Even in the handful of counties where the Black population approached 10 percent, strict residential segregation kept the populations separate. It was this homochromatic demography that contributed to the fact that many West Texas public school systems were among the first in the state to desegregate after the Brown v. Board of Education decision. Places like Friona, the very first school district in Texas to integrate, were relieved to be free of the financial and organizational burden of maintaining a separate school system for six students.

In the six years since the Supreme Court’s decision in Brown, WT’s strategy had been to ignore or turn down without comment applications from African American students. But in February 1960, after its attempts to deny admission to a local African American man failed, WT admitted its first black students later that fall. Helen Neal, an Amarillo elementary school teacher just a few credits shy of her degree, became the school’s first African American graduate the next year.

Looking forward, WT’s administrators, somehow, arrived at the conclusion that they could perhaps offset the discomfort that parents, locals, and students felt about integration by recruiting talented Black athletes and building a winning football program. Which sort of makes sense in football-mad Texas. A 1957 study on attitudes among West Texas high school students toward integration revealed that 95 percent of white high school seniors believed that Black students should be allowed to play sports.

WT administrators found an eager accomplice in new head football coach Joe Kerbel. A barrel-shaped, three hundred–pound offensive genius, Kerbel was already a Texas high school coaching legend and an assistant coach at Texas Tech when WT hired him in 1960. In just a decade of high school coaching, Kerbel had returned both the Breckenridge Broncos and Amarillo Sandies back into state powerhouses. He had a record of 74-17-1 with five district titles, his teams had played in the state title game four times and had won twice. He also picked up two state track championships. He was considered one of the best young coaches in the state. 

Kerbel was a popular choice; the press loved him; he was honest, funny, and quotable. Boosters loved him; he was accessible and enthusiastic. And WT had gone 1-9 its last two seasons. Kerbel surrounded himself with some of the best high school assistant coaches around. After a mediocre first season, the Buffalos new coach launched a recruiting blitz that heavily featured black players. He focused on recruiting junior college athletes, good football players looking to play big-time ball. He signed fullback Ollie Ross and end Bobby Drake out of California and the blazingly fast Pete Pedro from Trinidad College in Colorado. When the 1961 season opened, he had the fastest running back in the country.

The Buffaloes went 6-4 that season, defeating Brigham Young University and handing the 8-1-1 Arizona Wildcats their only loss. Pedro was an immediate sensation; he led the NCAA in scoring, finished second in rushing yards, and averaged 7.1 yards a carry. Coaching one of the only integrated college teams in Texas, Kerbel demanded that Pedro and the other Black players be treated with nothing less than equality and respect. And he ordered all his players to be responsible for one another, Black or white. Still, Black players were obvious novelties in an all-white school and town and were subject to drive-by epithets and discrimination in town. Sportswriters and boosters, for example, insisted that Pedro was Puerto Rican, not African American. For the most part, however, the players were accepted on campus and embraced by the team.

Kerbel recruited even more Black athletes and enjoyed a pipeline of Black high school talent in Texas who were still not allowed to play in the Southwest Conference, along with white players in the state considered too raw or wild to be a Longhorn or an Aggie or a Horned Frog. He promised, “If black players will win for us, we’re gonna keep recruiting ’em.” By 1967, almost every Black man on campus was a football player.

Recognizing the nature of Kerbel’s racial broad-mindedness is important. There was no larger social project at play here, nor was his use of Black players steeped in some transactional cynicism. It was simpler than that. Joe Kerbel was consumed by winning. Black players helped him win games. So, he recruited Black players. For the first time in his coaching career, he was in control of choosing his players and program, and he took full advantage. He pushed West Texas State’s athletic budget and the administration’s patience to every limit. His spending was out of control; he ran up enormous long-distance phone bills and put coaches on planes to recruit rumors. His assistants spent weeks on the road recruiting, living out of motels, eating at drive-ins, and constantly wiring for more cash. 

He took good care of his players once they were on the team. On the road, players enjoyed first-class treatment, staying in nice hotels and eating good meals. They took airplanes to far away games, and Kerbel always scheduled at least one game a year far out of state—Ohio, Tennessee, California, Michigan, Montana. (For most of his players it was the first time they had flown on a plane and, for many, their first time out of Texas.) He insisted that his players have the same equipment as professional teams.

That treatment came at a price. Kerbel’s practices were brutal hours-long exercises in working toward perfection and building an endurance few teams could match. Former WT running back Mercury Morris compared the program to the Marine Corps’s Parris Island (where Kerbel had done basic training in WWII). Kerbel roamed the field, watching his teams through his ever present blacked-out Wayfarers, yelling at players (“Hey, stupid, if you put your brains in a gnat’s ass it would fly backwards”), punching them in the chest, pulling their ears. A go-to Kerbel move was to angrily waddle onto the field, kicking players in the butt as he went. There were curfews and mandatory meetings and study halls. Players could not miss class or church on Sunday. Most of his players were convinced they hated him. 

By mid-decade, the Buffs were one of the most feared teams in the country. With the collapse of the Border Conference in 1962, West Texas State became an independent program, and Kerbel wasn’t afraid of any school. He scheduled home and away games with teams like Memphis State, Colorado State, Bowling Green, Northern Illinois, East Carolina, Montana State Arizona State. His teams won the Sun Bowl and the Junior Rose Bowl. He ran a pro-style Split-T no-huddle offense, and his teams were high-scoring, high-yardage juggernauts. Over three seasons, the Buffs averaged almost thirty points and 400 offensive yards a game. Hank Washington, a rangy six-feet, four-inch signal caller with a bullet arm, was one of the few Black quarterbacks playing Division I football in 1966 when he led the nation in scoring, completing 261 passes for over 2,000 yards and 17 touchdowns. (Many predicted he would be the first Black quarterback to start in the NFL.)

WT had a reputation for more than just big offensive numbers. It was an “outlaw” program, one of the most unusual teams in college football, “a home for those who just didn’t fit anywhere else.” Forty percent of the team was Black, an unheard-of number for a college football team in Texas. It was the most integrated institution anywhere in West Texas. When on the road, Kerbel assigned rooms by position and had Black and white teammates room together, an unheard-of gesture in 1960s Texas. To the shock of some parents, Kerbel had both Black and white players helping him with his summer high school clinics. The players, both the good ol’ boys from Texas and the Black players from across the country, got along. The team reveled in its reputation for being tough and a little crazy. And Kerbel’s sole focus on winning gave his players the freedom to exercise a degree of individuality that few college coaches in those years could possibly comprehend. It’s perhaps one of the reasons that WT was considered not only a pipeline to the NFL—at one point in the early 1970s, more than fifty former Kerbel players were in professional football, rivaling the numbers of the big schools like Ohio State and Alabama—but also a wellspring of professional wrestling. Across the 1970s and 1980s, the top bills of wrestling cards were chock-full of ex-Buffalo football players. So many pro wrestlers showed up at the spring alumni game one year that they were able to sneak Dick Murdoch (Dusty Rhodes’ tag-team partner) into uniform. Legend has it he even scored a touchdown.

Kerbel was untouchable. He once told star running back Duane Thomas, “As long as I keep winning these sons of bitches can’t say anything to me.” He was wrong.

Postcard of West Texas State College, 1946. [Library of Congress]

In spring 1968 the racial climate at WT began to change. That semester, Black students, led by football players, stood up to challenge the mechanisms, symbols, and practices of white supremacy on campus and in the community. While most Black men at WT were on athletic scholarships, the African American student population had grown beyond just athletes. There were plenty of other students of color, who had been attracted to the school’s business and education programs. Two Black fraternities had been chartered, Omega Psi Phi and Kappa Alpha Psi.

The African American community grew close over that year, drawn together by the countless micro and macro aggressions its members faced each day. WT administrators’ scheme to change the local racial climate with a winning football team had largely failed. Canyon was still very much an unwelcoming environment for Black students. Local rednecks screamed racial slurs as they drove through campus. Black students were not welcome at local restaurants and were barely tolerated in stores. The drugstore kept its “black” makeup under lock and key. The only larger African American community nearby was in Amarillo, and sometimes WT students were not welcome there either.

The assassination of Martin Luther King Jr. was the catalyst for change. As Mercury Morris remembered: “We raised our fists. And for the first time, Whites began to fear us … and listen.” There was no official recognition of King’s murder in either Amarillo or Canyon, where the flags still flew at full mast. As native son Buck Ramsey reported, “When news came to the Panhandle that Dr. King had been assassinated, the general reaction seemed to be one of rejoicing and celebration, a feeling of relief and, undoubtedly, of vicarious vengeance.” But it was different at WT, where local ministers and faculty, including a sociology professor who had gone to graduate school with King at Boston University, organized an informal march from downtown Canyon to the Episcopalian Student Center. It began with twenty parishioners from the Presbyterian Church and grew as it passed other churches: First Christian, First United Methodist, First Baptist, St. Paul Lutheran; congregants from some of the smaller chancels off 4th Avenue joined. Canyon police had denied a permit request to use the street and warned of arrests, so marchers kept to the sidewalks. Local police watched the procession carefully, as did military intelligence and the FBI, whose agents took photos of every participant. A dozen Black WT students joined, each wearing a black armband. The march ended on the lawn of the Episcopal Student Center, where a short tribute service to King was held.

Mourning and angry, WT’s African American community met informally and with great frequency over the next few weeks to talk about how race worked at WT and in Canyon. It was all ad hoc: there was no Black Student Alliance, no NAACP chapter on campus or even in Canyon, few obvious allies among the faculty or administration. Other than the athletes, most Black students didn’t even live on campus. There was, however, one solid bloc of students interested in forcing change at WT: football players, including several white players. 

The big issue that spring was West Texas State’s official participation in “Old South Day,” an annual celebration of the Confederacy put on by the Kappa Alpha fraternity. The Kappa Alpha Order, founded immediately after the Civil War by a former Confederate soldier, was among the most important Lost Cause organizations in the South. Through its rituals, publications, and celebrations it pushed a narrative of southern history that painted the antebellum South and plantation slavery in a positive light and portrayed the Civil War as a fight for the noble and glorious cause of states’ rights, Reconstruction as a corrupt failure, and Ku Klux Klan terrorists as heroes who restored white supremacy to the South. Unlike most of the rest of Texas, its western half had largely avoided organizing its history around the Lost Cause, preferring to emphasize its own frontier mythology as a narrative structure and ideological framework. The first Confederate monument in West Texas didn’t go up until 1931 (in Amarillo’s Ellwood Park), and while there were plenty of counties named for Confederate officers, those counties had been named long before white settlers moved to the region. There were no schools named for Confederates until after World War II. Compared to the rest of Texas, the United Daughters of the Confederacy and Sons of Confederate Veterans were weak in the region.

But that began to change in the 1950s. In direct response to the Civil Rights Movement, Lost Cause mythology surged again; this time it reached West Texas. New schools were named for Robert E. Lee and other Confederate heroes. New monuments to Confederate dead appeared in front of county courthouses in towns that hadn’t existed in 1865. When all-white Tascosa High School in Amarillo opened in 1958, it adopted the “Rebel” as its mascot. A student dressed as a Confederate general roamed its sidelines at football games and took center court at basketball games. He slashed the air with his saber, leading students in a Rebel Yell as they waved the Stars and Bars. The school’s choir called themselves the “Dixieland Singers,” and “Dixie” was its fight song. The cutest and most popular girl in school was named “Southern Belle.” It was also in 1958 that the Kappa Alpha Order organized at West Texas State. 

Old South Day was started in 1949 at the University of Alabama as a celebration of the Confederacy. It kicked off every year with the fraternity “seceding” from the university and parading in confederate uniforms through campus. After that came the white-supremacist cosplay dance balls with hoop skirts and elaborate Confederate regalia. The event was, as historian Anthony James described it, a place where white male students “performed the drama [of the horrors of the Civil War and Reconstruction] from a position of dominance. They could glory in past defeat because, ultimately, they emerged victorious. White supremacy, albeit in a different form, was restored.”

Led by football players, WT’s protest of Old South Day began just a month after Martin Luther King’s assassination. A meeting on May 2 created a petition demanding that the administration refuse to participate in a grotesque celebration of the Slave South. The meeting also produced special committees that would articulate other grievances and plan demonstrations. Gary Puckrein, a freshman football player from New York, emerged as the group’s spokesman. He appeared before a student court and not only demanded that the Administration not take part in Old South Day but also called attention to the “Whites Only” signs at fraternity lodges, the fact that there were no faculty of color at WT, and that WT did not offer courses in either African or African American history. “Black people need to know about their history,” he said, and summed up the attitudes of the administration this way: “They don’t care about the Negroes; they are racists.” He concluded his remarks by comparing the Confederate flag to the swastika and promised the court that he would burn it if he saw the flag flown on campus.

The administration ignored the demand for an injunction, and student petitions gained more signatures. The Kappa Alphas kept preparing for their big weekend and seemed genuinely confused about the uproar (even though other chapters had discontinued Old South Day after facing similar protests). Fraternity president Dick Flynn promised that the Kappa Alphas meant no offense and tried to reassure the campus community by saying that the fraternity had no interest in the “revival of slavery.” As the hour of the parade approached, a nervous tension gripped campus. Canyon police had placed sharpshooters on rooftops and brought in State Department of Public Safety officers to help them secure the route. Campus administrators feared a riot. Protesters lined the streets, waving signs: “The war is over—You Lost!” and “To Hell with the Old South.” But they were peaceful. The Kappa Alphas nervously made their way to President Cornette’s door, where Dick Flynn read a greatly abbreviated version of his speech. Cornette snatched the order of secession and quickly shut the door. Flynn looked at the closed door for a moment before remounting his horse and leading the KAs away from campus. WT’s official involvement in Old South Day ended there. 

Mercury Morris recalled the event as the day the fraternity exercised “its constitutional right to make a fool of itself ” and he and his teammates exercised “their constitutional right to protest their tasteless, not-so-hidden message.”

Local conservatives freaked. The Randall County Republican Party, led by longtime right-wing activist J. Evetts Haley, who had played football for WT half a century earlier, prepared a special resolution for the upcoming county convention. Among its various WHEREASes were scattered complaints about Kerbel and his players. In its official document that kicked off that campaign season, it decried the school’s “indiscriminate recruitment of Negro athletes” and “the moral principles involved.” The very presence of Black players, the resolution insisted, had “accentuated” social unrest, placed an undue burden on local police, and had offended the “moral sense of this community.” The Republicans labeled the Old South Day protests as “communistically-oriented agitation and anarchy” and an affront to “the exercise of a traditional wholesome rite on the West Texas State University campus.” They demanded that WT administrators recognize that the continued threat of “violence and bloodshed” was more important “than the winning of ball games.”

After the 1970 season, in which the Buffs went 7-3, with wins over East Carolina, Bowling Green, and Southern Miss, and had three players (all African American) selected in the NFL draft, WT declined to renew Joe Kerbel’s contract. Every single one of his assistant coaches quit on the spot. Kerbel died of a heart attack two years later; he was fifty-one years old. In 2020, less than five percent of WT’s is African American. The black population of Canyon was less than three percent. Kappa Alpha banned Old South celebrations in 2016. 

This excerpt is adapted from The Conservative Frontier: Texas and the Origins of the New Right, by Jeff Roche, published by the University of Texas Press. © 2025 by the University of Texas Press.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186067 https://historynewsnetwork.org/article/186067 0
The Women Who Ran Senegambia When it came to the relations between men and women, 17th century Cacheu — a port town in today’s Guinea-Bissau — was a very different place to the world of Counter-Reformation Portugal, where young girls were dressed in nun’s wimples in the streets of Lisbon to prepare them for a life of seclusion. While in the halls of the Inquisition of Lisbon it was men who commanded — as Inquisitors, notaries, scribes, and translators — and women appeared only in the guise of those to be condemned, Cacheu was a very different place. Where in 17th-century Europe, women’s activities were hampered by (male) administrative power, Cacheu was different. The most powerful trader in the town of Cacheu was a woman called Crispina Peres, whose success and challenges to the patriarchy of empire eventually saw her arrested in 1665 by the Portuguese Inquisition on trumped-up charges of “witchcraft.”

There are several ways in which the power of women emerge through Peres’ inquisition trial. The first is that the documents exist at all. This was the sole Inquisitorial trial taken throughout the 17th century from Guinea-Bissau. Given the acknowledged fact that Inquisitorial procedures were often a means of gaining revenge on one’s enemies, this sole extant trial could only have been taken against an exceptionally powerful individual. The number of enemies that Crispina had is clear from the trial records. And the more powerful the person, the larger the number of their enemies.

Moreover, many of the witnesses pointed the finger not only at Crispina’s “heretical” activities, but also at those of many women who were friends of hers. These women were said also to make sacrifices in Cacheu’s religious shrines known as chinas. For when the original accusation was formulated in the pages of the trial, it was not only against Crispina but also against “Genebra Lopes and Isabel Lopes, single, Black women resident in [Cacheu].” For in Cacheu, women ran their own households, supported one another in commercial business, and were the clear commercial, religious, and social rivals of the upstart (male) Atlantic traffickers in the town.

The evidence of Sebastião Vaz, a boatswain and resident of Cacheu, can stand as representative of this:

He knows as a matter of certainty that when there is need, or something is lost, Genebra Lopes, a single Black woman, resident of this settlement of Cacheu, buys palm wine and chickens and sends them and sometimes goes to the shrine … And he heard her say many times that it was [done] with that intention because … when they buy these things to make sacrifices in order for the stolen or lost [goods] to appear, they will scream, saying in their loud voices that they want to sacrifice it.

This gives a startling image of what often happened in Cacheu. When thefts took place, successful trading women whose goods had been taken would buy palm wine and chickens and then roam the streets shouting loudly that they were going to make a sacrifice to the china – with the known implication that bad things happened to whomsoever was the thief, if they didn’t return what they had taken. According to Vaz, all of this happened in Vila Quente, a neighborhood run by women.

Women’s power and agency in Cacheu was not therefore limited to the likes of Crispina Peres — to those who could rise to commercial dominance and the autonomy which went with it. There was also a whole neighborhood — Vila Quente — in which women ran the households and the religious life. What was it, then, about daily life in Cacheu which meant that many women had such autonomy?

 

Women’s capacity to occupy powerful roles in Cacheu mirrored the world around them. In the micro-kingdoms near Cacheu, female rulership was also a factor in political life. In her book on the micro-state of Pachesi, Gambian historian Ralphina Phillott de Almeida describes how there were at least three female rulers in the earlier phase of Pachesi prior to 1750. And 50 years or so before Crispina’s trial, the trader João Bautista Pérez described offering a flagon of wine to the queen of Cassão on the Gambia river.

Crispina’s influential position wasn’t therefore an anomaly for Senegambia. What, then, were the roots of this influence which she held? A key aspect was fluency in multiple languages. Multilingualism in West African languages gave women like Crispina a huge advantage in business dealings, where local rulers trading with Cacheu might speak Mandinka, Pepel, and/or Bainunk. These language ties also connected to her heritage, since the kinship ties she had through her Bainunk mother Domingas were important in making initial contacts with rulers.

This makes the roots of female empowerment in Cacheu clearer. These Atlantic towns were characterized by the intermarriage of Atlantic traffickers with Senegambian women. However, unlike in the Americas, the Atlantic heritage of the male traders was not an advantage. In fact, it was the reverse. In Cacheu, these men’s spouses had much greater reach: women’s autonomy was cemented by their linguistic abilities, kinship networks and the greater access which all this gave them to healers and the potential for better health empowered them.

Trade was at the heart of Cacheu’s life, and the ways in which it was conducted by the town’s men and women were vital in shaping the roles each had, and the experiences that they had of life.

 

If women dominated the marketplace in Cacheu, their commercial role there was driven by these patterns of trade. Prominent women such as Peres commissioned ships which then plied on trading voyages from the Gambia river in the north to Sierra Leone in the south. Most of the trading voyages themselves, however, were conducted and crewed by men, and these journeys all required long absences from Cacheu.

Meantime, while the men lived it up and aged fast, things were different for the women of Cacheu. Surviving account books provide precious information which show how they had their own independent lines of work. They tended to work in a range of small-scale trades, as well as running small businesses. They made their living selling goods on credit to itinerant male traders, while at the same time developing their own independence.

Women of Cacheu often made small purchases from traders, either for themselves or to sell on in some small business deal. One woman, Guiomar, brought some wax and wine from the trader João Bautista Peres, in 1619. Domingas Lopes bought two measures of kola nuts, chickpeas, Rouen cloth, and some handkerchiefs, while Esperança Vaz bought the same items in addition to wine. These women ran their households, sorted out the provisioning — and it seems also often sold small bits of stuff here and there to their friends and neighbors who came passing by.

This detail adds another layer to the sense of what this town was like. The busiest area was down by the port, where much of the haggling for provisions, wax and cloth was done. These women traders in Cacheu thronged around the place. They were known in Kriolu as regateiras (hawkers, or peddlers). Then they would go back to Vila Quente, where little bits of business here and there also went on as people came in and out of one another’s houses for what they needed. Meanwhile, other women who were not yet living in their own independent households, but still worked as servants or household slaves of Atlantic traffickers, also did business on their own account: one, Maria Rodrigues, the servant (criada) of António Vaz, bought raw cotton from Manoel Bautista Pérez in 1614.

This last piece of evidence on raw cotton also indicates a core element of the economic life of women in Cacheu. This was in spinning and supplying cotton for the looms that were so important to weavers of the cloths that dominated town fashions. This pattern was in keeping with general Senegambian practices, where women tended to spin cotton while men did the weaving. In Cacheu, not only did women run the marketplace, therefore, but there were other trades open to them through which they could achieve economic independence — and through which a skilful negotiator like Crispina might eventually rise to prominence.

 

Economic and social independence usually go together. Crispina was the most successful trader of her era in Cacheu, but she was also just the tip of the iceberg. Most successful women traders in the town also acted as household heads. Far from being an anomaly, Crispina’s life and household embodied the world she came from, something which tells us how far different this world was from that of the Inquisitors who arraigned her before them. It was small wonder that they felt threatened by it.

Just as, in patriarchal societies, older men often marry younger women, so in Cacheu the reverse was the case. In fact, women appear often to have married younger men, as was the case with Crispina and her bedbound husband Jorge: as he put it to the cotton-spinner Maria Mendes, “men marrying older women was a laborious matter,” which according to Mendes ‘impl[ied] that he could not do anything with her’ — and also that it was not something that was all that rare in Cacheu.

Older women might well marry younger men in a setting like Cacheu, where ambitious men from Portugal and Cabo Verde arrived, and where economic networks were likely the established domain of women who were older. This was upside down to the usual way of things in the Portuguese empire. But beyond the inverted relationship of age and marriage, Cacheu upended the gender norms of the Portuguese empire in countless ways. Many economically independent women lived in all-female households in Vila Quente. Here, they ran their small-time trading businesses. Their homes were often also the destinations of the itinerant men who roamed about the town when they returned from their trading voyages, looking for a relationship of some sort.

That women ran their own households is made clear by a piece of testimony from Crispina’s trial, in which the Friar Sebastião de São Vicente said that he had heard that the priest and vicar of Cacheu, Antonio Vaz de Pontes, was “having an affair with another parda [mixed-heritage] girl whose name he cannot recall, who lives in the house of Isabel Lopes, a Black woman, and single, resident in [Cacheu].” In other words, Lopes ran the house, in which other women also lived.

The priest Vaz de Pontes had clearly assimilated well into the ways in which the town worked. Other witnesses confirmed that he was a frequent visitor to Lopes’ house, some saying that there his partner Catalina was expecting a child with him, and others that he had visited her straight after mass on Ash Wednesday (when he should have been more spiritually engaged). Meanwhile, when Vaz de Pontes gave his own evidence, he mentioned “the white and parda women, who headed their households.” Such households were clearly the norm, not the exception, in Cacheu.

With such strong economic, social, and emotional autonomy, women in Cacheu had sexual independence. Even a priest like Vaz de Pontes thought nothing of having a relationship with his girlfriend Catalina in Isabel Lopes’ house. When, a few years before the trial of Crispina Peres, the reprobate Capeverdean priest Luis Rodrigues was also tried by the Portuguese Inquisition, evidence from Farim (100 miles or so upriver from Cacheu) made it clear that in these communities women were free to choose their sexual partners as they wanted to. It was, after all, Rodrigues who was placed in the Inquisitorial dock as a priest, for soliciting women in the confessional. The women of Farim, on the other hand, clearly felt they had the social freedom to participate in raunchy parties at his house. As one witness put it, 

the said priest always went around with a shirt half-undone, and full of wine, ordering dances to take place at his house where he brought together both pagan and Christian women, and all this with a great fanfare and producing a huge scandal in everyone since he committed sins with them.

This remarkable and rare evidence gives a new perspective to gender relations in 17th-century West Africa. Women’s sexual freedom was complementary to their lives in Cacheu. It was they who kept the town’s economy and social life going, while their men were absent, ill, raging drunkenly at their social impotence or dying. Their economic dominance and cultural capital gave them the freedom to run their own households, welcoming in other women who needed a place to live. There they sought to care for the health of those they loved as best they could, whether children, lovers, girlfriends, or spouses in decline.

 

For Crispina, her dominance in this setting proved her downfall: for her own Inquisition trial stemmed from these roles which she filled in her community. The women of Cacheu were able to move about with freedom from one place to another, as the cotton-spinner Lianor Ferreira did from Cabo Verde to Farim, and as Crispina herself had done in her life. This kind of freedom of movement and autonomy was just a workaday aspect of daily life. Yet this independence in their daily lives was shocking to arrivals from Portugal, and so became the source of many rivalries. These tensions then spilled over into these Inquisitorial trials — as the constant references to quarrels, drunkenness, and disputes make clear.

All this can begin to bring together a sense of what these women’s lives were like. They moved around, traded and often shouted loudly in the streets of the town if they needed to go and make a sacrifice to the china in Vila Quente. Their voices and inclinations shaped Cacheu. As the work of Ferreira and of the other women we have discussed here shows, cotton and textiles were important to them. Fashion counted, and women had an important place in supplying cotton to weavers and in selecting fabrics. They often wore blue-and-white cotton shifts, made of the same cotton as the men wore, with another hanging loose from their shoulders.

On the other hand, it would be a mistake to think that these relations were somehow static. There is strong evidence that the increasing violence associated with the transatlantic traffic in enslaved Africans affected gender relations. With men lost in warfare and captured for enslavement (with the strong preference by traffickers for male captives), women in rural settings and away from the urban spaces of the ports and other commercial towns had to take on a much greater labor burden. 

Thus, while there may have been freedom and autonomy for successful women in Cacheu and Vila Quente, they were also very much the lucky ones. The autonomy of their lives was an important and little-known aspect of gender relations in West Africa in the 17th century, but it was part of a larger global transformation that would have immense consequences for West Africa’s relationships with the world and for those caught up in its tragedies.

Excerpt adapted from The Heretic of Cacheu: Crispina Peres and the Struggle over Life in Seventeenth-Century West Africa by Toby Green, published by The University of Chicago Press. Copyright © 2025 by Toby Green. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186065 https://historynewsnetwork.org/article/186065 0
Reactionary Revolutionaries The Confederate States of America and Mexico’s Conservative and Second Imperial governments failed spectacularly. In retrospect, it is easy to shelve them as the product of strange times and particularly asinine politics. Nevertheless, understanding these regimes clues us into the possibilities of politics at a time of profound crisis. Secessionists, Conservatives, and imperialists forcefully rejected the republic as it had become.

Their breakaway projects meant to restore principles and ethical structures they believed were essential to the commonwealth. The Confederate States of America, in the words of their brand-new vice president, Alexander H. Stephens, were the first nation “in the history of the world” to be founded upon a “great physical, philosophical and moral truth”: that of “negro” inferiority. In Mexico, Conservatives believed the true faith should be the foundation of the political edifice, and neither its principles nor its practices should be sacrificed to modernization.

These reactionary revolutionaries took it upon themselves to recast North America’s political landscape. Southern secessionists created a nation from scratch. In Mexico, the enemies of the 1857 constitution first tinkered with a military dictatorship they hoped would be able to reorganize a society that was falling apart. Defeated in war by the end of 1860, “cast down but not destroyed,” their fears of U.S. intervention conveniently mitigated by the stranglehold that the Civil War had placed on U.S. foreign policy, Conservatives swallowed their qualms about entangling foreign alliances and restored a monarchist regime, led by an Austrian archduke and sponsored by an invading foreign army.

In their efforts to begin the nation anew, North America’s conservative revolutionists knew they tread on dangerous ground. To conceal the untested nature of their political formulas, they stressed continuities with the past — real or idealized — and transcendent principle. The Confederacy presented itself as the true heir of the Spirit of 1776, which had been perverted by fanatical Northerners. Its seal was graced by George Washington riding horseback over the Latin inscription “Deo Vindice [God Our Defender].” Confederate claims to a glorious past were inevitably controversial. But even as Southerners rejected the “bare-faced and transparent fallacies” of Declaration of Independence — the pretense that all men were created equal — they saw July 4 celebrations as an opportunity to “expose the North’s failure to live up to the Revolution’s ideals.” Their own insurgency was construed as both an echo and an “improvement” on their ancestors’ feats.

Maximilian’s empire also tapped into Mexico’s nationalist traditions and historical imagination. Unlike the divided, distracted Conservative governments that preceded it, the empire put forth an ambitious, comprehensive conception of the past, not as the object of angry disputes but as a source of unity. It put together a conciliatory version of historia patria, in which the recent troubles were but an accident in a long and venerable history, in which everyone was a hero: it glorified the extraordinary scientific and artistic achievements of precontact Indigenous civilizations and exalted both the popular revolt that set off the War for Independence in 1810 and the arbitrations of the former royalist officer — and merciless foe of the first insurgents — who had brought it to an end in 1821.

General Tomas Mejia, Emperor Maximilian, General Misamon, by François Aubert, 1867. [The Metropolitan Museum of Art]

At mid-century, disunionists in the United States and opponents of Mexico’s 1857 constitution drew from different philosophical, religious, and legal traditions to deal with particular challenges. Both faced the Conservatives’ classic dilemma: a desire to appropriate modernity — “progress,” “civilization” — and the need to purge it of its most unseemly, disquieting features, on both the practical and ideological levels. These “retrogrades” feared the political principles and population dynamics that constituted them as minorities. They despaired at the arrogance of men intent on interfering with Providence and the “natural order of things.” They bridled against an expansive, universalist language that jumbled the relationship between words and things and turned the heads of the poor, the ignorant, and the dependent. In rebuilding the political edifice, they hoped to separate the wheat from the chaff and discipline modernity through constitutional and legal engineering.

Southerners equated their rebellion to that of the 13 colonies against British tyranny and to the birth of the nation in 1787–89. To sever their relationship with the Union most seceding states relied on the same device they had used in the late 1780s to create a federal republic. Opting for flight rather than fighting over the character of state and nation with stubborn Northerners, secessionists upheld their legitimate claim to the founding — reloaded — and to the founders’ constitution — perfected because rightly interpreted.

In Mexico, those who disapproved of the 1857 constitution drew from a more eclectic but shallower constitutional arsenal. While secessionists were leaving at least part of their troubles behind as they left the Union, Mexican Conservatives and imperialists were convinced that to save the nation they had to wrest it from the Liberals’ hands. The Conservative military government (1858–60) pulled together a “provisional organic statute” to serve as fundamental law, but they dared not enact it. In 1863, a more ideologically diverse group of politicians aspired to a more profound transformation: it gave up on both constitutions and republican rule.

As they sorted out the characteristics of the new order, conservative advocates of revolution professed to be righting all sorts of wrongs, doing God’s work and fulfilling His will. They argued that they were also heeding the people’s mandate. Mexico’s devout Catholics and the vigorous white citizenry of the South surely condemned the disturbing designs of fanatical demagogues. On these issues, radical conservatives were, nevertheless, often as emphatic as they were unsure: on both sides of the border, “the people” had done things that belied their reasonable assumptions. In their enthusiasm for political meetings and electioneering that was far from sober, citizens had recently shown less than ideal judgment: in Mexico, many supported liberal radicalism with both votes and arms. In the United States, zealots had turned the heads of Northern voters and set the Union on the course of disaster, under the leadership of an untried, irresponsible political party. Even in the South, although voters had thoroughly rebuffed the Republican ticket, many had still backed moderate candidates who insisted on preserving the Union.

Still, North America’s reactionary revolutionaries insisted that they spoke for the people — or at least for their true interests and the common welfare. They feared that the people’s voice would prove them wrong. Wary of popular intervention, the Confederacy’s constitutional congress held its debates behind closed doors, as did the assembly that voted to restore a monarchical regime in Mexico City. Mexican authorities went further still: in order to prevent “bad passions” from “sowing seeds of discord among good Mexicans,” they suspended newspaper publication in the capital during the “most solemn” days during which delegates discussed the fate of the nation. Subsequently, the Confederate legislature frequently went into secret sessions; Mexico’s military and imperial governments abolished almost all representative bodies and regulated the press. Legislation exalted the freedom of public writers but forcibly curtailed their “abuses” if they probed public figures’ “private life” or endangered “morality” or the “public order.”

The architects of confederacy and empire wanted to ground the new order on both “truth” — eternal, immutable — and the people’s will, inevitably volatile and unreliable, unless one assumed that the people could never be wrong. Conservative revolutionaries were under no such delusion. Their skepticism notwithstanding, they could not step away from the essential fiction of modern politics. The voice of the people seemed at once essential and detrimental to good government.

Impress of the seal of the Confederate States, 1862, by P.A. Foley and Joseph Shepherd Wyon, 1864. [The Metropolitan Museum of Art]

In February 1861, “having dissolved their political connection” to the United States, seven newly independent republics sent some of their most proficient politicians — mostly large slaveholders — to Montgomery, Alabama, to give birth to a Confederate nation by endowing it with leadership and a fundamental law. The provisional Congress, stricken by what Emory M. Thomas described as a “mania for unanimity,” elected two experienced statesmen from the booming cotton states to preside over the new government. Both the president, Mississippi’s former senator Jefferson Davis, who had served the Union as secretary of war, and the vice president, Georgia congressman Alexander H. Stephens, had formerly steered a moderate course by Southern standards. In November, in line with the Confederates’ antiparty stance, Davis and Stephens ran for office unopposed.

At the ballot box, the people endorsed — having been presented with no other choice — the wisdom of their representatives at the constitutional convention. Delegates to Montgomery hammered out, mostly behind closed doors, a provisional and then a permanent constitution. A radical minority considered fundamental law to be useless. Like some Mexican Conservatives, they found constitutions “not worth the paper they were written on.” “God and nature” made states; the Confederacy needed little more than a simple, provisional treaty of alliance to bind them together. This small and cantankerous group, however, did most of its criticizing from the sidelines and to little effect. Most Southern politicians considered a constitution essential. State representatives, then, faced a momentous, dangerous challenge.

In the words of Georgia’s Benjamin Hill, the “formation of a new constitution” had to serve as “a very powerful agency for good,” even as uncertainty and division proliferated. Some “anticipated a more radical democracy,” “a fearful anarchy,” “an aristocracy,” “a slave-trade oligarchy,” and “even a limited monarchy.” Yet framing the new nation’s constitution took less than a month: congressmen reviewed and selectively amended, line by line, the fundamental law whose rule the states had just thrown off, as they insisted on the defense of its spirit. They introduced only “necessary and proper” changes to the 1787 document. The Confederacy’s fundamental law was unanimously adopted on March 11 and then rapidly ratified. Harper’s Weekly hailed the reformed constitution, and asserted that its “principal alterations … would receive hearty support” from most people in the North.

 

The most consequential changes to the 1787 document did not entail renovating a constitutional tradition but dealing, concretely, with the issues that had driven polarization and secession. Ambivalent silences on slavery were swept away: the Confederate fundamental law asserted that “no slave” could be discharged from service or labor and that “the institution of negro slavery” was protected by Congress in all of the Confederacy, including the territories it might expand into. As one of Alabama’s representatives contended, “no euphony,” dangerously open to interpretation, hid the real name of things: by calling “our negroes slaves,” the constitution “recognized and protected them as persons and our rights to them as property.”

These essential innovations were, nonetheless, not pushed to their last consequences. The document did not normalize the presence of the enslaved in republic, even in terms of how they should be counted for the purpose of apportionment. It preserved the three-fifths clause — a figure which outsiders found “most perplexing.” Politicians who had complained about it for years did not insist on counting the enslaved as inhabitants in order to bolster the political influence of large slaveholders or states with large enslaved populations. There was, perhaps, a powerful political reason for keeping things as they were. Confederates emphatically insisted on the political equality of white men: counting, for purposes of political representation, the “moral chattel” deemed radically unfit for citizenship proved too uncomfortable.

In another gesture of strategic comity, dressed up as paternalistic concern for the South’s “four million improved, civilized, hardy and happy laborers,” the Confederate constitution banned the Atlantic slave trade.

This was meant, on the one hand, to reassure reluctant slave states of their place in the Confederacy, but it dangled over their heads the possibility of shutting down a profitable market if they refused to join the slaveholders’ bold nation-building experiment. On the other hand, as they prepared to launch the nation onto the world stage, the Confederacy’s architects felt it was best not to provoke Great Britain and its abolitionists.

It is in some ways surprising that, having taken the solemn and dreadful step of secession, the builders of a republic premised on slavery and white supremacy enacted no constitutional provisions to shore up its most compelling elements. A majority of delegates either tabled or voted against proposals to explicitly exclude from the franchise those who had African blood in their veins; to forbid Congress from passing any law “denying the right of property in negro slaves”; and to keep free states out of the Confederacy, as long as they recognized the legitimacy of property in man. The framers’ restraint stemmed, in part, from their expectation that free border states would join the CSA; their more fanciful hope of luring some Midwestern states into the would-be powerful agricultural empire and a concern for the newborn nation’s international reputation. They trusted, perhaps, that occasions for disagreement would diminish significantly, since Confederate legislative debates would be purged of the Union’s most disturbing issues: protective tariffs and, intermittently but fatally (in 1820 and after 1846), the status of slavery in the territories.

Their discretion also spoke, perhaps, of peculiar conceptions of constitutionalism, federalism, and the right of property. According to their parameters, slavery was subject only to municipal regulation and property rights were something more than the product of convention upheld by law. Slavery, they contended, was a natural, “original institution”: slaves were property and their title of ownership was not the creature “of human law,” subject to debate, regulation, or restriction. Property — in man as in anything else — was beyond the purview of legislators, state or federal: it would never again become the “hobby of the political demagogue.”

Experience, convictions, and an entrenched political culture, as well as particular understandings of key concepts and principles, kept Confederate constitutional innovation to a minimum. Moreover, fear of politics seemed to weigh heavier on the minds of members of the constitutional convention than their concerns for state sovereignty, the preservation of slavery or their exaltation of white citizenship. In their effort to restore the founders’ vision of a republic idealized as virtuous, harmonious, and honest, the framers took steps to curb the factiousness and corruption which, in their opinion, had so marred Union politics. Their constitution sought to temper and elevate politics and limit their pollution by money and selfish interest. Only “the common defense” and the costs of government were worthy objects of public funding. The “general welfare” was eschewed as something worth spending the Confederacy’s money on. To further check politicking, lobbying, and pork-barrel legislation, Congress was designed to be smaller and less representative: each congressman would stand for 50,000 inhabitants, instead of the 30,000 established in the U.S. constitution.

Consequently, the Confederate government would grant no bounties or impose protective tariffs to “foster any branch of industry.” Despite Southerners’ interest in free trade, the Confederacy would not fund infrastructure: any “internal improvement intended to facilitate commerce” would have to be paid for by the states, unless it was required for the river and coastal navigation vital to the cotton economy. The Post Office Department had to be “paid out of its own revenues.” To prevent the “extravagance and corruption” of congressional logrolling, the constitution required a two-thirds majority in both houses to authorize appropriations not requested by department heads. It also granted the president a line-item veto on budget provisions.

In the face of the excitable, garrulous, perhaps corrupt people’s representatives, the Confederate constitution set up the president as a stabilizing element. This more powerful chief executive saw his hand strengthened by a longer period in office: six years. His informal influence was, nevertheless, abridged. The constitution curbed the president’s — instrumental and much maligned — role as the great dispenser of political patronage. Although he could remove civil servants for bad behavior, unsatisfactory performance, redundancy, or obsolescence, he had to report and justify any dismissal to the Senate. Furthermore, amassing political capital would be useless, since the president could not be reelected.

For all the Confederates’ vehement condemnation of federal despotism, parties, job-seeking, and populist pandering, they too had to rely on a defense of federal supremacy, politicking, concessions, and compromise. As long as the sovereign people remained at the center of public discourse, elections proved contentious, even the absence of old party politics and the power of government was mobilized to protect “class interests.” Even without the crushing pressures of war, then, Confederate design failed to do what it set out to do: secure states’ rights by assertively reigning in federal authority, defuse majority rule if it threatened minority rights, and protect, in James L. Huston’s words, a “legal definition that turned people into property.” Ironically, the Confederate constitution secured slavery by both naming it and setting it beyond the legislator’s reach. The requirements of state-building and republican principles drew the limits of constitutional innovation.

From Torn Asunder: Republican Crises and Civil Wars in the United States and Mexico, 1848–1867 by Erika Pani. Copyright © 2025 by The University of North Carolina Press. Used by permission of The University of North Carolina Press

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186063 https://historynewsnetwork.org/article/186063 0
Freedoms Lost in Translation Standing before Spanish officials in Havana in 1668, a woman of African descent from Brazil named Madalena explained how she and her daughter arrived on the island of Cuba in a stolen fishing vessel along with eleven other people. As she explained, the group had escaped captivity in Port Royal, Jamaica, with the aid of a Cuban fisherman whom they had liberated from the island’s prison. Among the captives, Madalena had spent the most time in unfreedom. In 1658, she had been taken captive off of a boat in the Bay of All Saints in Brazil by French traffickers who transported her to the island of Tortuga where they sold her into slavery. She spent two years enslaved on Tortuga, during which time she gave birth to her daughter, before she was sold again to a French merchant residing in Port Royal. Madalena lived and labored in the bustling English port city for the next seven years before her escape. Now, a decade later, Madalena defended her free status before Spanish officials, declaring that she “was born free and, as such, still is.” Crucially, this meant that her daughter inherited her mother’s free status, regardless of the circumstances surrounding her birth. Captivity by French traffickers and the experience of being commodified twice in less than ten years, as she argued, had not made slaves of Madalena or her daughter.

At least one man in the room had reason to disagree with Madalena’s articulation of her status. Listening to her testimony was Nicolás Castellón y Sánchez Pereira, the resident judge who represented Domingo Grillo and Ambrosio Lomelín in issues surrounding the newly created monopoly on the slave trade to Spanish America. Officials in Spain had responded to the increasingly chaotic nature of the transatlantic and intra-Caribbean slave trades in 1662 by issuing a new monopoly slave-trading contract, called an asiento, to a pair of merchants from Genoa: Grillo and Lomelín. The new asiento for the slave trade to Spanish America differed in important ways from previous contracts. Rather than conduct the transatlantic slave trade themselves, Grillo and Lomelín negotiated for permission to purchase enslaved Africans at English, French, or Dutch islands in the Caribbean and transport those enslaved peoples to specific Spanish American port cities. 

Castellón claimed legal ownership over the Afro-­descended captives who arrived from neighboring Jamaica on behalf of the new asiento holders. To Castellón, the Cuban fisherman who piloted the stolen boat was undoubtedly attempting to smuggle the Afro-­descended passengers into slavery rather than free them from captivity. And, he argued, by virtue of the asiento contract, Grillo and Lomelín had the right to “all of the Blacks that are introduced [in Cuba] and the value of them,” including captives trafficked by smugglers.

Madalena’s claims to freedom also fell on deaf ears among the other men in the room, some of whom argued that she, her daughter, and her ten companions should be granted their freedom — an argument that assumed they had become slaves in the course of being taken captive in the Americas and trafficked by northern Europeans. Rather than acknowledge her free status prior to captivity, Governor Francisco Oregón y Gascón argued that Madalena should be granted her freedom because she escaped from enemy heretics and sought to live among Catholics in Spanish territory. In making this argument, Oregón articulated a new Spanish Caribbean policy intended to weaken northern European colonies in the region by encouraging individuals enslaved by Protestants to escape. In the face of those officials, Madalena continued to articulate her free status, rejecting outright the logic that captivity and trafficking had ever made her a slave.

Madalena’s testimony in 1668, and the competing interpretations of her status, lay bare the racialized vulnerability experienced by captives of African descent who were trafficked across imperial jurisdictions. Madalena’s removal from the community of her birth — where family, neighbors, and community members would have attested to her free status — led officials in Havana to treat her like a slave. Nor was Madalena’s case unique. 

Territorial contestation in the form of the 1655 English invasion of Jamaica exposed different notions of race and status between the Spanish and English, while changes to the nature of the transatlantic slave trade to Spanish America meant that people who escaped captivity, like Madalena, became vulnerable to racialized interpretations of their legal status. The confluence of warfare and new slave-­trading methods, in other words, had important consequences for free people of African descent trafficked away from the communities where their free status was recognized.

An analysis of the experiences of people taken captive and trafficked away from communities of belonging intersects with a growing body of scholarship on the movement of people of African descent across imperial borders in the 17th and 18th centuries. Historians have shown, for example, that self-­liberated Africans and their descendants drove the creation of a sanctuary policy in Spanish America in which enslaved people who escaped from English, French, and Dutch colonies and sought baptism in the Catholic church were granted their freedom. By the late 17th century, the Spanish sanctuary policy provided important avenues for freedom for enslaved people who could escape and travel to Spanish territory. 

Some scholars have interpreted the experience of Madalena and her companions through the lens of this sanctuary policy, arguing that everyone in the group save one received their freedom. The experiences of the escapees from Jamaica, however, were more complicated and their eventual fate more ambiguous because of the profound changes enacted to the intra-­Caribbean and transatlantic slave trades during the 1660s. Religion and vassalage, in other words, were not the only factors that influenced what happened to individuals of African descent who sought sanctuary in the second half of the 17th century. Rather, people of African descent encountered a Spanish Atlantic profoundly shaped by the unprecedented authority that the Spanish Crown granted to representative factors of the Grillo and Lomellín asiento. The formalization of the slave trade under the new asiento, as scholars Tatiana Seijas and Alejandro García-­Montón have argued, “Africanized” slavery in Spanish America as other forms of licensed trade in bound labor were subsumed by the Grillo and Lomelín monopoly. These wider changes meant that the statuses of Madalena and her companions were contested by representatives of the asiento in Cuba who viewed the group not as potential converts or vassals but as Blacks and slaves.

Territorial contestation between the English and the Spanish and subsequent changes to the intra-­Caribbean slave trade unsettled many of the recognized social and corporate statuses that undergirded the position of Africans and their descendants in Spanish Caribbean society. This was particularly true for individuals who moved between Spanish and English spheres by coercion or by choice. 

 

In 1661, the French merchant who held Madalena and her daughter, Dominga, captive in Tortuga sold the mother and daughter to James Martine in Port Royal. Martine’s purchase of Madalena and Dominga from another man who claimed ownership over them created a documentary record of their enslavement and covered the illicit provenance of the two captives; should officials in Port Royal ask, Madalena and Dominga were already enslaved when Martine purchased them. It is likely, however, that no one asked about the provenance of the young mother and her child. The English town that Madalena and her daughter resided in for the next seven years was in the process of a rapid transition from a line of warehouses with a poorly built fort to a well-defended and thriving commercial hub. When Madalena arrived in 1661, Port Royal had just under 700 free residents and around 50 enslaved people. A decade later the population had grown to 1,669 people who held 312 individuals as slaves in the town.

Port Royal quickly emerged as a bustling economic and maritime center for English Jamaica. The brick homes and patios built by Port Royal’s mercantile community might have been disorienting for Spanish American captives and the town’s streets, which were designed around a triangular shape, beguiling for travelers accustomed to Spanish America’s gridded urban centers. Unlike Spanish Caribbean towns, which tended to be built inland to secure against foreigners attacking by sea, Port Royal’s orientation was thoroughly maritime. Even fresh water had to be rowed across the harbor from Passage Fort near the Rio Cobre and stored in cisterns in order to maintain the population of Port Royal. Captives in Port Royal labored both for English Jamaica’s maritime trades and for English administrators in building fortifications to defend against a feared Spanish or Dutch invasion. Port Royal’s growth during the 1660s relied heavily on the traffickers who brought specie, trade goods, and captives to the town.

When a Cuban fisherman named Simon Rodriguez contracted a fever while imprisoned in Port Royal, his English captors called upon Ignacio Hernández — one of the eleven people who later escaped with Madalena — to nurse him back to health. The Cuban fisherman had been taken captive when he sailed from Havana in 1666 for a tangle of sandy islands and reefs off of Florida’s southern coast. Following the seasonal migrations of sea turtles to the Bahamas, Rodriguez joined a multiethnic assortment of sea workers who hunted the marine creatures to provision regional markets. But in the shallow waters off the coast of Abaco, an English crew took him captive and forced him to labor aboard their vessel, likely in hunting the same turtles Rodriguez sought before being taken captive himself. After a month, Rodriguez’s captors left Bahamian waters to sell their catch to the rapidly growing population of Jamaica. 

No longer useful, and possibly a source of profit, Rodriguez’s captors attempted to sell him in Jamaica “to work for the space of seven years in the countryside.” The intervention of Jamaica’s governor prevented Rodriguez from being sold as an indentured servant, but he remained a captive in the island’s jail, likely so that he could serve in the same kind of prisoner of war exchange that Breton had used to gain entrance to Portobelo. Attending to the Cuban fisherman’s bedside, Ignacio recognized the utility of an experienced mariner for escaping the English island. In Rodriguez’s testimony about their escape, he referred to the other ten individuals as “companions” of Ignacio’s, indicating that it was Ignacio who maintained social ties with the other Spanish American captives and that the idea of escape by sea might have originated with the initial encounter between Simon and Ignacio.

For the Afro-descended captives, deciding to escape Jamaica with Simon Rodriguez meant weighing the risk of being caught alongside their particular vulnerability in a Caribbean context where their racial backgrounds made them a potential source of profit. Could they trust the Cuban fisherman to pilot them to freedom and not to slavery elsewhere? At least one member of the group had reason to fear being betrayed. According to Simon Rodriguez’s testimony, Ignacio made the Cuban fisherman promise not to reveal their plans to escape and told him that the rest of his companions were “people in whom he could trust.”

On the night of their escape each member of the group made their way to the beach where they seized a boat and rowed away from Port Royal under the cover of darkness. True to his word, Simon Rodriguez piloted the stolen boat over four hundred nautical miles between Port Royal, Jamaica, and Puerto de Batabano on the southwestern coast of Cuba. Rather than freedom, the arrival of a Cuban fisherman with eleven people of African descent raised questions, especially from the island’s representative of the Grillo and Lomelín asiento, an elite Cuban named Nicolás Castellón y Sánchez Pereira. With a shadow of suspicion over them, Madalena and her companions were marched to Havana where they were confined by Castellón. Their situation was made worse by the fact that their escape to Cuba coincided with a deadly epidemic on that island during which time official business ceased. The disease environment of the port city meant that Ignacio and his companions languished for a year under Castellón’s authority before the governor of Cuba, Francisco Oregón y Gascón, received a petition from members of the group and called for their interrogations in October 1668.

The petition, penned in the same notarial hand as the subsequent testimonies and signed by three of the eleven captives, contested the claims of the island’s asiento representative. The three signers — Ignacio Hernández, Gregorio Rodriguez, and Leonisio Rodriguez — explained that they had been held captive by the English for many years but, “as loyal vassals” of the Spanish Crown, they “risked their lives to escape tyranny and servitude among English heretics.” In contrast to the claims of Castellón, they argued that they were “free as any children of Adam,” and that it was simply circumstances that brought them before the governor of Havana to plead for a recognition of their freedom. The petition forcefully rejected the claims of Castellón that the captives were enslaved Africans smuggled into Cuba by Simon Rodriguez. As “free as any children of Adam,” the petitioners demanded that the governor of Havana adjudicate their case.

Governor Oregón responded by ordering the captives to explain the situation. For Oregón, Ignacio and his companions deserved freedom because they were “negros” who arrived in Cuba seeking to live among Catholics. Castellón and Oregón presented different arguments, but the fundamental assumptions remained the same. The growth of the intra-­Caribbean slave trade and the expansive jurisdiction of the Grillo and Lomelín asiento tethered the status of people of African descent to slavery in moments of maritime mobility. 

Excerpted from The Predatory Sea: Human Trafficking and Captivity in the Seventeenth-Century Caribbean by Casey Schmitt. Copyright © 2025 by the University of Pennsylvania Press. Reprinted by permission of the University of Pennsylvania Press.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186061 https://historynewsnetwork.org/article/186061 0
A. Philip Randolph Lambasts the Old Crowd “The New Negro” is one of the oldest, longest serving, and most fascinating concepts in the history of African American culture. Expansive and elastic, capable of morphing and absorbing new content as circumstances demand, contested and fraught, it assumes an astonishingly broad array of ideological guises, some diametrically opposed to others. The New Negro functions as a trope of declaration, proclamation, conjuration, and desperation, a figure of speech reflecting deep anguish and despair, a cry of the disheartened for salvation, for renewal, for equal rights.

While most of us first encounter the “New Negro” as the title of the seminal anthology that Alain Locke published in 1925, at the height of the Harlem Renaissance, we find the origin of the term — so closely associated by scholars and students alike with the multihued cacophony of the Jazz Age — actually goes back to 1887, four years after the U.S. Supreme Court had voided the liberal and forward-looking Civil Rights Act of 1875, thereby declaring with ultimate finality the end of Reconstruction. This was a time of great despair in the African American community, especially among the elite, educated, middle and upper middle classes. How to fight back? How to regain the race’s footing on the path to full and equal citizenship? This is how and when the “New Negro” was born, in an attempt to find a way around the mountain of racist stereotypes being drawn upon to justify the deprivation of Black civil rights, the disenfranchisement of Black men, and the formalization of Jim Crow segregation, all leading to the onset of a period of “second-class citizenship” that would last for many decades to come — far longer than any of the first New Negroes could have imagined.

One member of the New Negro coalition was A. Philip Randolph (1889-1979), a Black socialist and labor organizer hailed by Martin Luther King Jr. as “truly the Dean of Negro Leaders.” Immediately attracted to socialism as the best means of addressing the systemic exploitation of Black workers, he joined the Socialist Party with Columbia University student Chandler Owen. The two started giving soapbox speeches and founded the socialist Messenger magazine in 1917, which they proclaimed offered readers “the only magazine of scientific radicalism in the world published by Negroes!” Even before founding the Brotherhood of Sleeping Car Porters and Maids in 1925 and making the Messenger the union’s official organ, Randolph used the idea of the New Negro repeatedly as a call to action. Indeed, the Messenger cast out the previous era’s New Negroes as old and unable to address the crisis of the Red Summer. 

In the September 1919 issue, the Messenger included a half-page satirical political cartoon, “Following the Advice of the ‘Old Crowd’ Negroes,” that featured Du Bois, Robert Russa Moton (the successor to Booker T. Washington at the Tuskegee Institute after Washington died in 1915), and Emmett Jay Scott, secretary to Moton and a close adviser of Washington. On the left side of the cartoon a white man in military uniform leads a jeering, torch-carrying mob, and wields a club to attack an already bloodied Black person who struggles to raise from the ground. Another bloodied Black victim sits propped up at the base of the Statue of Liberty. On the night of July 18, 1919, in Washington, DC, over one hundred white servicemen, a “mob in uniform,” wielded pipes, clubs, rocks in handkerchiefs, and pistols attacking Black people they saw on the street. The Black historian Carter G. Woodson, in fact, fled the mob on foot. 

In the cartoon, three august “Old Negroes” propose accommodationist responses to the violence. A seated Du Bois implores, close ranks let us forget our grievances, a reference to his famous Crisis editorial the previous year urging Black readers to support World War I. Beside him, with hands clasped, stands Moton, who urges, be modest and unassuming! Scott, reaching back to Moton, says, when they smite thee on one cheek—turn the other. 

A cartoon from The Messenger, 1919. 

“The ‘New Crowd Negro’ Making America Safe for Himself,” features a clearly younger New Negro veteran in a speeding roadster — labeled the new negro, equipped with guns firing in the front and sides, and displaying a banner commemorating infamous 1919 sites of race riots, “Longview, Texas, Washington, D.C., Chicago, ILL.—?” As he fires at the fleeing white mob, a fallen member of which is in uniform, he declares, since the government won’t stop mob violence ill take a hand. In the clouds of smoke appears the caption giving the ‘hun’ a dose of his own medicine. Above, the editors quote Woodrow Wilson’s April 1918 Great War rallying cry against Germany: force, force to the utmost—force without stint or limit! Clearly, socialism, for Randolph, offered New Negroes the organizational fighting power Black people needed to fend off the most symbolically treacherous of all white mob attacks — those by U.S. military servicemen in uniform.

A cartoon from The Messenger, 1919.

In “Who’s Who: A New Crowd—A New Negro,” published in the May-June 1919 issue of the Messenger, in the wake of the Great War, Randolph urges Black socialists to join forces with white radicals and labor organizers to usher in a new era of social justice. 

—Henry Louis Gates, Jr. and Martha H. Patterson

A. Philip Randolph, from The Messenger, 1917. [Wikimedia Commons]

Throughout the world among all peoples and classes, the clock of social progress is striking the high noon of the Old Crowd. And why?

The reason lies in the inability of the old crowd to adapt itself to the changed conditions, to recognize and accept the consequences of the sudden, rapid and violent social changes that are shaking the world. In wild desperation, consternation and despair, the proud scions of regal pomp and authority, the prophets and high priests of the old order, view the steady and menacing rise of the great working class. Yes, the Old Crowd is passing, and with it, its false, corrupt and wicked institutions of oppression and cruelty; its ancient prejudices and beliefs and its pious, hypocritical and venerated idols.

It’s all like a dream! In Russia, one-hundred and eighty million of peasants and workmen—disinherited, writhing under the ruthless heel of the Czar for over three hundred years, awoke and revolted and drove their hateful oppressors from power. Here a New Crowd arose—the Bolsheviki, and expropriated their expropriators. They fashioned and established a new social machinery, the Soviet—to express the growing class consciousness of teaming millions, disillusioned and disenchanted. They also chose new leaders—Lenin and Trotsky—to invent and adopt scientific methods of social control; to marshal, organize and direct the revolutionary forces in constructive channels to build a New Russia.

The “iron battalions of the proletariat” are shaking age-long and historic thrones of Europe. The Hohenzollerns of Europe no longer hold mastery over the destinies of the German people. The Kaiser, once proud, irresponsible and powerful; wielding his sceptre in the name of the “divine right of kings,” has fallen, his throne has crumbled and he now sulks in ignominy and shame—expelled from his native land, a man without a country. And Nietzsche, Treitschke, Bismarck, and Bernhardi, his philosophic mentors are scrapped, discredited, and discarded, while the shadow of Marx looms in the distance. The revolution in Germany is still unfinished. The Eberts and Scheidemanns rule for the nonce; but a New Crowd is rising. The hand of the Sparticans must raise a New Germany out of the ashes of the old.

Already, Karolyi of the old regime of Hungary, abdicates to Bela Kun, who wirelessed greetings to the Russian Federated Socialist Soviet Republic. Meanwhile the triple alliance consisting of the National Union of Railwaymen, the National Transport Workers’ Federation and the Miners’ Federation, threaten to paralyze England with a general strike. The imminence of industrial disaster hangs like a pall over the Lloyd George government. The shop stewards’ committee or the rank and file in the works, challenge the sincerity and methods of the old pure and simple union leaders. British labor would build a New England. The Sein Feiners are the New Crowd in Ireland fighting for self-determination. France and Italy, too, bid soon to pass from the control of scheming and intriguing diplomats into the hands of a New Crowd. Even Egypt, raped for decades prostrate under the juggernaut of financial imperialism, rises in revolution to expel a foreign foe.

And the natural question arises: What does it all mean to the Negro?

First it means that he, too, must scrap the Old Crowd. For not only is the Old Crowd useless, but like the vermiform appendix, it is decidedly injurious, it prevents all real progress.

Before it is possible for the Negro to prosecute successfully a formidable offense for justice and fair play, he must tear down his false leaders, just as the people of Europe are tearing down their false leaders. Of course, some of the Old Crowd mean well. But what matter is [it] though poison be administered to the sick intentionally or out of ignorance. The result is the same—death. And our indictment of the Old Crowd is that: it lacks the knowledge of methods for the attainment of ends which is desires to achieve. For instance the Old Crowd never counsels the Negro to organize and strike against low wages and long hours. It cannot see the advisability of the Negro, who is the most exploited of the American workers, supporting a workingman’s political party.

The Old Crowd enjoins the Negro to be conservative, when he has nothing to conserve. Neither his life nor his property receives the protection of the government which conscripts his life to “make the world safe for democracy.” The conservative in all lands are the wealthy and the ruling class. The Negro is in dire poverty and he is no part of the ruling class.

But the question naturally arises: who is the Old Crowd?

In the Negro schools and colleges the most typical reactionaries are Kelly Miller, Moton and William Pickens. In the press Du Bois, James Weldon Johnson, Fred R. Moore, T. Thomas Fortune, Roscoe Conkling Simmons and George Harris are compromising the case of the Negro. In politics Chas. W. Anderson, W. H. Lewis, Ralph Tyler, Emmet Scott, George E. Haynes, and the entire old line palliating, me-to-boss gang of Negro Republican politicians, are hopelessly ignorant and distressingly unwitting of their way.

In the church the old crowd still preaches that “the meek will inherit the earth,” “if the enemy strikes you on one side of the face, turn the other,” and “you may take all this world but give me Jesus.” “Dry Bones,” “The Three Hebrew Children in the Fiery Furnace” and “Jonah in the Belly of the Whale,” constitute the subjects of the Old Crowd, for black men and women who are overworked and under-paid, lynched, jim-crowed and disfranchised—a people who are yet languishing in the dungeons of ignorance and superstition. Such then is the Old Crowd. And this is not strange to the student of history, economics, and sociology.

A man will not oppose his benefactor. The Old Crowd of Negro leaders has been and is subsidized by the Old Crowd of White Americans—a group which viciously opposes every demand made by organized labor for an opportunity to live a better life. Now, if the Old Crowd of white people opposes every demand of white labor for economic justice, how can the Negro expect to get that which is denied the white working class? And it is well nigh that economic justice is at the basis of social and political equality.

For instance, there is no organization of national prominence which ostensibly is working in the interest of the Negro which is not dominated by the Old Crowd of white people. And they are controlled by the white people because they receive their funds—their revenue—from it. It is, of course, a matter of common knowledge that Du Bois does not determine the policy of the National Association for the Advancement of Colored People; nor does Kinckle Jones or George E. Haynes control the National Urban League. The organizations are not responsible to Negroes because Negroes do not maintain them.

This brings us to the question as to who shall assume the reins of leadership when the Old Crowd falls.

As among all other peoples, the New Crowd must be composed of young men who are educated, radical and fearless. Young Negro radicals must control the press, church, schools, politics and labor. The condition for joining the New Crowd are: equality, radicalism and sincerity. The New Crowd views with much expectancy the revolutions ushering in a New World. The New Crowd is uncompromising. Its tactics are not defensive, but offensive. It would not send notes after a Negro is lynched. It would not appeal to white leaders. It would appeal to the plain working people everywhere. The New Crowd sees that the war came, that the Negro fought, bled and died; that the war has ended, and he is not yet free.

The New Crowd would have no armistice with lynch-law; no truce with jim-crowism, and disfranchisement; no peace until the Negro receives complete social, economic and political justice. To this end the New Crowd would form an alliance with white radicals such as the I.W.W., the Socialists and the Non-Partisan League, to build a new society—a society of equals, without class, race, caste or religious distinctions.

Excerpted from The New Negro: A History in Documents, 1887-1937. Copyright © 2025 by Martha H. Patterson and Henry Louis Gates, Jr. Reprinted by permission of Princeton University Press.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186059 https://historynewsnetwork.org/article/186059 0
A Powerful Influence on American Democracy In May 1951, Kwame Nkrumah received an invitation to Lincoln University. The news that his alma mater had plans to confer upon him an honorary doctorate the very next month landed with total surprise. As Nkrumah wrote:

It was just over six years since I had left America and I could not believe that such an honour could be bestowed upon me in so short a space of time. I felt that I had not done enough to merit it and my first inclination was to decline it.

The Lincoln invitation had been the doing of Horace Mann Bond, the first Black man to lead the university and its president since 1949. Bond, a precocious African American student from Nashville, had graduated with honors from Lincoln in 1923 at nineteen and then earned advanced degrees from the University of Chicago. He had made his academic reputation with original research on the education of Blacks in the American South. In his first book, The Education of the Negro in the American Social Order, he questioned the use of IQ tests by the army to assess the intelligence of African American recruits.

This anticipated by decades a scholarly consensus that that would eventually find that standardized tests were anything but culturally neutral. Subsequent work by Bond reappraised the history of the American Reconstruction Era and refuted the idea long held dear to champions of the myth of the “Lost Cause” and of the so-called Redemption, the period of resumed white supremacy across the South that followed Reconstruction. It held that profligacy caused by the entry of Blacks into government after the Civil War had driven the South into economic ruin.

In addition to being an original thinker, influential scholar, and part of what was still a very small cohort of academically trained Black historians in the United States, Bond was also a classic “race man.” This once-common term was used for African Americans who wore pride in their identity openly and believed that their social duty was to do whatever they could to advance the prospects of Black Americans as a group. In many of the black and white photographs of Bond from this era, there’s a hint of a scowl, and in that expression, I have often been tempted to read not just the flinty combativeness he was known for, but also smoldering resentment over the wages that racism in his society exacted from him and from Black people in general.

Although descended from enslaved great-grandparents, Bond was born into the Black middle class as the son of two college-going parents, a mother who became a schoolteacher, and a father who was a Congregational minister who preached throughout the South. As a boy, he was regaled with memories of Africa by his aunt Mamie, who had worked as a medical missionary on the continent. Then, as a young man, he had avidly read stories about Africa in the pages of Du Bois’s NAACP journal, The Crisis, which often emphasized the existence of kingdoms and accounts of African achievement. Du Bois wrote much of this content himself, beginning with the story of his first voyage to the continent, in 1923, when he visited Liberia, one of only two Black-ruled countries in the world at the time (although Haiti was then under American military occupation). Du Bois often lapsed into what one historian has called “a hyper-lyricism brought on by the sheer euphoria of having slipped the surly bonds of American racism.” “Africa is vegetation. It is the riotous, unbridled bursting life of lead and limb,” Du Bois gushed in one typical column. It was also “sunlight in great gold globules,” and “soft, heavy-scented heat,” that produced a “divine, eternal languor.”

In 1949, Bond took the first of his own eventual ten trips to Africa, and it utterly reshaped his life. It wouldn’t be an exaggeration to say that it also powerfully altered the historical trajectory of Black people on both sides of the Atlantic for the next two decades. Bond’s interest in Nkrumah, and the bridge he helped build for him with African Americans, threw a precious lifeline to the emerging Gold Coast leader at a time when he had few other cards at his disposal. And it pointed to a possible future of deep and mutually strengthening ties between two parallel movements, one for civil rights in America, and the other for independence for Africa’s colonies. Both were in dire need of allies as the world entered the Cold War. Bond’s early trips to Africa placed him at the forefront of an ideologically diverse group of African American intellectuals and political activists that would swell dramatically throughout this period —all of them fired up with the idea that the liberation of Africa and the battle for full citizenship rights for Black Americans were so fundamentally linked that if they were to advance at all, they would have to proceed in tandem.

In its first phase, this group included African Americans who had become familiar to the broad public: the novelist Richard Wright, the diplomat Ralph Bunche, the nationally prominent labor leader and elder statesman, A. Philip Randolph, and, just slightly later, a young Baptist minister named Martin Luther King Jr. Behind big names like these stood a panoply of others who also played crucial roles in building bonds between Black America and Africa but who mostly labored in relative anonymity. These included people such as William Alphaeus Hunton, a professor of English, and the historians Rayford Logan and William Leo Hansberry, all of whom taught at Howard University. The latter, uncle of the playwright, Lorraine Hansberry, had begun teaching African history at Howard in 1922. Four years later, with the appointment of Mordecai Wyatt Johnson, Howard got its first Black president, but it wasn’t until two decades after that, in 1954, at Hansberry’s initiative, that the university introduced the nation’s first African Studies curriculum.

The Second World War and its aftermath saw a recentering of pan-Africanist energy in Africa itself.

After following the example of Lincoln’s leadership by educating more and more students from Africa and the Caribbean, Black colleges and universities in the United States became a catalyst for this, spurring the development of a global Black consciousness movement. Not only did thinkers from different continents come together on these campuses, but with a critical mass came much more militantism. Here, although Lincoln had been the undeniable pathbreaker, it was Howard University that, starting even before the Second World War had ended, surged ahead to become the most important locus of ideas and activism linking Blacks from Africa and the diaspora in profound new ways.

Nnamdi Azikiwe of Nigeria has been called a “student zero” of African nationalism on American campuses for the way he had helped recruit African students, including Nkrumah, to historically Black colleges in the United States. Although Azikiwe eventually graduated from Lincoln, he had transferred there from Howard, where he had been unable to pay the bills for his studies. It was at Howard, he later wrote, where “the idea of a new Negro evolved into the crusade for a new Africa.” This resulted from the intense stimulation he experienced on a campus that had been assembling a deepening bench of intellectual stars since Alain Locke, a Rhodes Scholar, was hired in the 1920s. In Azikiwe’s case, it came from studying there under people like Leo Hansberry and Ralph Bunche.

At Howard, and wherever else a critical mass of students from Africa and the Black diaspora outside of the United States gathered, something else important began to occur: a sharing of experiences of exploitation and suffering under imperial rule. This also juiced campus progressivism. Learning from each other bred a bolder self-confidence, and as it did so, colonized and recently emancipated peoples began to lose whatever lingering patience they had with the temporizing of Western nations based on the supposed need for tutelage and gradual preparation for the responsibilities of self-government.

 

From the moment of his appointment as the first Black president of Lincoln University in 1945, Bond faced persistent pressure from trustees and others to change the school’s vocation. For decades, its official mission had been “the education of Colored youth.” Bond acceded to the removal of that phrase from Lincoln’s charter, but he pushed back against demands that the university actively recruit white students in order to significantly dilute its Black student body. These calls became even more insistent in the early 1950s when desegregation cases were working their way through the federal courts, making it seem increasingly likely that racial separation in American schools was doomed to fade.

True race man that he was, Bond was furious over the board’s pressure and responded defiantly. At most northern colleges and universities, Black students and faculty still numbered few to none. Lincoln, by contrast, had long welcomed white students and even recruited small numbers of them from nearby communities. “Having done this have we not done enough?” Bond asked. “Our self-respect will not permit us to do more.” In 1949, the Lincoln alumnus Thurgood Marshall, then legal counsel of the NAACP, gave a speech on campus in favor of integrating his alma mater. But Bond, who had personally led the desegregation of local schools in the community surrounding Lincoln by suing to force them to accept Black students, pushed back. According to a biographer, he criticized Marshall and the NAACP for praising white colleges that had two or three Black undergrads while maintaining all white boards and faculties. “Let those white colleges with token Black students hire Black faculty and choose Black board members; then they might merit being called interracial, as Lincoln did.”

Resentment over such double standards fueled Bond’s determination to intensify his school’s relations with Africa, both in terms of supporting applicants from the continent, as it had long done, and through a new kind of personal diplomacy toward Africa. Through Bond, the politics of these two issues — integration at home and the pull of Africa abroad — on the surface, seemingly unrelated, would become increasingly and explicitly joined. As they did so, they set him at odds with Lincoln’s board and ultimately contributed to his firing in 1957, ironically the year that Nkrumah led Ghana to independence.

 

Bond’s first visit to Africa in 1949 was on a trip partially paid for by a Lincoln alumnus from Nigeria. His first inkling of what Africa could mean for Lincoln and what Lincoln could mean for the continent had likely occurred two years earlier. That was when Nnamdi Azikiwe had returned to the campus to receive an honorary degree. Around that time, Bond began to argue that his university’s longstanding connections to the continent constituted a major competitive advantage that Lincoln had done little to exploit. Africa was clearly moving into a new age of eventual independence, and with alumni like Azikiwe and Nkrumah, the school had a special role to play. Bond even wrote that these two had “learned Democracy — with a capital D” at Lincoln, where they were made “good Americans — with an immense admiration for American inventiveness, enterprise and industry.”

By the time of his 1949 tour of West Africa, Bond’s thinking had evolved from vague and boosterish notions about the public relations gains to be won by Lincoln to a political vision about synergies to be developed between currents of Black nationalism on opposing sides of the Atlantic. Writing from Africa to the editor of the Baltimore Afro-American, then a leading Black newspaper, Bond affirmed: “Here is Black nationalism — the more astonishing to an American because of the low esteem in which the African American is held. But the American Negro enjoys that same tremendous prestige here that America does.” This was the germ of a robust and sophisticated later argument that the exercise of sovereignty and self-rule by new African leaders could serve as powerful sources of pride and inspiration for African Americans, while also helping to undermine the worst sorts of racist stereotypes held by whites against them.

“The key point for realizing the aspirations of the American Negro, lie[s] in Africa, and not in the United States,” Bond remarked in a “Letter from Africa” column dated October 17, 1949. “It is the African who, I think, will dissipate forever the theories of racial inferiority that now prejudice the position of the American Negro.” Of all the colonies in sub-Saharan Africa, the Gold Coast seemed closest to achieving independence from a European power peacefully. Bond became one the first African American thinkers to seize on its importance as a lodestar for African American liberation as well. If the Gold Coast, soon Ghana, could bring to vivid life images of Black people successfully conducting their affairs in a reasoned and orderly manner, he believed, it would deliver a serious blow to white supremacy everywhere.

The acerbic, chip-on-his shoulder Bond may have been among the first to think this way, but he was by no means alone. Indeed, one of the most remarkable things about this forgotten epiphanic moment is how widespread such thinking became across the African American political spectrum. According to the standards of the early Cold War, Bond stripped of his pan-Africanism was a run-of-the-mill, pro-business, anticommunist figure. Thoughts like his about the importance of Ghana’s example to African Americans found their neat echo, though, in 1950 in the words of Alphaeus Hunton. This Harvard educated grandson of Virginia slaves, Howard University English professor, and Communist Party member, became a leader of a pioneering anti-imperialist group called the Council on African Affairs (CAA). The CAA’s members were fiercely hounded by the McCarthy-era’s hysterically anticommunist House Un-American Activities Committee. In 1951, Hunton was imprisoned for his refusal to testify before the committee. He emigrated to Africa in 1960, first to Ahmed Sékou Touré’s Guinea, then to Nkrumah’s Ghana, and finally to Zambia, where he died of cancer in 1970. In one letter, he wrote:

It is not a matter of helping the African people achieve freedom simply out of a spirit of humanitarian concern for their welfare. It is a matter of helping the African people because in doing this we further the possibility of their being able to help us in our struggles here in the United States. Can you not envision what a powerful influence a free West Indies or a free West Africa would be upon American democracy?

Bond’s writings and conversations from this time reveal still more complexity about the ways in which racial identity questions for Black Americans were evolving in relation to a changing Africa. From that first trip to the continent, at a time when “Negro” or “colored” were the standard appellations for Blacks, Bond had already begun to anticipate the shift, still at least a quarter century away, toward the term African American. “Sincerely — (and with a great new pride that I am an American of African descent…)” he wrote at the close of one letter.

Excerpted from The Second Emancipation: Nkrumah, Pan-Africanism, and Global Blackness at High Tide by Howard W. French. Copyright © 2025 by Howard W. French. Used with permission of the publisher, Liveright Publishing Corporation, a division of W.W. Norton & Company, Inc. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186058 https://historynewsnetwork.org/article/186058 0
Life in the Firestorm The fallen brick sat at the edge of an abandoned lot, staring up at Roberto Ramirez like a question mark. Ramirez, a sixth grader in the Bronx, had been instructed by his art teacher to search for “found objects,” and his eyes gravitated toward this small chunk of a crumbling tenement. The assignment was to envision the objects “as something else,” so Ramirez pictured the brick as a tiny building that was still standing. He took a paintbrush to its rough exterior, and after carefully outlining the building’s matchbox windows, he filled their frames with fire. Not all of them, though — only the windows on the upper floors. He knew that in the Bronx, fires started at the top.

Ramirez’s technique quickly caught on among the other kids in his class, and before long the eleven­ and twelve­-year-olds had produced a series of fifty flaming miniatures. It was 1982, and they drew what they knew: life in a firestorm. Another student, John Mendoza, and his family had been burned out of their apartment three times by arsonists, an experience as routine as it was calamitous. Yet what the students didn’t know, beyond the rumors children sometimes absorb, was why fire was so prevalent in their neighborhood. For an explanation, they looked to their new art teacher, Tim Rollins, a white conceptual artist with ties to the downtown art scene. As an outsider in the Bronx, Rollins had no satisfying answers, so he decided on a field trip. “We go down to the Fire Department,” he recalled, and “the firemen see these ten crazy kids and me come stomping in and asking, ‘Why are there so many fires?’ ” The firefighters offered only vague replies. The students left the station dejected, but in the mystery Rollins spotted a teaching tool. He asked the sixth graders to inscribe an explanation on the bricks, and marveled, “We got 70 different reasons.” Ramirez, for one, blamed tenants who were behind on rent: he wrote “rent late” on the building’s roof. His classmate claimed that “junkies burned the buildings down,” while another wrote, “no heat.”

The students were left with concrete bricks in lieu of concrete answers. What they were attempting to do was give the bricks a history. The project became known as the Bricks series, and it was the first in a decades­long, intergenerational collaboration called Kids of Survival (or K.O.S.), so named because “we were broke but not broken.” One of the Bricks now sits in the permanent collection of the Whitney Museum of American Art.

The question haunting the Bricks series to this day is why the students suggested “70 different reasons” for the conflagration that upended their lives and engulfed their neighborhoods. How could the toll from the fires have been so colossal and its source so opaque?

 

Seven years earlier, in April 1975, a different act of painting offered some insight. Smearing black pigment onto their hands and faces, landlord Imre Oberlander and his associate Yishai Webber prepared to torch one of the former’s six buildings in the South Bronx. The white incendiaries believed blackface would offer them cover, like a perverse kind of safety gear. At four a.m. on a Friday morning, the two men cruised down Southern Boulevard en route to the targeted building. They hoped the twilight would provide further protection, but when they drove past a police car, their broken taillight caught the attention of the officers on patrol. Pulling them over on the wide thoroughfare, the policemen saw two Hasidic men from Williamsburg made up in blackface, one wearing a wig, and proceeded to search the car. They found two “firebombs ” —crude incendiary devices made out of gasoline, gunpowder, and a timing device.

Oberlander became one of the first landlords charged in connection with the decade’s “epidemic of arson,” as the New York Times had begun to call it. Though the Bronx, in particular, had been burning for years by this point, authorities remained so oblivious to the root causes that they initially suspected Oberlander and Webber of being spies en route to the Soviet diplomatic compound, ten miles away, at the opposite end of the borough. It is true that the firestorm involved vast conspiracies, transnational dealings, and a doctrine of containment, but it all had little to do with the Cold War. What drove the arson wave was profit. Oberlander had collected $125,000 (nearly $750,000 in 2024 dollars) in insurance payouts from twenty-one separate fires between 1970 and 1975. All his claims were “paid off without a murmur from the insurance company,” railed Bronx District Attorney Mario Merola, warning without hyperbole that this was just “a drop in the bucket.”

Between 1968 and the early 1980s, a wave of landlord arson coursed through cities across the United States, destroying large portions of neighborhoods home to poor communities of color. From Boston to Seattle, tens of thousands of housing units burned (this is a conservative estimate); the most affected neighborhoods lost up to 80 percent of their housing stock. Yet historians have largely neglected the burning of the nation’s cities, and popular memory has commonly confused the 1970s arson wave with the well-documented but far less destructive urban uprisings of the previous decade. The 1960s rebellions — most famously Watts in 1965, Newark and Detroit in 1967, and everywhere after the assassination of Martin Luther King Jr. in 1968 — were born of Black (and in some instances Puerto Rican) outrage over the persistence of white supremacy despite the tangible gains of the civil rights movement. In most cases set off by an incident of police violence, the rebellions represented a collective revolt against not just overpolicing but the daily persecution of Black communities in the form of unequal employment, housing, education, and more. Though these events were often deemed “senseless riots” devoid of a coherent politics, they were formidable and far-reaching — though fledgling — insurgencies. Historians typically describe this era as stretching from Birmingham in 1963 to the nationwide uproar following MLK’s murder in 1968, although important recent work has tracked the rebellions into the 1970s.

Whether measured in dollars or lives lost, the destruction caused by the uprisings of the 1960s pales in comparison to the arson wave of the 1970s. In 1967, the most violent year of the decade, the number of dead were counted in the tens and the insurable losses totaled $75 million. By contrast, at least five hundred people died of arson annually across the United States during the 1970s, and by 1980 the New York Times was estimating that arson caused $15 billion in total annual losses. Admittedly, these are crude and fraught barometers of historical significance. The rebellions had immense political implications on a national scale, and they justifiably loom large within the popular imagination. The 1970s blazes were perhaps too common, too consistent with existing iniquities, to draw the same kind of attention.

The latter decade was defined not by insurrection but by indemnification, though the two were connected, as we will see. The 1970s conflagrations bring into view the untold history of the racially stratified property insurance market, a key force in the making and remaking of American cities. Although fire usually requires only oxygen, heat, and fuel, the crucial ingredient during that decade was state-sponsored fire insurance, initiated by federal fiat in response to the 1960s uprisings. The reform effort was supposed to put an end to insurance redlining, which had left entire swaths of the American city uninsured or underinsured due to the race and class of their residents. Yet increased access to second-rate fire insurance, when paired with state cutbacks and ongoing mortgage redlining, incentivized landlord arson on a vast scale.

The Bronx lost approximately 20 percent of its total housing stock to fire or abandonment between 1970 and 1981—around 100,000 units, nearly the equivalent of the number of housing units in today’s Richmond, Virginia, or Reno, Nevada. Destruction on this scale, unfathomable as it may be, should not be seen as evidence of the Bronx’s exceptionality. The arson wave hit cities across the country, in every region. Coast to coast, Black and Brown tenants were blamed for the fires. Yet the evidence is unequivocal: the hand that torched the Bronx and scores of other cities was that of a landlord impelled by the market and guided by the state.

That hand was also, in the case of Imre Oberlander, covered with dark pigment. Who was the audience for this 4 am racial masquerade? Was it the building’s tenants, the block’s bystanders, the beat cops? Whomever they imagined as potential witnesses, Oberlander and Webber were performing a well-rehearsed script of Black and Brown incendiarism. The specter of the Black firesetter, in particular, is older than the United States itself. For the two white arsonists, the racist trope was something to exploit. The landlord and his accomplice believed it could deflect blame and prevent them from being identified. They applied blackface as though it, too, were a form of insurance.

Oberlander and Webber may have also seen blackface as a shield against a different bigotry — that of “Jewish lightning.” The stereotype of the arsonist Jew was a vestige of medieval anti-Semitism that was modernized by fire insurers in the mid-19th century, when underwriters at Aetna, the Hartford, and other notable firms warned against issuing policies to “Jew risks,” in part because of Jews’ supposed proclivity for fraud. The stereotypical arsonist, whether in its anti-Semitic or anti-Black variant, fulfilled a similar function: distracting from the larger power structures at work. In the 1970s, the Jewish slumlord became a potent symbol of Black exploitation, but in fact the redlining banks and insurance companies had, to different degrees, discriminated against both Black and Jewish communities.

The irony in Oberlander and Webber’s blackface gambit was that the two men ended up getting caught precisely because their performance of Blackness was both too convincing and too implausible. That is, their apparent Blackness may well have played a role in the police officers’ decision to pull them over, and their thinly veiled whiteness — upon closer inspection — almost definitely prompted the search of their car.

Few landlord arsonists actually made a habit of wearing blackface, because few had a need for it. The arson wave was made possible by financial masquerade—an array of insurance and real estate practices that obscured accountability and diffused risk — combined with official neglect and the presumed criminality of the Black and Brown tenants held culpable for the fires. Instead of blackface, landlords often chose more cunning disguises, such as hiring paid “torches,” usually neighborhood teenagers, to do the burning for them. But that was just the opening scene of a multi-act white-collar revue, one that featured Hollywood studios dishing out Bronxploitation films, journalists vilifying the supposed welfare arsonist, underwriters flooding cities with subpar coverage, insurance executives feigning impotence, real estate players attacking rent control, criminologists theorizing about broken windows, lawmakers gutting the fire service, and pundits yammering on about riots and pyromaniacs. All sang the same chorus, drowning out dissenting voices as well as the true origins of the arson wave. Blackface was not necessary when there was such a vibrant tradition of briefcase minstrelsy.

 

The torching of wide swaths of the American metropolis may strike some as a bizarre event in the distant past. Yet it is very much part of how our cities came to be. Long neglected by historians, the 1970s arson wave vividly reveals late-20th-century shifts in political economy that still shape our lives. Out of its embers was forged the metropolis we know today: one defined by volcanic real estate booms, economy-cratering busts, and an ongoing decline in housing stability. The world in which a solidly built home could generate more value by ruination than habitation is the same world in which homelessness, eviction, and foreclosure have become defining aspects of urban life.

The story of landlord arson is not a cautionary tale of capitalism gone awry, of a few bad apples, of uncaring policymakers, of government overreach, or of a grittier bygone era. To frame it as a singular, sensational episode of the past is to gloss over its continuities with — and its role in creating — the structures of the present. Warning against such “spectacularization of black pain,” Saidiya Hartman counsels that “shocking displays too easily obfuscate the more mundane and socially endurable forms of terror.” The arson wave renders visible much that is hidden in plain sight, historically and to this day.

Over the last 50 years, housing insecurity and real estate volatility have come to define our cities, and though there are many causes, none is more significant than financialization, which surged in the years after 1968. Financialization is the process by which an economy that was once organized around the making and trading of physical commodities becomes increasingly oriented around the profits from financial activity. The high finance we know from the nightly news and the silver screen is found on the fiftieth floor of a glass-encased skyscraper and in the cacophonous pits of a stock exchange, fueled by adrenaline, greed, and cocaine. The image we have is set a thousand feet in the air, its spoils and scandals a world apart, even if they eventually touch the rest of us. But this is not the only face of financialization, nor necessarily the one that sheds the most light on the crises of the present. The arson wave opens up a view of financialization from the ground up and far from the fray of Wall Street.

The stock image of the 1970s American city features an urban economy in decline. It is rarely acknowledged that there were profits to squeeze from the destruction of the metropolis, particularly in neighborhoods of color. The ascendance of the FIRE industries on the heels of the civil rights movement created conditions primed for plunder, especially in cities suffering from the flight of white residents and well-paying jobs. “Instant liquidity,” as one arsonist for hire described it in his testimony before Congress, was the real estate equivalent of Wall Street’s liquidity preference: the priority placed on an asset’s ready convertibility into cash. What made buildings liquid was property insurance expansion, presented as a means of racial justice and redress. Which is to say that race underwrote the gains enjoyed by landlords.

For those looking to make a quick buck, the Bronx and other communities of color possessed a peculiar asset: the powerful alibi of racial pathology. The presumption of Black and Brown criminality blotted out the fact of dispossession so completely that, all these decades later, the vague impression that Bronxites burned down their own borough endures, while the vast fortunes made were forgotten. The peril of getting caught perpetrating fraud thus transferred to its victims, where it has long remained.

Excerpted from Born in Flames: The Business of Arson and the Remaking of the American City by Bench Ansfield. Copyright © 2025 by Bench Ansfield. Used with permission of the publisher, W. W. Norton & Company, Inc. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186057 https://historynewsnetwork.org/article/186057 0
Textiles as Historical Texts In Weaving the Word, Kathryn Sullivan Kruger, a professor of English, examines the link between written texts and woven textiles. Kruger asserts that before stories were recorded through written text, cloth preserved and communicated these important social messages. Kruger argues for expanding the idea of literary history to include women’s role in transmitting traditions, stories, and myths via fabric. By including textiles in our study of literature and history, we will find many female authors. She also maintains that during times when weaving was analogous to storytelling, “women’s endeavors were equal to culture and were not considered beneath culture or marginal to it.” Cloth tells stories, records histories, and shapes culture in a synergistic interaction that makes it impossible to disentangle the effect of one on the other.

The Bayeux Tapestry, an 11th-century embroidered account of the Norman conquest of England in 1066 by William the Conqueror, is a clear example of textiles as historical texts. While the events of this epic battle are enshrined in woolen thread on linen, no one knows who stitched it. An 18th-century legend has it that Queen Matilda, William the Conqueror’s wife, carried out the embroidery with her ladies in waiting. While a romantic notion, this was certainly not the case. Most scholars believe that a group of Anglo-Saxon embroiderers stitched it near Canterbury, England. All the surviving evidence indicates that only women in early medieval England embroidered and that it was a highly regarded female occupation. However, there is no known convention of women embroidering on such a large scale — the tapestry is 70 meters long — or for such an important political purpose. This has led some to speculate that perhaps it was not the work of women, and Bayeux Tapestry Museum curator Antoine Verney has suggested that men could have been trained in embroidery to execute this important royal commission, potentially in Normandy, since the tapestry resided in the Bayeux Cathedral for centuries.

Textile archaeologist Alexandra Lester-Makin, an expert in early medieval embroidery and the Bayeux Tapestry, disputes this idea, noting that the needlework on the tapestry is highly skilled. She thinks it unlikely that it was the work of a team who had just recently learned to embroider. There is evidence of female embroidery workshops in England in the 11th century, indicating the likelihood that the tapestry was created by women. A skilled embroiderer would have organized and overseen the production process to maintain consistency and coordinate the many embroiderers working on the piece at the same time. Many women’s hands would have also been involved in spinning and weaving the linen and spinning and dyeing the wool embroidery thread. Notably, the style of embroidery used on the tapestry is meant to conserve thread, likely due to a firsthand awareness of how very labor intensive it is to produce from having spun wool themselves. The thread wraps around the back of the work only in short couching stitches, so the majority of the wool is laid down in long stitches on just the front surface of the work. This style of embroidery is a relatively quick way to fill in large spaces, much like painting, and evokes the brush strokes of illuminated manuscripts.

The women doing the embroidery work may have had some creative license over the messages that were communicated and immortalized in the tapestry. Some experts on the tapestry suggest that while the main story running horizontally across the center was dictated and likely sketched by men to record the details of the conquest, the borders were left to the discretion of the embroiderers, who included animals — often dragons, lions, and griffins — and scenes from Aesop’s fables alluding to ideals of medieval morality. Certain fables are embroidered more than once, like “The Fox and the Crow” and “The Wolf and the Crane.” They are drawn differently and appear to be embroidered by different hands, suggesting that each embroiderer was likely unaware that another had chosen to stitch the same image or scene. The fables in the borders can be read as commentary on the main action of the tapestry — perhaps a way for the Anglo-Saxons to tell their version of the story in the margins of the Norman tale.

Drawing of three of Aesop’s fables found on the Bayeux Tapestry, 1889. [Wikimedia Commons]

Lester-Makin expressed that as much as she would like to believe this was the case, she is not sure that women would have been given such freedom over the border content. However, that doesn’t mean that their experiences and perspectives were not included. “I think that even if they didn’t necessarily have free reign, there are still areas of expression that can be witnessed. This is a witness to what they have gone through or know that somebody went through … there are other ways … to read the tapestry and of seeing the embroiderers within it.” She called attention to a scene where an Anglo-Saxon woman is holding a child’s hand as Norman soldiers set fire to her home. “Whether that was chosen freely by the embroiderers or not, that is still a commentary and if you think of women embroidering that, and you never know what they may have witnessed or had done to them. That’s a harrowing scene.” Similarly, the borders show the bodies of dead Anglo-Saxon soldiers having their armor pulled off or being devoured by animals. “That kind of thing happened and … you can imagine someone stitching that and going, ‘oh my god, that happened to my brother, my cousin, my dad, my husband.’ ” Whether or not the women chose any of the tapestry’s content, they stitched it, and prior to that, they may have lived it. The tapestry is a testament to their experience preserved in a language they spoke.

Bayeux Tapestry scene. [Wikimedia Commons]

In our interview at the Bayeux Tapestry Museum, Verney stated that the genius of the tapestry was that it was the first known graphic representation of a current event in northern Europe, adding that if it was not captured in this object, the history may be lost today. He said that the technique of embroidering wool yarn on linen cloth was likely chosen because it was a relatively quick method and made it easy to share the story of the event on both sides of the English Channel to a largely illiterate public. It may have also served a political purpose. It was a way to integrate the Anglo-Saxon tradition of needlework into the story of the Norman conquest of England and assure the English that their traditions were valued and would be preserved under this new rule.

French historian R. Howard Bloch calls the embroidery of the Bayeux Tapestry “a powerful vehicle for cultural memory at a time when even the most powerful lords were illiterate.” Janet Catherine Berlo wrote in response to Bloch’s statement, “I position it as ‘a powerful vehicle for cultural memory’ of a different sort — a cultural memory for those of us who seek to understand the long history of the poetics of embroidery, and our places in it.” It is clear which history was thought valuable to preserve at the time — the content of the tapestry — and which was not — the process of its creation. Women looking to find their place in the “long history of the poetics of embroidery” often discover that it is a game of hide and seek. Even when the work remains, the hands that made it are so often invisible. Like so many stories of women throughout history, the creation story of the Bayeux Tapestry seems indelibly lost.

 

Eight hundred years after the original Bayeux Tapestry was finished, a group of women in Victorian England created a full-scale replica of it, now on display at Britain’s Reading Museum. The effort was spearheaded by Elizabeth Wardle, who in 1885 organized 39 members of the Leek Embroidery Society so that Britain could have its own copy of this important historic artifact. It took just one year for the women to re-create the entire tapestry, working from pictures that had been handcolored by archivists at what is now the Victoria and Albert Museum in London. It seems that these women were working to find their own place in embroidery history grounded in the Victorian-era “medieval revival,” which spurred a renaissance of medieval art and architecture. Their focused effort reflects an interest in their British heritage, the tradition of English needlework, and a wish to meaningfully contribute to those legacies. Unlike the anonymous stitchers of the original tapestry, these women added their names below the sections they worked on, escaping the obscurity of their medieval counterparts. Their signatures show that some women worked alone for long stretches of the tapestry, while others worked closely together on a section. Seeing three women’s names running the length of a four-foot section, we can imagine them huddled together talking and stitching.

Another difference between the Victorian re-creation and the original tapestry reflects the cultural mores of the time. In the original tapestry, there are several naked men, and male horses are depicted with anatomical accuracy. The Leek embroiderers omitted these “racy” details, though through no fault of their own. The men working in the museum archives felt it was improper to send such images to a group of British ladies. They “cleaned up” the photos that the women then faithfully copied.

More recently, a community project on the island of Alderney, in the British Channel, took inspiration from the tapestry but had a different aim: to finish it. The last panel of the original work is famously missing, its story lost to time. Historically, what naturally follows the Battle of Hastings, where the tapestry currently ends, is the coronation of William the Conqueror as William I of England. Kate Russell, the librarian on Alderney, spearheaded the project and together with artist Pauline Black imagined the ending and created the plan for the tapestry in 2012. Four hundred and sixteen people ranging in age from 4 to 100 contributed stitches to the final piece. Along with a large contingent of Alderney islanders and notably King Charles III, then Prince of Wales, stitchers came from nearly every continent of the world. Russell told me not a day went by that there wasn’t at least one person stitching while the library was open and often several people working together: “During that entire year, there was never any rancor, tension, disagreement, squabbling or any other sort of discord. Lots of stitching; no bitching. I imagine it must have been similar for the original stitchers, too, though the trauma they were living through in that torn-up country that England had become must have meant an entirely different atmosphere.”

Fran Harvey, a local resident and principal stitcher, said: “England was never the same after the Norman invasion. And I don’t think Alderney, as a community, will ever be the same again after so many people came forward and put their stitches into this amazing work. It is a landmark in Alderney’s modern history, and I feel sure that everybody involved in it, just like us, is very proud … The Tapestry … is like a thread that runs between Normandy and Alderney. It is almost a thousand years long, and today it brings us closer together.” Russell was awarded a British Empire Medal by Queen Elizabeth II for services to history and culture. Now, as a tourist destination on Alderney, the tapestry illustrates the cultural heritage of the community and carries the legacy forward.

Today, Mia Hansson, a Swedish seamstress living in England, is working to single-handedly re-create the Bayeux Tapestry. While most Bayeux Tapestry projects reflect a connection to British and French culture, Hansson’s embroidery pieces are motivated by her connection to a culture of needlework — an answer to Berlo’s call “to understand the long history of the poetics of embroidery” and her place in it. She plans to finish her Bayeux Tapestry replica just in time for a major restoration of the original tapestry, which the French Ministry of Culture has scheduled to begin in 2028. The restoration effort has been led thus far by a team of seven female textile conservationists who have assessed the areas in need of repair. A one-thousand-year­-old tapestry presents unique challenges. Because no one has worked on anything like this before, restorers will have to learn as they go. Hansson is helping to keep this object of cultural memory alive and in circulation even if the original can no longer be displayed for a time.

While stitching the tapestry, she was “forced to learn the history, almost against [her] will,” noting that history is the only subject she ever fell asleep in. But her real connection to the work is with the original stitchers and her grandmother, who, though deceased, is always looking over her shoulder to make sure the back side of the work is neat. She has come to know the original stitchers of the tapestry quite well through her close study and faithful re-creation of their work: “Although I often get frustrated with them and the way they chose to stitch, which I now have to replicate, I feel strangely protective over them. There were reasons why they did things in a certain way and I don’t always understand … I can complain and want to put my veto in, ask questions and want to suggest other ways of doing things, but … I want to give the women the benefit of doubt.”

Hansson said she can feel the tensions between the embroiderers who worked closely together and likely for long hours with poor lighting, as though there are ghosts in the fabric. Their varying skill levels are clear from the stitching. Some appear less patient than others: “There are places where stitches overlap, where none of the women wanted to give in. In other places, there is a gap, where the women have failed to connect their work. Why? Was there an argument? Was it a simple oversight?” Unlike the harmonious working environment depicted in the stitching of the Alderney panel or the Victorian re-creation, Hansson imagines “the air being thick with emotion at times” while stitching the original tapestry. 

Choosing to re-create the Bayeux Tapestry has connected Hansson to a community of people interested in the tapestry and given her a role in a broader cultural and historical conversation. She gives talks to schoolchildren, women’s groups, historical reenactors, and embroidery guilds. She has a designated dress for many of these talks; the material was handwoven by a friend, and she sewed the garment with her mother. She added a 17th-century pocket to wear on top of the dress, which she embroidered with images from the tapestry. During these talks, Hansson said, “I step into a role and kind of become part of the tapestry. I live and breathe it with every ounce of my body and soul. It’s quite magical.” She jokes that her gravestone will read, “The woman who became the Bayeux Tapestry,” as though she herself had become a carrier of cultural memory, an embodiment of the original embroiderers’ hands and minds a thousand years later.

Excerpted from With Her Own Hands: Women Weaving Their Stories. Copyright © 2025 by Nicole Nehrig. Used with permission of the publisher, W. W. Norton & Company, Inc. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186055 https://historynewsnetwork.org/article/186055 0
The Founders’ Family Research George Washington’s fellow founders reveled in genealogy as a means to explain themselves, their situation, and to some degree their new and important positions. Benjamin Franklin, John and Abigail Adams, Thomas Jefferson, and others undertook research, wrote and sketched out their family relationships, and discussed the meaning of these connections. It is more challenging to locate a founding father who was not interested in his own family’s founding than one who was. Family history research, correspondence about genealogy, the exercise of that information in court, and the public display of it were a matter of course for John and Abigail Adams, Benjamin Franklin, Thomas Jefferson, Alexander Hamilton, James Madison, James Monroe, and many more.

Sampler, by Abigail Adams, 1789. [Cooper Hewitt, Smithsonian Design Museum]

By the late 18th century, expressing ambivalence about genealogy in the form of time and energy expended toward family history research, alongside a disavowal of the significance of family history, was common. Abigail Adams explained her own interest as purely academic, albeit intense. As she wrote about the document that would demonstrate the lineage of the Massachusetts Quincys’ descent from the Sear de Quincy, she explained that “I do not expect either titles or estate from the Recovery of the Geneoligical Table … yet if I was in possession of it, money should not purchase it from me.” “Can it be wondered at,” she asked her sister, “that I should wish to Trace an Ancesstor amongst the signers of Magna Carta[?]” The signers of that document, contrary to the narrative of monarchy, could claim a lineage of expanding political rights and participation, one assumes.

But surely a key aspect of the founding generation’s ambivalence about genealogy was associated with family roots abroad, usually in England. That interest came with a whole host of associations, some problematic given the recent revolution and the geopolitics of the 1780s and beyond, and some appealing, as Adams illustrated, in terms of longstanding ideas about authority and authenticity.

The founders all appreciated that a deeper knowledge of their family’s past required genealogical research. They all took care to explain the character of their interest in their family’s past. And they all made claims to the significance of family, both in general terms and in terms of their particular family’s background. The timing of these elites’ genealogical interests in the post-revolutionary period and the evidence of extensive pre-revolutionary family interest in genealogy among their ancestors also illustrate the deep tradition of genealogy — of which they were well aware — that had developed by the early Republic. For Franklin, the Adamses, Jefferson, Hamilton, Madison, Monroe, and many others, riding the wave of political position often intersected with opportunities for and an impetus to genealogical research, reflection, and articulation.

When the founding generation turned to autobiographical reflection on the storied lives they had led, they began just as the previous generations did: rooted in family. Benjamin Franklin’s famous, posthumously published Autobiography began with his genealogical reflections and travels. When he turned to memoir, Thomas Jefferson began more casually, though his would remain as a manuscript. “At the age of 77,” he wrote, “I begin to make some memoranda and state some recollection of dates and facts concerning myself.”

Yet after this sentence of introduction, Jefferson spent the next passages describing his father’s family (“the tradition in my father’s family was that their ancestor came to this country from Wales … the first particular — information I have of any ancestor was my grandfather who lived at a place in Chesterfield called Ozborne’s and ownd the land afterwards the glebe of the parish”) and his mother’s family (“They trace their pedigree far back in England & Scotland, to which let every one ascribe the faith & merit he chooses”). When John Adams turned to autobiography he, too, began with family because, as he noted, “the Customs of Biography require that something should be said of my origin.” His relation of his paternal and maternal relatives was considerably longer than Jefferson’s, allowing him to fully root his own life in a long New England tradition. James Madison’s autobiographical manuscript treated his family the most briefly, but mirrored his father’s family record in recalling his birth when his parents were visiting relatives elsewhere in Virginia, thus echoing the family history.

Though he did not write an autobiography, or leave anything like a memoir, in the last years of his life George Washington evinced probably the most important and revealing investment in genealogy as a form of continuing importance for American elites on the cusp of a new century, hand on the tiller of a new nation. In 1791, Sir Isaac Heard, the Garter King of Arms at the College of Arms in London, wrote to President Washington with extensive information about the Washington genealogy and heraldry in England, as well as a request for more details about the family in America.

In addition to his expertise as England’s foremost genealogical authority, Heard was married to a Bostonian and had traveled in North America as a young man. His interest in the Washington family, Heard wrote, proceeded “from a sincere respect for the distinguished Character of Your Excellency” but also originated in his own American connections, “[c]ircumstances which have constantly excited my anxious Attention to the Scenes of that country & fervent wishes for the welfare of many families with which I had the happiness to be acquainted.” The materials included were, as one might expect from an expert genealogist, very detailed. There was a sketch of the arms and crest of the Washington family; the former includes a raven rising, with wings poised, from a cornet. There was an abstract of the will of Lawrence Washington, George Washington’s paternal grandfather, and two items of estate administration that formed part of Heard’s research into Washington family connections. And there was an annotated family tree. All in all, it was an impressive package.

In his response (nearly half a year later) to Heard’s interest in learning yet more information about the Washington family from American sources, George Washington first wanted to be clear about his own and his country’s use for genealogy. He noted, “This is a subject to which I confess I have paid very little attention. My time has been so much occupied … that but a small portion of it could have been devoted to researches of this nature, even if my inclination or particular circumstances could have prompted the enquiry.” Further, “[w]e have no Office of Record in this Country in which exact genealogical documents are preserved; and very few cases, I believe occur where a recurrence to pedigree for any considerable distance back has been found necessary to establish such points as may frequently arise in older Countries.”

The president dissembled. Washington had long been interested in the history of his family, and deeply invested in the symbols of his paternal lineage. The coat of arms that Sir Issac Heard sent from England was familiar from its long-standing and regular use by the Washington family. George Washington first commissioned silver with the Washington shield on it when he was in his twenties and had just taken full possession of Mount Vernon in 1757. Ready to furnish his home, he ordered a “Neat cruit stand & casters”; it was beautifully crafted and in the latest style. Two years later, he would marry Martha Dandridge Custis, who brought with her — and her children would add to these — items adorned with each of those coats of arms to the household at Mount Vernon.

Common to all of these founders’ founding stories is not only a cognizance of such things as heraldry, but also an appreciation for and willingness to undertake family history research, usually involving communication with other family members, sometimes involving travel, always relying on the same kind of work done by previous generations. Second, they all framed their family history pursuits in ambivalent terms by the later 18th century. And third, none eschewed family history because of the potential taint of aristocracy. For these sons and daughters of mostly middling means who had become elite by virtue of leveraging property and politics, surrounded by plenty of other families who were celebrating centuries of elite status in the British colonies, and then in the American nation, family history was still an obvious privilege and one that they embraced rather than eschewed.

From Lineage: Genealogy and the Power of Connection in Early America by Karin Wulf. Copyright © 2025 by Oxford University Press and published by Oxford University Press. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186053 https://historynewsnetwork.org/article/186053 0
A Ghost from Kitchens Across the Nation While most Americans today would likely be hard put to name a modern-day conjure woman if asked, a caricature of one smiled warily at them from their kitchen cupboards for over a century: Aunt Jemima, Pearl Milling Company’s cherished pancake mix mascot. Introduced in 1889, Aunt Jemima is a fictious character based on Negro Mammies, enslaved women who held a central place on plantations. They were women like Harriet Collins and Harriet Jacobs, women who played a major role in the development of American food traditions and medicine. Negro Mammies were conjure women who used local flora to heal minor ailments; nursed all the children on the grounds, both Black and white; cooked and organized food in the Big House; provided advice to younger enslaved women; and offered spiritual comfort, often by way of mojos, sacred amulets, to the enslaved.

Mojos were a staple of hoodoo, a conjure tradition that developed in the Deep South and lower East Coast. Mojos often gave the oppressed confidence to rebel against their oppressors — slave against master, wife against husband. This use of mojos would live on well into the twentieth century, becoming one of the defining aspects of African American culture, especially our music.

While newly freed African Americans were busy telling stories about the mojos of Negro Mammies in their early blues songs, the American public began to wax nostalgic over plantation life. In the eyes of the American public, the Negro Mammy was a docile slave who championed the institution of slavery. The national worship of Negro Mammies reached a fever pitch in 1923. At the start of that year, a bill was put forward in the Senate to erect a million-dollar marble and granite statue of their beloved Negro Mammy in Washington, DC.

Most white Americans never even had slaves, much less been raised by a Negro Mammy. So how did the Negro Mammy, a figure who was tucked away on rural southern plantations, a figure who was relatively obscure in the nation before the Civil War, become a wildly popular national icon and lightning rod of racial conflict?

It began with a party.

Memory Jug, c. 1890. [Smithsonian American Art Museum]

In 1893, the United States government decided to throw a grand party in Chicago: the World’s Columbian Exposition, an international fair to celebrate the four hundredth anniversary of the “discovery” of America by Christopher Columbus in 1492. (The complications in setting up the fair, a daunting task in the then largely industrial and hardly picturesque Chicago, made the fair a year late for the anniversary.) At the World’s Fair, several countries, from neighboring countries like Mexico to countries in the Far East, like Japan, were invited to set up an exhibit. The World’s Fair organizers wanted to display to the nations — especially Europe — how far the United States of America had come in four hundred years; they wanted to stress that our wild democratic experiment had been a success. The evidence of that success was our rapid technological innovation at the turn of the twentieth century.

And if you were one of the twenty-seven million people who purchased a ticket to the fair for fifty cents between May and October, you would have indeed been privy to grand feats of innovation that showcased American ingenuity: the world’s first Ferris wheel, a 264-foot-tall wheel that spun on a seventy-one ton axle, carried thirty-six cars that could fit sixty people at a time, and whose heights rivaled the Eiffel Tower, which was featured in the 1889 World’s Fair in Paris; electric lights whose colors danced to music and whirled in fountains at a time when most Americans were still using oil lamps to light their homes; one of the first electric train lines, ferrying visitors on a loop in the air over the fair’s 663 acres; and Thomas Edison’s kinetoscope, which displayed a mesmerizing precursor to movies.

The contributions of African Americans were noticeably missing from this grand celebration. The World’s Fair organizers refused to include African Americans in the fair’s planning, actively barring our proposals for booths that showcased our extraordinary cultural and economic progress achieved merely thirty years after enslavement. The proposed booths would have been astounding to an American public who believed we would never rise above the status of lowly, ignorant servants. By the 1890s, we had doubled our literacy rate, providing a robust education to thousands of Black people who, under slavery, had been violently prohibited from learning to read or write. We had tripled the number of books written by African Americans. And there was a significant increase of African Americans who took up the professions of teaching, ministry, medicine, and law.

In the end, it was Haiti, not America, that gave African Americans a place at the Chicago World’s Fair. Haiti, like many other countries, was represented at the World’s Fair with its own dedicated building. The Haitians opened their doors to African Americans to voice their complaints about and their contributions to the nation. Ida B. Wells, at the time an investigative journalist, partnered with other leading Black intellectuals — Irvine Garland Penn, Ferdinand Lee Barnett, and Frederick Douglass — to produce a pamphlet called “The Reason Why the Colored American Is Not in the World’s Columbian Exposition.” Wells stood on the steps of the building dedicated to Haiti at the World’s Fair, passing out copies of this pamphlet to the visitors from all over the globe who stopped to gaze upon and consider the first and only free Black republic in the New World.

In the pamphlet, Wells pointed out that the wealth created by African Americans’ industry has afforded to the white people of this country the leisure essential to their great progress in education, art, science, industry and invention. Wells understood that to try to tell the story of America without African Americans is as foolish as building a house upon shifting sands — which was exactly the physical construction of the fair.

At the center of the fair was a gleaming “White City” that swayed on stilts. Workers had cleared forlorn-looking oak and gum trees in the large muddy swamp of Jackson Park, which sat on the shore of Lake Michigan. They drove large stilts deep into the sandy marsh to support six large buildings of stucco — a low-cost plaster. They painted these cheap buildings bright white to look as if they were marble. Styled after Greek and Roman architecture, these six buildings formed a square called the Court of Honor, showcasing the major areas of innovation in America: liberal arts, agriculture, anthropology, electricity, machinery, and mining. These hastily built, faux marbled buildings on shoddy foundations were to be the symbols of American progress. And so, the Court of Honor, a make-believe city, held all the tensions of the American dream: buildings with a gleam so white, so bright, they detracted from the muck below that upheld them.

It was in the Court of Honor’s agriculture building that you would find the exhibit of Aunt Jemima’s pancakes. Many of the products that were sampled in this building are still found in our grocery stores today, over a hundred years later, such as Quaker Oats, Cracker Jacks, and Wrigley’s Chewing Gum.

In 1890, R.T. Davis, the president of Davis Milling Company, bought Aunt Jemima’s pancake mix from Chris Rutt and Charles Underwood, who first developed the product in 1889. The first self-rising flour mix on the market, Aunt Jemima’s pancake mix was made of wheat, rice, and corn. This was a striking departure from the pancakes of the South — called hoecakes, ashcakes, johnny-cakes, or pone — which were typically made from cornmeal. To market this unfamiliar product to the American public, Chris Rutt decided to draw upon the preeminent form of entertainment in the late nineteenth and early twentieth century: minstrel shows, where white men darkened their skin with burnt cork to imitate the songs and dances of the enslaved.

Fairgoers who visited Aunt Jemima’s pancake exhibit would have recognized her name from “Old Aunt Jemima,” a staple song and skit of the minstrel circuit. When Rutt heard the song performed in an 1889 minstrel show and saw how popular it was among the crowd, it struck him that Aunt Jemima would be the perfect “face” for his product. After the show, Rutt plastered a grotesque painting of Aunt Jemima on every newspaper, magazine, and paper box advertising their new pancake mix. R. T. Davis took the branding further by casting Nancy Green, a formerly enslaved Black woman, to play the role of Aunt Jemima at the World’s Fair.

At the pancake exhibit, an enormous barrel of pancake flour loomed behind Green. At 16 feet high, 12 feet wide, and 24 feet long, this barrel was bigger than the average SUV. Draped in an apron and donning the Negro Mammy’s customary red bandanna, Green flipped over one million pancakes over the six months the fair was in operation. While she made pancakes, she sang spirituals and relayed stories about the “good old days” of slavery. It was said that the exhibit drew a crowd so large the police had to step in to keep the walkways clear. The live advertisement was an incredible success, fetching over 50,000 orders of Aunt Jemima’s pancake mix from fair visitors hailing from all over the country. Due to her laudatory reception, the officials at the fair named Aunt Jemima “Queen of the Pancakes.”

Left: Aunt Jemima advertisement, 1894. [Wikimedia Commons] Right: Trademark registration for Aunt Jemima, 1903. [Library of Congress]

Consider the souvenir gift of the exhibit: a button pin featuring a smiling Aunt Jemima with the phrase “I’se in town, honey!” scrawled across the top. Aunt Jemima’s successful move from the rural, southern plantation to the bustling, urban “town” of the North (like Chicago) is predicated upon her willingness to remain a servant, a Negro Mammy to whites. This sentiment captures the attitudes of both northern whites who were agitated by the influx of African Americans and southern whites who were dismayed at the loss of their workforce during the mass migration. Through Aunt Jemima’s pancake mix, the longed-for Negro Mammy could return to white kitchens once again. 

But while white Americans saw in Aunt Jemima the docility and domesticity of the Old South, African Americans took something entirely different from the exhibit. African Americans, too, were familiar with the song “Old Aunt Jemima” — but it held a radically different meaning. African American minstrel performer and former slave Billy Kersands originally came up with the “Old Aunt Jemima” song and dance routine in 1875. And when African Americans came onto the minstrel stage, they often added layers of nuance to the routine that whites did not pick up on. After all, it was an incredibly ironic performance: African Americans were mocking white Americans, who had built a career out of mocking their castmates’ days of enslavement.

As scholar M.M. Manring observes, Kersands’s “Old Aunt Jemima,” which impersonates a Negro Mammy, is based upon a slave song called “Promises of Freedom.” This song added a great irony to his act, subtly making fun of the very whites who enjoyed minstrel shows. In the song, the enslaved ridicule slave masters who made vacuous pledges of future manumission. One verse features a mistress who promised to free the enslaved upon her death. Rather than fulfill her promise, she simply refuses to die, going plumb bald in her old age.

Whites in America did not realize that by welcoming Aunt Jemima into their homes, they were not only gaining access to her delicious pancakes; they were also partaking of the conjure African American women have been wielding for centuries. Beneath the light melody and playful dance that accompanied the song “Old Aunt Jemima,” the lyrics issued a deadly serious threat: a mojo used by the enslaved to get back at their masters who failed to uphold past promises of freedom.

When Rutt’s Aunt Jemima and Kersands’s “Old Aunt Jemima” are laid side by side, they tell a story of two different Americas. At the 1893 Chicago World’s Fair, Aunt Jemima’s pancakes gave whites the America they longed for — one where newly freed African Americans embraced the docility and domesticity posed by an imaginary Negro Mammy. But “Old Aunt Jemima” describes an America that has fallen short of providing the freedom it guarantees to all its citizens — an America that newly freed Blacks wanted to hold accountable for such failings. In Black America, Aunt Jemima rises like a ghost from kitchens across the nation, wielding the mojos of past Negro Mammies. Behind this popular pancake mix stands a secret history where Aunt Jemima is no longer a slave but a Black revolutionary.

Excerpted from the book The Conjuring of America: Mojos, Mermaids, Medicine, and 400 Years of Black Women’s Magic by Lindsey Stewart. Copyright © 2025 by Lindsey Stewart. Reprinted with permission of Legacy Lit, an imprint of Grand Central Publishing. All rights reserved.

 

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186052 https://historynewsnetwork.org/article/186052 0
A Mere Mass of Error On October 22, 1880, the front page of Truth, a tiny and previously obscure New York City newspaper, was dominated by a story that threatened to doom the presidential hopes of Republican candidate James Garfield. Splashed across the front page was a large photograph (still a rarity in newspapers then) of a handwritten letter in which Garfield appeared to secretly promise to oppose efforts to ban Chinese immigration in order to protect the supply of cheap labor for industrialists. Never mind that Garfield’s Republicans had, like their Democratic rivals, already adopted Chinese exclusion in their campaign platform. A quickly convened court hearing provided expert and investigative evidence that the “Chinese letter” was a forgery and the whole affair a ginned-up illusion, only to be countered with competing experts and alternative facts extending the hearings and keeping the controversy in the news. Word of this muddied debunking chased the lie down the channels of the 19th-century information networks — telegraph and railroad lines — but initially did little to quench outrage among the nearly all-white electorate of the Western states upon which the election now hinged. 

Before Garfield and the Republicans could mount an effective response, the photographic image of the “Chinese letter” had spread across the entire nation. Flyers and posters of the images, often labeled as “Garfield’s Political Death Warrant,” were “being scattered throughout every [New York] county and school district”; being handed out to Chicago children “at the doors of the public schools” to bring home to their parents; becoming “the sole topic of conversation” in Toledo, Ohio, and in Nevada mining towns; and, as one member of the Democratic National Committee gloated, being “scattered all over the Pacific slope,” making “the Chinese problem” all at once “the foremost argument in the campaign.” The Los Angeles Herald declared: “The election of Garfield would be the signal for the discharge of all white men from employment by manufacturers and corporations and substitution of Chinese coolies.” (“Coolie” was a derogatory term for Asian laborers adopted from British colonial culture).

Nowhere did the arrival of this lie cause more mayhem and misery than in Colorado, where news of the Garfield letter set the match to an explosive anti-Chinese climate stoked for months by the local Democratic Party-aligned press. News of the letter’s claim was being flogged in Denver papers within a day of its publication in New York City, followed within days by photo-lithographic printing plates shipped by train that brought the photographic proof to Denver whites. Soon enough, on Sunday, October 31, a barroom assault on a handful of Chinese pool players erupted into a racial pogrom against the city’s Chinese population. Dozens of Chinese homes and businesses were burned, scores of Chinese immigrants badly beaten, and 28-year-old Lu Yang (Look Young) was dead.

This devastating “October Surprise” was rendered all the more potent by Garfield’s five-day delay in issuing an official denial. He privately assured Republican Party leaders that the letter was “a base forgery,” but, refusing their increasingly desperate pleas, told them that he “hoped to answer all my accusers by silence.” In accordance with the contemporary norm that it was unseemly for candidates to campaign for themselves, Garfield would agree only to have a surrogate, Republican National Committee chairman Marshall Jewell, denounce the letter as a forgery. There was more to Garfield’s delay than propriety, however. Without yet having seen a photograph of the letter, the candidate wasn’t entirely sure that he hadn’t written it, or rather that a member of his staff hadn’t perhaps done so and signed it on Garfield’s behalf, as was sometimes the practice with minor correspondence. Without sharing his uncertainties with his party leadership, Garfield, away from Washington, DC, quietly sent his secretary “to search our files which had been carefully indexed to see if they contained any such letter.”

In the meantime, the “Chinese letter” scandal metastasized, feeding on the uncertainty created by Garfield’s silence. Republicans responded first with moral outrage. “That there has been a most deliberate conspiracy, carried out in all its parts with foresight, with malign and infamous intent to destroy the name of James A. Garfield,” thundered celebrity preacher Henry Ward Beecher from his Brooklyn pulpit, denouncing the unseen wirepullers “who undertook, by lies, by forgery … to blight a fair fame,” and predicted that “the people [will] be the voice of God, come to judge such” men.

In the end, James Garfield won the 1880 presidential election, if just barely. The “Chinese letter” hoax seems to have cost him California and Nevada, and resulted in the slimmest popular vote margin in U.S. history (two thousand ballots out of nine million). While the hoax failed in its immediate aim of winning the White House for Democrats in 1880, it arguably contributed more to a successful and tragically consequential sleight-of-hand: convincing white workers to focus on nonwhite immigrants as the greatest threat to their prosperity rather than the white businessmen who set wages, hours, and working conditions.

Forty years before the “Chinese letter” hoax rocked the 1880 presidential election, a hoax involving the 1840 U.S. Census used the nascent authority of the new science of statistics to promulgate false evidence that the mental health of African Americans collapsed outside of slavery. Secretary of State John C. Calhoun, the infamous advocate of slavery responsible for the census, argued that emancipation “would indeed, to [the enslaved], be a curse rather than a blessing.” Calhoun deployed convenient census errors to inhibit abolitionist efforts to stop the spread of slavery to new U.S. states. The false conclusions drawn from the 1840 census became the first major dataset in what would become the massive edifice of American scientific racism that propped up U.S. white supremacy into the second half of the 20th century.

“Who would believe without the fact black and white before his eyes,” marveled a letter in the New York Observer, that “there is an awful prevalence of idiocy and insanity among the free blacks [and] … slaves?” Startling as it was, this conclusion was “obvious,” the writer explained, “from the following schedule,” referring to columns of data reproduced from the 1840 U.S. Census. This letter and its accompanying excerpt from the census were themselves quickly reproduced without analysis or comment in the American Journal of Insanity and other medical journals around the country, perpetuating the “fact” that, as one appalled white Northerner observed, “lunacy was … about eleven times more frequent for the African in freedom as in slavery” and that “more strange than this,” the mental health of free African Americans worsened still further the farther north from the Slave South they lived. The unexpected conclusion that freedom was unhealthy for “Africans” delighted slavery’s defenders and confounded their opponents in the antislavery movement. The conclusion was seemingly irrefutable, however, bearing as it did the authority of both the federal government and the new science of statistics.

Calhoun’s longtime adversary John Quincy Adams — the 77-year-old former president (upon whom Calhoun had been disagreeably foisted as vice president 20 years before), currently Massachusetts congressman, and, three years earlier, defender of the Amistad slave ship rebels before the U.S. Supreme Court — was leading a call for the results of the 1840 census to be publicly retracted. According to Adams, some Massachusetts country doctor had reportedly “discovered that the whole of the [census] statements in reference to the disorders of the colored race were a mere mass of error, and totally unworthy of credit,” rendering the 1840 census “worthless, at best utterly botched and at worst maliciously falsified.” Adams would later claim that he had already convinced Calhoun’s predecessor of the falseness of the census data a week before the man’s unfortunate and dramatic demise. Calhoun now found it politically impossible to completely ignore Adams’ repeated accusations in Congress that “atrocious misrepresentations had been made” by the census of which existed “such proof as no man would be able to contradict,” and that the nation had, thanks to Calhoun, been “placed in a condition very short of war with Great Britain as well as Mexico on the foundation of these errors.” Adams demanded that the secretary of state reveal “whether any gross errors have been discovered in the printed Sixth Census … and if so, how those errors originated, what they are, and what, if any, measures have been taken to rectify them.” 

Calhoun agreed to “give the subject a thorough and impartial investigation.” Adams savored watching Calhoun “writhe … like a trodden rattle on the exposure of his false” assurances regarding the accuracy of the census and grumble that “there were so many errors they balanced one another, and led to the same conclusion as if they were correct,” imagining (naïvely it turned out) that the exposure of the errors would end the spread of the census’ false conclusions. 

Ultimately while Congress conceded that “in nearly every department of the late census errors have crept in, which go very far to destroy confidence in the accuracy of its results,” they declined to incur the “great expense” of commencing a new corrective census, and shrugged off the offending inaccuracies, concluding regarding the 1840 census that “it’s near approximation to the truth is all that can be hoped for.” The false claims about African American intelligence and sanity would stand. It was at this moment that the 1840 U.S. Census data became a species of hoax rather than simply a fiasco.

Indeed, the 1840 census was still a potent enough cultural force over a decade later that it was the only element of American racism that Harriet Beecher Stowe thought merited its own extended appendix in A Key to Uncle Tom’s Cabin (1854), her documentation of the truth behind Uncle Tom’s Cabin, her blockbuster 1852 antislavery novel. “In order to gain capital for the extension of slave territory,” Stowe fumed, “the most important statistical document of the United States has been boldly, grossly, and perseveringly falsified, and stands falsified to this day. Query: If state documents are falsified in support of slavery, what confidence can be placed in any representations that are made upon the subject?”            

What did accrue significant public confidence in the United States after the 1840 census, however, was the notion that science could be used to confirm racial equality and defend racist institutions and laws while evading accusations of racial bias. American culture threw itself into the production of scientific racism with gusto for the next hundred years, justifying everything from slavery and segregation to racist immigration, marriage, citizenship, and sterilization laws.

This excerpt originally appeared in The Great White Hoax: Two Centuries of Selling Racism in America, published by The New Press. Reprinted here with permission.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186048 https://historynewsnetwork.org/article/186048 0
Slave Hunts as “Normal Policing” In May 1752 the French minister of the navy, Antoine de Rouillé, wrote to the governor of Saint-Domingue about the new problem of slaves in France. Slaves were “multiplying every day, more and more, in almost all the towns of the kingdom.” The minister’s disquiet followed a controversy that centered on an African man, age 22, whom I shall call Jean, though he also appears under other names (Charles-Auguste and Adonis) in the police archives. He was enslaved to Guy Coustard, a sugar planter in Saint-Domingue. Jean had the Coustard family’s monogram (CO) branded on his left breast.

Documents about Jean’s brief sojourn in France come from two slender files at the Bastille Archives, which contain letters to the lieutenant-general of police from the minister of the navy and from Jean’s would-be benefactor, the Dowager Princess of Nassau-Siegen, born Charlotte de Mailly de Nesle, who tried and failed to protect Jean from Coustard. Her staff and Coustard lodged in the same hotel, near the Luxembourg Palace. Through her servants, she learned of Jean’s physical abuse and despair.

From Mailly de Nesle we learn that Jean arrived in Paris during the spring of 1751 and fled from the city twice. On both occasions he tried to escape by joining the army. In March 1752 the French constabulary arrested him in Sedan, a frontier garrison town, and escorted him back to Paris in chains. He wound up in the dungeon of For l’Évêque, a former ecclesiastical prison. Many of the other inmates at that time were soldiers. Unlike Jean, who had hoped to become free by joining the army, those men were draftees, who had sought freedom from the army through desertion. On April 8, someone other than Coustard claimed Jean from prison. Port records in La Rochelle note that a slave named Jean sailed for Saint-Domingue in July.

The capture and imprisonment of Jean resulted from an order of the king, popularly known as a lettre de cachet. Masters paid a fee to police for these roundups and paid for the maintenance of their slaves in prison. In March 1752, Jean-Jacques Coustard, an elderly Parisian judge, lobbied the Crown to arrest Jean by royal writ. The judge did not own slaves himself and had probably never set foot in the colonies. He came from a clan of Angevine drapers who bought their way into the Paris legal establishment in the 17th century. The Paris Coustards abandoned trade for the law, to become a judging dynasty, just as a more intrepid, piratical sprig of the family settled in Saint-Domingue. The judge and Guy Coustard, Jean’s master, were cousins, not brothers. The capture of Jean resulted from the maneuvering of Crown officials to oblige both a sugar magnate and a member of the city’s judicial elite.

Jean’s failed bid for liberty offers a glimpse of how elusive freedom became for many slaves in Paris after the mid-18th century. His removal from the army and deportation back to Saint-Domingue resulted from new policing practices that crystallized around the time of his brief stay in France. Despite fleeing Paris, Jean became one of the first victims of an emerging system, based in France’s capital, by which slave owners, or their proxies, caused freedom-seeking domestics to disappear. 

 

The rising importance of the slave trade, and of colonial slave plantations, to Parisian social and economic life led the city’s elites to adopt a new attitude toward people of African and South Asian descent, whom they increasingly viewed as potentially saleable belongings. Resplendent sojourners from Saint-Domingue played a role in diffusing new racial concepts in Paris, but their influence should not be overstated. Ideas of race did not waft into the capital as a foreign essence. By 1750, slave plantations and the slave trade out of East and West Africa had become economically vital to Parisian institutions, including the Company of the Indies, which enjoyed direct support from the Crown and strong ties to Parisian high finance. There was nothing distantly managerial about the activities of Paris-based officials in the Africa trade. Consider this document from 1750, written one year before Jean arrived in Paris. Signed by all directors of the Company of the Indies, it sets forth a new scale of value for slave sales in Senegal.

RÉGULATION DES NOIRS, NÉGRESSES, NÉGRILLONS ET NÉGRITTES

21. Every negro between 14 and 40 will be reputed as one Indian piece so long as he has none of the defects indicated below.

22. One négrillon (boy) of 14 equals one Indian piece.

23. Four négrillons (boys) or négrittes (girls) from the age of 8 to 13 equal three Indian pieces.

24. Six négrillons (boys) or négrittes (girls) from the age of 4 to the age of 8 equal three Indian pieces.

25. Four négrillons (boys) or négrittes (girls) who are 4 years of age or younger equal one Indian piece so long as they are not nursing.

26. One negress who is between 14 and 35 years of age equals one Indian piece.

27. One negress who is age 13 and 14 equals one Indian piece.

28. Men between 40 and 50 years of age, and women between 35 and 40 years of age, equal one-half Indian piece and cannot compose more than 3 percent of the cargo.

29. All nursing children will follow their mothers and not be counted.

30. All negroes, negresses, négrillons (boys), and négrittes (girls) will be considered valid Indian pieces so long as they are not epileptic, maimed, blind, or suffering from formal disease.

31. Some missing teeth, and negroes with enlarged testicles who do not have hernias, cannot be refused by captains and surgeons, or excepted from the above regulation.

32. Negroes with one bad eye who are not over 30 years, others of the same age who are missing one or two fingers, however robust their bodies, will only be counted as one-half an Indian piece.

33. A negro who is lacking two toes will be estimated as two-thirds of a piece; a negress in the same case will be evaluated similarly; and négrillons (boys) and négrittes (girls) by the same proportion.

To pin down the novelty of this document requires that we identify what is not new. At direct points of sale among slave buyers in Africa or the Americas, this meticulously commodified view of the human body was familiar. It was normal for company agents to haggle over people with missing toes and enlarged testicles. There is also nothing new about the term pièce d’Inde (Indian piece), from the Portuguese peça das Indias, which originally referred to the value of a piece of cloth exchanged for slaves in Africa by 15th-century traders. French merchants began to employ this term in the early 18th century.

What seems new is this bald enactment by Paris-based officials of a common system of meaning that binds together the capital and trading posts in Senegal in which Africans about 30 years old are whole units, Africans about 40 years old are half-units, and nursing babies, the blind, and ailing people literally have no value. This is not merely a blunt statement of adhesion to the language of the slave captain by the city’s most eminent merchants; it is the other way around. It is Paris scripting the dialogue at the point of sale.

Police sources about slaves in Paris might seem worlds away from plantation inventories, or Indies Company contracts, yet they convey the same matter-of-fact view of black people as property. Stakeouts and arrests could not have occurred otherwise. Urban slave hunts, far from chafing against local values, reaffirmed them. The property that officials in Paris were willing to defend changed in step with the kind of property that Parisians believed in. By the mid-century, policemen accepted that property could take the form of people.

Slave hunts brought the ideology of the slave owner into the streets of Paris, raising the question of what neighbors thought. At least for bystanders, the arrest of slaves looked just like regular police raids. The question is not how neighbors reacted to the spectacle of capture so much as how they understood the status of their neighbors’ domestics, whether they reported fugitives to the police, and whether they hid people. It is impossible to venture a single answer to this question. Police files offer many clues to friendship, love, and complicity between Parisians and enslaved people. There were, nonetheless, some residents of the city who described their neighbors’ domestics in the crudest possible terms. In 1751, la Dame Mallecot, the wife of an administrator in Cayenne, sought help from the police with the removal of Esther, an African (Igbo) domestic. Mallecot plotted the woman’s arrest, sent Esther to the home of an elderly neighbor, and left town. The neighbor’s son complained to the lieutenant-general of police. “I beg you sir to order that Mallecot come for her negress, whom I will return. It is her property, she will do with it what she wants.” Esther was “a deposit” (un dépôt) for his neighbor to reclaim.

There did not need to be a slave master in the picture. Police agents presumed black and brown people to be stolen goods even when no one reported them missing. The arrest of a man called Mustapha in 1755 offers a revealing instance of this. Mustapha, newly arrived from Marseille, was doubly jinxed. The police had doubts about the fancy napkins Mustapha was hawking on a bridge, and they were just as suspicious about the provenance of Mustapha himself. He deepened their concern by refusing to answer questions (although he was believed to know French) and spent four weeks in For l’Évêque. “We did not find anything in his pockets indicating to whom he belonged.”

 

During the reign of Louis XIV, royal officials began to theorize policing as a vast, tentacular cleansing project by an all-knowing state. As Michel Foucault observes, the rise of new policing ideas would change the structure of government as people began to reimagine its purpose. Policing became a boom topic for publishers and Crown officials, especially after the death of Louis XIV in 1715. The end of Louis’s long reign heightened the reforming zeal of police enthusiasts, to inspire dictionaries, treatises, proclamations, and experiments in repression and surveillance. In Paris, the word police encompassed just about everything. It meant ridding the city of moral filth, actual filth, crime and delinquency, crooked houses, illegal workers, badly lighted streets, family embarrassments, and riotous effervescence among the laboring poor. In the service of this billowing project, the lieutenant-general of police in Paris could issue his own royal writs for the arrest of undesirables, who entered dungeons without passing through the courts.

The practical ability of municipal authorities in Paris to police evolved over time. The invention of inspectors in 1708, with an amplified role after 1740, altered the relationship between police and city dwellers. Through their webs of spies and informants, twenty police inspectors maintained an unrelenting, round-the-clock surveillance of lodging houses and rented rooms frequented by étrangers (strangers). The French word étranger, imbued with a sense of danger and suspicion, referred to outsiders in general, including people from elsewhere in France.

Changes to the policing of Paris responded to dearth, social unrest, and an increase in human mobility. Migration expanded both the city, as a physical space, and its population. The new brutal efficacy of police inspectors around the mid-century also came on the heels of war — the War of the Austrian Succession — and should be read in light of that conflict. As Arlette Farge notes, resistance to troop levies, together with mass desertion, spurred social upheaval in Paris. This may help to account for the menacing force of police in Paris after the war in confrontations with strangers and crowds.

Once agents of the Paris police put themselves in the service of slave owners, it became perilous for fugitives to hide in the city. Jean needed to escape from Paris and not into it. Enslaved domestics who accompanied masters to Paris in the 1740s tended to disappear after a couple of weeks.

Admiralty records provide numerous examples of flight by teenage Africans between 1742 and 1747. The police did not catch these people and there is no evidence they tried to. (They may have been focusing on deserters.) On the rare, documented occasions before 1750 when masters sought help from the police to recover enslaved domestics, nothing happened. In 1742 Anne-Marie-Josephe de Sorel, from Léogane, reported the flight of her slave Pierrot to the Admiralty. To find the boy, she summoned “Sir Genesty, exempt, and she charged him with conducting searches for the said negro, which he assures her of having done for several days and nights” to no effect. In August 1749 a Parisian solicitor reported the flight of his slave Jeanne, who remained at large despite “investigations and house searches that her master caused to be done” — which suggests another failed police hunt.

Masters in the 1750s who appealed to the police framed their demands by emphasizing the moral threat posed by escapees. At the time, the police and most of French society viewed the whole serving class as degenerate scoundrels. Through their depiction of runaways as urban contaminants, masters recast slave hunts as normal policing. In 1751 the Portuguese bishop of Noronha, governor of Sao Tomé, reported the flight of Figueret, “about 4 foot 3, black, dressed in black, in a curly wig gathered at the back, age 16 or 17, from Goa in the Indies.” Figueret was known to be spending his days at the Saint-Germain fair. Noronha explained that the boy “who belonged to him, has been extremely deranged for five or six months, since arrivingin Paris, and it being important to oversee his conduct, to prevent him from committing some disorder, he would be very grateful for him to be put in the prison of For l’Évêque until he departs Paris for the Orient.” When informing the police about the flight of his slave, Louis Aubin, the Chevalier de Nolivos noted “how much pleasure (his arrest) would give me, because, independent of the real loss caused by this domestic, he swindled me.” Masters in the 1750s emphasized the resemblance between runaways and other delinquents. They did so to enable the extrajudicial arrest of people they regarded as valuable assets. 

Excerpted from Slaves in Paris: Hidden Lives and Fugitive Histories by Miranda Spieler, published by Harvard University Press. Copyright © 2025 by the President and Fellows of Harvard College. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186043 https://historynewsnetwork.org/article/186043 0
Irrelevant at Best, or Else Complicit It was not an optimistic time. In the United States, President John F. Kennedy and civil rights activist Medgar Evers had been shot dead in 1963, Malcolm X in 1965, and Dr. Martin Luther King Jr. and Robert F. Kennedy in 1968. Bodies piled up, too, in Vietnam. The year 1968 had brought a global surge of energy and solidarity: the growth of social movements, of struggles against dictatorships and authoritarian rule, of resistance even in the face of violent repression. But 1969 saw a massive global let-down. Coalitional hopes sagged nearly worldwide, replaced by feelings of chaos, dread, and hopelessness. 

“Design,” whatever that might be, no longer looked to anyone like the answer to any of the world’s problems. At the 1969 International Design Conference in Aspen (IDCA) — the same conference that in 1961 had been themed “Man / Problem Solver,” that had emphasized the designer’s “great social responsibility” to help build “a new society with new institutions,” that had celebrated design’s capacity to “‘blast off’ for richer worlds” — the atmosphere had turned somber. The 1969 conference was titled “The Rest of Our Lives.” The industrial designer George Nelson bemoaned, in his conference talk, the difficulty of escape from “the perverted offspring of the American dream” — the dream itself having been brought about, Nelson said, in part by blind faith in technology. The conference’s overall mood, one commentator observed later, reflected “the despair the participants felt at the crumbling of American ideals.” 

The 1970 conference, titled “Environment by Design,” was even darker. Three days in, the American architect Carl Koch declared from the podium that “Our national leadership is unspeakable. The government’s sense of priorities is criminally askew. Our cities are rotting visibly before our eyes.” By a few days later, the program of organized talks had disintegrated. 

People gathered ad hoc in the conference tent to connect with one another and express ideas about the current crisis. A group of French participants read a screed against design itself, written for the occasion by Jean Baudrillard. Baudrillard’s statement lambasted the conference’s environmentalist theme as disingenuous (“Nothing better than a touch of ecology and catastrophe to unite the social classes”), even as it acknowledged, “The real problem is far beyond Aspen — it is the entire theory of Design and Environment itself, which constitutes a generalized Utopia; Utopia produced by a Capitalist system.” (Utopia, here, seems to imply the most self-delusional kind of fantasy.) 

The final hours of the conference, IDCA president Eliot Noyes wrote afterward, underlined “the relative irrelevance of the design subject in the minds of many who were attending.” At the subsequent board meeting, Noyes resigned as president, and the board resolved to search for a radically new form for the 1971 conference, if the conference were to be held again at all. Both the conferees and the board, Noyes reflected, now harbored “serious doubt as to whether at this moment in our national history and our state of emotional disrepair a conference on design can or should be held at all.” Focusing on design seemed irrelevant at best, or else complicit, deplorable, malign.

The whole concept of design was also under attack from those outside design’s professional bounds. In 1971, the German philosopher Wolfgang Fritz Haug published Kritik der Warenästhetik (later translated into English as A Critique of Commodity Aesthetics), a Marxist-cum-Freudian manifesto that described designers as the “handmaidens” of capitalism. Design, Haug contended, was an engine of the appetite-generating “illusion industry” of media and advertising, as well as of the broader consumer capitalist system behind them, all of which were organized around driving consumption and thereby producing profits. 

Haug, like the Frankfurt School before him, charged the modern culture industries and the commodities they produced with the manipulation of human beings. But Haug added a meaningful nuance to Theodor Adorno and Max Horkheimer’s thesis: he showed that manipulating people was only possible because design and its peer disciplines colluded with those people’s pursuit of self-interest, which was continuous, intelligent, and fully intentional. Even “manipulative phenomena” like design, as Haug put it elsewhere, still spoke “the language of real needs.” 

So what to make of design? Was it a necessary evil, or a poison to be eradicated? Neither: it was that poison’s dangerously sweet taste. Or, to use Haug’s own metaphor, design was like the Red Cross in wartime. “It tends some wounds, but not the worst, inflicted by capitalism,” Haug wrote. “Its function is cosmetic, and thus prolongs the life of capitalism by making it occasionally somewhat more attractive and by boosting morale, just as the Red Cross prolongs war. Thus design, by its particular artifice, supports the general disfigurement.”

1971 was also the year the Austrian American designer Victor Papanek published Design for the Real World. It has since become one of the most widely read design books in history; it has been published all over the world, has been translated into over twenty languages, and (as of 2024) has never fallen out of print. It’s a manifesto against what design had become. And it’s a passionate brief for what Papanek believed design could be.

 

As of 1971, Victor Papanek was dean of the newly formed School of Design at the California Institute of the Arts (CalArts). And he had begun to develop his own methodology for a design practice focused, he believed, on solving for real human beings’ real needs. 

Papanek preached design’s “unique capacity for addressing human issues,” as he put it in the magazine Industrial Design, and its “value beyond the purely commercial imperative.” His philosophy of “DESIGN FOR THE NEEDS OF MAN” was a set of seven “main areas for creative attack”:

1. Design for Backward and Underdeveloped Areas of the World.

2. Design for Poverty Areas such as: Northern Big City Ghettos & Slums, White Southern Appalachia, Indian Reservations in the Southwest and Migratory Farm Workers.

3. Design for Medicine, Surgery, Dentistry, Psychiatry & Hospitals.

4. Design for Scientific Research and Biological Work.

5. Design of Teaching, Training and Exercising Devices for the Disabled, the Retarded, the Handi-capped and the Subnormal, the Disadvantaged.

6. Design for Non-Terran and Deep Space Environments, Design for Sub-Oceanic Environments.

7. Design for “Breakthrough,” through new concepts.

That designers should organize their work around addressing human beings’ real-world needs, however clumsily taxonomized—rather than around aesthetics, or function, or the profit imperative—was the message of Design for the Real World. First published in Swedish in 1970, it found global success when published in English in 1971, taking its place among other leftist English-language jeremiads of the time: Jane Jacobs’s The Death and Life of Great American Cities (1961), Rachel Carson’s Silent Spring (1962), James Baldwin’s The Fire Next Time (1963), Kate Millett’s Sexual Politics (1970), E. F. Schumacher’s Small Is Beautiful: Economics as if People Mattered (1973). 

Papanek’s book attributes a lot of agency to design: “In an age of mass production when everything must be planned and designed,” he writes, “design has become the most powerful tool with which man shapes his tools and environments (and, by extension, society and himself ).” But the book doesn’t celebrate that agency. Instead, it charges designers, and the broader economies of production within which they operate, with wasting and abusing their power. 

Take the process of creating and distributing a new secretarial chair. In a “market-oriented, profit-directed system such as that in the United States,” such a new chair almost invariably “is designed because a furniture manufacturer feels that there may be a profit in putting a new chair on the market,” rather than because there is any empirical evidence that a particular population’s sitting needs are not being met. The design team is simply “told that a new chair is needed, and what particular price structure it should fit into.” The team may consult resources in ergonomics or human factors, but inevitably they will find that the information available about their potential “users” is sorely lacking. So they design another generic chair, made neither to fit a specific population nor to solve a new problem. After some perfunctory testing, the chair hits the market, where, invariably, someone other than the secretary decides whether to buy it for her use. Some money is made. No one’s life improves. But the manufacturer is satisfied: “If it sells, swell.”

Young man paints the back of a wooden chair, by Arnold Eagle, c. 1940. [The J. Paul Getty Museum]

What should designers do instead? “A great deal of research,” Papanek replied. Designers should ask “big,” “transnational” questions: “What is an ideal human social system? … What are optimal conditions for human society on earth?” They should inquire into their potential users’ “living patterns, sexual mores, world mobility, codes of behavior, primitive and sophisticated religions and philosophies, and much more.” And they should learn about other cultures’ ways of prioritizing and addressing needs. They should undertake “in-depth study” of such “diverse social organizations” as the “American Plains Indians, the Mundugumor of the Lower Sepik River basin; the priest-cultures of the Inca, Maya, Toltec, and Aztec; the Pueblo cultures of the Hopi; the social structuring surrounding the priest-goddess in Crete; the mountain-dwelling Arapesh; child care in Periclean Greece; Samoa of the late 19th century, Nazi Germany, and modern-day Sweden”; et cetera, et cetera.

Papanek’s commitment to identifying needs by learning about the lives of specific users—largely those from non-Western cultures—might be called an “ethnographic” impulse: a drive to study groups of people (usually groups other than one’s own) and to document their cultures, customs, habits, and differences from an assumed norm. The ethnographic impulse played out not only in Papanek’s bloc-buster book but also in his self-curation and self-presentation. He built a personal library, his biographer notes, containing hundreds of volumes of anthropological research and writing. Beginning in the 1960s, Papanek invited reporters into his home to photograph or draw him and his wife (whoever she was at the time) and their decor: Navajo weavings, Buddhist figures, Inuit masks and ritual artifacts, Balinese masks, other objects of vernacular culture. 

Papanek also endeavored, through this period, to document his alleged ethnographic capital as a set of professional credentials. In the “biographical data” sheet (something like a curriculum vitae) that he presented to CalArts in 1970, Papanek wrote that he 

a. had traveled widely throughout Europe, Thailand, Bali, Java, Cambodia, Japan, etc.

b. spent nearly 6 months (with the Governor’s permission) living in a Hopi Indian pueblo

c. spent several months with an Alaskan Eskimo tribe and nearly five years in Canada

d. spent part of 5 summers in an art-and-craft centered milieu in the Southern Appalachians

e. received various grants that took me to Lapland, Sweden and Finland during the summer of 1966; Finland and Russia during the summer of 1967; and will take me to Russia, Finland, Sweden, and Norway during the summer of 1968

His biographer calls several of these items—particularly those suggesting that Papanek had carried out fieldwork with Hopi and Alaskan Eskimo tribes—“fallacious.” But that didn’t stop Papanek from repeating them across documents and forums. 

Excerpted adapted from The Invention of Design: A Twentieth-Century History by Maggie Gram. Copyright © 2025 by Maggie Gram. Available from Basic Books, an imprint of Hachette Book Group, Inc.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186041 https://historynewsnetwork.org/article/186041 0
An Attempt to Defeat Constitutional Order Conservatives in South Carolina first attempted to defeat the state’s new post-Civil War constitution by appealing to the federal government they had fought three years prior. A petition was submitted to Congress, describing the new constitution as “the work of Northern adventurers, Southern renegades, and ignorant negroes” and claiming that “not one percentum of the white population of the State approves it, and not two percentum of the negroes who voted for its adoption know any more than a dog, horse, or cat, what his act of voting implied.” Conservatives complained that “there seems to be a studied desire throughout all the provisions of this most infamous Constitution, to degrade the white race and elevate the black race, to force upon us social as well as political equality, and bring about an amalgamation of the races.” They ended the petition with a warning: “The white people of our State will never quietly submit to negro rule.”

Congress refused conservative entreaties. But conservatives persisted in their fight. To prevent Black people and Republicans from prevailing in the first elections after the constitution, many turned to coercion, intimidation, and violence. In testimony before a state legislative committee investigating a disputed election, one South Carolinian said that employers had resolved not to employ any man who voted Republican. This was a smart strategy as many former slaves still relied on contracts with their former masters to earn a living. Slaveholders had exploited Black labor to build their wealth, and then used that wealth to build white political power. 

Conservatives also used the legal system. One former slave was arrested and held without trial. Authorities released him when he agreed to vote Democrat. Sometimes, conservatives resorted to even more direct methods. In the spring of 1868, the Ku Klux Klan appeared in South Carolina for the first time, and worked to win the 1868 election for conservatives. After years of being denied a voice in the political process, Richard Johnson was excited to vote. But the night before the election, “the Ku Klux came through our plantation, and said if any of the colored people went to the polls the next day to vote, that they would kill the last one of them.” Some Black men on the plantation were so determined to vote that they still turned up at the polls. But several decided not to vote at the last minute because “the Democrats had liquor at the box upstairs and were drinking and going on in such a manner that the colored people were afraid to go up.” Eli Moragne was one of them. The day before the election, the Klan broke into his home, dragged him outside, stripped him naked, and then whipped him. He showed up despite the experience but was told that if he “voted the Radical ticket [he] would vote it over a dead body.” Armed white men stood between him and the ballot box.

Union Republican Ticket for Constitution, 1868. [University of South Carolina]

Sometimes, Democrats engaged in violence without bothering to wear their Klan robes. William Tolbert, a Democrat who helped murder a Black Republican, observed that “committees were appointed, which met in secret, and they appointed men to patrol in each different neighborhood.” This was done “to find out where the negroes were holding Union leagues.” They had instructions to “break them up, kill the leaders, fire into them, and kill the leaders if they could.” Committees were supposed to take ballots from Republicans and kill those who resisted. Republicans did resist because Tolbert described a scene where one Republican had been shot dead and others had fled. The violence was effective. At one precinct, Tolbert would ordinarily have expected between four and five hundred Black men to vote, but Democratic committee members in the area only allowed two Black men to vote before they started shooting. There were similar drops in Black turnout across the state. For example, in Abbeville County, around 4,200 Black men were registered voters, but only 800 actually voted in 1868’s fall elections.

Republicans won the governorship and control of the legislature. But Democrats and conservatives saw that violence could be effective. 

Carte-de-visite of members of Republicans in the South Carolina State Legislature, 1868. [Wikimedia Commons]

State authorities did try to respond. Amid Klan violence sweeping the state, Governor Robert Scott signed a bill authorizing a state militia. However, most whites refused to serve, a trend that became especially pronounced when Governor Scott rejected all-white militia companies offered by former rebels. In the end, as many as 100,000 men, mostly Black, joined by the fall of 1870. They often wielded state-of-the-art weapons such as Winchester rifles. White newspapers spread conspiracy theories about the militia. For example, after describing the militia sent to Edgefield as “the Corps d’Afrique,” the Charleston Daily Courier claimed that it had come to the town to commence “the arrest of citizens on trumped up charges of being ‘rebel bushwackers,’” and “‘members of the Ku Klux Klan.’” It then suggested that the militia had tortured an innocent white man into admitting that he was a “bushwacker.” Two things appear to have been truly offensive about Black militia units. First, they inspired pride among Black people. The paper complained that when a Black militia unit went to Edgefield, “the negroes of Edgefield became exceedingly jubilant, and determined to congratulate the colored soldiers on their great victory.” Second, the militia gave Black men another economic option besides relying on their former masters. As the paper lamented, “Among the numerous evils which have resulted to the people of Edgefield from this invasion of the county by the negro militia, has been the desertion of the fields by the negro laborers.”

Violence between Black militia units and white people erupted in Laurens County right after the 1870 election. After a gun discharged during a fight between a police officer and a citizen, a white mob began shooting at militia in the town. Several Black men and a few white men died during the fighting and in the subsequent upheaval. One of them was Wade Perrin, a Black legislator. White men caught up to him, ordered him to dance, sing, pray, and then run away. While he was running, they shot him in the back. Between 2,000 and 2,500 armed white men occupied the town. They had confiscated militia weapons from the armory. Two different stories developed about what had caused the violence. The Daily Phoenix blamed Black people. In the months before the 1870 election, the paper reported, “the white people had been subjected to an organized system of disparagement, abuse, and threats of violence to person and property, which had produced that feverish state of feeling incident to a deep sense of outrage and injustice.” Black people had allegedly become so unruly that “for weeks, whole families had not undressed for bed, so great was the apprehension of midnight negro risings, burnings and butcheries.”

The South Carolina Republican, however, claimed that a white man deliberately attacked a policeman to provoke him into firing so they would have an excuse to shoot. This must have been a premeditated plot because “it was not three minutes after the first shot was fired before a line of white men had formed across the public square … The white men came from every direction, out of the stores, the courthouse, and every other place, and what appears very singular is that every one was fully armed.” After the white men had fired on the militia, the paper reported that “white couriers were dispatched on every road, to rouse the people, so that by night at least one thousand men were scouring the countryside on horseback, and in little squads hunting up Radicals.” The incident attracted national media coverage. The New York Herald observed that “‘The War of the Races’ in South Carolina did not end with the rebellion, but occasionally bursts forth with its wonted fury.”

Governor Scott declared martial law in four South Carolina counties. But he also ordered remaining militia weapons in Laurens County transferred to Columbia. Removing the weapons ensured that the militia couldn’t be a serious fighting force and made the martial law proclamation meaningless. A wave of Klan violence swept the state after Laurens. The violence diminished temporarily later in 1871, though there is disagreement about why. Some have suggested that aggressive federal measures were responsible. 

In 1871, the federal government stationed more troops in the state and engaged in a thorough intelligence gathering operation to learn more about the Klan. Federal legislation authorized President Ulysses S. Grant to use the military to enforce the law and placed congressional elections under federal supervision. What became known as the Ku Klux Klan Act allowed Grant to suspend the writ of habeas corpus when he deemed it necessary. After considerable debate, Grant suspended the writ in nine South Carolina counties on October 17, 1871. Over the next months, federal authorities arrested thousands of men for allegedly participating in the Klan and secured dozens of convictions and guilty pleas. These efforts were enough for one historian to claim that “the limited steps taken by the Federal government were adequate to destroy” the Klan.

Indeed, Klan violence was lower for the end of 1871 and some of 1872 than it had been earlier. At the time, however, law enforcement officials themselves were skeptical about whether their efforts had been effective. One prosecutor even suggested that “orders were given” from unknown persons to end the violence “for the present” and that the Klan would simply “wait until the storms blew over” to “resume operations.” By the summer of 1872, Klan activity intensified, indicating that any benefits from federal intervention were limited.

Left: Jonathan Jasper Wright, 1870. [Wikimedia Commons] Right: William Whipper, c. 1879. [Wikimedia Commons]

Given the immense opposition it faced, South Carolina’s government made important achievements. The state greatly extended educational opportunities. In 1868, 400 schools served only 30,000 students. But by 1876, 2,776 schools served 123,035 students. The state also expanded the University of South Carolina, even providing 124 scholarships to help poor students with tuition.

Perhaps most importantly, South Carolina saw unparalleled Black involvement in politics during Reconstruction. During these years, 315 Black men served in political office. Six served in Congress. Two Black men served as lieutenant governor. South Carolina was a place where a parent could take a son who had experienced chattel slavery just three years previously to the legislature, point to a majority of the members, and say, “that could be you one day.” The state that was the first to plunge the nation into Civil War because of its commitment to Black slavery was also the first to raise a Black man up to its supreme court. Jonathan Jasper Wright was born in Pennsylvania to free Black parents and managed to save enough money to attend college, a rare feat for both white and Black people in the era. He read law in his spare time while teaching to support himself. Upon passing the bar, he became the first Black lawyer in Pennsylvania. After the Civil War, he came to South Carolina to organize schools for freedmen. Wright had a neatly trimmed beard and mustache, and his somber eyes betrayed a young man who was in a hurry or a man weighed down with cares, or perhaps both.

Corruption marred all of the progress. In 1870, the Charleston Daily News wrote that “the South Carolina Legislature enjoys the reputation, both at home and abroad, of being one of the most corrupt legislative bodies in existence.” Corruption was so bad, the paper claimed, that “a remark frequently made among the white men in Columbia, Radicals and Democrats, was that two hundred thousand dollars, judiciously circulated among the legislators, would secure the passage of a bill repealing the Emancipation act, and putting all but colored legislators back in slavery.” The paper then asserted that there was an organization known as the forty thieves pillaging the treasury. The organization allegedly had a captain, three lieutenants, four sergeants, and twenty-eight privates. The group conspired to prevent the legislature from passing any “measure unless money was paid to the members of the organization.”

Although conservatives may have exaggerated corruption, it did plague South Carolina during Reconstruction. After John Patterson won election to the U.S. Senate, authorities arrested him when a legislator said he had voted for Patterson after receiving a bribe. Critics called Patterson “Honest John,” supposedly because he always made good on his promises to pay bribes. The legislature attempted to impeach Governor Scott for his behavior in issuing bonds. At the end of 1871, a Republican newspaper lamented that “1872 finds South Carolina financially in a bad way, with no one to blame but officials of our own party. This is a disagreeable statement to make, but it is the truth.” William Whipper, who had argued for enfranchising women at the 1868 constitutional convention, asserted Scott bribed legislators to escape impeachment.

All the corruption caused schisms in the Republican Party. Eventually Whipper, who would himself be accused of corruption, asserted, “It is my duty to dissolve my connection, not with the Republican Party, but with the men, who by dishonesty, demagogism and intrigue have defamed the name of Republicanism, and brought financial ruin upon the State.” Disgruntled Republicans joined the new Union Reform Party along with some Democrats. In the 1870 campaign, the party’s platform was “honesty against dishonesty — cheap, economical government against exorbitant taxation — reduction of public expenses against extravagant expenditure of the people’s money — responsibility of officials for the faithful discharge of their duties against irresponsibility, selfishness and greedy absorption of power.” The Reform Party failed to win the fall elections, though members alleged fraud and intimidation at the polls. Corruption in the Republican Party deprived it of unity precisely when it was most needed to overcome the massive resistance it faced.

Some observers even claimed that corruption led to the Klan violence against Black people and Republicans. But whatever else is true about the corruption in the South Carolina Republican Party, it does not explain the attempt to overthrow the constitutional order. We know this because conservatives and Democrats never gave the 1868 constitution or the Republican Party a chance. They schemed to prevent a constitutional convention in the first place, protested to federal authorities, and used terrorism, cold-blooded murder, and economic coercion to prevail in the 1868 general election. The reality is that, given their hostility to Black political advancement, they would have engaged in violence and attempted to defeat the new constitutional order even if every Republican official had been honest and efficient.

Excerpt adapted from Sedition: How America’s Constitutional Order Emerged from Violent Crisis by Marcus Alexander Gadson. Copyright © 2025 by New York University. Published by NYU Press.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186038 https://historynewsnetwork.org/article/186038 0
Lethal Injection Is Not Based on Science We know how to euthanize beloved pets — veterinarians do it every day. And we know how physician-assisted suicide works — it is legal in several states. If drugs can be used to humanely end life in these other contexts, why is it so difficult in the death penalty context? The answer is one of the best-kept secrets of the killing state: lethal injection is not based on science. It is based on the illusion of science, the assumption of science. “What we have here is a masquerade,” one lab scientist says. “Something that pretends to be science and pretends to be medicine but isn’t.” Consider first the birth of lethal injection.

In 1976, the Supreme Court gave states the green light to resume executions after a decade of legal wrangling over the constitutionality of the death penalty, and Oklahoma was eager to get started. The only hitch was how to do it. Oklahoma’s electric chair was dilapidated and in need of repair, but more importantly, it was widely viewed as barbaric and inhumane. The state was looking to try something new. A state legislator approached several physicians about the possibility of death by drugs — a lethal injection. They wanted nothing to do with it, but the state’s medical examiner, Dr. Jay Chapman, was game. “To hell with them,” the legislator remembered Chapman saying. “Let’s do this.”

Chapman had no expertise in drugs or executions. As Chapman himself would later say, he was an “expert in dead bodies but not an expert in getting them that way.” Still, he said he would help and so he did, dictating a drug combination to the legislator during a meeting in the legislator’s office. Chapman first proposed two drugs, then later added a third. Voila. In 1977, the three-drug protocol that states would use for the next 30 years was born.

The idea was triple toxicity — a megadose of three drugs, any one of which was lethal enough to kill. The first drug, sodium thiopental, would kill by barbiturate overdose, slowing respiration until it stopped entirely. The second drug, pancuronium bromide, would kill by paralyzing the diaphragm, preventing it from pumping air into the lungs. And the third drug, potassium chloride, would kill by triggering a cardiac arrest. The effects of the second and third drugs would be excruciatingly painful, so the first drug did double duty by blocking pain as well.

How did Chapman come up with his three-drug combo? “I didn’t do any research,” he later confided in an interview. “I just knew from having been placed under anesthesia myself, what was needed. I wanted to have at least two drugs in doses that would each kill the prisoner, to make sure if one didn’t kill him, the other would.” As to why he added a third drug, Chapman answered, “Why not? … You wanted to make sure the prisoner was dead at the end, so why not add a third drug,” he said, asking: “Why does it matter why I chose it?”

This is how the original three-drug lethal injection protocol came to be: a man working outside his area of expertise and who had done no research just came up with it. “There was no science,” says law professor Deborah Denno, one of the leading experts in the field. “It was basically concocted in an afternoon.” As another lethal injection expert, law professor Ty Alper, put the point, Chapman “gave the matter about as much thought as you might put in developing a protocol for stacking dishes in a dishwasher.” For the careful dish stackers among us, it’s fair to say he gave it less.

But that was good enough for Oklahoma, which adopted the new execution method without subjecting it to a shred of scientific scrutiny. No committee hearings. No expert testimony. No review of clinical, veterinary, or medical literature. The state was embarking upon an entirely new way to kill its prisoners, and did none of the most basic things.

Texas followed Oklahoma’s lead the next day, and then other states did too, carelessly copying a protocol that had been carelessly designed in the first place. “There is scant evidence that ensuing States’ adoption of lethal injection was supported by any additional medical or scientific studies,” a court reviewing the historical record wrote. “Rather, it is this Court’s impression that the various States simply fell in line relying solely on Oklahoma’s protocol.” As Deborah Denno observes, the result was an optical illusion — states touted a “seemingly modern, scientific method of execution” without an iota of science to back it up. Jay Chapman was as surprised as anyone by other states’ adoption of his protocol. “I guess they just blindly followed it,” he later stated, adding, “Not in my wildest flight of fancy would I have ever thought that it would’ve mushroomed into what it did.” “I was young at the time,” he explained. “I had no idea that it would ever amount to anything except for Oklahoma.”

Over time, every death penalty state in the country would adopt Chapman’s three-drug lethal injection protocol — not because they had studied it, but because in the absence of studying it, there was nothing to do but follow the lead of other states. “I didn’t have the knowledge to question the chemicals,” one warden explained, saying that he had “no reason to because other states were doing it.”12 “It wasn’t a medical decision,” an official from another state explained. “It was based on the other states.”

Sociologists have a name for this, a term of art for fads based on a faulty assumption. They call it a “cascade to a mistaken consensus,” and lethal injection is a textbook example. States had come to a consensus in adopting the three-drug protocol, but it was based on the assumption that other states knew what they were doing. They did not.

 

The fact that the three-drug protocol wasn’t based on science is not to say that science on the drugs didn’t exist. All three drugs were FDA approved, so there were studies and FDA warning labels saying what each drug did. The problem was that none of that science could predict what would happen when the drugs were used in lethal injection. Lethal injection is an “off-label” use of a drug, and although doctors use drugs for off-label purposes all the time, they aren’t trying to kill people, so their off-label use doesn’t come anywhere close to the use of those drugs as poison in lethal injection. Lethal injection uses drugs in amounts that no one has ever prescribed, let alone studied in a research setting. It delivers the entire dose of a drug at once — a practice known as “bolus dosing” — rather than delivering the drug in an IV drip, as is typical for large doses in the clinical setting. And it uses combinations of drugs that are simply unfathomable in the practice of medicine, giving rise to the possibility of “profound physiological derangements” (science-speak for freakishly weird results), as overdoses of different drugs affect the body in different ways.

Who knew what was going to happen when all three of these perversions came together. No one did, and the studies to find out had not even begun. In the biomedical research setting, a baseline showing of scientific support is required for testing on animals, and the three-drug protocol didn’t even meet that threshold. As one lab scientist quipped, “You wouldn’t be able to use this protocol to kill a pig.”

But states weren’t killing pigs. They were killing people, so they forged ahead, undaunted by the unknowns. Yet over time, the executions that followed created data points of their own, and those data points drew scientists. If states would not go to the science, science would come to them.

Granted, the data was thin. In some states, the problem was secrecy. “There is an enormous amount of information from executions (autopsies, toxicology, ECG recordings, EEG recordings, execution logs, and photographs),” one expert explained, “but most of it has been kept secret.” In other states, the problem was poor record-keeping. In still others, it was a state’s decision to stop keeping records altogether. For example, Texas — which conducts more executions per year than any other state — stopped conducting post-execution autopsies altogether in 1989. “We know how they died,” a state spokesperson stated when asked about the reason for the no-autopsy policy.

That said, the raw data that scientists did manage to get was enough to raise serious concerns about the three-drug protocol. State officials were making “scientifically unsupportable” claims about lethal injection, researchers stated, so they decided to look at the data to see what it showed. In 2005 and 2007, researchers published two peer-reviewed studies on lethal injection, the first major studies of their kind.

In the first study, researchers obtained toxicology reports from forty-nine executions in Arizona, Georgia, North Carolina, and South Carolina. (Texas and Virginia, the two states with the most executions in the country at the time, refused to share their data.) Because they had no other way to determine whether prisoners were anesthetized when they were injected with the second and third drugs, researchers measured the postmortem amounts of sodium thiopental (the first drug) in the blood, finding that most prisoners had amounts lower than what was necessary for anesthesia, and some had only trace amounts in their system.

“Extrapolation of ante-mortem depth of anesthesia from post-mortem thiopental concentrations is admittedly problematic,” the researchers conceded. Still, the wide range of sodium thiopental amounts in prisoners’ blood suggested gross disparities during their executions as well. “It is possible that some of these inmates were fully aware during their executions,” the researchers stated, but their conclusion was more modest: “We certainly cannot conclude that these inmates were unconscious and insensate.”

Vigorous debate ensued. “You can’t take these post-mortem drug levels at face value,” one forensic pathologist stated, explaining that the amount of a drug in the blood dissipates after death, just as it does in life, and most autopsies in the study were conducted around twelve hours after death, so the postmortem measurements didn’t say much about the sodium thiopental in a prisoner’s blood during the execution. The study’s authors shot back with point-by-point responses to the criticism, but the damage was done. The so-called “Lancet study,” named for its publication in one of the most prestigious medical journals in the world, would forever be tainted by skepticism.

Had the first study been the only study of the three-drug protocol, one might have said that the science was inconclusive. But a second study was published two years later, and its findings were far less subject to dispute. In the second study, researchers examined execution logs in California. California’s expert had testified that the effects of sodium thiopental were well understood. Within sixty seconds of receiving the overdose, “over 99.999999999999% of the population would be unconscious,” the state’s expert stated, and “virtually all persons [would] stop breathing within a minute.” But when researchers examined the logs from California’s eleven executions by lethal injection, they found that this was not the case. In six of the eleven cases — 54% — the logs showed that the prisoner “continued to breathe for up to nine minutes after thiopental was injected.”

This was alarming not only because it showed that the state’s expert was wrong, but also because it suggested that the prisoners had died torturous deaths. In the absence of a trained professional assessing anesthetic depth, the cessation of breathing provides a rough proxy for adequate anesthesia. Thus, the fact that over half the prisoners continued breathing was an ominous sign that they had not been fully anesthetized prior to injection of the drugs that would cause slow suffocation and cardiac arrest. Executioners had recorded prisoners’ vital signs, but had not understood what they meant.

California’s execution logs revealed another problem as well: the same six prisoners who continued to breathe did not go into cardiac arrest after injection of the third drug, potassium chloride, which the state’s expert had said would kill within two minutes. Given the massive dose of potassium chloride, how could this possibly be? The answer was one of the “profound physiological derangements” that no one saw coming, at least not until researchers documented it: the bolus dose of sodium thiopental had depressed circulation so dramatically that it blunted the bolus dose of potassium chloride. Prisoners’ hearts raced in response to the potassium chloride, but not enough to induce cardiac arrest, leaving them to die by slow suffocation from the paralytic instead.

The findings from California’s execution logs led a federal court to invalidate the state’s lethal injection protocol in 2006. “The evidence is more than adequate to establish a constitutional violation,” the court stated, noting that it was “impossible to determine with any degree of certainty whether one or more inmates may have been conscious during previous executions or whether there is any reasonable assurance going forward that a given inmate will be adequately anesthetized.” The governor has since declared a moratorium on executions in the state, and it remains in place today.

Looking back, it’s fair to say that for the first 30 years of lethal injection, states used a three-drug protocol without understanding how it actually worked. State experts made claims and stated them with confidence, but what they said didn’t turn out to be true. Sodium thiopental didn’t do what states said it would do, and potassium chloride didn’t do what states said either — largely because no one accounted for the possibility that a bolus dose of the first drug would blunt the bolus dose of the third. States had no idea what their toxic drug combinations would actually do. They were slowly suffocating prisoners to death, and they didn’t have a clue.

Excerpt adapted from Secrets of the Killing State: The Untold Story of Lethal Injection by Corinna Barrett Lain. Copyright © 2025 by New York University. Published by NYU Press.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186036 https://historynewsnetwork.org/article/186036 0
Are You Not Large and Unwieldy Enough Already? On May 25, 1836, John Quincy Adams addressed the U.S. House of Representatives in an hour-long oration. Eight years earlier, when Adams was still president of the United States, an address of such length by the erudite Harvard graduate would have been unremarkable. But by 1836, Adams was no longer president. He had been defeated for reelection by Andrew Jackson in 1828; left the White House in 1829 without attending his successor’s inauguration; quickly grown restless in retirement as he observed with dismay Jackson’s populist, expansionist, and proslavery policies; and returned to Washington in 1831 as a member of the House. The nominal issue that inspired Adams’ sprawling speech in 1836 was a resolution authorizing the distribution of relief to settlers who had fled their homes in Alabama and Georgia following a series of violent altercations with Indigenous people. Adams used that conflict as an opportunity to embark on a wide-ranging discourse. As a Congressional Globe journalist archly put it, the ex-president addressed the chamber “on the state of the Union.”

Although Adams expounded on numerous subjects, he focused on the most pressing issue of the moment: the rebellion in the Mexican province of Coahuila y Tejas (or, as Americans called the northern part of the province, Texas). Beginning in October 1835, “Texians,” as expatriate American settlers in Texas were known, had revolted against Mexican rule. By April 1836, the Texians had unexpectedly defeated the Mexican force sent to subdue them, achieved a fragile independence, and appealed to the United States for annexation. Jackson plainly favored annexation, and Adams accused numerous House members of “thirsting” to annex Texas as well.

In dire terms, Adams warned against expanding the boundaries of the United States to include Texas. His opposition to annexation may have surprised some of his colleagues in the House. As a U.S. senator from Massachusetts in 1803, he had been the only Federalist to vote in favor of Thomas Jefferson’s acquisition of the Louisiana Territory. In 1818, as secretary of state during the administration of James Monroe, he had defended Andrew Jackson when Jackson, then an army general, had invaded Spanish Florida. In 1821, Adams acquired Florida for the United States from Spain in return for setting the southwestern boundary of the United States at the Sabine River — the border between the modern states of Louisiana and Texas. With that agreement in place, Adams believed that U.S. expansion had gone far enough. Before the House in 1836, he argued that to extend the already “over-distended dominions” of the United States beyond the Sabine would be an untenable overreach. “Are you not large and unwieldy enough already?” he asked proponents of annexation. “Is your southern and southwestern frontier not sufficiently extensive? Not sufficiently feeble? Not sufficiently defenceless?” Annexation, he predicted, would precipitate a war with Mexico that the United States might well lose. Adams warned that Mexico had “the more recent experience of war” and “the greatest number of veteran warriors.” He reminded the House of ongoing U.S. military stumbles in Florida, where the United States had struggled to establish its control since acquiring the peninsula from Spain: “Is the success of your whole army, and all your veteran generals, and all your militia-calls, and all your mutinous volunteers against a miserable band of 500 or 600 invisible Seminole Indians, in your late campaign, an earnest of the energy and vigor with which you are ready to carry on that far otherwise formidable and complicated war?” Not least of all, he warned that if Mexico were to carry the war into the United States, the invader would find numerous allies among slaves and especially among the Indigenous people whom the United States was in the process of removing to the Indian Territory on the border with Texas. “How far will it spread,” Adams asked, should Mexico invade the United States, “proclaiming emancipation to the slave and revenge to the native Indian”? In such an instance, “Where will be your negroes? Where will be that combined and concentrated mass of Indian tribes, whom, by an inconsiderate policy, you have expelled from their widely distant habitations, to embody them within a small compass on the very borders of Mexico, as if on purpose to give that country a nation of natural allies in their hostilities against you? Sir, you have a Mexican, an Indian, and a negro war upon your hands, and you are plunging yourself into it blindfold.” Adams’ speech sparked a debate that consumed five hours, causing the House to stay in session long into the evening. That night, Adams, in his inimitably cramped handwriting, recorded the day’s events in his diary. He congratulated himself that he had succeeded in sapping the House’s enthusiasm for annexation. Indeed, Adams and his like-minded colleagues in Congress managed to deter annexation for nine more years.

Ornamental map of the United States and Mexico, 1846. [David Rumsey Historical Map Collection]

In Adams’ view, the United States, which between 1783 and 1836 had expanded its territory northwest into the Great Lakes region, west into the Great Plains, and south to the Gulf of Mexico, had swollen beyond its capacity either to exercise effective sovereignty over border regions or to defend its extended borders against imperial competitors. The U.S. presence in the borderlands, a multilateral and multiethnic region, was tenuous: until the 1840s, Britain dominated the region between the western Great Lakes and Oregon, while Spain and, later, Mexico controlled the region between Texas and California. The success of the Seminoles together with the escaped slaves who were allied with them in resisting U.S. forces in Florida was hardly exceptional. In the western Great Lakes region, the Ojibwe dominated. The British liberally supported the Ojibwe and other Indigenous nations in the Great Lakes region. In the event of another war with Britain, the natives were likely to once again be British allies as they had been in the War of 1812. As for the horse-mounted natives of the Great Plains such as the Comanches and the Lakota, the United States in 1836 could not even begin to imagine challenging their control of the grasslands. Likewise, the fear that an invasion by a foreign power on the southwestern border might spur a slave revolt was quite real; by promising freedom, the British had encouraged thousands of enslaved people to join them in fighting against the United States in both the Revolutionary War and the War of 1812. In the first decades of the 19th century, numerous slaves fled from Georgia and Louisiana to Florida and New Spain; once in Spanish territory, maroon communities encouraged further flight and, slaveholders feared, rebellion. In short, Adams was entirely correct that in the first decades of the 19th century, the United States maintained a relatively weak presence on its borders where it had to contend with powerful, autonomous native groups, fugitive slaves, and competing imperial powers.

Leaders such as Adams who in the first decades of the 19th century pondered the weaknesses of the United States in its border regions were in many respects confronting a new problem. Before 1800, the most profitable imperial holdings in the Americas were of two types: sugar plantations in the Caribbean and coastal Brazil; and Spain’s silver mines at Potosí in the Andes and the Bajío in Mexico. Almost everywhere else, until the end of the 18th century, the British, French, Spanish, and Portuguese empires in continental North and South America were primarily commercial and tributary rather than territorial. European imperial settlements on the American mainland, with the notable exceptions of the Spanish silver mines and a few other places such as Mexico’s Central Valley, hugged the coastlines. European empires primarily claimed sovereignty over vast interiors of the Americas based on the reciprocal exchange of gifts and tribute with native leaders and by virtue of commerce in animal products and slaves that European merchants carried on with the Indigenous people of continental interiors.

Thus, throughout much of British, French, and Spanish North America, European imperial claims to territory depended on the commercial and diplomatic loyalties of Indigenous people. European military forces occasionally launched punitive expeditions into the interior against natives who resisted these commercial and diplomatic arrangements but rarely managed, or even tried, to establish an enduring military presence. Imperial boundaries, in this scheme, remained only loosely defined.

This system, in which Indigenous people held considerable influence, began to change in the late eighteenth and early 19th century, as European empires shifted away from defining sovereignty in terms of relationships with Indigenous people and toward negotiating imperial boundaries with each other. In 1777, for instance, Spain and Portugal agreed in the first Treaty of San Ildefonso to create a joint boundary commission to survey the border between their South American empires, marginalizing the Indigenous nations who lived in those lands. When the United States and Spain agreed to a border between Georgia and Spanish Florida in 1795, they did not consult with the Seminoles who inhabited the territory. Indigenous people were similarly excluded in 1818, when the United States agreed to a treaty with Britain establishing the northern boundary of the United States and providing for joint Anglo-American occupation of Oregon. They were likewise left out in 1821, when Adams negotiated with Luis de Onís, a Spanish minister, establishing the border between the United States and New Spain at the Sabine River. All these agreements belonged to a larger European-US effort to sideline Indigenous people and negotiate imperial boundaries among themselves. European- and American-made maps reflected the shift in imperial mentalities: in the seventeenth and eighteenth centuries, when imperial claims depended on alliances with Indigenous people, maps of the North American interior abounded with the names of Indigenous nations. By the 19th century, similar maps had erased references to Indigenous nations and showed only empty space.

Yet while European powers and the United States could erase Indigenous nations from their maps, they could not so easily dispense with the necessity of dealing with autonomous and powerful Indigenous nations on the outskirts of their territories. In the first decades of the 19th century, the old, somewhat unpredictable system of imperial sovereignty contingent upon diplomatic and commercial relations with Indigenous people persisted even as the new territorial system based on diplomacy (and sometimes war) between empires was ascending. For example, when the United States achieved its independence from Britain in 1783, it acquired — on paper at least — an extensive territory between the Appalachians and the Mississippi River. In 1783, however, the borders spelled out in treaties remained less meaningful than commercial and diplomatic relations with Indigenous people. While the British formally ceded the trans-Appalachian region to the United States, they maintained for decades merchant outposts in what was nominally U.S. territory. The U.S. explorer Zebulon Pike encountered one such outpost on the Upper Mississippi River in January 1806: a North West Company trading post. Seeing “the flag of Great Britain” over the post in what was nominally U.S. territory, Pike wrote, “I felt indignant.” But there was little he could do to assert U.S. authority. 

More than just flying their flag in U.S. territory, the British, through their trade, retained the commercial and diplomatic allegiance of Indigenous people in the new US Northwest Territory. When the United States and Britain went to war six years after Pike stumbled across the British trading post, most of the Indigenous people in the Northwest Territory sided with the British. To the south, the Spanish had seized Florida from Britain during the American Revolution; the Florida peninsula almost immediately became a haven for fugitive slaves from the United States. The Spanish, who also controlled New Orleans, periodically inconvenienced American merchants by closing the mouth of the Mississippi River to commercial travel.

Between 1803 and 1821, the United States acquired both Florida and New Orleans by treaty. The United States thus removed those territories from the control of an imperial competitor but in so doing took on an extensive territory where it struggled to establish its sovereignty. Understanding the early 19th-century United States as weak relative to Indigenous people, escaped slaves, and imperial competitors contradicts both the popular and the scholarly view of the United States in this period. Most historians of what the historian Arthur M. Schlesinger Jr. once called “the age of Jackson” depict U.S. expansion not only as inexorable but as one of the defining characteristics of the period. According to this view, the United States in the first half of the 19th century was like a seething boiler that could barely contain the outward economic and cultural pressures within it: a virulent, racist hatred of Indigenous people; an all-but-insatiable desire for land; a dynamic, profitable, and expanding slave-based plantation system; an explosive market economy; and a self-righteous American missionary Protestantism that saw itself as a reforming beacon to the world.

Pictorial map of the Great West, 1848. [David Rumsey Historical Map Collection]

Expansion was not a national consensus, and the expansionism that Andrew Jackson advocated was always a politically divisive and contested issue. In 1819, by a vote of 107–100, Jackson only narrowly escaped censure in the House of Representatives for his unauthorized attacks against Spanish outposts and British subjects during an invasion of Spanish Florida the previous year; in 1830, Jackson’s Indian Removal Act barely passed the House of Representatives, 101–97; in 1832, an anti-Jackson coalition won a majority of the Senate; and beginning in 1836 and lasting for the next nine years, Adams and his congressional allies successfully deterred Texas annexation. Adams was one of numerous elected leaders — many of them Northeasterners who eventually coalesced into the Whig Party — who advocated strengthening U.S. commerce, manufacturing, and infrastructure within existing U.S. boundaries rather than overstretching U.S. power by sprawling across the continent. Adams understood a reality about the U.S. position in North America that “manifest destiny” obscures:  a relatively weak United States found itself engaged with powerful European imperial competitors, and even more powerful Indigenous nations, in a complicated struggle for sovereignty in several regions on its borders. Unable to simply impose its will, the U.S. often reached out into the borderlands through diplomacy or commerce. Manifest destiny was just one of many narrative visions for the borderlands; in the first decades of the 19th century, it was neither the dominant vision nor the most plausible. 

Excerpt adapted from The Age of the Borderlands: Indians, Slaves, and the Limits of Manifest Destiny, 1790–1850 by Andrew C. Isenberg. Copyright © 2025 by the University of North Carolina Press.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186034 https://historynewsnetwork.org/article/186034 0
Mutant Capitalism In Neal Stephenson’s Snow Crash (1992), a novel that channeled perfectly the libertarian imagination of the post–Cold War moment, the territory once known as the United States has been shattered into privatized spaces: franchise nations, apartheid burbclaves, and franchulets, a world of what I have called “crack-up capitalism.” The threat in the plot is the Raft, a maritime assemblage several miles across: a decommissioned aircraft carrier lashed to an oil tanker and countless container ships, freight carriers, “pleasure craft, sampans, junks, dhows, dinghies, life rafts, houseboats, makeshift structures built on air-filled oil drums and slabs of styrofoam.” The Raft “orbits the Pacific clockwise” bearing a cargo of “Refus” or refugees, welcomed aboard by an entrepreneurial tech evangelist who has just cornered the global fiber optic grid and has schemes to subjugate the population through a computer virus administered as a bitmap narcotic. The Raft’s passengers are dehumanized and anonymized: a mass of insects “dipping its myriad oars into the Pacific, like ant legs” at whose arrival the coastal residents of California live in terror, subscribing to a “twenty-four-hour Raft Report” to know when the “latest contingent of 25,000 starving Eurasians has cut itself loose” to swim ashore.

Stephenson’s descriptions are stomach-turning, indulging in a grotesque racist imagery of nonwhite danger. The Raft was the fodder for, as he wrote, “a hundred Hong Kong B-movies and blood-soaked Nipponese comic books.” As the race scientist and former National Review journalist Steve Sailer noted, the Raft also had an obvious antecedent: the “Last Chance Armada” of Jean Raspail’s 1973 novel, first published in French, The Camp of the Saints. In that book, a disabled messianic leader from the Calcutta slums boards millions of indigent Indians on a lashed-together fleet of old ships to travel West “in a welter of dung and debauch.” The novel revels in what one scholar calls “pornographic prose” in its depiction of coprophagy, incest, and pedophilia aboard the armada. The plot ends in an orgy of violence after what the author sees as the suicidal embrace of the armada by the foreigner-friendly French population.

The first English translation of The Camp of the Saints was published by Scribner’s in 1975 to many positive reviews. The cover image showed a single Caucasian hand holding up a globe from grasping brown hands with a catch line reading: “a chilling novel about the end of the white world.” The book returned to public discussion during the first successful presidential campaign of Donald Trump as an alleged inspiration to his advisers Steve Bannon and Stephen Miller, but it was already a common touchstone decades earlier. It was reissued in 1986 by the white supremacist Noontide Press and in 1987 by the American Immigration Control Foundation (AICF), which, along with the Federation for American Immigration Reform (FAIR) helped mainstream anti-immigrant arguments in part by piggy-backing on the mailing lists of right-wing magazines to help seed a national movement. 

In 1991, John Randolph Club (JRC) founding member Sam Francis described the book as “a kind of science fiction novel” that had become American reality. “The future is now,” he wrote. The vision of the maritime refugee indexed with the evening news in the early 1990s. There were more than 30,000 interceptions of Haitians at sea in 1992 and nearly 40,000 Cubans in 1994; the same year, the Golden Venture ran aground in Rockaway Beach, carrying 300 Chinese would-be migrants. Raspail’s novel “forecasts the recent landing of the Golden Venture,” as one letter to the Washington Times put it in 1993. The Social Contract Press reissue featured a photo of Chinese men wrapped in blankets after disembarking from the vessel in the background. Introducing the novel, the nativist ideological entrepreneur and FAIR director John Tanton wrote that “the future has arrived,” citing the Golden Venture and other instances of maritime flight that had taken Raspail’s plot “out of a theorist’s realm and transposed it into real life.” Fiction can be more powerful than fact,” wrote JRC member and American Renaissance founder Jared Taylor in a review of The Camp of the Saints. “The novel,” he wrote, “is a call to all whites to rekindle their sense of race, love of culture, and pride in history for he knows that without them we will disappear.”

The Camp of the Saints had a special place in the paleo imagination. Ahead of the first JRC meeting, the Ludwig von Mises Institute’s Lew Rockwell claimed partial credit for the book’s circulation in the United States in 1975. In his talk “Decomposing the Nation-State” at the Mont Pelerin Society in 1993, Rothbard wrote that he had previously dismissed the novel’s vision, but “as cultural and welfare-state problems have intensified, it became impossible to dismiss Raspail’s concerns any longer.” He referred to his proposal of privatizing all land and infrastructure discussed in the last chapter as a solution to the “Camp of the Saints problem.” When the JRC met in Chicago in December 1992, the conference was titled “Bosnia, USA” and Hans-Hermann Hoppe spoke in the lead-off session named after The Camp of the Saints.

The year between the first and second meeting of the JRC had been momentous. The Los Angeles riots in April, Buchanan’s run for president, and Rothbard’s proposal of a strategy of right-wing populism made 1992 look like, in the words of author John Ganz, “the year the clock broke.” Another notable event was the publication of an article in National Review by the scheduled keynote speaker at the club: the journalist Peter Brimelow, a naturalized U.S. citizen born in England in 1947. When the article was published as a book by Random House in 1995 with thanks given to Rockwell and Jeffrey Tucker at the Ludwig von Mises Institute (as well as his agent Andrew Wylie), Alien Nation was described as a “non-fiction horror story of a nation that is willfully but blindly pursuing a course of suicide.” Historian Aristide Zolberg writes that the book “marked the ascent to respectability of an explicitly white supremacist position … that had hitherto been confined in the United States to shadowy groups.” Alien Nation came in the immediate wake of the passage of Proposition 187 in California, blocking access to education and health services for undocumented immigrants, one of the earliest instances of local governments “trying to retake immigration control into their own hands.” “No writer has argued more effectively for this change of policy than Peter Brimelow,” wrote Brimelow’s former colleague at Forbes, David Frum. “No reformer can avoid grappling with [his] formidable work.”

In 1999, Brimelow took his project online — “fortunately the Internet came along,” as he put it later — founding the website VDARE.com, named after the first child born to white settlers in North America, Virginia Dare. Serving as what the Washington Post called a “platform for white nationalism,” the website has hosted prominent advocates of scientific racism like Jared Taylor, J. Philippe Rushton, and Steve Sailer as well as alt-right activists Richard Spencer and Jason Kessler.

An amplifier for themes and tropes of the Far Right, a search of the website yields more than 20,000 posts with the term “white genocide,” more than 13,000 with “race realism,” and 6,000 with “Great Replacement.” Brimelow is also proximate to more mainstream figures in the United States. He was hosted at the home of then-president Donald Trump’s economic adviser Larry Kudlow in 2018 and held a role at the same time at Fox reporting directly to Rupert Murdoch. Brimelow has become Jean Raspail’s spokesperson for the 1990s and 2000s. 

 

Where does the resurgence of the Far Right come from? Scholars attempting to explain how apparently fringe political ideologies have moved to center stage since the election of Trump in 2016 have split into two camps. The first locates the origins of the Far Right in culture: racism, chauvinism, xenophobia, the “tribalism” of “white identity politics,” or a longing for “eternity.” As a group, these commentators seem to ignore the admonition from Frankfurt school sociologist Max Horkheimer repeated so often that it threatens to become a cliché that “whoever is not willing to talk about capitalism should also keep quiet about fascism.”

Capitalism can be hard to find in this literature. A recent book on “the far right today” does not mention the term once. Four other books on the alt-right and white power movement barely mention it, and a fourth only to say that the alt-right is “skeptical of global capitalism.” References to “identity” outnumber “capitalism” at a ratio of several dozen to one. The assumption seems to be that Far Right ideology is either post- or pre-material: it inhabits a space of culture detached from issues of production and distribution. This is startling given the fact that the radical Right’s central issue is nonwhite immigration, an eminently economic issue with a vast specialized literature. 

By contrast, the second school of interpretation finds the origins of the Far Right in the spirit of capitalism itself. Rather than a rejection of neoliberalism, they see the Far Right as a mutant form of it, shedding certain features like a commitment to multilateral trade governance or the virtues of outsourcing while doubling down on Social Darwinist principles of struggle in the market translated through hierarchical categories of race, nationality, and gender. Brimelow’s work helps us see how the nation is understood as both a racial and economic asset to the Far Right.

Brimelow is described variously as a “white nationalist,” “restrictionist,” or “Alt Right figurehead.” Yet he is almost never described the way he described himself: as a libertarian conservative or even a “libertarian ideologue.” It is rarely, if ever, noted that he was a fixture in the standard networks of neoliberal intellectuals seeking to rebuild the foundations of postwar capitalism. He spoke at a Mont Pelerin Society (MPS) regional meeting in Vancouver in 1983 alongside Margaret Thatcher’s speechwriter and later National Review editor John O’Sullivan. Brimelow’s interviews and lengthier features in Forbes in the late 1980s and 1990s drew almost exclusively from the MPS roster. This included profiles and interviews with Thomas Sowell (twice), Peter Bauer, Milton Friedman (twice for Forbes and twice for Fortune), and Murray Rothbard. His longer features were built around the research of Gordon Tullock, Hayek, Friedman, and MPS member Lawrence White. He wrote a glowing review of Milton and Rose Friedman’s memoirs, recounting Milton’s first trip overseas to the inaugural MPS meeting and praised the couple’s contributions to “the free-market revolution in economics that has overthrown the statist-Keynesian-socialist consensus.”

To describe Brimelow as nativist and white nationalist may be correct, but it threatens to banish his concerns from the domain of the rational and the economic. In fact, he was a typical member of a transnational milieu linking Thatcherite intellectuals taking their own version of a cultural turn around the Institute of Economic Affairs’ Social Affairs Unit with social scientists like Charles Murray and Richard J. Herrnstein concocting theories linking race, intelligence, and economic capacity as well as neoconservatives from the United States to Singapore to Japan rediscovering the relevance of “Asian values” for capitalist success. For the new fusionists of the free-market Right, the economic was not a pristine space quarantined from matters of biology, culture, tradition, and race. Rather, these thought worlds overlapped and melded with one another.

Brimelow’s first book was not about politics or race. It was called The Wall Street Gurus: How You Can Profit from Investment Newsletters, marketed alongside books like The Warning: The Coming Great Crash in the Stock Market and Wall Street Insiders: How You Can Watch Them and Profit. Like the authors of those newsletters, investment was simultaneously a strategy of money-making and leveraging symbolism and accruing influence. We can understand his turn to whiteness as the outcome of a portfolio analysis. The nation was a safe asset. The pro-white play looked like a payday. 

Excerpt adapted from Hayek’s Bastards: Race, Gold, IQ, and the Capitalism of the Far Right by Quinn Slobodian. Copyright © 2025 by Quinn Slobodian. Published by Zone Books.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186032 https://historynewsnetwork.org/article/186032 0
“The End Is Coming! The End Is Coming!” If you were a child during the late 1990s, there’s a good chance you either owned Beanie Babies or your parents went crazy over them. Beanies featured simple designs that inspired complex madness. A mixture of polyester, synthetic plush, and plastic pellets, Beanies were ordinary-looking children’s playthings that took the world by storm. Throughout the mid- to late 1990s, they became ubiquitous in American homes and fostered a community. By the end of the decade, the craze went haywire. In April 1998, a police department near Chicago removed weapons from the streets by offering a buyback program where people could exchange their guns for Beanie Babies. A few months later, in a foreshadowing of the inanity of contemporary politics, U.S. trade representative Charlene Barshefsky sparked controversy when Customs forced her to turn over a few dozen Beanie Babies she purchased while traveling to China with President Bill Clinton. “Instead of trying to reduce our $50 billion trade deficit with China,” stated Republican National Committee chairman Jim Nicholson, “our trade representative was scouring the street markets of Beijing grabbing up every illegal, black market ‘Beanie Baby’ she could get her hands on.” Citing “a source close to the White House delegation,” the Washington Post reported that Barshefsky turned over 40 Chinese Beanies.

Beanie Babies came with a red heart-shaped tag with the word “Ty” printed in large white letters. Ty is an homage to Ty Warner, who created the Beanies in 1993. His company Ty Inc. grew from a modest upstart near Chicago with a handful of employees to running a 370,000-square-foot warehouse and generating $1.4 billion in sales by 1998.

Looking at a Beanie Baby would give the impression that they were children’s toys. But what drove the revenue of Ty Inc. wasn’t parents buying toys for their children to play with. It was adults buying Beanies for themselves and treating them like financial instruments. The CD-ROM Ultimate Collector for Beanie Babies made a splash at the video game industry’s largest trade event in 1999. For $25, consumers received software helping them organize their collection and track price changes. Ultimate Collector featured tax summaries and insurance reports. “It was no longer a child’s toy,” said one collector when recalling why she accumulated Beanies throughout the late ’90s. “It was the hunt for me.”

Despite selling millions of stuffed animals, the Ty company convinced consumers that Beanies were in short supply. In the late ’90s, every few months a particular Beanie Baby would be “retired,” which drove prices up. Online Beanie reselling became so common that when eBay filed paperwork with the U.S. Securities and Exchange Commission to become a publicly traded company in 1998, it cited the volatility of Beanie Baby sales as a risk factor to the company’s financial health. During the second fiscal quarter in 1998, eBay had “over 30,000 simultaneous auctions listed in its ‘Beanie Babies’ category,” the company stated. “A decline in the popularity of, or demand for, certain collectibles or other items sold through the eBay service could reduce the overall volume of transactions on the eBay service, resulting in reduced revenues. In addition, certain consumer ‘fads’ may temporarily inflate the volume of certain types of items listed on the eBay service, placing a significant strain upon the Company’s infrastructure and transaction capacity.” eBay was correct to be cautious about a fad. Beanie sales accounted for a tenth of its total revenues, but the gravy train would not last.

There was enough interest in this market that Beanie Baby price guides were produced to forecast Beanie resale values the same way investment banking analysts make stock predictions. At its peak, there were more than a hundred Beanie Baby price guides. Price guides were often published by collectors who had large investments in Beanie Babies. Unlike an impartial observer, it was in the collector’s personal interest to hype prices rather than proceeding with caution. At the time that Beanies were reselling for thousands of dollars, one price guide predicted they would appreciate another 8,000% over the following decade. Enough people acted on price guide recommendations that, for a brief time, their outrageous predictions became self-fulling until the bubble popped.

Self Portrait with Toys, by Viola Frey, 1981. [Smithsonian American Art Museum]

Retiring Beanies initially drove up resales. Then Ty Inc. tried a more aggressive tactic. “For years, nothing has been hotter than those cuddly little animals with cute little names,” stated CBS Evening News correspondent Anthony Mason during a September 1999 segment. “But abruptly this week, the makers of Beanie Babies, Ty Incorporated, announced over the internet that it was over … By the turn of the century, Beanie Babies will become has beans.” The ushering in of the new millennium would include Y2K and Beanie Baby production stoppages. Beanie Baby web forums were full of apocalyptic posts with titles like “The End Is Coming! The End Is Coming!”

Prices did not rise as Ty hoped. More people came to the realization that stuffed animals that originally sold for $5 should not be reselling for $5,000. That people who banked their life savings on children’s collectibles were playing a fool’s game. That there was no value in investing in a mass-produced product whose entire worth is premised on speculation. Beanie collectors were enamored with old toys that sold for high prices. But when old toys become valuable it is because most toys get beat up when children play with them, so finding a toy in mint condition 30 years after it came out is a rarity that drives prices up. 

Beanies were collected by thousands of adults and stored in glass cases. They were produced in such high volume that the supply outstripped the demand. Finding an old mint-condition Beanie Baby is about as rare as finding a pothole on a city street. 

People who spent thousands of dollars chasing the Beanie fad had closets full of merchandise that was worth less than its original retail price. A soap opera actor who spent $100,000 on Beanies as an investment in his children’s college education saw his investment evaporate. That might sound laughable, but the actor’s son made a haunting documentary about the saga that displayed the humanity driving the craze and its unfortunate consequences. Collectors were left with credit card debt they couldn’t pay off and regret they couldn’t shake.

By 2004, Ty Warner claimed a loss of nearly $40 million on his tax return. In 2014, Warner was convicted of tax evasion. (Warner loathed all taxes, including road tolls. According to Zac Bissonnette, Warner instructed people he was riding with to “just throw pennies and keep driving! It’s an illegal tax!”) Like so many other rich men found guilty of serious crimes, he avoided jail time and remains wealthy. 

Excerpt adapted from 1999: The Year Low Culture Conquered America and Kickstarted Our Bizarre Times, available for preorder from University Press of Kansas. Copyright © 2025 by Ross Benes. 

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186030 https://historynewsnetwork.org/article/186030 0
Telling Chestnut Stories At one time, more than 4 billion American chestnut trees spread from southern Canada all the way to Mississippi and Alabama. While it was rarely the dominant tree in the forests, this giant of the eastern woodlands was hard to miss. It could stand over one hundred feet tall, the trunks straight and true.

To those who lived in the eastern United States, especially Appalachia, the tree was invaluable. It provided food for both people and animals and wood for cabins and fence posts. In cash-poor regions, the tree could even put some money in people’s pockets when they sold nuts to brokers to take to the cities of the Northeast. Some joked that the tree could take you from the cradle to the grave, as the wood was used to make both furniture for babies and caskets. It was certainly “the most useful tree.” In the early 20th century, however, the chestnut came under serious threat. A fungus, first identified in New York in 1904, began killing chestnut trees. It quickly spread south across Appalachia, resulting in the almost complete loss of the species by the mid-20th century. This loss had enormous ecological impacts on the forests of the eastern United States and contributed to a decline in the population of many species of wildlife, including turkeys, bears, and squirrels.

Today, while millions of American chestnut sprouts remain in the forests of the East, almost all the large trees, as well as most of the people who remember the trees’ dominant place in the forest ecosystem, are gone.

Since 1983, the American Chestnut Foundation (TACF) has taken the lead in restoring the American chestnut. While scientists coordinate the project, volunteers play an important part in planting and caring for trees in TACF test orchards. One of the tools TACF uses to connect with volunteers are chestnut stories. In TACF’s publication, originally published as the Journal of the American Chestnut Foundation and known since 2015 as Chestnut: The New Journal of the American Chestnut Foundation, chestnut stories can take many forms — oral histories, essays, poems — but they all document the relationship between humankind and the American chestnut tree. Chestnut stories serve an important purpose: reminding people of the value of the species and the many ways people used the tree before its decline. In documenting the story of the American chestnut through the journal, in sharing and interpreting this story, and in using it to mobilize volunteers and resources, TACF has demonstrated the value that practices rooted in the field of public history and the study of memory can bring to the realm of environmental science. Public historians are well aware of the power that narrative has in prompting action and encouraging people to rethink the status quo. The chestnut stories documented by TACF help create a historical narrative and also serve as a justification for the reintroduction of the species into the modern landscape. As we deal with the long-term consequences of climate change, the emergence of new diseases, and the loss of habitat, the work of TACF can, perhaps, provide a road map for other organizations to employ science, technology, public history practices, and memories to mobilize people to solve environmental challenges.

 

While it is difficult to pinpoint the exact moment when the fungus that devastated the American chestnut arrived in North America, it is possible to date when and where someone first noticed its effects. In 1904, in the Bronx Zoological Park in New York City, employee H.W. Merkel noticed that large sections of the park’s chestnut trees’ bark were dying, and the overall health of the trees appeared to be deteriorating. Dr. A.A. Murrill, who worked for the New York Botanical Gardens, was called in to study the affected trees. He identified the cause: a new fungus, which he called Diaporthe parasitica (the name was changed in 1978 to Cryphonectria parasitica). It is highly unlikely that the trees in the Bronx Zoo were the first to be infected by the fungus, which had come into the port of New York from Asia, but rather it was the first place that someone paid enough attention to notice it.

The “chestnut blight,” as it was commonly called, spread quickly, infecting trees in other locations in New York, as well as in New Jersey and Connecticut. Scientists studying the blight, such as Dr. Haven Metcalf and Dr. J.F. Collins, published bulletins about the disease, which contained recommendations for how to slow the spread. These recommendations included checking nurseries for blighted trees and quarantining those suffering from the blight, creating blight-free zones where chestnuts were removed in the hopes that the blight’s progress would be stopped if there were no chestnut trees, and performing surgery to remove cankers from infected trees. Unfortunately, the advice they gave did not stop the blight, and it began pushing farther south. In Pennsylvania, the Chestnut Blight Commission had permission to enter private property and remove trees infected with or threatened by the blight. In all, the commission spent over $500,000 to fight the blight. But, again, their efforts did not halt the spread. The blight reached West Virginia by 1911, pushing into the heart of Appalachia, where the tree had an important place in the lives of mountain communities. Combined with ink disease, which had been attacking chestnut trees in the southern end of the tree’s range since the 19th century, the blight caused widespread devastation. By 1950, 80% of the American chestnut trees were gone. In all, the upper portion of over 3.5 billion trees died, the equivalent of approximately 9 million acres of forest. The root structure of many trees, however, did not die, and stump sprouts continue to emerge from root systems today, well over a century later. Unfortunately, before they are able to grow very large, these stump sprouts become infected by the blight and die back. So today, while millions of stump sprouts do exist, few mature trees are left.

Chestnut blight, 2009. Photograph by Daderot. [Wikimedia Commons]

TACF eventually took the lead in chestnut restoration efforts. TACF began formally documenting its progress in 1985 with the publication of the Journal of the American Chestnut Foundation, published as Chestnut: The New Journal of the American Chestnut Foundation since 2015. In the first edition, the then editor Donald Willeke lays out the mission of the journal: “We hope that it will be both a scientific journal and a means of communicating news and developments about the American Chestnut to dedicated non-scientists (such as the lawyer who is writing this Introduction) who care about trees in general, and the American Chestnut in particular. and wish to see it resume its place as the crowning glory of the American deciduous woodlands.” Over the years, the journal has moved from a volunteer publication released once or twice a year (depending on the year and on capacity) to a glossy, professional magazine released three times a year.

In the journal, the progress of the backcross breeding program is broken down into terms nonscientists can understand. The journal, however, is not only about the science behind the restoration effort. One of the most significant sections of the journal in its early years was the “Memories” section, which documented “chestnut stories.” While many of the memories included in the journal came to TACF unsolicited, TACF also recognized the importance of documenting people’s chestnut stories in a more organized fashion. In 2003, the then membership director Elizabeth Daniels wrote about the American Chestnut Oral History project, which aimed to preserve chestnut stories for future generations. In the spring of 2006, the then editor Jeanne Coleman let readers know she was interested in gathering chestnut stories. Stories came pouring in, and as Coleman says in the fall 2006 issue, “These stories are heartwarming [and] often funny.” Today, in essence, the journal itself acts as the archive of TACF’s chestnut stories, preserving and sharing them simultaneously.

Untitled (Squirrels in a Chestnut Tree), by Susan Catherine Moore Waters, c. 1875. [The Metropolitan Museum of Art]

In reviewing all 79 issues of TACF’s journal that are available online as of January 2022, as well as other sources, the significance of the chestnut stories becomes quite clear. The work of the scientists engaged in backcross breeding and genetic modification is essential to the restoration of the chestnut. But the success of TACF also has come from thousands of members and volunteers who have supported the work of the scientists. From the beginning, TACF understood the importance of engaging with people outside traditional scientific research circles to accomplish restoration. 

People who mourned the past also supported work to bring about a future where the chestnut once again plays an important role in the ecosystems of the eastern woodlands. TACF members have been, per scientist Philip Rutter from the University of Minnesota, “trying to do something about the problem rather than just lament the loss,” which certainly challenges the argument that nostalgia can reduce the ability to act in the present. 

While maybe not quite as tall or as wide as remembered in chestnut stories, the American chestnut tree occupied a significant place in the forest and in the lives of those who lived under its spreading branches—and it is certainly worthy of the work to restore it. Chestnut stories document this significant place chestnuts held in the forest ecosystem, and the sharing of the stories reminds people of the value the tree brought to Americans before the blight destroyed it. In an interview with Charles A. Miller, William Good remembers how farmers fattened their hogs off chestnuts: “In the fall, because people didn’t raise corn to feed their hogs, farmers would let them run in the mountain where they fattened up on chestnuts. The hogs would have to eat through the big burs on the outside to get the nut out. . . . The hogs must have liked the nuts so much they would chew through them until their mouths were bleeding.”

In an interview that appears in the 1980 folklore collection Foxfire 6 and is reprinted in TACF’s journal, Noel Moore recollects that people in the Appalachians did not build fences to keep their stock in; they instead fenced around their homes and fields to keep out the free-range stock wandering the woods. Each fall, farmers would round up and butcher their hogs that had grown fat on acorns and chestnuts. Chestnuts also served as an important food source for wild animals, including turkeys, black bears, white-tailed deer, gray fox squirrels, and the now extinct chestnut moth. These animals, in turn, fed those who hunted them. Chestnuts also were an important food source for people. Myra McAllister, who grew up in Virginia, recalls that she liked chestnuts “raw, boiled, and roasted by an open fire.” Cecil Clink, who grew up in North Bridgewater, Pennsylvania, remembers filling old flour sacks with the nuts, which his mother would store in the family’s “butry,” or buttery, “with the smoked hams. . . . [They ate] the nuts boiled, or put them on the cook stove and roast them.” Other people made stuffing and traditional Cherokee bread out of the nuts, though they could not grind the nuts into flour because they were too soft; the nuts had to be pounded by hand into flour instead. And it was not just the nuts themselves that people loved. Noel Moore remembers the taste of the honey that bees made from the chestnut blossoms. The leaves also had medicinal uses: the Mohegans made tea to treat rheumatism and colds, and the Cherokee made cough syrup.

Some stories recall the act of selling chestnuts, which gave many families cash that they might otherwise not have had—likely making it a memorable moment. In Where There Are Mountains, Donald Davis shares the chestnut story of John McCaulley, who as a young man had gathered chestnuts for sale. The nuts he gathered sold for four dollars a bushel in Knoxville, Tennessee—and McCaulley could gather up to seven bushels a day. Jake Waldroop remembers seeing wagons loaded with chestnuts in his northeast Georgia mountain community. The wagons headed to “Tocca, Lavonia, Athens, all the way to Atlanta, Georgia.”63 Noel Moore recalls seeing families coming from the mountains and heading to the store in the fall, laden with bags of chestnuts. They traded the bags for “coffee and sugar and flour and things that they had to buy to live on through the winter.”6 Exchanging chestnuts for supplies or cash was “much less risky than moonshining.”

Chestnutting, by Winslow Homer, 1870. [Wikimedia Commons]

To the north in Vittoria, Ontario, Donald McCall, whose father owned a store in the village, recollects that “farmers counted on the money from their chestnuts to pay taxes on the farm.” The trees themselves also had value. William B. Wood recalls that his father tried to save the family farm during the Great Depression by felling and selling the wood of a huge chestnut tree dead from the blight that Wood calls the “Chestnut Ghost.” Unfortunately, the plan did not work, and the family lost their farm.Other stories connect to the “usefulness” of the tree. Because chestnut wood is “even grained and durable . . . light in weight and easily worked,” the tree was used for a wide variety of purposes. Georgia Miller, who was 101 when she wrote “Chestnuts before the Blight,” recalls the split rail fences that lined the edges of pastures. The chestnut wood split easily and lasted longer than that of other species, making it a good material for what some called “snake fences.” Daniel Hallett, born in New Jersey in 1911, says his family used chestnut to trim doors and windows and also for chair rails in the houses they built. Dr. Edwin Flinn’s story (told by Dr. William Lord in 2014), which focuses on the extraction of tannins from dead chestnut trees, shows that the tree remained valuable even after the blight struck.

Chestnut stories recount more than memories of the tree’s usefulness or the role it played in Indigenous and rural economies. Many stories have documented how an encounter with the tree mobilized someone toward engagement with the restoration effort, demonstrating that chestnut stories can provide a pathway to a wider recognition of the natural world. Patrick Chamberlin describes such an experience in “A Practical Way for the Layman to Participate in Breeding Resistance into the American Chestnut.” Chamberlin tells readers how his grandmother used to reminisce about the tree when he was a young boy and then how he came across a burr from an American chestnut while in high school. He started looking for trees as he explored the woods on his parents’ farm. Eventually, while wandering near the old homestead site where his grandmother grew up, he came across two flowering chestnut trees. Returning later in the season, he found nuts from the trees. Through this experience, Chamberlin became involved in the back- cross breeding program—and, at the end of his essay, encourages others to do the same. A chestnut story from the distant reaches of his youth started him on his journey, and science helped him continue his work into the present. Fred Hebard, who directed TACF’s Meadowview Research Farms for twenty-six years, saw his first American chestnut sprout while helping round up an escaped cow with a farmer he worked for. When the farmer told him the story of the American chestnut, Hebard ended up changing his college major, earned a PhD in plant pathology, and began researching the chestnut. It became his life’s work.

Reprinted from Branching Out: The Public History of Trees. Copyright © 2025 by University of Massachusetts Press. Published by the University of Massachusetts Press.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186029 https://historynewsnetwork.org/article/186029 0
Scared Out of the Community Our featured weekly excerpts usually spotlight new history titles, but sometimes the news of the day makes returning to past scholarship, responding to different times and looking at the past from different contexts, a useful endeavor. This is the first entry in a series we hope to revisit from time to time, excerpting books from previous decades in order to bring the history they document to new audiences. Below is an excerpt adapted from Unwanted Mexican Americans in the Great Depression: Repatriation Pressures, 1929–1939, by Abraham Hoffman, published in 1974 by the University of Arizona Press. You can read the entire book online here

The old man entered the circular park, looked around, and sat down on one of the many benches placed there for the use of the town’s citizens. Several hundred people, mostly men, were also in the park, enjoying the afternoon sun. Sitting in the park enabled the old man to forget the reason that had brought him there. The deepening economic depression had cost him his job, and work was hard to find.

A sudden commotion startled the old man out of his reverie. Without warning, uniformed policemen surrounded the park, blocking all exits. A voice filled with authority ordered everyone to remain where he was. While the policemen guarded the exits, government agents methodically quizzed each of the frightened people, demanding identification papers, documents, or passports. With shaking hands the old man produced a dog-eared, yellowed visa. Only the other day, he had considered throwing it away. After all, he had entered the country so long ago…

The agent inspected the papers and barked several questions at the old man. Haltingly, he answered as best he could, for despite his years of residence in the country he had learned the language only imperfectly. With a nod of approval, the officer returned the papers. The old man sat down again; a sense of relief washed over him.

The agents continued their interrogation, and after about an hour everyone in the park had been checked and cleared. Or almost everyone. Seventeen men were placed in cars and taken away. The inspection over, the policemen left the park to the people. Few cared to remain, however, and in a few moments the place was deserted.

 

The time was 1931; the place, Los Angeles, California, in the city’s downtown plaza. The government agents were officers in the Department of Labor’s Bureau of Immigration, assisted by local policemen. Their goal was the apprehension of aliens who had entered the United States illegally.

Unlike many post-World War II aliens who knowingly entered in violation of immigration laws, immigrants prior to the Great Depression entered the United States at a time when the government’s views on immigration were in flux, moving from unrestricted entry to severe restriction. Many aliens found themselves confused by the tightening noose of regulations; one immigrant might enter with one law in effect, but his younger brother, coming to the country a few years later, might find new rules — or new interpretations of old rules — impeding his entrance.

With the onset of the depression, pressure mounted to remove aliens from the relief rolls and, almost paradoxically, from the jobs they were said to hold at the expense of American citizens. In the Southwest, immigration service officers searched for Mexican immigrants, while local welfare agencies sought to lighten their relief load by urging Mexican indigents to volunteer for repatriation. The most ambitious of these repatriation programs was organized in Los Angeles County, an area with the largest concentration of Mexicans outside of Mexico City.

Not all of the repatriates, however, departed solely under pressure from the Anglos. Many Mexicans who had achieved varying degrees of financial success decided on their own to return to Mexico, taking with them the automobiles, clothing, radios, and other material possessions they had accumulated. The Mexican government, vacillating between the desire to lure these people home and the fear that their arrival would add to an already existing labor surplus, sporadically launched land reform programs designed for repatriados. Between 1929 and 1939 approximately half a million Mexicans left the United States. Many of the departing families included American-born children to whom Mexico, not the United States, was the foreign land.

The peak month in which Mexicans recrossed the border was November 1931, and in all subsequent months the figures generally declined. Yet it was after this date that the number of cities shipping out Mexican families increased. Even after the massive federal relief programs of the New Deal were begun in 1933, cities such as Los Angeles, Chicago, and Detroit still attempted to persuade indigent Mexicans to leave.

With the start of World War II, Mexican immigration was renewed, when the United States and Mexico concluded an agreement to permit braceros to enter the United States. A system of permits and visas for varying periods testifies to the evolution of border regulations; their abuse and misuse bear witness to the difficulties of making such a system coherent. 

No other locality matched the county of Los Angeles in its ambitious efforts to rid itself of the Mexican immigrant during the depression years. By defining people along cultural instead of national lines, county officials deprived American children of Mexican descent of rights guaranteed them by the Constitution. On the federal level, no other region in the country received as much attention from immigration officials as Southern California. Because of the tremendous growth of this region after World War II, Southern California’s service as a locus for deportation and repatriation of Mexican immigrants is little remembered. To the Mexican-American community, however, repatriation is a painful memory. 

 

In 1931, various elements in Los Angeles had indicated support for the idea of restricting jobs on public works projects to American citizens. Motions were presented and passed by the Los Angeles city council and the county board of supervisors, while the Independent Order of Veterans of Los Angeles called for the deportation of illegal aliens as a means of aiding jobless relief.

The board of supervisors went so far as to endorse legislation pending in Congress and in the state legislature, which would bar aliens who had entered the country illegally from “establishing a residence, holding a position, or engaging in any form of business.” Supervisor John R. Quinn believed that such legislation would provide a sort of cure-all for all problems generated by illegal aliens, whom he believed numbered “between 200,000 and 400,000 in California alone.” Said Quinn in two remarkably all-inclusive sentences:

If we were rid of the aliens who have entered this country illegally since 1931 ... our present unemployment problem would shrink to the proportions of a relatively unimportant flat spot in business. In ridding ourselves of the criminally undesirable alien we will put an end to a large part of our crime and law enforcement problem, probably saving many good American lives and certainly millions of dollars for law enforcement against people who have no business in this country.

Quinn also believed the “Red problem” would disappear with the deportation of these aliens. 

It was in this atmosphere that Charles P. Visel, head of the Los Angeles Citizen’s Committee on Coordination of Unemployment Relief, published a press release in city newspapers. The statement announced a deportation campaign and stressed that help from adjoining districts would be given the local office of the Bureau of Immigration. Each newspaper printed the text as it saw fit, so that while one newspaper printed sections of it verbatim, another summarized and paraphrased. Certain embellishments were added. “Aliens who are deportable will save themselves trouble and expense,” suggested the Los Angeles Illustrated Daily News on January 26, 1931, “by arranging their departure at once.” On that same day, the Examiner, a Hearst paper, announced, without going into any qualifying details, that “Deportable aliens include Mexicans, Japanese, Chinese, and others.”

As the days passed, follow-up stories and editorials kept the public aware of the project. The Express two days later editorially endorsed restrictionist legislation and called for compulsory alien registration. On January 29, the Times quoted Visel, who urged “all nondeportable aliens who are without credentials or who have not registered to register at once, as those having papers will save themselves a great deal of annoyance and trouble in the very near future. This is a constructive suggestion.” The impending arrival of the special agents from Washington, DC, and other immigration districts was made known, the word being given by Visel to the newspapers. 

La Opinión, the leading Spanish-language newspaper in Los Angeles, published an extensive article on January 29. With a major headline spread across page one, the newspaper quoted from Visel’s release and from the versions of it given by the Times and the Illustrated Daily News. La Opinión’s article pointedly stressed that the deportation campaign was being aimed primarily at those of Mexican nationality.

 

Commencing February 3, Supervisor William F. Watkins of the Bureau of Immigration and his men, with the assistance of police and deputies, began ferreting out aliens in Los Angeles. By Saturday 35 deportable aliens had been apprehended. Of this number, eight were immediately returned to Mexico by the “voluntary departure” method, while an additional number chose to take the same procedure in preference to undergoing a formal hearing. Several aliens were held for formal deportation on charges of violating the criminal, immoral, or undesirable class provisions of the immigration laws. Five additional immigration inspectors arrived to provide assistance, and five more were shortly expected. 

On Friday the 13th, with the assistance of 13 sheriff’s deputies led by Captain William J. Bright of the sheriff’s homicide detail, the immigration agents staged a raid in the El Monte area. This action was given prominence in the Sunday editions of the Times and the Examiner. Watkins wrote to Robe Carl White, assistant labor secretary, that such coverage was “unfortunate from our standpoint,” because the impression was given by the articles that every district in Los Angeles County known to have aliens living there would be investigated. “Our attitude in regard to publicity was made known to the authorities working with us in this matter,” Watkins complained, “but somehow the information found its way into the papers.”

Considering the announcements from Walter E. Carr, the Los Angeles district director of immigration, that no ethnic group was being singled out and that only aliens with criminal records were the primary interest of the Bureau of Immigration, the aliens captured in the Friday the 13th raid could only have made the Mexican community wary of official statements. Three hundred people were stopped and questioned: from this number, the immigration agents jailed 13, and 12 of them were Mexicans. The Examiner conveniently supplied the public with the names, ages, occupations, birth places, years in the United States, and years or months in Los Angeles County, while the Times was content just to supply the names.

While generalizations are impossible about the people stopped, questioned, and occasionally detained, the assertions that all the aliens either were people holding jobs (that only could be held by citizens) or were criminals in the county did not apply to these arrested suspects. Of the twelve Mexicans arrested, the most recent arrival in the United States had come eight months earlier, while three had been in the United States at least seven years, one for thirteen years, and another was classified as an “American-born Mexican,” a term which carried no clear meaning, inasmuch as the charge against the suspects was illegal entry. Eleven of the twelve gave their occupation as laborer; the twelfth said he was a waiter.

 

As Watkins pursued the search for deportable aliens, he observed that the job became progressively difficult:

After the first few roundups of aliens ... there was noticeable falling off in the number of contrabands apprehended. The newspaper publicity which attended our efforts and the word which passed between the alien groups undoubtedly caused great numbers of them to seek concealment.

After several forays into East Los Angeles, the agents found the streets deserted, with local merchants complaining that the investigations were bad for business. In the rural sections of the county surveyed by Watkins’ men, whole families disappeared from sight. Watkins also began to appreciate the extent of Southern California’s residential sprawl. He observed that the Belvedere section, according to the 1930 census, might hold as many as 60,000 Mexicans.

The Mexican and other ethnic communities were not about to take the investigations passively. La Opinión railed at officials for the raids, while ethnic brotherhood associations gave advice and assistance. A meeting of over one hundred Mexican and Mexican American businessmen on the evening of February 16 resulted in the organization of the Mexican Chamber of Commerce in Los Angeles, and a pledge to carry their complaints about the treatment of Mexican nationals to both Mexico City and Washington, DC. Mexican merchants in Los Angeles, who catered to the trade of their ethnic group, felt that their business had been adversely affected, since Mexicans living in outlying areas now hesitated to present themselves in Los Angeles for possible harassment. Sheriff William Traeger’s deputies in particular were criticized for rounding up Mexicans in large groups and taking them to jail without checking whether anyone in the group had a passport or proof of entry.

Ambassador Rafael de la Colina had been working tirelessly on behalf of destitute Mexicans in need of aid or desiring repatriation. Much of his time was occupied with meeting immigration officials who kept assuring him that the Mexicans were not being singled out for deportation. He also warned against unscrupulous individuals who were taking advantage of Mexican nationals by soliciting funds for charity and issuing bogus affidavits to Mexicans who had lost their papers.

The Japanese community also expressed its hostility to the immigration officials. When several agents stopped to investigate some suspected Japanese aliens, the owner of the ranch employing the aliens threatened to shoot the inspector “if he had a gun.” Japanese people obstinately refused to answer any questions, and Watkins believed that an attorney had been retained by the Japanese for the purpose of circumventing the immigration laws. 

Despite the adverse reaction to and public knowledge of the drive on aliens, Watkins persisted. “I am fully convinced that there is an extensive field here for deportation work and as we can gradually absorb same it is expected [sic] to ask for additional help,” he stated. Responding to the charges of dragnet methods, he notified his superiors in Washington:

I have tried to be extremely careful to avoid the holding of aliens by or for this Service who are not deportable and to this end it is our endeavor to immediately release at the local point of investigation any alien who is not found to be deportable as soon as his examination is completed.

 

On February 21, 1931, Watkins wrote to White, and the following month to Visel, that 230 aliens had been deported in formal proceedings, of whom 110 were Mexican nationals, and that 159 additional Mexican aliens had chosen the voluntary departure option to return to Mexico.

These figures revealed that seven out of ten persons deported in the Southern California antialien drive were Mexicans. By the supervisor’s own admission, in order to capture the 389 aliens successfully prosecuted during this period, Watkins and his men had to round up and question somewhere between 3,000 and 4,000 people — truly a monumental task.

The effect of the drive on the Mexican community was traumatic. Many of the aliens apprehended had never regularized an illegal entry that might have been made years before. Other than that, to call them criminals is to misapply the term. The pressure on the Mexican community from the deportation campaign contributed significantly to the huge repatriation from Los Angeles that followed the antialien drive. But this seemed of little concern to the head of the Citizens Committee on Coordination of Unemployment Relief. By the third week in March, an exuberant Visel could write to Labor Secretary William N. Doak:

Six weeks have elapsed since we have received ... Mr. Watkins, in reply to our request for deportable alien relief in this district. We wish to compliment your department for his efficiency, aggressiveness, resourcefulness, and the altogether sane way in which he is endeavoring and is getting concrete results.

The exodus of aliens deportable and otherwise who have been scared out of the community has undoubtedly left many jobs which have been taken up by other persons (not deportable) and citizens of the United States and our municipality. The exodus still continues.

We are very much impressed by the methods used and the constructive results steadily being accomplished.

Our compliments to you, Sir, and to this branch of your service.

However much Visel’s interpretation of the benefits derived from the campaign squared with reality, the Department of Labor was no longer as eager to endorse the Los Angeles coordinator, or even to imply the existence of an endorsement. Perhaps the department feared any such reply might be converted into another publicity release. At any rate, with Nation and New Republic lambasting the department, Doak shied away from making a personal reply. Visel’s letter was answered by Assistant Secretary W.W. Husband, who acknowledged Visel’s message and then circumspectly stated:

It is the purpose of this Department that the deportation provisions of our immigration laws shall be carried out to the fullest possible extent but the Department is equally desirous that such activities shall be carried out strictly in accordance with law.

Excerpt adapted from Unwanted Mexican Americans in the Great Depression: Repatriation Pressures, 1929–1939 by Abraham Hoffman. Copyright © 1974 by The Arizona Board of Regents. Used with permission of the publisher, the University of Arizona Press. All rights reserved.

 

 

 

 

 

 

Buy This Book

]]>
Mon, 13 Oct 2025 23:01:05 +0000 https://historynewsnetwork.org/article/186026 https://historynewsnetwork.org/article/186026 0