Why Historians Have a Stake in the Debate Over Evolution
As any newspaper reader knows, the American public has little regard for the theory of evolution. Despite its nearly universal acceptance among scientists worldwide, both the president of the United States and the Senate majority leader—an MD—favor the teaching of a competing, unscientific (indeed, antiscientific) theory known as “Intelligent Design” as a counterpoint to evolution in the public schools. Even before such powerful politicians openly supported “intelligent Design,” a 2005 poll conducted by the Pew Research Center found that 64 pecent of Americans believe that high school science courses should teach both evolution and creationism. A CBS News poll in 2004 found that a robust 37 percent insist that schools teach creationism only.
Not only do its adherents offer “Intelligent Design” as a competing explanation for the diversity of life, but some prominent Americans even portray evolution as a threat to society. By displacing God as the author of all things, the theory goes, people have no reason to restrain their conduct with ethical norms; the result is the breakdown of civilized morality. House majority leader Tom Delay exemplified this attitude when he cited “the teaching of evolution in the schools” as a proximate cause of the horrific 1999 shootings at Colorado’s Columbine High School.
Historians have reacted to the unfair disparaging of evolution by placing the debate in the context of American religious belief, most often using the appropriate (if inevitable) comparison to the Scopes trial. Debates over evolution are treated as part of the cultural and intellectual history of the American populace and of Protestant fundamentalism in particular. Historians generally avoid entering the controversy itself, preferring the calmer role of spectator. Being accused of undermining western morality, after all, is nasty business.
But I believe such a response to be grossly inadequate to the current debate. Most importantly, historians often act as if evolution is an unnecessary tool for understanding history. Yet it is my contention that historians have a great stake in this debate: not only is evolution the central organizing principle for modern biology—and a fine example of historical thinking—but an essential concept for understanding human history.
Take a subject as central to American history as the European colonization of the Americas. Historians have long described the motives of European settlers—their desire to make money, convert those they viewed as heathen, worship freely, or to expand empire—but only in the last generation have we come to understand the vital importance of the biological factors that made colonization possible.
To understand colonization we must begin by acknowledging the vast difference in the flora and fauna of the North American and European continents at the time of contact. When European settlers came to the Americas, they encountered bison and rattlesnakes and thousands of other strange and wondrous plants and animals. Moreover, they brought to American soil such important biological agents as wheat, barley, cattle, and—perhaps most fatefully—disease.
Prior to the fifteenth century, Native Americans had no acquired immunities to a host of diseases endemic to continental Europe. Smallpox, measles, whooping cough, chicken pox, typhoid fever, cholera and yellow fever, among others, spread ferociously in the “virgin soil” of Native American populations who had little biological resistance or experience in dealing with such menaces. Cultural factors exacerbated the biological ones. As disease spread, village life collapsed, economies faltered, and health care became inadequate or nonexistent. Faced with devastating loses, some people reacted fatalistically, convinced of the inevitability of their untimely death. The effects of these Old World microbes resulted in one of the most dramatic demographic collapses in human history, a record of devastating population losses that stretches from the arrival of European settlers into the twentieth century.
Much of this history must be explained by the ecological happenstance of the suitability of the North American continent for Old World pathogens. Parasites have definite environmental limits beyond which they cannot survive, of which the most obvious is climate. The importance of climate is greatest on diseases spread by aerial transmission—such as colds, influenza, measles, and smallpox—and for vector borne diseases such as malaria and yellow fever. Too, the environmental requirements of the vectors—or carriers—of disease demand an ecological explanation to understand their New World success. Without ecological conditions hospitable to European diseases, the demographic collapse of Native America would not have occurred and American history would have developed in a radically different manner.
Delving into the reasons why Native Americans had not developed resistance to these Old World diseases takes us further into the ecological factors that help drive the course of human events. Not only were Native American peoples isolated from European disease for hundreds of generations, but also the fact that the New World contained fewer cities—in whose dense settings diseases and antibodies proliferate—helps account for the lack of immunities among native populations. In 1600 the average population density of France and England was from fifteen to nine hundred times greater than that of native North America.
More important was the relative lack of domesticated herd animals in America. Herd animals are our richest source of disease microorganisms; we share in common with pigs and other barnyard animals diseases such as influenza. The fact that Native Americans had few domestic animals—dogs, llamas, guineas pigs and couple species of fowl—meant that they had relatively few immunities compared to those Europeans gained from their extensive interactions with dogs, barnyard fowl, horses, donkeys, pigs, cattle, goats, and sheep.
Moreover, it is possible that the relative lack of large domesticated animals and the interactions with diseases they bring to their human neighbors is related to the mass extinctions of large North American animals during the late Pleistocene. Some researchers contend that the extinctions coincided with the arrival of human settlers who traversed the Bering land bridge and quickly hunted many of the new species they encountered to extinction. Other researchers cite climate change as a contributing factor. Whatever the cause, Pleistocene extinctions help explain the paucity of large domesticated animals among Native Americans. It’s possible that by hunting to extinction dozens of species of large mammals, these Paleoindians helped set a course of ecological events that would contribute to the demographic catastrophe of their descendants.
Though the question of the human relationship to Pleistocene extinctions is fraught with controversy, I raise it because it emphasizes the importance of ecological contingency to human history. And of course such considerations are not relegated to the colonial past. The terrible influenza epidemic of 1918-1919 killed more Americans—approximately 675,000—than did the Civil War. Known as “Spanish Flu” or “La Grippe,” the epidemic not only infected 28 percent of all Americans, but curiously, was most deadly for people ages 20 to 40. The AIDS pandemic and worries over the so-called “Avian Flu” are contemporary reminders of the dramatic importance of diseases to world events.
Diseases are just one example of the limitless influence of ecological factors upon human history. Yet their profound influences upon American history don’t make much sense unless one has an understanding of evolution that explains how diseases develop and susceptibilities arise. Diseases did not spread evenly across the North American continent, but spottily, with various populations developing partial immunity to Old World disease while others suffered successive waves of infection. Unlike “Intelligent Design,” the blind workings of natural selection explain this history. Indeed, natural selection is an incredibly powerful tool for deciphering the interplay of ecological and cultural factors that makes human history. By contrast, recourse to “Intelligent Design” offers nothing to help us understand the course of colonial events. But evolution elucidates the absolutely consequential interactions of Old World and New World organisms.
Historians thus have a great deal at stake in debates over whether high school students should be taught evolution. One can’t understand the course of American (or world) history without its power to explain the ecological factors that account for fundamental aspects of human history. We need to understand evolution to understand much of our past and its influence on our present.
Historians, then, should not be mere silent observers of the controversies over the teaching of evolution. Reducing the debate to one more chapter of the history of popular reactions to science neglects the explanatory power that ideas from the natural sciences offer historians. Perhaps the very sensitivity of historians to the cultural and religious values of the American populace make us uniquely suited to popularize how evolution explains many intellectual mysteries in many disciplines. But more than anything else, we should find out whether historians are successful at explaining evolution to the public because of the necessity of evolution for understanding some the central findings of our discipline.