With support from the University of Richmond

History News Network

History News Network puts current events into historical perspective. Subscribe to our newsletter for new perspectives on the ways history continues to resonate in the present. Explore our archive of thousands of original op-eds and curated stories from around the web. Join us to learn more about the past, now.

To Understand Antivaxers, Consider Aristotle

Rejections of scientific advances are found throughout the history of medicine. There have been four great advances in medicine over the past 200 years: anesthesia, antisepsis, antibiotics, and immunization. Not every advance was met with resistance. When the benefits of the advance have been obvious, there has tended to be little hesitation. Anesthesia and its cousin, analgesia, for instance, were rapidly accepted; they relieved pain, and the advantages were readily appreciated.

Antisepsis had a stormier path to public acceptance. In the 19th century, English and Irish physicians recognized that puerperal sepsis (a dangerous infection in a mother after delivery of a baby) was likely a contagious condition that was spread from patient to patient either by the medical staff or the local environment. They suggested that improving hygiene would reduce the high rates of mortality that puerperal sepsis caused. In 1843, Oliver Wendell Holmes Sr., a physician (and one of The Atlantic’s founders), presented a paper to the Boston Society for Medical Improvement titled “The Contagiousness of Puerperal Fever.” Holmes suggested that unwashed hands among the medical and nursing staff were responsible for transmitting puerperal fever. This did not sit well with the establishment. A prestigious Philadelphia obstetrician, Charles D. Meigs, declared Holmes’s findings to be nonsense and suggested that an increased number of cases associated with any physician was just bad luck.

The physician who is most frequently recognized with establishing the contagious nature of this infection is a Hungarian obstetrician, Ignaz Semmelweis. He noted that patients in the Vienna General Hospital who were cared for by physicians had a higher incidence of postpartum sepsis than those who were cared for by midwives. Semmelweis realized that physicians performed autopsies, whereas midwives did not, and that physicians did not wash their hands or clothing before moving from an autopsy to a delivery. (It was routine for them to attend deliveries in their bloodstained clothing, having come directly from the autopsy suite.) When he suggested simple hygiene measures such as handwashing, he was derided and eventually run out of town. The medical establishment was unwilling to accept that physicians—rather than bad air or host weaknesses—were responsible for spreading infections and harming patients.

Science denialism can work in the other direction too. When antibiotics, especially penicillin, were first introduced, they were rightly appreciated as miracle drugs. In the pre-antibiotic era, the leading cause of death among children was infectious diseases. The use of antibiotics was astoundingly successful against many, but not all, childhood diseases. The downside for this enthusiasm for treatment came when patients demanded antibiotics for conditions—such as viruses—that didn’t actually necessitate them. Fifty years ago, telling a patient that they had a virus and that penicillin was therefore of no use led to disappointment, disbelief, and even arguments from patients requesting antibiotics for simple colds. Many doctors gave in because it was simpler than spending time fighting with a patient. A consequence of the more indiscriminate use of antibiotics—which represents its own mini-genre of science denialism—has been increased bacterial resistance.

But of the four great advances, none has so broadly helped humanity, or suffered more from science denialism, than immunization.

Read entire article at The Atlantic