The One Good Thing that Could Come from the Explosion in the Allergic Reaction to Peanuts
What is a food allergy? Today this question would likely make people think of allergies to peanuts, and the frightening, potential fatal anaphylactic reactions they can cause. Peanut allergy has not been far from the headlines in recent times. Whether it be banning peanuts from schools, airplanes, and sports facilities, such as Edmonton, Alberta’s Commonwealth Stadium, debating whether pregnant and breastfeeding mothers should eat peanuts, or simply speculating about the cause of the explosion in peanut allergy diagnoses, peanut allergies have dominated discussions of food allergy for 25 years.
In many ways, this is understandable. Peanut allergies are bizarre, confounding, and alarming reactions that raise important questions about how the immune system responds to environmental change, and how best to ensure the safety of the severely allergic. The fact that peanut allergies tend to affect children and young adults most (partly due to the increase in rates, but also because of the higher risk of accidental exposure in young people) adds to the poignancy of such discussions. Peanut allergies definitely merit all the discussion they attract. But there is – and has been – much more to food allergies than just peanuts.
For a start, peanut allergies have only confounded allergists for about a quarter of a century, with the first fatality being reported in the medical literature in 1988. People were clearly allergic to peanuts before that, but their allergies were not depicted as particularly dangerous, common, or controversial. This raises two important historical questions: 1) How was food allergy understood prior to the peanut allergy epidemic and 2) Why have peanut allergies become so important in the last three decades.
In order to answer the second question, it is essential to address the first. Whereas we associated food allergy today with sudden, dangerous, and unmistakable reactions to peanuts, for most of the twentieth century, food allergy was seen as something very different. Anaphylactic reactions to foods certainly occurred, but allergists did not tend to focus on them. One reason for this was that, since anaphylactic reactions tend to follow immediately after the ingestion of an allergenic food, many people simply diagnosed themselves, rather than spending the time and money involved in consulting with an allergist. As the Virginia food allergist Walter Vaughan declared in the 1930s, these were the “fortunate” sufferers, people allergic to potent, but easy-to-identify allergens, such as seafood and nuts, as well as seasonal fruits and vegetables, such as strawberries and watermelon.
The “unfortunate” sufferers, who were much more common, had greater difficulty identifying the foods to which they were allergic and, therefore, were more likely to consult an allergist. These were people who were allergic to much more ubiquitous foods, such as wheat, eggs, milk, and corn, foods that were not only dietary staples, but were also used in food processing. Although allergies to such foods could cause anaphylaxis reactions, they were more likely to trigger delayed, chronic reactions, such as gastrointestinal distress, asthma, dermatological problems, or even psychiatric disturbance. Since such symptoms were also associated with other causes, it was difficult to attribute them unequivocally to specific foods.
Yet, that is precisely what food allergists did for much of the twentieth century. Working closely with their patients to identify their allergens, food allergists came to believe that food allergy was the cause of much undiagnosed, chronic suffering. While Vaughan estimated that 60% of Americans were allergic to different foods, others claimed that this figure was “somewhat conservative.” When synthetic food dyes, flavors, and preservatives began flooding the food supply following the Second World War, these substances were also blamed by food allergists as the cause of countless symptoms, including hyperactivity in children.
To many patients, a food allergy diagnosis represented salvation, a long-sought explanation for their chronic suffering. But to those who doubted the claims of food allergists and their patients, food allergy was “witchcraft, a fad, or a racket.” For so-called “orthodox allergists,” who were more hesitant to diagnose patients with food allergy, food allergists brought allergy as a whole into disrepute with their extravagant claims. During the middle decades of the twentieth century, many orthodox allergists went further, claiming that the symptoms presented by food allergists’ patients were merely psychosomatic, or “all in the mind.” Such patients were better served by a psychiatrist than an allergist.
To a considerable degree, the dispute between food allergists and orthodox allergists revolved around a basic point: the definition of allergy. In 1906, Austrian pediatrician, Clemens von Pirquet defined allergy as “any form of altered biological reactivity,” a decidedly broad definition that food allergists took to heart. Orthodox allergists, in contrast, limited their definition of allergy to instances, best exemplified by anaphylactic shock, where there was clear involvement of the immune system. Unless evidence could be provided through a diagnostic test, such as a skin test, the symptoms could not be allergic.
The problem, however, was that skin tests were judged by many allergists – orthodox or not – to be an unreliable test for food allergy, causing too many false positives and negatives. Instead, food allergists took an exhaustive history of their patients’ dietary experiences and prescribed elimination diets, whereby patients would embark upon a reduced, bland diet, adding test items at intervals and then taking note of any reactions. Since patients themselves were responsible for providing the evidence upon which diagnosis took place, orthodox allergists were dubious of its value, particularly if they also doubted the mental state of such patients. The situation deteriorated during the 1960s and 1970s until many food allergists abandoned allergy altogether, forming the rival discipline, clinical ecology, as an alternative. By the 1980s, the last thing an orthodox allergist would want in his/her office was a patient complaining of food allergies.
But that all changed with the emergence of peanut allergy in the late 1980s. With peanut allergy, orthodox allergists could reclaim food allergy for themselves. Unlike the chronic food allergies that dominated the clinical practice of food allergists, peanut allergies caused acute, anaphylactic, immediate, and potentially lethal reactions that left no one guessing.
On the one hand, the legitimacy of peanut allergy in the eyes of the orthodox allergy community has been a godsend to peanut allergy patients and parents, allowing for the public health measures that have prevented accidental exposures and ensured that emergency measures, such as the stockpiling of epinephrine injectors in schools, are taken seriously. But on the other hand, the focus on peanut allergy has left those suffering with chronic allergies – or whatever one wishes to call them – even more marginalized than ever, and often vulnerable to quack remedies on the internet.
In addition, by undermining the clinical strategies of food allergists, most of which relied upon close cooperation with their patients and the gradual accumulation of clinical knowledge over time, allergists today are at a loss to explain what has caused the peanut allergy epidemic in the first place. If the puzzle that is peanut allergy is to be resolved, I would suggest that allergists will need to recapture some of the curiosity, open-mindedness, flexibility, and empathy that characterized the work of the food allergists in decades past.