Blogs > (R)evolutionary Biology > One Really Big Difference Between Science and History

Nov 12, 2018

One Really Big Difference Between Science and History




David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington; his most recent book is Through a Glass Brightly: Using Science to See Our Species as We Really Are (Oxford University Press), 2018, from which this article is drawn.

Science differs from theology and the humanities in that it is made to be improved on and corrected over time. Hence, the new paradigms that follow are approximations at best, not rigid formulations of Truth and Reality; they are open to revision— indeed, much of their value resides precisely in their openness to modification in the light of additional evidence. By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on “revelations” that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen-name Abbé Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as “what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.”

Fundamentalist doctrines of Judaism and Islam are no different. Such rigidity is not, however, limited to religion. For a purely secular case, most people today agree that it would be absurd to improve on Shakespeare, as was attempted, for example, by Nahum Tate, who rewrote the ending of King Lear in 1682, a “correction” that was approved by none other than Samuel Johnson, who agreed with Mr. Tate that the death of Cordelia was simply unbearable.

By contrast, science not only is open to improvements and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated— and sometimes killed— for their apostasy, and even among secularists rewriting Shakespeare or playing Bach on an electric guitar is likely to be treated as indefensible, science thrives on correction and adjustments, aiming not to enshrine received wisdom and tradition but to move its insights ever closer to correspondence with reality as found in the natural world. This is not to claim that scientists are less vulnerable to peer pressure and the expectations of intellectual conformity than are others. We are, all of us, only human. But an important part of the peer pressure experienced by scientists involves openness to revision and reformulation. The Nobel Prize–winning ethologist Konrad Lorenz once wrote that every scientist should discard at least one cherished notion every day before breakfast. Although I don’t recall Dr. Lorenz conspicuously following his own advice, it nonetheless captures a worthwhile aspiration.

Well-established scientific concepts aren’t discarded lightly. Nor should they be. As Carl Sagan emphasized, extraordinary claims require extraordinary evidence, and so, it is reasonable that any new findings that go against received scientific wisdom— especially in proportion as those claims are dramatic and paradigm- shifting— should receive special attention, which includes being held to a high standard. But closed-mindedness is among the worst sins of an enterprise devoted to seeking truth rather than to validating old texts and anointed wisdom (religion) or passing along something that has been woven out of whole cloth (the creative arts).

I employ the word “truth” without quotation marks, because despite the protestations of some postmodernists and outright data-deniers, the natural world is undeniably real, and it is the noble endeavor of science to describe and understand this reality, which leads to the following proclamation— which shouldn’t be controversial, but has become so, at least in some quarters: science is our best, perhaps our only way of comprehending the natural world, including ourselves. Moreover, we are approaching an ever more accurate perception of that world and of our place in it. This argument, by the way, is not intended to glorify science or scientists at the expense of the creative arts and its practitioners. In fact, a case can be made that an act of artistic creativity is actually more valuable than a scientific discovery, not least because humanistic creativity yields results that, in the absence of their creators, are unlikely to have otherwise seen the light of day. If Shakespeare had not lived, for example, it is almost certain that we would not have Hamlet, Othello, King Lear, Macbeth, the hilarious comedies, those magnificent sonnets, and so forth. Without Bach, no Goldberg Variations or Well- Tempered Clavier, and none of his toccatas and fugues. No Leonardo? No Mona Lisa. The list goes on, and is as extensive as human creativity itself.

By contrast, although counterfactual history is necessarily speculative, an intuitive case can be made that if there had never been an Isaac Newton, someone else— perhaps his rival Robert Boyle or the Dutch polymath Christiaan Huygens— would have come up with the laws of gravity and of motion, which, unlike the blank pages on which Hamlet was written, or Leonardo’s empty easel, were out there in the world, waiting in a sense to be discovered. Gottfried Leibniz, after all, has a strong claim to being at least the co-discoverer with Newton of calculus, which also was, equally, a truth waiting to be found. Unlike a Shakespeare or Leonardo, who literally created things that had not existed before and that owe their existence entirely to the imaginative genius of their creators, Newton’s contributions were discoveries, scientific insights based on preexisting realities of the world (e.g., the laws of motion) or logical constructs (e.g., calculus).

Similarly, if there had been no Darwin, we can be confident that someone else would have come up with evolution by natural selection. Indeed, someone did: Alfred Russell Wallace, independently and at about the same time. A comparable argument can be made for every scientific discovery, insofar as they were— and will be— true discoveries, irrevocably keyed to the world as it is and thus, one might say, lingering just off-stage for someone to reveal them, like the “New World” waiting for Columbus. If not Columbus, then some other (equally rapacious and glory- seeking) European would have sailed due west from the Old World, just as if not Copernicus then someone else— maybe Kepler or Galileo— would have perceived that the solar system is solar- and not Earth-centric.

Mendelian genetics, interestingly, was rediscovered in 1900 by Hugo DeVries, Carl Correns, and Erich von Tschermak, three botanists working independently, more than a generation after Gregor Mendel published his little- noticed papers outlining the basic laws of inheritance. For us, the key point isn’t so much the near simultaneity of this scientific awakening, but the fact that Mendel based his own findings on something genuine and preexisting (genes and chromosomes, even though they weren’t identified as such until a half- century later), which, once again, were populating the natural world and waiting to be uncovered by someone with the necessary passion, patience, and insight.

Ditto for every scientific discovery, as contrasted to every imaginative creation. No matter how brilliant the former or the latter, practitioners of both enterprises operate under fundamentally different circumstances. It does not disparage Einstein’s intellect to note that in his absence, there were many brilliant physicists (Mach, Maxwell, Planck, Bohr, Heisenberg, etc.) who might well have seen the reality of relativity, because no matter how abstruse and even unreal it may appear to the uninitiated, relativity is, after all, based on reality and not entirely the product of a human brain, no matter how brainy.

In some cases, the distinction is less clear between products of the creative imagination and those of equally creative scientific researchers. Take Freud, for example. His major scientific contribution— identifying the unconscious— is undoubtedly a genuine discovery; that is, the elucidation of something that actually exists, and had Freud never existed, someone else would have found and initiated study of those patterns of human mental activity that lie beneath or otherwise distinct from our conscious minds. On the other hand, it isn’t at all obvious that absent Freud, someone else would have dreamed up infantile sexuality, the Oedipus complex, penis envy, and so forth. And this is precisely my point: insofar as alleged discoveries meld into imaginative creativity, they veer away from science and into something else.

Among some historians, there has been a tendency to assume that history proceeds toward a fixed point so that the past can be seen (only in retrospect, of course) as aiming at a superior or in some sense a more “valid” outcome. Such a perspective has been called “Whig history,” and it is generally out of favor among contemporary scholars, as it should be. A notable example is the announcement by Francis Fukuyama that with the disbanding of the Soviet Union in 1991, we had reached the “end of history” and the final triumph of capitalist democracy.2 Subsequent events have made the inadequacy of this claim painfully obvious. Despite the importance of democracy and its widespread appeal (at least in the West), it is important to be wary of the seductiveness of such “Whiggery,” and to appreciate that changes in sociopolitics do not necessarily exemplify movement from primitive to advanced forms of governance, or from malevolent to benevolent, and so forth. Nor are they final.

On the other hand, such a Whiggish approach is mostly valid in the domain of science. Readers at this point may well be tired of hearing about the Copernican, sun-centered solar system, but it deserves repetition, and not only because it contributed mightily to some much needed anthropodiminution but also because it is quite simply a better model than the Ptolemaic, Earth-centered version, just as evolution by natural selection is superior to special creation. It seems likely, by the same token, that the new paradigms of human nature described in Part II of this book are superior— more accurate, more useful, more supported by existing facts, and more likely to lead to yet more insights—than are the old paradigms they are in the process of displacing. Paul Valery wrote that a work of art is never completed but rather abandoned. This sense of unfinishedness is even more characteristic of science, not just because of the scientific “creator’s” persistent personal dissatisfaction with her product but also because reality is tricky, sometimes leaving false trails. As a result, our pursuit and even our successes can always be improved on. This is not to say (as the more extreme postmodernists claim) that reality is socially constructed and as a result, no scientific findings can be described as true. The physicist David Deutsch, making a bow to the philosopher of science Karl Popper, refers to the accumulation of scientific knowledge as “fallibilism,” which acknowledges that no attempts to create knowledge can avoid being subject to error. His point is that although we may never know for sure that we are correct in an assertion (after all, tomorrow’s discovery may reveal its falsity), we can be increasingly clear that we aren’t completely wrong. Moreover, science has never been superseded by any better way of making sense of nature.

Newton’s laws of motion are as close to truth as anyone can imagine, with respect to medium- sized objects moving at middling speeds, but even these genuine laws have been improved on when it comes to very small things (quantum physics), very fast movement (special relativity), or very large things (general relativity). Even in Newton’s day, his work on gravity was vigorously resisted – by well- informed scientifically minded colleagues – because it posited pushing and pulling by a mysterious, seemingly occult “action at a distance” that could not be otherwise identified or explained. Scientific insights, in short, can be as tricky as reality. Observing the disorienting aspects of quantum physics and how they often depart from both common sense and certain rarely questioned philosophical assumptions (such as cause and effect), Steven Weinberg wryly noted, “Evidently it is a mistake to demand too strictly that new physical theories should fit some preconceived philosophical standard.” He might have included new biological theories as well.

Biology, in fact, offers materialist insights that, in a sense, exceed even those of physics, the presumed zenith of hard-headed objective knowledge. Thus, although physicists can tell us a lot about how matter behaves, it is so far stumped when it comes to the question of what matter is (wave functions, clouds of probability, gluons and muons and quarks and the like: but what are they made of?). By contrast, biology can and does tell us both what we are made of, at least at the organismic level— organs, tissues, cells, proteins, carbohydrates, fats, and so forth— and, increasingly, how and why we behave as we do: some of the deepest and hitherto unplumbed aspects of human nature. By an interesting coincidence, in 1543, the same year that Copernicus’s book on the solar system appeared, Vesalius published his magnum opus, on human anatomy. Anatomies large and small, far and near, planetary and personal: all give way to a more accurate, and typically more modest view of reality, including ourselves. Such a perspective on human behavior— mostly due to Darwin and his interpreters and extenders— can be seen as yet another continuation of this trend, dissecting the anatomy of human nature itself.



comments powered by Disqus