Imagine that you are a teacher of Roman history and the Latin language, anxious to impart your enthusiasm for the ancient world—for the elegiacs of Ovid and the odes of Horace, the si newy economy of Latin grammar as exhibited in the oratory of Cicero, the strategic niceties of the Punic Wars, the generalship of Julius Caesar, and the voluptuous excesses of the later emperors. That’s a big undertaking, and it takes time, concentration, dedication. Yet you find your precious time continually preyed upon, and your class’s attention distracted, by a baying pack of ignoramuses (as a Latin scholar you would know better than to say ignorami) who, with strong political and especially financial support, scurry about tirelessly attempting to persuade your unfortunate pupils that the Romans never existed. There never was a Roman Empire. The entire world came into existence only just beyond living memory. Spanish, Italian, French, Portuguese, Catalan, Occitan, Romansh: all these languages and their constituent dialects sprang spontaneously and separately into being and owe nothing to any predecessor such as Latin. Instead of devoting your full attention to the noble vocation of classical scholar and teacher, you are forced to divert your time and energy to a rearguard defense of the proposition that the Romans existed at all: a defense against an exhibition of ignorant prejudice that would make you weep if you weren’t too busy fighting it.
If my fantasy of the Latin teacher seems too wayward, here’s a more realistic example. Imagine you are a teacher of more recent history, and your lessons on twentieth-century Europe are boycotted, heckled, or otherwise disrupted by well-organized, well-financed, and politically muscular groups of Holocaust-deniers. Unlike my hypothetical Rome-deniers, Holocaust-deniers really exist. They are vocal, superficially plausible, and adept at seeming learned. They are supported by the president of at least one currently powerful state, and they include at least one bishop of the Roman Catholic Church. Imagine that, as a teacher of European history, you are continually faced with belligerent demands to “teach the controversy” and to give “equal time” to the “alternative theory” that the Holocaust never happened but was invented by a bunch of Zionist fabricators. Fashionably relativist intellectuals chime in to insist that there is no absolute truth: whether the Holocaust happened is a matter of personal belief; all points of view are equally valid and should be equally “respected.”
The plight of many science teachers today is not less dire. When they attempt to expound the central and guiding principle of biology; when they honestly place the living world in its historical context—which means evolution; when they explore and explain the very nature of life itself, they are harried and stymied, hassled and bullied, even threatened with the loss of their jobs. At the very least, their time is wasted at every turn. They are likely to receive menacing letters from parents and have to endure the sarcastic smirks and close-folded arms of brainwashed children. They are supplied with state-approved textbooks that have had the word evolution systematically expunged or bowdlerized into “change over time.” Once, we were tempted to laugh off this kind of thing as a peculiarly American phenomenon. Teachers in Britain and Europe now face the same problems, partly because of American influence but more significantly because of the growing Islamic presence in the classroom—abetted by the official commitment to “multiculturalism” and the terror of being thought racist.
It is frequently, and rightly, said that senior churchmen and theologians have no problem with evolution and, in many cases, actively support scientists in this respect. . . . The Archbishop of Canterbury has no problem with evolution, nor does the pope (give or take the odd wobble over the precise paleontological juncture when the human soul was injected), nor do educated priests and professors of theology. . . . Bishops and theologians who have attended to the evidence for evolution have given up the struggle against it. Some may do so reluctantly, and some, like Richard Harries, the former Bishop of Oxford, enthusiastically, but all except the woefully uninformed are forced to accept the fact of evolution. They may think God had a hand in starting the process and perhaps didn’t stay his hand in guiding its subsequent progress. They probably think God cranked up the universe in the first place and solemnized its birth with a harmonious set of laws and physical constants calculated to fulfil some inscrutable purpose in which we were eventually to play a role. But, grudgingly in some cases, happily in others, thoughtful and rational churchmen and women accept the evidence for evolution.
What we must not do is complacently assume that because bishops and educated clergymen accept evolution, so do their congregations. Alas, . . . there is ample evidence to the contrary from opinion polls. More than 40 percent of Americans deny that humans evolved from other animals and think that we—and by implication all of life—were created by God within the last ten thousand years. The figure is not quite so high in Britain, but it is still worryingly large. And it should be as worrying to the churches as it is to scientists. . . . I shall be using the name “history-deniers” for those people who deny evolution: who believe the world’s age is measured in thousands of years rather than thousands of millions of years, and who believe humans walked with dinosaurs. They constitute more than 40 percent of the American population. The equivalent figure is higher in some countries, lower in others, but 40 percent is a good average, and I shall from time to time refer to the history-deniers as the “40-percenters.”
To return to the enlightened bishops and theologians, it would be nice if they’d put a bit more effort into combating the anti-scientific nonsense that they deplore. All too many preachers, while agreeing that evolution is true and Adam and Eve never existed, will then blithely go into the pulpit and make some moral or theological point about Adam and Eve in their sermons, without once mentioning that, of course, Adam and Eve never actually existed! If challenged, they will protest that they intended a purely “symbolic” meaning, perhaps something to do with “original sin” or the virtues of innocence. They may add witheringly that, obviously, nobody would be so foolish as to take their words literally. But do their congregations know that? How is the person in the pew, or on the prayer-mat, supposed to know which bits of scripture to take literally, which symbolically? Is it really so easy for an uneducated churchgoer to guess? In all too many cases the answer is clearly no, and anybody could be forgiven for feeling confused. . . .
Evolution is a fact. Beyond reasonable doubt, beyond serious doubt, beyond sane, informed, intelligent doubt, beyond doubt evolution is a fact. The evidence for evolution is at least as strong as the evidence for the Holocaust, even allowing for eyewitnesses to the Holocaust. It is the plain truth that we are cousins of chimpanzees, somewhat more distant cousins of monkeys, more distant cousins still of aardvarks and manatees, yet more distant cousins of bananas and turnips . . . continue the list as long as desired. That didn’t have to be true. It is not self-evidently, tautologically, obviously true, and there was a time when most people, even educated people, thought it wasn’t. But it is true. We know this because a rising flood of evidence supports it. Evolution is a fact, and the book this article is drawn from will demonstrate it. No reputable scientist di
sputes it, and no unbiased reader will close the book doubting it.
Why, then, do we speak of “Darwin’s Theory of Evolution,” thereby, it seems, giving spurious comfort to those of a creationist persuasion—the history-deniers, the 40-percenters—who think the word theory is a concession, handing them some kind of gift or victory?
What Is a Theory? What Is a Fact?
Only a theory? Let’s look at what theory means. The Oxford English Dictionary gives two meanings (actually more, but these are the two that matter here).
- Theory, Sense 1: A scheme or system of ideas or statements held as an explanation or account of a group of facts or phenomena; a hypothesis that has been confirmed or established by observation or experiment, and is propounded or accepted as accounting for the known facts; a statement of what are held to be the general laws, principles, or causes of something known or observed.
- Theory, Sense 2: A hypothesis proposed as an explanation; hence, a mere hypothesis, speculation, conjecture; an idea or set of ideas about something; an individual view or notion.
Obviously the two meanings are quite different from one another. And the short answer to my question about the theory of evolution is that the scientists are using Sense 1 while the creationists are—perhaps mischievously, perhaps sincerely—opting for Sense 2. A good example of Sense 1 is the Heliocentric Theory of the Solar System, the theory that Earth and the other planets orbit the sun. Evolution fits Sense 1 perfectly. Darwin’s theory of evolution is indeed a “scheme or system of ideas or statements.” It does account for a massive “group of facts or phenomena.” It is “a hypothesis that has been confirmed or established by observation or experiment” and, by generally informed consent, it is “a statement of what are held to be the general laws, principles, or causes of something known or observed.” It is certainly very far from “a mere hypothesis, speculation, conjecture.” Scientists and creationists are understanding the word theory in two very different senses. Evolution is a theory in the same sense as the heliocentric theory. In neither case should the word only be used, as in “only a theory.”
As for the claim that evolution has never been “proved,” proof is a notion that scientists have been intimidated into mistrusting. Influential philosophers tell us we can’t prove anything in science. Mathematicians can prove things—according to one strict view, they are the only people who can—but the best that scientists can do is fail to disprove things while pointing to how hard they tried. Even the undisputed theory that the moon is smaller than the sun cannot, to the satisfaction of a certain kind of philosopher, be proved in the way that, for example, the Pythagorean Theorem can be proved. But massive accretions of evidence support it so strongly that to deny it the status of “fact” seems ridiculous to all but pedants. The same is true of evolution. Evolution is a fact in the same sense as it is a fact that Paris is in the Northern Hemisphere. Though logic-choppers rule the town (not my favorite Yeats line but apt in this case), some theories are beyond sensible doubt, and we call them “facts.” The more energetically and thoroughly you try to disprove a theory, if it survives the assault, the more it converges on what commonsense happily calls a fact.
To a mathematician, a proof is a logical demonstration that a conclusion necessarily follows from axioms that are assumed. Pythagoras’s theorem is necessarily true, provided only that we assume Euclidean axioms, such as the axiom that parallel straight lines never meet. You are wasting your time measuring thousands of right-angled triangles, trying to find one that falsifies Pythagoras’s Theorem. The Pythagoreans proved it, anybody can work through the proof; it’s just true and that’s that. Mathematicians use the idea of proof to make a distinction between a “conjecture” and a “theorem,” which bears a superficial resemblance to the Oxford Dictionary’s distinction between the two senses of “theory.” A conjecture is a proposition that looks true but has never been proved. It will become a theorem when it has been proved. A famous example is the Goldbach Conjecture, which states that any even integer can be expressed as the sum of two primes. Mathematicians have failed to disprove it for all even numbers up to three hundred thousand million million million, and common sense would happily call it Goldbach’s Fact. Nevertheless, it has never been proved, despite lucrative prizes being offered for the achievement, and mathematicians rightly refuse to place it on the pedestal reserved for theorems. If anybody ever finds a proof, it will be promoted from Goldbach’s Conjecture to Goldbach’s Theorem, or maybe X’s Theorem where X is the clever mathematician who finds the proof. . . .
Fermat’s Last Theorem, like the Goldbach conjecture, is a proposition about numbers to which nobody has found an exception. Proving it has been a kind of holy grail for mathematicians ever since 1637, when Pierre de Fermat wrote in the margin of an old mathematics book, “I have a truly marvellous proof . . . which this margin is too narrow to contain.” It was finally proved by the English mathematician Andrew Wiles in 1995. Before that, some mathematicians think it should have been called a conjecture. Given the length and complication of Wiles’s successful proof, and his reliance on advanced twentieth-century methods and knowledge, most mathematicians think Fermat was (honestly) mistaken in his claim to have proved it. I tell this story only to illustrate the difference between a conjecture and a theorem.
A scientific theory, such as that of evolution or heliocentrism, conforms to the Oxford Dictionary’s Sense 1.
[It] has been confirmed or established by observation or experiment, and is propounded or accepted as accounting for the known facts; [it is] a statement of what are held to be the general laws, principles, or causes of something known or observed.
This kind of scientific theory has not been—cannot be—proved in the way a mathematical theorem is proved. But common sense treats it as a fact in the same sense as the “theory” that the Earth is round and not flat is a fact and the theory that green plants obtain energy from the sun is a fact. All are scientific theories: supported by massive quantities of evidence accepted by all informed observers, undisputed facts in the ordinary sense of the word. As with all facts, if we are going to be pedantic, it is undeniably possible that our measuring instruments, and the sense organs with which we read them, are the victims of a massive confidence trick. As Bertrand Russell said, “We may all have come into existence five minutes ago, provided with ready-made memories, with holes in our socks and hair that needed cutting.” Given the evidence now available, for evolution to be anything other than a fact would require a similar confidence trick by the creator, something that few theists would wish to credit.
It is time now to examine the dictionary definition of a “fact.” Here is what the OED has to say (again there are several definitions, but this is the relevant one):
Fact: Something that has really occurred or is actually the case; something certainly known to be of this character; hence, a particular truth known by actual observation or authentic testimony, as opposed to what is merely inferred, or to a conjecture or fiction; a datum of experi
ence, as distinguished from the conclusions that may be based upon it.
Notice that a fact in this sense doesn’t have the same rigorous status as a proved mathematical theorem, which follows inescapably from a set of assumed axioms. Moreover, “actual observation or authentic testimony” can be horribly fallible and is overrated in courts of law. Psychological experiments have given us some stunning demonstrations, which should worry any jurist inclined to give superior weight to “eyewitness” evidence. A famous example was prepared by Professor Daniel J. Simons at the University of Illinois. Half a dozen young people standing in a circle were filmed for twenty-five seconds tossing a pair of basketballs to each other, and we, the experimental subjects, watch the film. The players weave in and out of the circle and change places as they pass and bounce the balls, so the scene is quite actively complicated (http://viscog.beckman.uiuc.edu/flasmovie/15.php). Before being shown the film, we are told that we have a task to perform to test our powers of observation. We have to count the total number of times balls are passed from person to person. At the end of the test, the counts are duly written down, but—little does the audience know—this is not the real test!
After showing the film and collecting the counts, the experimenter drops his bombshell. “And how many of you saw the gorilla?” The majority of the audience looks baffled, blank. The experimenter then replays the film, but this time he tells the audience to watch in a relaxed fashion without trying to count anything. Amazingly, nine seconds into the film, a man in a gorilla suit strolls nonchalantly to the center of the circle of players, pauses to face the camera, thumps his chest as if in belligerent contempt for eyewitness evidence, and then strolls off with the same insouciance as before. He is there in full view for nine whole seconds—more than one-third of the film—and yet the majority of the witnesses never see him. They would swear an oath in a court of law that no man in a gorilla suit was present, and they would swear that they had been watching with more than usual acute concentration for the whole twenty-five seconds, precisely because they were counting ball-passes. Many experiments along these lines have been performed with similar results and with similar reactions of stupefied disbelief when the audience is finally shown the truth. Eyewitness testimony, “actual observation,” “a datum of experience”—all are, or at least can be, hopelessly unreliable. It is, of course, exactly this unreliability among observers that stage conjurers exploit with their techniques of deliberate distraction.
The dictionary definition of a fact mentions “actual observation or authentic testimony, as opposed to what is merely inferred” (emphasis added). The implied pejorative of that “merely” is a bit of cheek. Careful inference can be more reliable than “actual observation,” however strongly our intuition protests at admitting it. . . .
Admittedly, inference has to be based ultimately on observation by our sense organs. For example we use our eyes to observe the printout from a DNA-sequencing machine or from the Large Hadron Collider. But—all intuition to the contrary—direct observation of an alleged event (such as a murder) as it actually happens is not necessarily more reliable than indirect observation of its consequences (such as DNA in a bloodstain) fed into a well-constructed inference engine. Mistaken identity is more likely to arise from direct eyewitness testimony than from indirect inference derived from DNA evidence. And, by the way, there is a distressingly long list of people who have been wrongly convicted on eyewitness testimony and subsequently freed—sometimes after many years—because of new evidence from DNA. . . .
I take inference seriously—not mere inference but proper scientific inference—and I shall show the irrefragable power of the inference that evolution is a fact. Obviously the vast majority of evolutionary change is invisible to direct eyewitness observation. Most of it happened before we were born, and in any case it is usually too slow to be seen during an individual’s lifetime. The same is true of the relentless pulling apart of Africa and South America, which occurs . . . too slowly for us to notice. With evolution, as with continental drift, inference after the event is all that is available to us, for the obvious reason that we don’t exist until after the event. But do not for one nanosecond underestimate the power of such inference. The slow drifting apart of South America and Africa is now an established fact in the ordinary language sense of “fact” and so is our common ancestry with porcupines and pomegranates.
The distinction between the two dictionary meanings of theory is not an unbridgeable chasm, as many historical examples show. In the history of science, theories often start off as “mere” hypotheses. Like the theory of continental drift, an idea may even begin its career mired in ridicule before progressing by painful steps to the status of an undisputed fact. This is not a philosophically difficult point. The fact that some widely held past beliefs have been conclusively proved erroneous doesn’t mean we have to fear that future evidence will always show our present beliefs to be wrong. How vulnerable our present beliefs are depends, among other things, on how strong the evidence for them is. People used to think the sun was smaller than the earth because they had inadequate evidence. Now we have evidence, which was not previously available, that shows conclusively that it is much larger, and we can be totally confident that this evidence will never, ever be superseded. This is not a temporary hypothesis that has so far survived disproof. Our present beliefs about many things may be disproved, but we can with complete confidence make a list of certain facts that will never be disproved. The heliocentric theory of the planets wasn’t always among them, but it is now. So is evolution.
Adapted from The Greatest Show on Earth: The Evidence for Evolution by Richard Dawkins. Copyright ©2009. Adapted by permission of Free Press, a Division of Simon & Schuster, Inc.