Gambling, drinking, and sex! Sounds like fun. Yet each of these was intimately involved with the creation of what students consider to be one of the most grueling courses of their college career: “sadistics,” er, I mean statistics. But without statistics there could be no science and no skepticism. Statistics and evolution are so interconnected that modern statistics could not have taken shape without the prior discovery of evolution, which helped legitimize and secularize randomness; the modern theory of evolution could not have been formed without understanding statistics.
There is neither informed skepticism nor secularism without acceptance of the scientific method. Otherwise, those who doubt the Holocaust, evolution, the Moon landing, or immunization would be legitimate skeptics. Science is a method, not a body of information, and it is the only method that provides knowledge about empirical reality. Its motto is “show me the data,” but only statistics gives data meaning. Statistics is the language of proof in science.
As put by historian Theodore Porter: “Statistics has become known in the twentieth century as the mathematical tool for analyzing experimental and observational data. Enshrined . . . as the only reliable basis for judgments such as the efficacy of medical procedures or the safety of chemicals, and adopted by business for such uses as industrial quality control . . . statistical analysis has also come to be seen in many scientific disciplines as indispensable for drawing reliable conclusions from empirical results. . . . Not since the invention of calculus, if ever, has a new field of mathematics found so extensive a domain of application.” Statistics offers the route out of mental slavery.
The mournful cry of psychology majors who have to take statistics is, “Oh, no, it’s math!” Statistics is indeed mathematics, but it is very different from other areas of math.
Sophisticated mathematics is extremely ancient. An ancient parchment showing the work of Archimedes—ironically, overwritten with Christian prayers and thus only surviving the cultural frontal lobotomy of the Dark Ages because it was turned into a prayer book—revealed that Archimedes was on the verge of inventing calculus 1,900 years before it finally occurred. For all intents and purposes, statistics was invented yesterday, in the last decade of the nineteenth century. Why so late? Statistics is the study of randomness by utilizing probability, and randomness is a difficult concept for humans who evolved to perceive meaning and order in their environment to cope with. Probabilistic thinking is an inherently new way of cognition, requiring a worldview that allows for randomness. Statistics as a field of inquiry depends on different reasoning than mathematics, which is a purely logical enterprise.
As Baranoff, Brockett, and Kahane expressed in a recent business statistics text: “It’s hard for us to imagine that only a few centuries ago people did not believe even in the existence of chance occurrences or random events or in accidents, much less explore any method of quantifying seemingly chance events. Up until very recently, people have believed that God controlled every minute detail of the universe. This belief rules out any kind of conceptualization of chance as a regular or predictable phenomenon.”
That belief would also rule out experimentation and thus science. Without the concept of randomness, it is impossible to do an experiment, because experimental proof comes down to seeing how the results of a procedure differ from what randomness would produce. If one believes the outcome of any procedure is due to the will of God, there is no point in experimentation. Indeed, that is why the concept of science is so foreign to the vast majority of human cultures and history. After all, quoting Baranoff, Brockett, and Kahane again, “If all things are believed to be governed by an omnipotent god, then regularity is not to be trusted, perhaps it can even be considered deceptive, and variation is irrelevant and illusive, being merely reflective of God’s will.”
Mathematically, the capacity to deal with randomness was beyond Ionian pre-science. In ancient Greece, a highpoint of civilization not reached again for about 1,700 years, uncertainty meant not knowing what the gods intended to do. The ancient Greeks could understand randomness or luck as whims of petty gods or even chance events as too trivial for gods to worry about, but the notion that random events could be quantified was a mode of thought impossible for them. For example, the ancient Greek philosopher Diodorus Cronus argued that if something did not happen, it was in effect never possible!
An evolved cognitive fallacy is the attributing of meaning and significance to coincidences. The concept of randomness was alien to ancient thought. Indeed, it remains alien to all but a tiny strand of recent Western thought. Replacing providence with random chance was necessary for science and progress.
What chance did any study of probability have after the rise of Christianity, when mathematics itself was suspect? St. Augustine warned, “The good Christian should beware of mathematicians, and all those who make empty prophecies. The danger already exists that the mathematicians have made a covenant with the devil to darken the spirit and to confine man in the bonds of Hell.”
Of course, a main use of mathematics at the time was for a purpose that is still carried on by newspapers today, to their disgrace: astrology, hence Augustine’s reference to prophecy. But the result of this attitude was summarized by mathematics historian Morris Kline: “From the years 500 to 1400 there was no mathematician of note in the whole Christian world.”
As Matthew 10:29 states, “Are not two sparrows sold for a copper coin? And not one of them falls to the ground apart from your Father’s will.” This mode of thought makes science and experimentation worthless and out of the question. Indeed, theists are forced into more sophisticated theologies in order to engage in science. Statistics did not present a mathematical difficulty but rather an ideological impossibility, before the advent of modern secularism.
The Casting of Lots
Gambling is very ancient. If humans were rational, it would have led us to understand probability long ago, but the tools of gambling were never distinct from the tools of divination, and for thousands of years no one saw the mathematical basis of probability behind games of chance. Dice evolved from using the multisided astralugus (heel) bones of sheep and other domesticated animals. Indeed, these bones were used in both gambling and divination in many cultures and locations since the Neolithic pre-5000 BCE. Bones were first turned into six-sided dice more than four thousand years ago, and yet dice are still called “bones” in slang.
In addition to their frivolous use in gambling, dice were used for a far more important and noble purpose: to reveal the will of the gods. Both the ancient Greeks and Romans used dice for divination; the Roman goddess Fortuna was believed to determine the outcomes of games of chance. Today, some gamblers have another name for her: “Lady Luck.”
Events that secularists in the modern Western world attribute to chance are attributed everywhere else and always to divine will. Thus, we still have the I Ching, tarot cards, tea leaves, and bibliomancy.
Bibliomancy, picking a selection at random from a book to communicate with God, is endemic to cultures that have holy books. (That means most of them.) Adherents of Chabad, the largest sect within Orthodox Judaism, use the writings of the Lubavitcher Rebbe, considered by them as Melech HaMoshiach (King Messiah), in this manner in order to continue “communicating” with him though he died in 1994.
Proverbs 16:33 says: “We may throw the dice, but the Lord determines how they fall.” The casting of lots to reveal God’s will occurs over seventy times in the Old and New Testaments. Lots is sometimes translated as dice and must have been something at least similar. To the extent that one believes God wrote the Bible, one is forced to conclude that God approved of this method for learning his will.
By the way, Deuteronomy 25:19 mysteriously commands that “Amalek” be wiped out. Medieval sages, whose writings are taken as the word of God by Orthodox Jews, explain why. As put by a rabbi in an online sermon: “Amalek believed in randomness. . . . This is why they must be obliterated from the face of the earth.”
Gambling, Drinking, and Sex
I promised gambling, drinking, and sex; let us begin with gambling. Games using dice went on for thousands of years without anyone seeing the mathematical basis of probability behind them. The reason, as we’ve seen, is that there was no concept of randomness; the outcome of all events was seen as predetermined by God.
This default mode of human thought is diametrically opposed to scientific thinking. So it is quite possible that if humans lacked an evolved predilection toward gambling addiction, there would be no science. The study of probability as a serious area of mathematics began only because of gambling. But it was not until the very end of the Renaissance that mathematicians even turned their attention to gambling games. The Chevalier de Méré (an intellectual, writer, and self-promoted “nobleman”) was a big-time dice gambler; he began to dabble in the mathematics behind the games hoping to find an “edge.”
The game that wound up costing de Méré dearly was the following: he would roll two dice twenty-four times and bet even money on getting at least one double six. He figured the following: There was a one-in-six chance that one dice would be a six and a one-in-six chance that the other would be a six, so there was a one-in-thirty-six chance that both would come up six. Since he was rolling the dice twenty-four times, the chances of winning should be twenty-four out of thirty-six or two out of three, a sucker bet for his opponents. But it was de Méré who was the sucker. What could have gone wrong? In 1654, he turned for help on this to one of the greatest mathematical minds of history, Blaise Pascal. Pascal quickly spotted de Méré’s error. Warming to the subject, he consulted with another math genius, Pierre de Fermat. The formal study of probability was born!
What went wrong for de Méré? He was right that the chances of getting a double six when tossing two dice is one in thirty-six, and he was right that on average one should have two double sixes per three times of playing the game. But he neglected to take into account that sometimes twenty-four spins would produce more than one double six but result in only one win. The probability of not rolling a double six in twenty-four rolls is (35/36)^24 or about 50.86 percent. Thus, his chances of winning were 49.14 percent. Inadvertently, he had been giving his opponents a slight edge; probability always wins in the end, and that slight edge was enough to cause him large losses over time.
This is exactly how casinos work; the bank has a slight advantage, and in the long run that’s enough to make them money-printing operations. For example, in roulette the house has a 5.26 percent advantage. Card-counters are banned in blackjack because they can have the advantage over the bank. A casino will never tolerate that. Note that if only a few people had only a bit of extrasensory perception (ESP), Las Vegas would be back to the desert it was before Bugsy Siegel got there. Casinos are multibillion-dollar experiments that demonstrate conclusively that ESP is absolute bunk.
The contributions of Pascal and Fermat have been termed in the Baranoff, Brockett, and Kahane textbook “a very distinct departure from the previous conceptualization of uncertainty that had all events controlled by God with no humanly discernable pattern. In the Pascal-Fermat framework, prediction became a matter of counting that could be done by anyone.”
Time for a drink. Perhaps the greatest single achievement of statistics, the ability to make decisions about populations from small samples, was achieved in the quest to make better beer. The Guinness Brewery was among the first to ensure quality control and consistency in its products; it hired a young chemist named William Sealy Gosset (1876–1937) in 1899. He was required to make decisions about the quality of huge batches of barley, malt, and hops when he was able to test only small samples. Gosset proved up to this task, which before the dawn of the twentieth century would have been deemed logically impossible. In 1908, under the pseudonym “Student,” Gosset published the t-distribution and t-test, methods still taught and used today.
Sir Ronald Fisher (1890–1962), one of the founders of the modern evolutionary synthesis, solved the so-called “blending problems” by combining Darwinian evolution with Mendelian genetics, producing such innovations in statistics as the Analysis of Variance, still the basic statistical procedure employed in this area. Fisher contributed to the Wright-Fisher statistical model of genetic reproduction. Sir Francis Galton (1822–1911), the father of statistics, invented the concept of correlation to study the idea of inherited variation, vital to a theory of a rather famous cousin of his, Charles Darwin. Sex, of course, is one of nature’s ways to produce reproduction with variability, the mechanism used to recombine genes and produce mutations.
The pioneers of statistics, working in the last decade of the nineteenth century and the first decades of the twentieth, inhabited the United States or England, societies that were the freest ever until that time and still freer than 90 percent of the world today. That is no coincidence.
Gambling, drinking, and sex—there would be no science without them!
The Patterns of Randomness
Statistics studies the patterns produced by randomness and measures uncertainty. G. A. Barnard characterized as the central idea of Gosset’s and Fisher’s work as being that “uncertainty may be capable of precise quantitative assessment.”
Flip a coin; whether it comes up heads or tails is random. If you continue flipping the coin, in the long run we expect the results to be heads half the time and tails half the time. If you flip, say, thirty coins at a time or, say, five thousand times and record how many heads occurred, you will wind up with (in addition to carpal-tunnel syndrome) what is for practical purposes a normal distribution, the classic bell curve. Anything that depends on lots of random factors will end up normally distributed. Every normal distribution is identical in the percentages of the population found between standard deviations.
Galton wrote of the normal distribution: “I know of scarcely anything so apt to impress the imagination as the wonderful form of cosmic order expressed by the ‘Law of Frequency of Error.’ The law would have been personified by the Greeks and deified, if they had known of it. . . . The huger the mob, and the greater the apparent anarchy, the more perfect is its sway. It is the supreme law of Unreason. Whenever a large sample of chaotic elements are taken in hand and marshaled in the order of their magnitude, an unsuspected and most beautiful form of regularity proves to have been latent all along.”
Randomness is the law of the universe. Richard Dawkins stated with his typical bluntness, “The universe we observe has precisely the properties we should expect if there is, at bottom, no design, no purpose, no evil, no good, nothing but blind, pitiless indifference.” This seems indisputable. It is not by itself proof there is no God or gods, but it is clear that if God exists, he/she/they is/are playing dice with the universe. In fact, randomness is the essential building mechanism of the universe we know. As Taner Edis writes: “Indeed, physicists today have become accustomed to thinking of physical order as inseparable from disorder—from sheer randomness. For example, quantum mechanics, the most fundamental description of nature we have, allows us to calculate probability distributions. . . . In modern science, randomness is not just a quirk confined to fundamental physics. It is in fact the best source of novelty—the raw material for the creativity we find in Darwinian evolution and human brains. . . . The patterns we observe in nature, by which we find intelligibility in the world, are often the direct consequence of underlying disorder.”
Chance is blind and random and thus entirely dependable in the long run. That is the secret of statistics. Faced with randomness, it (seemingly paradoxically) uses chaos to make marvelous sense out of our universe—as only statistics can do!
Baranoff, E., Brockett, P., Kahane, Y. 2009. Risk Management for Enterprises and Individuals. Flat World Knowledge.
Barnard, G. A. “Fisher’s Contribution to Mathematical Statistics.” Journal of the Royal Statistical Society 126, 1963.
Dawkins, Richard. 1996. River Out of Eden. New York: Basic Books. Edis, Taner. “Is The Universe Rational?,” Free Inquiry, February/March 2010.
Kline, M. 1953. Mathematics in Western Culture. New York: Oxford University Press.
Newman, J. R., ed. 1956. The World of Mathematics. New York: Simon and Schuster.
Porter, T. The Rise of Statistical Thinking, 1820-1900. 1986. Princeton, N.J.: Princeton University Press.
Alexander Nussbaum is adjunct full professor in the Department of Psychology at St. John’s University and on the board of advisers of Analytic MedTek Consultants LLC. He is a contributor to a textbook and has published two previous articles in Free Inquiry, as well as articles in other publications.