Intellectual Black Holes and Bullshit

Stephen Law

Even among the world’s best-educated and most scientifically literate populations, ridiculous belief systems abound. Huge numbers of people believe in such things as astrology, television psychics, crystal divination, the healing powers of magnets, and the prophecies of Nostradamus. Many suppose that the pyramids were built by aliens, that the Holocaust never happened, or that the World Trade Center was brought down by the U.S. government. A few would have us believe that Earth is ruled by a secret cabal of lizardlike aliens. Even within mainstream religions many people believe absurdities. Some imams promised seventy-two heavenly virgins to suicide bombers. Other religious authorities insist the entire universe is only a few thousand years old.

How do intelligent, college-educated people end up the willing slaves of claptrap? How, in particular, do the true believers manage to convince themselves and others that they are the rational, reasonable ones and everyone else is deluded?

Cosmologists talk about black holes, objects so gravitationally powerful that nothing, not even light, can break away from them. Unwary space travelers passing too close to a back hole will find themselves sucked in. An increasingly powerful motor is required to resist its pull, until one eventually passes the “event horizon” and escape becomes impossible.

My suggestion is that our contemporary cultural landscape contains, if you like, numerous intellectual black holes—belief systems constructed in such a way that unwary passers-by can find themselves similarly drawn in. While those of us lacking robust intellectual and other psychological defenses are most easily trapped, we’re all potentially vulnerable. If you find yourself encountering a belief system in which one or more of these mechanisms features prominently, be wary. Alarm bells should be going off and warning lights flashing, for you may be approaching the event horizon of an intellectual black hole.

Fake Reasonableness

Note that the mere fact that a set of beliefs is attractive doesn’t make it an intellectual black hole. Take a set of beliefs about water, such as that it freezes at 0 degrees centigrade and boils at 100 degrees. People are powerfully wedded to these beliefs because they are genuinely reasonable. The seductive draw of the beliefs that lie at the heart of an intellectual black hole, by contrast, has nothing to do with whether or not they’re reasonable or true. To those inside, the core beliefs may appear reasonable. But the appearance is a façade—a product of the belief system’s ability to disable the truth-detecting power of reason and get its victims to instead embrace habits of thought that are deceptive and unreliable.

The Dangers Posed by Intellectual Black Holes

Why worry about intellectual black holes? What does it matter if some people happen to believe absurd things? There’s no doubt that intellectual black holes can exist without causing any great harm. They are still dangerous, however. The hazards posed by an extreme cult, such as that of the Reverend Jim Jones (which ended in the mass suicide of his followers), are abundantly clear. Once our minds have been captured by such a belief system, we become vulnerable to the wiles of those who control it. Victims have even been led to commit terrorist attacks.

There are less dramatic but still serious dangers. Every year, millions of dollars are spent on alternative medicines that, in many cases, just don’t work. Not only are these medicines ineffective, but people relying on them may expose themselves to serious risk as a result. For example, people may die as a consequence of relying on homeopathic treatment rather than conventional immunizations to protect them against malaria. Belief in the efficacy of homeopathy to protect against malaria, or that homeopathy has any kind of genuinely medicinal effect, is not supported by the evidence.

Each year, vast sums are also spent on astrologers, psychics, and others claiming extraordinary powers. Vulnerable people waste both cash and emotional energy seeking out reassurances about lost loved ones that are, in reality, bogus.

So intellectual black holes allow people to be taken advantage of financially. Indeed, they are big business. But victims can, of course, be taken advantage of in other ways, too. Intellectual black holes can lead people to waste their lives. In some cases, true believers may be led to abandon friends and family and throw away real opportunities, all for the sake of furthering their belief system’s hypnotically attractive, if bogus, cause.

On Religion

Examples of intellectual black holes include Young Earth creationism and Christian Science. However, I should emphasize that I am not suggesting here that every religious belief system is an intellectual black hole or that every person of faith is a victim. I am not arguing here that all religious beliefs in question are false, or that they couldn’t be given a proper, robust defense. Just because some religious people choose to defend what they believe by dubious means doesn’t mean that no one can reasonably hold those same beliefs.

On Bullshit

So, to be clear, when I talk, as I do, about an intellectual black hole being a bullshit belief system, it’s not the content I’m suggesting is bullshit but the manner in which its core beliefs are defended and promoted.

According to philosopher Harry Frankfurt, whose essay On Bullshit has become a minor philosophical classic, bullshit involves a kind of fakery. A bullshitter, says Frankfurt, is not the same thing as a liar. The bullshitter does not knowingly tell a fib. He or she does not assert something he or she knows to be false. Rather he or she just says things to suit his or her purposes—to get away with something—without any care as to whether those things are true.

I don’t entirely agree with Frankfurt’s analysis. His definition, it seems to me, is in at least one respect too narrow. People regularly talk about astrology, feng shui, Christian Science, the latest self-help fad, and so on as being bullshit, and their practitioners as bullshit artists, even while acknowledging that those who practice these beliefs typically do so in all sincerity. Not only do the practitioners believe what they say, it matters to them that what they say is true.

What nevertheless marks out practitioners of astrology, feng shui, and Christian Science as bullshit artists, I’d suggest, is the kind of faux reasonableness that they manage to generate—the pseudoscientific gloss that they are able to apply to their core beliefs. They create the illusion that what they believe is reasonable while not themselves recognizing that it’s only an illusion. They typically manage to fool not only others but themselves, too.

On Stupidity

Victims of intellectual black holes need be neither dim nor foolish. Those inside them are often smart. Nor need those who fall afoul of intellectual black holes be generally gullible. Victims may in other areas of their lives be models of caution, subjecting claims to close critical scrutiny, weighing evidence scrupulously, and tailoring their beliefs according to robust rational standards. They are able to, as it were, compartmentalize their application of these strategies.

So if you begin to suspect that you yourself may have fallen into an intellectual black hole, there’s no need to feel foolish. People far cleverer than either you or me have fallen victim.

This article was adapted from Stephen Law, Believing Bullshit: How Not to Get Sucked into an Intellectual Black Hole (Amherst, N.Y.: Prometheus Books); www.prometheusbooks.com. Copyright ©2011 by Step
hen Law. Used with permission of the author and publisher.

Further Reading

  • Justin Barrett, Why Would Anyone Believe in God? (Lanham, Md.: Altamira Press, 2004).
  • Harry Frankfurt, On Bullshit (Princeton: Princeton University Press, 2005).
  • Daniel D. Wegner, The Illusion of Conscious Will (Cambridge Mass.: MIT Press, 2002).

 


 

Why Do We Believe What We Do?

Why is belief in supernatural beings—such as ghosts, angels, dead ancestors, and gods—so widespread? Belief in such supernatural agents appears to be a near-universal feature of human societies. There is some evidence that a predisposition toward beliefs of this kind may actually be innate—part of our natural, evolutionary heritage. The psychologist Justin Barrett has suggested that the prevalence of beliefs of this kind may in part be explained by our possessing a hypersensitive agent detection device, or HADD.

Human beings explain features of the world around them in two very different ways. For example, we sometimes appeal to natural causes or laws in order to account for an event. Why did that apple fall from the tree? Because the wind blew and shook the branch, causing the apple to fall. Why did the water freeze in the pipes last night? Because the temperature of the water fell below zero, and it is a law that water freezes below zero.

However, we also arrive at explanations by appealing to agents—beings who act on the basis of their beliefs and desires in a more or less rational way. Why did the apple fall from the tree? Because Ted wanted to eat it, believed that shaking the tree would make it fall, and so shook the tree. Why are Mary’s car keys on the mantelpiece? Because she wanted to remind herself not to forget them and so put them where she thought she would spot them.

Barrett suggests that we have evolved to be overly sensitive to agency. We evolved in an environment containing many agents—family members, friends, rivals, predators, prey, and so on. Spotting and understanding other agents helps us survive and reproduce. So we evolved to be sensitive to them—oversensitive, in fact. Hear a rustle in the bushes behind you, and you instinctively spin round, looking for an agent. Most times, there’s no one there—just the wind in the leaves. But, in the environment in which we evolved, on those few occasions when there is an agent present, detecting it might well save your life. Far better to avoid several imaginary predators than be eaten by a real one. Thus evolution will select for an inheritable tendency to not just detect—but overdetect—agency. We have evolved to possess (or, perhaps more plausibly, to be) hyperactive agency detectors.

If we do have an HADD, that would at least partly explain the human tendency to feel that there is “someone there” even when no one is observed and so may at least partly explain our tendency to believe in the existence of invisible agents—in spirits, ghosts, angels, or gods.

For example, in his book Illusion of Conscious Will, Daniel Wegner points out that what he believes is the most remarkable characteristic of those using a Ouija board (in which the planchette—often an upturned shot glass—on which the subjects’ index fingers are gently resting appears to wander independently around the board, spelling out messages from “beyond”): “People using the board seem irresistibly drawn to the conclusion that some sort of unseen agent . . . is guiding the planchette movement. Not only is there a breakdown in the perception of one’s own contribution to the talking board effect but a theory immediately arises to account for this breakdown: the theory of outside agency. In addition to spirits of the dead, people seem willing at times to adduce the influence of demons, angels, and even entities from the future or from outer space, depending on their personal contact with cultural theories about such effects.”

Because the movement of the planchette is inexplicable and odd, it is immediately put down to the influence of an invisible agent (though notice the kind of agent invoked varies from group to group, depending on the members’ own particular, culturally led expectations).

Note that the HADD hypothesis does not say that there are no invisible agents. Perhaps at least some of the invisible agents that people suppose exist are real. Perhaps there really are ghosts, or spirits, or gods. However, if we suppose the HADD hypothesis does correctly explain why so many people believe in the existence of invisible agents, then the fact that large numbers hold such beliefs can no longer be considered good evidence that any such agents exist. It will no longer do to say, “Surely not all these people can be so very deluded? Surely there must be some truth to these beliefs, otherwise they would not be so widespread?” The fact is, if the HADD hypothesis is correct, we’re likely to believe in the existence of such invisible agents anyway, whether or not such agents exist. But then the commonality of these beliefs is not good evidence such agents exist.

There was already good reason to be skeptical about popular support when it comes to justifying beliefs in invisible agents, as well as many other beliefs of a religious or supernatural character. The fact that around 45 percent of the citizens of one of the richest and best-educated populations on the planet believe that the entire universe is only about six thousand years old is testament to the fact that, whatever else may be said about religion, it undoubtedly possesses a quite astonishing power to get large numbers of people—even smart, college, educated people—to believe downright ridiculous things. Nevertheless, if the HADD hypothesis is correct, it adds yet another nail to the coffin lid of the suggestion that “Lots of people believe it, so there’s got to be something to it!”

Another psychological theory is the theory of cognitive dissonance. Dissonance is the psychological discomfort that we feel when we hold beliefs or attitudes that conflict. The theory says that we are motivated to reduce dissonance by either adjusting our beliefs and attitudes or rationalizing them.

The example of the sour grapes in Aesop’s story of The Fox and The Grapes is often used to illustrate cognitive dissonance. The fox desires those juicy-looking grapes, but then, when he realizes he will never attain them, he adjusts his belief accordingly to make himself feel better—he decides that the grapes are sour.

What role might the theory of cognitive dissonance play in explaining why we are drawn to using belief-immunizing strategies? Suppose, for the sake of argument, that our evolutionary history has predisposed us toward both a belief in supernatural agents and also toward forming beliefs that are, broadly speaking, rational or at the very least not downright irrational. That might put us in a psychological bind. On the one hand, we may find ourselves unwilling or even unable to give up our belief in certain invisible agents. On the other hand, we may find ourselves confronted by overwhelming evidence that what we believe is downright unreasonable. Under these circumstances, strategies promising to disarm rational threats and give our beliefs at least the illusion of reasonableness are likely to seem increasingly attractive. Such strategies can provide us with a way of dealing with the intellectual discomfort such innate tendencies might otherwise produce. They allow true believers to reassure themselves that they are not being nearly as irrational as reason might otherwise suggest—to convince themselves and others that their belief in ghosts or spirits or whatever, even if not well-confirmed, is at least not contrary t
o reason.

So we can speculate about why certain belief systems are attractive and also why such strategies are employed to immunize them against rational criticism and provide a veneer of “reasonableness.” Both the HADD hypothesis and the theory of cognitive dissonance may have a role to play.

Stephen Law

Stephen Law is a lecturer in philosophy at Heythrop College, University of London.

He is the editor of Think, the author of numerous articles and books, and a debater of apologists and theologians. He has a blog at www.stephenlaw.org.


Even among the world’s best-educated and most scientifically literate populations, ridiculous belief systems abound. Huge numbers of people believe in such things as astrology, television psychics, crystal divination, the healing powers of magnets, and the prophecies of Nostradamus. Many suppose that the pyramids were built by aliens, that the Holocaust never happened, or that the …

This article is available to subscribers only.
Subscribe now or log in to read this article.