Is Loss of Faith a Two-Generation Process?

Tom Rees

In February, the Pew Forum on Religion and Public Life published a report showing that although American youth have lost the church-going habits of their parents, they retain strong religious beliefs. In other words, they believe in God but don’t belong to a church—a pattern long associated with European society. Meanwhile in Europe, we’ve learned that the late Pope John Paul II used to spend his summer vacations whipping himself with a belt. Perhaps you don’t see the connection? Not only are they linked, but the factor that connects them may well provide us with a crystal ball into the future of religion in the U.S.A.

Let’s start with the case of the self-flagellating pope. The revelations about his semi-secret life come from a new book by Monsignor Slawomir Oder, the Vatican official spearheading the campaign for John Paul II’s beatification. In addition to the beatings, the pope frequently denied himself food and slept on the bare floor rather than his bed.

Religion watchers know that this sort of behavior is not uncommon among the faithful. Indeed, more extreme examples abound. In the Philippines, men undergo ritual crucifixion during the Easter festival, allowing themselves to be literally nailed to wooden crosses. On the day of Ashura, devout Sunni Muslims whip themselves with chains and cut their skin with knives. During the Hindu festival of Thaipusam, Hindu devotees carry ritual burdens (kavadi) and endure skin and tongue piercing, even placing hooks through their skin and attaching lines to tractors to pull them. Mahayana Buddhist monks have been known to self-immolate. Shamans the world over engage in any number of painful “body modifications.”

While conventional explanations focus on how these acts of self-sacrifice help the individual reach a higher plane of devotion, a new wave of anthropologists and social psychologists are exploring ways in which evolutionary theory might help us to understand them in terms of social interactions. After all, the factor that really distinguishes religion from ordinary superstitions is that it is a social endeavor. What’s more, while dramatic acts of self-harm are pretty rare, all religions require their adherents to take part in onerous activities. Going to services, avoiding taboo foods, self-denial, wearing signs and symbols—the list goes on. Perhaps the fact that all religions require their members to expend time and effort on rituals is linked in some perverse way to their popularity. That could be the case if the purpose of these rituals is not to change the individual but rather to send a signal to other people. But a signal of what, exactly? Well, here is where it gets controversial.

One idea is that these ritual acts send a signal of how strong one’s beliefs are. Only a true believer, so the argument goes, would bother going to church every week. Since true believers have genuine fear of supernatural punishment, you can trust them to be honest and straight in their dealings. This idea, termed costly signaling, has its roots in the psychology of gang membership. Potential members who really want to belong will be prepared to undergo a painful or unpleasant initiation ceremony, whereas potential free riders will decide that it’s not worth the trouble. There is some evidence that religious rituals and rules really do work this way. For example, Richard Sosis, an anthropologist at the University of Connecticut, has studied American religious communes of the nineteenth century and found that the longest lived were those that imposed the greatest restrictions and burdens on their members.

You’ve probably spotted the problem here. If I can get you to treat me more favorably simply by engaging in religious rituals (going to church, for example), then that’s exactly what I’ll do. In other words, I’ll fake it. In a recent critique, the philosopher Michael Murray laid out in devastating detail the problems with using costly signaling to explain religious rituals, and fakery is the most fundamental one. It is a fatal flaw in the idea that you can take following religious rituals as evidence of honest intent. Whichever way you slice it, if people see some benefit from faking religion then they will, and that fatally undermines the idea that participation in religious rituals reliably signals religious belief.

So if religious acts don’t signal good intent, what do they signal? The answer may well lie in how we learn new facts. Many of our learning experiences present us with the problem of deciding what is true—hence worth incorporating into our worldview—and what is not. I might tell you that there are invisible germs on your hands that cause disease, but without recourse to a microscope I would have difficulty proving it.

Transmitting religious beliefs faces a similar hurdle. I may say that God exists, but with no tangible evidence to prove it, how do you know whether to believe me? Perhaps you might start by looking to see if I walk the walk as well as talk the talk. Do I behave in ways that accord with my beliefs? If I wash my hands before eating, then you might be inclined to think that I really believe all this stuff about germs, and so you might also be persuaded that they exist. Maybe religious rituals perform a similar function. If I tell you that you’ll get a reward in heaven if you believe in my god, well, you might be skeptical. But if I do things that make sense only if I’m a true believer myself, then the credibilty barrier is lowered.

Acts that increase the credibility of an individual’s beliefs rejoice in the label “credibility enhancing displays” (CREDs for short) and, though the idea sounds superficially similar to costly signaling, it differs in one crucial way: there’s nothing in it for me if I persuade you to believe me. As a result, there’s no incentive for me to cheat. The only reason for me to do these things is if I genuinely believe that they will benefit me in some way. If you’re smart, you’ll pick up on this and start to believe the same things.

Joe Heinrich, an anthropologist at the Centre for Human Evolution, Cognition, and Culture at the University of British Columbia who has studied the CRED phenomenon, thinks that it can explain why seemingly useless ideas can take hold. Using a model of how ideas are transmitted, he found that a belief that carries no tangible benefit but only a cost can out-compete a cost-free belief so long as the cost is linked to a CRED. Heinrich’s theory gives a radical perspective on why the pope and other religious self-harmers do what they do. It’s because they have each been infected by a particularly virulent meme—one that subverts the way we learn from others and uses it to self-propagate ruthlessly, despite the damage to its host.

CREDs aren’t just relevant to extreme varieties of religious belief. We also use CREDs to decide whether to adopt more conventional doctrines, and this has important implications for the future of religion. Take Sweden, one of the most godless nations on Earth. Recent work by Jonathan Lanman, an anthropologist at the University of Oxford in the United Kingdom, has described Sweden’s path to secularization as a two-stage process. Improved living standards and social security in the postwar period combined to reduce the importance of church-going in the lives of Swedes. Members of this generation retained their religious beliefs but simply found they had better things to on a Sunday morning. They believed but did not belong.

Then came the problem of passing on their beliefs to their children. They told their children all about their god and how important their beliefs were to them. But something was missing. With no tangible evidence for the existence of God, each new generation looks to CREDs to see whether they should accept what they are being told, and church-going is a classic CRED. If I simply tell you that an invisible being exists who wants you to go to church, you might not take me too seriously. However, if you see me going to church regularly, you might be more receptive to the idea. Once most members of Sweden’s older generations abandoned church-going, their efforts to pass on their beliefs to the younger generation were fatally undermined.

Could something similar be happening in the United States today? According to the Pew Forum, fewer than 20 percent of adults under thirty now attend church regularly. They still believe, of course, just as Sweden’s postwar generation still believed. But what of the next generation? Could it be that they will neither belong nor believe?

Predictions about the future of religion have a habit of going awry. Even so, the decline in church-going among American youth should not be lightly dismissed. Clearly, major changes in the religious landscape of America are underway, and only time will tell if the cognitive theorists have it right!

Tom Rees

Tom Rees is a medical writer and a lifelong humanist. His blog, Epiphenom, covers the latest research into the psychology and social science of religion and nonbelief.

In February, the Pew Forum on Religion and Public Life published a report showing that although American youth have lost the church-going habits of their parents, they retain strong religious beliefs. In other words, they believe in God but don’t belong to a church—a pattern long associated with European society. Meanwhile in Europe, we’ve learned …

This article is available to subscribers only.
Subscribe now or log in to read this article.