Scientific beliefs are destined to supersede and replace primitive religious views, once argued 19th-century French philosopher Auguste Comte. His scientific positivism birthed today’s scientism: the notion that science has exclusive access to the truth.
“Science” is usually equated by proponents of this view with empiricism or, in many fields, with a method of inquiry that employs controls, blinding, and randomization. Now, a small group of contemporary psychologists have published a series of provocative experiments showing that faith in science can serve the same mentally-stabilizing function as religious beliefs.
In 2013, a study published in The Journal of Experimental Social Psychology found that when subjects were stressed, they were more likely to agree to statements typifying scientism such as, “the scientific method is the only reliable path to knowledge.” When people felt anxious, they esteemed science more highly than calmer subjects did, just as previous experiments have shown to be the case with religious ideals.
Another study led by University of Amsterdam’s Bastiaan Rutjens in 2010 found that uncertain subjects expressed an increased faith in God or in evolution, provided that evolution was presented as a structured and predictable process.
In these cases, beliefs about science may be defended emotionally, even if they are false, as long as they provide a reassuring sense of order. That is to say, beliefs about science may be defended thoughtlessly—even unscientifically.
So what does it mean that both religious and scientific outlooks may function to becalm our existential anxieties? What we believe, the parallel implies, can sometimes be less important than how we believe it. In other words, deep faith in science is sometimes just another form of (irrational) extremism.
Psychology, not theology, is at the root of extremism
The psychology of extremism clarifies the essential logic. Last year, the University of Maryland’s Arie Kruglanski detailed evidence that psychology, not theology, is at the root of extremist ideologies.
Extremist groups like ISIL offer adherents a sense of personal worth, he argued, but they also provide believers with certainties about the world that they so desperately need.
Whether studying extremists in Morocco, Northern Ireland, Palestine, the Philippines, Spain, or Sri Lanka, Kruglanski found that believers all displayed a desire for certainty and structure that was higher than average.
For extremists, Kruglanski wrote in the online journal E-International Relations, the world is one of “good versus evil, saints versus sinners, order versus chaos; a pure universe in black and white admitting no shades of gray.”
As his research shows, we all have different baseline levels of need for closure, but our distaste for ambiguity can also be heightened by uncertainty and stress. Extremism results, in part, when our natural need for order is enflamed by disorder.
The content of extremist beliefs, beyond their status-reassuring certainties, is incidental. Moral relativism, which holds that objective criteria do not exist for judging norms, seems to spring from this link between extremism and the craving for certainty.
As NYU student Zachary Fine observed in The New York Times last year: “The byproducts of absolute truths and intractable forms of ideology … historically seem linked to bigotry and prejudice.” Indeed, Kruglanski’s concept of the need for closure and the very study of the intolerance of ambiguity stem from post-World War II attempts to understand Nazism.
The psychology of extremism reveals an important point: part of what makes a belief system dangerous is its dogmatic denial of uncertainty.
The moral duty to reject dogma
If the moral authority of science is rooted anywhere it is in the opposing stance, in its acceptance of fallibility and its welcoming treatment of ambiguity and unknowns. That is where science finds its contrast with scientism and many religious perspectives.
The history of “the” scientific method amounts to a series of (ongoing) attempts to prevent human bias, false certainty, and weakness from compromising the search for knowledge. It reads as a long communal struggle toward a radical self-imposed culture of self-distrust.
In his 2012 book Ignorance: How It Drives Science, Columbia neuroscientist Stuart Firestein argued, “Being a scientist requires having faith in uncertainty, finding pleasure in mystery, and learning to cultivate doubt.” Ignorance, Firestein pointed out, is not only natural but fertile.
As climate scientist Tamsin Edwards put it in an op-ed for Vice, “Uncertainty is the engine of science.”
The virtues of this moral outlook are curiosity, self-doubt, and an independent skepticism, ideals which we need more of today in our whirlwind era and which in no way conflict with novelist William Faulkner’s “old verities and truths of the heart, the old universal truths lacking which any story is ephemeral and doomed: love and honor and pity and pride and compassion and sacrifice.” Defending uncertainty against premature claims of authority is not a timid calling.
How we know what we know
Scientism, to be sure, does a disservice to this spirit of humility in the face of human ignorance, and as Leon Wieseltier wrote in The New York Times in January, the idea “that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university.”
I once heard a prominent researcher argue that “anything not statistically significant is truly worthless,” a staggering claim that dismissingly brands Shakespeare and Chekhov as without intellectual value for students of human thought, behavior, and society.
But was fire harnessed by scientific method? Did developed countries become developed through randomized controlled trials? Many innovations obviously work or plainly don’t; testing is often for close calls. That isn’t to deny the progress of science or its unique role but only to relativize it as one precious mode of discovery among others.
Promoting science’s quieter, humbler spirit would have numerous upsides.
If the public were more comfortable with degrees of scientific uncertainty, for one, then climate change “skeptics”—those merchants of doubt, as Naomi Oreskes and Erik Conway dubbed them—wouldn’t be able to conflate so easily minor uncertainties with substantive disagreement. Science wouldn’t appear so harshly incompatible with spirituality. More fundamentally, truly respecting the scientific tradition requires us to acknowledge not only the limits of any given theory or method, but also the partiality of science as a way of knowing the world.