Skip to main content

Is your brain wired for science, or for bunk?


This post is by Maarten Boudry (picture above), Research Fellow in the Department of Philosophy and Moral Sciences at Ghent University. Here Maarten writes about the inspiration for his recent paper, co-authored with Stefaan Blancke and Massimo Pigliucci, 'What Makes Weird Beliefs Thrive? The Epidemiology of Pseudoscience', published in Philosophical Psychology. 

Science does not just explain the way the universe is; it also explains why people continue to believe the universe is different than it is. in other words, science is now trying to explain its own failure in persuading the population at large of its truth claims. In Why Religion is Natural and Science is Not, philosopher Robert McCauley offers ample demonstrations of the truth of his book title. Many scientific theories run roughshod over our deepest intuitions. Lewis Wolpert even remarked that 'I would almost contend that if something fits with common sense it almost certainly isn't science.’ It is not so much that the universe is inimical to our deepest intuitions, it is that it does not care a whit about them (it is nothing personal, strictly business). And it gets worse as we go along. Newton’s principle of inertia was already hard to get your head around (uniform motion continuing indefinitely?), but think about the curvature of space-time in general relativity, or the bizarre phenomena of quantum mechanics, which baffle even the scientists who spend a lifetime thinking about them. Science does not have much going for it in the way of intuitive appeal.

Bearing all that in mind, it may seem remarkable, not that so many people refuse to accept the scientific worldview, but that so many have embraced it at all. Of course, science has one thing in its favour: it works. Every time your GPS device tells you where you are, or you heat up your soup by bombarding it with invisible waves, or you blindly entrust your fate in the hands of an able surgeon, you are relying on the achievements of science. Science is culturally successful despite the fact that it clashes with deeply engrained intuitions. By and large, people accept the epistemic authority of science—sometimes begrudgingly—because they admire its technological fruits and because deep down they know it is reasonable to defer to the expertise of more knowledgeable people. Without its technological prowess, which ultimately derives from the facts that it tracks truth, the scientific worldview would wither away. No system of beliefs could succeed in convincing so many people of so many bizarre and counterintuitive things, unless the truth was on its side, at the least most of the time.

We can see that if we compare science with some of its contenders: religion, superstition, ideology, and in particular pseudoscience—belief systems that actively mimic the superficial trappings of science, trying to piggyback on its cultural prestige. By definition, pseudoscience does not have truth on its side (except by a sheer stroke of luck), or else we would just call it ‘science’. Because they defy reality, pseudosciences can boast of no genuine technological success. The army does not hire psychics (or so one hopes), homeopathy only has the placebo effect to count on, and creationists are marginalized in the scientific community, despite their persistent campaign for recognition.

But how do pseudoscience and other weird belief systems sustain themselves? They profit exactly from that which is lacking in science: intuitive appeal. Almost all pseudosciences tap into the cognitive biases, intuitions and heuristics of the human mind, courtesy of evolution by natural selection. Intuitive appeal makes up for lack of truth value. Pseudosciences have even developed ‘strategies’ to cope with the threat of adverse evidence, and to withstand critical scrutiny. In my dissertation Here Be Dragons and in a series of papers (here and here) with Johan Braeckman, I refer to these as ‘immunizing strategies’ and ‘epistemic defense mechanisms’. In our recent paper we have further pursued this analysis and compared the cultural dynamics of science and pseudoscience, developing what Dan Sperber called an ‘epidemiology of representations’. In this new work, we show how science achieves cultural stability, despite the fact that it flies in the face of pretty much every human intuition, and how ‘weird’ beliefs can thrive, under the false pretense of being scientific. Pseudoscience does not have truth on its side, but it does tap into our innate intuitions and biases, and is protected by its own in-built survival kit.

Popular posts from this blog

Delusions in the DSM 5

This post is by Lisa Bortolotti. How has the definition of delusions changed in the DSM 5? Here are some first impressions. In the DSM-IV (Glossary) delusions were defined as follows: Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person's culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.

Rationalization: Why your intelligence, vigilance and expertise probably don't protect you

Today's post is by Jonathan Ellis , Associate Professor of Philosophy and Director of the Center for Public Philosophy at the University of California, Santa Cruz, and Eric Schwitzgebel , Professor of Philosophy at the University of California, Riverside. This is the first in a two-part contribution on their paper "Rationalization in Moral and Philosophical thought" in Moral Inferences , eds. J. F. Bonnefon and B. Trémolière (Psychology Press, 2017). We’ve all been there. You’re arguing with someone – about politics, or a policy at work, or about whose turn it is to do the dishes – and they keep finding all kinds of self-serving justifications for their view. When one of their arguments is defeated, rather than rethinking their position they just leap to another argument, then maybe another. They’re rationalizing –coming up with convenient defenses for what they want to believe, rather than responding even-handedly to the points you're making. Yo

A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind

Today's post is by  Karen Yan (National Yang Ming Chiao Tung University) on her recent paper (co-authored with Chuan-Ya Liao), " A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind " ( Synthese 2023). Karen Yan What drives us to write this paper is our curiosity about what it means when philosophers of mind claim their works are informed by empirical evidence and how to assess this quality of empirically-informedness. Building on Knobe’s (2015) quantitative metaphilosophical analyses of empirically-informed philosophy of mind (EIPM), we investigated further how empirically-informed philosophers rely on empirical research and what metaphilosophical lessons to draw from our empirical results.  We utilize scientometric tools and categorization analysis to provide an empirically reliable description of EIPM. Our methodological novelty lies in integrating the co-citation analysis tool with the conceptual resources from the philosoph