Skip to main content

Conspiracy Beliefs, Democracy, and Confabulation

As part of a British Academy project on Conspiratorial Ideation and Pathological Beliefs, Ema Sullivan-Bissett (University of Birmingham) and Anna Ichino (University of Milan) organised a workshop in Birmingham with speakers from philosophy, psychology, and psychiatry. In this post, I summarise the workshop talks on day one, 24th April 2023.


Workshop poster


Psychologist Karen Douglas kicked off the workshop talking about the psychology of conspiracy theories, asking why people believe in conspiracy theories and what the consequences are of believing conspiracy beliefs. Douglas started with a psychologist's definition of a conspiracy theory: "A belief that two or more actors have coordinated in secret to achieve an outcome. It is a conspiracy that the public should know about." For Douglas, conspiracy beliefs respond to three types of needs:

  1. Epistemic needs: finding meaning and explanation, addressing uncertainty, seeing patterns, wanting closure. People more likely to endorse conspiracy theories have lower levels of analytic thinking, lower levels of education, and tend to perceive agency and intentionality when they are not there. Intelligence does not seem to be related one way or another with the tendency to believe conspiracy theories.
  2. Existential needs: placating anxiety, feeling powerlessness, lacking socio-political control, having an avoidant coping style. People who are anxious and feel less in control of their lives are more likely to believe in conspiracies.
  3. Social needs: wanting to belong and maintaining an image of ourselves and our group as moral competent. Often this happens by blaming other people or other groups when things do not go well. People who are ostracised, disadvantaged, excluded are attracted to conspiracy theories too. People also want to feel like they are unique and special and everybody else are in the dark.

Karen Douglas


But are conspiracy theories helpful or harmful? They do not satisfy the epistemic or existential needs people have: people still feel uncertain and do not behave prosocially in relation to the context of the conspiracy theory; moreover, people continue to feel anxious and powerless, and fail to engage in mainstream political action but tend towards extremist views. The same is true for social needs: people who believe in conspiracy theories are more likely to engage in minor crimes and violent protests, are less likely to trust others, and are prejudiced against other groups.

New research also suggests that believing a conspiracy theory makes people less attractive in interpersonal relationships, and they appear less intelligent, sociable, honest, and trustworthy.


Kathleen Murphy-Hollies


Next, Philosopher Kathleen Murphy-Hollies argued that conspiracist claims can be understood as confabulations. Agents confabulate when they give reasons for a choice, action or belief of theirs but get something wrong about the world and don't capture other factors which were efficacious in bringing about that choice or action about. Confabulators don't intend to deceive others and believe in the account they give to others. Confabulations fill a cognitive gap which agents have where a more accurate explanation would be. Kathleen argued that confabulations are mostly justificatory. 

Kathleen suggested that in a similar way, when conspiracy theorists give reasons for thinking that certain conspiracies are true, they are providing reasons which justify their position and fill a cognitive gap they have around other factors which are driving their conspiracist claims (seeking patterns and meaning, high need for cognitive closure, feeling powerless etc). This explains a number of bizarre features of conspiracy theorists found in empirical studies, such as: (i) that conspiracy theorists tend to eagerly elaborate on their claims of conspiracy, just as confabulators engage in secondary confabulation when put under pressure from others; (ii) in these elaborations, conspiracy theorists tend to posit more conspiracy and deception, as part of an over-arching monological belief system which is guided by higher-order beliefs about deception. 

Confabulations, in the same way, are claims which are guided by higher-order beliefs with an over-arching theme - of understanding oneself and the world around oneself. Finally, (iii), conspiracist claims made by the same agent can often be self-contradictory. For example, a conspiracy theorist may believe both that covid-19 was a complete hoax but also a bio-chemical weapon released by China. Understanding these claims as confabulations also helps here, because although these claims are contradictory when taken as explanations of the world, not so when they are taken as justifications. Confabulations are not just explanations but justifications, and these claims are perfectly consistent when taken as parts of higher-order overarching beliefs focused on the theme of justifying that the agent understands herself and the world around her. 

Kathleen discussed a few upshots of this. It makes conspiracist claims a little bit less bizarre and othering, and a bit more understandable. It shifts the focus for interacting with conspiracy theorists away from trying to set the facts straight about explaining the workings of the world, and instead towards talking about one's values and justifications which are being invoked in conspiracist claims. Finally, with regards to pathology, both options are left open, as is the case with confabulation. The cognitive gap which confabulation fills might be due to pathology (such as dementia or anosognosia), or due to everyday cognitive limitations (such as having implicit bias, or knee-jerk intuitive thinking). 


Stephan Lewandowsky


Psychologist Stephan Lewandowsky talked about conspiracy theories and democracy. He started by differentiating between rhetoric and belief: people sometimes deploy the rhetoric of their political party to dismiss scientific views but there are also people who genuinely endorse conspiratorial alternatives. And mistrusting one governmental source means mistrusting all of them: people who tend to doubt an official account are primed to doubt more, which leads to political disengagement.

Another worrying feature is that there is basically no violent extremism without a conspiracy theory and, among the people who are predisposed to violence, the more they endorse conspiracy theories the more they are likely to react violently to the views they oppose. 

Does this suggest that believing conspiracy theories is pathological? Some have argued that some conspiracy theories are true, but if we look at the famous examples (such as Watergate), they were discovered by official sources of information (journalists) and not people protesting in the streets or in a blog. They were discovered by people who are sceptical about things, respond to evidence, and strive for coherence. Not by people who are subject to conspiratorial ideation, reject contrary evidence, and embrace inconsistencies. Paranoia and narcissism are associated with the endorsement of conspiracy theories, as well as schizotypy personality and pseudoscientific beliefs.

So people who endorse conspiracy theories are different from people who don't, though none of their behaviours meets the clinical threshold. What can we do to stop the spreading of conspiracy theories? One promising strategy is inoculation, by which we warn people that they may be misled and we tell them how they are likely to be misled. This is an effective strategy to stop people from being influenced by conspiracy theories in their thoughts and decisions.


Miriam McCormick


Philosopher Miriam McCormick focused on how to engage with a conspiracy theorist, starting from her views about disagreement. First, she argued that there are good reasons to engage with people who hold very different views, because it helps us reflect on our own beliefs and it shows respect for our opponents. That said, there is no obligation to engage--for instance when people report conspiracy theories in bad faith (without really committing to the conspiracy theories) engaging seems fruitless.

One thought is that the type of engagement we should adopt is not an open-minded one (where we are open to changing our views as a result of the interaction) but a closed-minded one (where we listen empathetically and are interested in why the person has those beliefs but we are not open to changing our views). Closed-minded engagement is done for the purpose of focusing on the believer, not the belief, trying to get them to see their own beliefs as problematic.

The important thing here is to engage in a way that is neither insincere (faking interest in the belief) nor disrespectful (considering the person unworthy of a different type of engagement, a lost cause). It consists of an exploration of the view and the reasons for the views being endorsed. The challenge to the problematic belief is made indirectly, not via a direct confrontation, but by considering the experiences and feelings that make the belief attractive and showing that, although the experiences and feelings are valid, they do not need to lead to the belief.

Popular posts from this blog

Delusions in the DSM 5

This post is by Lisa Bortolotti. How has the definition of delusions changed in the DSM 5? Here are some first impressions. In the DSM-IV (Glossary) delusions were defined as follows: Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person's culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.

Rationalization: Why your intelligence, vigilance and expertise probably don't protect you

Today's post is by Jonathan Ellis , Associate Professor of Philosophy and Director of the Center for Public Philosophy at the University of California, Santa Cruz, and Eric Schwitzgebel , Professor of Philosophy at the University of California, Riverside. This is the first in a two-part contribution on their paper "Rationalization in Moral and Philosophical thought" in Moral Inferences , eds. J. F. Bonnefon and B. Trémolière (Psychology Press, 2017). We’ve all been there. You’re arguing with someone – about politics, or a policy at work, or about whose turn it is to do the dishes – and they keep finding all kinds of self-serving justifications for their view. When one of their arguments is defeated, rather than rethinking their position they just leap to another argument, then maybe another. They’re rationalizing –coming up with convenient defenses for what they want to believe, rather than responding even-handedly to the points you're making. Yo...

A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind

Today's post is by  Karen Yan (National Yang Ming Chiao Tung University) on her recent paper (co-authored with Chuan-Ya Liao), " A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind " ( Synthese 2023). Karen Yan What drives us to write this paper is our curiosity about what it means when philosophers of mind claim their works are informed by empirical evidence and how to assess this quality of empirically-informedness. Building on Knobe’s (2015) quantitative metaphilosophical analyses of empirically-informed philosophy of mind (EIPM), we investigated further how empirically-informed philosophers rely on empirical research and what metaphilosophical lessons to draw from our empirical results.  We utilize scientometric tools and categorization analysis to provide an empirically reliable description of EIPM. Our methodological novelty lies in integrating the co-citation analysis tool with the conceptual resources from the philosoph...