Skip to main content

Conspiracy Beliefs between Secret Evidence and Delusion

On 26th and 27th September in Berlin, the Human Abilities Centre for Advanced Studies in the Humanities organised a workshop on conspiracy beliefs and delusions. This is a report of the workshop.


Logo of the Human Abilities centre


The first speaker was Romy Jaster (Humboldt-Universität zu Berlin) presented a talk on self-immunization in conspiracy theories. Romy thinks about conspiracy theories from an epistemological and philosophy of science perspective and she started her presentation with the conceptual distinction between "conspiracy theories" as a neutral term (an explanation that involves a conspiracy) and "conspiracy theories" as a negatively-valued term (an explanation that is epistemically deficient). What the epistemic deficit is is open to debate and controversy. 

Romy focused on the idea that conspiracy theories and delusions are both deficient because they are not responsive to counter-evidence. The idea is that conspiracy theories are built in such a way that they become immune from counter-evidence (cover-up thesis). Pathologising conspiracy beliefs, however, misconstrues them. But leaving aside the question whether they are pathological, Romy concentrated on the cover-up thesis. In this respect, conspiracy theories are very similar to sceptical hypotheses such as the brain-in-the-vat hypothesis (see Coady's work on this), and beliefs by people who experience impostor syndrome and gaslighting. That is, they have an in-built capacity to resist counter-evidence.


Romy Jaster

However, the analogies are not perfect. For instance, in sceptical hypotheses there is a deception thesis (things may not as they seem because someone is out there to trick us) that is not exactly the same as the cover-up thesis. First , there is a difference in the entity or individuals doing the deception or the covering-up work: if they were to be able to create ways to counteract all evidence for the conspiracy, then they would need to be all-knowing and all-powerful. But whereas the Cartesian demon may have these features, it is implausible that real-life conspirators do. 

Instead, most conspiracy theories do not hypothesis an all-powerful deceiving entity but have a more nuanced view where some of the ways in which the conspiracy theories are counteracted are not planned or the outcome of evil deliberation but they are due to bias and doxastic conservatism. So, some conspiracy theories can be empirically challenged. In the latter part of the talk, Romy argued that there are interesting similarities between the beliefs of people who experience impostor syndrome and gaslighting and conspiracy theories, in terms of self-immunisation from counterevidence.

The second talk in the workshop was by Matthew Parrott (University of Oxford), who discussed delusional content and delusional form. Matthew started discussing the standard definitions of delusions as psychological states that are individuated on the basis of their content. Explanations of delusions have traditionally been based on the content of the delusions (e.g. explaining Capgras on the basis of an unusual experience when seeing the face of a familiar person).

Matthew argued for a different conception of delusions, which he called the Dynamic Conception of Delusion in four steps: 

  • existing definitions of delusions are no good as some delusions do not meet the criteria in the definitions;
  • the standard (static) conception of delusions is not well-motivated and we have no theoretically grounded reasons to accept that conception;
  • the dynamic conception is implicitly accepted by the most promising recent work in computational psychiatry (which proposes a purely formal way to account for delusions);
  • there are positive consequences of adopting the dynamic conception of delusions for both the philosophy of delusions and clinical treatment.

Matthew Parrott


Matthew argued against the evidence insensitivity of delusions (a key feature of the standard definition) by talking about work that suggests that people with persecutory delusions are receptive to reasons against their delusions. So the epistemic criteria in the definitions are not met by all cases of delusions. What about the ontological criterion? Are delusions beliefs or not? It does not really matter, because the central presupposition of this debate is that delusions are a type of psychological state individuated by its content. And that is the presupposition Matthew intended to challenge.

In recent computational models of delusions people maintain that there are a number of factors involved in the generation of delusions. They are based on how people infer information from observations. Delusions can have multiple causes and are identified as self-maintained attractor states. They are termed "states" but not really a state but a tendency in a dynamic system, a pattern due to the internal structure of the system.

A delusion is a dynamic pattern of behaviour of a complex system. Delusions are not behaviours but behaviours are symptoms of delusions. Delusions are like COVID and having an implausible belief is like coughing. This way of thinking has several interesting consequences:
  1. we no longer need to individuate delusions based on content. This also explains heterogeneity as symptoms can be manifest or not: all we can say is that some features tend to cluster together; 
  2. treating delusions by talking to people about their beliefs is like treating fever without addressing the presence of the virus. The pathology is the virus, not the fever;
  3. importantly for the purpose of the workshop, this conception can also make sense of the perceived similarities between delusions and conspiracy beliefs, because the conception makes it plausible that there is a lot of continuity between processes underlying the two phenomena as we observe them.

Slide from Lisa Bortolotti's presentation

In the third talk of the day, I presented a paper focused on the analogies between conspiracy beliefs and delusions. Although we can mark some interesting areas of overlap, for instance in the way we attribute conspiracy beliefs and delusions to other speakers and what cognitive antecedents are identified in the formation of conspiracy and delusional beliefs, this does not mean that such beliefs are pathological or that their presence signals that the speaker's agency is lacking or compromised.

I offered a view of delusions and conspiracy beliefs as explanations for salient and unexpected, often distressing events, that appear as implausible and unshakeable to an interpreter who does not share the same perspective as the speaker. Moreover, I remarked on how for speakers the belief with a delusional quality, whether it is a clinical delusion or a conspiracy beliefs, ends up being central to how speakers see themselves and relate to their physical and social environment.

If the cognitive antecedents identified by psychologists, and indeed other factors contributing to delusion and conspiracy belief formation, are not confined to a pathology but characterise everyday thinking, especially in situations of uncertainty and distress, then the assumption that in people with delusions and conspiracy beliefs something is going wrong (that they are not just mistaken, but out of their minds) should be resisted. I presented a view of delusions and conspiracy theories as expressions of epistemic agency, attempts to identify a causal explanation that makes sense and can support action.

In particular, I suggest we should adopt an attitude towards speakers (including speakers to whom we would attribute delusional and conspiracy beliefs) that I call the agential stance, and recognise their role as agents with a valuable perspective on the world and with the capacity to pursue shared epistemic projects. Although I have some ideas about what the stance would be like in some clinical contexts (in terms of engaging with empathy and curiosity), the details of how it would amount to in everyday contexts, and in particular in exchanges with people whose beliefs can be harmful, are not fully worked out.

In the end, I showed how this way of thinking about conspiracy theories has inspired a series of resources (animated videos and a game) to encourage critical engagement with new information and raise awareness of cognitive biases and psychological needs in young people.

In the last talk of the day, by Roland Imhoff, the question whether conspiracy beliefs are "contaminated mindware" was considered. People who are more likely to endorse conspiracy theories perform less well in cognitive tasks and fail to identify bullshit. Roland defined conspiracy theories as "suspicions that an event of societal relevance happened because a few powerful individuals or groups have coordinated in secret to bring it about - to their own advantage and the public's disadvantage."


Roland Imhoff


What is interesting is that if people endorse conspiracy theories, then they endorse many of them; and that they tend to endorse conspiracy theories that conflict with one another. This suggests that there is something called conspiracy mentality that is a broader view of the world and is found in people who are more on the suspicious side. It is important though to note that paranoia and conspiracy beliefs are two different phenomena: in both cases people see threats, but people with paranoia mistrust everybody (powerful and vulnerable) whereas people with conspiracy beliefs mistrust only very powerful people.

A very interesting result is that people with conspiracy theories evaluate arguments on their own right, independent of whether the source has credibility or not; whereas people without conspiracy theories prefer the argument proposed by sources identified as credible. This might suggest that people with conspiracy theories have less of a bias than people without, but of course it does make sense to trust reliable sources.

The problem is that some tests for conspiracy mentality are contaminated by bullshit receptivity. To avoid this effect, Imhoff's team created a belief updating test to discriminate the two. Although conspiracy mentality is not indistinguishable from bullshit receptivity, there is still a lot of contamination. What can done to address this? To try and unpack what conspiracy mentality is, the researchers identified six items that are supposed to be highly correlated:

  1. intentionality bias
  2. secrecy bias
  3. refusal of randomness
  4. pattern perception
  5. anti-elitism
  6. heterodoxy

And the results gathered so far suggest that these items are very highly correlated, which means that conspiracy mentality seems to be a robust construct. The first four items are so close that they may be regarded as the same thing, whereas the correlation with anti-elitism and heterodoxy may be culturally dependent (for instance, correlation with heterodoxy is lower in Russian samples).

On the second day of the workshop, the first talk was by Daniel Munro who suggested that we should stop talking about "conspiracy theories".


Daniel Munro

Daniel started from a consideration of the definitional questions concerning conspiracy theories: should we define conspiracy theories in a neutral way or in a pejorative way? But the idea was to intersect these definitional questions with psychological questions about the nature of conspiracy theories: whether people believe them or have other attitudes towards them. 

The neutral definition of conspiracy theories (also called the minimal definition) characterises conspiracy theories as explanations of some phenomenon that cites a group of agents conspiring. The pejorative definition adds another dimension to the minimal definition: the explanation is epistemically defective and the people who endorse the explanation are irrational, irresponsible, or vicious. Far-fetched explanations count as conspiracy theories in both the minimal and pejorative definition whereas the plausible explanations count as conspiracy theories in the minimal definition but not in the pejorative definition.

The psychological question is based on a tension between the fact that people invest a lot in the conspiracy theories they endorse (in terms of time and effort) and the fact that people can endorse conspiracy theories that conflict with one another and may not act on their conspiracy theories.

  • Belief view: conspiracy theorists genuinely believe that their theories are true.
  • Imaginative view: conspiracy theorists merely imagine or pretend their theories are true.

Proponents of the imaginative view may think of conspiracy theories as fictional stories (as in the work by Ichino and Ganapini), serious play (in Levy's work), or fantasies of secret knowledge (as in Daniel's own work). Arguments for the imaginative view include the idea that conspiracy theories are not sensitive to evidence as beliefs ought to be; that they are motivated by the desire to be special and unique and seeking a sense of community; and that they have entertainment value. Another consideration is whether people act on their conspiracy theories (few people do) and whether their endorsement of conspiracy theories is an instance of signalling.

But the Imaginative theory works only with the pejorative definition and this brings us back to evaluating the definitions for conspiracy theories. The neutral definition lumps together realistic and far-fetched explanations. But a philosophical account of an explanation is not about singling out what an explanation is an explanation of, it is more about identifying the role of the type of explanation in human practices. Pejorative definitions are more unified but it focuses on epistemic deficiencies when acts of imagination or serious play do not have a crucial epistemic dimension.

So we should replace "conspiracy theories" with conspiratorial explanations for the belief view and conspiratorial fictions for the imaginative view.

The last of the talks I was able to witness was by Sanja Dembić, who presented on the relationship between delusions and conspiracy beliefs. Should we treat delusions and conspiracy beliefs differently? Sanja developed an argument for the asymmetry thesis, that delusional beliefs are pathological and conspiracy beliefs are not pathological. Whereas delusions are symptoms of disorder, conspiracy beliefs (apart from those that appear in persecutory delusions) are not associated with pathology.


Sanja Dembić

What are good reasons to believe the asymmetry thesis? There are some common features:

  • they are belief-like states
  • they resist counterevidence
  • they can be implausible or even bizarre
  • they are held with conviction
There are differences too: paranoid delusions are about the person and are very distressing; conspiracy beliefs are not just about the person and can be comforting. So there are differences in content and in the harm associated with the belief. However, the view that they are different because delusions are generally more implausible than conspiracy beliefs is not well motivated. How about harm? Maybe delusions are more harmful than conspiracy beliefs. But this is not to say that conspiracy beliefs are not harmful.

Considerations about cognitive dysfunction and biases also fail to discriminate between pathological and non-pathological beliefs, as there are strong symmetries between delusions and conspiracy beliefs. The only way to make the asymmetry convincing for Sanja is to say that in the case of delusions, but not in the case of conspiracy theories, there aren't any real or apparent reasons for the belief. So reasons-responsiveness is the key to understand why delusions are pathological and conspiracy beliefs are not: there is no shareable meaningful connection between evidence and belief in delusions; there are only private reasons for the belief.

All the talks stimulated very interesting discussion and the meeting was (at least for me!) very inspiring and productive.


Popular posts from this blog

Delusions in the DSM 5

This post is by Lisa Bortolotti. How has the definition of delusions changed in the DSM 5? Here are some first impressions. In the DSM-IV (Glossary) delusions were defined as follows: Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person's culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.

Rationalization: Why your intelligence, vigilance and expertise probably don't protect you

Today's post is by Jonathan Ellis , Associate Professor of Philosophy and Director of the Center for Public Philosophy at the University of California, Santa Cruz, and Eric Schwitzgebel , Professor of Philosophy at the University of California, Riverside. This is the first in a two-part contribution on their paper "Rationalization in Moral and Philosophical thought" in Moral Inferences , eds. J. F. Bonnefon and B. Trémolière (Psychology Press, 2017). We’ve all been there. You’re arguing with someone – about politics, or a policy at work, or about whose turn it is to do the dishes – and they keep finding all kinds of self-serving justifications for their view. When one of their arguments is defeated, rather than rethinking their position they just leap to another argument, then maybe another. They’re rationalizing –coming up with convenient defenses for what they want to believe, rather than responding even-handedly to the points you're making. Yo

A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind

Today's post is by  Karen Yan (National Yang Ming Chiao Tung University) on her recent paper (co-authored with Chuan-Ya Liao), " A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind " ( Synthese 2023). Karen Yan What drives us to write this paper is our curiosity about what it means when philosophers of mind claim their works are informed by empirical evidence and how to assess this quality of empirically-informedness. Building on Knobe’s (2015) quantitative metaphilosophical analyses of empirically-informed philosophy of mind (EIPM), we investigated further how empirically-informed philosophers rely on empirical research and what metaphilosophical lessons to draw from our empirical results.  We utilize scientometric tools and categorization analysis to provide an empirically reliable description of EIPM. Our methodological novelty lies in integrating the co-citation analysis tool with the conceptual resources from the philosoph