Skip to main content

Failures of Introspective Belief Formation

This post is by Chiara Caporuscio (Berlin School of Mind and Brain). Here Chiara discussed some ideas from her paper Introspection and Belief, published in Review of Philosophy and Psychology (2021).


Chiara Caporuscio

Are beliefs about the external world psychologically and epistemically different from beliefs about what is going on in our own mind? The belief that it’s a rainy day outside is formed by weighing different sources of evidence, such as the view from my window or the weather forecast. It might be influenced by motivational factors, such as my desire to have a picnic later in the day. It is prone to error - for example, my upstairs neighbour watering the plants on their balcony might have caused me to jump to conclusions - and it can be revised and updated when new evidence comes in. My belief that I’m feeling happy, on the other hand, has been regarded by a long philosophical tradition as being fundamentally different: direct, incorrigible, and protected from error. In my recent paper, I argue that the way we form most beliefs about our inner world is often not so different from the way we form beliefs about the external world - and, like the latter, it can go astray.

 

Some introspective beliefs can be plausibly defended as being highly protected from error because they are exclusive, i.e. they are only determined by their target mental state and nothing else. Judgements of this kind are “I am feeling this” (Gertler, 2012) or “This is R”, where R is a phenomenal concept purely constituted by the experience (Chalmers, 2003). However, this also makes them uninformative: their infallibility is not helpful for our everyday goals of communicating our mental states to others, guiding action or learning something about ourselves. The informative beliefs that better capture daily instances of introspection in practical use, like “I am feeling a throbbing pain” cannot be infallible in this way, as they require relating the phenomenal character of our mental state to other concepts and experiences.

 

In my paper, I compare informative introspection with regular belief formation. To do so, I employ a 5-stage cognitive account of belief formation put forward by Connors and Halligan (2015; 2020), according to which beliefs arise in response to a distal trigger, namely, a precursor (stage 1). Then, different hypotheses to explain the precursor are formulated in a search for meaning (stage 2) and evaluated (stage 3). The hypothesis that better explains the precursor given the rest of our beliefs becomes accepted as a new belief (stage 4) and affects new beliefs and lower-level processes (stage 5). In this process, non-pathological errors and delusions can arise when something goes wrong in stages 2 and 3: for example, when we lack the background knowledge that would help us formulate the right hypothesis, when our background beliefs are false and lead us astray, or when our biases or motivational factors lead us to favour the wrong hypothesis.

I argue that the same five stages are likely to be needed for the formation of informative introspective beliefs, meaning that false or missing background beliefs, biases or motivational factors can mess with stages 2 and 3 and lead them astray. For example, a psychiatric patient lacking the notion of intrusive thoughts might be unable to formulate the right hypothesis about their mental state, and thus mistake them for desires; someone angry at a friend for petty reasons might decide to favour the hypothesis that they are perfectly calm because they don’t want to be the kind of person that holds unmotivated grudges. Informative introspective beliefs have similar failure conditions as beliefs about the external world.

So far, I have referred to false introspective beliefs outside of the context of psychiatry. But what about pathological errors in introspective belief formation? According to the Diagnostic and Statistical Manual of Mental Disorders, a delusion is a pathological failure in belief formation “based on incorrect inference about external reality [...]”. A prima facie reason to maintain the external reality condition is the presumed infallibility of introspection. However, if my account is on the right track, we could be as dramatically wrong about our internal world as we are about our external world: our beliefs about our own experience could be not only false but delusional. The possibility of introspective delusion raises questions about the relationship between experience and belief in delusional belief formation and deserves further investigation.

Popular posts from this blog

Delusions in the DSM 5

This post is by Lisa Bortolotti. How has the definition of delusions changed in the DSM 5? Here are some first impressions. In the DSM-IV (Glossary) delusions were defined as follows: Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person's culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.

Rationalization: Why your intelligence, vigilance and expertise probably don't protect you

Today's post is by Jonathan Ellis , Associate Professor of Philosophy and Director of the Center for Public Philosophy at the University of California, Santa Cruz, and Eric Schwitzgebel , Professor of Philosophy at the University of California, Riverside. This is the first in a two-part contribution on their paper "Rationalization in Moral and Philosophical thought" in Moral Inferences , eds. J. F. Bonnefon and B. Trémolière (Psychology Press, 2017). We’ve all been there. You’re arguing with someone – about politics, or a policy at work, or about whose turn it is to do the dishes – and they keep finding all kinds of self-serving justifications for their view. When one of their arguments is defeated, rather than rethinking their position they just leap to another argument, then maybe another. They’re rationalizing –coming up with convenient defenses for what they want to believe, rather than responding even-handedly to the points you're making. Yo...

A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind

Today's post is by  Karen Yan (National Yang Ming Chiao Tung University) on her recent paper (co-authored with Chuan-Ya Liao), " A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind " ( Synthese 2023). Karen Yan What drives us to write this paper is our curiosity about what it means when philosophers of mind claim their works are informed by empirical evidence and how to assess this quality of empirically-informedness. Building on Knobe’s (2015) quantitative metaphilosophical analyses of empirically-informed philosophy of mind (EIPM), we investigated further how empirically-informed philosophers rely on empirical research and what metaphilosophical lessons to draw from our empirical results.  We utilize scientometric tools and categorization analysis to provide an empirically reliable description of EIPM. Our methodological novelty lies in integrating the co-citation analysis tool with the conceptual resources from the philosoph...