Skip to main content

Beliefs that Feel Good Workshop


On December 16th and 17th, the Cognitive Irrationality project hosted a workshop on beliefs that feel good, organized by Marie van Loon, Melanie Sarzano and Anne Meylan. This very interesting event dealt with beliefs that feel good but are epistemically problematic in some way, as well as with beliefs for which this is not the case.

While the majority of talks and discussions focused on problematic cases, such as wayward believers, self-deceptive beliefs and unrealistically optimistic beliefs, there was also a discussion of epistemic virtues and the relation between scepticism and beliefs in the world. Below, I summarize the main points made in the talks.

Quassim Cassam probed the question why people hold weird beliefs and theories. Some examples are the theory that the moon landings were faked, or that 9/11 was in reality an inside job. Quassim argued that more often not, these types of beliefs stem from epistemic vices. In as far as these vices are stealthy, i.e. not apparent to the person who has them, it can be very hard to persuade people of the falsity of their opinions, as these vices reinforce and insulate the wayward beliefs. While it can happen that these wayward believers suddenly come to realize the inadequacy of their belief system in an act of deep unlearning, this is normally not something we can bring about by means of reasoning and persuasion.

​In a talk that straddled the disciplines of philosophy, psychology and neuroscience, Federico Lauria put forward an account of self-deception as affective coping. He argued that we self-deceive in order to avoid painful truths and that a full explanation of self-deception needs to account for affective evaluation of information. He argued that self-deceptive individuals evaluate evidence that their desire is frustrated in the light of three appraisals: (i) ambiguity of the evidence, (ii) negative impact on one's well-being (mediated by somatic markers), and (iii) low coping potential. In addition, self-deceptive people evaluate the state of desire satisfaction positively, which is mediated by increased dopaminergic activity. The affective conflict between truth and happiness is at the heart of self-deception.

In my own talk, I asked whether unrealistic optimism is irrational. I argued that unrealistically optimistic beliefs are frequently epistemically irrational because they result from processes of motivated cognition which privilege self-related desirable information while discounting undesirable information. However, unrealistically optimistic beliefs can be pragmatically rational in as far as they lead individuals to pursue strategies that make a desired outcome more likely. Interestingly, where they are pragmatically rational, they are also less epistemically irrational, as the likelihood of success increases. However, this does not mean that they become realistic or justified. They merely become less unwarranted than they would otherwise have been.

Susanna Rinard proposed a novel response to scepticism, arguing that even if we concede that ordinary beliefs are not well supported by our evidence, it does not follow that we should give up on everyday beliefs. Rather, we can run the risk of having false beliefs about the external worlds because we also incur a reverse cost when suspending belief, which is that of not having true beliefs. Furthermore, suspending judgment may come with additional costs, such as being effortful and unpleasant.

Finally, Adam Carter gave an account of insight as an epistemic virtue, drawing both on virtue epistemology and the psychology of insight problem solving. Insight experiences are characterized by a certain ‘aha’ moment where relations between states of affairs previously seen as unrelated strike the individual. Adam argued that the virtue of insightfulness lies in both cultivating the anteceding stages of preparation and incubation of insights described in the empirical literature and moving from insight experiences to judgment and/or verification in a considered way, i.e. not blindly endorsing the result of every insight experience but giving it due weight.

Every talk was followed by a response and lively discussion. Thanks to Anne, Melanie and Marie for organizing a workshop which was both intellectually stimulating and friendly.

Popular posts from this blog

Delusions in the DSM 5

This post is by Lisa Bortolotti. How has the definition of delusions changed in the DSM 5? Here are some first impressions. In the DSM-IV (Glossary) delusions were defined as follows: Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person's culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.

Rationalization: Why your intelligence, vigilance and expertise probably don't protect you

Today's post is by Jonathan Ellis , Associate Professor of Philosophy and Director of the Center for Public Philosophy at the University of California, Santa Cruz, and Eric Schwitzgebel , Professor of Philosophy at the University of California, Riverside. This is the first in a two-part contribution on their paper "Rationalization in Moral and Philosophical thought" in Moral Inferences , eds. J. F. Bonnefon and B. Trémolière (Psychology Press, 2017). We’ve all been there. You’re arguing with someone – about politics, or a policy at work, or about whose turn it is to do the dishes – and they keep finding all kinds of self-serving justifications for their view. When one of their arguments is defeated, rather than rethinking their position they just leap to another argument, then maybe another. They’re rationalizing –coming up with convenient defenses for what they want to believe, rather than responding even-handedly to the points you're making. Yo...

A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind

Today's post is by  Karen Yan (National Yang Ming Chiao Tung University) on her recent paper (co-authored with Chuan-Ya Liao), " A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind " ( Synthese 2023). Karen Yan What drives us to write this paper is our curiosity about what it means when philosophers of mind claim their works are informed by empirical evidence and how to assess this quality of empirically-informedness. Building on Knobe’s (2015) quantitative metaphilosophical analyses of empirically-informed philosophy of mind (EIPM), we investigated further how empirically-informed philosophers rely on empirical research and what metaphilosophical lessons to draw from our empirical results.  We utilize scientometric tools and categorization analysis to provide an empirically reliable description of EIPM. Our methodological novelty lies in integrating the co-citation analysis tool with the conceptual resources from the philosoph...