Skip to main content

Ignorance, Misconceptions and Critical Thinking

This post is by Sara Dellantonio and Luigi Pastore. They discuss the theme of a recent paper, "Ignorance, misconception and critical thinking", appeared in Synthese.


Sara Dellantonio

Beliefs such as “Tiny specks of matter don’t weigh anything”, “Most people only use 10% of their brains”, “People with severe mental illness are prone to violence” or “Autism has become an epidemic” are usually defined as misconceptions, i.e., as beliefs that are considered to be false in the light of current accepted scientific knowledge.

Most studies on misconceptions aim to identify and create lists of the most common misconceptions across scientific fields such as physics, psychology, medicine, etc. In our article “Ignorance, misconceptions and critical thinking” we instead investigate the reasons that such misconceptions are endorsed in the first place.

It turns out that many of our misconceptions are not isolated errors that occur against the background of a correct explanatory framework. Instead, they constitute an integral part of and are the products of pseudo-explanations.

It is widely acknowledged that misconceptions are dangerous and detrimental. In everyday reasoning and decision-making, we rely on our beliefs. If these beliefs contradict current scientific knowledge, they are most probably erroneous and misleading; they will lead us to conclusions and actions that are not aligned with a scientific understanding of the world.

According to a widely accepted view, (one of) the most effective ways to eradicate misconceptions is to intervene directly: identify the specific misconceptions, inform people that these positions are wrong, and provide them with the correct facts.


Luigi Pastore

However, if our misconceptions are deeply rooted in our belief system and closely tied to pseudo-explanations for phenomena, this approach is ineffective: it does not alter the pseudo-explanations these misconceptions rely on and result from. The only truly incisive way to address such misconceptions is to tackle the explanation that supports them and adjust the system of inferences that gives rise to them. By doing so, we can act on all the interconnected false beliefs people hold on the topic, offer good reasons to embrace new beliefs, and provide means to make further well-grounded and congruent inferences on related phenomena – i.e., to develop new insights into this content.

In order for such changes in belief systems to occur, we not only need to acquire new disciplinary knowledge but also to develop adequate reasoning skills. The discipline that appears most promising to develop these skills is critical thinking. At its very basis, critical thinking aims to improve people’s analytic and evaluative attitudes toward knowledge; it consists in training individuals to reason in a disciplined manner, adhering to clear intellectual standards.

The definition of critical thinking goes back to John Dewey. In Dewey’s view, critical thinking consists in an active form of knowledge acquisition in which immediate intuitions are weakened and put under scrutiny and in which the acquisition of new information is accompanied by an adequate comprehension of how this knowledge should be organized and how its various components relate to form a system. Critical thinking appears to be key to developing integrated and coherent belief systems that are continuously examined from the point of view of the evidence they rely on and that are continuously re-evalutated to ensure all components remain mutually consistent and plausible.

Popular posts from this blog

Delusions in the DSM 5

This post is by Lisa Bortolotti. How has the definition of delusions changed in the DSM 5? Here are some first impressions. In the DSM-IV (Glossary) delusions were defined as follows: Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person's culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.

Rationalization: Why your intelligence, vigilance and expertise probably don't protect you

Today's post is by Jonathan Ellis , Associate Professor of Philosophy and Director of the Center for Public Philosophy at the University of California, Santa Cruz, and Eric Schwitzgebel , Professor of Philosophy at the University of California, Riverside. This is the first in a two-part contribution on their paper "Rationalization in Moral and Philosophical thought" in Moral Inferences , eds. J. F. Bonnefon and B. Trémolière (Psychology Press, 2017). We’ve all been there. You’re arguing with someone – about politics, or a policy at work, or about whose turn it is to do the dishes – and they keep finding all kinds of self-serving justifications for their view. When one of their arguments is defeated, rather than rethinking their position they just leap to another argument, then maybe another. They’re rationalizing –coming up with convenient defenses for what they want to believe, rather than responding even-handedly to the points you're making. Yo

A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind

Today's post is by  Karen Yan (National Yang Ming Chiao Tung University) on her recent paper (co-authored with Chuan-Ya Liao), " A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind " ( Synthese 2023). Karen Yan What drives us to write this paper is our curiosity about what it means when philosophers of mind claim their works are informed by empirical evidence and how to assess this quality of empirically-informedness. Building on Knobe’s (2015) quantitative metaphilosophical analyses of empirically-informed philosophy of mind (EIPM), we investigated further how empirically-informed philosophers rely on empirical research and what metaphilosophical lessons to draw from our empirical results.  We utilize scientometric tools and categorization analysis to provide an empirically reliable description of EIPM. Our methodological novelty lies in integrating the co-citation analysis tool with the conceptual resources from the philosoph