Skip to main content

Disorder in the Predictive Mind

Over the last few years I have worked more and more on the idea that the brain is a prediction error minimizer. This has now resulted in a book—The Predictive Mind—just published with Oxford University Press.

The Predictive Mind
By Jakob Hohwy
The first part of the book explains the basic idea of prediction error minimization, which mainly stems from work by Karl Friston and others in computational neuroscience. The second part applies this to the binding problem, to cognitive impenetrability, to delusions and autism, and to a range of philosophical questions about misrepresentation. The third part considers how it applies to attention, consciousness, the mind-world relation, and the nature of self.

The prediction error minimization idea says that all the brain ever does is minimize the error of predictions about its sensory input, formed on the basis of an internal model of the world and the body. The better these predictions are, the less error there is. On this view, the bottom-up signals in the brain, beginning with the sensory input, are conceived as prediction errors that work as feedback to the models maintained in the brain and their top-down predictions.

This is a simple idea, but with extreme explanatory ambition. Perception is the process of refining the models so that better predictions are formed, attention is predicting the sensory input precisely, and action is changing the input to fit with the model’s predictions. In this sense, perception, attention, and action are all best conceived in terms of statistical inference: the brain must use statistical tools to make sound inferences about the world based only on its sensory input.

To be meaningful, inference must be guided by reliable signals so a key part of inference is to estimate the precisions (or variance) of the prediction error (or sensory input). This implies the brain must keep track of the varying levels of noise and uncertainty in the environment, and adjust accordingly how much it relies on the sensory input according to such expected precisions. Mechanistically, this is tied to adjusting the gain on sensory input. Functionally, it is attention.

Simply put, the brain’s predictions of how precise the sensory input is control the “gates of perception”: we attend to signals that are expected to be precise. Optimizing expected precisions is difficult statistically because it is itself a type of inference, in need of further assessment of precision. To avoid regress, at some point, levels of such inference about inference must become uninformative. This induces a certain fragility to a prediction error minimizing system, which suggests that some psychopathologies, and some neurodevelopmental problems may be tied to inference on expected precisions.

Parts of the book pursue this idea, in particular for delusions and elements of autism. I speculate that some aspects of schizophrenia may be tied to low expectations of precision, or little trust in the sensory signal, and that some aspects of autism may be tied to too much trust in the sensory signal. This could potentially explain some of the symptoms of these disorders. A key element here is not chronic expectations of precision, but the ability to adjust the expectation for precision, depending on context. In our own lab, we explore these ideas, in particular concerning autism.

Jakob Hohwy
The hypothesis is that people with autism and high on autism-like traits are slow at adjusting their expected precisions when exposed to contexts suggesting uncertainty, and that this affects subsequent action and perception. We use the rubber hand illusion as a tool to induce contexts of uncertainty and have found evidence in support of our hypothesis. Recently, we have begun to expand this experimental research into somatosensory effects (the well-known phenomenon that we cannot tickle ourselves), where we find evidence in support of the action-related aspects of prediction error minimization. We are also beginning to look at aspects of social cognition in autism, from the perspective of prediction error minimization.

Versions of all the linked papers are available from my website.
The book is available in paperback, hardback and kindle; it is also available from amazon.co.uk (amazon.de and amazon.com soon).

Popular posts from this blog

Delusions in the DSM 5

This post is by Lisa Bortolotti. How has the definition of delusions changed in the DSM 5? Here are some first impressions. In the DSM-IV (Glossary) delusions were defined as follows: Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person's culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.

Rationalization: Why your intelligence, vigilance and expertise probably don't protect you

Today's post is by Jonathan Ellis , Associate Professor of Philosophy and Director of the Center for Public Philosophy at the University of California, Santa Cruz, and Eric Schwitzgebel , Professor of Philosophy at the University of California, Riverside. This is the first in a two-part contribution on their paper "Rationalization in Moral and Philosophical thought" in Moral Inferences , eds. J. F. Bonnefon and B. Trémolière (Psychology Press, 2017). We’ve all been there. You’re arguing with someone – about politics, or a policy at work, or about whose turn it is to do the dishes – and they keep finding all kinds of self-serving justifications for their view. When one of their arguments is defeated, rather than rethinking their position they just leap to another argument, then maybe another. They’re rationalizing –coming up with convenient defenses for what they want to believe, rather than responding even-handedly to the points you're making. Yo...

A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind

Today's post is by  Karen Yan (National Yang Ming Chiao Tung University) on her recent paper (co-authored with Chuan-Ya Liao), " A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind " ( Synthese 2023). Karen Yan What drives us to write this paper is our curiosity about what it means when philosophers of mind claim their works are informed by empirical evidence and how to assess this quality of empirically-informedness. Building on Knobe’s (2015) quantitative metaphilosophical analyses of empirically-informed philosophy of mind (EIPM), we investigated further how empirically-informed philosophers rely on empirical research and what metaphilosophical lessons to draw from our empirical results.  We utilize scientometric tools and categorization analysis to provide an empirically reliable description of EIPM. Our methodological novelty lies in integrating the co-citation analysis tool with the conceptual resources from the philosoph...