Today's post is by Joe Pierre, Acting Chief at the Mental Health Community Care Systems, VA Greater Los Angeles Healthcare System, and Health Sciences Clinical Professor in the Department of Psychiatry & Biobehavioral Sciences at the David Geffen School of Medicine at UCLA.
The blurry line separating psychopathology and normality, in the real world and the DSM, has been a longtime interest. Twenty years ago, I attempted to disentangle religious and delusional beliefs using the “continuum” model of delusional thinking based on cognitive dimensions. More recently, I’ve tried to understand other “delusion-like beliefs” (DLBs) including conspiracy theories, a frequent topic of my blog, Psych Unseen. A forthcoming paper models belief in conspiracy theories as a “two component, socio-epistemic” process involving epistemic mistrust and biased misinformation processing.
Delusions and DLBs remain challenging to distinguish in clinical practice and in the internet era where fringe beliefs are often validated. Continuum models can be helpful, along with some categorical guidelines. Delusional beliefs are false; DLBs may not be. Delusions are usually idiosyncratic/unshared, based on subjective experience, and self-referential; DLBs usually aren’t. On the contrary, DLBs are typically based on learned misinformation if not deliberate disinformation.
In forensics, the distinction between delusions and DLBs can be crucial. Mass murderer Anders Breivik nearly eluded criminal conviction based on how Norwegian law treats psychosis as an exculpatory factor (see Dr. Bortolotti et al’s nuanced account). For prosecutors and expert witnesses supporting their cause, the potential exculpatory role of DLBs therefore presents a sizeable headache. Consequently, a group of forensic psychiatrists led by Dr. Tahir Rahman has proposed a new diagnostic category called “extreme overvalued beliefs” to describe DLBs that they claim are easily differentiated from delusions:
Although I agree that DLBs deserve to be separated from delusions, and that DSM-5 doesn’t help much, I’m a critic of “extreme overvalued beliefs” as a solution for several reasons (see here and here for more details):
First, diagnosing “extreme overvalued beliefs” isn’t nearly as easy as is claimed.
Second, DLBs shouldn’t be “swept under the rug” of a new psychiatric umbrella term. A fuller understanding would benefit from integrating established concepts from psychology (e.g. conspiracy theories), sociology and political science (e.g. terrorist “extremism”), and information science (e.g. belief in misinformation).
Third, “extremism” in overvalued beliefs is defined by criminal behavior, not on dimensional features of the belief itself, leaving unresolved why some commit violent acts in the service of DLBs, but most don’t.
And finally, the conceptualization has a concerning prosecutorial bias, seemingly in the service of thwarting defense efforts to claim incapacity and argue for sentence mitigation due to DLBs. Since many DLBs are learned or indoctrinated, a more nuanced view might see them through a lens of not only epistemic innocence, but potential forensic innocence as well. In a world where misinformation and disinformation now runs rampant, we should at least consider that the distributed responsibility for DLBs and their occasional forensic impact extends beyond the individual.
Delusions and DLBs remain challenging to distinguish in clinical practice and in the internet era where fringe beliefs are often validated. Continuum models can be helpful, along with some categorical guidelines. Delusional beliefs are false; DLBs may not be. Delusions are usually idiosyncratic/unshared, based on subjective experience, and self-referential; DLBs usually aren’t. On the contrary, DLBs are typically based on learned misinformation if not deliberate disinformation.
In forensics, the distinction between delusions and DLBs can be crucial. Mass murderer Anders Breivik nearly eluded criminal conviction based on how Norwegian law treats psychosis as an exculpatory factor (see Dr. Bortolotti et al’s nuanced account). For prosecutors and expert witnesses supporting their cause, the potential exculpatory role of DLBs therefore presents a sizeable headache. Consequently, a group of forensic psychiatrists led by Dr. Tahir Rahman has proposed a new diagnostic category called “extreme overvalued beliefs” to describe DLBs that they claim are easily differentiated from delusions:
An extreme overvalued belief is one that is shared by others in a person's cultural, religious, or subcultural group. The belief is often relished, amplified, and defended by the possessor of the belief… The individual has an intense emotional commitment to the belief and may carry out violent behavior in its service.
Although I agree that DLBs deserve to be separated from delusions, and that DSM-5 doesn’t help much, I’m a critic of “extreme overvalued beliefs” as a solution for several reasons (see here and here for more details):
First, diagnosing “extreme overvalued beliefs” isn’t nearly as easy as is claimed.
Second, DLBs shouldn’t be “swept under the rug” of a new psychiatric umbrella term. A fuller understanding would benefit from integrating established concepts from psychology (e.g. conspiracy theories), sociology and political science (e.g. terrorist “extremism”), and information science (e.g. belief in misinformation).
Third, “extremism” in overvalued beliefs is defined by criminal behavior, not on dimensional features of the belief itself, leaving unresolved why some commit violent acts in the service of DLBs, but most don’t.
And finally, the conceptualization has a concerning prosecutorial bias, seemingly in the service of thwarting defense efforts to claim incapacity and argue for sentence mitigation due to DLBs. Since many DLBs are learned or indoctrinated, a more nuanced view might see them through a lens of not only epistemic innocence, but potential forensic innocence as well. In a world where misinformation and disinformation now runs rampant, we should at least consider that the distributed responsibility for DLBs and their occasional forensic impact extends beyond the individual.