Tuesday 1 June 2021

Social Approaches to Delusions (4): Collectively Jumping to Conclusions

Justin Sulik

Here is the fourth post in the series on social approaches to delusions, after the posts by Miyazono, Williams and Montagnese, and Wilkinson. In today's post, Justin Sulik, Charles Jefferson, and Ryan McKay discuss their new paper, “Collectively Jumping to Conclusions: Social Information Amplifies the Tendency to Gather Insufficient Data”.

Justin Sulik is a postdoctoral researcher in the Cognition, Values and Behavior group at Ludwig Maximilian University of Munich. Charles Efferson is Professor in the Faculty of Business and Economics at the University of Lausanne. Ryan McKay is Professor of Psychology at Royal Holloway, University of London.

 

Charles Efferson


Human beings are inveterate misbelievers. At the individual level, our propensity to false beliefs about our prowess and prospects can be costly and dangerous: promoting harmful behaviours like unsafe driving, smoking and overspending. Spreading and amplifying in large groups, however, such misbeliefs – we might call them “collective delusions” – can have catastrophic consequences. Widespread suspicions that vaccines cause more harm than good, that coronavirus is just another flu that will pass, or that climate change is a hoax, decrease people’s intentions to vaccinate, to self-isolate, or to reduce their carbon footprint, triggering or exacerbating global public health emergencies and environmental disasters.

How can we explain collective delusions? Psychological explanations typically appeal to two main causal pathways (see a and b in the figure). The first is a matter of individual psychology: some of us have cognitive biases or deficits that render us, in isolation, more prone to misbeliefs. The tendency to “jump to conclusions”, for instance – forming beliefs on minimal evidence – is thought to play a role in the formation of clinical delusions. 





The second pathway is social: People are influenced by the beliefs of those around them, particularly by those with status and prestige. For instance, when a high-status individual like President Trump claims that vaccinations are causing an autism epidemic; that the coronavirus pandemic is totally under control; that President Obama was born in Kenya, or that global warming is a hoax created by the Chinese, these beliefs are likely to increase among the population.

In a new pre-registered study, we test whether a third causal pathway exists (pathway c in the figure), whereby social information affects the cognitive biases themselves, in addition to the beliefs those biases occasion. Can social context explain not just what people learn but also how they learn?

We investigate the “jumping to conclusions” bias specifically. To explore whether a social context can amplify this individual learning bias, we embedded a probabilistic reasoning task (the well-known “Beads Task”) in a social transmission-chain procedure. We recruited 2000 participants and randomly assigned them to one of 100 chains or groups (20 participants per group). Within each group, an initial participant (assigned to Position 1 within their group) undertook the Beads Task. They had to draw coloured beads from an urn, filled with beads of two colours, in order to discover the colour of the majority of beads in the urn. Each participant could decide how much evidence they gathered (specifically, how many beads they wanted to draw) before making their decision about the majority colour.



Ryan McKay


Subsequently, a second participant (assigned to Position 2 within their group) was given information about how many beads the first participant gathered before they also undertook the Beads Task. Thereafter, the participant in Position 3 was given information about the data-gathering decisions of participants in Positions 1 and 2, and so on. This was the procedure for half of the groups (our social condition). The other half were assigned to an asocial control condition, which was identical except that participants were not given information about the data-gathering decisions of previous participants in their group.

We gave a small monetary reward to participants who correctly guessed the majority colour of the urn, but also made them pay for each bead they drew before making this guess. This meant there was always a rationally optimal amount of evidence to gather. Across the board, our participants jumped to conclusions, drawing fewer beads than they optimally should have done. Crucially, this tendency was amplified when participants could observe the evidence-gathering behaviour of others in their chain (the social condition), relative to those who could not (the asocial control). Effectively, participants in the social condition “caught” this tendency from their forebears, and as a result were more likely to arrive at false beliefs about the majority colour, and to earn less money.

To contextualise this, let’s return to President Trump. Aside from the kinds of beliefs Mr Trump endorses, he has also displayed a certain attitude to evidence. For example, he is famously allergic to reading. According to Trump’s former chief economic adviser, “Trump won’t read anything—not one-page memos, not the brief policy papers, nothing.” Trump himself has acknowledged that he prefers reports to contain “as little as possible.” Now, again, Trump is a prestigious individual and many people follow his lead. But in this case what they might “catch” from him is not a specific belief (e.g., that vaccines cause autism), but a learning strategy – a means by which people acquire beliefs. In particular, they might acquire a disregard for evidence (at least of the written kind).

Consistent with this, our study demonstrates that when individuals see others collecting minimal evidence before making a decision, they too are more inclined to collect minimal evidence. People who form beliefs on the basis of minimal evidence are more likely to be wrong – so, in shifting the focus from the diffusion of false beliefs to the diffusion of suboptimal belief-formation strategies, we have identified a novel mechanism whereby misbeliefs arise and spread.

No comments:

Post a Comment

Comments are moderated.