This is part of a series of posts on the new journal, Memory, Mind & Media. Today's post is by Stephan Lewandowsky (University of Bristol) and Peter Pomerantsev (Johns Hopkins University). Their forthcoming article ‘Technology and democracy: a paradox wrapped in a contradiction inside an irony’ will be published shortly as part of the journal inaugural collection.
Stephan Lewandowsky |
Numerous indicators suggest that democracy is in retreat globally. Even countries that had been considered stable democracies have recently witnessed events that are incompatible with democratic governance and the rule of law, such as the armed assault on the U.S. Capitol in 2021 and the unlawful suspension of the British parliament in 2019.
Although the symptoms and causes of democratic backsliding are complex and difficult to disentangle, the Internet and social media are frequently blamed in this context. For example, social media has been identified as a tool of autocrats, and some scholars have questioned whether democracy can survive the Internet. Indeed, recent evidence suggests that social media can cause some anti-democratic political behaviors ranging from ethnic hate crimes to voting for populist parties.
In the opposing corner, social media has been heralded as “liberation technology”, owing to its role in the “Arab Spring” and other instances in which it mobilized the public against autocratic regimes. Similarly, protest movements around the world have relied on social media platforms for the coordination of collective action.
This is the fundamental paradox of the Internet and social media: They erode democracy and they expand democracy. They are the tools of autocrats and they are the tools of activists. They make people obey and they make them protest. They provide a voice to the marginalized and they give reach to fanatics and extremists, and all of those views can appeal to supporting evidence.
We suggest that this basic paradox can be resolved only by examining the unique pressure points between human cognition and the architecture of the information ecology.
Peter Pomerantsev |
For example, people are known to attend to news that is predominantly negative or awe inspiring, and they preferentially share messages couched in moral-emotional language. When this fundamental attribute of human attention is combined with the social media platforms’ desire to keep people engaged, so that users’ attention can be sold to advertisers, it becomes unsurprising that online content has become outrage-provoking and toxic in so many instances: Misinformation on Facebook during the 2016 U.S. presidential campaign was particularly likely to provoke voter outrage and fake news titles have been found to be substantially more negative than real news titles.
Protecting citizens from misinformation, and protecting democracy itself, therefore requires a redesign of the current anti-democratic reinforcement structures of the online “attention economy”. In an Internet with democratic credentials, users would be able to understand which of their own data has been used to target them and why. Users would know why algorithms show them one thing and not another. During elections people would immediately understand how different campaigns target different people with different messages, who is behind campaigns, and how much they spend.
And as individuals should have more oversight and control over the information environment all around them, so should the public have greater oversight and control over tech companies in general. The public need to be able to understand what social engineering experiments the companies tinker with, what their impacts are, and how the tech companies track the consequences of these experiments.
We consider the redesign of the internet to be the defining political battle of the 21st century—the battle between technological hegemony and survival of democracy.