Skip to main content

Elliot Aronson on Hypocrisy

Today's post is by Elliot Aronson (pictured below), Professor Emeritus of Psychology at the University of California Santa Cruz, and author of The Social Animal and Mistakes Were Made (but not by me), with Carol Tavris.



Social psychologists define hypocrisy as behaving contrary to one’s values or beliefs; in common vernacular, it can be defined as the failure to practice what one preaches.

I think that just about everybody wants to see themselves as a person of integrity. It is a very powerful desire that transcends individual differences due to age, gender, race, socio-economic status, and nationality. This quest to maintain a self-image of integrity is quite touching; at the same time, it often distorts our memory or causes us to stretch to find justifications for actions that might appear hypocritical. Thus, it is far easier to see hypocrisy in others than it is to see it in oneself. If there are individual differences they lie not in the ability to behave hypocritically, but in the ability either to blind oneself to our past behavior or to find ways justify our behavior which unbiased observers might judge to be hypocritical.

People will attempt to defend against seeing themselves as hypocrites through forgetting, compartmentalization, or some form of self-justification. To take one example from contemporary American politics, whenever journalists have confronted Donald Trump about having said something in the recent past that contradicts his current position, he will frequently respond by denying that he ever said that—even though video tape exists of his having said it on national TV—just a week or two ago. Politicians usually avoid telling outright lies that can easily be exposed via video. Accordingly, I am convinced that Trump was not deliberately lying but that he actually did forget that he said the very thing that he now denies having said.

When motivated forgetting fails, people will try to invent reasons that justify their actions. For example, in Stanley Milgram’s classic experiment on obedience, he showed that two thirds of his subjects gave what they believed to be near lethal electric shocks to an innocent person in obedience to an authority figure. These people generally regarded themselves as decent human beings. When interviewed afterward, often justified their behavior by claiming that they had no choice because they had committed themselves to participating in the research. Therefore they felt obliged to continue administering shocks—even though they firmly believed they were harming the victim. In addition, many actually convinced themselves that their victim, in some obscure way, deserved what he got.

My own research was directed toward breaking through these justifications and forcing people to become mindful of their own hypocrisy so that they might behave with integrity. For these experiments I sought out situations where the realization that they were behaving hypocritically might lead them to change behavior in a way that would be beneficial to themselves and to society at large. My first experiment came about as a response to the AIDS epidemic. Background: Because AIDS is caused primarily by sexual contact, and because condoms are effective at preventing sexually transmitted diseases, then AIDS prevention seemed simply to be a matter of explaining the cause and prevention to sexually active people. 

But it wasn’t that simple. In the United States, hundreds of millions of dollars were spent on public-service information campaigns which produced only a negligible increase in condom use among sexually active people. For example, at my university, (which was typical of most universities in America), although students had a healthy fear of AIDS, only 17% of sexually active students used condoms regularly. How come? Although almost all college students believed condoms could prevent AIDS, most considered their use to be unromantic and a terrible nuisance. Accordingly, without giving it a lot of thought most succeeded in convincing themselves that it was okay for them not to use condoms because well, "certainly none of my friends could possibly have AIDS". In short, they implicitly believed that although condom use was essential for others, it might not apply to them.

Together with my graduate students (Jeff Stone and Carrie Fried), I set out to find a way to force students to become vividly aware of the fact that they were not practicing what they were preaching. We predicted that once sexually active people were confronted with the fact that they were advocating behavior that they, themselves were not practicing, they would be motivated to modify their behavior to preserve their integrity.

In our experiment, we instructed college students to compose a speech describing the dangers of AIDS and advocating the use of condoms. In the hypocrisy condition, students (1) recited their speech in front of a video camera and were informed that the video would be shown to high school students, as part of a sex education course, and (2) were made mindful of their own implicit decision not to use condoms by interviewing them and getting them to tell us, in detail, the circumstances in which they failed to use condoms in the recent past. In the control conditions, students either made the speech without having been made mindful or were made mindful without making the speech. Several months later, as part of an “unrelated” telephone survey, students were asked about their sexual behavior. Almost 60% of the students in the hypocrisy condition reported using condoms regularly—about three times the percentage of people in the control conditions. That is, when forced to become mindful of the discrepancy between what they were practicing and what they were preaching, a large percentage modified their practice.

In a subsequent experiment, during a severe drought in California, we used a variation on this hypocrisy procedure to induce students to conserve water by taking shorter showers. Specifically, we intercepted students who were headed for the showers in the university field house, and asked them to print their name on a large, prominent poster on wall that read: “Conserve Water. Take Short Showers. If I can do it so can you.” 

In the hypocrisy condition we also made them mindful of their usual shower-taking behavior—which was far from conservative. In one control conditions we got them to sign the poster but did not force them to think about their usual shower-taking behavior; in the other, we simply interviewed them about their past shower-taking behavior without asking them to put their names on the poster. Subsequently, we surreptitiously timed their actual shower-taking behavior. In the control conditions, the average time in the shower was just under 14 minutes. In the Hypocrisy/Mindful condition, the average time was 3.6 minutes. (Any parent of teenagers can attest to the fact that 3.6 minutes in the shower is something of a miracle!).

To summarize, people have a strong need to behave with integrity. But people are also adept at allowing themselves to behave hypocritically by either blinding ourselves to the difference between what we are practicing and what we are preaching or by finding some other justification for their behavior. In our research, my students and I have succeeded in finding relatively gentle ways to help people become more mindful of the gap between their beliefs and behavior, producing a beneficial modification of their behavior and ultimately restoring their legitimate feelings of integrity.

Popular posts from this blog

Delusions in the DSM 5

This post is by Lisa Bortolotti. How has the definition of delusions changed in the DSM 5? Here are some first impressions. In the DSM-IV (Glossary) delusions were defined as follows: Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person's culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.

Rationalization: Why your intelligence, vigilance and expertise probably don't protect you

Today's post is by Jonathan Ellis , Associate Professor of Philosophy and Director of the Center for Public Philosophy at the University of California, Santa Cruz, and Eric Schwitzgebel , Professor of Philosophy at the University of California, Riverside. This is the first in a two-part contribution on their paper "Rationalization in Moral and Philosophical thought" in Moral Inferences , eds. J. F. Bonnefon and B. Trémolière (Psychology Press, 2017). We’ve all been there. You’re arguing with someone – about politics, or a policy at work, or about whose turn it is to do the dishes – and they keep finding all kinds of self-serving justifications for their view. When one of their arguments is defeated, rather than rethinking their position they just leap to another argument, then maybe another. They’re rationalizing –coming up with convenient defenses for what they want to believe, rather than responding even-handedly to the points you're making. Yo...

A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind

Today's post is by  Karen Yan (National Yang Ming Chiao Tung University) on her recent paper (co-authored with Chuan-Ya Liao), " A co-citation analysis of cross-disciplinarity in the empirically-informed philosophy of mind " ( Synthese 2023). Karen Yan What drives us to write this paper is our curiosity about what it means when philosophers of mind claim their works are informed by empirical evidence and how to assess this quality of empirically-informedness. Building on Knobe’s (2015) quantitative metaphilosophical analyses of empirically-informed philosophy of mind (EIPM), we investigated further how empirically-informed philosophers rely on empirical research and what metaphilosophical lessons to draw from our empirical results.  We utilize scientometric tools and categorization analysis to provide an empirically reliable description of EIPM. Our methodological novelty lies in integrating the co-citation analysis tool with the conceptual resources from the philosoph...