Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts
Carol Tavris and Elliot Aronson
Harcourt Books. 292pp. ISBN 978-0-15-101098-1
Cognitive dissonance is such an elegantly simple idea that it’s hard to forget: anybody who took Psych 101 can remember it. In fact, as Carol Tavris and Elliot Aronson remind us in Mistakes Were Made (But Not By Me), the term is now even a mainstay of pop culture.
As you may remember, cognitive dissonance is “a state of tension that occurs whenever a person holds two distinct cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent,” write the authors. Obviously, it isn’t easy coping with two distinct and contradictory ideas, such as “smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day,” at the same time. So cognitive dissonance produces “mental discomfort,” ranging from “minor pangs to deep anguish,” and people don’t rest easy until they’ve found a way to reduce it.
The resolution for cognitive dissonance is universal: you make up your mind about something you’ve struggled over, and then you find good reasons to support your decision. After you’ve made up your mind, you engage in a determined campaign of self-justification to reduce the unpleasant recollection of the dissonance.
This theory has a marvelous creation story. In the 1950s, a young social scientist, Leon Festinger, and two colleagues infiltrated a group that believed the world would end on December 31. The scientists wanted to know what would happen when their prophecy failed. What Festinger and company discovered was that when the world didn’t come to an end, the group’s disappointment unexpectedly turned into exhilaration: they reasoned that the world had been miraculously spared because of their faith. Their cognitive dissonance—belief that the world would end and the reality that it hadn’t—was reduced by their new belief that they’d witnessed a miracle!
This book is lucidly written and full of “aha” examples. How nice that one principle can be applied so handily to almost all of human existence! Why do bad marriages stay together? Why do cops never admit they’ve sent the wrong people to jail, even after new DNA evidence is introduced? Why did Lyndon Johnson initially struggle with the idea of going full-tilt into Vietnam, only to become a true believer in his own decision afterward? More recently, how was Governor Elliot Spitzer able to think of himself as a watchdog for society’s morals even while he patronized call girls? Cognitive dissonance—and the self-justification required to reduce its unpleasant affects—provides one extensively researched answer.
The psychologists who wrote this book have a certain pedigree. Carol Tavris is a respected writer who reports on psychology for popular audiences. Her well-known book Anger: The Misunderstood Emotion, published 20 years ago, helped put the lie to the nonsense that getting angry is therapeutic (most of the time, expressing anger only makes you angrier). Elliot Aronson is one of the world’s most prominent social psychologists—the only psychologist to have won all three of the American Psychological Association’s top awards. As one of Festinger’s graduate students, he was there at the inception (well, nearly) of the work that put cognitive dissonance on the map.
Sure, cognitive dissonance is a great theory, and it produces terrific yarns. It’s had 50 years to fill its war chest with compelling data. But it’s fair to ask the question of even the most readable of books and most distinguished of authors: why reintroduce readers to this familiar theory? Why now? Simply to celebrate its 50th anniversary?
That’s one reason given by Aronson. But I think there are two other answers to this question. First, cognitive dissonance was the product of a once-great academic specialty we no longer hear so much about: social psychology, the scientific study of how people’s thoughts, feelings, and behaviors are influenced by the actual or imagined presence of others. You probably recall other memorable social psychology experiments from Psych 101 class: Stanley Milgram’s famous authority experiment, in which he got ordinary people to follow orders, even though they hated doing so and thought they were harming others. Or Philip Zimbardo’s Stanford prison experiment, in which student “guards” quickly began to abuse their fellow student “prisoners.” These experiments seemed to make a powerful case that, given the right social conditions, ordinary people can be induced to do things they’d never dream of doing by themselves.
As the focus of our culture has gone increasingly from “we” to “me,” social psychology has been eclipsed by schools of psychology that are more concerned with individual differences and the biological roots of behavior. This book in intended to reinstate cognitive dissonance to a place of prominence within the broader concerns of psychology.
Aronson and Tavris argue that we make hundreds of decisions in the course of our ordinary lives, each one of which causes us plenty of distress. Even casual purchases need justifications to reduce distress—Did you need that new jacket? Sure: You need to look good for the job, and besides, it was on sale! Reducing cognitive dissonance allows us to negotiate lives filled with instantaneous decisions.
Another reason (and perhaps the main one) that this book has currency now is politics. In fact, the main title, Mistakes Were Made, comes from three quotes placed before the introduction. The first is from Henry Kissinger: “Mistakes were quite often made by the administrations in which I served.” That was his response to charges that he’d committed war crimes in Vietnam, Cambodia, and South America. The second quote is from Cardinal Egan of New York: “If, in hindsight, we also discover that mistakes may have been made, . . . I am deeply sorry.” That’s from the Cardinal’s statement about how the Roman Catholic Church dealt with child molesters among its clergy. The third is by a McDonald’s spokesperson: “Mistakes were made in communicating to the public and customers about the ingredients in our French fries and hash browns.” This apology was directed to Hindus and other vegetarians, who weren’t informed that the “natural flavoring” in the chain’s potatoes contained beef byproducts.
All of these statements were deemed apologies, but clearly, a certain distancing was going on. Inherent in the responses was the human urge to scale down the scope of responsibility—and, no doubt, reduce the cognitive dissonance of those issuing the apologies.
So, it’s undoubtedly true, as Aronson and Tavris write, that “all of us share the impulse to justify ourselves and avoid taking responsibility for any actions that turn out to be harmful, immoral, or stupid.” But they remind us that most of us will never be in a position to make decisions about the lives of large numbers of people. Of course, in recent years, the poster boy for “tenacious clinging to a discredited belief” has been George W. Bush. It remains to be seen whether anyone will displace him from this inglorious position anytime soon.
Aronson and Tavris ask whether ordinary people can be made to do unpleasant things that they wouldn’t under normal circumstances? Yes, they say, if they’re introduced gradually to the acts, with the right inducements. Soon enough, they’ll be hooked, and their own belief systems and self-justifications will protect them from themselves by reducing their cognitive dissonance.
The authors present us, once again, with the story of Abu Ghraib, where ordinary men and women justified what they’d done to humiliate and abuse Iraqi prisoners, while their superiors washed their hands of any responsibility. The remarkable thing is that everybody involved thought themselves “good people.” Taking a little artistic license, Aronson and Tavris imagine the reasoning of those accused of abuse: “If we deliberately inflict pain on another, the other must have deserved it.”
Of course not everybody involved in crimes always justifies their actions in this manner. But, write Aronson and Tavris, “that relatively small percentage of people who cannot or will not reduce dissonance this way pay a large psychological price in guilt, anguish, anxiety, nightmares and sleepless nights.” No wonder self-justification is such a universal balm for the tender ego!
Aronson and Tavris say that all of us—world leaders and ordinary individuals alike—are at the mercy of cognitive dissonance, and that we also practice “confirmational bias”: we seek evidence that “confirms” or supports the decisions we make, while ignoring other facts. So we naturally lie to ourselves when we find ourselves in unpleasant circumstances. Not only does “confirmational bias” help us lie to ourselves, say the authors, but it has another, annoying quality: it makes us believe that we can see clearly, even if others can’t.
According to Aronson and Tavris, no one is immune from cognitive dissonance, including therapists. Like everyone else, they want to protect their egos and reputations when they’re wrong, as they did during the scandals of repressed memory and the hysterias of imagined child abuse in daycare centers (the authors devote a chapter to these low moments in the history of psychology).
Like many books that present a vast, overarching problem, the only solution to the ethical dilemmas posed by cognitive dissonance theory is, once again, that noble and often elusive goal: we need to be aware. They urge us to “rethink our own muddles.” We must be vigilant about a virus that destroys politics, marriages, relationships, and nations. Who can argue against awareness? But can it stand up to our blind urge to self-justify?
The problem here is that in the rush to deploy social psychological generalities, individual differences sometimes can be shortchanged. Does cognitive dissonance theory explain why some conflicted people say no, despite the social pressures applied to them? Why some people really are able to put aside self-justification in difficult circumstances and admit that they were wrong?
With this caveat, I must say, this is a good, readable book with lots of marvelous stories about the pickles everybody (but me) gets into. I, by contrast, see clearly and wisely at all times. Pity. If only people could see what I see, we wouldn’t need cognitive dissonance theory to explain how screwed up we are.
This article first appeared in the May/June 2008 issue.
Richard Handler
Richard Handler is a radio producer with the Canadian Broadcasting Corporation in Toronto, Canada.