Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts

4.5 rating

In 100 Words or Less: It's easy to find fault in others, but difficult to see our own mistakes. Cognitive dissonance is the discomfort we feel when confronted with new information which contradicts our most prized beliefs, including the belief that we are intelligent, moral people; this leads us to deny or to justify our mistakes. Confirmation bias causes us to emphasize evidence which supports our beliefs and discredit facts which do not. Hedonic bias causes us to credit ourselves for our successes and blame external situations for our failures. Cultivating awareness of these limitations is the first step to making smarter, more deliberate choices.

  • No human being wants to admit their mistakes. It’s surprisingly difficult for each of us to utter these three words: “I was wrong”. The higher the perceived stakes, the greater our difficulty.
  • When confronted with evidence of our error, our first impulse is often to dig in further. Rather than revise our view or change direction, it’s much easier to deny or justify, even when there is strong evidence of our error.

    Although politicians are one prominent example — eventually, they reluctantly admit that “mistakes were made”, but continue to pass the buck — in reality, most of us do this from time to time, if not far more often.

  • Self-justification can actually be more dangerous than an outright lie. When faced with the consequences, a guilty man may deny it, but fear of consequences may deter him from making the same mistake again. However, self-justification allows him to explain — to others, but especially himself — the difference between his professed convictions and his actions.

    When we’re able to say, “yeah, I did something which was maybe kinda’ horrible, but he really deserved it”, it actually predisposes us toward that same behavior again; we’ve found a way to feel good about it. (Naturally, this also prevents us from learning from our mistakes, for our own benefit.)

    Conversely, when we treat someone well, this actually predisposes us to further acts of selfless behavior in the future. (“I did something really nice for that person, and they really deserved it.”)

  • Self-justification can be useful, to some extent. In and of itself, it needn’t be all bad. Self-justification allows us to “get over it”, to decide between flawed options, to stop second-guessing ourselves or agonize over mistakes.

    Taken too far, however, it prevents us from acknowledging our errors, making amends and doing better next time.

    Understanding this is essential to understanding why so many people do such “crazy” things which are not in their self-interest (or in the interest of those they love).

  • Two words which drive self-justification: cognitive dissonance. Cognitive dissonance is a “state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent.”

    Cognitive dissonance makes it difficult to admit our mistakes because we want to perceive ourselves as competent, moral and rational people. It explains why we justify immoral decisions to ourselves, for example, raiding the corporate supply closet for our personal use, or taking sodas from the community fridge without paying for them. It’s why Donald Trump and his supporters downplay his disparaging of minorities as “telling it like it is” or his comments about sexual assault as “locker-room talk”.

    As a parent, I might genuinely believe that I shouldn’t yell at my children in anger, but supposing that I have tried and failed to restrain myself on more than one occasion, it would be easy to go on to justify my mistakes in any number of ways. I might reason that they’ll “never learn” if I don’t raise my voice, or that “anyone” would’ve lost their temper in those circumstances.

    In each and every case, our acts of self-justification merely open the door to our continued bad behavior.

  • Cognitive dissonance and confirmation bias. When new information validates our existing beliefs, we are prone to overestimate its accuracy and importance. On the other hand, when it clashes with our thinking, we tend to become critical of the evidence and hypersensitive to minor flaws. This is confirmation bias. In short, we see what we expect to see.

    What’s more, our brains tend to fixate on confirming evidence but overlook the absence of evidence which would disconfirm the possibility.

    Confirmation bias explains how two otherwise-intelligent people can watch the same heated argument, controversial football replay, or presidential debate and come away with two completely different views of who won. Once again, we see what we expect to see.

  • Cognitive dissonance and hedonic bias. When we explain our behavior, there is a tendency to take credit for our successes (our insight, our plan, our execution) and to shift blame for our failures (bad timing, bad market, bad luck). Conversely, it happens that we credit these same, situational forces for someone else’s successes (good timing, good market, good luck) and their personal lack of insight, lack of plan or lack of execution for their failures.
  • We are all “naive realists”. We each possess “the inescapable conviction that we [and we alone] perceive objects and events clearly, as they really are.”

    In one experiment, researchers took peace proposals created by Israeli negotiators, credited them to Palestinians, and then asked Israelis to judge them. The Israelis liked the Palestinian proposals which were attributed to Israelis more than they liked the Israeli proposals which were attribute to Palestinians; yet, unsurprisingly, all of the participants insisted that they weren’t influenced by their nationalism and had carefully considered each policy on its own merits.

    If our present experiences are so easily skewed, our memories of past experiences are even less reliable. We fail to appreciate that even our most vivid memories are heavily influenced by external events and internal biases, including a desire to embellish our contributions or minimize our failings. Our brains omit important facts and even fabricate things which did not happen.

    Oliver Wendell Holmes Jr. observed that “trying to educate a bigot is like shining light into the pupil of an eye — it constricts.” Many of us would rather preserve our opinions at all costs.

  • “Naive realism creates a logical labyrinth because it presupposes two things: One, people who are open-minded and fair ought to agree with a reasonable opinion. And two, any opinion I hold must be reasonable; if it weren’t, I wouldn’t hold it. Therefore, if I can just get my opponents to sit down here and listen to me, so I can tell them how things really are, they will agree with me. And if they don’t, it must be because they are biased.” — Carol Tavris and Elliot Aronson

  • Cognitive dissonance explains why we assign more value to things which are difficult to come by. Whether joining a fraternity with strict initiation requirements, paying lots of money for an expensive handbag, or attending an elite law school, most of us would be reluctant to view these things as poor investments, once we’ve attained them. So we reduce cognitive dissonance in selecting thoughts which make the achievement or possession seem more valuable.
  • Self-justification is a slippery slope. How do we start off with good intentions and end up so far afield?

    Why are politicians corrupt, even as so few genuinely do not, deliberately cross ethical lines?

    Why do pharmaceutical studies consistently demonstrate funding bias, even as a majority of scientific researchers aren’t consciously cheating?

    Why do police and prosecutors identify the wrong suspect, ignore valuable evidence and insist that they “got the right guy” even when their guy is exonerated by DNA?

    Why do so many marriages disintegrate so tragically, even as the once-happy couple genuinely meant it when they promised each other “till death do us part?”

    Why do countries spend billions of dollars and destroy millions of lives to fight meaningless wars?

    The answer? These things happen when you and I continuously rationalize our errors, one small, seemingly inconsequential step at a time.

    There is a famous experiment — amongst many others — conducted by Stanley Milgram which illustrates this point. Essentially, a series of ordinary, American citizens were successfully induced by researchers to deliver progressively stronger electrical shocks to innocent people.

  • We are all “stereotypers”; it’s a natural consequence of the brain’s inclination to categorize our experiences. Our brains are lazy; they want to conserve energy. Stereotypes help us to simplify a complex world, to process new information quickly and act decisively. For this reason, we tend to divide the world into “us” and “everybody else”, in many different contexts and often without realizing it.

    Ethnocentrism — the belief that our own culture, nation, race, religion or sports team is superior to all others — serves an evolutionary purpose, strengthening social bonds and increasing our willingness to work, fight and even die for those belonging to our group. However, when we successfully convince ourselves that “everyone else” is stupid, sinful or subhuman, it also allows us to justify our immoral treatment of those without guilt.

    Self-justification occurs when we make conscious choices; but at least then, we can expect it. More insidiously, it can also occur when we behave badly for unconscious reasons.

  • No true Scotsman. When someone makes a claim and is presented with evidence to the contrary, rather than deny the evidence or modify their original claim, they explain it away. It might go something like this:

    “No Scotmsan would steal”
    “I know a Scotsman who stole”
    “Well, no true Scotsman would steal”

    For example, I have well-meaning Christian friends who insist that Christianity is peaceful but Islam is violent. When confronted with numerous examples of Christians who have committed violence, they quickly fire back that such people are not “true” Christians. (Yet, the vast majority of Muslims who are peaceful do not represent “true” Islam. Go figure.)

  • Overcoming our natural tendencies to self-justify. Developing awareness of our tendency to paper over dissonance is the first step; it helps us to make conscious choices, rather than falling back upon automatic, self-protective behaviors. We should admit our mistakes, if for no other reason than we are likely to be found out any way.

    When I find myself in the midst of a difficult situation, I might try to ask “why am I wrong?” Rather than assume I’m correct, asking this question of myself helps me to think outside of the box I’ve already built. I might also try to envision helping a friend in a similar position; granting myself separation and the capability to see things from a more objective, outside perspective.

    Above all, be willing to be wrong. Be willing to make mistakes as a natural part of becoming better and not as a reflection of who you are.

  • “Understanding how the mind yearns for consonance, and rejects information that questions our beliefs, decisions, or preferences, teaches us to be open to the possibility of error. It also helps us let go of the need to be right. Confidence is a fine and useful quality; none of us would want a physician who was forever wallowing in uncertainty and couldn’t decide how to treat our illness, but we do want one who is open-minded and willing to learn. Nor would most of us wish to live without passions or convictions, which give our lives meaning and color, energy and hope. But the unbending need to be right inevitably produces self-righteousness. When confidence and convictions are unleavened by humility, by an acceptance of fallibility, people can easily cross the line from healthy self-assurance to arrogance.” — Carol Tavris and Elliot Aronson

It’s difficult to do this book justice; there are many more powerful and relevant lessons, than can be readily summarized here. I recommend it is one of the more important and influential books I have ever read.

Chris Aram

I'm one-half of Webster Park Digital. I'm a devoted family man, avid reader, coffee snob, fajita-eater and professional PlayStation4 dabbler.

Leave a Comment