I recently read Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts on Dr. Eades’s recommendation. The book centers on cognitive dissonance theory (read this link for a good overview), which is ripe with insights into why people hold irrational beliefs. The essence of the theory is that when people are confronted with dissonant beliefs such as “I’m a good, smart person” and “I just made a bad mistake,” they tend to rationalize the latter so that the former is not undermined. Such a person might convince himself that it wasn’t a mistake, or that somebody else was responsible for it. Likewise, when people with poor self-esteem do something good, they tend to rationalize it away, e.g., “It would have happened without me anyways.” This rationalizing process can take someone step-by-step to the point of justifying things that they would have considered crazy at the outset.
A good example is the Milgram Experiment: the volunteers proceeded to deliver more and more powerful “shocks” (up to what they believed were dangerous levels) because they justified it one shock at a time. Once they had delivered the first shock, it wasn’t a big leap to justify giving the second shock, nor the third, and so on. But if they stopped for fear of harming the person, they would have to justify the previous few shocks, which weren’t much weaker. They faced the dissonance of admitting that they were wrong to be giving the shocks in the first place, which is why many people rationalized the shocks and continued as instructed. (Interestingly, this suggests that it was the incremental nature of the process that led so many to deliver dangerous shocks, and not just obedience to an authority figure, as commonly believed. Had the authority figure ordered the volunteers to deliver a dangerous shock right from the start, a lot more of them would have refused because they wouldn’t have had any dissonance to resolve.)
Confirmation bias plays an important role in rationalizing our beliefs, allowing us to discount or ignore disconfirming evidence and focus on the confirming evidence. People can become entrenched in the craziest beliefs via this process of step-by-step justification with confirmation bias, e.g., flat-earthers, vegetarians, and religious people. Take vegetarianism: let’s say you object to the cruel treatment of animals on factory farms. From there, it’s only a small step to the belief that killing animals is wrong. And from there, it’s a small step to the belief that eating meat is wrong. Confirmation bias smooths each step: you ignore or discount the counter-arguments and convince yourself with all the supporting arguments. Step-by-step, you slide further and further. By the end of the process, you’ve gone from the reasonable belief that animals shouldn’t be treated cruelly to the absurd belief that eating animals is unhealthy. The lesson here is that once we become committed to a belief, we become motivated to justify it. Confirmation bias helps smooth the process, and step-by-step we can end up strongly believing something that we previously would have considered ridiculous.
Cognitive dissonance theory offers some interesting practical advice. For instance, if you’re considering making a big purchase, don’t base your decision on the opinion of somebody who just made that purchase. They’ll be motivated to rationalize the purchase and you’ll tend to get biased advice. Another tip: if you want to win somebody’s friendship, get them to do a favor for you. They’ll be motivated to justify the favor by telling themselves that you’re a good person and you deserved it. Conversely, if you harm someone, you’ll be motivated to justify the harm by convincing yourself that the person deserved it. So venting your anger at someone is counter-productive: you’ll come to hate that person even more.
Cognitive dissonance theory has much to say about the mass delusions, which have the veneer of legitimacy because of their sheer number of believers. The main three are religion, statism, and mainstream health (or god, government, and grains.) To non-believers, these are completely loony beliefs that only persist because of cultural momentum. Because of this, a believer faces great dissonance in admitting that such beliefs are foolish. It would be very difficult for them to admit that their religion is nothing but a fairy tale; they would experience strong dissonance between “I’m intelligent and rational” and “I strongly believed in a fairy tale”. The dissonance would be even worse for intellectuals, who play a crucial role maintaining the legitimacy of widespread beliefs. To admit error is to admit that they misled countless people—a terrible thing to do—so there’s a strong motivation to rationalize the belief and convince themselves that they’re right. The dissonance becomes extreme in fields where the ideas have horrible real-world consequences, such as mainstream health or the social sciences. It would be extremely difficult to accept that “I promoted ideas that caused misery and deaths for countless people.” In these cases, the motivation to rationalize is tremendously powerful, which explains why conversions among these intellectuals are practically non-existent.
It’s important to understand this when working to explode mass delusions. Erroneous beliefs are rarely dropped right away—if they are at all, it’s often through a step-by-step reversal of the process that led there in the first place. One important conclusion we can draw is that if we want to convince someone of their error, we should respectfully and humbly point out their error. If we viciously attack their position as though only an idiot would believe it, then they face the dissonance of admitting that they strongly held an idiotic position and will be motivated to further entrench themselves in their position. Of course, we don’t always aim to convert the other person, in which case vicious attacks on their position can motivate other critics and win over undecided people. But when we really do want them to change their position, we would be wise to recognize that it will take time for them to correct their beliefs and that respectful criticism will go much further than head-on assault.
Perhaps most importantly, understanding cognitive dissonance theory can help us overcome our own biases and avoid the dangers of rationalization. The motivation to rationalize is quite difficult to escape, even for the authors of the book. We may always be susceptible, but we can protect ourselves by being aware of when we rationalize and stopping the process before it goes too far. It can be difficult and even humiliating to admit error, but a strong commitment to truth can provide the motivation. In the long run, a cultural shift in our attitudes towards mistakes would solve the bulk of the problem. If mistakes were considered normal and admission of error honorable, it would be much easier to admit error from the start, before rationalizing our way into delusion.