Subscribe to Annie's Substack

Newsletter: EXTREME BELIEFS ARE THE HARDEST TO QUIT

EXTREME BELIEFS ARE THE HARDEST TO QUIT

It’s a good bet that you know somebody (or know somebody who knows somebody) who has fallen down the rabbit hole of a conspiracy theory or some other extremist beliefs.
 
It’s also a good bet that you find yourself scratching your head, wondering why, even allowing the space for differences of opinion and interpretation, people could hold on to such beliefs when they’re so completely contradicted by the facts.
 
The head scratching is understandable given that our intuition is that, when it comes to our beliefs, the facts are in the driver’s seat. Yet, the science tells us this often isn’t the case. 
 
We trust that, if we encounter information that conflicts with a belief we have, we will change our belief.  Obviously, we think we are pretty rational, and that would be the rational response in situations where the facts completely obliterate the basis for those beliefs.
 
But in the battle between facts and beliefs, the facts too often lose out. 
 
That’s particularly true the more extreme somebody’s beliefs are, which should alarm all of us given the sheer number of people who hold these types of beliefs. 
 
Take QAnon as an example. A December 2020 poll by NPR and Ipsos found that 17 percent of Americans believed that “a group of Satan-worshiping elites who run a child sex ring are trying to control our politics and media.
 
Seventeen percent
 
That’s the same group of people ​​who believe that “in addition to molesting children, members of this group kill and eat their victims to extract a life-extending chemical called adrenochrome.
 
And QAnon – at least as conspiracy theories go – isn’t a historical one off. Point to any time in history, and I can show you some conspiracy theory that gained traction with a significant portion of the population, whether it is the belief that the moon landing was faked, that AIDS was developed in a lab to kill homosexuals and African Americans, that 9/11 was an inside job, or that Sandy Hook was a false flag.
 
No matter the facts, these conspiracy theories persist. And when confronted with the facts, believers often escalate their commitment, doubling and tripling down on their extreme beliefs. 
 
When Prophecy Fails documents one of the pioneering and most famous studies on this phenomenon. Psychologist Leon Festinger and several colleagues infiltrated a small doomsday cult in 1953. Based on transmissions one cult member claimed to be receiving from space aliens, the world would be destroyed by a flood on December 21st of that year. But the aliens would send a spaceship immediately before the flood to rescue true believers.
 
This, of course, set up a situation where on December 21st, when the spaceship and the flood failed to materialize, the facts would come into very clear and indisputable conflict with the extreme beliefs of the cult members. 
 
If you had joined this doomsday cult and, on the prophesied day, there was no flood and were no aliens, don’t you think your response would be like, “Oops, I really whiffed on that one,” and you’d walk away from the cult?
 
Of course, we’d all suppose that we wouldn’t join the cult in the first place. But, if we did, I think we’d also like to believe that we would walk away when the prophecy failed. 
 
But Festinger found that when the aliens failed to arrive, not only did the majority of the cult members not abandon their belief system, but they doubled down on it, escalating their commitment to the cult. 
 
Now you might be thinking, “But those people are whacky! After all, they joined a CULT! This obviously doesn’t apply to me.”
 
But this phenomenon isn’t limited to extremists and cult members. This is happening in your life, to you, every single day, in ways big and small. 
 
Even for people who are analytical by nature. 
 
Katy Milkman and John Beshears looked at 18 years of corporate earnings estimates and updates by more than 6,000 stock analysts. They found that when analysts make extreme, out-of-consensus predictions of corporate earnings and those forecasts don’t play out, the analysts don’t modify their views to fit the new information. Instead, they commit even further to their out-of-consensus forecasts. 
 
In comparison, when analysts make mainstream, in-consensus forecasts, they are much more willing to update those forecasts in the face of new information about actual earnings. 
 
In other words, the analysts are less willing to change their minds when their views are non-consensus than when they are in the mainstream. This is particularly concerning because the analysts are financially motivated to offer accurate forecasts so it is not in their financial best interest to stick to their out-of-the-mainstream views.
 
Extreme beliefs, contrary to logic and intuition, seem the most resistant to change in the face of the facts. 
 
What we believe informs the decisions we make. When you hold a false belief, especially one that is extreme, that can drive decisions that hurt you and others.
 
Sometimes that means an analyst costing themselves financially because they refuse to update an earnings forecast. 
 
Sometimes the stakes are even higher, like when a guy believes high-ranking Democrats are operating a child sex ring out of a Washington, D.C. pizzeria (Pizzagate) and decides to shoot up a pizza parlor searching for non-existent underground tunnels to free the children being held there. 
 
Back in 1978, Jim Jones convinced more than 900 people to drink fruit juice laced with cyanide, in what is known as one of the largest cult massacres in American history. 
 
There are people who died of COVID because they refused to get vaccinated based on the belief that Bill Gates or George Soros was trying to inject them with a microchip. 
 
When most people think about the social media ecosphere, they think that the reason why people are so entrenched in these beliefs is because of confirmation bias. People go into their informational bubbles, seek out only the information that agrees with them, and don’t see the stuff that contradicts them. 
 
There is a certain hope behind that thinking. If confirmation bias is the problem, then it follows that if we could just penetrate those information bubbles and get the facts through, people will change their minds. 
 
But the facts too often don’t prevail when facing the choice between changing your mind or rationalizing away the facts. 
 
Especially when the belief you are protecting is extreme.