If we hold an opinion fiercely and then we are told that we've got the facts wrong and are given a correction, we do not change our minds. No, most of us dig in deeper. We believe what we believe and no amount of 'true facts' will change our mind. We only accept new information that confirms our original opinion.
Oh and here's the kicker: The less self confident we are and the more we feel afraid and unsure, the more we strongly refuse new information that might conceivably change our minds. We keep our own little world safe and we simply don't change our minds when confronted with new and more accurate information.
"How can we have things so wrong, and be so sure that we’re right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them."
We're wired for this kind of behavior. And this behavior is fed continuously in these times by 24/7 news cycles that have made entertainment out of policy and politicians. We hear slogans in 10 second clips and we believe.
Keohane writes, “The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”
He continues, "These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote."
"The persistence of political misperceptions remains a young field of inquiry. “It’s very much up in the air,” says Nyhan.
"But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are."
Well, there you have it. I think I wrote something about this thing called fear a couple of years ago when talking about Karl Rove's schemes and plans. Something about keeping the public on edge, uncertain and fearful because then you've got them in the palm of your hand. Add in that endless news cycle. Look toward the end of democracy. Of course, that's what the fearful are thinking too. We both see the end of democracy for very different reasons.