While it is common scientific belief that finding out the real facts is going to change the way people feel about some things, this is hardly true for most humans. People in the field of science and technology are looking for new researches and anecdotes that would help them prove something that goes against the common beliefs. But even when they reach such proofs, they have a very hard time “convincing” others about the same. And this is where they get disillusioned.
“Why are people not ‘believing’ in facts?” they ask. Because they are people, we find out.
There was a time when free thinkers actually had to face a lot of friction from people who thought the Earth was flat, while in fact, it was spherical. The numerous clashes between some very well-known philosophers and the people at the time are still cited today when someone “defends” their facts. This is especially true in the political context. Political researchers have discovered something that would seriously discourage someone who trusts in the power of information. It is: facts do not really have the power to change human minds. Actually, it works almost in the opposite way. In numerous researches over the period of 2005-06, researchers from the University of Michigan actually found out that even when misinformed political partisans were presented with actual new facts that corrected their misled beliefs, they hardly changed their minds. In fact, their beliefs became even more strongly set. The facts actually worked like low potency antibiotics, which actually strengthened their misinformation as opposed to correcting the same.
Anyone who believes in something has actually believed in the same thing for years (maybe decades). This means that when facts are told to them, they have two options:
Either accept that their whole lives and beliefs have been wrong, and question each of their beliefs, or
Ignore that fact.
Researches find out that if the fact is not of a life-changing capacity, people would hear you out. But if it tends to damage the belief system they have formed over the years, the defense mechanisms of recipients tend to kick in, and they tend to retract back to the shell of beliefs they have formed over the years. This phenomenon is called “backfire”, which is a natural way to prevent cognitive dissonance.
This makes us question the most important principle of a democracy: that the well-informed electorate is the best electorate.
No, it does not work that way, sadly. It is very easy for people to be wrong, as their beliefs can be reinforced from bits and pieces of advice, rumors, misinformation, and variations of the truth. But it becomes very hard for someone to be certainly right. The solution to this problem in political context is very simple: ignorant people can simply choose not to vote.
But the fact is, misinformed people tend to have the strongest opinions.
James Kuklinski of University of Illinois at Urbana Champaign performed a research back in 2000, where more than 1000 Illinois residents were asked about welfare, and how the state is spending on the same, budget cuts, and the like. While more than 50% were strongly confident that their answers are correct, only 3% turned out to actually be correct. This has frankly been called “I know I’m right” syndrome by the researchers, and has been termed as the single largest threat to a real democracy. Most people ignore facts, and those that need to actually change their beliefs, tend to do the exact opposite.
Diving further in, we understand that facts are “cold, hard truths” that people can either face, or ignore. In order to actually consider a fact and question their beliefs, people need to “personally connect to you”. So if you are better able to connect to people, they are more likely to hear out the facts that you tell them and are more likely to respond to them positively. But this easing off of their belief-shield can only be done on a personal level, one person at a time.