How we decide who and what to believe
We appear to live in an age of misinformation.
Certain broadcasters and social-media celebrities openly promote fake facts or misrepresentations of science and data to their audience, many of whom do not seem to care whether they are right or wrong, as long as they are hearing what they want to hear.
The promotion of misinformation can be caused by an over-inflated belief in their own judgment and knowledge, or often, they simply relish the chance to proclaim their own contrarian or ideological views. Sometimes, it’s just about self-interest.
Many of us have at least a few controversial beliefs. We might believe that the death penalty deters crime or that raising the minimum wage decreases unemployment or that raising business taxes will reduce innovation.
We might even believe that women are not as good at maths as men or that the Earth is flat.
Some of these beliefs we will hold strongly.
But when we attempt to justify our beliefs, we often find the evidence pool is very shallow.
Researchers have identified a chronic illusion of explanatory depth, in that we overestimate our understanding of the world.
We can discover this by trying to justify our pet beliefs. To illustrate, when I interrogate myself about why I believe the death penalty is not a deterrent, I find there is not a lot there except for consensus beliefs among my peer group – some of whom I hope have looked into the evidence – some intuition and vague memories of looking at some blog posts or newspaper articles. This is not a lot. But it is perhaps not surprising: we simply don’t have time to be experts on everything.
The Dunning-Kruger effect, however, is a population-level effect, so no individual can ‘have’ it. It primarily means that just because someone is confident, it doesn’t mean they are right. In fact, there are individual differences in confidence, with some people being absurdly sure of themselves and others quite diffident.