(Vox) Why we pretend to know things, explained by a cognitive scientist
New research explains why we pretend to know more than we do.
Why do people pretend to know things? Why does confidence so often scale with ignorance? Steven Sloman, a professor of cognitive science at Brown University, has some compelling answers to these questions.
“We’re biased to preserve our sense of rightness,” he told me, “and we have to be.”
The author of The Knowledge Illusion: Why We Never Think Alone, Sloman’s research focuses on judgment, decision-making, and reasoning. He’s especially interested in what’s called “the illusion of explanatory depth.” This is how cognitive scientists refer to our tendency to overestimate our understanding of how the world works.
We do this, Sloman says, because of our reliance on other minds.
“The decisions we make, the attitudes we form, the judgments we make, depend very much on what other people are thinking,” he said.
If the people around us are wrong about something, there’s a good chance we will be too. Proximity to truth compounds in the same way.
In this interview, Sloman and I talk about the problem of unjustified belief. I ask him about the political implications of his research, and if he thinks the rise of “fake news” and “alternative facts” has amplified our cognitive biases.
This conversation has been lightly edited for length and clarity.
This is another way of saying that we live in a community of knowledge.
That’s right. I believe every thought we have depends on thoughts that other people are having. When I cross the street, my actions depend on the thoughts that are going through the mind of the driver’s head. If I get on the bus, the success of my endeavor depends on the thoughts that are going on in the bus driver’s head.
When I express an attitude about immigration, what am I really doing? What do I really know about immigration? I live in a very limited universe, and so I have to depend on the beliefs and knowledge of other people. I know what I’ve read; I know what I’ve heard from experts. I don’t have any direct experience of the immigration problem; I haven’t visited the border and studied it myself.
In that sense, the decisions we make, the attitudes we form, the judgments we make, depend very much on what other people are thinking.
There are some obvious dangers here, right?
One danger is that if I think I understand because the people around me think they understand, and the people around me all think they understand because the people around them all think they understand, then it turns out we can all have this strong sense of understanding even though no one really has any idea what they’re talking about.
I’m trying to think about all of this in terms of our political circumstances. Most of us don’t understand as much as we think, and yet we’re all cocksure about a range of issues. So when we are arguing about politics, what are we really arguing about? Is it about getting it right or is it about preserving our sense of rightness?
I’m not sure there’s a sharp distinction between wanting to get it right and wanting to preserve our sense of rightness. In the political domain, like most domains in which we don’t just hear or see what’s true, we rely on social consensus. So argument is about trying to convince others while we’re trying to convince ourselves. Getting it right essentially means we’re convinced.
Of course, we’re biased to preserve our sense of rightness, but we have to be. If we weren’t, we’d be starting again each time we approached an issue; our previous arguments would be for naught.
Nevertheless, people differ on this. Everyone has a compulsion to be right, meaning that they want the people around them to think they’re right, and this is easily achieved by mouthing the things that the people around you say. And people who are more capable tend to be better at finding ways to interpret new facts in line with their community’s preconceptions.
But some people do try to rise above the crowd: to verify claims independently, to give fair hearing to others’ claims, and to follow the data where it actually leads. In fact, many people are trained to do that: scientists, judges, forensic investigators, physicians, etc. That doesn’t mean they always do (and they don’t always), just that they’re supposed to try.
I like to live in communities that put a premium on getting things right even when they fly in the face of social norms. This means living with constant tension, but it’s worth it.
This phenomenon, the “illusion of explanatory depth,” applies equally to people on the left and the right. This isn’t a partisan problem; it’s a human problem.
That’s exactly right, and our data shows this clearly.
How do you collect that data? What sorts of experiments have you done to tease out these tendencies?
I run experiments in my lab and over the internet. We try to find representative groups of Americans and ask them questions, mostly hypothetical questions. In the case of the illusion with political policies, we ask people to rate their attitude and their understanding of a policy, then ask them to explain the policy (what it is and how it would lead to specific consequences), and then they rate their own understanding and their attitude again. We find that the attempt to explain reduces their sense of understanding and also makes their attitude less extreme, on average.