Are We Wrong To Think We're Right?

Nov 4, 2016
Originally published on November 4, 2016 8:07 am

Part 4 of the TED Radio Hour episode Democracy On Trial

About Julia Galef's TED Talk

Our biases lead us to amend the facts. Writer Julia Galef explains how we can have better judgement by developing more empathy and testing our own beliefs.

About Julia Galef

Julia Galef co-founded the Center for Applied Rationality, a nonprofit organization devoted to helping people improve their reasoning and decision-making, particularly with the aim of addressing global problems.

Copyright 2017 NPR. To see more, visit http://www.npr.org/.

GUY RAZ, HOST:

It's the TED Radio Hour from NPR. I'm Guy Raz, and on today's show, democracy on trial. And one of the things about democracy is that everyone gets a voice. It's basically why democracy is messy because everyone can say I'm right and you're wrong.

JULIA GALEF: Yeah. If they're in an argument with someone else, they're generally not thinking pensively about the extent to which the other person is right. They're reflexively reaching for whatever rebuttals they can find to throw at the other person. So the question is why is our instinct to just defend our own viewpoints and attack other people's viewpoints instead of trying to examine questions objectively?

RAZ: This, by the way, is Julia Galef. She's a scholar who studies why it's so hard for people to see things from someone else's perspective.

GALEF: It's easy for me to look at someone else's perspective and say that's ridiculous. I just don't understand how anyone could possibly think, you know, there's a giant conspiracy to cover up the truth about global warming...

RAZ: Yeah.

GALEF: ...That it's all a hoax or...

RAZ: Right.

GALEF: ...Something. That seems very ludicrous to me. But you can't just evaluate how reasonable that belief is in isolation. You have to look at it in the context of all the other things that person believes, you know, are true about how the world works. And this is part of what makes it hard for people to cognitively empathize with how someone else could believe something without being a totally insane, deluded or evil person. And it's also what makes it hard for people to change their minds.

RAZ: And there's actually a name for this idea that our biases can make us believe things that are objectively false. That idea is called motivated reasoning. And Julia argues that that kind of thinking can actually be incredibly destructive, for example, the case of Alfred Dreyfus, a Jewish officer in the French army who was falsely convicted of treason. Julia picks up the story from the TED stage.

(SOUNDBITE OF TED TALK)

GALEF: So I'm going to take you back to 19th century France, where officers in the French General Staff discovered that someone in their ranks had been selling military secrets to Germany. Their suspicions quickly converged on Dreyfus. He had a sterling record, no past history of wrongdoing, no motive as far as they could tell. But Dreyfus was the only Jewish officer at that rank in the army.

And unfortunately, at this time, the French army was highly anti-Semitic. They went and searched Dreyfus's apartment looking for any signs of espionage, and they didn't find anything. And this just convinced them more that Dreyfus was not only guilty but sneaky as well because clearly he had hidden all of the evidence before they had managed to get to it. And they sentenced him to life imprisonment on the aptly named Devil's Island.

So there he went, and there he spent his days writing letters and letters to the French government begging them to reopen his case so they could discover his innocence. But for the most part, friends consider the matter closed. This is a case of what scientists call motivated reasoning. It's this phenomenon in which our unconscious motivations, our desires and fears shape the way we interpret information. So some information, some ideas feel like our allies, and we want - we want them to win. We want to defend them. And other information or ideas are the enemy, and we want to shoot them down. So this is why I call motivated reasoning soldier mindset. Our judgment is just strongly influenced unconsciously by which side we want to win.

And this is ubiquitous. This shapes how we decide how to vote, what we consider fair or ethical. And what's most scary to me about motivated reasoning or soldier mindset is how unconscious it is. We can think we're being objective and fair-minded and still wind up ruining the life of an innocent man.

RAZ: Yeah. I mean, it's interesting because, you know, sometimes we cannot accept that a person that disagrees with us can be right about anything. I mean, you look at an election, for example - right? - and you see a candidate that you disagree with and, you know, you can't accept that what they say is true.

GALEF: I think what's going on when people refuse to acknowledge even minor points on which the opposing candidate is correct, part of what's going on there is that we have this instinctive model where if we allow that a single piece of evidence contradicts our worldview, we are obliged to instantly change our mind about this really important topic. Like, if I acknowledge that Trump was right about this one particular point, then I'm sort of forced to acknowledge, oh, well, maybe he is actually a truth teller and maybe Hillary is crooked and maybe Trump should win.

And that kind of change feels an absurdly large response to this, you know, piece of evidence. And so given that we feel like our choices are either acknowledge he's correct about this and then, you know, update our model entirely or find some reason to dismiss him as wrong, we opt for the latter because, you know, the former just seems so absurd. And it is absurd. You shouldn't change your mind about a huge topic just based on one additional small piece of evidence. But you have to you have to allow the possibility of that happening at some point by acknowledging when little bits of evidence contradict your worldview.

RAZ: OK, so let's go back to the story of Alfred Dreyfus, who most everyone thought was guilty, except for one man, Colonel George Picquart.

(SOUNDBITE OF TED TALK)

GALEF: He's another high-ranking officer in the French army. Like most people in the army, he was at least casually anti-Semitic. But Picquart began to suspect what if we're all wrong about Dreyfus? And what happened was that he had discovered evidence that the spying for Germany had continued even after Dreyfus was in prison. Eventually, Georges Picquart managed to get Dreyfus exonerated, but it took him 10 years. A lot of people feel like Picquart can't really be the hero of this story because he was an anti-Semite and that's bad, which I agree with. But personally, for me, the fact that Picquart was anti-Semitic actually makes his actions more admirable because he had the same prejudices as his fellow officers. But his motivation to find the truth and uphold it just trumped all of that.

Picquart is a poster child for what I call scout mindset. It's the drive not to make one idea win or another lose but just to see what's really there. Scouts are curious. They think it's virtuous to test your own beliefs, and they're less likely to say that someone who changes his mind seems weak.

RAZ: OK. So basically, Scout mindset - it's kind of like the best version of ourselves.

GALEF: Absolutely. It's not always the most natural way to approach an issue, especially an emotionally charged one. But I think, to the extent that our civilization has made progress - especially moral progress - it's been because there are people who are able to to see things through scout mindset. That's what allows us to revise our model of how the world works so that the things that we're trying to do for the world are actually more likely to be good and not bad.

RAZ: All right. All right. So I'm going to make a confession here, Julia, which I think is probably self-evident if you've heard some of this this episode. But I actually believe in democracy, right? Like, I think it's a pretty good system. So I'm open, you know, to having my mind changed about a lot of things. But if I met somebody who said, you know, democracy is horrible and it's a terrible system, I don't think they're going to change my mind about that.

GALEF: Yeah. I mean, when two people have really different perspectives, I think the most productive thing to do is to try to look for the crux of the disagreement, where the crux is some underlying assumption or empirical question about the world that, you know, if you found out you were wrong about that, would actually change your mind about the broader question, which, in this case is, you know, is democracy the best system? And, you know, do the thought experiments of - OK, let's say I found out that, you know, 10 years from now, the average happiness level in China was higher than the U.S. and people were well-off and, you know, etc., etc., then would that budge my opinion of the usefulness of democracy? If not, then, you know, you try another thought experiment, etc. And this does not mean that you're always going to end up agreeing with each other at the end of the conversation. But at the very least, both people end up with a much fuller and more fleshed out model of how the other person is thinking. And I think that is progress.

RAZ: Julia Galef - she's co-founder of the Center for Applied Rationality. You can check out her full talk at ted.com. Transcript provided by NPR, Copyright NPR.