In today's political climate, it sometimes feels like we can't even agree on basic facts. We bombard each other with stats and figures, hoping that more data will make a difference. A liberal might show you the same climate change graphs over and over; a conservative might point to the trillions of dollars of growing national debt. We're left wondering, "Why can't they just see? It's so obvious!"
Certain myths are so pervasive that no matter how many experts disprove them, they only seem to grow in popularity. There's no shortage of peer-reviewed studies showing no link between autism and vaccines, for example, but these are no match for an emotional appeal to a parent worried for his young child.
Tali Sharot, a cognitive neuroscientist at University College London, studies how our minds work and how we process new information. In her book The Influential Mind, she explores why we ignore facts and how we can get people to actually listen to the truth.
Tali shows that we're open to new information – but only if it confirms our existing beliefs. We find ways to ignore facts that challenge our ideals. And as neuroscientist Bahador Bahrami and colleagues have found, we weigh all opinions as equally valid, regardless of expertise.
So how do we identify the experts, the people who have the correct answer about a given fact? Economist Drazen Prelec and colleagues recently published research that shows how to identify what they refer to as "the surprisingly popular vote" on a given topic.
Still, having the data on your side is not always enough. For better or for worse, Sharot says, emotions may be the key to changing minds.
Hidden Brain is hosted by Shankar Vedantam and produced by Maggie Penman, Jennifer Schmidt, Rhaina Cohen, Renee Klahr, and Parth Shah. Our supervising producer is Tara Boyle. You can follow us on Twitter @hiddenbrain, and listen for Hidden Brain stories each week on your local public radio station.
- Tali tells us that fear is a powerful motivator for inaction, but positive feedback is a better motivator for action. She talks abouta study in which health care workers were given positive feedback for washing their hands.
- Tali talks about the ability of strong emotional appeals - and even some powerful speeches - to synchronize brain activity across listeners and between listeners and speakers.
- Sometimes crowd wisdom fails us - but the "surprisingly popular vote" means that we can still learn the right answer to a question, even if most of the crowd is wrong.
SHANKAR VEDANTAM, HOST:
Hey there. Shankar here. As we reach the end of 2017, there are four words you almost never hear, whether that's from people in the news or from friends and family. You're right; I'm wrong. We don't hear those words, and we don't say those words because we all have elaborate mental defenses to keep threatening ideas at bay. These psychological defenses can give us a short-term boost even as they expose us to long-term risks. We're taking a bit of time off this week, so we thought we'd share a conversation from March with the neuroscientist Tali Sharot. We found this episode helped us understand why it's so hard to listen to ideas that challenge our beliefs. It also gave us some techniques to help us see the world with fewer biases and preconceptions. As you gather with friends and family to celebrate the new year, we hope you'll find it useful, too.
(SOUNDBITE OF MUSIC)
VEDANTAM: This is HIDDEN BRAIN. I'm Shankar Vedantam. There are certain issues where we just don't see eye to eye with people on the other side of the political aisle. Often, it feels like we don't just disagree over policy. We can't even agree on the facts.
UNIDENTIFIED WOMAN: I think people are mostly just set in their ways.
UNIDENTIFIED MAN #1: I would say I'm right objectively.
UNIDENTIFIED MAN #2: The facts prove that there is global warming. I mean, I don't know how you can dispute it.
UNIDENTIFIED WOMAN: But everybody thinks they're right.
UNIDENTIFIED MAN #3: I think they're wrong. I think they're making a huge mistake because security has to be a very big issue.
VEDANTAM: My guest today has spent years studying the way we process information and why we often reach biased conclusions. She says it's surprisingly difficult for us to change one another's minds no matter how much data we present. But just a little bit of emotion - that can go a long way.
(SOUNDBITE OF MUSIC)
VEDANTAM: Tali Sharot is a cognitive neuroscientist at University College London. In her upcoming book, "The Influential Mind," she explores how our minds work, why we ignore the facts and how you can get people to actually listen to the truth. Tali, welcome to HIDDEN BRAIN.
TALI SHAROT: Thank you, great to be here.
VEDANTAM: Some months ago, you were listening to a Republican presidential debate. And candidate Donald Trump was asked a question about the safety of childhood vaccines. Here's what he said.
(SOUNDBITE OF ARCHIVED RECORDING)
PRESIDENT DONALD TRUMP: You take this little, beautiful baby, and you pump - I mean, it looks just like it's meant for a horse, not for a child. And we've had so many instances - people that work for me - just the other day, 2 years old - 2 and a half years old, a child, a beautiful child went to have the vaccine and came back and a week later got a tremendous fever, got very, very sick, now is autistic.
VEDANTAM: Tali, you're a mom. You have two small children. At the time Trump said this, one of your kids was 2 years old. The other was 7 weeks old. Describe your emotional reaction to what he said.
SHAROT: Yeah, so when I was listening to Trump at that debate, I was already quite concerned because I was a mother of a small child. So you're always concerned about the health and safety of your kids, especially when they're only a few weeks old. And so it really tapped into this fear that I had and the anxiety that I already had. And when he talked about this huge syringe - a horse-size syringe that was going to go into the baby, in my mind I could imagine this syringe inserted into my small, little child and all the bad things that could happen.
And this was a very irrational reaction on my end because I know that there is not an actual link between autism and vaccines. But it's not enough to have the data. Ben Carson - Dr. Ben Carson was on the other end.
(SOUNDBITE OF ARCHIVED RECORDING)
BEN CARSON: Well, let me put it this way. There has been - there have been numerous studies, and they have not demonstrated that there' any correlation between vaccinations and autism.
SHAROT: But that wasn't enough because the data is not enough. And even if the data is based on very good science, it has to be communicated in a way that will really tap into people's needs, their desires. If people are afraid, we should address that.
VEDANTAM: I'm curious. When you sort of contrasted, you know, the weight of the evidence on the one hand and this very powerful image of the horse syringe and your 7-week-old baby on the other hand, how did you talk yourself into trusting the data over that emotional image?
SHAROT: What really helped is that I understood what was happening to me. You know, the first instinct was, like, a stress and anxiety. But because this is what I study, I knew what my reaction was. I knew where it was coming from. I knew how it was going to affect me. And I think that awareness helped me to put it aside and say, OK, I know that I am anxious for the wrong reasons. And this is the action that I should take. It's a little bit when you're on a plane and there is turbulence and you get scared. But telling yourself, I know that turbulence is not actually anything that's dangerous; I know the statistics on safety on planes and so on, it helps. It helps people reduce their anxiety.
(SOUNDBITE OF MUSIC)
VEDANTAM: The facts don't always relieve our anxieties, though. Sometimes they only harden our views. Some time ago, Tali did a study where she presented information to people who believe that climate change is real and to people who are skeptics. She found for both groups, people strengthened their pre-existing beliefs when new information confirmed what they thought. But both groups ignored information when it challenged their views. I asked Tali about this.
SHAROT: Our psychological biases are the same across individuals on average. We all have what's known as a confirmation bias. A confirmation bias is our tendency to take in any kind of data that confirms our prior convictions and to disregard data that does not conform to what we already believe. And when we see data that doesn't conform to what we believe, what we do is we try to distance ourselves from it. We say, well, that data is not credible, right? It's not good evidence for what it's saying. So we're trying to reframe it, to discredit it.
VEDANTAM: I want to sort of push back at you just on one point, which is, sometimes it seems to me that it actually might be rational to reject data that comes in. You know, if I've seen gravity work all my life - it makes objects fall toward the ground - and you take me into a room where an object seems to be levitating, it seems to me the appropriate reaction is not, OK, everything I knew about gravity was wrong, but let me try and understand this anomaly. Let me try and understand why this object is not doing what it's supposed to do. It seems to me - and I am clearly betraying my own beliefs here - given the overwhelming weight of evidence that climate change is real, showing me data that one winter is colder than normal shouldn't make me change my overall view. Should it?
SHAROT: You're absolutely right. So the way that people tend to update their beliefs is that they use the new information in light of what they already believe because that, on average, is the rational way to go, exactly as you said. When we encounter some kind of information that really contradicts what we believe very, very strongly, on average, that information is wrong, right? So I give an example in my book where if someone comes in and says, I just saw pink elephants flying in the sky, and I have a very strong belief obviously that no pink elephants fly in the sky, I would then think that they're either delusional or they're lying. And there is good reason for me to believe that. So it's actually the correct approach to assess data in light of what you believe.
There's four factors that determine whether we're going to change our beliefs - our old belief, our confidence in that old belief, the new piece of data and our confidence in that piece of data. And the further away the piece of data is from what you already believe, the less likely it is to change your belief. And on average, as you go about the world, that is not a bad approach. However, it also means that it's really hard to change false beliefs. So if someone holds a belief very strongly but it is a false belief, it's very hard to change it with data.
(SOUNDBITE OF MUSIC)
VEDANTAM: When we come back, we're going to talk about some of the ways we can reshape the beliefs of other people or our beliefs and try to answer the question, if throwing good information at people doesn't drive misinformation out of circulation, how do you get people to buy the truth? Stay with us.
(SOUNDBITE OF MUSIC)
VEDANTAM: It often feels as though fear is used to motivate us to act.
(SOUNDBITE OF ARCHIVED RECORDING)
RUDY GIULIANI: The vast majority of Americans today do not feel safe.
HILLARY CLINTON: On Sunday, Americans woke up to a nightmare that's become mind-numbingly familiar.
TRUMP: This could be the great Trojan horse of all time.
VEDANTAM: Politicians use fear to get us to vote. TV programs use fear to get us to keep watching. Public health officials use fear to get us to quit smoking. I asked Tali whether fear might be an effective way to persuade people to change their minds and maybe even their behavior.
SHAROT: Fear works in two situations. It works when people are already stressed out, and it also works when what you're trying to do is get someone not to do something, an inaction. For example, if you try to get someone not to vaccinate their kids, fear may work. If there is, you know, an apple that looks bad, I don't eat it. Fear is actually not such a good motivator for inducing action, while hope is a better motivator, on average, for motivating action.
VEDANTAM: You talk about one study in your book where a hospital managed to get its workers to practice hand hygiene, to get staff members to wash their hands regularly. But it turned out the most effective thing wasn't frightening the staff about the risks of transmitting infections. It was something else.
SHAROT: So in a hospital on the East Coast, a camera was installed to see how often medical staff actually sanitized their hands before and after entering a patient's room. And the medical staff knew that the camera was installed. And yet, only 1 in 10 medical staff sanitized their hands before and after entering a patient's room. But then an intervention was introduced - an electronic board that was put above each door. And it gave the medical staff, in real time, positive feedback. It showed them the percentage of medical staff that washed their hands in the current shift and the weekly rate as well.
So any time a medical staff will wash their hands, the numbers will immediately go up, and there will be a positive feedback saying, you know, good job. And that affected the likelihood of people washing their hands significantly. It went up from 10 percent to 90 percent. And it stayed there. Instead of using the normal approach, instead of saying, you know, you have to wash your hands 'cause otherwise you'll spread the disease - basically instead of warning them of all the bad things that can happen in the future which actually results in inaction, they gave them positive feedback.
VEDANTAM: One important idea that Tali has explored is something known as the equality heuristic. It's a mental shortcut, and it's really quite simple. We tend to assign equal weight to everyone's opinion. But sometimes when there's an expert in the room, this mental shortcut can lead us astray.
SHAROT: Different people have different expertise. And it's better to put more weight on people who are more knowledgeable or have more expertise in the domain that we're making the decision in. And there's been a study showing that this equality heuristic is something that people do around the world. So it's not something that people do only in democratic governments. But studies have been conducted in other countries such as China and Iran, and there, too, people go according to the equality heuristic. If they need to make a decision, they will get the opinions of quite a few individuals and then tally them up. And that's how they make their decision instead of actually using the person in the room who is more knowledgeable and has more expertise.
VEDANTAM: But how can you know who the expert is? When your question is about cancer, you can turn to an oncologist. But for other questions, many people might claim to be knowledgeable. It turns out there's a clever technique to separate the experts from the pretenders. It's called the surprisingly popular vote. Let me show you how it works with an example. I have two questions for you. The first is, what's the capital of Brazil? OK, here's a second question. What do you think most people will say is the capital of Brazil? Maybe you thought the capital was Rio de Janeiro. But maybe you knew the right answer, Brasilia. By looking at the results of those two quick polls, I can tell you with no prior information that the capital of Brazil is Brasilia. How would I know that?
SHAROT: Because if I think Rio de Janeiro is the capital of Brazil, I will also think that most people think that. However, if I think Brasilia is the capital of Brazil, I will still think that Rio is the answer that most people will give. So what this ends up being is that Brasilia will be an answer that's more popular than people expect.
VEDANTAM: I have to say it took me a minute to understand how this idea works. It's complicated, but it really is very clever. Here's another example. Let's look at a math problem. In a lake, there's a patch of lily pads. Each day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half the lake? The correct answer is 47 days because the lilis double in area every day. If they fill the lake on day 48, they will fill half the lake on day 47.
But the intuitive answer is the 24th day because 24 is half of 48. If you ask people what the answer is, most people will say 24. A few people will say 47. When you ask people what they think others will say, nearly everyone will predict that others will say 24. Since more people say the answer is 47 than expect others to say the answer is 47, 47 turns out to be the surprisingly popular vote answer and the correct one. It's basically a clever way to figure out who has expert or insider information. Betting markets work on a similar idea. If there's someone in a group who has knowledge about something and you give them a financial incentive to come forward with that information, you can bring insider information to the surface.
SHAROT: So the surprising popular vote tends to predict the correct answer in many different domains. It can be factual. It can be math. It can be even estimating the price of art. So they did a little study where they asked MIT students to estimate the price of art. Now, MIT students, for the most case - they don't really have a lot of knowledge about art. And yet when they asked them to estimate the price of art and also to say what other people will think about that and they took the surprising popular vote, the surprising popular vote was spot on.
VEDANTAM: I wrapped up my conversation with Tali by exploring one last idea about how we might convince others to listen to the truth. It had to do with a study of Princeton students who got their brains scanned while they listened to a story of a young woman named Annabel (ph). Producer Renee Klahr read an excerpt from that story.
RENEE KLAHR, BYLINE: (Reading) I know everyone has some crazy prom stories. But, well, just wait. I was a freshman in high school in Miami, Fla. And I'm new to the freshman scene. I'm new to the high school scene, I should say. And it's almost December, so I've been in high school for about three months. And this boy Charles asks me out. He's British. He's a junior. And he's really cute but sort of shy but just - well, it doesn't matter. So I say, yes. I'm excited.
VEDANTAM: The story goes on. It includes, as Tali writes, love, rejection, blood, alcohol and a couple of policemen - all the requirements of a best-seller. I asked Tali to tell me what happened in the minds of the Princeton students as they listened to that story.
SHAROT: So this is a study that was conducted by Professor Hasson's group at Princeton. And what they showed is that when one person listens to another person's story, the activity in the two brains synchronize. So if you're listening to me, then it is possible that the activity of your brain looks a lot like the activity in my brain. And at the beginning, the activity in your brain listening to my brain, it will be delayed a little bit, right? So I'm thinking about something. I'm saying the words. You're perceiving the words. And then the activity in your brain will follow the activity in mine.
However, they found in that study that after a while, the brain of the listener started preceeding the brain of the storyteller because the brain of the listener was predicting what the storyteller will say. So if you looked at the listener brain, you could predict what the activity in Annabel's brain would look like. But the idea of synchronization seemed to be important for influence. Why? In another study that the same group did, they found that when people were listening to very strong speeches, what they found was the brains of the different people listening to those speeches started synchronizing. So if we all listen, for example, to Kennedy's famous moon speech, our brains would likely look very much alike.
(SOUNDBITE OF SPEECH)
JOHN F. KENNEDY: Those who came before us made certain that this country rode the first waves of the Industrial Revolution, the first waves of modern invention and the first wave of nuclear power. And this generation does not intend to founder in the backwash of the coming age of space. We mean to be a part of it. We mean to lead it.
SHAROT: And this is not only in regions that are important for language and hearing. It's also in regions that are important for emotion, in regions that are important for what's known as theory of mind, our ability to think about what other people are thinking, in regions that are important for associations. And you try to think, well, what's common to all these influential speeches that can cause so many people's activity to synchronize? And one of the most important things is emotion. If the storyteller or the person giving the speech is able to elicit emotion in the other person, then he's actually having somewhat of a control on that person's state of mind.
So think about it like this. If you're very sad and I'm telling you a joke, well, you're sad, so you're not going to perceive the joke as I perceive it when I'm happy. But if I'm able to first make you happy and then tell you the joke, well, then you perceive it more from my point of view. So by eliciting emotion, what you're able to do is change the perception of everything that comes after, to perceive information as the person who's giving the speech wants you to perceive it.
VEDANTAM: So you can see how this coupling, this idea that the audience's mind and the speaker's mind are in some ways coupled together - you can see how this could potentially be used to spread good information. You know, you have a great teacher in high school, and you're captivated by the teacher. And you're being pulled along by the story the teacher is telling you, maybe about history or maybe about geography. But you can also see equally how the same thing can work in the opposite direction, that you could be listening to a demagogue, or you could be listening to somebody who has a very sort of captivating rhetorical style. And this person could also lead you astray in just the same way that the great teacher can lead you to knowledge and to positive things.
SHAROT: Absolutely. All the different factors that affect whether we will be influenced by one person or ignore another person are the same whether the person has good intentions or bad intentions, right? The factors that affect whether you're influential can be, can you elicit emotion in the other person? Can you tell a story? Are you taking into account the state of mind of the person that's in front of you? Are you giving them data that confirms to their preconceived notions? All those factors that make one speech more influential than the other or more likely to create an impact can be used for good and can be used for bad.
VEDANTAM: Tali Sharot, I want to thank you for joining me on HIDDEN BRAIN today.
SHAROT: Thank you so much for having me.
(SOUNDBITE OF MUSIC)
VEDANTAM: This week's episode was produced by Maggie Penman and Rhaina Cohen and edited by Tara Boyle. A special shout-out to our former intern Chloe Connelly, who interviewed the liberals and conservatives we heard at the start of this episode. Our staff includes Jenny Schmidt, Parth Shah and Renee Klahr. This week, our unsung hero is Bryan Moffett of National Public Media. NPM is the group that sells our sponsorship messages. Bryan is a great mix of liberal and conservative. He's always liberal in his encouragement and conservative in his promises. As the saying goes, he under-promises and over-delivers. Thanks, Bryan. I'm Shankar Vedantam, and this is NPR. Transcript provided by NPR, Copyright NPR.