Hidden Brain XX
[0] This is Hidden Brain.
[1] I'm Shankar Vedantam.
[2] In 1980, Ronald Reagan became president of the United States.
[3] He quickly raised the temperature of the Cold War and assumed a muscular stance toward the Soviet Union.
[4] Let us be aware that while they preach the supremacy of the state, they are the focus of evil in the modern world.
[5] That September, a message flashed in a secret bunker at Supercar 15, a secret Soviet outpost that analyzed satellite data from the United States.
[6] Inside the bunker was a 44 -year -old Soviet lieutenant colonel named Stanislav Petrov.
[7] The military commander saw a button pulsing red.
[8] His panel told him the unimaginable had happened.
[9] The United States had launched a ballistic missile.
[10] Within minutes, the satellite data showed four more missiles had been launched.
[11] It looked like the United States was strong, trying to cripple the Soviet Union with a sudden, deadly, nuclear attack.
[12] There were only seconds for the Soviets to launch strikes in response.
[13] Stanislav Petrov debated whether to report the attack.
[14] If he did, it could have triggered a massive Soviet response.
[15] The Soviet commander did not do what was expected of him.
[16] He decided the satellite data was wrong and did not report the missiles.
[17] He was right.
[18] The satellite signals were reflections of sunlight off clouds.
[19] You probably have never heard of Stanislav Petrov, but you might owe your life to him.
[20] Retaliatory strikes could easily have killed half the populations of both countries.
[21] Researchers have estimated that the nuclear winter that followed could have killed two billion people worldwide.
[22] There is a lesson in the story about whether fallible human beings should ever have nuclear weapons at their disposal.
[23] But our focus today is on a psychological idea, how our minds work when we are under attack.
[24] It's also the start of a series we're calling Us 2 .0.
[25] As we begin what promises to be a pivotal and contentious election season in the United States and many countries around the world, we're taking a close look at how we engage with our opponents.
[26] Over the next few weeks, we'll explore the assumptions we make about our allies and our foes.
[27] We'll look to history for lessons and we'll offer specific strategies to engage constructively with our opponents, whether in the political realm, at the dinner table, or at work.
[28] We begin with the psychology of threat this week on Hidden Brain.
[29] When something bad happens, it's human nature to look for someone to blame.
[30] Needless to say, that person usually isn't us.
[31] The tendency to see others as villains and to cast our human beings, ourselves as innocent victims causes harm in interpersonal relationships.
[32] It may also lie beneath some of our deepest societal divides.
[33] At the University of North Carolina, psychologist and neuroscientist Kurt Gray studies what happens in our minds when we think about our political opponents.
[34] Kurt Gray, welcome to Hidden Brain.
[35] Thanks so much for having me on.
[36] Kurt, I want to start our conversation with a story that is very far away from politics, but I think it has a deep connection with politics at a psychological level.
[37] When you were a teenager, you used to drive around with a bunch of high school friends.
[38] I understand your car was nicknamed Fireball.
[39] Does that say something about how fast you used to drive?
[40] It does.
[41] I used to drive a two -door Pontiac Grand Dam.
[42] It wasn't a flashy car, but I liked to drive it very fast and didn't always pay attention.
[43] So one time you and your friends were heading to a movie when something fairly dramatic happened to you, tell me the story of what happened, Kurt?
[44] I was 16, had just got my license, and we were driving in the night to go see a movie, and it had just rained, and so the streets were, you know, shining in the orange sodium lights, and we were roaring up the road because the movie started in, five minutes, and we were 10 minutes away.
[45] And I was in the right -hand lane.
[46] There was a lane to my left, and my friend and shotgun all of a sudden said, Kurt, you're going to miss the turn, turn left.
[47] And so I hauled on the steering wheel to the left.
[48] I didn't look in the lane next to me because I had to cross a lane to be able to turn left, and there was a car driving next to me. Oh, my gosh.
[49] And so as I turned, this car slammed on its brakes, I suddenly became aware it was there.
[50] I slammed on my brakes.
[51] We screeched and squealed.
[52] The roads were wet.
[53] And so we spun around in the intersection.
[54] I didn't hit it, this other car.
[55] It didn't hit me. We didn't hit anything else.
[56] So luckily, everyone was safe.
[57] No one was around.
[58] We ended up stopped in the wrong direction on the other side of the road, just kind of in a desolate night.
[59] I mean, it's still a heart -stopping moment, though, because I think in that instant, everyone must have seen how close they came to a crash.
[60] It was terrifying, and it happened so fast.
[61] I mean, the music was so loud, right?
[62] We barely realized anything.
[63] We were kind of wrapped in our own world.
[64] And so I open my window to start to apologize to the driver of the other car.
[65] It was a silver Mercedes -Benz.
[66] And this driver, this guy gets out of the car.
[67] He was in his early 20s, had pretty nice clothes on.
[68] I remember he had, you know, curly hair.
[69] It was gelled.
[70] He had some silver chains on.
[71] And I just started opening my mouth to say, sorry.
[72] And he, you know, looks at me just daggers right in his eyes.
[73] And his shoulders are set.
[74] And he is coming towards me fast.
[75] And he says, you're fucking dead.
[76] Get out of the car.
[77] I'm going to fucking kill you.
[78] Kurt was terrified.
[79] He stepped on the gas and took off.
[80] To his horror, the drug.
[81] The driver of the Mercedes hurried back to his car, jumped in it, and came off to him.
[82] So I took off, and I was flying through a strip malls.
[83] It was a really built -up kind of like big box store kind of area.
[84] I was totally panicked.
[85] I had no idea where it was going.
[86] And so, and again, it was dark.
[87] It was no one was around, even though the movie theater, you know, half a mile away was bustling.
[88] No one was around these stores.
[89] And so I'm just taking turn after turn, and he's getting closer and closer on my tail.
[90] And eventually I turn into a parking lot of like a Home Depot store.
[91] And he, you know, revs and gets close to me. And I turn and I go behind this store into the loading dock.
[92] And so there's a steep embankment on my right.
[93] So I'm really like funneled into this little canyon with this guy behind me. and he accelerates up beside me and then in front of me and starts kind of like cutting me off.
[94] He kind of like corrals me into the wall, kind of into a corner, and I realized I was trapped.
[95] Kurt was so paralyzed with fear he could barely think.
[96] And he gets out of his car and starts walking towards me. Again, very menacing.
[97] He's very angry.
[98] And all my friends, we were talking about.
[99] on the way there, obviously.
[100] We were having fun, you know, deadly silent, no music.
[101] My one friend in the back, who's thinking lucidly, her name's Jesse, she says, lock the door.
[102] And so I immediately locked the door, and a second later, he grabs my handle and just starts to haul on the handle trying to pull me out of the car.
[103] Wow, but the door is locked at this point so he can't get in.
[104] Exactly.
[105] But I also realize I have to diffuse the situation.
[106] because he's so angry and much bigger than me. And so I do the only thing I can, which is, you know, start to apologize.
[107] So I unroll my window a few inches.
[108] I say, I'm so sorry.
[109] I know it's my fault.
[110] I wasn't watching where I was going.
[111] And he again says, you know, you're fucking dead.
[112] I'm going to kill you.
[113] And then he reaches into, you know, through the crack in the window.
[114] And he tries to unlock the door from the inside with his hand.
[115] Wow.
[116] Wow.
[117] And so I'm simultaneously trying to stay calm and contrite, apologize to this man, while frantically slapping away his hand so he can't unlock the door.
[118] And then it's clear he's not going to be able to unlock it.
[119] And so he just starts slapping me through the crack in the window.
[120] He's grabbing me by my collar and just kind of shaking me, just repeating like I'm going to kill you, you're dead again and again.
[121] How does this, how does this end?
[122] The friend of mine in the backseat, Jesse, you know, the cogent one, her mom happened to work for a cell phone store.
[123] And cell phones back when I was in high school were not popular.
[124] Not everyone had one, but she had one lent from her mom just in case anything happened if she had to make any phone calls.
[125] And it was, you know, a kind of brick of a phone, as the old ones were.
[126] And she holds it up and she says to this guy, I've got a cell phone, and I'll call the cops.
[127] And so this doesn't sink in right away to the guy who keeps on slapping me and threatening to kill me. And then eventually he stops, it sinks in, and he takes his hand away, and he kind of bends down, and he looks through the crack in the window at all of us in the car, and he says, fine, you call the cops, and I'll tell him what you did.
[128] And this statement was perplexing to me, because clearly in my mind, if I explain what had happened to the police, they would surely be on my side.
[129] I was the one getting assaulted, getting threatened with murder, but I was puzzled because he was so confident that the police would be on his side.
[130] I couldn't understand how he could be so confident that he was in the morally right position and yet I was confident that I was morally correct.
[131] You know, I think when things like this happen to us, you know, we're very quick to try and defend our particular points of view.
[132] But as you're telling me the story, I'm an observer, and I can see things from both points of view.
[133] I can see how he must have been driving along the road, someone swerves in front of him at high speed, nearly kills him, and he says, clearly I'm the victim here.
[134] This crazy teenager could have killed me. And from your point of view, you're saying, you know, I made a mistake and a simple mistake, and I'm really sorry about it.
[135] But surely that mistake doesn't warrant somebody chasing me through dark streets for mile upon mile, cornering me and threatening to kill me. I agree.
[136] And as I started to do research on moral psychology, I came more and more to recognize the genuine concerns that he had about being harmed.
[137] He genuinely felt like he was victimized, and so did I. And so this presented a puzzle to me. We experienced the same situation and had completely opposite perceptions of blame and harm.
[138] I have to ask you what happened that night after your friend threatened to call the police and he said, go ahead, call them.
[139] How did the incident come to an end?
[140] Well, he, after he told us to go ahead and call the cops, he kind of stood there and looked at us for a while and, you know, maybe recognized that we were all frightened teenagers, you know, trapped in a little metal cage like veal in some parking lot.
[141] And he stormed back to his car, slammed his door, and squealed off into the night.
[142] When we come back, how Kurt's story speaks to our deep political divides.
[143] You're listening to Hidden Brain.
[144] I'm Shankar Vedantam.
[145] This is Hidden Brain.
[146] I'm Shankar Vedantam.
[147] At the University of North Carolina Chapel Hill, psychologist Kurt Gray studies the science of political polarization.
[148] Along with other researchers who study how we think about our political opponents, Kurt finds that we make a series of assumptions and draw a series of conclusions about people who disagree with us politically.
[149] These assumptions and conclusions are especially powerful because they happen so swiftly, automatically, and unconsciously that they don't feel like assumptions or conclusions.
[150] They feel like facts, self -evident facts.
[151] The first of these has to do with what we think is happening inside our opponent's minds, or rather what we think isn't happening inside our opponent's minds.
[152] We generally think that we, our side, is smart, that we vote in our own self -interest and that we do things that make sense.
[153] And we think that we, you know, want policies that are going to help ourselves and the country.
[154] But when we think of our opponents, we think of them as being quite stupid.
[155] We think of them as not voting in their own self -interest, and we think of them not wanting policies that's going to help them.
[156] And so in one study we did in North Carolina, we presented people with a bunch of amendments that were part of an election a few years ago, and we just asked people about those amendments and why someone on the other side might vote differently than they do on those amendments.
[157] And so it might ask a progressive participant in North Carolina, why might a conservative person vote yes on these things.
[158] And what we found is that people think that people on the other side, they think that they are dumb and they don't appreciate what's best for themselves or their country or the state.
[159] So in other words, we think the other side is filled with sheeple.
[160] Exactly.
[161] We think that we are thoughtful and rational and doing the best we can with complex issues.
[162] And we also think that those on the other side are not those things at all.
[163] They are tricked by the media.
[164] They are deceived by some leader, that they are sheeple, and we just think that they're stupid.
[165] So we think our opponents are not very smart, but it's also the case that we feel like we don't like our opponents very much, but we think our opponents have stronger feelings about us.
[166] Tell me about the work you've done, looking at how.
[167] We feel like how we think about our opponents and how we think our opponents feel about us.
[168] So it's true that we don't like folks on the other side, but a lot of research finds that we severely overestimate how much the other side dislikes us.
[169] In one paper, the researchers show that we inflate our estimates of how much the other side dislikes us by somewhere between 50 to 300%.
[170] Wow.
[171] So Republicans might mildly dislike Democrats in general, but if U .S. Democrats, how much you think Republicans dislike you, they think it's this, you know, deep burning hatred to yourself and your political party, and it's just not true.
[172] And, of course, the reverse is true as well, that Republicans believe that the Democrats hate them.
[173] And I understand that this work, I think it was done by Samantha Mooreberg in, uh, 2020, found that the more partisan people are, the more strongly partisan people are, the more they hold this bias.
[174] That's right.
[175] So no matter what side of the political spectrum you are, the further out on that spectrum you are, the more you inflate your estimates of how much the other side hates you.
[176] I'm wondering what the effect of this is.
[177] If you and I are in conflict with one another and we have a disagreement about something, I can tell myself, you know, Kurt wants X and I want Y. and we can figure out, is there a middle ground between X and Y?
[178] But if I tell myself, you know, Kurt doesn't just want X. Kurt actually hates me and really wants the worst for me. It becomes very difficult to think about splitting the difference between X and Y. That's right.
[179] So compromise and democracy more generally requires that we're willing to talk with others who might disagree with us, cooperate with others who might disagree with us.
[180] and if you think the other side hates you, it can be hard to even engage in conversation with them, right?
[181] It's a fight for survival.
[182] You know, I'm thinking back to that incident that took place when you were a teenager, the incident where you got into a conflict with another driver.
[183] And one of the things that strikes me is that in that moment when your friend was threatening to call the police, you felt righteous because you felt clearly I'm the one who has been wrong, here and the police will see my side of the story.
[184] And the other driver felt righteous too and said, surely the police will see my side of the story.
[185] In some ways, when we believe that people will hate us, it gives us license to feel righteous.
[186] Absolutely right.
[187] We feel righteous and in the moral right because not only are we hated, but also because we're being harmed.
[188] There's a villain on the other side.
[189] They're attacking us, and that makes us the victim.
[190] And when we're feeling victimized, we feel licensed to protect ourselves in any way that we can.
[191] I'm wondering how much of this is about, you know, what psychologists sometimes call, you know, cognitive closure, seeking cognitive simplicity.
[192] If I have to say, Kurt wants X, I want Y, what's the middle ground, it's complicated.
[193] But if I can just say, you know, Kurt hates me, Kurt is clearly wrong, I'm in the right, in some ways, it's cognitively simpler.
[194] Yes.
[195] Our minds want simplicity.
[196] And that's especially true when it comes to morality.
[197] And the reason is because if we acknowledge that the moral universe is complicated, then we have to acknowledge that our moral beliefs might sometimes be wrong.
[198] And so when I was in that car, right, recognizing that the other driver had a legitimate, you know, a genuine feeling of victimhood.
[199] meant that I might be the villain.
[200] I might be the perpetrator there.
[201] And that's a tough pill to swallow.
[202] So we've looked at how we believe that our opponents are stupid and that our opponents hate us.
[203] You say that another belief that we hold about our opponents is whether they care about democracy and they care about our shared civic values.
[204] Talk to me about this research court.
[205] A team of scientists has found that although people generally support democracy Everyone in America generally supports democracy.
[206] We vastly overinflate how much people on the other side don't want to support democracy.
[207] So our side is pro -democracy.
[208] Their side is, if not anti -democracy, at least willing to let democracy slide to win it politics.
[209] That perception means that now we feel threatened.
[210] Now we're in a war.
[211] They're trying to destroy democracy.
[212] And so in a war, we have to fight dirty too.
[213] And the perception that the other side is anti -democratic licenses our side to also do anti -democratic things.
[214] Because they're willing to steal elections for us to even stay in the game.
[215] We should be willing to bend the rules as well.
[216] That's right.
[217] I'm wondering what you make of recent events in the United States.
[218] So, for example, if you think about the January 6th insurrection, how would you think that.
[219] I think many Democrats would look at that and say, you know, very clearly, these are people who actually try to overturn the election.
[220] Clearly, they are anti -democratic.
[221] It's not just a perception in my head.
[222] So the January 6th example is a great one because from the outside, it seems like these are the folks who are just trying to destroy democracy.
[223] But I think from the inside, if you look at it from their perspective, they think that they are upholding democracy because they were led to believe that the election was stolen.
[224] And maybe what they're doing isn't democratic, but if they thought that Democrats kind of fired the first shot, they're just retaliating.
[225] But in that case, there is a difference between what everyday people think and what political elites are doing.
[226] So I think when it comes to the behavior of the many people who went up to the Capitol on January 6th.
[227] It's easy to see how their worldview supports the idea that they are standing up for democracy and for freedom.
[228] I'm not sure that I would be as sympathetic to some elites who are propagating that idea.
[229] So we've looked at how people think the other side is stupid, the other side is irrational, the other side is anti -democratic.
[230] You recently published a study, Kurt, that reported on what Democrats and Republicans in the United States believe about one another when it comes to topics such as murder, child pornography, and embezzlement.
[231] Tell me about the study.
[232] Yeah, so that study, I should say it's not yet published, but it is available online all the data and the manuscript.
[233] And in that study, we wondered how people would view the morality of the other side.
[234] And of course, we already know that progressives and conservatives disagree about hot -button issues.
[235] So you might think the other side is wrong about abortion, about capital punishment or immigration.
[236] But there are many moral issues that seem totally uncontroversial, like murder or embezzlement or, as you say, child pornography.
[237] And so the question is, what do people think about those on the other side when it comes to those issues?
[238] Would Republicans think Democrats are okay with child pornography or embezzlement or infidelity and so forth?
[239] And vice versa.
[240] So when we look at the data, we find that consistent with what we'd been talking about before, people vastly overestimate how much those on the other side see these obvious moral wrongs as acceptable.
[241] We show that both Democrats and Republicans, think that 15 % of the other side view child pornography as acceptable.
[242] That's crazy, right?
[243] The real answer is basically zero, but we really inflate how much the other side is evil and lacks a basic moral sense.
[244] So, I mean, we're in really deep waters here because now our dislike for one another is not just about policy matters, We're not even dressing it up as being about policy matters.
[245] We're actually saying our opponents now are just evil people who are bent on just destroying the world.
[246] So I think this rampant polarization makes people endorse something that we call a destruction narrative, where the other side is motivated by the urge to destroy our side and also America.
[247] And it's really the sense that the other side wants to watch the world.
[248] I should say that people think that the other side is more stupid than evil, you know, more misguided than demonic, but it's still not a great place to be, obviously, and there's still a sense that the other side is motivated by some destruction, that when the other side passes some policy with some unintended negative side effect, as all policies have.
[249] Some research shows that people think that those on the other side intend those negative policy consequences, that they want to hurt people on the other side.
[250] But of course, that's not true.
[251] Folks are just trying to do the best they can when it comes to these policy preferences.
[252] So we've talked in different ways and offered different examples of how in our political discourse, we want to see ourselves as being the victim, as being potentially harmed, and seeing the other side as the perpetrator, the other side is the villain, the people who are trying to do us harm.
[253] And, of course, the other side feels exactly the same way.
[254] But it raises a really interesting point, which is that the animating force in much of politics might not be animosity and aggression, but it might be a feeling of victimhood, a feeling that we are under siege, that we are under attack.
[255] Can you talk about this idea?
[256] Because I think that's not the way most people think about politics.
[257] People think about politics as being a blood sport, very aggressive.
[258] But the picture that you're painting, I think, is slightly at odds with that, where the feelings of vulnerability that we have are, in fact, the dominant drivers of our perceptions and behavior.
[259] That's right.
[260] So when we think about the motivations of others, we think that they are aggressive.
[261] We think that they are trying to destroy us.
[262] We think they are motivated by some deep instinct to hurt us.
[263] But my reading of the literature and my work suggests that ultimately people are motivated by this desire to protect themselves, to guard against threats.
[264] They're motivated by a sense of vulnerability.
[265] So rather than a destruction narrative, I think the politics is better described by a protection narrative, where people are trying to protect themselves and their vulnerabilities.
[266] Where do you think this comes from, this sort of constant need to protect ourselves, to see ourselves as under threat?
[267] Where do you think this comes from, Kurt?
[268] I think our desire to protect ourselves from threat in politics and the modern world comes from way back in human nature.
[269] I think that the human experience is ultimately an experience of threat and fear and worry about our vulnerabilities.
[270] I'm wondering, Kurt, if some people might say, you know, that can't possibly be true.
[271] Humans are at the apex of the planet right now.
[272] You know, every other species should fear humans because, in fact, we are the most deadly predator on the planet right now.
[273] But you're making the case that humans, in fact, are motivated almost entirely by fear, by vulnerability.
[274] This seems to be a mismatch there.
[275] there is a mismatch and there is no doubt that today we are apex predators we can hunt wolves from helicopters we can remake the world but there's a fallacy in thinking that just because we are predators today that that's how we have always been and in fact if you look back in the midst of time where our minds and our human nature evolved we were not predators at all and so Instead, we were prey.
[276] We evolved not as predators, but as prey.
[277] I understand that you had an incident in your own life that brought home to you, your own vulnerability as an individual creature.
[278] Tell me the story of what happened, Kurt.
[279] Before I wanted to be a social psychologist, I thought I wanted to be a geophysicist.
[280] And a geophysicist spends a lot of time outdoors in remote locations looking for natural gas or oil.
[281] And so I was very far north.
[282] So if you drive to the border between Montana and Alberta, and you drive 18 hours straight north, and then you turn left and drive for another hour, you come to Rainbow Lake, Alberta.
[283] And then from Rainbow Lake, which is extremely isolated, you take a helicopter ride, another 30 miles into the bush.
[284] That's where we were looking for natural gas.
[285] So there was a crew of five of us, four college students, and one old man named Ian, who was at the time 25, but he seemed like an old man to us.
[286] And we would spend our days in the middle of the wilderness, driving around on snowmobiles and pounding stakes into the ground to try to find natural gas.
[287] On one of these expeditions, Kurt and its team had just finished a tough days' work.
[288] But before the helicopter could come fetch them, bad weather rolled in.
[289] The helicopter pilot told them he'd come get them the next morning.
[290] But it was winter and it was very cold.
[291] And the five of us were literally in the middle of nowhere and had no water, no food, except some leftover sandwiches from lunch.
[292] And we had to spend the night alone in the middle of the Canadian wilderness.
[293] And so we went off to the forest, we built a lean -to, we gathered some firewood, we lit it with gasoline, which I wouldn't recommend unless that was your only source of anything flammable in the middle of the wilderness.
[294] And then we just sat down to wait through the night until the helicopter might be able to come and pick us up.
[295] And I'm assuming it was, apart from the fire, it was pitch dark.
[296] Yeah, it was, it was, Hitch dark.
[297] It was minus 10 degrees Celsius.
[298] We had no other blankets.
[299] We had no other jackets other than our one -piece fire -retardant Nomex coveralls.
[300] You know, we weren't prepared to weather a night outside.
[301] So we shared the remnants of our lunch.
[302] We sat around the fire talking and then it was time to go to bed.
[303] And so we all, five in a row, we spooned with each other to stay warm, but that proved to be too cold in the night.
[304] And so we eventually found our way back to the fire and we curled up around it in a circle and tried to sleep while the flames were high and while we were warm.
[305] And then when the flames died down, we would wake up and we'd add some wood to the fire and try to sleep again.
[306] And we did that for 10, 12 hours.
[307] Now obviously, you know, there's no sort of, they're no human predators out there, but presumably there are animals.
[308] Before this night, I had never thought about predators, right?
[309] I grew up in a city in Canada, but there was a couple times when I woke up in the middle of the night where I felt uneasy.
[310] And you might say, of course you felt uneasy because you were stuck in the middle of the wilderness hoping not to die, you know, of cold or thirst or something like that.
[311] But I just couldn't get the sense that, you know, there was something out there, and it's so dark you can't see beyond this little circle of light.
[312] So you could look into the woods and there was absolutely nothing but blackness.
[313] And it's not like there's some serial killer out there, right?
[314] It's not like a horror movie because we're so far from civilization.
[315] But I still got this uneasy sense.
[316] And then bit by bit, the sky turns gray.
[317] it's still pretty cloudy out.
[318] And we get up and we stretch.
[319] And as we walk around the campsite, we notice that there are paw prints all the way around.
[320] Very close to where we were sleeping.
[321] And they were lynx, paw prints.
[322] And so what had happened in the night was that some lynx had hurt us, had smelt us, and had crept close to us in the night.
[323] For our overseas listeners who live in tropical climates, can you tell me what they are?
[324] Links are big, fluffy bobcats.
[325] I don't think they could take down an adult man, but I think they could probably eat a small child, and certainly they could rip out the throat of someone who's sleeping in the darkness.
[326] And that realization hit home to me as we sat there.
[327] in the morning waiting to get picked up by the helicopter.
[328] And we couldn't have done anything to prevent it because humans are weak and we don't have nails and we don't have teeth.
[329] And if there had been a real predator, if it had been a mountain line, then we wouldn't have stood a chance.
[330] Not very long ago, this was not unusual at all.
[331] You know, 150 years ago and earlier, stuff like this happened probably all the time in all parts of the world.
[332] people were living in close proximity to nature and, in fact, were vulnerable in ways that we simply don't feel today.
[333] Absolutely.
[334] So for the last millions of years of our evolution, we have been vulnerable to predation.
[335] And it's really only in the last 100, couple hundred years, that that threat has basically dropped down to zero for most of us.
[336] for a long time people were hunted by wolves tigers bears but even today in our industrialized world many people are still vulnerable to predators there was a case in canada several years ago of of a pop singer going for a walk through a nova scotia forest and she was killed and partially eaten by a pack of coyotes I'm wondering what effect this has in our minds, the fact that in some ways we've had a very long evolutionary history where we are vulnerable and potentially under threat, and a very recent evolutionary history where that threat has receded, what has that done to our minds curve?
[337] Our longstanding vulnerability to predation has really shaped our psychology in our modern world.
[338] even if we don't think about predators today very much, we are still fundamentally concerned with protecting ourselves from threats.
[339] And those threats might not be sitting in the forest or in the jungle behind our houses, but we are constantly bombarded with threats today when it comes to politics, when it comes to morality.
[340] So we bring forward this long -standing evolution, feeling of threat into our modern political realm.
[341] And this is why we typecast the other side as predators.
[342] And I think it's important to recognize this because fundamentally those folks on the other side who we see as predators also feel like prey.
[343] Even the other driver in the parking lot that night, he felt like the victim, like the prey.
[344] Of course, that doesn't mean that liberals and conservatives have to define harm the same way.
[345] What you might consider harmful might not necessarily be what I consider harmful, which is why we can be worried about different issues.
[346] That's exactly right.
[347] So in my research, we find that liberals might emphasize harms to the environment or they might emphasize harms to members of disadvantaged groups, whereas conservatives might emphasize harms to social order, to those trying to protect our society like police, and perhaps to religious entities like God or the Bible.
[348] You can see this very well even with hot -button issues like immigration.
[349] So progressives might worry about the harm done to undocumented immigrants, who they perceive as vulnerable, whereas conservatives might worry about the harm done by undocumented immigrants who might be criminals or drug traffickers in America.
[350] So both of those positions are motivated by desire to protect us from harm.
[351] They just emphasize competing harms in that issue.
[352] It's a problem we face in nearly every dimension of our lives.
[353] Our brains were sculpted by evolution over thousands of years.
[354] Our minds today are the product of those evolutionary forms.
[355] forces.
[356] We are walking around with machines that were designed, if you will, in the Stone Age.
[357] Unsurprisingly, there are mismatches between what those brains were designed to do and the challenges we confront today.
[358] When we come back, how understanding the psychology of our political conflicts can help to bridge seemingly intractable divides.
[359] You're listening to Hidden Brain.
[360] I'm Shankar Vedantam.
[361] This is Hidden Brain.
[362] I'm Shankar Vedantan.
[363] Psychologist Kurt Grace studies the science of political polarization.
[364] In a number of studies, he and other researchers have found that Democrats and Republicans in the United States, and partisans in other countries, have very strong and very wrong views about their opponents.
[365] We tend to think our opponents are idiotic and irrational.
[366] That's the mild stuff.
[367] We also think they're anti -democratic, evil, and are okay with children being harmed.
[368] We ask ourselves, what is wrong with those people?
[369] How can any decent person have such terrible and misguided thoughts?
[370] Our certitude about our moral superiority means we don't have to understand our opponents or give them the benefit of the doubt.
[371] So you've done a lot of work, Kurt, sort of looking at ways in which we can turn down the temperature on political polarization.
[372] And you say that one of the first and most practical things that we can do is to frame our positions on issues in terms of harm.
[373] So in other words, we think that facts are what bridge divides, but in fact, it's our shared concern about harm that actually is what bridges divides?
[374] That's right.
[375] And we have a big paper with 15 studies that shows that people think that facts are the key to bridging divides, but when you actually give people facts in heated conversations about morality, it doesn't work.
[376] Instead, what does work to bridge divides is allowing people to talk.
[377] about their own concerns with harm, to talk about their own worries about threats and the pain that they or their family may have suffered.
[378] And that makes them seem less like sheeple.
[379] It makes them seem less stupid and less evil.
[380] Even though they disagree with you, right, they have the same concerns about harm, so they're similar to you, but it also makes sense that they would make this decision, right?
[381] And so now they're not voting against their own self -interest.
[382] They're not being irrational, what they do make sense.
[383] And that makes people willing to respect them and have conversations with them.
[384] Kurt, one of the things you say is that it's important for us to remind ourselves that the other person's feelings about harm are genuine, even if those feelings of harm seem unfounded to us.
[385] Why is this hard to do and why is it helpful?
[386] It's so hard to recognize the authenticity of other people's perceptions of harm, especially when those perceptions are opposite to our own.
[387] And that's because our perceptions of harm are deeply intuitive.
[388] We feel them in our gut.
[389] If you're a pro -choice person thinking about the abortion debate, in your gut, you know it's about protecting women.
[390] But if you're a pro -life person, then in your gut, you know it's about protecting unborn children.
[391] And the power of intuitions about harm make it difficult to realize that the other person is authentically trying to protect someone from harm.
[392] But it's so crucial because that's what we need to do to recognize that those on the other side are motivated by protection and not destruction.
[393] You also talk about an idea called moral humility, which you say is different from intellectual humility.
[394] Yeah.
[395] There's been a lot of discussion these days about intellectual humility.
[396] And I think it's important to recognize that you might be wrong about how the world works.
[397] But it's much harder to think that your moral judgments might not be 100 % right.
[398] We are deeply motivated to think that we are good people.
[399] And yet moral humility is appreciating that even if we are good people, other people, other people might be good too, and even if they disagree with us, they're still good.
[400] And so what that means is that we might not be 100 % right about our moral judgment.
[401] And it's hard to have that kind of humility.
[402] I want to talk about a demonstration of moral humility that was in a recent documentary called Guns and Empathy.
[403] It was produced by a nonprofit organization called Narrative 4 in partnership with New York Magazine.
[404] And during this documentary, one of the participants was a woman named Carolyn Tuft, who was shot three times in a mass shooting at a mall in Salt Lake City, and her 15 -year -old daughter was killed.
[405] I want to play you a little clip of what Carolyn said.
[406] Everything I knew is gone.
[407] If people thought that could happen to them and thought they could actually lose their business, lose their house, lose their family, you know, I think that that gun would not have so much, so much hold.
[408] And a little while later, Kurt, there was another person who spoke at the same event.
[409] Her name was Jillian Weiss, and she had a very different view on guns.
[410] She was born with a disability, and she bought a pistol after she was stalked, and after she learned that disabled women were much more likely to be sexually assaulted than women without disabilities.
[411] Let me play you a clip of Jillian.
[412] I have my gun with me in my home, and I feel so much safer knowing.
[413] that should anything happen, I can defend myself.
[414] What is the effect of hearing these two different stories on people who are listening?
[415] What's happening in their minds?
[416] Listening to these stories might not persuade you, but it does make you see the position of the person telling these stories as rational, as something that makes sense, and it makes you respect that position, and makes you willing to interact more with that person.
[417] And those feelings of respect and the willingness to engage are essential in our pluralistic democracy, right?
[418] We depend on compromise, on open dialogue in our society.
[419] And so these stories of harm are a good first step at motivating the kind of respect that we need to decrease polarization and increase our willingness to engage with others.
[420] You know, it often seems to many people, Kurt, that the divides that we have in our country and, you know, in many countries around the world are so intractable, so painful that it can seem as if, you know, there is no way out, there are no solutions out, that there's no hope in sight.
[421] And I think that's understandable because the temperature has been turned up to such a pitch.
[422] But you cite a historical example of a moment when people put aside their differences and truly saw the humanity of the other side.
[423] And it occurred in 1914 in the First World War.
[424] Can you tell us what happened?
[425] It was the first Christmas of the First World War and, you know, the sides were dug in into their trenches.
[426] They had the barbed wire up.
[427] And even though they were supposed to be killing each other, as Christmas approached, they started being kinder to each other, right?
[428] They would hear each other singing Christmas carols in the trench over and might wave at each other, right, shout some pleasantries.
[429] And eventually the situation got so positive that the Germans and the allies decided to have a soccer game in no man's land where they exchanged gifts.
[430] And so this is really an act of defiance against, you know, the generals who wanted them to kill each other, and it was an act of camaraderie and bridging divides that I think is remarkable even today.
[431] Their mission was to literally murder each other, and yet they found space to come together and see past their disagreements.
[432] I think it holds powerful, applicable lessons for our own time.
[433] The elites in our government and in the media are telling us to hate each other and telling us that we should hate each other, but we already know from all the scientific work we talked about today that the other side actually doesn't hate us as much as we think.
[434] And so this should be an inspiration that even in war, real war, people can rise up and come together, and we can too.
[435] Kurt Gray is a psychologist and neuroscientist at the University of North Carolina at Chapel Hill.
[436] He plans to publish a book about these ideas in 2025.
[437] The book is going to be titled Outraged, Why We Fight Over Morality and Politics.
[438] Kurt, thank you so much for joining me today on Hidden Brain.
[439] Thanks for having me. Have you tried to talk with someone who disagrees with you about politics?
[440] Have you found effective ways to get through?
[441] Have you lost friends over political disagreements?
[442] If you'd be willing to share your stories with the Hidden Brain audience, along with any questions you have for Kurt Gray, Please record a voice memo and email it to us at Ideas at Hiddenbrain .org.
[443] That email address again is Ideas at Hiddenbrain .org.
[444] Use the subject line, politics.
[445] Hidden Brain is produced by Hidden Brain Media.
[446] Our audio production team includes Bridget McCarthy, Annie Murphy Paul, Kristen Wong, Laura Querell, Ryan Katz, Autumn Barnes, Andrew Chadwick, and Nick Woodbury.
[447] Tara Boyle is our executive.
[448] producer.
[449] I'm Hidden Brains executive editor.
[450] Next week in our Us 2 .0 series, the mistakes we make when we try to change someone's mind and a better way to talk to political opponents.
[451] Asking somebody to give up their moral values, people are willing to fight and die for their values, right?
[452] Like, people really, really are invested in not changing their minds about that.
[453] I'm Shankar Vedantham.
[454] See you soon.