Hidden Brain XX
[0] From NPR, this is Hidden Brain.
[1] I'm Shankar Vedantam.
[2] If you had to choose a romantic partner, would you pick someone who was equally wonderful to everyone, including you, or someone who was especially wonderful to you?
[3] It's a question that fascinates Laleen Anik.
[4] I am an assistant professor at University of Virginia's Darden School of Business.
[5] Lalene and her colleague, Ryan Hauser, ran a set of studies to figure out if people want a partner whose equal opportunity in their attention or someone who reserves special treatment for them.
[6] What we find in the paper is people want to be treated uniquely.
[7] The urge to be treated special was so strong that people were willing to pay a price for it.
[8] Take the example of a birthday message.
[9] Imagine that your partner writes a Facebook message that is long and beautiful.
[10] But there's a catch.
[11] Your partner writes this sort of long birth message all the time for everyone.
[12] If this message goes to everybody, people say, I don't want that beautiful, thoughtful message, just tell me one -liner that says happy birthday.
[13] Why would people care so much about being singled out that they'd accept an inferior message?
[14] If my partner sends me a beautiful message, that's a world in which she or he gives it to everybody else, and that pie might be bigger, but it is divided into multiple parts.
[15] and I think people have the feeling that they're getting less.
[16] In the other world where they send me a short, almost cured or cold message, but they don't do that for everybody else, that means that the whole pie, although it might be smaller, it is mine.
[17] But it is all for me. So many of us want our partners to give us the whole pie.
[18] We expect our loved ones to make us the center of their lives, and we expect to do the same in return.
[19] showering favors on those we love doesn't just feel natural it feels like something to celebrate our Facebook and Instagram feeds are chock full of people describing how they've put their loved ones on a pedestal but what we fail to notice is that our impulse to give special treatment to those closest to us can have terrible moral consequences last week we looked at how few of us actually live by the maxim that everyone is equal Today, we try to understand why we struggle to put that value into practice and how actions that come from a place of love can lead to a more unjust world.
[20] The Sins of Playing Favorites this week on Hidden Brain.
[21] Mazurine Banaji has been studying the psychology of discrimination for a long time.
[22] Her first article on the topic came out in 1993 and she's published hundreds of research papers since then.
[23] She's also helped to develop the implicit association test, which millions of people have used to identify their unconscious biases.
[24] Given how deeply she's thought about the ways we treat others, you might think it would take a lot to alter her views about discrimination.
[25] But Mazurine's understanding of how bias plays out was radically reshaped by two words.
[26] Professor Kaplan.
[27] Those words marked a turning point in a story she first heard, many years ago.
[28] The story was about Mazurine's friend and colleague, Carla Kaplan.
[29] Carla and Mazarin are now both professors in Massachusetts, Carla at Northeastern University and Mazarin at Harvard.
[30] But at the time, they both taught at Yale.
[31] Carla was this absolutely brilliant young woman, articulate beyond measure, gifted writer, a professor of English.
[32] but she also had these wonderful other hobbies.
[33] I mean, she was a quilter, and you would often see her sitting in the back of a lecture, like a little Madame Defarge, you know, sitting there quilting away while she listened to a talk on 18th century, you know, literature.
[34] Despite working in different departments, the two of them got to know each other because they shared a characteristic that was unusual for professors in the 1980s.
[35] They were women.
[36] In the late 1980s, young women, women who looked like us, could only be two kinds of people.
[37] If they dressed like us, they looked like undergraduates.
[38] And if they dressed nicer, they typically wore administrative assistants.
[39] The few female professors on campus regularly crossed paths.
[40] That's what happened to Mazarin and Carla.
[41] And we got to know each other, which is how I learned about the story of her hand.
[42] One night, Carla lost her grip on a crystal bowl she was washing.
[43] The bowl hit the sink and shattered.
[44] The jagged edge of this bowl cut her hand from the middle of the palm to her wrist.
[45] Carla's kitchen floor looked like a Jackson Pollock canvas.
[46] Her blood was splattered everywhere.
[47] Her boyfriend patched up her hand with some bandages.
[48] He rushed her over to the emergency room.
[49] The attending doctor was very attentive to her and kept reassuring her that she would be fine.
[50] Carla and her boyfriend tried to get the doctor to understand how important it was for Carla to preserve the dexterity in her hand.
[51] They explained that she was a quilter.
[52] Quilting requires fine finger movement, and they hoped that this stitching up would actually allow her to have all of her sensitivity back in her hands and were assured that indeed that would happen.
[53] And there was no reason to think in that moment that she wasn't receiving anything but the best treatment.
[54] But Carla was about to discover that she was not getting the best treatment possible.
[55] A volunteer at the hospital happened to pass by, recognized Carla and said, Professor Kaplan, what are you doing here?
[56] The doctor froze.
[57] He asked Carla, you're a professor at Yale?
[58] Upon confirming that information, rushed Carla off to a different part of the building.
[59] The best hand specialist in New England was called and with a bunch of other doctors operated on her hand for several hours into the night.
[60] It was far more extensive surgery than Carla would have received in the emergency room.
[61] The work on the finer nerves of her hand is what would have taken the incredibly longer time.
[62] And then I also believe that there was greater attention to the cosmetic part of it.
[63] Carla was grateful to receive such careful attention.
[64] But her gratitude was overshadowed by another feeling.
[65] She was appalled.
[66] Carla was just stunned that her being a quilter was not sufficient to get her the treatment that she should have gotten, but that being a professor did.
[67] And somehow it must be that the doctor did not feel compelled by the quilter story in the same way as he was compelled by a two -word phrase Yale professor.
[68] When Mazarin reflected on the story, it rattled her understanding of bias, the very focus of her career.
[69] Had Carla's doctor in the ER refused to perform surgery on her because of her race or gender or some other characteristic, Mazurine would have immediately considered what happened to be discrimination.
[70] We all would.
[71] But Carla was not denied anything.
[72] The emergency room doctor was kind, attentive, and competent.
[73] Carla was not treated worse than any other patient.
[74] The point of the Carla story is to focus not so much on a negative thing, that the doctor did, something where he withheld some basic level attention to her.
[75] What we're speaking about here are acts of helping, which are things that go above and beyond what we typically do.
[76] The moral problem at the heart of Carla's story was not an act of hate, but an act of special kindness.
[77] What Carla got went above and beyond the ordinary.
[78] She received VIP treatment, and it made a huge difference.
[79] It's hard for me to think that a patch -up job in an ER that required sewing up versus a fine -grained surgery that goes on for hours is really not going to have some substantial impact on the person's life.
[80] Mazarin realized that our definition of discrimination is too narrow to capture this type of harm.
[81] We really measure prejudice in the old -fashioned way by looking for acts of commission.
[82] What do I do?
[83] Do I go across town to burn down the church of somebody who's not from my denomination?
[84] That I can recognize as prejudice.
[85] But when we don't act, that's acceptable.
[86] It's understandable that we rarely feel outraged when someone gets treated better.
[87] We don't see the harm because the harm is invisible.
[88] It's directed at people who don't get the VIP treatment.
[89] But those people were not in the ER with Carla.
[90] they didn't know what they were missing.
[91] If you haven't benefited from this type of treatment, you might have suspicions that favoritism is costing you opportunities, but it's often hard to pinpoint what you're losing.
[92] You don't know what would have happened if someone had recommended you for a job because you both went to the same university, or how your career would have changed if you had been invited to a private party where important company business got discussed.
[93] For Mazurine, The experience that Carla described in the hospital was like a curtain being pulled back.
[94] Carla suddenly got to see what it was like to be treated the same as everyone else and what it was like to be treated as someone special.
[95] Carla Kaplan and Professor Kaplan each got a glimpse of what life was like for the other person.
[96] It made Mazarin reflect on her own behavior.
[97] I go through life having no clue that the person, people around me are the ones I'm helping unless I pause and ask the question, how do I wish to do this?
[98] Do I intend to help one type of person over another?
[99] Mazarin can think of instances when she's favored people because they shared something in common with her, like the time a writer asked to interview her for a story.
[100] I typically oblige, but I asked her who she was writing for, and when she told me, I said, I don't think I'll do the interview because I did not hold the magazine she was writing for in high esteem.
[101] I didn't think it presented psychology in a terribly favorable way or as a science.
[102] And I told her that.
[103] And as she was departing, she said, this is on the phone, she said, I used to be a student at Yale when you were there.
[104] And even though I didn't take a course with you, I do remember hearing about your work.
[105] And the next words out of my mouth were, okay, come on over.
[106] I'll talk to you.
[107] You know, what happened to my Nifalutin principles that I ought not to talk to, a magazine that does not represent my field accurately and so on.
[108] That little bit of in -group information, the fact that she and I had shared a zip code for four years, was sufficient to make me override my decision.
[109] This type of behavior, of course, is pervasive.
[110] It's why people want to go to elite schools.
[111] It's not just for the education.
[112] It's for the next.
[113] It's for the next.
[114] network.
[115] I actually think that this is not only very much pervasive, but that it often appears in the context of doing something good.
[116] We interpret these acts of prejudice as acts of generosity and kindness.
[117] Carla's doctor felt a connection to her because of their common affiliation, and he pulled out all the stops to help.
[118] He probably went home that day, feeling proud.
[119] I think that kind of act of helping towards people with whom we have some shared group identity is really the modern way in which discrimination likely happens.
[120] Favoritism, of course, is not the only way that discrimination happens.
[121] Current news events show us that other forms of prejudice are alive and well.
[122] But favoritism flies under the radar, despite having serious moral consequences.
[123] I think it absolutely does in a world in which we believe, I think, with our good intentions, that when certain groups of people suffer or when terrible things happen to them, that those are of their own making.
[124] Let's say you listen to the story and decide you won't play favorites anymore.
[125] That's admirable, but...
[126] It's not easy, in most cases in life, to say, where is that zero -level neutral behavior where if I do that, I'm not discriminating.
[127] When we come back, what would it look like if we treated everyone the same?
[128] Be honest.
[129] Yeah, I would sacrifice Hannah.
[130] Stay with us.
[131] This is Hidden Brain.
[132] I'm Shankar Vedantam.
[133] Favoritism is the unnoticed form of discrimination that many of us have experienced in one way or another.
[134] We may have gotten a job interview because the hiring manager went to the same university.
[135] We may have done a favor for.
[136] someone because we have friends in common.
[137] But there are people who are not only able to notice their impulse toward favoritism, they're able to override it.
[138] Dylan Matthews is one of those people.
[139] His journey to this moral stance had unusual beginnings.
[140] It started with sarcasm and vindictiveness and wanting to annoy someone when I thought it was funny.
[141] It's probably worth noting that Dylan was in middle school at the time.
[142] We had this school project for social studies where we had to write about a threat to the world.
[143] And normal people would do like world hunger or HIV AIDS or something.
[144] And my friend Teresa, who then was a very devout Catholic, did the philosophy of Peter Singer because I think she had heard her dad complaining about Peter Singer and this guy who wanted to abort infants or something.
[145] Peter Singer, the philosopher that Dylan's friend despised, was a same philosopher we spoke to in our last episode.
[146] And because I was 12 or 13 and a jerk, I went out and bought a Peter Singer book and would read it in front of her as much as I could.
[147] And I found much of it persuasive.
[148] So I stopped eating meat was the first thing I did.
[149] But there was also an essay in there called Famine, Affluence and Morality, which was written during the Bengali War in 1971, was an argument that people in rich countries have a moral obligation to give to people in poor countries when doing that doesn't cost them much for their standard of living.
[150] If you missed our last episode, Peter Singer's philosophical position is known as utilitarianism.
[151] In essence, it asks how we can increase the greatest amount of happiness and well -being for the greatest number of people.
[152] In a famous thought experiment, Peter asks you to imagine coming by a drowning child in a pond.
[153] Let's say saving the child means ruining the suit you are wearing, and it costs $200.
[154] Most people say they would rather save the child than save their suit.
[155] If that's the case, Peter says, why not write a check for $200 to a charitable organization that can save a child's life on the other side of the planet?
[156] The analogy was not donating in those circumstances is like if you see a child drowning in a pond and you don't jump in because you don't want a dirtier suit.
[157] When Dylan Fuddering, He just read this thought experiment, he was still a student and didn't have income to donate.
[158] When he finally started working and had money to spare, the drowning child in the pond was still in the back of his mind.
[159] With that in the back of my head, I wanted to donate some of it, and I happened upon a group called Givewell that recommended highly effective charities for people who wanted to save as many lives as possible, given their charitable giving.
[160] He eventually found his way into effective altruism, a movement that Peter Singer helped to inspire.
[161] The aim of effective altruism is to do the most good possible for others.
[162] That means using evidence and reason rather than doing what feels good or is intuitively appealing.
[163] Given his involvement in this movement, it's not a surprise that several years ago, Dylan found himself attracted to someone who takes ethics as seriously as he does, someone who has a sense of duty to others who's willing to give until it hurts.
[164] I'm Hannah Gurch Begley.
[165] What is surprising is that Hannah's moral framework is very different from Dillon's.
[166] I was raised to really think about the person who's in front of you and to think about your family members, to think about your friends, and to think about how can I put them first?
[167] How can I always be there for them?
[168] And then, you know, extending that to how can we make our neighborhood better?
[169] How can we make our city better?
[170] Hannah learned these values from her parents as they went about their daily routines.
[171] My parents always prioritized dinner time.
[172] No matter how busy they were with their careers, we always sat down and at least had dinner together.
[173] Dylan and Hannah's different moral frameworks became evident on their very first date.
[174] They met at a bar in Washington, D .C. I remember it was a problem because they don't serve food, and so we got a little tipsy.
[175] Yeah, we had too many cocktails and not enough to soak them up.
[176] Like all couples on a first date, they flirted.
[177] Of course, being Dylan and Hannah, their flirting took the form of a debate about moral philosophy.
[178] We were having this debate over, is it more important to just sort of maximize happiness in the world or whatever sort of thing you want to maximize, you should maximize?
[179] maximize it, and therefore every action should follow from there.
[180] Dylan believed it was essential not to play favorites.
[181] If you want to maximize happiness in the world, you have to treat everyone's happiness as equally important.
[182] It doesn't matter whether someone happens to live on your block or is even a member of your family.
[183] Their happiness should not matter more than the happiness of someone who lives across the world.
[184] which I thought was wild and insane.
[185] Dylan told Hannah he deliberately chose charities that could maximize the well -being of the greatest number of people.
[186] This usually meant giving to charities that work with the poorest of the poor.
[187] To organizations, for example, that provide bed nets to prevent malaria.
[188] Dylan did not donate to pet projects and causes.
[189] And I am very much not that kind of thinker.
[190] was much more focused on how do we save a particular community or how do we help certain groups of individuals.
[191] Hannah found it difficult to understand how you could turn your back on a neighbor to help someone who lives in a place you've never seen.
[192] The differences in their opinions could have driven them apart.
[193] Instead, it became an energizing force.
[194] That was sort of part of the original spark was that we were able to have those debates.
[195] Dylan did not expect.
[196] Hannah to recalibrate her moral compass to match his, nor did Hannah think Dylan would do that for her.
[197] We both have immense respect for each other's passions and political opinions and desires about how to be a good person in the world, and sometimes we disagree about what the best path on that is.
[198] But the real test came a few months into their relationship.
[199] I think that it became more difficult for me when it became more concrete, which was six months into our relationship.
[200] Dylan had decided to donate one of his kidneys to a stranger.
[201] Dylan said, I want to do this.
[202] I want to start the process.
[203] His logic was straightforward.
[204] There are people who are dying and one of them needs a kidney and this will go to one of them.
[205] So as you're thinking about this, you're basically saying, I have a functioning organ in my body that I want to give to somebody else whom I don't know.
[206] Yeah.
[207] That does not strike you at all as being an unusual thing to say.
[208] I understand that it is an unusual thing to say being in the world and having talked about this with the people who find it to be an unusual thing to say.
[209] But like I didn't need it.
[210] It's a functioning organ, but like it wasn't doing anything for me. To Dylan, not giving away his kidney, was like hoarding food he'd never eat while others were hungry.
[211] Hannah saw it differently.
[212] There were moments when I was very scared that this person who I was starting to love was going to potentially put themselves in a great deal of danger.
[213] It is a very safe procedure, but I'm not a doctor.
[214] So you tell me, oh, you're going to remove a major organ.
[215] Elective surgery, like, I don't understand why you would do that.
[216] Hannah would have struggled less if Dylan's father needed a kidney and Dylan was a man. But having a major organ extracted for a stranger?
[217] It was very clear to me that this was non -negotiable, that this was going to happen.
[218] I think the question was, am I going to be here for that?
[219] Dylan went ahead with the surgery, and Hannah was there for him.
[220] The surgery itself took almost four hours, and that was just four hours of my life that I had no idea whether or not Dylan was okay or not.
[221] She got little sleep that night as she made sure Dylan had someone advocating for his needs around the clock.
[222] The first few days of recovery were rough.
[223] Dylan's experience was recorded by the new site Vox.
[224] He's a staff writer there.
[225] I'm good.
[226] I'm good.
[227] I've only run up once so far.
[228] So there's that.
[229] And they have a bunch of exercises that I have to do.
[230] In the video, Dylan blow up.
[231] into a plastic tube.
[232] And so on, apparently that keeps me from getting pneumonia.
[233] Science is a magical thing.
[234] And then I have to do coughs with my cough, buddy, three times an hour.
[235] Still pretty painful, but not super painful.
[236] Dylan went back to work two weeks after the surgery.
[237] He wrote an article for Vox with the title, Why I Give My Kidney to a Stranger and Why You Should Consider Doing It Too, His article and the video helped publicize this unusual form of altruism.
[238] Some readers told him that they were inspired to donate a kidney because of his piece.
[239] He was honored at a ceremony for kidney donors.
[240] The award he received that night sits on a bookshelf in Dylan and Hannah's living room.
[241] It is this very heavy metal brick that says American hero, Dylan Matthews.
[242] I did not find that embarrassing at all.
[243] It's from the National Kidney Registry, and it was very kind of them to give it to Dylan.
[244] It's hefty.
[245] It's one of the better obelisks I've received in my life.
[246] Dylan and Hannah's differences are fascinating because they both have entirely compelling visions of what it is to be a moral person.
[247] These visions clash with one another.
[248] I was curious how they'd think about a famous moral dilemma known as a trolley problem.
[249] You've probably heard it before, but if not, here's a summary from someone who has studied the problem extensively.
[250] I'm Joshua Green.
[251] I'm a professor in the psychology department at Harvard University, and I study moral decision -making and high -level cognition.
[252] Joshua says the trolley problem has achieved the status of legend.
[253] The trolley problem has become a kind of meme.
[254] So much so, it's even featured on network TV.
[255] It's on the show, The Good Place, which explores questions in moral philosophy.
[256] Oh, God, Michael, what did you do?
[257] I made the trolley problem real so we could see how the ethics would actually play out.
[258] There are five workers on this track and one over there.
[259] Here are the levers to switch the tracks.
[260] This is the classic version of it.
[261] A trolley, or a train, is hurtling down a track.
[262] Its brakes are shot, and it cannot be stopped.
[263] The trolley is headed towards five people, but you can hit a switch that will turn it onto a sidetrack where it will only kill one person.
[264] And the question is, is it okay to hit the switch to minimize the loss of life?
[265] Is it okay to kill one innocent person to save the lives of five innocent people?
[266] Many people find the moral dilemma painful, but will tell you, yes, if saving five people means unintentionally killing one, then it's the right thing to do.
[267] But there's a way to make the dilemma not just hard, but excruciating.
[268] What if the one person whose life you have to sacrifice is someone you know, someone you love?
[269] I have joked that I think if you were in the horrible situation in which, you know, the the five people who are on this trolley track are about to be killed and the only way to save them is to kill your own child that Dylan would kill his own child in order to save those five people because he has this very, I think, rational approach to morality.
[270] The child is one life.
[271] Five people's lives are more than that one child.
[272] I would save the child.
[273] I would easily kill five people on behalf of my own child.
[274] I know that and I'm not a mother.
[275] I just know that that's the kind of mother I would be.
[276] I would kill the shit.
[277] Yeah, I wouldn't even question it.
[278] Like, it's five against one.
[279] I would probably kill myself after out of grief and so it would be five against two.
[280] But yeah, no, that's not a hard problem for me. Like those people are all the heroes of their own stories and they all have loved ones who love them as much as I loved the kid.
[281] And it seems obscene to me to treat my attachment as paramount above their attachments and their lives.
[282] Yeah, I just have totally different intuitions about that from Hannah.
[283] I couldn't resist asking the two of them to consider a scenario that's even closer to home.
[284] This is still hypothetical, but it's a little less hypothetical given that you don't actually have a kid right now.
[285] Let's say it was five people on one track, and it was Hannah on the other track.
[286] What would you do?
[287] Be honest.
[288] Yeah, I would sacrifice Hannah.
[289] I love Hannah very much.
[290] I would have a lot of very tense conversations with their parents afterwards.
[291] I don't think John and Val would be in my life for much longer.
[292] But let me pose the same hypothetical to you now.
[293] So let's say it was five and it's not a child on the other track, but it was Dylan on the other track.
[294] What would you do?
[295] Well, I might kill Dylan because I would know that's what he wanted.
[296] I would never forgive you if you'd be.
[297] Yeah, if I killed five people and he was alive, he'd be very mad at me. So I think I would sacrifice him, but I think it would be hard.
[298] I think a split -second decision I would save Dylan because I think, you know, my instinct is to protect my loved ones.
[299] I told Joshua Green about Dylan and Hannah.
[300] He noticed something revealing.
[301] about their responses to the thought experiment.
[302] She still acknowledged that what he would do is rational, in a sense.
[303] That's the word she used, right?
[304] It wasn't that she didn't understand it.
[305] It's just that she didn't understand how that could get more weight than those basic human and mammalian feelings.
[306] And he said, yeah, I would probably sacrifice my child, and then I would kill myself out of anguish.
[307] So it's not that he doesn't feel it.
[308] It's that he's willing to override it.
[309] And he worries in that hypothetical that he could maybe override it in the moment but couldn't override it for the rest of his life in which case he'd rather end his life.
[310] So I think there's actually there is some mutual understanding there despite the fact that in the end the balance tips one way for one of them and the other way for the other.
[311] When we come back, how these different moral frameworks are reflected inside the brain.
[312] Stay with us.
[313] This is Hidden Brain.
[314] I'm Shankar Vedantam.
[315] Favoritism is deeply ingrained in us.
[316] Most of us don't question our impulse to favor our inner circle because we view such behavior as kind and generous.
[317] We believe such acts make us a good friend or a trusted partner.
[318] But some people, like Dylan Matthews, treat those they know and strangers with the same commitment and care.
[319] I admire Dylan greatly for his willingness to give up his kid and for strangers, but I wouldn't do that.
[320] This again is psychologist Joshua Green.
[321] Joshua knows that while many of us may admire Dylan, most of us, like Hannah, tend to favor those we know.
[322] We don't feel morally compelled to help strangers in the same way we'd help someone we love.
[323] And our whole society is structured around the idea that our lives are defined by these emotionally durable and meaningful personal relationships.
[324] There's an evolutionary reason for this.
[325] Over thousands of years, humans who favored those close to them had an advantage.
[326] They could cooperate, and such cooperation allowed them to out -compete others.
[327] Morality is fundamentally about cooperation.
[328] It's a suite of psychological capacities that enable us to cooperate, to have teamwork, right?
[329] And so, you know, basic rules like don't lie, don't cheat, don't steal, those are rules that make possible for people to live together.
[330] Whether in a village or a family.
[331] The most basic form of cooperation is cooperation between individuals who are genetically related.
[332] And, you know, what's really going on there is that the gene that is in both of your bodies is essentially saying same gene here and there you should be treating each other as alike, or at least not as too different.
[333] You know, genes that encode for caring about the individuals who share your genes and therefore have a copy of that gene, those genes are more likely to spread, right?
[334] So the most basic kind of cooperation is cooperation based on genetic relatedness.
[335] And then the next level up is based on a kind of reciprocity in a direct face -to -face sort of way.
[336] So this is, you know, friends.
[337] You'd share food with each other.
[338] You'd have each other's backs when going into battle, take care of each other's children, take care of each other or, you know, when you're sick.
[339] That's what friends are for, right?
[340] And everybody is likely to do better if they have other people that they can cooperate with.
[341] You can imagine the circle of cooperation expanding even further.
[342] You get out to different levels where there might be people who you don't know personally, but you have some kind of connection with.
[343] That is, you're a member of the same religion, or you're a member of the same country, or you have some other kinds of shared connections or interests, or you're a friend of my friend or a friend of my friend, right?
[344] then, you know, those are more symbolically marked.
[345] It's not based on your experience with that person, but what you have encoded about them in a more abstract kind of way.
[346] There's a simple image you can use to picture this.
[347] As the philosopher Peter Singer put it, we live in these moral concentric circles.
[348] If you're a complete narcissist, it's just you at the middle, right?
[349] And then one ring out is your family and your closest friends, and then people who you're more acquainted with, people who are in a perhaps part of your town, your village, and then broader connections with people, even people who are strangers but are connected to you in some way.
[350] And so in some sense, the most basic question that anyone has to answer is, how big is my us?
[351] What is the range of humans or beings even that I care about?
[352] Dylan's circle of care is very large.
[353] It's basically the entire world.
[354] Hannah's circle is much smaller.
[355] It's her family, her community, her city.
[356] Few of us operate solely with either of these frameworks.
[357] We move back and forth.
[358] Joshua compares these different moral frameworks to settings on a camera.
[359] Think about photography, especially if you're kind of a weak amateur photographer like me. So the camera that I've used for years has these nice automatic settings.
[360] So if I want to take a picture of a mountain from a mile away, then I put it in landscape setting and click, point, and shoot, and I've got a pretty good picture of a mountain from far away.
[361] Or it has portrait mode and dark mode, et cetera, right?
[362] And the idea is that the manufacturer has thought in advance, what are the kinds of things that people need to do all the time?
[363] We're just going to set up these pre -programmed responses for those things, and that's what people use most of the time, and that's what I do.
[364] Occasionally, I get ambitious and will want to do something fancy and maybe have something slightly out of focus and off to the side, and who knows, right?
[365] And so there you put the camera in manual mode, allows you to adjust the F -stop and everything else by hand.
[366] There are benefits to having both of these camera modes.
[367] It allows you to navigate the trade -off between efficiency and flexibility, or reliability and flexibility.
[368] That is, if I tried to do all those different things that I might do for portraits and landscapes by hand, I might make a mistake, and I'm trying to do this quickly.
[369] It's much better to have a point -and -shoot setting.
[370] But you can't do everything you might want to do with point -and -shoot settings, and that's why the manual mode is, And in the same way, we humans have gut reactions, we have feelings, we have habits.
[371] In a familiar kind of situation, you don't even have to think or reason about it when it's something that's just obviously a terrible thing to do.
[372] And that's good.
[373] You want to have that kind of efficiency and sturdiness in your everyday basic moral judgment.
[374] But then sometimes life is complicated, and sometimes there are difficult trade -offs, and then that's when you want to shift into manual mode.
[375] Joshua finds this analogy helpful because it answers a question that people tend to ask, which moral framework is better?
[376] And the answer is neither is better.
[377] It depends on what you're trying to do.
[378] If you're trying to do something quickly, fast, and fairly routine, then you want the automatic settings.
[379] And if you're trying to do something more complicated and you're kind of out of your usual element or trying to do something new, then you want the manual mode.
[380] And I think it's the same thing with moral thinking, that when it comes to the ever, of everyday life, the basic things that everybody should know not to do, the lying, stealing, cheating kinds of things, we want to have automatic settings that just say, nope, nope, nope, nope, can't do that.
[381] But then when we're making difficult decisions, decisions where on the one hand something might feel wrong or break some kind of rule, but on the other hand, it could mean more people will die if you do the thing that seems like the ordinarily right thing to do, then you want to be able to weigh those things against each other.
[382] And that is exactly.
[383] what our brains do.
[384] Joshua and other researchers have found that different parts of the brain are at work when we're using our manual mode and our automatic mode.
[385] When people are specifically engaging the utilitarian reasoning, you see increased activity in a part of the brain called the dorsolateral prefrontal cortex.
[386] It's part of a network in our brain that's the home of hard thinking and self -control.
[387] Like if you're trying to remember a long string of numbers or something like that, or if you're resisting an impulse.
[388] The automatic mode engages a different part of the brain.
[389] Part of the brain called the amygdala, which is a kind of early warning, attention directing system that says something's going on here, maybe bad, you've got to pay attention to this.
[390] And so if you see a snake, for example, before you even realize what you've seen at a sort of conceptual level, your brain is already responding with that flash of, whoa, something you need to pay attention to there.
[391] Joshua has used dilemmas like the trolley problem to understand what happens when we shift between the two modes.
[392] The most revealing scenarios pit the two modes against one another.
[393] One of these scenarios, the dilemma of the crying baby.
[394] This is kind of a horrible case, and I think when I first started thinking about this, it was before I had kids, and now that I have kids, it's like even more horrible to think about.
[395] But this is modeled after some stories that people told about what happened in World War II.
[396] And the story is that people are hiding in a basement and the enemy soldiers are outside and there's a baby that starts to cry.
[397] And if the soldiers hear the baby, they'll find everybody and they'll kill everybody, including the baby, is it morally acceptable to smother the baby in order to save everybody else, knowing that the baby would die anyway if you didn't do anything?
[398] Compared to the standard version of the trolley problem, the crying baby case ramps up the stakes.
[399] One, it's your own child as opposed to a stranger.
[400] And at least the way the story is described, the child's going to die anyway if you don't do that.
[401] And so that's something that also seems to matter.
[402] That is, in general, people are more willing to say that it's okay to do the horrible thing in the name of the greater good if the horrible thing involves hurting somebody who would end up being harmed anyway.
[403] If you give people brain scans as they consider such dilemmas, you can almost see different parts of their minds wrestle with something that feels viscerally wrong but logically correct.
[404] When people have to make in all things considered judgment where they are not told to just think about your feelings or just think about the consequences, but instead try to weigh those things against each other, you see increased activity in a part of the brain called the Ventura Medial Prefrontal Cortex.
[405] This part of the brain integrates these two different types of decision -making.
[406] It's like these different weights get added up and the scale tips one way or another, and that's how we decide what to do.
[407] We told you a second ago that one system for moral reasoning is not better than the other.
[408] That's true, but not always.
[409] There are situations in our modern world for which the fast, intuitive system in our heads is not well designed.
[410] This system was not designed for a world in which we no longer live in tribes, where the people who live near us don't always look like us.
[411] They were not designed for a time when we're capable of helping people on the other side of the world.
[412] When Dylan thinks about the balance between these modes of moral reasoning, his mind jumps to cases where loyalty and love caused real harm.
[413] One example, he says, are the ways in which some men have responded to concerns about sexual misconduct.
[414] There have been a lot of stories of the nature of, hey, I'm a guy in media or politics or entertainment.
[415] I've heard rumors about this guy friend of mine being awful toward women for the last few years.
[416] I'm not going to say anything because I feel like he's a good guy and he deserves better than for me to stick my neck out and get into his business.
[417] That's about a lot of things.
[418] That's about endemic misogyny and depression and accountability, but I think it's also about men placing loyalty before other virtues.
[419] And in a family context, I've had some experience of people acting like the fact that we are family and loyal to each other means they can get away with anything and that no one will judge them because they're family and we stick together no matter what.
[420] And I think in those cases, my attitude and the attitude of people close to me is, no, you don't get to do that.
[421] That there are limits, and we are not going to stick by you no matter what.
[422] I think the sum total of all of our small partialities has added up to a lot of injustices in a lot of cases.
[423] That doesn't mean loyalty is a bad thing always.
[424] Like, you should be loyal to people.
[425] it's worth being loyal to, but I think the worth being loyalty part is sometimes unexamined.
[426] Even those of us who recognize the problem with favoritism may find that actually doing something about it is simply too hard.
[427] Suddenly, we have to account for every decision.
[428] If we spend $200 on our child's birthday party, we've chosen not to spend that $200, saving a poor child.
[429] in a faraway country.
[430] We have to reckon with what this means.
[431] The life of that faraway child matters less than the happiness of the one who lives in our house.
[432] Spending money on family or friends or ourselves can feel like extravagances when there's so much deprivation in the world.
[433] Dylan agrees that you can go too far in putting other people's needs first.
[434] He believes that giving to others needs to be sustainable.
[435] I've made the judgment that I'm not going to beat myself up for giving 10 % rather than 50 % of my income.
[436] Hannah also has a say in this, and not just when it comes to how they give away money.
[437] Well, I guess the liver is one example.
[438] Yeah, I was remembering the...
[439] Wait, you wanted to donate a liver?
[440] Yeah, the dinner.
[441] Yeah, let me tell the story.
[442] You tell the story.
[443] You realize you don't have two livers.
[444] He does not have two livers.
[445] So after Dylan donated his kidney, we were invited to this very nice gala in New York that sort of celebrates kidney donors.
[446] This is the same gala where Dylan was given the award that's on his bookshelf, the one that says American hero.
[447] We were sat at this table with maybe like four or five other kidney donors and their spouses, almost all of whom were men.
[448] The guys start talking and they started talking about how now that they've donated a kidney, what they're thinking about is donating the liver.
[449] There's one guy in particular.
[450] Yes.
[451] Like, I'm just going to do a small piece, so it's for a baby.
[452] Yeah, that if you donate liver to a child, you don't have to donate the entire organ.
[453] You just have to donate a little piece of it because it will grow.
[454] And so then also you can do this multiple times because your own liver will grow back.
[455] And then you could cut a little piece off of it and donate that.
[456] And the guys are all like, yeah, that sounds great.
[457] I've been thinking about this.
[458] It felt like skateboarders, like talking to someone who was trying a new trick they had never tried before.
[459] They're like, that's sweet, man. And all of the spouses are like, absolutely not.
[460] I refuse to do this again.
[461] Joshua is interested in spreading a form of morality that encourages people to use logic and reason to make moral decisions, but stops short of demanding that they donate their organs to strangers.
[462] What I call deep pragmatism.
[463] He compares this approach to dieting.
[464] You can imagine your physiologically optimal diet, But that's not the best diet for you if you're never going to stick to it, right?
[465] The best diet for you is the diet that you're most likely to stick with.
[466] I think of being a moral person, even though it's a much more serious issue than dieting.
[467] It's the same, that's the deep pragmatist perspective, is to remember that you're operating within the limits of your own psychology.
[468] What does it take to care for people beyond those in our inner circle?
[469] It requires us to ask the same question that Joshua raised earlier.
[470] How big is my us?
[471] What is the range of humans or beings even that I care about?
[472] We're capable of expanding the range of people who fit in our us.
[473] Mazurine Banaji, the psychologist, has experienced this herself.
[474] During the tsunami that hit Japan, it took me two seconds to write to my colleague in Japan, because I know somebody who lives there and had spent time with me in graduate school.
[475] I wrote a check immediately to the Red Cross, and all of those actions happened.
[476] And this was in some foreign part of the world that in the lives of my grandparents and parents, somebody in Japan would simply not have been a person associated with them.
[477] So now that Susumu is a part of my in -group, the fact that he lived so far away didn't matter to the act of helping.
[478] Maserine sees what happened as a parable about what modern societies make possible.
[479] They give us the opportunity to include people from the out -group into the in -group.
[480] But expanding the size of your us through friendships has limits.
[481] There were other disasters Maserine did not respond to because she did not feel a personal connection.
[482] And it did make me wonder about having rules for how I might do this that are not based on what tugs at the heartstrings, but rather where the need is the greatest and in a more neutral way.
[483] Joshua says there is a simple way for people to try and expand their circles of connection.
[484] The number one thing that people who have disposable income can do is make a donation to an organization like the Against Malaria Foundation.
[485] If you can donate $3 ,000, odds are you can save someone's life.
[486] That's not an exaggeration, right?
[487] And if you and 10 other people can each give $300, you can save someone's life.
[488] Or 100 other people each give $30, right?
[489] Those are real things that ordinary mortal people can do.
[490] Joshua thinks about something he heard from the philosopher Will McCaskill.
[491] If you ran into a burning building and saved somebody's life, that would be, you know, one of the defining moments of your life.
[492] And you'd remember that for the rest of your life and other people would remember you by it.
[493] But most people, you know, who earn a decent professional.
[494] professional salary, you could save at least one or two lives every year and not even think much about it.
[495] I mean, it's mind -boggling that we have this power, and yet it's absolutely true.
[496] As humans, we come programmed with an immense capacity for love and loyalty.
[497] We also have the capacity to see the entire world as our family.
[498] Both of these systems can produce immense good, and both have major shortcomings.
[499] Like a skilled photographer who changes her settings, depending on the photo she wants to take, we would be wise to choose our moral settings deliberately.
[500] This episode was produced by Raina Cohen and Thomas Liu.
[501] It was edited by Tara Boyle, Kat Schuchnecht, and Parth Shah.
[502] Our team includes Jenny Schmidt, Laura Querell, and Lushik Waba.
[503] Engineering support from Gilly Moon and Joshua Nua.
[504] Our unsung -hung hero this week is John Lansing.
[505] John is NPR CEO.
[506] He recently came on the job only to find himself dealing with the effects of a global pandemic and a very intense news cycle.
[507] John has steered NPR these past few months with great empathy.
[508] He comes across as honest, kind, and direct, exactly the qualities you want to see in a leader.
[509] It doesn't hurt that he's also a fan of Hidden Brain.
[510] This episode was the second in a two -part series on Moral Decision.
[511] making.
[512] If you miss the first, it's called Justifying the Means.
[513] For more Hidden Brain, you can follow us on Facebook and Twitter.
[514] If you like this episode, please share it with a friend, or better yet, with a stranger.
[515] I'm having fun imagining our future kid listening to this and me being like, I didn't know you yet, buddy.
[516] It's not personal.
[517] I'll explain to him why we're staying away from trains.
[518] I'm Shankar Vedantam, and this is NPR.