Hidden Brain XX
[0] This is Hidden Brain.
[1] I'm Shankar Vedant.
[2] On a September evening in 2016, Terence Crutcher's SUV stopped in the middle of a road in Tulsa, Oklahoma.
[3] A woman saw him step out of the car.
[4] The doors of the car were open.
[5] The engine was still running.
[6] The woman called 911.
[7] Officer Betty Shelby was on her way to an unrelated incident when the call came in.
[8] Terence was 40, African American, born and raised in Tulsa.
[9] He was a church -going man with four children.
[10] Betty was 42, white, a mother.
[11] She was born in a small town not far from Tulsa.
[12] In an ideal world, these two Oklahoma natives close in age ought to have had more to bring them together than to hold them apart.
[13] But on this evening, there was no small talk or friendly chatter.
[14] The police officer told Terrence to take his hands out of his pockets.
[15] According to her attorney, he first said, complied.
[16] He then put his hands up in the air.
[17] Moments later, he put his hands back in his pockets.
[18] By this point, multiple police officers had gathered and drawn their guns and tasers.
[19] Overhead, a police chopper filmed events as they unfolded.
[20] From the video, it's hard to tell exactly what's happening on the ground, but an officer in the helicopter thinks Terence isn't cooperating.
[21] Not for taser, I think.
[22] That looks like a bad dude, too.
[23] to be honest.
[24] Moments later, one officer on the ground does fire a taser.
[25] Betty Shelby fires her gun.
[26] Shots fired!
[27] Ooh, 321, we have shot fired.
[28] We have one suspect down.
[29] We need to emsa here.
[30] She kills Terence Crutcher.
[31] Later, police discover that he was unarmed.
[32] Soon, accusations are flying.
[33] Maybe the victim was high on drugs.
[34] Others said, maybe the police officer was racist.
[35] At a press conference after the shooting, a journalist asked Scott Wood, Betty Shelby's attorney, about that theory.
[36] Did him being a big black man play a role in her perceived danger?
[37] No, him being a large man perceived a role in her being in danger.
[38] She's worked in this part of town for quite some time.
[39] And just the week before, she was at an all -black high school homecoming school.
[40] football game.
[41] She's not afraid of black people.
[42] Terrence Crutcher's sister, Tiffany, sees it very differently.
[43] She thinks her brother was shot because he was black.
[44] That big bad dude was my twin brother.
[45] That big bad dude was a father.
[46] That big bad dude was a son.
[47] That big bad dude was enrolled at Tulsa Community College.
[48] Just wanting to make us proud Betty Shelby's daughter, Amber, defended her mother.
[49] I am here to reveal to you the side of my mother that the public does not know.
[50] My mother is an incredible, supportive, loving, and caring woman.
[51] She is a wife, a mother, and a grandmother with a heart of gold.
[52] She has always fought for the underdog and stands up for the weak.
[53] And so it went with accusations against and defenses for the person who pulled the trigger and the person who got shot.
[54] Betty Shelby was acquitted of manslaughter charges.
[55] Still, the tenor of the back and forth, the psychological accusations and psychological defenses, is very revealing.
[56] When an incident like this occurs, we want to hear the story of what happened.
[57] We want to know what was going on in the mind of the shooter and the mind of the victim.
[58] Was Terence Crutcher truly a threat?
[59] Did Betty Shelby dislike black people?
[60] What clues explain the behavior of these individuals?
[61] We home in, dig for facts, and look for psychological explanations.
[62] But what if there's another way to think about what happened, one that has less to do with the individuals involved and more to do with the context in which the shooting occurred?
[63] What we're discovering here is that the individual mind sits in society and the connection between mind and society is an extremely important one that should not be forgotten.
[64] Individual behavior and the mind of the village today on Hidden Brain.
[65] I'm Mazarin Banaji.
[66] Mazarin is a psychology professor at Harvard.
[67] She's made a career out of studying the invisible.
[68] For the past 30 years, I've been interested in studying those aspects of our minds that are hidden from our own conscious awareness.
[69] Mazarin's interest began in graduate school.
[70] She was teaching psychology at Yale and looking for a way to measure people's biases.
[71] There was debate over the right scientific method to do this.
[72] You could simply ask people their views, but because prejudice is a sensitive topic, you often don't get anything.
[73] You couldn't walk up to somebody and say, do you agree that Italians are lazy, and have them say yes or no?
[74] They'll just refuse to answer that question.
[75] Our deep -seated discomfort about discussing prejudice was one hurdle for a researcher looking to study the phenomenon.
[76] Mazarin realized there was another barrier.
[77] What if some forms of prejudice are so deeply buried that people themselves don't realize they harbor such bias?
[78] Perhaps we behave in ways that are not known to our own conscious awareness, that we are being driven to act in certain ways, not because we're explicitly prejudiced, but because we may carry in our heads the thumbprint of the culture.
[79] Was there a way to decipher this thumbprint and expose people's hidden biases?
[80] Eventually, Mazarin, with the help of her mentor, Tony Greenwald, and then graduate student Brian Nosek, developed a simple, ingenious test.
[81] It's called the Implicit Association Test, or the IAT.
[82] It's based on the way we group things in our minds.
[83] When you say bread, my mind will easily think butter, but not something unrelated to it.
[84] Like, say, a hammer.
[85] Our brains make associations, and these associations can reveal something important about the way we think.
[86] So the way the IAT works is to simply ask people to sort things.
[87] So imagine that you're given a deck of playing cards, and you're asked to sort all the red cards to the left and all the black cards to the right.
[88] I'll predict that it will take you about 20 seconds to run through that deck.
[89] Next, Mazarin says, shuffle the deck and re -sort the cards.
[90] This time, I'd like you to put all the spades and the diamonds to one side and the clubs and the hearts to the other side.
[91] And what we'll find is that this will take you nearly twice as long to do.
[92] Why?
[93] Because a rule that your brain had learned, red and red go together, black and black go together, is no longer available to you.
[94] Remember that in both scenarios, you are grouping two suits together.
[95] In the first scenario, you're grouping hearts with diamonds and clubs with spades.
[96] In the second scenario, you're grouping hearts with clubs and diamonds with spades.
[97] Because there's a simple rule for the first task, group red with red and black with black, that task is easy.
[98] In the second scenario, you need a fraction of a second to think about each card.
[99] You can't follow a simple rule of thumb.
[100] Mazarin and Tony and Brian had an important insight.
[101] These rules of association apply to many subjects, including the way we think about other human beings.
[102] So they created a new sorting task.
[103] Sort for me, faces of black people and bad things together, words like devil and bomb and vomit and awful and failure.
[104] Sort those on one side of the playing deck.
[105] On the other side, put the faces of white people.
[106] And words like love and peace.
[107] in joy and sunshine and friendly and so on to the other side.
[108] This turns out to be pretty easy for us to do, because, as my colleagues and I will argue, the association of white and good and black and bad has been made for us in our culture.
[109] The test doesn't end there.
[110] After sorting white and good into one group and black and bad into another, you now have to do it again, this time grouping black with good and white with bad.
[111] And when you try to do that, and when I try to do that, the data show that we will slow down, that we can't do it quite as fast, because black and good are not practiced responses for us.
[112] They're not habitual responses for us.
[113] We have to exert control to make that happen because it doesn't come naturally and easily to us.
[114] That's the IAT.
[115] By the way, if you're wondering whether the order of the test makes a difference, it doesn't.
[116] The researchers have presented tests to volunteers in different ways.
[117] It doesn't make a difference if you ask people to first group black with bad or to group black with good.
[118] In both cases, people are faster to associate white faces with positive words and black faces with negative words.
[119] Mazarin thinks the IAT is measuring a form of bias that is implicit or unconscious.
[120] Mazarin herself has taken the IAT many times, to her dismay, the tests show that she has robust, levels of unconscious bias.
[121] My brain simply could not make the association of black with good as quickly as I could make the association of white with good.
[122] And that told me something.
[123] It told me it's not the IAT that's screwed up.
[124] It's my head that's screwed up.
[125] Because the IAT is a timed test, the results can be precisely measured.
[126] Implicit bias, in other words, can be quantified.
[127] Most psychological tests are only available in the lab.
[128] but Mazarin and her colleagues decided to do something radical.
[129] They put their test on the internet.
[130] You can find it today at implicit .harvard .edu.
[131] Millions of people have taken this test.
[132] The data has been collected, shared, disseminated.
[133] The IAT is widely considered today to be the most influential test of unconscious bias.
[134] As Mazarin and Tony and Brian were developing the IAT, other researchers were developing different ways, to measure bias.
[135] Psychologist Joshua Correll found himself diving into the field shortly after a black man was shot and killed in New York City in 1999.
[136] His name, Amadou Diallo.
[137] Diallo was standing unarmed on the front stoop of his apartment building, and the police thought he looked suspicious, and they approached him, and they ended up shooting him.
[138] And the question that everybody was asking, And this was, I mean, something that people, you know, across the country were wondering about was, was he shot because he was black?
[139] At the time, Joshua was starting graduate school.
[140] I took that question pretty seriously, and we tried to figure out how we could test it in a laboratory.
[141] Joshua and his colleagues eventually developed a video game.
[142] It was a pretty bad video game, but it did the trick.
[143] It's more like a slideshow where there are a series of backgrounds that pop up on the screen.
[144] And then in one of those critical backgrounds, a person will suddenly appear.
[145] So we've got photographs of, say, 25 or so white men and 25 black men.
[146] And we've photographed these guys holding a variety of different objects, cell phones, can a Coke, a wallet, a silver pistol, a black pistol.
[147] And so we've just edited the photograph so that the person pops up in the background holding an object.
[148] and the player has to decide how to respond.
[149] And they're instructed, if the guy on the screen has a gun, he's a bad guy, and you're supposed to shoot him.
[150] And you're supposed to do that as quickly as you possibly can.
[151] What Joshua wanted to know was whether players would respond differently depending on the race of the target on the screen.
[152] Say, a black guy pops up holding a wallet and a white guy pops up holding a wallet.
[153] What's the likelihood that the black guy gets shot and the white guy doesn't?
[154] If current events are any clue, you may guess the answer.
[155] Here we're looking at, say, you know, the player is responding to a target who's holding a wallet, and the correct decision is to say don't shoot.
[156] And what we found is that they are faster to say don't shoot if the target is white rather than black.
[157] The same held true for armed targets.
[158] Test takers were faster to shoot black targets, slower to shoot white ones.
[159] Now, you might think that Joshua would conclude that his test takers were just racist.
[160] But one important similarity between Joshua's test and Mazurin's test is that they do not presume that the people with such biases have active animosity toward African -Americans.
[161] These are not members of the Ku Klux Klan.
[162] It was just exactly what we had predicted, and I guess both kind of hoped and feared, right?
[163] I mean, it's an exciting scientific moment, but it also suggests something kind of deeply troubling that these participants who are presumably nice people with no bone to pick.
[164] They're not bigots.
[165] They're not angry at black people in any way.
[166] But what we saw in their data very, very clearly is a tendency to associate black people with threat and to shoot them more quickly.
[167] You can say that both the psychological tests are academic exercises.
[168] Do they say anything about how people behave in real life?
[169] Joshua Correll is very clear that his video game experiment cannot replicate real life.
[170] It's impossible, he says.
[171] says, to recreate in a lab the fear and stress that a real -world police confrontation can generate.
[172] The IAT has also been criticized for a somewhat hazy link between test results and real -world behavior.
[173] Hundreds of studies have been conducted looking at whether the IAT explains or predicts how people will act.
[174] The results have been mixed.
[175] In some studies, unconscious racial bias on the test seems to predict how people will behave.
[176] Researchers found, for example, that doctors who score high in implicit bias are less likely to prescribe clot -busting heart drugs to black patients compared to white patients.
[177] But other studies, also looking at doctors and black -and -white patients, find no correlation between results on the bias test and actual behavior.
[178] This discrepancy bothers psychologist Phil Tetlock at the University of Pennsylvania.
[179] He is a critic of the IAT.
[180] It's a test that is enormously intuitively appealing.
[181] I mean, I've never seen a psychological test take off the way the IAT has and has gripped the popular imagination the way it has because it just seems on its surface to be measuring something like prejudice.
[182] Tetlock and other critics are concerned that just because someone shows bias on the IAT doesn't mean that they're going to act in biased ways in real life.
[183] If a test cannot predict how you're actually going to behave, Isn't it just an academic exercise?
[184] There is the question of whether or not people who score as prejudiced on the IAT actually act in discriminatory ways toward other human beings in real -world situations.
[185] And if they don't, if there is very close to zero relationship between those two things, what exactly is the IAT measuring?
[186] It turns out a lot.
[187] There's new evidence that suggests that the IAT does in fact predict behavior, but to see it, you have to zoom out.
[188] you have to widen the lens to look beyond the individual and into the community.
[189] Hello, my name is Eric Heyman.
[190] He's a psychology professor at Ryerson University.
[191] Eric became interested in the IAT as he was researching the use of lethal force in policing.
[192] He was trying to design a statistical model that would predict where in the United States people of color are disproportionately likely to be shot and killed by police.
[193] First, he needed some baseline data.
[194] This proved hard since the federal government does not require police departments to report deadly shootings by officers.
[195] We really had no idea about really basic questions, such as how often they were happening, where they're happening, and who they were happening too.
[196] But in 2015, some news outlets, including the Washington Post and the British newspaper, The Guardian, began to compile their own database on police homicides in the United States.
[197] According to official terminology, these are known as justifiable homicides.
[198] So what they were putting together was the most comprehensive list of these justifiable homicides in the United States.
[199] Eric used this data to pinpoint where disproportionate police shootings of minorities were most likely.
[200] Then he turned to the IAT data.
[201] Eric suspected that if bias was a factor in police shootings, it was likely that implicit bias, rather than overt racism, was at play.
[202] Traditionally, the field has found that explicit biases were pretty.
[203] predict behaviors that are under our conscious control, whereas implicit biases predict things are a little bit more automatic, a little bit more difficult to control.
[204] And this is exactly the sort of behavior that we thought might be involved in police shootings.
[205] People take the IAT anonymously, but they need to provide some information, like their race and where they live.
[206] With the millions of data points the IAT provided, Eric painted a map of bias across the United States.
[207] Some places seem to have lots of bias, others, very little.
[208] Now, he had two databases.
[209] He cross -referenced them to see if there was any connection between communities with disproportionate numbers of police shootings of minorities and communities showing high levels of implicit bias.
[210] A powerful correlation emerged.
[211] So we find that in communities in which people have more racial biases, African -Americans are being killed more by police than their presence in the population would warrant.
[212] Let me repeat this because it's important.
[213] In places where implicit bias in a community is higher than average, police shootings of minorities are also higher than average.
[214] Eric's analysis effectively pinpoints where police shootings are likely to happen.
[215] But here's what makes the finding crazy.
[216] Most people who take the IAT are not police officers.
[217] So we're predicting police behavior by not measuring police at all themselves.
[218] Coming up, we explore how a test can predict how people will behave, even when they are not the people who've taken the test.
[219] This is Hidden Brain.
[220] I'm Shankar Vedantham.
[221] On today's show, we're discussing implicit bias, the unconscious attitudes we have in our minds that shape the ways we interact with the world.
[222] Earlier, we introduced the IAT, the implicit association test, a tool used to measure a person's implicit biases.
[223] Psychologist Eric Heyman found a way to predict police behavior by comparing places that have high levels of implicit bias with places where police shootings of minorities are higher than average.
[224] Since police typically don't take the IAT, how could the IAT be predicting their behavior?
[225] Eric thinks the test has tapped into the mind of the community as a whole.
[226] Say there's a neighborhood that's traditionally associated with threat or danger and the people who live in that neighborhood have these associations between African Americans and threat or African Americans in danger.
[227] And these would be anybody in this community.
[228] This could be my mother or the person who lives down the street, not necessarily the police officers themselves.
[229] But there's this idea that this attitude is pervasive across the entire area and that when officers are operating in that area, they themselves might share that same attitude that might influence their behaviors in these split -second challenging life -and -death decisions.
[230] Implicit bias is like the smog that hangs over a community.
[231] It becomes the air that people breathe.
[232] Or as Harvard psychology professor Mazarin Banaji might say, the thumbprint of the culture is showing up in the minds of the people living in that community.
[233] There are many examples for this idea that individual minds shape the community and the community shapes what happens in individual minds.
[234] Seth Stevens -Dividowitz is a data -sense, scientists who used to work at Google.
[235] We featured him on Hidden Brain before.
[236] In his book, Everybody Lies, Seth explains how big data from Google searches can predict with great accuracy things like the suicide rate in a community or the chances that a hate crime will take place.
[237] We've shown that you can predict hate crimes against Muslims based on searches people make.
[238] People make very, very, very disturbing searches, searches such as kill Muslims or I hate Muslims.
[239] And these searches can predict on a given week how many hate crimes there will be against Muslims.
[240] But I think the right approach to this is not to target any particular individual to show up at the door of any particular individual who make these searches.
[241] But if there are many, many searches in a given week, it would be wise for police departments to put extra security around mosques because there is greater threat of attacks.
[242] In other words, what the Google search data is doing is effectively taking the temperature of an entire community.
[243] That's what you're really saying, that you're picking up on things that are in the ether, if you will, in the community that might not show up in the individual but are likely to show up in the aggregate.
[244] Yeah, and I think you don't really know the reason that any particular person makes a search, right?
[245] Someone could be searching kill Muslims because they're doing research or they're just curious about something or they made a mistake and they're typing.
[246] There are a lot of reasons an individual can make these searches.
[247] But if twice as many people are making these searches, well, if I were a Muslim American, I'd want some extra security around my mosque, right?
[248] Asking whether implicit bias affects the behavior of every individual is a little like investigating everyone who types an offensive search term into Google.
[249] A lot of the time, you're going to find nothing.
[250] And yet, when you look at search terms in aggregate, it can tell you with great precision which areas will see the most hate crimes.
[251] For her part, Mazarin Banaji believes that Eric's work is a key link between her psychological data on individuals and sociological insights on how a community behaves.
[252] What we're discovering here is that the individual mind sits in society, and the connection between mind and society is an extremely important one that should not be forgotten.
[253] And that more than any other group of people, social psychologists, owe it to the beginnings of their discipline, to do both and to do it even -handedly, to be focused on the individual mind, and to be talking about how that mind is both influenced by and is influencing the larger social group around her.
[254] This is why Mazarin says, when a problem has spread throughout a community, when it has become part of the culture, you can't fix it by simply focusing on individual.
[255] One of the difficulties we've had in the past is that we have looked at individual people and blamed individual people.
[256] We've said, if we can remove these 10 bad police officers from this force, we'll be fine.
[257] And we know as social scientists, and I believe firmly that that is no way to change anything.
[258] This new way of thinking about bias showed up in the 2016 presidential election.
[259] Democrat Hillary Clinton said implicit bias probably played a real.
[260] role in police shootings.
[261] I think implicit bias is a problem for everyone, not just police.
[262] I think, unfortunately, too many of us in our great country jumped to conclusions about each other.
[263] Republican Mike Pence, now vice president, bristled at the idea.
[264] He said that Hillary Clinton was calling cops racist.
[265] When an African -American police officers involved in a police action shooting involving an African -American, why would Hillary Clinton accuse that African -American police?
[266] I guess I can't believe you are defending the position that there is no bias.
[267] But as Mazarin says, it's not quite right to think of people with implicit bias as harboring the kind of racial animosity we typically think of when we say someone is a racist.
[268] Small kids show implicit bias.
[269] African Americans themselves show implicit bias against other African Americans.
[270] The test isn't picking up the nasty thoughts of a few angry outliers.
[271] It's picking up the thumbprint of the culture.
[272] on each of our minds.
[273] So, what can we do?
[274] Mazarin is skeptical of those who offer training courses that promise quick -fix solutions.
[275] There are many people across the country who say that they offer such a thing called implicit bias training.
[276] And what they do is explain to large groups of people what might be going on that's keeping them from reaching their own goals and being the good people that they think they are.
[277] And my concern is that when I'm an old woman, that I will look back at this time and think, why didn't I do something about this?
[278] Because I don't believe this training is going to do anything.
[279] In Mazurine's view, you can't easily erase implicit bias because you can't erase the effect of the culture when people are living day in and day out in that same culture.
[280] But she and others argue that there might be ways to prevent such biases from influencing our behavior.
[281] Let's return to psychologist Joshua Correll.
[282] Remember, he's the one who created the active shooter video game that found that test takers were more likely to shoot black targets rather than white ones.
[283] Many of Joshua's initial test takers were students.
[284] Eventually, he decided to see what would happen if police officers took the test, so he went to the Denver police.
[285] We brought down a bunch of laptops and button boxes, a bunch of electronic equipment that we were using to do the study, and we would set it up in their roll call room.
[286] And it was just complete chaos and really, really fun.
[287] And some of the police really wanted nothing to do with us.
[288] But a huge number of them volunteered and they wanted to talk with us afterwards.
[289] At first, the police officers performed exactly the same as everyone else.
[290] Their levels of implicit bias were about the same as lay people who have taken the test, both in response times and in the number of mistakes they made.
[291] But when it came to the actual shooting of targets, the police were very different.
[292] The police officers did not show a bias in who they actually shot.
[293] Those earlier test takers, college students and other laypeople, displayed their bias on response times, mistakes, and who they shot.
[294] But not the police.
[295] But whereas those stereotypes may influence the behavior of the college students and of you and me, the police officers are somehow able to exert control.
[296] So even though the stereotype, say, of threat may come to mind, the officer can overcome that stereotype and respond based on the information that's actually present in the scene rather than information that the officer is bringing to it through his or her stereotypes.
[297] Joshua wondered whether there were certain factors that might keep police officers from exerting this kind of cognitive control over their biases.
[298] He found, among other things, that sleep made a difference.
[299] Those who were getting less sleep were more like.
[300] to show racial bias in their decisions to shoot.
[301] And again, that's just consistent with this idea that they might be able to exert control, to use cognitive resources to avoid showing stereotypic bias in their decisions.
[302] But when those resources are compromised, they can't do it.
[303] And they could be compromised in a variety of ways.
[304] Sleep is just one way that we could compromise it.
[305] This, once again, is evidence that you can't train people not to have unconscious bias, but as Joshua suggests, you can do things to make it less likely that people will be affected by their bias.
[306] To be clear, Joshua's experiments are laboratory experiments.
[307] We know that in real life, police officers do shoot people in error.
[308] Now, this could be because in a real -life encounter, stuff happens that makes it very difficult for you to actually think about what you're doing.
[309] So on the street, when somebody pulls a gun on you, it's scary, right?
[310] Like, cops when they're involved in these firefights report some crazy, crazy psychological distortions because they're legitimately freaked out.
[311] If they think somebody poses a life and death threat, they may panic, and it may be hard to bring those cognitive resources online.
[312] In several recent high -profile cases, as in the case of Terence Crutcher and Betty Shelby, police officers shoot people who are not armed.
[313] Of course, officers do not always know whether someone is armed.
[314] It's only in hindsight that we know the officer was or wasn't in real danger.
[315] Joshua's larger point is that police encounters can be inherently stressful.
[316] The uncertainty embedded in a confrontation can make it very difficult to think objectively.
[317] Let's put it another way.
[318] If you're running a police department and want to reduce errors and shootings, it may not be useful to lecture cops on how they shouldn't be racist.
[319] It may be more useful to build procedures that give cops and extra half -second when they're making decisions under pressure.
[320] With practice and a bit of time to exercise conscious control, people can reduce the risk of falling prey to their implicit biases.
[321] There is the potential to control it, right?
[322] The performance of the regular police officers, or even people that we train in our lab, suggests that people don't have to succumb to those stereotypic influences.
[323] They can exert control in certain circumstances.
[324] Mazarin Banaji has a similar solution.
[325] She thinks we need more of what she calls in the moment reminders.
[326] For example, it's been found that some doctors prescribe painkillers to white patients far more often than they do to black patients who are reporting exactly the same levels of pain.
[327] The only difference is the patient's skin color.
[328] This suggests that bias is at work.
[329] Mazarin says if the bias is implicit, meaning physicians are acting biased without intending to be biased, a timely reminder can help doctors exercise conscious control over their unconscious associations.
[330] You type in a painkiller that you want to prescribe to a patient into your electronic system while the patient is sitting next to you.
[331] And it seems to me quite simple that when you type in the name of any painkiller, let's say codeine, that a little graph pops up in front of you that says, please note, in our hospital system, we have noticed that this is the average amount of painkiller we give to white men.
[332] This is the average amount we give to black men for the same reported level of pain.
[333] In other words, giving doctors an opportunity to stop for a second to make a decision consciously and deliberately.
[334] This can reduce the effect of implicit bias.
[335] Psychology has spent many years understanding the behavior of individuals.
[336] But tools such as the IAT give us a way to understand communities as a whole, maybe even countries as a whole.
[337] We did a study some years ago.
[338] Brian Nosek led this particular project in which we looked at gender stereotypes across many countries in the world.
[339] How strongly do we associate female with science and male with science?
[340] And then we looked at the performance of girls and boys, roughly around eighth grade, on some kind of a standardized test.
[341] And what we discovered is that the stronger the gender bias in a country, that is to say, the stronger the association of male with science in a country, the less well girls in that country did on that mathematics test.
[342] That's very similar to the Haman kind of result, because we didn't measure the gender bias in the girls.
[343] and the boys who took the test.
[344] We were measuring something at the level of a country in that case.
[345] And yet, it did predict something systematic about the difference in performance between boys and girls.
[346] When we look at an event like a police shooting, we invariably seek to understand it at the level of individuals.
[347] If something bad happens, we think it has to be because someone had bad intentions.
[348] Implicit bias certainly does act on individuals, but it's possible that its strongest effects are at the level of a community as a whole.
[349] This might be why some police shootings of African -American men are carried out by African -American police officers and why some physicians who are not prescribing pain medications to people of color might themselves be people of color.
[350] Individuals can do their part to limit the effects of bias on their behavior.
[351] But if you want to fix the bias itself, well, that takes the whole village.
[352] You're listening to Hidden Brain.
[353] I'm Shankar Vedantam.
[354] We'll be back in a moment.
[355] This is Hidden Brain.
[356] I'm Shankar Vedantam.
[357] If you follow football, for our listeners in other countries we're talking about American football, this will sound familiar.
[358] At several points in a game, the ruling on the field of progress as related to a first down.
[359] At several points in a game, the action comes to a stop because a ruling that a referee has made on the field has to be reviewed.
[360] When this happens, everyone turns to the telemed.
[361] television replay, and the same replay right there in faster motion.
[362] And experts both on and off the field decide whether the referee made the right call.
[363] After reviewing the play, the ruling on the field has changed.
[364] The receiver did not get control and two feet down in bounds.
[365] Each time a referee's ruling gets reversed, and the officials have overturned it.
[366] It's a small blemish on his reputation.
[367] If your calls get overturned a lot, people will start to say that you're a bad referee.
[368] What happens in football is an analogy for something much more important.
[369] At Harvard's Kennedy School, political scientist Maya San has examined how rulings in the federal courts get upheld or overturned.
[370] And the traditional viewpoint in the legal academy is that for a judge to be reversed, it's actually quite costly.
[371] As in sports, it can be costly to that judge's reputation, it can sort of be a mark that they perhaps reached an incorrect decision or a decision that was just.
[372] extreme or unpalatable to the higher court.
[373] In other words, costly to an individual judge, but also costly to the system as a whole.
[374] Reversal sends the case back, right?
[375] So it's sent back to a lower court, usually.
[376] And that lower court doesn't have a reduced workflow as a result.
[377] It just has to accommodate another case being handed back down to it.
[378] And so it's costly not just in terms of reputational costs, but it's actually also costly because it adds more work onto a judge's place.
[379] We're talking this hour about implicit biases, subtle nudges that might cause us to evaluate some people differently than others.
[380] There has been some really interesting work showing that, for example, employers who look at resumes that are sent essentially on an identical basis, with the names at the top of the resumes, either being white names or traditionally African -American names, have resulted in massive differences, for example, in callback rates and things like that.
[381] Maya wanted to know if such biases might also shape which judicial rulings get overturned, or more specifically, which judges get overturned.
[382] Even more simply, she asked, do black judges get reversed more often than white ones?
[383] Before we get to the research, a quick bit of history.
[384] There are hundreds of judges in the U .S. federal court system, and up until the 1960s, all of them were white.
[385] Through the 1960s and into the 1970s, there was a big push to diversify the courts in the United States.
[386] Presidents like Jimmy Carter began appointing black judges to the courts.
[387] The thing is, these judges were usually the most junior members of the system.
[388] Our judicial system is very hierarchical.
[389] So there are certain courts at the bottom, and then there is, you know, the Supreme Court at the top.
[390] And along the way, the way that cases get up to these higher courts is by being appealed.
[391] So if someone goes to federal court and doesn't like the way a judge has ruled, she can challenge that judge's decision by filing an appeal.
[392] So if a case is appealed to a higher court, then that higher court has the opportunity to uphold the lower court's ruling or reverse it.
[393] As we said before, judges whose rulings get overturned a lot are at risk for being seen as less competent.
[394] What was the hypothesis of what did you think about, where you were sort of saying, well, reversals might tell me something interesting.
[395] What was the hypothesis that you wanted to explore?
[396] So in the case of reversal, my working hypothesis was whether black judges would be reversed more on appeal than white judges.
[397] What was the, you looked at a specific group of judges.
[398] So tell me about a little bit about the data set you looked at and why you looked at that data set.
[399] I did.
[400] So for this study, I chose the federal courts.
[401] And I chose to look at federal judges in part because they are sort of the more prestigious of the courts in the United States.
[402] We tend to have the best data.
[403] We tend to have the most information about them, which, from my perspective, as a quantitative data scientist, is appealing.
[404] Maya looked at judges of all races appointed between 1960 and 2012.
[405] 10 % or so were African American.
[406] And analyzed how often their cases were appealed and how often their rulings were overturned.
[407] What did you find?
[408] So essentially what we find, what I find, and this is very consistent across different kinds of ways of slicing and dicing the data, which is that black judges have a very, very consistently, have a consistently higher reversal rate.
[409] They are much more likely to be reversed once a case that they've written has been appealed to a higher court.
[410] And that's the case regardless of whether we control for or take into account differences in the court in which they sit, the kinds of cases that they hear, their age, their gender, their professional experience, their, qualifications ratings from the American Bar Association.
[411] It's actually a very, very sticky finding.
[412] So essentially, black judges are more likely to be reversed by higher courts.
[413] At the University of Chicago Law School, Dean Tom Miles reviewed Maya study.
[414] Well, the conclusions of the paper are surprising because, of course, judges and particularly appellate judges, are very much attuned to look for racial disparities elsewhere in the legal system.
[415] This is a disturbing finding.
[416] Now, what happens to black judges could be the result of racial bias.
[417] But before we reach that troubling conclusion, there is another theory we need to consider.
[418] This theory is much more innocuous.
[419] To explain what I mean, let's take a short detour.
[420] If you live through the 1990s in the United States, you might be able to sing a couple of lines from this theme song.
[421] The Fresh Prince of Belair is where I spent most of my days, chilling out, maxing, relaxing, or cool, and all shooting some beep all outside of the school.
[422] The Fresh Prince of Bel Air isn't set in a courtroom and doesn't really discuss the judicial system, except...
[423] Wait, wait, wait, hold up, Uncle Phil.
[424] Now, before you get started.
[425] One of the main characters, Uncle Phil, is a judge.
[426] He's a straight -laced character who often wears a suit, and he cringes at the way his nephew Will dresses and speaks.
[427] Oh, yo, the plane ride was stupid.
[428] I was up in first Christmas, me?
[429] No, I was saying the plane was dope.
[430] Excuse me?
[431] No, stupid, dope.
[432] Oh, no, that doesn't mean what you...
[433] How would he say?
[434] The flight was really neat.
[435] In the show's pilot episode, Will and Uncle Phil get into an argument.
[436] Will accuses his uncle of forgetting his roots.
[437] Let me tell you something, son.
[438] I grew up on the streets just like you.
[439] I encountered bigotry you could not imagine.
[440] Now, you have a nice poster of Malcolm X on your wall.
[441] I heard the brothers speak.
[442] I read every word he wrote.
[443] Believe me, I know where I come from.
[444] Uncle Phil is something of a prototype for the black judges appointed to the federal judiciary in the 1960s.
[445] As Maya says, Black judges who were appointed earlier, so in the 1960s, 1970s, primarily by Jimmy Carter, were more likely to come from the trenches of the civil rights movement.
[446] As a result, she says, studies have found that black judges do tend to vote differently on certain kinds of cases.
[447] Black judges tend to vote in a more progressive or liberal direction on cases having to do with affirmative action and civil rights.
[448] Maya acknowledges that this could be a potential explanation for her finding that black judges are more likely to be reversed than white ones.
[449] They aren't being reversed because they're black.
[450] They're being reversed because they're ideologically liberal.
[451] So therefore, when their cases are being appealed to higher courts, higher courts might have fewer African Americans, perhaps fewer liberals.
[452] And because of that, cases written by black judges who perhaps are more liberal are therefore more likely to be overturned.
[453] So in other words, racial bias might have absolutely nothing to do with Maya's finding.
[454] Right?
[455] At the University of Chicago Law School, Tom Miles says the innocuous explanation is unlikely to be true.
[456] Maya's research, he says, clearly shows that ideological differences cannot explain the outcome she finds.
[457] That's because Maya controls for the effects of ideology.
[458] By looking at whether black Democratic judges are more likely to be overturned than white Democratic judges, and she finds even in that context the racial gap exists.
[459] Since judges are appointed by both Republican and Democratic presidents, Maya reran her study by looking only at judges appointed by Democrats.
[460] Presumably, all these judges have liberal sympathies.
[461] But even in this group, the analysis finds that black liberal judges get reversed more often than white liberal judges.
[462] Even when they're facing an appellate review panel that consists of other democratic judges.
[463] And she finds even in that context, the racial gap exists.
[464] And so on the first inspection, it doesn't appear that ideology is the explanation for the result.
[465] Now, this study doesn't allow us to test whether unconscious biases, are producing this outcome.
[466] But Tom says the outcome is at least consistent with what we know about unconscious bias.
[467] It's certainly consistent with the existing literature that implicit biases exist at all levels of society and thus it's not unthinkable to imagine that it might also exist among judges.
[468] But of course, that would be a very controversial claim to say that judges in making their legal decisions in which they're relying on legal materials are influenced by their knowledge of the identity of the lower court judge.
[469] Maya notes that younger black judges don't have the same activist backgrounds as their predecessors.
[470] Black judges today are actually more likely to have attended Harvard Law School than Howard Law School.
[471] And so they're more likely to have come from private practice, more likely to have attained prestigious clerkships, and they're actually starting to look professionally, at least, more like white judges.
[472] Does this mean that younger black judges are less likely to be reversed, than their older counterparts?
[473] You might have a sense of where this is headed.
[474] The answer is no. Maya says younger black judges are just as likely to have their cases reversed upon appeal.
[475] Is one of the implications of this is if you get reversed a lot, are you less likely than to advance through the courts?
[476] You know, that's a good question.
[477] That's certainly the public or the legal scholarly perception, which is that reversal might be a signal of an underlying problem, right?
[478] So if you have a judge with very, very high reversal rates, that could potentially be a judge who perhaps writes poor equality opinions or whose legal reasoning is problematic or who's more ideologically extreme or is too large of a risk taker.
[479] As the country has gotten more diverse, we've sought to make the judiciary more diverse as well.
[480] Americans who show up in federal court today are more likely to see an African -American sitting at the bench compared to their counterparts half a century ago.
[481] Many people take justifiable pride in this change.
[482] Where this paper kind of fits in is pushing back on that a little bit and sort of asking, okay, well, you know, it seems to be true that these judges are introducing this diverse viewpoint, but what does that actually mean if that diverse viewpoint is then eventually overruled by higher courts?
[483] In other words, it's one thing to appoint people of color, but for diversity to actually take root and flourish, the entire ecosystem has to embrace the newcomers and their ideas.
[484] If it doesn't, diversity ends up being mostly about optics.
[485] This episode was produced by Jenny Schmidt, Raina Cohen, Bart Shah, and Maggie Penman.
[486] It was edited by Tara Boyle.
[487] René Clark does our social media.
[488] We had original music in this episode from Ramtinara Blui.
[489] For more Hidden Brain, please follow us on Facebook, Twitter, and Instagram.
[490] And if you know a friend who would like to hear our show, please tell them about Hidden Brain.
[491] I'm Shankar Vedantham.
[492] See you next week.