Hidden Brain XX
[0] This is Hidden Brain.
[1] I'm Shankar Vedantam.
[2] If you were to paint a portrait of the inner workings of your mind, what would it look like?
[3] Maybe you would use bright splashes of color to represent your most intense emotions.
[4] Shades of gray to reflect the complexity and nuance of your thoughts.
[5] Swirls and spirals to express the moments when you're lost in rumination.
[6] No one else could paint this picture.
[7] You know yourself better than anyone else, and you are the one most attuned to your inner rhythms, your fears and insecurities, your hopes and dreams.
[8] Yet, even though we may feel we are acutely aware of every corner of our mental landscapes, it turns out many aspects of our minds are hidden from us.
[9] It's almost like if you imagine a fork in the road and it just goes two different ways, there are just two different paths here.
[10] There's the path that we use for self -judgment, and there's the path that we use.
[11] for judging others.
[12] And in my view, the path that we use for judging others is we look at their actions.
[13] The path that we use for judging ourselves is we look inwards.
[14] This week on Hidden Brain, we bring you the third installment of our Mind Reading 2 .0 series.
[15] In previous episodes, we explored how we read other people's intentions and the social illusions that pervade everyday interpersonal relationships.
[16] Today, in a favorite episode from 20, we look at one of the most bewildering aspects of how we read minds.
[17] In this case, our own.
[18] On a daily basis, all of us evaluate others.
[19] We think about the claims of people who want to sell us something.
[20] We gauge the ideas of colleagues.
[21] We assess friends and family.
[22] We also regularly look into our own hearts and minds.
[23] We evaluate ourselves.
[24] At Princeton University, psychologist Emily Pronen has studied why our minds come to very different conclusions about ourselves and others?
[25] Emily Pronan, welcome to Hidden Brain.
[26] Thank you, Shankar.
[27] So a few years ago, Emily, you conducted an experiment where you brought volunteers into a lab and you told them about a range of different biases, you know, biases like the halo effect where, you know, you see someone who's very beautiful and you assume this person must also be very intelligent, or a bias like confirmation bias, where you know, we go looking for information that supports our pre -existing views.
[28] And you did something very interesting.
[29] You asked the volunteers whether they thought that they would fall prey to these biases.
[30] What did they tell you?
[31] We had students in a class so that they kind of all knew each other from being in the class together.
[32] And what we did is we described each bias just in a few sentences.
[33] We didn't use the word bias.
[34] We didn't want to make it sound like a negative thing so that people would say, that's bad.
[35] I don't do that.
[36] We just described it in neutral terms.
[37] Sometimes people do this.
[38] Do you do this?
[39] And what we found is that people said, oh, gee, other people do do that.
[40] You know, that's so great.
[41] You put that into words like that.
[42] I see that all the time.
[43] But me, I know I don't really do that.
[44] So what happened was people recognized the bias as something that people do, and they attributed it to other people.
[45] But they thought that they did it quite a bit less.
[46] And the same thing happens in so many different domains.
[47] if you asked me, do you evaluate the news fairly?
[48] Are you a good judge of policy?
[49] I'll tell you, of course I am, but I can see lots of biases in the people around me. Emily, you call this the bias blind spot.
[50] What do you mean by the term?
[51] The reason why I came to call it a bias blind spot is that a blind spot refers to a situation where you can see something sort of all around you except in one place.
[52] And so the blind spot is seeing the bias in yourself.
[53] because it turns out that people could readily recognize these biases all around them.
[54] Let's look at some specific domains where the bias blind spot affects us.
[55] When it comes to ethics, we're all quick to see conflicts of interest in other people, but slow to see it when it comes to ourselves.
[56] It's such a beautiful example.
[57] So, you know, doctors and gifts from the pharmaceutical industry.
[58] So people have studied this.
[59] And doctors will say, I'm not influenced by gifts.
[60] And oftentimes the gifts are small, right?
[61] They're like you have a Pfizer pen.
[62] Sometimes the gifts are rather large.
[63] Like, we'd love to hear you come and give a talk on your research.
[64] You know, in the Caribbean, you know, we'll fly you over there, you know, in a private jet to give your talk.
[65] And I credit to the medical industry that I think, you know, they have really worked on trying to root this out because they recognized it as a problem.
[66] So there's no longer free lunches for residents every day, you know, sponsored by various drug companies as far as I understand.
[67] But the point being the doctors said that they were not influenced.
[68] by these gifts, but that other doctors were.
[69] So it's a perfect example of a conflict of interest, not being recognized in self, but seen in others.
[70] The bias blind spot also affects how we think we are affected by marketing and how we think others are affected by marketing.
[71] I want to play you a clip for an ad that I recently came by.
[72] GLH means great -looking hair.
[73] Just spray GLH on and it instantly covers your bald spot, leaving you with great -looking hair.
[74] GLH is not a paint or a cover -up.
[75] It's an amazing powder that clings to the tiniest hairs on your head.
[76] Order GLH now for only...
[77] So that was an ad for spray -on hair.
[78] Now, I don't think I'm influenced by advertising, whether that's commercial advertising or political advertising, but I think other people are quite vulnerable to such persuasion.
[79] Yeah, there's a phenomenon called the third -person effect, whereby people think that persuasive attempts have more of an impact on other people than themselves.
[80] So they say, oh, commercials, you know, political ads, you know, those things, I'm sort of immune to them.
[81] They don't influence me. Whereas people recognize it's a whole industry, you know, it's influencing other people, but we see others as more susceptible to these influences than ourselves.
[82] Yeah.
[83] How does this work in politics?
[84] When we evaluate our political opponents, how does this bias sort of play out in our evaluations, both of people on our side and people on the other side?
[85] Yeah, so that's a great question.
[86] And obviously we've all been thinking about it a lot recently.
[87] So I think there's a lot of things that are going on.
[88] One is, what do I believe are the roots of my political opinions and political beliefs?
[89] And people will swear that the roots of their political beliefs are just in a rational analysis of the issue, right?
[90] So I take the positions I do because those are the correct positions.
[91] If you analyze the issues, if you analyze the state of the country, if you think about what's best for the nation, these are the correct positions.
[92] But they don't view that as being the root to the positions of those on the other side.
[93] So the other side is influenced by ideology, by self -interest, by prejudice, whatever it is.
[94] You know, I was thinking about this, the study that came out some time ago, and this was during the Obama presidency, gas prices were really high, and people were asking, you know, how much is the president responsible for high gas prices or low gas prices?
[95] And what was interesting is that the same group had asked the question to citizens during the presidency of George W. Bush when also gas prices were high.
[96] And what's fascinating and perhaps unsurprising is, of course, when gas prices are high and there's a Republican in the White House, most Republicans think the president has very little control over gas prices and therefore should not be blamed for it.
[97] And Democrats think the president has a lot of control over gas prices and should be blamed for it.
[98] And the tables are exactly turned when you have a Democrat in the White House.
[99] And so in both cases, people in very self -interested way, see the data that they have and interpret it in a way that aligns with their political beliefs.
[100] And of course, this is just one example.
[101] There must be hundreds of examples like this.
[102] That's right.
[103] And what's amazing is it's motivated reasoning.
[104] And both of those words are important, right?
[105] So it's motivated.
[106] I'm seeing things in a way that's consistent, right, with my motives, my prior beliefs.
[107] But it's also reasoning because they're not just saying, well, I'm just going to believe whatever makes my side look better.
[108] Done.
[109] Right?
[110] That would be just like pure motivation.
[111] There's reasoning going on.
[112] So people are actually stopping to think, okay, well, what are the factors that influence gas prices and who might be responsible for?
[113] And what's going on on the global political stage, you know?
[114] And so if you reason it out, these things are so complex that you can find reasons for almost anything.
[115] Or at least for one of the two sides of the issue anyway.
[116] There is another curious dimension of the bias blind spot.
[117] When we come up with positions on various issues, We're keenly aware of the nuances and subtleties of our opinions.
[118] But we don't extend the same respect to the views of our opponents.
[119] We very rarely say the views of the people who disagree with me are thoughtful and nuanced, right?
[120] I think that's right.
[121] I think that there's more of a tendency to sort of stereotype and caricaturize others and to recognize the nuance and complexity in our own views.
[122] And unfortunately, the political realm, I think, affords that, makes that even more likely, because people can't really express their own ambivalences and nuances, right?
[123] Because that's seen as sort of giving into the other side.
[124] So people do tend to portray themselves as sort of more clear and perhaps even more extreme in that respect as a result.
[125] We've known for a long time that our evaluations of ourselves are very different than our evaluations of others.
[126] The Bible asks why we notice a speck of dust in our brother's eye, but ignore the beam sticking out of our own eye.
[127] When we come back, the psychological quirk that produces the radically different judgments we make of ourselves and others.
[128] You're listening to Hidden Brain.
[129] I'm Shankar Vedantam.
[130] This is Hidden Brain.
[131] I'm Shankar Vedantham.
[132] All of us find it remarkably easy to identify bias among other people, especially our opponents.
[133] And all of us find it maddeningly difficult to spot biases in ourselves.
[134] psychologist Emily Pronin has spent years studying this discrepancy in our perceptions and she has found that much of the discrepancy comes down to the different yardsticks we use in judging ourselves and others.
[135] Emily, I'm not sure if you have watched the television show Veep, but on the show there's a character named Jonah Ryan who decides to run for president.
[136] He's been advised that it's a bad look to be single, so he gets together with a woman who happens to be the daughter of a man his mom used to be married to.
[137] do.
[138] So she is his step -sister.
[139] In an interview, Jonah prefers to think of his fiancé as only his former step -sister.
[140] So what would you say to someone who might ask?
[141] How can they marry their step -siblings?
[142] I'm not her brother.
[143] Nor have I ever been her brother.
[144] Right.
[145] And the only time anyone could ever say that would be for that one year.
[146] I mean, it's exactly what Woody Allen did and nobody thinks he's weird.
[147] I mean, everybody just hates him because ants wasn't as good as a bug's life.
[148] Exactly.
[149] So this is obviously a comedy show, Emily.
[150] But I'm wondering if you can just start by explaining when it comes to our judgments of other people, what is the yardstick that we use to evaluate that they are biased?
[151] The yardstick that we use is in one word, behavior.
[152] They're actions.
[153] The process is, it's almost like if you imagine a fork in the road and it just goes two different ways.
[154] There are just two different paths here.
[155] There's the path that we use for self -judgment and there's the path that we use for judging others.
[156] And in my view, the path that we use for judging others is we look at their actions.
[157] The path that we use for judging ourselves is we look inwards.
[158] And when I say look inwards, I mean, we look to things like our thoughts, feelings, intentions, motives.
[159] So if the question is, did I marry my brother?
[160] You know, there's an action.
[161] There's a behavior, right?
[162] I did it or I didn't.
[163] And that's how other people will judge it.
[164] But in judging myself, I might look much more to my motives and my intentions.
[165] Am I someone who would intend to marry their brother?
[166] No, that's weird.
[167] I would never intend to do that.
[168] So I guess I didn't do it.
[169] And so when we are interacting with other people, what we see is them.
[170] We see their actions.
[171] We see their expressions.
[172] But when we experience ourselves, we don't really see that.
[173] Instead, what we perceive is what's inside our heads.
[174] That's the information that we're flooded.
[175] with.
[176] That's the information that we can't escape is our thoughts and feelings and intentions.
[177] So that's what we give so much weight to.
[178] So one of the things that jumps out at me from what you're saying, Emily, is that our introspection, our access to our own thoughts and feelings, these are with us all the time.
[179] So it's almost, we don't actually have to ask ourselves the question, how do I evaluate myself?
[180] We automatically go to looking inward to our thoughts and feelings.
[181] When it comes to our evaluations of other people, in some ways, we don't have access to their thoughts and feelings.
[182] Those are hidden from us.
[183] And so we use what we have.
[184] And on the surface, this happens without any sort of conscious awareness that it's happening.
[185] Right.
[186] So I don't realize I'm using a different yardstick to evaluate your behavior and a different yardstick to evaluate my.
[187] Right.
[188] I don't think that we really think about that explicitly.
[189] What we use in any judgment psychologists can tell you is the information that's salient.
[190] Psychologists like that term salient.
[191] The information that's available to us, whenever is fresh in our judgment.
[192] brains is the information we use.
[193] And so it just so happens that for the self, the information that's fresh in our brains all the time is that stuff that we perceive to be in our brains, right?
[194] Our thoughts and feelings, all that stuff that's sort of just constantly there and that we're sort of constantly aware of.
[195] It's not just that we use different yacht sticks in evaluating ourselves and others.
[196] Each of those yacht sticks is flawed and flawed in a different way.
[197] When it comes to evaluating our own behavior through introspection, we imagine that we can see all our motives and intentions, that they are accessible to us.
[198] But it turns out, that is not the case.
[199] There's a bunch of stuff that goes on in our brand that we're not aware of, right?
[200] We're not aware of the sources of our beliefs.
[201] We're not aware of, you know, if I go to the ice cream shop and I choose the chocolate ice cream over the vanilla, I am aware that that was my choice.
[202] But I'm not aware why that was my choice.
[203] That's happening in the brain without my having access to it.
[204] But we sometimes forget that.
[205] So we think that we can look inwards and find out everything, right?
[206] So we forget, for example, that a lot of prejudice and stereotyping happens unconsciously.
[207] And that means I can't look inwards to find it.
[208] I'm not necessarily going to have those racist intentions.
[209] That's something that people talk about a lot now.
[210] So it's not the kids.
[211] case that just because we have access to all this information in our heads, that that means it's always going to be probative for making whatever judgment we need to make.
[212] I remember speaking some years ago with the researcher Michael Tesla.
[213] He ran an interesting experiment with Republicans and Democrats, and this is back when the country was debating the Affordable Care Act or Obamacare.
[214] And what Michael Tesla did is he presented volunteers with details of the Affordable Care Act, but he told some of them that the plan had been put forward by President Bill Clinton, a Democratic president who was white, and he told other volunteers the plan was from Barack Obama, a Democratic president who was black.
[215] So same plan, same details, both put forward by a Democratic president, except that one was white and one president was black.
[216] And what he found was that both liberals and conservatives was subtly biased by their feelings about the racial identity of the president.
[217] White racial liberals become more supportive of a policy when it's framed as Barack Obama than when it's framed as Bill Clinton's.
[218] But white racial conservatives become less supportive of that policy.
[219] And Emily, I feel like this is speaking to what you just said.
[220] If you ask liberals and conservatives, how are you evaluating this policy?
[221] They will dive into the details and say, here's why I like the policy or here's why I don't like the policy.
[222] And neither will say my affinity or my aversion to someone from a different race might be shaping my view on something like the Affordable Care Act.
[223] Yeah, I think that's exactly right, and that sounds like a great study.
[224] And it's not that the subjects were lying.
[225] They were saying what they believed to be the case.
[226] They assumed incorrectly that if the race of the candidate had impacted their judgment that they would know it.
[227] Now, an outsider might be able to notice this pattern much more quickly because they wouldn't be relying on their intentions.
[228] they'd just be looking at what the person did.
[229] So sometimes when we look at behavior, things can be a little bit easier to see.
[230] Yeah.
[231] Some years ago, Emily, you came up with a theory of why our introspection's are unreliable.
[232] You called it the introspection illusion.
[233] What is the introspection illusion?
[234] Although we have access to our introspections, this is sort of what it means to be a conscious person, as you know your thoughts and your feelings and your motives and your intentions, and they're there all the time in your head, we have some illusions about what that can do for us.
[235] So we think that that gives us sort of supreme self -knowledge, sort of that we can know all sorts of things about ourselves because we have access to this information.
[236] We also think that our behavior is less important than knowing what's inside of our heads.
[237] In the case of ourselves, it's our intentions that are so important to know.
[238] You know, I was speaking some years ago with Mazarin Banaji, the psychologist at Harvard, and she said something really interesting to me. She said, you know, if you have a problem with your heart, you might go to a cardiologist to get it checked out.
[239] And when the cardiologist says, here's what's wrong with your heart, you're inclined to believe her because you think the cardiologist knows more than you do about your heart.
[240] You don't tell the cardiologist it's my heart, therefore I must be the expert on my heart because it belongs to me. But Maserine Banaji was saying the same thing doesn't happen with our minds.
[241] It's very hard when an expert comes along and says, let me explain to you how your mind works because at some level all of us feel like we are experts in our own minds.
[242] And that's partly, I think, connected to what you're calling the introspection illusion.
[243] It feels like our mental worlds are so rich and we spend so much time in them that it feels in some ways we understand how they work.
[244] And in some ways that could be an illusion.
[245] That's right.
[246] And look, it's embedded in the history of our own field, you know, in the very early days of psychology, when people wanted to understand the mind, they had people come into the laboratory, sit in a room and introspect.
[247] And they said, this is how we're going to learn how the mind works.
[248] And then, you know, there was a huge backlash.
[249] The behaviorists came along and they said, we cannot learn about how the mind works by asking people to report to us what's going on in their mind.
[250] So they said, we're getting rid of all of that.
[251] So we're just going to focus on behavior because that is observable.
[252] And then, you know, we had sort of.
[253] of a third wave of cognitive psychology, where we realized that there were objective strategies, empirical methods that we could use to study the mind that did not rely on people telling us what was going on in their own minds.
[254] And I think part of this also rests on the idea that if everything that happens in our mind was actually accessible to conscious introspection, we might, in fact, if we were very honest and very diligent, be able to look inside our minds and see everything.
[255] But in fact, if much of our minds actually are operating outside of our conscious awareness, you know, what our minds are doing is simply not accessible to us through introspection.
[256] Right, exactly.
[257] You brought up the halo effect when we were talking earlier.
[258] And the famous experiment on the halo effect comes from Tim Wilson and Richard Nisbet back in 1977.
[259] And they had people watch a video in which a professor with a quote unquote foreign accent, I'm not sure what his accent was talking.
[260] And then they asked subjects in the experiment to evaluate this professor.
[261] And half of the subjects had seen the professor in the video acting cold, not very likable.
[262] And the other half of subjects had seen him acting warm, very likable.
[263] And afterwards, they asked the subjects what they thought of him.
[264] And what people said in the unlikable professor condition was that they didn't like his accent when they were asked about his accent.
[265] In the likable condition, they did like his accent.
[266] So what happened was the likeability of the professor, which was manipulated by the experimenters, influenced how much people thought the accent was likable.
[267] But they didn't realize that this had happened at all.
[268] They had no access to what had influenced their perception of the accent.
[269] Now, we knew as experimenters, right, because we saw those who got the unlikable professor thought it was a bad accent, and those who got the likable professor thought it was a good accent.
[270] So the experimenters could say, gee, we know how they came to this conclusion.
[271] But the subject didn't know that.
[272] the subject just looked inwards and said, that's a bunch of hooey.
[273] Why would I evaluate someone's accent, how likable their accent is, based on how nice they were.
[274] That makes no sense.
[275] They had no awareness of having done that.
[276] It occurred unconsciously, and they denied it.
[277] You can see the same thing play out on a much larger scale when it comes to politics.
[278] In the 2020 U .S. presidential election, the vast majority of Republicans voted for the Republican candidate, and the vast majority of Democrats voted for the Democratic candidate.
[279] Many people, if you ask them, why did you choose the candidate you voted for, they would give you a nuanced explanation of why candidate A was better than candidate B. But imagine that you're a neutral observer who just landed in the United States from Mars.
[280] You might look at the same results and say, well, people are just voting for their party's candidate.
[281] It doesn't really make a difference who that person is.
[282] Most people tend to dislike such comments because they suggest that our choices are less deliberate than we think.
[283] They prompt us to search for justifications that prove that we are, in fact, in charge of our own minds.
[284] That's right.
[285] We think we're being rational, that we're choosing the candidate based on rationalizing, but really what we're doing is we're rationalizing.
[286] Really, actually, there's other factors that have determined which candidate we prefer.
[287] And then after the fact, we rationalize it by coming up with what seem like rational reasons.
[288] And it feels like our evaluations of ourselves and others, you know, is shaped by these dual forces.
[289] On the one hand, we ascribe greater weight to our own introspection than maybe we should.
[290] But on the other, we discount the introspection of other people.
[291] So in other words, I don't think that my political opponents have actually thought very carefully about how they've chosen their course of action.
[292] I sort of can dismiss them as being easily led as being sheep, even as we overvalue our own introspection, we undervalue the internal thought processes of other people.
[293] Yes.
[294] And we've even done experiments where we say, look, maybe the reason why people undervalue other's thought processes is that they just don't have access to them, right?
[295] You know, you have such rich access to what's going on in your head.
[296] So we will give people an entire think -aloud protocol, meaning that before the subject made their decision, they thought aloud into a tape recorder.
[297] They just dumped all their thoughts and we'll give that to the subject to hear and they still disregard it.
[298] So we actually did a study with political beliefs and this was with Jonah Berger and Sarah Maluki we made up various California propositions.
[299] One was about increasing the maximum cargo size at the port of Los Angeles.
[300] And then we told people their party's position on them.
[301] So the Democrats support this or the Republicans support this.
[302] And we ask them to choose their position to vote, essentially.
[303] And we asked them, were they influenced by their party's position?
[304] And subjects said, no, I wasn't influenced by that.
[305] I just evaluated the issue.
[306] Before they said what position they would take, we had them list all their thoughts.
[307] They dumped out all of their thoughts on the issue.
[308] And then we gave that to another person to also evaluate.
[309] And the other person didn't care about any of that.
[310] So people said, I went with my thoughts.
[311] I evaluated the issue.
[312] But the outsider said, oh, gee, I don't need to see all those thoughts.
[313] That's not relevant.
[314] You're a Democrat.
[315] You went with a democratic position.
[316] Done.
[317] Simple.
[318] One of the other ideas that's connected to your work on how we overvalue the things happening in our minds and pay less attention to the things happening in other people's minds is a phenomenon called naive realism.
[319] Can you talk about what that is and how it connects to your work?
[320] So much of what we're talking about today, I think, is really rooted in the basic functioning of our brains.
[321] It's just how we are designed, right?
[322] So, for example, we have eyes in our head and those lead us to see other people's behaviors, but the eyes don't look inwards.
[323] And there's just sort of basic brain architecture that determines so much.
[324] And naive realism has to do with this.
[325] It has to do with the idea that there are some basic and inescapable beliefs.
[326] And one is that I believe that I see the world as it is in objective reality.
[327] And as a result, I think that others will see the world the same as I do.
[328] And that when others don't, I have to explain it.
[329] And the way that I tend to explain it is either by saying, they don't understand, I need to educate them, or failing that, saying there's something wrong with them.
[330] Either they're stupid or they're biased.
[331] You can see naive realism at work in everyday interactions.
[332] Take, for example, something that the comedian George Carlin observed.
[333] Have you ever noticed when you're driving that Anyone who's driving slower than you is an idiot.
[334] And anyone driving faster than you is a maniac.
[335] Yes, I love that quote.
[336] It reminds me of one time I was in the kitchen preparing some food, and my seven -year -old was in the playroom with my father, his grandfather, and they were looking at different cars in a magazine.
[337] And my son kept preferring the big SUVs.
[338] You know, and my father preferred the little boxer -type cars.
[339] And at some point, my father said to him, I know it's a matter of taste, but your taste is stupid.
[340] That's a great story.
[341] That's a wonderful story.
[342] And I feel it speaks to something that I think is really important.
[343] You know, parents and teachers are constantly trying to teach this lesson.
[344] You know, don't jump to conclusion, slow down.
[345] Don't assume you know what's happening in someone else's head.
[346] And yet it's so hard to remember to practice.
[347] these lessons, right?
[348] I mean, as a parent and as a teacher yourself, do you sometimes go, you know, I'm doing the exact same thing I tell my kids not to do?
[349] Yes, because these things are so automatic and so natural.
[350] These are tendencies that we have to override in ourselves.
[351] We can't eliminate them, right?
[352] Because so the tendency to think you see the world as it is, an objective reality, and therefore, if you like the race car better than the SUV, that it truly is better, that it truly is is sort of inescapable.
[353] And when children do it, they don't realize even that there's a distinction when they're young between their perception is reality.
[354] As we get older, we come to recognize, right?
[355] Oh, wait a second, that's a matter of taste, right?
[356] At some level, we come to realize, oh, no, no, there's different perspectives and it's a matter of taste.
[357] But that initial belief that we have from childhood that my perception is reality doesn't really go away.
[358] And so it does actually feel that the cart we prefer is the better one.
[359] There are lots of implications that stem from what Emily calls this basic architecture of the brain.
[360] Here is one that should be familiar to all of us.
[361] If I'm late for a meeting, my mind is chock full with all the reasons I'm late.
[362] Traffic was terrible, I had a child care crisis, and so on.
[363] But if someone else is late for a meeting, I don't have access to all that stuff happening inside their heads.
[364] It's easy for me to think of them as just being irresponsible or careless.
[365] Psychologists call this the fundamental attribution error.
[366] Another implication of this work has to do with the phenomenon of magical thinking.
[367] Magical thinking involves the idea that our thoughts could somehow influence the world around us.
[368] So, for example, if I think ill thoughts about you, can that give you a headache?
[369] Or if I think positive thoughts about my favorite player on the team, will that help them score a goal?
[370] And we found that, in fact, this was the case.
[371] So, for example, we had people think evil thoughts about someone else in the experiment.
[372] We said we're interested in whether you could place a hex on the person, and then they stuck pins in a voodoo doll.
[373] And then the other person reported a headache because they worked for us, and we told them to.
[374] Right.
[375] And what happened was if you were told to think ill about the other person before putting those pins in, you were told, just take a minute and think of something terrible.
[376] Just think of the worst thing you can happening to this person.
[377] And then you stuck the pins in the doll.
[378] Then you felt like you caused the headache and you felt bad.
[379] And we found the same thing with basketball.
[380] Before a big university basketball game, we had people think about the different players and think about how each one of them could contribute to the game and how would they help their teams score well.
[381] And then we asked them after the game how much impact they felt that their thoughts had on the score of the game.
[382] So they thought that they'd impacted the game when they had thought about the players doing well.
[383] and the way that it's related is it again involves putting way too much weight on what's going on inside our heads because we're basically saying that what's going on inside my head could give someone else a headache or what's going on inside my head when I sit in the stands of a basketball game could influence the players score how many baskets they shot.
[384] I mean, if it's a critical moment in the game, I would feel terrible getting up to leave the room and get some popcorn.
[385] I feel like, I can't let my team down.
[386] How could I do that to them?
[387] The introspection illusion, the bias blind spot, and naive realism have profound consequences in our daily lives.
[388] They do more than shape our thinking in basketball games.
[389] They shape life and death decisions and choices to go to war.
[390] That's when we come back.
[391] You're listening to Hidden Brain.
[392] I'm Shankar Vedantam.
[393] This is Hidden Brain.
[394] I'm Shankar Vedantam.
[395] Psychologist Emily Pronan has found we judge ourselves very differently from the way we judge others.
[396] This is because we use different yardsticks while doing those two things.
[397] We evaluate others based on their behavior, but we evaluate our own actions using introspection, and it turns out introspection is not a useful guide to understanding our own minds.
[398] Emily, I want to talk about some of the implications of this work and the ways in which it plays out in the real world, and I want to start with an example of something that can seem trivial, but that produces widespread conflict across the United States.
[399] I'm going to let Whoopi Goldberg explain.
[400] A survey found that one of the most common arguments this time of year in households across America is what temperature to put thermostat at.
[401] Everybody relates.
[402] Oh, my gosh.
[403] People are like, uh -huh.
[404] There's a lot of stuff we should be talking about because it's on the list.
[405] But this interests me because I feel like everybody deals with this.
[406] So, Emily, I want to draw attention to the fact that this is a topic which you bring up.
[407] Almost everyone has an opinion about it, and the opinion is often heated.
[408] Tell me how it connects to the conversation we're having about how we think about our minds, other people's minds, and the judgments we arrive at.
[409] I just love the idea of the thermostat wars.
[410] It's just, it's so real.
[411] And I think it goes back to that quote, I know it's a matter of taste, but your taste is stupid.
[412] Because essentially, if I think that the temperature should be set, to 73 and you think it should be set to 68, I do realize that this is a matter of taste, right?
[413] That there's no right answer here.
[414] But at another level, I actually think that the temperature that I wanted to be at is the correct one.
[415] And it's like that George Carlin quote, right?
[416] If you want it to be hotter than me, you're a little soft and ridiculous, you know, and you're wasting a lot of energy.
[417] If you want it to be colder than me, you've got to be kidding.
[418] Do you really need to be that aesthetic and suffer like that?
[419] We can turn up the heat a little bit more, right?
[420] So we think it's a matter of taste, but we also think we're right.
[421] And I've done some research with Nate Sheik and Shane Blackman, where we show this with paintings, sort of people say, oh, paintings, that's a matter of taste, until someone disagrees with them about which are the nice paintings and which are the bad paintings.
[422] And then all of a sudden, they say that that person is wrong.
[423] Tell me a little bit more about the art study.
[424] I'm fascinated by the examples that you used and what you found.
[425] So Shane collected some...
[426] images of paintings from art history books.
[427] So these were paintings by famous artists.
[428] And they were in major museums.
[429] And they were arranged from abstract to portraits.
[430] And we would show them to a subject and ask them to rate which ones they thought were truly great and which ones they thought were overrated.
[431] And then we showed them a cover story for this of another subject who had supposedly done the same task.
[432] But that subject totally disagreed with them.
[433] So if I thought it was great, they thought it was overrated and vice versa.
[434] And then they had to evaluate this other person.
[435] And although they said that opinions on art were a matter of taste, when they saw this person who disagreed with them, they actually thought that the person was wrong and had been influenced by improper influences because otherwise they surely would have agreed with oneself.
[436] And we've done studies with chefs.
[437] Kobe Sissarski, my student, did a study with chefs.
[438] and chefs showed this exact phenomenon, right?
[439] There's an objectively correct amount that the meat should be cooked, that the pasta should be cooked, you know, how much it should be salted, and those who do it the other way are wrong.
[440] I want to point to something that you said earlier that I think might connect with this, which is in some ways when we think about our own subjective conclusions, when we think about a painting or how long pasta should be cooked, we're actually not thinking of this as being subjective.
[441] It genuinely feels as if we have amassed a whole bunch of objective data and I arrived at this conclusion that in some ways feels objective.
[442] So we might say, yes, my taste in art and music is subjective, but it actually feels like it's not, that it's actually objective.
[443] And part of it is that when things come to us through the senses, it comes so quickly that we do not feel the operation of the mind being involved.
[444] I know if I preferred to get the chocolate ice cream to the midship ice cream, but if you ask me why, I simply don't have access to that.
[445] And so it doesn't feel like there's been all these intervening processes that could have biased it.
[446] So when the piece of food hits my mouth and I think that it has too much salt, I don't have access to any brain processes that are influencing that judgment.
[447] It's just, yeah, that's too salty.
[448] And it's an immediate feeling.
[449] And because it is so immediate, it's hard to imagine that it could have been biased by anything.
[450] I'm wondering if this is connected in some ways, Emily, to other work that you have done that looks at how we perform during interviews, but also how we judge other people during interviews.
[451] So we all think that we can sit before someone for half an hour, talk to them, and get a pretty good sense of whether this person is a good fit for a job.
[452] But if someone would come along and say, oh, we can talk to you for half an hour and figure out if you are a good fit for the job, we say that's clearly inadequate because I'm so much more complex than anything that can be ascertained in 30 minutes.
[453] Right.
[454] So there's a term the interview illusion, which I did not coin, and it was about this idea that it's an illusion that you can tell so much from an interview.
[455] If I want to know whether you're going to be a good bricklayer, it's probably a lot more valuable for me to watch you lay bricks and for me to ask your five prior bosses how well you laid bricks than for me to sit down and interview you about how good of a brick layer you are.
[456] And yet, people love interviews.
[457] And even psychologists, we do job interviews.
[458] Why do we do that?
[459] We could just ask the people who've worked with the person.
[460] we could ask their advisors to write letters, and we could read their work, but we do interviews as well, and we put a lot of weight on them.
[461] And the work that I did, we actually had people come into the laboratory in pairs.
[462] These were students who'd never met each other, and they talked to each other for a half an hour.
[463] And we found that at the end of the half an hour, they felt like they'd really come to know the other person, but that the other person had not really come to know them.
[464] So you could only get a small understanding and a small glimpse of who I am from that conversation, but I've got the whole you.
[465] And part of that has to do with the fact that I know all the stuff about me that you didn't find out from that conversation.
[466] I'm aware of all the stuff I didn't say, all the stuff I said that maybe was misleading about who I really am.
[467] I've got all that, but I don't have all that about you.
[468] You can see how these biases might play out in the context not just of interpersonal conflict, but geopolitical conflict.
[469] If you think you see the world accurately and I don't, if you try to set me straight and find you kind of change my views, what are you to conclude?
[470] The simplest explanation is that I can't be trusted.
[471] There's no point trying to understand me or reason with me or negotiate with me because I must be either stupid or evil.
[472] It's not just people's actions that influence how we want to respond to them.
[473] It's also our beliefs about what those actions stem from.
[474] And if we believe that individuals are biased, that their mental processes are biased, then we don't believe that it makes sense to try to reason with them.
[475] Is there any evidence that teaching people about the ways in which our minds work, that it actually changes the way they can actually perceive the conflict and perhaps respond differently?
[476] When people learn about these different biases, they're initially very optimistic that what we need to do is educate people about the biases.
[477] So if I just tell people, like my students, here's the different biases that people engage in, that that should solve the problem.
[478] They'll say, gee, I didn't know about all those biases.
[479] Yeah.
[480] And the idea is now that I know about them, I won't do them.
[481] But as you know, that's not how it works.
[482] Because what happens is they say, gee, I didn't have words for all those biases.
[483] But now that you've told me the words, they give me a great vocabulary for describing what all the people around me keep doing.
[484] So that doesn't work.
[485] That doesn't work.
[486] But what Matthew Coogler and I tried was instead to educate people about the importance of unconscious processes.
[487] And we taught people about how a lot of our judges.
[488] are rooted in things we don't have access to so that a lot of things are automatic and a lot of things are biased.
[489] And so we tried to educate them about essentially the introspection illusion, the illusion that we can have access to all these things and the fact that instead much of it is occurring automatically and is biased.
[490] And then we asked people to complete our usual bias blind spot measure where they read about various biases.
[491] And then people no longer show to blind spots.
[492] So once they understood about the operation of the unconscious and how these things happen automatically, they no longer claimed to be less biased than others.
[493] So then they said, gee, maybe I am biased.
[494] Maybe looking inwards and not seeing bias is not the best way to conclude whether I'm biased or not.
[495] Can you give me a concrete example of a time when you used your own research to change how you thought about something important or to change your own behavior?
[496] I don't know if I can give you a single important example, but I think that as a parent, I find it happening with me all the time that, you know, I'm talking to my kids about someone in our lives who's done something that sort of has, you know, irritated us in some way.
[497] You know, somebody canceled on a plan that we had or, you know, said something that, you know, was insensitive.
[498] And I find myself, you know, doing that thing, right, where I'm about to jump to the fundamental attributioner and I'm about to say, you know, gee, that was, you know, mean or inconsiderate or lazy or whatever, and then I've got my kids there with me. Oh, this is not what I want to teach my children.
[499] And so I say, wait a second, you know, I know it might seem like the person was being inconsiderate, but maybe they're having a really hard day.
[500] I try to sort of teach them and to remind myself to think about people's circumstances instead of jumping right away to that dispositional attribution.
[501] You know, Emily, I'm thinking about your work, I'm realizing that many miscommunications might happen because our thoughts seem so clear to us, but we do a terrible job communicating those thoughts to others.
[502] Things in our minds seem so clear and loom so large to us that we somehow assume they must be clear to others as well.
[503] Yeah, it's interesting.
[504] I think of the example of breakup, romantic breakups, and people sort of, they want to be kind and they want to be considerate and they want to do it nicely, not always.
[505] And the other person, is left to totally confused and says, oh, you know, yeah, I think, you know, we're just taking a break for a few days, you know, or, you know, we just hit a rough patch, you know.
[506] And the other person thinks they have successfully broken up and ended their relationship.
[507] But we forget that our intentions, what we're intending to do is to break up, to close the door, but to do it in a very kind and considerate way.
[508] And what the other person thinks is that you've sent a bunch of mixed messages and you're leaving the door open.
[509] So this is just one of so many examples where we don't recognize our lack of transparency and what we haven't communicated because it's so obvious to us.
[510] I don't know if you're a fan of the show Parks and Recreation, but there was an incident on Parks and Rec that basically is almost exactly the same lines where one party is breaking up with another, but they break up so politely and so kindly that the other party thinks, great, we've had a wonderful chat.
[511] The relationship is now onto a higher level than it was before, and one party thinks they've broken up and the other party thinks, wow, we're really on a good place now.
[512] So you're leaving soon?
[513] Back to Indianapolis, briefly, and then on to a town called snorling, Indiana for several months.
[514] Never heard of it.
[515] It's quite small.
[516] The cows at number of the people 41.
[517] And then after Chris moves, Anne tracks him down, storms into his house and accuses him of cheating on her.
[518] Oh, God.
[519] I'm so sorry, honey.
[520] I'm so embarrassed.
[521] I was scared that you were cheating on me. No, I'm not cheating on you.
[522] But I'm also not dating you.
[523] We broke up last week.
[524] Yeah, I'm laughing, but it's actually sad.
[525] I mean, it actually causes a lot of suffering in reality.
[526] Yeah.
[527] We've just been through a really bruising political year, Emily.
[528] And the country as a whole has been very divided.
[529] And a lot of people are really asking, is it possible for us to come together as a nation after a very bitter political fight?
[530] I'm wondering if you were to give advice to the nation based on the work that you have done, what would that advice look and sound like?
[531] I mean, I think one thing I would say is that if you were judging yourself by all your positive intentions, right, your good feelings, you know, if your intentions are that you want the country to be in a better place, that you want people to thrive, don't assume that others' intentions are different from your own.
[532] and if you put a lot of weight on your on your intentions those others intentions would deserve just that same amount of weight so we owe some charity in judging others behavior by giving some weight to their intentions and we should not assume that their intentions are so different from our own and if we can start with that and start with the charity of trying to find the positive intentions and others, then maybe there is some hope.
[533] But it's so hard to do, especially when things are so divided and feel so divided.
[534] Psychologist Emily Pronin teaches at Princeton University.
[535] Emily, thanks for joining me today on Hidden Brain.
[536] Thank you so much for having me. Hidden Brain is produced by Hidden Brain Media.
[537] Our production team includes Bridget McCarthy, Annie Murphy Paul, Kristen Wong, Laura Querell, Ryan Katz, Autumn Barnes, and Andrew Chadwick.
[538] Tara Boyle is our executive producer.
[539] I'm Hidden Brain's executive editor.
[540] Our Unsung Hero today is HR specialist Max Lenowitz.
[541] When we launched Hidden Brain Media, Max was an onboarding manager at JustWorks, the company that handles our payroll.
[542] Max worked closely with us as we were launching our business, and he patiently answered our many questions about how to pay employees and track things like vacation and sick time.
[543] His good cheer made a busy time feel less stressful.
[544] Thank you, Max, for helping us to find our feet.
[545] For more of our work, please be sure to subscribe to our newsletter.
[546] Every week, we feature interesting research on human behavior and a brain teaser.
[547] Plus, we give you a look at what's coming up on both Hidden Brain and our new show, My Unsung Hero.
[548] You can subscribe at news .hidnbrane .org.
[549] That's N -E -W -S at Hiddenbrain .org.
[550] Next week in our mind -reading series, we look at how many of us underestimate the goodness in human nature.
[551] Guess what?
[552] People aren't entirely selfish.
[553] Economists have to learn that lesson, too.
[554] People are not entirely selfish.
[555] We actually care a lot about others.
[556] I'm Shankar Vedantham.
[557] See you soon.