Hidden Brain XX
[0] This is Hidden Brain.
[1] I'm Shankar Vedantham.
[2] In 2012, Miranda Dinda had a lot on her plate.
[3] She was 18.
[4] All her friends were getting ready for college and busy being teenagers.
[5] But Miranda was on a different path.
[6] She was pregnant.
[7] So I was very nervous and I felt really uneasy but excited at the same time.
[8] To calm her anxieties, Miranda did some research on childbirth.
[9] I decided I wanted a home birth.
[10] For that, she would need a midwife.
[11] She searched for months for someone who could come to her apartment in rural Pennsylvania.
[12] Eventually, she found a woman who seemed perfect.
[13] We'll call her M, her first initial, to protect her privacy.
[14] From the very first moment they met, Miranda knew everything was going to be fine.
[15] She said she had been a home birth midwife for over a decade.
[16] She was no nonsense, the mother of eight.
[17] She was very open and honest and friendly.
[18] So she definitely seemed like someone who I could trust.
[19] About an hour into that first meeting, M brought up a question that struck Miranda as strange.
[20] She said, have you ever considered not vaccinating?
[21] Miranda hadn't.
[22] Vaccines had never crossed her mind.
[23] So I asked her, you know, I was very confused.
[24] I was like, what do you mean?
[25] you, why would I consider that?
[26] Em explained that years ago, something bad had happened after she vaccinated her first child.
[27] She went on to describe a progression of events that lead some parents to a powerful but faulty conclusion.
[28] Em told Miranda that right after her son got his shots, his development regressed.
[29] One minute he was fine, the next, he was autistic.
[30] She said, the light had left his eyes.
[31] So M decided not to vaccinate her other children.
[32] And she very much implored me to do the same and to look into it.
[33] So I did.
[34] Miranda started on Google.
[35] It led her to Facebook groups.
[36] It's very easy to find them.
[37] So, yeah, even if you just Google, you know, support groups for parents who don't vaccinate, you will find a lot.
[38] The moms in these groups echoed what M had told her.
[39] And they welcomed me with open arms.
[40] and immediately were just practically bombarding me with information, telling me, your midwife's right, this is why you shouldn't vaccinate, this is why I don't vaccinate, this is what happened to my child who I did vaccinate versus my child who I didn't vaccinate, things like that.
[41] Everyone was caring and attentive.
[42] They didn't just talk about vaccines.
[43] They talked about regular mom stuff, things that Miranda found hard to talk about with anyone else.
[44] Diapers and birth plans and hospitals.
[45] and midwives and breast pumps and stuff like that.
[46] Miranda trusted them.
[47] To me, it seemed so clear.
[48] It seemed like I had just found this secret information that only some people come across, and I thought, why would I not use this information?
[49] Why would I not use this to my benefit, to my child's benefit?
[50] So it did not take me very long at all before I was solidly saying, I will not vaccinate my child when she's born.
[51] Ramona, come close.
[52] She named her daughter Ramona after the song by Bob Dylan.
[53] Ramona as a newborn, she was very active.
[54] She was very bright.
[55] She was very happy.
[56] She was a great baby, honestly.
[57] She was a wonderful baby.
[58] When the doctor said it was time to vaccinate Ramona, Miranda was ready.
[59] She had a script she'd been practicing in her head for months.
[60] And I said, no thank you.
[61] I have decided that I do not want to vaccinate.
[62] Please respect my opinions.
[63] Thank you very much.
[64] For the next two years, Miranda continued to say no to vaccines.
[65] Occasionally, when she encountered information that conflicted with her decision, a pamphlet at the doctor's office, a website, she dismissed it.
[66] I just very quickly went, that's not true.
[67] I don't agree with that, and I moved on.
[68] At some point, though, her conviction started to waver.
[69] Those doting moms on Facebook, they had some weird beliefs.
[70] People denying that AIDS exists.
[71] People saying that the reason there's gay people is vaccines, on and on and on with really crazy conspiracy theories.
[72] And then it hit her.
[73] If she didn't believe those ideas, why was she trusting them on vaccines?
[74] And I stepped back.
[75] I stopped going to the Facebook group as much.
[76] And I decided I needed to look at this issue from a purely logical perspective.
[77] No emotion in it.
[78] No, oh my God, what if something happens to my baby?
[79] And I completely readdress the issue all over again pretty much from the start.
[80] Miranda started seeking out perspectives that the mom's had urged her to avoid, information from the centers for disease control and medical journals.
[81] I started reading all kinds of things, basically opening my mind to more than just vaccines are bad, to the other side of the coin.
[82] It didn't take her long to change her mind.
[83] She got Ramona vaccinated.
[84] Looking back, Miranda can't believe how easy it was to embrace beliefs that were false.
[85] And what I would say to someone who's about to become a new mom, especially if they're a young mom is don't try to confirm your own fears online.
[86] It is so, so easy to Google what if this happens and find something that's probably not true that confirms your fear, that confirms your anxieties.
[87] Don't do that.
[88] Miranda's story tells us a lot about the psychology of false beliefs, how they spread and how they persist even in the face of conflicting information.
[89] Today we look at how we rely on people we trust to shape what we believe and why emotion can be more powerful than facts.
[90] We start with neuroscientist Talley Sharratt, a professor at University College London.
[91] She's the author of The Influential Mind, what the brain reveals about our power to change others.
[92] Talley is a mom, and so she understands on a personal level why Miranda was so worried about the safety of her child.
[93] A few years ago, when her baby was just a few weeks old, Talley was listening to a Republican presidential debate.
[94] You take this little beautiful baby and you pump, I mean, it looks just like it's meant for a horse, not for a child.
[95] Candidate Donald Trump had been asked a question about the safety of childhood vaccines.
[96] And we've had so many instances, people that worked for me just the other day, two years old, two and a half years old, a child, a beautiful child, went to have the vaccine and came back.
[97] and a week later got a tremendous fever, got very, very sick, now is autistic.
[98] So when I was listening to Trump at that debate, it really tapped into this fear that I had and the anxiety that I already had.
[99] And when he talked about this huge syringe, a horse -sized syringe that was going to go into the baby.
[100] In my mind, I could imagine the syringe inserted into my small little child and all the bad things that could happen.
[101] And this was a very irrational reaction on my end because I know that there's not an actual link between autism and vaccines, but it's not enough to have the data.
[102] Ben Carson, Dr. Ben Carson was on the other end.
[103] Well, let me put it this way.
[104] There have been numerous studies, and they have not demonstrated that there's any correlation between vaccinations and autism.
[105] But that wasn't enough, because the data is not enough.
[106] And even if the data is based on very good science, it has to be communicated in a way that would really tap into people's needs, their desires.
[107] If people are afraid, we should address that.
[108] I'm curious, when you sort of contrasted, you know, the weight of the evidence on the one hand and this very powerful image of the horse syringe and your seven -week -old baby on the other hand, how did you talk yourself?
[109] into trusting the data over that emotional image.
[110] What really helped is that I understood what was happening to me. Because this is what I study, I knew what my reaction was.
[111] I knew where it was coming from.
[112] I knew how it was going to affect me. And I think that awareness helped me to put it aside and say, okay, I know that I am anxious for the wrong reasons, and this is the action that I should take.
[113] It's a little bit when you're on a plane and there's turbulence and you get scared.
[114] But telling yourself, I know that turbulence is not actually anything that's dangerous.
[115] I know the statistics on safety and planes and so on.
[116] It helps.
[117] It helps people reduce their anxiety.
[118] But facts don't always relieve our anxieties.
[119] Sometimes they harden our views.
[120] Some time ago, Talley wanted to test how people update their beliefs.
[121] when confronted with new information.
[122] So she presented statements to two kinds of people, those who believed that climate change was real and those who were deniers.
[123] She found that for both groups, when the statement confirmed what they already thought, this strengthened their beliefs.
[124] But when it challenged their views, they ignored it.
[125] Talley says it's because of a powerful phenomenon known as confirmation bias.
[126] Confirmation bias is our tendency to take in any kind of data that confirms our prior convictions and to disregard data that does not conform to what we already believe.
[127] And when we see data that doesn't conform to what we believe, what we do is we try to distance ourselves from it.
[128] We say, well, that data is not credible, right?
[129] It's not good evidence for what it's saying.
[130] So we're trying to reframe it to discredit.
[131] I give an example in my book where if someone comes in and says, I just saw pink elephants flying in the sky, and I have a very strong belief, obviously, that no pink elephants fly in the sky.
[132] I would then think that they're either delusional or they're lying, and there's good reason for me to believe that.
[133] So it's actually the correct approach to assess data in light of what you believe.
[134] There's four factors that determine whether we're going to change our beliefs.
[135] Our old belief, our confidence in that old belief, the new piece of data, and our confidence in that piece of data.
[136] And the further away the piece of data is from what you already believe, the less likely it is to change your belief.
[137] And on average, as you go about the world, that is not a bad approach.
[138] However, it also means that it's really hard to change false beliefs.
[139] So if someone holds a belief very strongly, but it is a false belief, it's very hard to change it with data.
[140] So if data and facts don't work, what does?
[141] How do you get people to buy the truth?
[142] Well, you could try scaring them.
[143] The vast majority of Americans today do not feel safe.
[144] On Sunday, Americans woke up to a nightmare that's become mind -numbingly familiar.
[145] This could be the great Trojan horse of all time.
[146] Politicians use fear to get us to vote.
[147] TV programs use fear to get us to keep watching.
[148] Public health officials use fear.
[149] to get us to quit smoking.
[150] I asked Talley whether fear might be an effective way to persuade people to change their minds and maybe even their behavior.
[151] Fear works in two situations.
[152] It works when people are ready, stressed out, and it also works when what you're trying to do is get someone not to do something, an inaction.
[153] For example, if you try to get someone not to vaccinate their kids, fear may work.
[154] If there's, you know, an apple that looks bad, I don't eat it.
[155] Fear is actually not such a good motivator for inducing action, while hope is a better motivator, on average, for motivating action.
[156] You talk about one study in your book where a hospital managed to get its workers to practice hand hygiene to get staff members to wash their hands regularly.
[157] But it turned out the most effective thing wasn't frightening the staff about the risks of transmitting infections, it was something else.
[158] So in a hospital on the East Coast, a camera was installed to see how often medical staff actually sanitized their hand before and after entering a patient's room.
[159] And the medical staff knew that the camera was installed and yet only one in ten medical staff sanitized their hands before and after entering a patient room.
[160] But then an intervention was introduced.
[161] An electronic board that was put above each door and it gave the medical staff in real time positive feedback.
[162] It showed them the percentage of medical staff that washed their hands in the current shift and the weekly rate as well.
[163] So anytime a medical staff will wash their hands, the numbers will immediately go up and there will be a positive feedback saying, you know, good job.
[164] And that affected the likelihood of people washing their hands significantly.
[165] it went up from 10 % to 90 % and it stayed there.
[166] Instead of using the normal approach, instead of saying, you know, you have to wash your hands because otherwise you'll spread the disease.
[167] Basically, instead of warning them of all the bad things that can happen in the future, which actually results in inaction, they gave them positive feedback.
[168] I wrapped up my conversation with Talley by exploring another idea about how we might convince others to listen to views that conflict with their own.
[169] It had to do with a study of Princeton students who got their brain scanned while they listened to stirring emotional speeches.
[170] What they found was the brains of the different people listening to those speeches started synchronizing.
[171] So if we all listened, for example, to Kennedy's famous moon speech, her brains would likely look very much alike.
[172] Those who came before us made certain that this country And this is not only in regions that are important for language and hearing, it's also in regions that are important for emotion, in regions that are important for what's known as fear of mind, our ability to think about what other people are thinking, in regions that are important for associations.
[173] And you try to think, well, what's common to all these influential speeches that can cause so many people's activity to synchronize?
[174] And one of the most important things is emotion.
[175] If the storyteller or the person giving the speech is able to elicit emotion in the other person, then he's actually having somewhat of a control on that person's state of mind.
[176] And this generation does not intend to founder in the backwash of the coming age of space.
[177] We mean to be a part of it.
[178] We mean to lead it.
[179] So think about it like this.
[180] if you're very sad and I'm telling you a joke, well, you're sad.
[181] So you're not going to perceive the joke as I perceive it when I'm happy.
[182] But if I'm able to first make you happy and then tell you the joke, well, then you perceive it more from my point of view.
[183] So by eliciting emotion, what you're able to do is change the perception of everything that comes after, to perceive information as the person who's giving the speech wants you to perceive it.
[184] So you can see how this coupling, this idea that the audience's mind and the speaker's mind are in some ways coupled together, you can see how this could potentially be used to spread good information.
[185] You know, you have a great teacher in high school, and you're captivated by the teacher, and you're being pulled along by the story the teacher is telling you, maybe about history or maybe about geography, but you can also see equally how the same thing can work in the opposite direction, that you could be listening to a day.
[186] demagogue, or you could be listening to somebody who has a very sort of captivating rhetorical style.
[187] And this person could also lead you astray in just the same way that the great teacher can lead you to knowledge and to positive things.
[188] Absolutely.
[189] All the different factors that affect whether we will be influenced by one person or ignore another person are the same whether the person has good intentions or bad intentions, right?
[190] The factors that affect whether you're influential can be, can you elicit emotion in the other person?
[191] Can you tell a story?
[192] Are you taking into account the state of mind of the person that's in front of you?
[193] Are you giving them data that confirms to their preconceived notions?
[194] All those factors that make one speech more influential than the other or more likely to create an impact can be used for good and can be used for bad.
[195] Tally Shard, I want to thank you for joining me on Hidden Brain today.
[196] Thank you so much for having me. We've been exploring why we cling to beliefs.
[197] After the break, we look at how we spread them, from person to person to person.
[198] We'll talk to a mathematician about the power of social networks to circulate ideas.
[199] I'm Shankar Vedantam, and you're listening to Hidden Brain.
[200] This is NPR.
[201] This is Hidden Brain.
[202] I'm Shankar Vedantanth.
[203] During the Middle Ages, words spread to Europe about a peculiar plant found in Asia.
[204] This plant had a long stalk with heavy pods attached.
[205] When you cut those pods open, inside you would find a tiny little lamb.
[206] Complete with flesh and wool, like a live animal lamb.
[207] This creature, half plant, half animal, came to be known as the vegetable lamb of tartary.
[208] Various travel writers wrote that they had either heard about this or that they had eaten one of these lambs and many of them said they had sawn the kind of downy wool from the lamb.
[209] When these narratives made their way to Europe, people felt they had a view of a different world.
[210] Of course, no one in Europe had ever seen the vegetable lamb of tartary because there was no such thing.
[211] But for centuries, people kept talking about this fantastical creature as if it were real.
[212] It even came up in scholarly works right next to pictures of oak trees and rabbits.
[213] If people hadn't been telling each other about these things, nobody would believe that there were vegetable lambs because nobody had ever seen them, right?
[214] And this is by no means a unique happening at that time.
[215] At that time.
[216] Of course, we would never fall for vegetable lambs.
[217] We live in an era of science, of evidence -based reasoning, of calm, cool analysis.
[218] But maybe there are vegetable lambs that persist even today, even among highly trained scientists, physicians, and researchers.
[219] Maybe there are spectacularly bad ideas that we haven't yet recognized as spectacularly.
[220] bad.
[221] Kaelan O 'Connor is a philosopher and mathematician at the University of California Irvine.
[222] She studies how information, both good and bad, can pass from person to person.
[223] She is co -author with James Wetherall of the book The Misinformation Age, How False Beliefs Spread.
[224] Kaelin, welcome to Hidden Brain.
[225] Oh, thank you for having me. So one of the fundamental premises in your book is that human beings are extremely dependent on the opinions and knowledge of other people.
[226] And this is what creates channels for fake news to flourish and spread.
[227] Let's talk about this idea.
[228] Can you give me some sense of our dependence on what you call the testimony of others?
[229] So one reason we wrote this book is that we noticed that a lot of people thinking about fake news and false belief were thinking about problems with individual psychology.
[230] So the way we have biases and processing information, the fact that we're bad at probability.
[231] But if you think about the things you believe, almost every single belief you have has come from another person.
[232] And that's just where we get our beliefs because we're social animals.
[233] And that's really wonderful for us.
[234] That's why we have culture and technology.
[235] You know, that's how we went to the moon.
[236] But if you imagine this social spread of beliefs as opening a door when you open a door for true beliefs to spread from person to person, you also open the door for false beliefs to spread from person to person.
[237] So it's this kind of double -sided coin.
[238] And what's interesting, of course, is that if you close the door, you close the door to both.
[239] And if you open the door, you open the door to both.
[240] That's right.
[241] So if you want to be social learners who can do the kinds of cultural things we can do, it has to be the case that you also have to have this channel by which you can spread falsehood and misinformation too.
[242] So as I was reading the book, I was reflecting on the things that I know or the things that I think I know.
[243] And I couldn't come up with a good answer for how I actually know that it's the earth that revolve around the sun and not the other way around.
[244] Yeah, that's right.
[245] 99 % of the things you believe probably you have no direct evidence of yourself.
[246] You have to trust other people to find those things out, get the evidence, and tell it to you.
[247] And so one thing that we talk a lot about in the book is the fact that we all have to ground our beliefs in social trust.
[248] So we have to decide what sources and what people we trust and therefore what beliefs we're going to take up because there's just this problem where we cannot go verify everything that we learn directly.
[249] someone else to do that for us.
[250] We trust the historian who teaches us about Christopher Columbus.
[251] We trust the images from NASA showing how our solar system is organized.
[252] Now, we say we know Columbus was Italian and we know the earth revolves around the sun.
[253] But really what we mean to say is we trust the teacher and we trust NASA to tell us what is true.
[254] And the social trust and ability to spread beliefs, I mean, it's.
[255] what it's let humans do.
[256] You know, no other animal has this ability to sort of transfer ideas and knowledge dependably from person to person over generation and after generation to accumulate that knowledge.
[257] But you do just see sometimes very funny examples of false beliefs being spread in this same way.
[258] As a philosopher of science, Kaelin studies how scientists communicate and share information.
[259] If we rely on scientists to tell us what to believe, who do they rely on?
[260] Turns out, other scientists.
[261] Now, showing that this is the case isn't easy.
[262] The process by which scientists change their minds on questions such as the spread of disease or the movement of objects through space is very complex.
[263] Studying this complex process can be mind -boggling.
[264] Say, for instance, Dr. A talks to Dr. B one day about her region.
[265] It also turns out that Dr. B is collaborating with Dr. C, who recently met Dr. D at a conference.
[266] Now, Dr. D frequently reads Dr. A's papers but doesn't know about Dr. C's research.
[267] A couple of years later, Dr. E reads what Dr. B has written about what Dr. A said in an article that Dr. C cited before Dr. F had even published her results.
[268] Empirically, it's hard to study scientists because things like theory change will happen over the course of 10 or 20 years.
[269] and involve thousands and thousands of interactions between different scientists.
[270] You know, how would you ever study that?
[271] How would you ever study that?
[272] Because Kalen can't follow all these interactions, she recreates them in a computer simulation.
[273] You'd want to think of it as a really kind of simplified representation of what's happening in the real world.
[274] She creates groups of fictional scientists and she gives them a series of rules like who they can talk to and who they trust.
[275] These simulated scientists collect data and discuss their simulated research.
[276] Kalyn sits back and watches what happens.
[277] So one thing we find sometimes in these models is that one agent or scientist will get data supporting the false belief.
[278] They'll share it with the entire community of scientists, and then everyone will come to all believe the false thing at once and sort of ignore a better theory.
[279] And part of what happens there is this social spread of knowledge and belief causing everyone to turn away from a good theory.
[280] So if you have almost too much social influence within a community, that can be really bad because everyone can stop gathering data since the entire community is exposed to the same spurious results.
[281] If I hear you correctly, what you're saying is that psychological factors can have an effect, but you can have the spread of bad information even in the absence of biases or stupidity.
[282] Yeah, so one way that the models we look at are really useful is that you can kind of pair away things that are happening in the real world and see, well, suppose we didn't have any psychological biases, suppose we were perfectly rational, would we always come to the right answer in science and in our day -to -day lives and see that the answer is no. Coming up, case studies from the world of science, supposedly rational scientific communities that show how good information sometimes fails to spread and how bad information can metastasize.
[283] I'm Shankar Vedantam, and this is NPR.
[284] This is Hidden Brain.
[285] I'm Shankar Vedantam.
[286] Mathematician and philosopher Kalyn O 'Connor studies how information spreads through social networks.
[287] People who know and trust one another efficiently pass information back and forth and learn from one another.
[288] Unfortunately, the same rules of social trust can sometimes be a roadblock for the truth.
[289] Mary Wortley -Montague learned this lesson hundreds of years ago.
[290] She was an English aristocrat who found herself living for a while in what is modern -day Turkey.
[291] So Mary Montague seems to have been really enchanted by Turkish culture.
[292] You know, she was coming from England and aristocratic culture there.
[293] In Turkey, she discovered these beautiful shopping.
[294] centers, bathhouses, she seems to be, have been enchanted by bath houses where there would be a lot of women sort of lounging naked, going in the hot water, drinking hot drinks together.
[295] Another thing that struck Mary about Turkish women, they used an innovative technique to limit the spread of smallpox.
[296] It was called variolation.
[297] What this involved, I mean, it's a bit like vaccination now.
[298] you would scratch maybe the arm of a patient and take pus from a smallpox pustule and put that pus into the scratch.
[299] So what would happen after you did that is that the patient would get a very mild smallpox infection.
[300] Some small percentage of patients would die, but many, many fewer than who would die of an actual smallpox infection.
[301] And after they had that more mild infection, they would actually be immune to smallpox.
[302] So this was practiced commonly in Turkey, basically unheard of in England at the time.
[303] Mary Montague had herself had smallpox and survived when she was younger.
[304] She had lost a brother to smallpox.
[305] And so when she encountered variolation in Turkey, she decided, well, you know, why don't we do this in England?
[306] She had her own son variolated, and she decided she was going to try to spread this practice in her native country.
[307] So when she returns to Britain, in some ways Mary Montague here functions like one of your agents in your computer models, because you have one cluster over here in Turkey and one cluster over here in Britain.
[308] And essentially, you have an agent walking over from Turkey to Britain.
[309] And Mary Montague says, here's this wonderful idea.
[310] We can limit the spread of smallpox in Britain.
[311] Britain, in fact, at the time was actually facing a smallpox crisis.
[312] How were her ideas received?
[313] So her ideas were not received very well when she first came back.
[314] One thing we talk a lot about in the book is that almost everyone has what you might call a conformist bias.
[315] We don't like to publicly state things that are different from the people in our social networks.
[316] We don't like to have beliefs that are different from the people around us.
[317] It's somehow very socially uncomfortable to do that.
[318] And we don't like our actions to not conform with the people who we know and love.
[319] So when she got back to England, you know, it was already the case that all these physicians in England didn't believe in variolation.
[320] They thought this was a crazy idea.
[321] And none of them were going to stand out from the pack of and say, yeah, I'm the person who's going to try this or going to believe that this practice works because they were all busy conforming with each other.
[322] And of course, these ideas were coming from another country, a country with very different cultural practices that seemed in some ways very foreign.
[323] The idea and the country itself seemed very foreign.
[324] That's right.
[325] So it's not just that it's a weird new idea that none of them believe in their kind of in group.
[326] It's also that it's coming from Turkey, and furthermore, it's coming from women in Turkey, so it was a practice mostly done by women, and a woman is bringing it to England as well.
[327] So they also don't really trust her as a woman and someone who's not a physician.
[328] So social trust is a really important aspect in understanding how people form beliefs.
[329] because we can't go out and figure out ourselves whether the things people tell us are true, usually, we just always have to decide who to trust.
[330] And people have little shortcuts in how they do this.
[331] They tend to trust those who are more like them.
[332] They also tend to trust those who share beliefs and values and practices with them.
[333] So, for example, if you are a physician, you might tend to trust a physician.
[334] if you believe in homeopathy, you might tend to trust someone who believes in homeopathy.
[335] We all use these kinds of tricks.
[336] So what we saw in the variolation case with Mary Montague, the physicians aren't going to trust this woman who doesn't share their beliefs and practices, who isn't much like them.
[337] Now you could argue that the physicians who rejected Mary Montague's ideas were not behaving like real scientists.
[338] They weren't being dispassionate.
[339] They weren't being objective.
[340] They were bringing psychological biases into the picture.
[341] Sexism, xenophobia, tribalism.
[342] In the real world, misinformation spreads because of some combination of network effects and psychological and cognitive biases.
[343] You see the same thing in the case of the Hungarian physician, Ignaz Semmelweis.
[344] He was an insider, a man, and a doctor.
[345] He even had the assistance of scientific evidence.
[346] to support his claims.
[347] But it turned out, even these were not enough to overcome the barriers that confront the truth.
[348] Ignaz Semmelweis was a physician living in Vienna.
[349] He was put in charge of this clinic, the first obstetrical clinic in Vienna.
[350] Next door was the second obstetrical clinic of Vienna.
[351] He was in charge of training new doctors in obstetrics, and at the second clinic they were training midwives.
[352] And shortly after he took over, he realized that something really terrible was going on because in his clinic, 10 % of the women were dying, mostly of childbed fever.
[353] While the midwives next door, who presumably, you know, they would have thought they were less expertise, only 3 % to 4 % of their patients were dying.
[354] So Semmelweis was obviously really worried about this.
[355] He had patients who would be begging on their knees to be transferred to the other clinic.
[356] He had this kind of breakthrough moment when a colleague of his was conducting an autopsy and accidentally cut himself.
[357] And then shortly thereafter, he died of something that looked a lot like childbed fever.
[358] Someone wise realized, well, I've got all these physicians who are conducting autopsies on cadavers and then immediately going and delivering babies.
[359] and he thought, well, maybe there's something transferred on their hands.
[360] And he called this cadaverous particles.
[361] Of course, now we know that that is bacteria, but they didn't have a theory of bacteria at the time.
[362] So he started requiring the physicians to wash their hands in a chlorinated solution, and the death rate in his clinic dropped way down.
[363] And of course, the way we think about science, we say, all right, if someone's discovered something wonderful, everyone must have instantly adopted this brilliant new idea.
[364] You would think, right?
[365] And he has this wonderful evidence, right?
[366] It was 10%.
[367] He introduced the practice, goes down to 3%.
[368] But that's not what happened.
[369] So he published his ideas.
[370] And the other gentleman physicians did not take them up.
[371] In fact, they found them kind of offensive.
[372] They thought this is, you know, he's writing that we have dirty hands.
[373] We have unclean hands.
[374] But in fact, we're gentlemen.
[375] They also thought it was just really far out of the range of theories that could possibly be true.
[376] So they didn't believe him, despite the really good evidence and the deep importance.
[377] You know, people's lives were really at stake.
[378] And it took, I mean, decades for his hand -washing practice to actually spread.
[379] In fact, I understand that Semmelweis himself eventually suffered a nervous breakdown.
[380] How did his own story end?
[381] So the way the story goes, though this is a little hard, to verify is that he was so frustrated that people weren't adopting his hand -washing practice that he had a nervous breakdown as a result.
[382] He was put into a Viennese mental hospital where he was beaten by guards and died of blood poisoning a few weeks later.
[383] We've seen how being an outsider or breaking with tradition can be barriers to the spread of good scientific information.
[384] But you could argue that these examples were from a long -gone era of gentlemen physicians and amateur scientists.
[385] But even in the modern day of science, where researchers demand hard evidence to be convinced, it turns out that false, inaccurate, and incomplete information can still take hold.
[386] In 1954, E .D. Palmer published a paper that changed how doctors thought about stomach ulcers.
[387] So what he did was looked at a lot of stomachs.
[388] I believe somewhere in the range of a thousand, and he found that there were no bacteria whatsoever in the stomachs that he investigated.
[389] A lot of people at that time had been arguing over whether stomach ulcers were caused by stomach acid or some kind of bacteria.
[390] this was taken as really decisive evidence showing that, okay, well, it can't be bacteria because everyone thought Palmer's study showed there are no bacteria in stomachs, so it absolutely must be stomach acid.
[391] And of course, in this case, Palmer was not trying to fabricate his data or make up data.
[392] He was sincerely arriving at what he thought was a very good conclusion.
[393] That's right.
[394] And it seems that it just was a problem with his methodology.
[395] Of course, there are bacteria in our stomachs.
[396] he just didn't see them because of the way he was doing his particular experiment.
[397] This was not a fabrication at all.
[398] One of the things that's interesting about this episode involving Palmer and the stomach ulcers is that as individuals essentially came over to believe what Palmer was telling them, there was a consensus that started to grow.
[399] And as each new person added to the consensus, it became a little bit stronger, which made it even harder to challenge.
[400] Yeah.
[401] So although they had been arguing for decades about whether ulcers were caused by acid or by bacteria, at this point, people started to share palmish results.
[402] Pretty much everybody saw them.
[403] And this consensus was arrived at, okay, it's acid.
[404] And everyone who had been studying the possibility that bacteria caused stomach ulcers stopped studying that.
[405] Well, not everyone.
[406] Fast forward a few decades to the early 1980s.
[407] In Australia, a physician named Barry Marshall grew skeptical of the acid theory.
[408] His experiments suggested that ulcers were caused by bacteria, not stomach acid.
[409] But this theory was met with stony -faced resistance.
[410] He couldn't even get his articles published.
[411] Scientists sniped at him behind his back, even though, as it turns out, his data was far better than the stomach studies by E .D. Palmer.
[412] Barry Marshall was frustrated that no one seemed willing to.
[413] listen to his findings.
[414] People were bleeding in my practice and dying from ulcers in my hospital.
[415] I could see it.
[416] So he figured out a way to get everyone's attention.
[417] The only person in the world at that time who could make an informed consent was me. So I had to be in my own experiment.
[418] And so he did this demonstration.
[419] He took bacteria from the stomach of one of his sick patients.
[420] So we cultured a patient with gastritis.
[421] He stirred it into a broth And then, I drank the bacteria, 10 to the 9th colony forming units.
[422] He gave himself stomach ulcers, and then he later cured them with antibiotics in this publicity stunt almost to convince people that, in fact, ulcers were caused by bacteria.
[423] Eventually, Barry Marshall and Robin Warren went on to win the Nobel Prize in Medicine for their discoveries.
[424] Mary Montague, the woman who faced resistance in bringing variolation to England, never won a prestigious prize.
[425] But she also found a way to spread the truth.
[426] Like Barry Marshall, she found it had more to do with her sales pitch than with the evidence.
[427] So in the end, she did something really smart, which took advantage of the ways that we use our social connections to ground our beliefs and our trust.
[428] So she ended up convincing Princess Carolyn of Ansbach to virulate her own two small daughters and to do it in this kind of public way.
[429] So she got one of the most influential people in the entire country to engage in this practice.
[430] So that did two things.
[431] So number one, it made clear, you know, because she did in this kind of public way and her daughters were fine, it gave people evidence that this is, in fact, a safe practice and it's a good idea.
[432] But it also made clear to people that if they want to conform to the norm, if they want to share a practice with this really influential person, then they should do the same thing.
[433] And after Princess Carolyn did this, variolations spread much more quickly, especially among people who had a personal connection to either Mary Montague or to the princess.
[434] What's fascinating here is that this wasn't, in some ways, a rational way to solve the problem.
[435] It wasn't saying, look, there's really convincing evidence here.
[436] You're almost using a technique that's pretty close to propaganda.
[437] It is a propaganda technique, absolutely.
[438] So propagandists tend to be very savvy about the ways that people use their social connections to ground trust and knowledge and choose their beliefs, and they take advantage of those.
[439] In this case, it was using that social trust for good, but in many cases, people use it for bad.
[440] And if you look at the history of industrial propaganda in the U .S., or if you look at the way Russia conducted propaganda before the last election, people have taken advantage of these kinds of social ties and beliefs to try to convince us of whatever it is they're selling.
[441] One last idea in how you counter bad information.
[442] Semmelweis, as we saw, did not succeed in persuading other doctors during his lifetime to wash their hands thoroughly before they were treating patients.
[443] But of course, now that idea is widely adopted.
[444] What does that tell us, Kaelin, about how science in some ways might be self -correcting?
[445] It might not be self -correcting at the pace that we want.
[446] But over time, it appears that good ideas do beat out the bad ones?
[447] Yeah, so we have thousands and thousands of examples in science of exactly that happening of good ideas beating out the bad ones.
[448] Of course, now we can look back and say, oh, well, that good idea won out and that good idea won out.
[449] We can't actually look at right now and know which of the ideas we believe now are correct ones or good ones.
[450] So there are actually philosophers of science like Larry Loudon and Kyle Stanford who argue for something called the pessimistic meta induction, which is something like this, because scientific theories in the past have always eventually been overturned, we ought to think that our theories now will probably be overturned as well.
[451] But there's actually an optimistic side to this, which is that if you look at many theories in the past, ones that were overturned, often the reason people believe, believe them is that even if they were wrong, they were a good guide to action.
[452] Even the theory of stomach acid causing ulcers, well, if you treat stomach acid, it actually does help with ulcers.
[453] You know, it wasn't a completely unsuccessful theory.
[454] It's just that it wasn't totally right, and it wasn't as successful as the bacteria theory of ulcers because antibiotics do better.
[455] One of the interesting implications about all of this is how we should think about the truth.
[456] And in some ways, I think the picture that I'm getting from you is a picture that says the truth is not a binary question.
[457] It's not, you know, is it true?
[458] Is it false?
[459] I mean, some questions, of course, perhaps can be reduced to is it true?
[460] Is it false?
[461] But really, scientists in the business are producing probability estimates for various claims.
[462] And I think what you're saying is that for us to actually be on the right side of the misinformation information divide, it's helpful for us to think in probabilistic terms rather than in binary terms.
[463] Yeah, that's absolutely right.
[464] So we do think it's really important to think about belief in terms of degrees and evidence and believing something strongly enough.
[465] And part of the reason is that there has been this strategy where people who are trying to subvert our beliefs, beliefs will say, but we're not sure about something.
[466] They'll say, evolution's just a theory, or there's some doubt about global warming.
[467] But ultimately, not being sure about something is not what matters.
[468] We're never really 100 % sure about anything.
[469] And if you think about it, think about any belief you could have, you know, that the sun will come up tomorrow.
[470] Well, it always has in the past, but that doesn't.
[471] doesn't mean that 100 % sure it will tomorrow.
[472] There's a really good chance at will tomorrow.
[473] We shouldn't be looking for certainty.
[474] Instead, we need to be saying to ourselves, when do we have enough evidence to make good decisions?
[475] Kaelin O 'Connor is a philosopher and mathematician at the University of California, Irvine.
[476] She studies how social networks can spread both good information and bad.
[477] Along with James Wetherall, she is co -author of the book, the misinformation age, how false beliefs spread.
[478] Kaelin, thank you for joining me today on Hidden Brain.
[479] Oh, thank you so much for having me. This episode was produced by Maggie Penman, Camilla Vargas Restrepo, and Laura Querell.
[480] Our team includes Jenny Schmidt, Parth Shah, Raina Cohen, and Thomas Liu.
[481] Our supervising producer is Tara Boyle.
[482] For more Hidden Brain, please follow the show on Facebook and Twitter.
[483] I'm Shankar Vedantam.
[484] I'll see you next week.