Hidden Brain XX
[0] T -minus 15 seconds.
[1] This is Hidden Brain.
[2] I'm Shankar Vedant.
[3] On January 28, 1986, the Challenger spacecraft blasted off from Cape Canaveral.
[4] Two, one, and lift -off, lift -off of the 25th Space Shuttle mission, and it has cleared the tower.
[5] Seconds after liftoff into a clear blue sky, something went wrong.
[6] It was looking very carefully at the situation.
[7] Obviously a major malfunction.
[8] It was much more than a malfunction.
[9] It was a disaster.
[10] As millions of people watched in horror, trails of smoke and debris flew off in different directions.
[11] We have a report from the flight dynamics officer that the vehicle has exploded.
[12] The space shuttle blew up.
[13] Those words, flashing across America today, the Challenger exploded this morning just after a liftoff from Cape Canaveral.
[14] the crew of seven now presumed dead.
[15] After the explosion, NASA officials were hauled to Capitol Hill and put to questioning.
[16] Who finally made that decision to go?
[17] What was the chain?
[18] How did that link?
[19] What happened then specifically?
[20] Members of Congress asked the tough questions.
[21] How could this have happened?
[22] Who's to blame?
[23] These are the kinds of questions we have when any catastrophe occurs, whether a spacecraft has exploded, a new war has broken out, or the world is swept up in a fast -moving.
[24] pandemic.
[25] In all these cases, we want to know who screwed up.
[26] Your government failed you.
[27] Those entrusted with protecting you failed you.
[28] And I failed you.
[29] We ask, who knew what, when?
[30] The State Department had received repeated warnings that the situation was getting worse.
[31] And we try to figure out how...
[32] How do we make sure that this kind of breach...
[33] That such an accident...
[34] It does not, never happens again.
[35] With many of these incidents, there's a sense that they could have been avoided, that someone knew something but didn't see anything.
[36] Or if they said it, they weren't believed.
[37] Or if they were believed, nothing was done.
[38] This week on Hidden Brain, we revisit a favorite 2019 episode about the psychology of warnings and how we can all become better at predicting the future.
[39] Why some warnings get heard, why many are ignored, and the pitfalls of being a prophet.
[40] We begin this tale of warnings made and warnings ignored in the middle of Alaska.
[41] It's a balmy day, barely jacket weather.
[42] I'm in a car riding on a weather -beaten road that leads out of Fairbanks.
[43] Chris Heimstra is driving, and he points out something.
[44] And you can see in the road here, you see all these bumps and all these curves.
[45] These aren't your standard potholes.
[46] The state of the road is a sign of something far more significant.
[47] Beneath the pavement, the ground is disintegrating.
[48] What lies below the asphalt is the Alaskan permafrost, and it's melting.
[49] Permafrost is any soil or ice or rock that's frozen for more than two years, like two consecutive years.
[50] Picture the plants and animals that lived in Alaska over the past tens of thousands of years.
[51] After they died and fell to the ground, they froze.
[52] Permafrost is made up of layer upon layer of this organic frozen material.
[53] There's a place where you can see what's happening deep inside this permafrost.
[54] It's a tunnel that's hundreds of feet long.
[55] It's built into the side of an Alaskan hill.
[56] Chris is a research scientist at the Army Corps of Engineers.
[57] He works in the Cold Region's Research and Engineering Laboratory out of Fairbanks.
[58] He has spent a lot of time in this tunnel, and he's taking me there today to show me something important.
[59] There's 40 ,000 years of Earth history that's stored and frozen in time.
[60] To get to that frozen history, we walk through a set of gates, over to a wood cabin.
[61] A whole bunch of hard hats.
[62] We head over to what looks like a nondescript wooden shed.
[63] It has a sign on the outside that says, U .S. Army Corps of Engineers, and a warning to watch your step.
[64] Inside, it's pitch black.
[65] At the far end, there's another door that leads into the actual tunnel.
[66] Chris turns on the lights and leads me through.
[67] We walk down to a lower area.
[68] the deeper we go the lower down you go into the tunnel the older you are in sediment the further back we move in time when Chris describes what's in the tunnel he's even tone but what I can see all around me is completely extraordinary so on that that's like a 43 ,000 year old piece of probably willow that's been sitting down here for quite a while 43 ,000 years old.
[69] It actually looks like it could have fallen last year.
[70] Yeah, it's amazing at the cold temperatures and how long it can be organic matter can be preserved here.
[71] If you're picturing this like a real -life trip on the Magic School Bus, you're right.
[72] Except there's one part of the experience that doesn't exactly fill me with wonder.
[73] I think the aroma is one thing that people notice right away.
[74] I was just going to ask you, is it just me or does this place stink?
[75] It's definitely got an unusual smell, and that's your organic matter that's coming back into the atmosphere.
[76] All around me, the decaying plants and animals smell like food gone bad in a freezer.
[77] The smell is unpleasant, but what it ought to be is terrifying.
[78] Something is happening here that has consequences for the entire planet.
[79] The organic matter trapped in the permafrost, fungi, plants, animals, it's thawing.
[80] As it thaws, it decays, and as it decays, it releases extraordinary amounts of carbon into the atmosphere.
[81] How much carbon?
[82] Scientists say the amount of carbon that's stored in the permafrost is about double of what's in the entire atmosphere.
[83] Let me say it again.
[84] Double.
[85] It's in the deep freeze.
[86] What happens if that temperature goes up or for some reason it thaws a little bit more?
[87] What happens to that carbon?
[88] A place you don't necessarily want it.
[89] is back in the atmosphere.
[90] We don't want it back in the atmosphere because carbon dioxide contributes to climate change.
[91] But a vicious cycle has already started.
[92] As the planet warms, the permafrost thaws.
[93] All those dead animals and plants and fungi start to decompose.
[94] More decomposition means more carbon released into the atmosphere, which means warmer temperatures, which means even more melting in the permafrost.
[95] What follows is disaster.
[96] The firestorm in Australia has burned an area as large as West Virginia.
[97] So it started raining and no big deal.
[98] And then we see the water going down the street start to get a little bit higher.
[99] At the rate, global temperatures are rising 60 % of all the glaciers here in Xinjiang.
[100] Nearly 11 ,000 glaciers will be gone within 50 years.
[101] Chris doesn't need to turn on the news to see what rising temperatures are doing to the planet.
[102] When he's in the tunnel, he can see it right in front of his eyes.
[103] he can smell it.
[104] He's reminded of it every time he drives on the cracked pavement.
[105] And there's a feeling that he can't escape.
[106] It comes when he's at work, when he talks to strangers, even when he's in the comfort of his own car.
[107] That feeling is futility.
[108] He can see a catastrophe unfolding in front of him, but no one seems to be listening, or people seem to be worried about the wrong things, like in a conversation he had earlier in the day.
[109] The claim was made that most Most of the CO2 in the atmosphere comes from volcanoes, which isn't the case.
[110] Who made the claim?
[111] It was somebody on the school field trip earlier today.
[112] And I pushed back against that politely.
[113] And it's hard because you've got to be really, it's so hard to be, to be, you don't want to offend people.
[114] You don't want to, because making someone angry isn't going to change their mind at all.
[115] So you've got to be really careful about how you wait into things, and you don't know this person necessarily, and there's not a familial tie.
[116] All we have really in common is that our kids go to the same school, and they're in the same class.
[117] But it was a crucial piece of misinformation that wasn't true.
[118] And then so you just say, like, flat out, like, that's not true.
[119] That's going to shut everything down, and it's not going to help me in any way.
[120] It's just going to make somebody who thinks I'm a jerk.
[121] To Chris, it seems.
[122] It seems as if the pushback he gets is driven by a larger contempt that some people feel towards science and scientists.
[123] Like, it's not relegated or limited to your work.
[124] It's a critique of you as a person as if, like, you're just trying to lie to people about the work you do, which doesn't make any sense.
[125] I mean, your stock as a scientist is because of your honesty.
[126] If you're not honest as a scientist, your career is over.
[127] and it should be over Chris feels invisible he walks in a remote place 40 feet below the surface in a part of the country that's unfamiliar to most Americans A lot of them will never make it up here a lot of them won't go into the permafrost tunnel a lot of them won't go run up above the tunnel or throughout Alaska where permafrost exists or even understand what it looks like, what it smells like so how do you communicate that?
[128] How do you say, like, there's a value in you understanding what you don't currently understand?
[129] How do you get people to see that there's value in understanding what they don't understand?
[130] Chris's question is an ancient question.
[131] For millennia, we've often shunned and shamed people who have warned us of looming disaster.
[132] Why does this happen?
[133] Why do human beings who care about their survival ignore warnings of doom?
[134] When we come back, we're going to do something unusual.
[135] we're going to look for answers to that question, not in science, but in literature.
[136] We're going to dive into Greek mythology and talk about a doomed prophet.
[137] The lessons from her story still resonate today.
[138] You're listening to Hidden Brain.
[139] I'm Shankar Vedantam.
[140] This is Hidden Brain.
[141] I'm Shankar Vedantam.
[142] In Greek mythology, the gods loom large, so large that many of them fill our imaginations even today.
[143] Zeus, with his thunderbolts, Aphrodite, the goddess of love, Hades in charge of the underworld, Athena, goddess of wisdom.
[144] The gods of ancient Greece moved among humans.
[145] Some of those humans were themselves touched with divine powers.
[146] One of the most striking was the prophet Cassandra.
[147] Cassandra remained so memorable that her story has inspired movies, television, even campy pop music.
[148] In 1982, the Swedish band Abba dedicated a song to her more than 2 ,500 years after she was first memorialized by Homer, Escalis, and Euripides.
[149] This power that Cassandra had was incredible.
[150] Except, there was one problem.
[151] No one believed her.
[152] It's not Cassandra's fault that she's not believed.
[153] This is Emily Wilson.
[154] I'm a professor of classical studies at the University of Pennsylvania.
[155] Emily is going to take us back to the legend of the kingdom of Troy.
[156] She's going to help us understand what this ancient myth still has to teach us.
[157] Cassandra's father, Priam, was the king of Troy.
[158] It's safe to say that Cassandra didn't suffer from only child syndrome.
[159] Priam has, according to different accounts, either 50 or 100 children, she's one of the many children of Priam.
[160] Cassandra has a blessing that is really a curse.
[161] She can see the future, but no one will believe her.
[162] But even if you set aside the curse, it turns out she also did several things.
[163] that made it less likely she would be believed.
[164] For one thing, her prophecies were opaque.
[165] When she revealed her visions, she spoke in language that was a little hard to understand.
[166] What's this in front of my eyes now?
[167] Is it a hunting net out of the underworld?
[168] Yes, but a man -trap, too, that sleeps with him, helps plot his murderer.
[169] led to the mob endlessly gorging on this clan raise a shriek over the sacrifice on which stones will fall in their turn She speaks primarily in symbols and metaphors That she can foresee the ox killed at the altar She can foresee blood and slaughter She doesn't speak in a way that sort of spells out exactly As soon as I walk into this house Clytemnestra is going to take the axe and hack us both to death.
[170] Because that's not the way oracles speak.
[171] She speaks in prophetic language.
[172] Cassandra's best -known prophecy had to do with one of the world's most famous carpentry projects, the Trojan horse.
[173] There's a war between the Trojans and the Greeks, and it's dragging on.
[174] It keeps going for 10 years without decisive victory on either side.
[175] The turning point comes when the Greeks have an idea.
[176] That they should build a great, big, wooden horse, and have their best warriors on the Greek side.
[177] hide inside the wooden horse.
[178] They leave the great big wooden horse outside the walls of Troy.
[179] As the Trojans watch from their city, the Greeks get in their ships, seemingly in defeat, and sail away.
[180] Their mysterious gift remains on the beach, and the Trojans debate whether to throw open the city gates and bring in the horse.
[181] Is this some kind of gift for the gods?
[182] Is it a holy offering?
[183] Should we mistrust it?
[184] Should we not mistrust it?
[185] Cassandra knows that this gift is not a gift at all.
[186] She tells her fellow citizens, not to open the gates of Troy.
[187] In her own convoluted way, she says, this is a terrible idea.
[188] Don't do it.
[189] But she doesn't have any real power.
[190] Her fellow Trojans don't recognize her as a prophet.
[191] And so she's not taken seriously.
[192] The consequences are disastrous.
[193] The Trojans decide in the end to bring the horse in, and then, of course, in the middle of the night, the Greek warriors spring out of the horse and start slaughtering the citizens inside the city.
[194] that after Cassandra accurately predicts the fall of Troy, some people might start believing her.
[195] You'd be wrong.
[196] She makes another prediction that fails to gain any traction.
[197] As the city is burning and the women of Troy are forced to board Greek ships as slaves, all the prisoners are distraught, all except Cassandra, who seems weirdly happy.
[198] Her mother seems deeply concerned about her, worried for her, as one would be, for a daughter who keeps saying crazy stuff, and who also seems to have this perverse idea that being taken into slavery could be a good thing.
[199] There's a reason why Cassandra has this oddly positive reaction.
[200] Whereas everyone else can only see what's right in front of them, she can see two steps ahead.
[201] She already knows something about her situation.
[202] She knows she's going to die, but so is her captor, the Greek general Agamemnon.
[203] Revenge is coming, even if it means her own death.
[204] Cassandra sees a gleam of hope in the fact that she knows Agamemnon's going to be murdered horribly.
[205] She can see that there's going to be bad things for the Greeks down the line.
[206] So, to recap, there are several things besides the curse that make Cassandra less likely to be believed.
[207] She speaks in cryptic language, doesn't have any formal authority, and is too far ahead of everyone else.
[208] There's one more thing.
[209] She asks too much of the people she warns.
[210] This happens when Agamemnon takes Cassandra back to Greece with him as a slave.
[211] They go to his home.
[212] Agamemnon doesn't know that during his absence, his wife Clytemnestra has started an affair.
[213] He doesn't know that Clytemnestra is not pleased to have him back, but she pretends to be happy.
[214] So she welcomes him into the house along with Cassandra as his human property.
[215] He goes first into the house and then Cassandra pauses to give these prophecies, these riddling prophecies.
[216] Cassandra foresees his death and her own.
[217] Look at this!
[218] Look!
[219] Keep the bull away from the heifer!
[220] She's caught him in her dress, her engine on her black horn, striking.
[221] Into the basin, he falls, where the water lies.
[222] He met his death in the bath.
[223] It's lay and wait for him, I tell you.
[224] Or, to translate, Clytemnestra is about to hack her husband to death with an axe.
[225] For Agamemnon to take Cassandra seriously, he would have to see that his life was in danger, that his wife despised him, that far from being a victorious warrior, he was walking into a death trap.
[226] To save his own life, he would have to change his entire outlook.
[227] He wants to think of himself as a strong, triumphant city sacker, and he's only going to process the information that confirms that belief about himself.
[228] And he's going to ignore all the signs both from Clytemnestra and from Cassandra that might suggest that reality is not the only reality.
[229] And in fact, you're missing a whole lot of information here.
[230] Things play out, just as Cassandra predicted.
[231] She's caught him in her dress, her engine on her black horn, striking.
[232] Clytemnestra gives him a lovely bath, and she entraps him in a net as well in order to make sure that if he struggles after the first blow, he won't get away.
[233] So she strikes him multiple times until he's good and dead.
[234] and also hacks Cassandra to death.
[235] This sounds like it was a particularly gruesome murder, but maybe I'm wrong.
[236] Maybe this is par for the course in Athenian tragedy.
[237] Oh, I think it's par for the course.
[238] I mean, it's much less gruesome than the death of Pentheus say, yes.
[239] There's usually some gory death.
[240] Why else would you go to the theatre?
[241] Now, of course, Cassandra was cursed.
[242] By definition, it didn't matter how persuasive she was.
[243] She was never going to be believed.
[244] but her failed attempts to warn those around her can give us insights into how warnings are heard whether they're taken seriously and when they're acted upon.
[245] So what does an effective Cassandra sound like?
[246] I speak loudly all the time because I'm kind of aggressive person, you know, even though I'm all, I'll be 70 next year.
[247] My wife said, will you please calm down?
[248] She's been saying it for 43 years now and it hasn't happened.
[249] The actual name of this Cassandra is Andrew Natsios.
[250] His moment of prophecy involved a life -and -death choice that affected hundreds of thousands of people.
[251] Before we get to that, we need to understand the formative moments in Andrew's career.
[252] In many ways, he was an unlikely hero.
[253] In the 1980s, he served in the Massachusetts House of Representatives.
[254] At a Republican National Committee meeting in 1988, he didn't exactly get the Star Treatment.
[255] The chair recognizes Nestids, Mr. Nastids, from...
[256] Maryland?
[257] Massachusetts.
[258] Nantzios for Massachusetts, right?
[259] Nassios.
[260] Poor printing here, sorry.
[261] Before he got to his Cassandra moment, Andrew was brought in to salvage a boondoggle of a transportation project in Boston.
[262] The Big Dig.
[263] It redirected a massive highway into a tunnel under the city.
[264] Andrew came in several years into the project.
[265] At that point, it was a mess, with huge cost overruns.
[266] Andrew had two things going for him.
[267] he had experienced leading big institutions, and he had his temperament.
[268] I'm sort of a type A personality, very kind of aggressive and a dominant figure in any institution that I run.
[269] So I could get the institution to do what I wanted it to do, what I thought was right to do.
[270] He was, in fact, able to figure out why the project's finances were out of whack, but his efforts weren't always appreciated.
[271] It was the most difficult year of my career.
[272] Actually, I actually felt safer in Sudan and Iraq and Afghanistan that I did in Boston.
[273] One person threatened to break my neck while I was investigating the big dig.
[274] Andrew had spent much of his career overseas, his experience leading several national and international organizations, caught the eye of President George W. Bush, who appointed him to lead USAID.
[275] That's the agency responsible for America's involvement in international development.
[276] It's here that Andrew had his Cassandra moment.
[277] In 2003, Andrew briefed top members of the Bush administration about escalating violence in Sudan.
[278] By that point, marauders were charging into villages, setting them on fire, sending civilians fleeing.
[279] They were known as a Janjaweed.
[280] They quickly took over a vast and dusty region on the edge of the Sahara.
[281] Darfur.
[282] Andrew's warnings and advice had an impact.
[283] The U .S. supplied billions of dollars in aid to Sudan over the next several years.
[284] In spite of the economic difficulties, our aid will continue to flow.
[285] President Bush also put political pressure on Sudanese leaders.
[286] Some of his measures were behind the scenes, others more public.
[287] The news media began to take note.
[288] Today, President Bush announced tough new economic sanctions against Sudan over the continued persecution of the minority population.
[289] in Darfur.
[290] Andrew's actions at USAID have become a case study on effective warnings.
[291] That's according to Christoph Mayer.
[292] I'm a professor of European and international politics at King's College, London.
[293] Christoph has spent many years studying how warnings are made and which ones manage to break through the noise.
[294] He says there are several reasons Andrew Natsios managed to persuade the Bush administration to act.
[295] One is he was able to show how a further end.
[296] escalation with many hundreds of thousands of people that was highly likely, and he was able to kind of put that into a presentation, you know, chart the escalation of the conflict into the future.
[297] Andrew commissioned a study that predicted how many people would die if the U .S. didn't intervene.
[298] He also got U .S. spy satellites to take pictures of Darfur.
[299] To photograph the ground every day to show the villages that were being burned from day to day.
[300] and these photographs were so clear that the photographs were unimpeachable in terms of their quality in terms of what the atrocities were and they burned 30 ,800 villages.
[301] They displaced two million people.
[302] So Andrew laid out clear evidence.
[303] There was something else that helped him make the case to the president.
[304] He was an insider.
[305] Was I taken seriously?
[306] Yes, because I had a relationship with the presidents.
[307] That's presidents, plural.
[308] He had campaigned twice for George H .W. Bush, and he had worked on the George W. Bush campaign in 2000.
[309] He was not seen as one of these do -gooders, one of these kind of liberal NGO types.
[310] Yes, he had a history in the NGO sector.
[311] He was an expert.
[312] But he had also a kind of a conservative pedigree.
[313] He had experience in the armed forces.
[314] He was seen as someone who understood that the president's time was precious.
[315] understood the preferences and he was seen as kind of part of us.
[316] Besides his political credentials, Andrew was also an insider in another way.
[317] He was a Christian who'd spent years leading an international Christian NGO.
[318] He knew that George W. Bush's identity as a Christian was important to him and important to his re -election chances in 2004.
[319] Christoph Mayer says Andrew laid out the political consequences of inaction in Darfur where many of the civilians being attacked were Christian.
[320] He was able then to show how that kind of escalation would be politically relevant to the Bush administration at the time, how it would impact on the re -election chances, and how it would connect to Christian constituency in the U .S., which, of course, one of the supporting constituencies of the Bush administration because it was also Christians who were in Sudan were largely affected by this violence.
[321] Andrew implicitly understood a widespread psychological bias.
[322] We all tend to look more simple.
[323] hypothetically at suffering when the people who are suffering have something in common with us.
[324] There's another part to this.
[325] Andrew was successful because he didn't ask President Bush to make a major U -turn.
[326] From the very beginning of his administration, the president had been interested in what was happening in Sudan.
[327] The first presidential review that the president ordered was on Sudan policy.
[328] Now, what came to be known as a genocide in Darfur did claim hundreds of thousands of lives.
[329] So it may be hard to see Andrew's warning as effective.
[330] And yet, if Andrew had not acted, Christoph says he believes things would have turned out worse.
[331] I think the Bush administration did act.
[332] It didn't act very early, but it did act politically to put pressure on the conflict parties.
[333] And they did act, I think, as soon as could be probably expected on the humanitarian front, therefore saving lives through humanitarian action and probably preventing the conflict from being even more disastrous than would have been otherwise.
[334] I couldn't have done that alone, but President Bush did it.
[335] I have to say, I said in my book, I think he ended what could have been another Rwandan genocide.
[336] So in Andrew Natsios, we have a plain -spoken leader.
[337] He had the insider credentials to get others on board, and he didn't ask policymakers to do something that was greatly at odds with what they wanted to do anyway.
[338] Contrast that with the doomed profit we heard about earlier.
[339] Cassandra spoke in riddles.
[340] Prophetic riddles, but riddles nonetheless.
[341] Led to the mob endlessly gorging on this clan, raise a shriek over the sacrifice on which stones will fall in their turn.
[342] Unlike Andrew Natsios, Cassandra wasn't an insider.
[343] As Emily Wilson says, Even if somebody is speaking the truth, if it's coming from the mouth of an unauthoritative person for somebody who's dismissed, as othered, then that can make it possible to dismiss even a very clear articulation of a scary truth.
[344] And Cassandra was too far ahead of everyone else.
[345] When she and the other Trojan women are being captured as slaves, she doesn't explain that she is happy because she knows the Greeks are going to be killed.
[346] She can see into the future, but she doesn't take others along with her.
[347] There are lessons here for our own time.
[348] Christoph says many modern Cassandra's forget it's hard for most people to look far into the future.
[349] Leaders especially are often pulled in different directions.
[350] Paying attention to one risk means fewer resources for others.
[351] If you come in with a vague warning about a distant problem, you're going to get sidelined.
[352] Samantha Powell wrote in her book A Problem from Hell about the response from one administrator to the warning that was given that unless his telephones were ringing, he couldn't do anything.
[353] So even if he believed that what she was saying is right, he was so constrained by the lack of, in a sense, public clamoring in the beltway, the kind of support for acting, that he couldn't do something.
[354] Cassandra also asked the people she was trying to warn to stretch too far outside their comfort zone.
[355] Remember what Clytemnestra is giving Agamemnon a lovely bath?
[356] Before she hacks him to death?
[357] Cassandra saw it coming.
[358] Look at this!
[359] Look!
[360] Keep the bull away from the heifer!
[361] But Agamemnon wasn't in a headspace where he could hear the warning.
[362] Christoph says this happens with real -life Cassandra's and real -life policymakers.
[363] If leaders have to reject some foundational belief to act on a warning, there's a strong chance that they will simply ignore the warning.
[364] Quite often, what makes warnings so difficult to believe is their political inconvenience.
[365] In fact, this is exactly what happened in the case of the Challenger Space Shuttle design.
[366] A scientific inquiry found that several engineers had had concerns about the safety of what came to be known as the O -rings on the shuttle.
[367] They told NASA managers to delay the launch, but the managers overruled the engineers and the challenger took off his plan.
[368] We've painted a picture of warnings that is at odds with the way most of us think about them.
[369] In the conventional telling, someone raises an alarm and everyone jumps up and does something about it.
[370] In reality, warnings are likely to be heard when they're made by someone who's part of our in -group, when the warning is so imminent that nearly everyone can see the danger, and when the solution doesn't require a radical shift in existing strategy.
[371] Unsurprisingly, this means that many warnings will go unheeded, and many Cassandra's will be dismissed.
[372] After the break, why you don't need Cassandra -like vision to predict what's to come?
[373] Psychologist Phil Tetlock tells us about the traits of super -frocasters.
[374] You're listening to Hidden Brain.
[375] I'm Shankar Vedantam.
[376] This is Hidden Brain.
[377] I'm Shankar Vedantam.
[378] We're surrounded by people who tell us they know what's going to happen in the future.
[379] A lot of people have no idea that Trump is headed for a historic defeat.
[380] Bear Stearns is fine.
[381] Don't move your money from there.
[382] That's just being silly.
[383] These predictions have a few things in common.
[384] The commentators have complete confidence in themselves.
[385] We as the audience love to hear them make a complicated world seem simple.
[386] And finally, no one ever pays a serious price for being wrong.
[387] Donald Trump wins the presidency.
[388] Bear Stearns in the bargain bin.
[389] Sold to rival J .P. Morgan Chase for just $2 a share.
[390] Making predictions is hard, even for the so -called experts.
[391] Ironically, the people who are the best at forecasting the future tend to be ordinary people who happen to know a very important secret.
[392] Predicting the future isn't about being unusually smart or especially knowledgeable.
[393] It's about understanding the pitfalls in the way we think and practicing better habits of mind.
[394] Phil Tetlock is a psychologist at the University of Pennsylvania.
[395] In his book, Super Forecasting, The Art and Science, of prediction, Phil explores how we can learn from these people to become better forecasters ourselves.
[396] Phil, welcome to Hidden Brain.
[397] Thank you very much.
[398] So Phil, lots of people watch television at night and millions of people feel like throwing things at their television set each evening as they listen to pundits and prognosticators explain the day's news and predict what's going to happen next.
[399] Of all the people in the country, you probably have more cause than most to hurl your coffee cup at the television set, because starting in 1984, you conducted a study that analyzed the predictions of experts in various fields.
[400] What did you find?
[401] Well, we found that pundits didn't know as much about the future as they thought they did.
[402] But it might be useful before we start throwing things at the poor pundits on the TV screen to consider their predicament.
[403] they're under pressure to say something interesting.
[404] So they resort to interesting linguistic gambits.
[405] They say things like, well, I think there's a distinct possibility that Putin's next move will be on Estonia.
[406] Now, that's a wonderful phrase, distinct possibility.
[407] It's wonderfully elastic because if Putin does move into Estonia, they can say, hey, I told you, there was a distinct possibility he was going to do that.
[408] And if he doesn't, they can say, hey, I just said it was possible.
[409] So they're very well positioned.
[410] Now, if you play the game the way it really should be played, the forecasting game, and use actual probability.
[411] So you play it the way Nate Silver plays it and you wind up with a, say, a 70 % probability that Hillary will win the election a few days before the election in November 16, you're much more subject to embarrassment.
[412] If he said there's a distinct possibility that Hillary would win, he would have been very safely covered.
[413] Because when you ask people to translate distinct possibility of the numbers, it means anything from about 20 % to about 80%.
[414] The truth is making predictions is difficult, but many biases also get in the way of making accurate forecasts.
[415] When we make a prediction, and it turns out wrong, most of us don't remember that we'd predicted something different.
[416] In fact, the hindsight bias prompts us to believe we'd gotten it right all along.
[417] We also hail people who make predictions that turn out right, whether in the stock market or politics or sports, but that keeps us from seeing the role of luck.
[418] Many people who get the right answer are just lucky, and some people who gets wrong are just unlucky.
[419] Over time, the laws of probability mean luck can only take you so far.
[420] One reliable way to check if someone's success at predictions is driven by skill or by luck is to have them make lots of predictions and see how they turn out.
[421] out over time.
[422] A few years ago, the federal government launched such an experiment.
[423] They conducted a forecasting tournament where thousands of people logged into computers to make thousands of predictions.
[424] As time passed, the forecasts could be checked for accuracy.
[425] We were one of five academic research teams that were competing to pull together the best methods of making probability estimates of events that national security professionals cared about.
[426] What kind of questions were they asking?
[427] All over the map, quite literally.
[428] So there would be questions about violent clashes in the East or South China Sea.
[429] There would be questions about the Syrian Civil War, about Russian -Ukrainian relations, about the Iranian nuclear program, Colombian narco -traffickers, literally all over the map.
[430] If you were asked to pick someone to answer a difficult question about foreign affairs, you might turn to an Oxford -educated public intellectual who writes a column for a very important newspaper.
[431] You probably wouldn't turn to a retiree in Nebraska who spends his time birdwatching.
[432] But Phil Tetlock says, maybe you should.
[433] He was the opposite of Tom Friedman.
[434] Tom Friedman, of course, being an eminent New York Times columnist, well known for his explanations, but nobody has any idea how good a forecaster he is.
[435] and Bill Flack is an anonymous retired irrigation specialist working in Nebraska, working out of the public library or out of his home, and doing a fabulous job making probability estimates in the intelligence community forecasting tournament.
[436] Super forecasters like Bill Flack turn out to have some things in common.
[437] Tell me about the kinds of philosophies they have and the kinds of thinking styles that you seem to find in common among many of these super forecasters.
[438] I would say the most distinctive attribute of the super forecasters is their curiosity and their willingness to give the idea a try.
[439] And when I say the idea, I mean the idea that forecasting is a skill that can be cultivated and is worth cultivating.
[440] Because it doesn't matter how intelligent you are or how knowledgeable you are.
[441] If you believe that it's essentially impossible to get better at these kinds of tasks, you're never going to try and it's never going to happen.
[442] It's as simple as that.
[443] Super Forecasters tend to gather information and update their beliefs in a very particular way.
[444] Phil Tetlock points to Aaron Brown, the chief risk officer of the hedge fund AQR.
[445] Before he was a big shot in finance, he was a big shot in the world of poker.
[446] He was a world class poker player.
[447] And we quote him as saying that you can, you know, you can tell the difference between a world -class poker player and a talented amateur because the world -class player knows the difference between a 60 -40 -bet and a 40 -60 bet.
[448] Then he pauses and says, oh, maybe like 55 -45 -45 -45.
[449] Distinguishing more degrees of maybe is an important skill.
[450] Why is that?
[451] Well, the very best forecasters are well -calibrated.
[452] So when they say events are 80 % likely, those events happen about 80 % of the time.
[453] When they say things are 90 % likely, they happen about 90 % of the time.
[454] So it makes a difference how frequently you update your forecasts.
[455] If you don't update your forecast reasonably frequently, you're going to fall out a phase with events.
[456] And that means often making adjustments that are relatively small.
[457] You suggest that forecasters should do something that doesn't seem to be very intuitive.
[458] Instead of looking at the particulars of an individual case, you say forecasters should zoom out and figure out how often something has happened historically.
[459] So Daniel Kahneman is probably one of the greatest psychologists of the last hundred years, and he calls that the outside view.
[460] And he says people rarely take the outside view when they do forecasting.
[461] They normally start from the inside and they work out.
[462] But there's a big advantage to you as a forecaster from starting with the outside view and working in.
[463] Take another example.
[464] Let's say you're out a wedding, and you're sitting next to somebody who has the bad taste to ask you, how likely do you think it is this couple is going to stay?
[465] married.
[466] And you look at the person, it's bad taste and all that.
[467] You see how happy the couple is, and you can see it's a joyous occasion.
[468] You say, I can't imagine these people who are so happy together getting divorced.
[469] I think maybe a 5 % chance they're going to get divorced.
[470] Now, if you'd ask that question of a super forecaster, they'd say, well, let's see.
[471] Let's look at the sociodemographics of the couple.
[472] And let's see, what's the base rate of divorce within this sociodemographic group?
[473] Let's say it's 35 or 40 % over the next 10 years.
[474] And, let's see.
[475] It's the Okay, I think it's about a 40 % chance to get divorced in the next 10 years.
[476] Now, that's not the end of the forecasting process.
[477] That's just the beginning.
[478] The real value, though, of doing it this way, starting from the outside and working in, is it puts you in the ballpark of plausibility right away.
[479] 40 % is a much more plausible number than 5%.
[480] Now, then you can start adjusting the 40%.
[481] So if you discover things about the couple that suggest they really are deeply bonded to each other and they've known each other a long time and they really understand each other and they've done great things for each other, you're going to lower your probability.
[482] If you discover that the husband is a sociopathic philanderer, you're going to raise the probability.
[483] Those inside view sorts of pieces of data that would cause you to adjust.
[484] Or you might just see them having a small fight and say, well, okay, I'm going to move from 40 to 41%.
[485] And that's one of the interesting things about super forecasters.
[486] They do a lot of updating in response to relatively small events.
[487] And most of the news, most of the time, is what statisticians would call low -diagnosticity news.
[488] It doesn't change things dramatically, but it does change things a little bit.
[489] And appreciating how the news gradually builds up toward one conclusion or another is a very valuable skill.
[490] I'm wondering if one reason many of us start with the inside view rather than the outside view is that just at an emotional level, that's how our minds think, that, you know, you see a couple, and you put yourself in the shoes of that, couple.
[491] And you try and imagine what's happening in their lives.
[492] And we think in stories.
[493] And we imagine what life must be like for that couple.
[494] And we're trying to see how that story will turn out.
[495] And we're trying to follow the narrative where it leads, rather than do this much more abstract, remote process of saying, let me start with the rough estimate of how often something happens.
[496] There's something about that in some ways that requires us to step outside this narrative frame that we often use to understand the world.
[497] I think that's right.
[498] We're quite readily seduced by stories.
[499] Another example might be, let's say I ask you, how likely is it that in the next 10 years there'll be a flood in North America that kills more than 1 ,000 people and ask you to make an estimate on that?
[500] Let's say I ask another person to make the estimate.
[501] How likely is it that there'll be a flood in California that will be caused by an earthquake cracking a dam leading to a massive outflow of water?
[502] Now, when I put the two questions together like that, it's pretty obvious that a flood anywhere in North America due to any cause has got to be more likely than a flood in California caused by an earthquake cracking a dam, right?
[503] The California event is obviously a sub subset of the more general North American flood thing.
[504] But people don't see it that way.
[505] The California earthquake dam story is more like a story.
[506] It's like a story.
[507] You can actually put it together in a more meaningful way, whereas a flood anywhere in North America is kind of abstract and vague.
[508] People can transport themselves into the world.
[509] And you can imagine it's like a movie, like a Hollywood movie playing out.
[510] And they can see it happening.
[511] Yes, I can see that happening.
[512] And that pumps up the probability, and that screws up your forecasting track record.
[513] So again, think of forecasting as a skill that can be improved with practice.
[514] When you're making a prediction, start with the base rate, the outside in view.
[515] Beware of the risks of storytelling.
[516] Finally, amateurs make three kinds of predictions.
[517] Yes, no, and maybe.
[518] Professionals have many gradations of maybe, and they attach specific probability estimates to their predictions, allowing them to go back and learn where they went wrong.
[519] But even if you do all these things, I asked Phil how you can be really sure that predictions that turn out correct are because of good technique.
[520] You know, there's an old trick that they play in business schools to talk about the role of luck, where they divide the class into pairs, and they say, you know, have a coin toss between, between each person.
[521] And then the winner of each of those coin tosses competes against another winner.
[522] And after 12 rounds, there's one person who has, who has declared the winner.
[523] And of course, that person in a business sense might seem to be extraordinarily good.
[524] But really all that's happened is they've happened to win 12 coin tosses in a row.
[525] They've just been very, very lucky.
[526] How do you distinguish between people who are lucky and people who are actually very good?
[527] Well, that is indeed the $64 ,000 question.
[528] And it comes up in finance too.
[529] I mean, There are some finance professors out there who would argue that the really famous super investors, Warren Buffett or Ray Dalio and people like that, are in some sense like coins that come up heads 20 or 30 or 40 times in a row.
[530] We have a lot of people competing in financial markets, and they're making a lot of predictions over long stretches of time.
[531] So you're going to expect some streaks.
[532] And when we get really streaky performance, we declare we found a genius.
[533] So the skeptics would say, well, Phil Tetlock is doing essentially the same thing here.
[534] He's anointing people who are essentially lucky.
[535] So we built in a lot of statistical checks, but you can never be 100 % sure.
[536] One of the other critiques of Phil's work is that the kinds of questions that superforecasters are answering are not really the questions that people want answered.
[537] Most of us are interested in the big questions.
[538] Who's going to win the current standoff between the United States and Russia?
[539] Super forecasters tend to answer much more narrow questions.
[540] Is Russia going to invade Ukraine in the next 18 months?
[541] I think that's a very fair criticism of the first generation of forecasting tournaments, that we put all of our effort into improving forecasting accuracy.
[542] Now, I'm not going to say the questions that people were trying to answer were trivial, but could we have made the questions more relevant to deep policy questions?
[543] I think the answer is yes.
[544] I think we should be focusing as much on the insightfulness of the questions as the accuracy of the answers.
[545] I'm going to ask you one final question, and this is also, I think, a potential critique of super forecasting, but it comes in the form of a forecast that I'm going to make.
[546] The reason I think many of us make forecasts or look to prognosticators and pundits to make forecasts is that it gives us a feeling like we have a handle on the future.
[547] It gives us a sense of reassurance.
[548] And this is why liberals like to watch the pundits on MSNBC and conservatives like to watch the pundits on Fox.
[549] you know, a more cautious style that sort of says, you know, the chance that you're going to die from cancer is 65 .3%.
[550] These estimates run up against a very powerful psychological impulse we have for certainty that we actually want someone to hold our hand and tell us you're not going to die.
[551] We don't want a probability estimate.
[552] We want actually an assurance that things are going to turn out the way we hope.
[553] So here's my last question for you.
[554] If someone advises people to do something that runs against their emotional need for well -being and reassurance.
[555] I'm going to forecast that advice, however well -intentioned, however accurate, is likely not going to be followed by most people.
[556] What do you make of my forecast, Phil?
[557] Well, I think there's a lot of truth to what you say.
[558] I would say this.
[559] I would say people would be better off if they were more honest with themselves about the functions that their beliefs serve.
[560] Do I believe this because it helps me get along with my friends or my boss, helps me fit in, helps me feel good about myself, or do I believe this because it really is the best synthesis of the best available evidence?
[561] But you're right, when people sit down in their living room and they're watching their favorite pundits, they're cheering for their team.
[562] It's a different kind of psychology.
[563] They're playing a different kind of game.
[564] So all I'm saying is you're better off, if you're honest with yourself, about what game you're playing.
[565] Psychologist Phil Tetlock is the author of Super Forecasting, art and science of prediction.
[566] Phil, thank you for joining me today on Hidden Brain.
[567] My pleasure.
[568] Hidden Brain is produced by Hidden Brain Media.
[569] Our audio production team includes Bridget McCarthy, Annie Murphy Paul, Kristen Wong, Laura Querell, Ryan Katz, Autumn Barnes, and Andrew Chadwick.
[570] Tara Boyle is our executive producer.
[571] I'm Hidden Brain's executive editor.
[572] Our Unsung Hero this week is Sofar.
[573] Daukins.
[574] Sophia has spent years studying the conflict in South Sudan.
[575] She patiently helped us understand the complicated history of the region to get the nuances right.
[576] Sometimes, the most important contribution someone can make to a story is to get things out of the story.
[577] Thank you, Sophia.
[578] If you like Hidden Brain, please consider making a financial contribution to help us make the show.
[579] You can do so by going to support .hiddenbrain .org.
[580] Any amount really helps and we truly appreciate your support.
[581] I'm Shankar Vedantam.
[582] See you soon.