Freakonomics Radio XX
[0] Hey there, it's Stephen Dubner.
[1] You are about to hear a special episode of our other live podcast.
[2] Tell Me Something I Don't Know.
[3] If you're a fan of Freakonomics Radio, I think you'll really like this episode.
[4] It features some of the smartest people in behavioral science, some of whom you'll recognize from previous Freakonomics radio episodes, like Angela Duckworth, the author of Grit.
[5] You can subscribe to Tell Me Something I Don't Know wherever you get your podcasts.
[6] Also, please join us for our next round of live tapings in New York City in early October.
[7] You can get tickets at TMSIDK .com.
[8] You can also click Be On the Show for a chance to come on stage and tell me something I don't know.
[9] Thanks for listening.
[10] Why do I read?
[11] Why do I have conversations?
[12] Why do I have to go to school?
[13] Why do I pay attention?
[14] Why do I pay attention?
[15] Because I want to be amused.
[16] Because I want to get outside my comfort zone.
[17] But mostly.
[18] Mostly because because I want to find out stuff.
[19] Find out stuff.
[20] Find out stuff.
[21] Because I want you to tell me something I don't know.
[22] Good evening, I'm Stephen Dubner, and this is Tell Me Something I Don't Know.
[23] Tonight, we are coming to you from Philadelphia, Pennsylvania.
[24] We have got an audience full of very bright people who come on stage and tell us fascinating stuff.
[25] If it goes as planned, we will all be a bit smarter by the time we're through.
[26] Joining us tonight as co -host, the author of Grit, the power of passion and perseverance.
[27] She's a psychology professor at the University of Pennsylvania.
[28] Please welcome Angela Duckworth.
[29] Angela, let's see what we know about you so far.
[30] We know that when you were a kid, your dad liked to tell you, Angie, you're no genius.
[31] But then you proved him wrong years later by winning a MacArthur Genius Award.
[32] Nicely done.
[33] We know before becoming an academic, you were a management consultant and a school teacher and that you have advised the Obama White House, the World Bank, NBA, and NFL teams, and Fortune 500 CEOs, we know that your colleagues have called you, and I quote, the most extroverted person, the quickest learner, and the fastest thinker they've ever met.
[34] And I just want to say we'll be the judge of that tonight.
[35] Yeah, no pressure, low expectations.
[36] We're also told that you like to swear like a sailor.
[37] So I'm looking forward to that as well.
[38] Yes, I will be censored.
[39] It's going to be bad.
[40] All right, Angela Duckworth, that's what we know.
[41] Tell us something, please.
[42] We don't know about you.
[43] In addition to being a MacArthur, I was captain of the football cheerleading squad at Sherry Hill High School East.
[44] I needed to go against the stereotype of an Asian female, so I went out for cheerleading.
[45] I can see that you could bring the cheer, though.
[46] You've got a lot of enthusiasm.
[47] I was above average.
[48] Angela, we are very happy and flattered to have you here tonight for telling me something I don't know.
[49] Now, I should say, a special episode of our show, and we've got you, Angela, to thank.
[50] We're here, because today, you held a conference at Penn with a bunch of world -class academics, and then we dragged a bunch of them over here to be our guests on Tell Me Something I Don't Know.
[51] Now, just because you know all them doesn't mean you can't rough them up, okay?
[52] They are going to come on stage, and they'll each tell us some interesting fact or idea.
[53] You and I will hear them out.
[54] We'll poke and prod a bit.
[55] And then our live audience at the end of the night will pick a winner.
[56] Now, victory will be based on three simple criteria.
[57] Number one, did they tell us something we truly did not know?
[58] Number two, was it worth knowing?
[59] And number three, was it demonstrably true?
[60] To help with that demonstrably true part, would you please welcome our real -time fact -checker, Mike Maughan.
[61] Mike is head of global insights at the software company Qualtrix.
[62] He is one of the smartest and truthfully nicest people that we know.
[63] Now, Mike, all that said, how do we know you've got the chops tonight to keep our super smart guests in line?
[64] Well, I get to work with some of the leading researchers and highlight some of the most interesting things that they've ever found using Qualtrix.
[65] That includes how seeing pictures of Marilyn Monroe versus Abraham Lincoln caused you to eat healthy or unhealthy foods.
[66] Why men with facial hair are more likely to hold traditional gender stereotypes, and why Republican women find Democratic men more attractive.
[67] Okay, can I hear a little bit about the Democratic men, please?
[68] The finding on Democrats is that good girls like naughty men.
[69] And so Republican women were more likely to think that the exact same person was attractive if he was a Democrat, because it gave them a sense of being, you know, going against the grain, choosing something that was wrong, and that was attractive to them, the devilish nature of good girls.
[70] All right, Mike Maughan, very happy to have you here tonight.
[71] It is time now to play.
[72] Tell me something I don't know.
[73] We'll call it the Ultra Egghead Edition.
[74] Would you please welcome our first guest, Colin Kammerer.
[75] All right, Colin, please tell us what you do?
[76] I'm a behavioral economics and neuroscience professor at Caltech.
[77] Okay, Colin, I'm ready.
[78] So are Angela Duckworth and Mike Maughn.
[79] So what do you know?
[80] It's worth knowing that you think we don't know.
[81] I know a mathematical formula for when people are intellectually curious.
[82] It applies to adults and babies.
[83] For example, in the late 1980s, when I taught at Penn, actually, a band that I invested in played on this very stage.
[84] So if you're having a feeling of curiosity, but there's a name of that band, that feeling of curiosity is in your brain somewhere, and I can tell you where.
[85] So you can tell us where in the brain resides curiosity, essentially, yes?
[86] Yes.
[87] And you know this how?
[88] from scanning brains of people at Caltech who are unusual.
[89] If you've seen the Big Bang Theory, that's basically a documentary about Caltech.
[90] But our imaging results have been replicated a couple of times in Germany, and I think a couple of other studies.
[91] So these are Caltech students being made curious how, exactly?
[92] So the curiosity of you study is something called epistemic or intellectual.
[93] It's trivia questions.
[94] So we ask questions like what animal other than human, can get sunburned.
[95] That's a question that provokes a lot of people say, I'm curious.
[96] Not only do they say it, they're willing to wait longer to find out the answer, and their pupils dilate it open up right before they find out of the answer, as if they need the pupils dialy to, like, drink in the information.
[97] Well, don't we all kind of now want to know what animal besides us get sunburned?
[98] Now let's forget about everything else and just, yeah.
[99] Mike, you got a computer there, buddy.
[100] I hope it matches our answer.
[101] This study was a little while ago.
[102] Maybe there's some new animals, Higgs, is that the answer?
[103] Who needs a computer?
[104] We have an audience here.
[105] Somebody from the audience knew that.
[106] Volunteer fact checker, thank you.
[107] Wisdom of the crowds.
[108] All right, so wait, Colin, so what you're asking us is to try to figure out which part of the brain houses curiosity or what happens when we exercise that part of the brain?
[109] What's the mathematical formula?
[110] So there's a mathematical formula, but there's also a location, right?
[111] These are two different things?
[112] No, there are both of these things?
[113] Well, when you compute the mathematics of a particular question that says this is going to be a high curiosity and one on a low curiosity.
[114] And then you do a bunch of fancy math.
[115] And you ask where in the brain is blood flowing into regions to a degree that's associated with the level of curiosity.
[116] We call that the brain encoding a number.
[117] I can tell you what some of those regions are.
[118] And then that led to a very interesting prediction too, which was very surprising at us and others.
[119] Okay, so I don't know the mathematical formula.
[120] I do know a little about functional MRI.
[121] but is it the it is not the dorsolateral prefrontal cortex no that's what i was going to say you were going to say that right it is not the insula that's that's usually what people guess the doors yeah i know so cliche right so i'm sorry that was conventional come on come on bring something special nucleus accumbens nucleus accumbens yes bang bang bang bang got it anything else anything else light up in the brain yes parah hippocampal gyrus i can't believe i didn't get that one Actually, I forgot about an important one called midbrain.
[122] Midbrain is a boringly named midbrain.
[123] Like, where is it?
[124] What is the evidence, if any, that curiosity is intrinsically and broadly good for us?
[125] I asked my colleagues who study disorders.
[126] Is there a group of people who are deeply uncurious?
[127] For example...
[128] Cheerleaders.
[129] Chirleaders are really curious.
[130] Actually, my colleague said amygdala patients.
[131] He studied patients who have damage in both part of the amygdala.
[132] These patients are very uncurious about their disease and about a lot of other things.
[133] So although we didn't see amygdala activity, lots of these regions that were talking about, amygdala, nucleus of comets, are involved in lots of different things.
[134] You know, that's just an area which is active because finding out things you're curious about seems to be rewarding.
[135] So how do you make people more curious?
[136] Okay, here's the formula.
[137] We asked people in these studies what animal gets sunburn, and we asked them, how likely do you think you know the answer?
[138] It turns out if they have no clue, they're not very curious.
[139] Once they know a little bit, they get more curious.
[140] And the sweet spot is 50 -50.
[141] When you're equally likely to think I know or I don't know.
[142] So that's why you set us up to guess what the band was that played on the stage of the Trocadero when you were both a pen professor and also somehow a music producer?
[143] Correct.
[144] So if you were curious, your nucleus accumbens was active and para hippocampal gyrus.
[145] And if you were wrong, we might find that soon, then the curiosity actually acts as a memory enhancer helps you remember.
[146] Okay.
[147] I actually didn't have a guess because I can't name a single band from which decade is this, by the way?
[148] 90s.
[149] I taught you from 83 to 91.
[150] I think it was around 87.
[151] Journey, Asia.
[152] I wish, journey.
[153] I can't name any other bands.
[154] Stephen, help me. Does it rhyme with Fred Filckplan?
[155] Yes.
[156] Bing, bang, bang, bang.
[157] Dead milkman.
[158] That should pay me. for a lot of vacation homes, I think.
[159] Well, shacks, actually, shacks.
[160] And not of the nicest part of the Jersey Shore.
[161] So Mike Maugh and my midbrain and other parts are hurting with how much Colin Kammer has taught us tonight.
[162] Can you tell us anything factual or otherwise?
[163] A few things.
[164] Hippos, elephants, freshly shorn sheep, they all can get sunburned.
[165] Oh, really?
[166] Just for the record.
[167] Anyway, so curiosity does indeed hit the reward center of the brain, but so does eating double -stuffed Oreos, trust me. I know.
[168] And on curiosity, it is indeed a memory enhancer, and that's why we take pains to avoid being wrong.
[169] So our brains are designed to be more easily stimulated than they are satisfied.
[170] And so curiosity stimulates the brain.
[171] The easiest way to maybe think of this is in how most of us remember names.
[172] So when you meet someone, and two minutes later, they say something like, great to meet, you slugger, you have a pretty good idea that they have literally no interest in who you are or getting to know you ever again.
[173] So does that help, buddy?
[174] Thank you, Mike, and Colin Cameron.
[175] Thank you so much for playing Tommy Something I don't know.
[176] Thank you.
[177] Very well done.
[178] I think we're in for a pretty good night, Angela Duckworth.
[179] That was good.
[180] That was good stuff.
[181] I'm getting warmed up.
[182] Would you please welcome our next guest, Ayelet Fishback?
[183] Hello there.
[184] Why don't you introduce yourself to our fine crowd here, please.
[185] I'm a professor of behavioral science and marketing at the University of Chicago.
[186] We've heard of the University of Chicago.
[187] Okay, what do you have to tell us tonight then, please?
[188] Well, you may have heard about the marshmallow test and the idea that putting immediate rewards aside is the way to accomplish our life goals.
[189] Well, I find in my studies that immediate rewards are actually also critical to achieve our long -term goals.
[190] That's success at, at actually, at exercising at healthy eating is the immediate rewards that you are getting from pursuing these goals.
[191] So the famous Walter Michel marshmallow test, you put a kid in a room, you tell them not to eat it, and basically the kids who eat the marshmallow become felons and the ones who don't, right?
[192] So you're saying that instant gratification is important and necessary for long -term goals?
[193] Yes.
[194] Angela, this seems pretty much...
[195] You counter to my entire life's work?
[196] Is that what you're going to say?
[197] Well, I was hoping you will not notice.
[198] No, I noticed.
[199] I noticed.
[200] So I yell it.
[201] How can it be that immediate gratification and doing what feels good right away is going to help you be healthy, go to college, have a good job, safe retirement?
[202] Can you explain this?
[203] Yes, so immediate gratification is important.
[204] or is predicting of any behavior, immediate gratification is what predicts eating the marshmallow, in which case one might not make it a college.
[205] But immediate gratification is also predicting of how much one can stick to studying, okay, that the student that likes studying will study more than the student that doesn't like studying, whereas the student that finds studying more important will not necessarily study much more than the student that find it's less important.
[206] And so immediate gratification is just important for everything.
[207] I think some analysis misses out is that we are very much responding to present rewards.
[208] And so the way to get people to do anything is basically by including present rewards.
[209] Isn't that just another way of saying that, you know, incentives matter along the way?
[210] Isn't that wouldn't it?
[211] Wouldn't it an economist just take that tack?
[212] Well, incentives can be of two types, so they can be immediate and they can be delayed.
[213] Actually, we ran one study in which we asked people to do a task that either pays more or less, or is slightly more interesting versus slightly more boring.
[214] Now, if you ask people to predict how much they can stick to this task, they tell you that they can stick longer to a task that pays more.
[215] But when you measure the actual persistence, the money didn't make much of a difference.
[216] People were able to stick longer with a task that was more interesting.
[217] So there are kids who grow up to go to University of Chicago and Caltech, and they probably are the kids who were like, oh my God, there was extra Latin homework.
[218] Awesome.
[219] Like, so excited.
[220] But what about the kids who don't actually find Latin homework more immediately gratifying than Snapchat?
[221] I don't think that kids can learn if they're not interested.
[222] Wouldn't you agree?
[223] Yes, I would actually agree.
[224] And I do think that this is not counter to my life's work.
[225] I won't have anything to do tomorrow if it's entirely countered.
[226] I do know that in your book, Angela, it's passion and person, right?
[227] And passion is the first step toward developing the thing that you're going to be gritty about.
[228] So I don't see them as conflicting.
[229] I do, however, wonder about a separate conflict, perhaps.
[230] So, Aiella, I recall that you were on a program called Freakonomics Radio talking about wading in lines and how weighting helps develop patients, which is, as we all know, a virtue, right?
[231] So doesn't that piece of your work run a little bit counter?
[232] Her life's work is counter to her life's work.
[233] No. And let me explain why.
[234] It is a key to success.
[235] It's almost the definition of success to be able to pursue the long -term rewards.
[236] What we find is that the way to get there is using immediate rewards.
[237] Now, there are other things that can help us, and to the extent that we move from the environment, the immediate temptation to the extent that we make people wait because they have to wait, that makes them better able to go for the long -term success.
[238] Mike Maughn, I guess we'd call this the upside of instant gratification.
[239] What more can you tell us?
[240] So delayed gratification is obviously a thing, but as you've mentioned, people do want to see progress to keep them motivated throughout the process.
[241] For example, those that want to lose weight, I tried exercising once, but I found out I was allergic to it.
[242] My skin flushed, my heart started racing, I got sweaty, I was out of breath.
[243] Overall, it seemed quite dangerous.
[244] So I did find this joint Princeton Harvard study talking about immediate versus delayed gratification.
[245] And the most useful part of this study was this line that said the finding supports the growing view among economists that factors other than pure reasoning often drive people's decisions.
[246] Oh, really?
[247] Thanks, science, that was helpful.
[248] Mike, thank you.
[249] And IEL at Fishback, thank you very much for playing.
[250] Tell me something I don't know.
[251] Would you please welcome David Labson.
[252] So David Labson, would you introduce yourself, please?
[253] Good evening, everyone.
[254] I'm the chair of the Harvard Economics Department.
[255] I'm a behavioral economist.
[256] My academic expertise is procrastination.
[257] David, very happy to have you.
[258] Tell us something we don't know, please, sir.
[259] So I'm going to get really dry here.
[260] let's talk about IRAs and 401Ks.
[261] You got those?
[262] You got those retirement savings accounts?
[263] Yeah, yeah.
[264] Shout out for a few dollars away.
[265] So in the U .S., those accounts are basically liquid.
[266] There's only a 10 % penalty, and for lots of different activities, you can take the money out with no penalty at all.
[267] So we in the U .S. have highly liquid retirement savings, but we look in other countries, and all that money is.
[268] is locked down.
[269] It's absolutely illiquid.
[270] You can't touch it until you retire.
[271] And it turns out that for every dollar that goes into the 401k IRA retirement saving system in the U .S., 40 cents leaks out before people reach retirement.
[272] So we've got a leaky boat.
[273] A leaky boat, which you're implying leads to a lot of cat food in the end, yeah?
[274] You're saying that people spend their retirement savings early, and then when it comes time to need it, What's left?
[275] About half the population in the U .S. now is retiring with a retirement account, and the other half don't have anything.
[276] Maybe they've never had an account, or maybe they had one, and let the money leak out at some point along the way.
[277] Now, all of that leakage isn't a disaster.
[278] Some of it's coming out for good reasons.
[279] But, you know, I think looking at those numbers gives one pause.
[280] This is a very serious topic, but I want to know what you meant by cat food.
[281] Well, he was talking about cat food.
[282] Is that related to a...
[283] You're like, and then there's a cat food.
[284] Yeah, that's a pretty classic.
[285] You're too young, Angela, I think.
[286] Is this a, what is this?
[287] Our generation knows that eating cat food means you're retired and don't have enough money.
[288] Oh, literally eating the cat food yourself.
[289] Oh, that's really sad.
[290] That's terrible.
[291] But maybe it's just that now cat food is so much better that it's considered a good outcome.
[292] But David, wait, so let me ask you this.
[293] It sounds like you're advocating for either larger penalties or different rules.
[294] Are you advocating for that?
[295] Probably not larger penalties, because then I think people would just take them, someone would take the money out and they'd just be paying those larger penalties.
[296] It would be probably a sort of regressive tax.
[297] Maybe we should have more money in these retirement accounts and maybe more money that's locked up.
[298] So for example, maybe the employer match should be a liquid.
[299] Or maybe you can take out the money you put in, but not the interest on it.
[300] So there should be some liquidity, but maybe not all of it should be liquid.
[301] But my question then would be, if you make it more locked, essentially, in some fashion, wouldn't there be people then who decide simply forget it?
[302] If I can't access that money, then I'm going to choose to not put it in the first place.
[303] It's certainly a possibility, but there's no evidence that the liquidity of the accounts is a big driver in people's willingness to put money into those accounts.
[304] But again, I'm not proposing that it be totally locked up.
[305] Maybe just part of those funds would be made a liquid so that people retire with something in these accounts.
[306] Can you tell us anything about the people who are more likely to, to make early withdrawals?
[307] So it tends to be people with lower levels of income, tends to be people with more financial distress.
[308] So we have another kind of facet of the economy, which is the people who are getting the most out of these 401K accounts, tend to be the people who are already doing very well in the economy.
[309] So it's kind of a double victory for them.
[310] They're thriving with high incomes, and they're thriving by getting the benefits of these tax -deferred retirement accounts.
[311] This is very sad.
[312] This is like a very sad story.
[313] And people were eating cat food and that's sad.
[314] Is there anything else that's like super optimistic or possibly easy retirement savings?
[315] Sad.
[316] Well, I'll give you a maybe a small bit of brightness.
[317] Not much.
[318] A little bit.
[319] So when you ought to enroll people in retirement savings plans, they don't tend to opt out.
[320] And when you want to enroll them at 3 % savings rate or 6 % savings rate, they tend to stay in and not opt out.
[321] So it looks as though people have the capacity to save a little bit more, and we're not helping them in many cases to do that.
[322] So we can partially address this problem.
[323] I think the easiest way to address it would be to do more auto -enrollment in retirement savings plans.
[324] Roughly now about half of the private sector workforce has a 401k plan at work, but the other half of the working population doesn't even have a plan at work right now.
[325] We need to migrate this benefit to the other half of the private sector workforce.
[326] And, you know, I'm optimistic that we're going to go down that path in the decades ahead.
[327] Why?
[328] Well, the idea that people need help, believe it or not, this is another one of those can't believe it took economists this long to figure it out.
[329] The idea that people may need help saving for retirement really became obvious only in the 1990s when the first generation of 401K plans and IRA plans weren't working.
[330] And so there's a growing awareness that people need a nudge.
[331] People need a little help to get into these savings plans.
[332] And it was actually a bipartisan effort.
[333] Back under Bush the Sun, there was something called the Pension Protection Act, which actually took kind of automated retirement savings and both sides of the aisle got behind it.
[334] So there is a tradition of bipartisan support for making saving easier.
[335] So I see a future where that alliance that occurred under the Second Bush administration might reform and we'll see more of this in the future.
[336] Really interesting.
[337] Mike Mon, anything to add?
[338] There's reason for hope.
[339] I'd like to share with you that there is grain -free, herbed duck confi with sweet potato cat food.
[340] Oh.
[341] Awesome.
[342] Thank you, Mike, and David Leibson.
[343] Thank you so much for playing.
[344] Tell me something I don't know.
[345] Would you please welcome our next guest, Max Beaserman.
[346] Would you please introduce yourself?
[347] Sure.
[348] I'm a professor at the Harvard Business School and I co -direct the Center for Public Leadership at the Harvard Kennedy School.
[349] Very good.
[350] So it sounds like we have something we can learn from you.
[351] What do you have for us tonight?
[352] Well, I thought I'd try a demonstration I'm going to give you a problem to help me solve, and you're picking between four different investment funds, and I'm going to describe the four of them to you and ask you which one you want to invest in.
[353] The first fund, over the last nine years in total, was outperformed the market by about 30%.
[354] We'll call that A. Fund B has outperformed the market by about 45 % over the last nine years in total, with moderately erratic performance in terms of year -to -year performance.
[355] The third fund has outperformed the market by 65%, but the third fund, Fund C, has had absolutely no volatility in performance over the last nine years.
[356] So a nice, smooth trend.
[357] And finally, there's Fund D that's outperform the market by 70%, but with a lot of erratic behavior.
[358] So A, outperform the market by 30%, B, outperform the market by 45%, C, outperform the market by 65 % with very smooth returns, and fund D outperform the market by 70 % with a lot of erratic differences across years.
[359] And with that, I need you to vote.
[360] How many of you are going with A, the tobacco fund?
[361] How many of you are going with B, 45%.
[362] Okay, we've got about 10 % of the audience there.
[363] How about Fund C?
[364] That's a, that's very popular, very, very popular.
[365] And how about Fund D?
[366] Okay, moderately popular, but not nearly as popular as C. So, Stephen, you're the one who wrote a book on economics, I believe.
[367] So can you tell us, is it possible to dramatically outperform the market by 65 % over a nine -year period of time with absolutely no volatility in performance?
[368] I would have to say, considering how nicely you've constructed that question for me, in which the only possible wise answer is no. I would say no. That's not possible to X -Basier.
[369] Excellent.
[370] So a quick fact is that it's not possible to dramatically outperform the market over a nine -year period of time with absolutely no volatility.
[371] And yet many of you were perfectly comfortable investing in that.
[372] And what I want to tell you, which will make you a little bit more upset about your choice of fund C, is that I was describing Bernie Madoff's Ponzi scheme and the vast majority of you picked it.
[373] The other thing that's interesting, when I ask after they picked Fund C, is there anything wrong with any of these funds?
[374] Many people in the audience will tell me that Fund C isn't possible.
[375] So people have access to that information.
[376] But the question, what do you invest in, leads people to look for high returns, low volatility, and they basically have the idea of an ethical problem fade from their existence.
[377] So we have the capacity to, you know, to notice, but so often we fail to notice critical information in our environments.
[378] And while many of us have been trained to focus, we also need to notice.
[379] And when something doesn't quite fit, we need to look further and ask some questions to find out what's going on.
[380] Too often we don't, and we stick to a very superficial level of noticing.
[381] That's so interesting.
[382] In this case, is what you're calling a failure to notice tied to you know, a desire for it to be true?
[383] I think partially.
[384] And the real world is probably even more accurate to say that people wanted it to be true.
[385] But I'm guessing there are cases contrary to that, right?
[386] If I am scared that I'm very ill and I have what might be a symptom of that, I might, quote, fail to notice as well, right?
[387] Yeah.
[388] So I think that there's things that prompt us to be fearful.
[389] But it's already on our radar screen and something makes it salient and makes us hypervigilant about it.
[390] But But there are many, many more cases from 9 -11 to every possible financial scandal that we've read about where there were dozens, if not hundreds or thousands of people who had access to the critical information that should have allowed them to notice that there was a problem and did it notice.
[391] So we can look back on things like Bernie Madoff or 9 -11 and say it was a failure of noticing, not a failure of knowing.
[392] So what is it now that we are not noticing that we should be noticing?
[393] What a lot of people don't notice, despite the fact that they see it occur all the time, is that your frequent flyer miles are becoming worth less and less and less.
[394] Use your miles, even if you need to give them to your cousin who you don't like.
[395] That's useful.
[396] That's actually extremely useful.
[397] But more broadly, I find it fascinating that society trust institutions like auditors or credit rating agencies to provide us with independent advice, and we allow these firms to exist in a form that creates incentives for them to simply want to please their clients and not do the very job that they're in existence to begin with.
[398] How can any one of us get better, generally, at noticing?
[399] What I find is that if you simply ask yourself, what's going on, if you pick up your head and ask, what are the key challenges, threats, opportunities around us, you start to see them very, very regularly.
[400] So my reaction is in life, when we can't figure out what's going on, that's a really good hint that we should look further.
[401] Thank you so much, Max Bezeman.
[402] Thank you all.
[403] Nicely done.
[404] Would you please welcome our next guest, Katie Milkman.
[405] Katie Milkman, would you please introduce yourself?
[406] I'm a professor at the Wharton School at the University of Pennsylvania, and I study decision -making.
[407] Very good.
[408] We like decisions.
[409] Tell us something we don't know about them, please.
[410] Well, I'm going to take a page from Colin Kammerer's book and leave you a little bit curious.
[411] And I'll tell you that there is a certain type of news article, a certain section of the newspaper, that's more likely to go viral.
[412] You're more likely to email a story if you read it in this section than in any other.
[413] Can you guess what that section might be?
[414] Would it be the cute cat section of the newspaper?
[415] Or the cat food section?
[416] Possibly if there were a cute cat section of the newspaper.
[417] that would win.
[418] But since there isn't, you'll have to choose a more typical category.
[419] Which section of the newspaper, Angela?
[420] I guess op -eds.
[421] Don't people like forward op -eds?
[422] Or is that just my friends?
[423] People do forward op -eds, but that's actually not the section that I was going to mention.
[424] The one that jumps from about 20 % of stories on the New York Times homepage get forwarded to a friend sent by email.
[425] This section, about 30 % of them go viral.
[426] What about health?
[427] Like, how to lose...
[428] You're getting closer.
[429] getting closer, very warm.
[430] A section like health?
[431] The business section?
[432] Is the business section like health?
[433] It's not the business section.
[434] It turns out that you can predict what kinds of stories will go viral based on the kinds of emotions that they produce.
[435] More positive stories are more likely to go viral.
[436] Stories that make your heart race are more likely to go viral.
[437] And stories that inspire awe are particularly likely to go viral.
[438] Does that help you guess which section?
[439] Oh, Sunday Styles?
[440] That's like my favorite section.
[441] Is it that?
[442] No. It's the science times.
[443] Oh.
[444] No way.
[445] Really?
[446] It sky rockets.
[447] So about 20 % of the articles that appear on the New York Times make the most emailed list.
[448] And 30 % from the science times make the most emailed list.
[449] So what kind of science is most likely to be viral?
[450] And I guess really, why?
[451] I mean, why do people care about it in that way?
[452] Yeah.
[453] Well, the kind of science that's mostly likely to go viral is good news about science.
[454] Right.
[455] We discover something exciting that feels like it could make our lives.
[456] better.
[457] An example of a story that went particularly viral was a story about what happy people don't do.
[458] So it was based on research about people who are happy and all of the habits they have that maybe lead them to be happy.
[459] You can see how that's both interesting, useful, on inspiring, sciencey, positive, and that was a big winner on the internet.
[460] So Katie, here's what I want to know.
[461] How did you determine it?
[462] What were the data overall and how did you measure?
[463] Well, I'm a nerd.
[464] So what we did is we built a web crawler, which is a little automated machine that goes and captures data on the Internet.
[465] And our little web crawler went and captured every story that appeared on the New York Times every 15 minutes, both on the homepage and on the most emailed list to figure out what was its rank.
[466] And we did this for three months, and then we crunched the numbers and looked at which kinds of articles were the most likely to make it to the most emailed list, where they lead stories in the newspaper, you know, did they have a photograph that appeared with them, and so on.
[467] that careful analysis led us to conclude that more positive stories are more likely to go viral.
[468] In general, stories that make your heart race, so stories that make you angry or anxious, but not stories that make you sad.
[469] Did you, when you say you controlled for whether they appeared with a big photo and so on, are you controlling for how much time it'll spend on the homepage, for instance, and how do you measure that?
[470] Because that's not so easy.
[471] Exactly.
[472] Well, since we're snapping a photo, essentially, of the homepage every 15 minutes, we could see exactly how many minutes in units of 15 a story spent in a given spot.
[473] And we can compare two stories that got exactly the same amount of attention in terms of the marketing that the paper gave it and see which one actually was more likely to make the most emailed list.
[474] Excellent.
[475] Mike, Katie Milkman's been telling us about what makes viral viral.
[476] You agreeing over there?
[477] So first of all, I just want to say that nothing in the New York Times actually goes viral.
[478] Like, can we just be serious?
[479] It's not like it's a cat video.
[480] You mean compared to cat videos.
[481] Right.
[482] But if we actually knew what caused something to go viral, we'd be billionaires.
[483] And every marketing department in the world is guilty of just saying, well, let's just make a viral video to get the word out.
[484] Like, thanks, that's really helpful.
[485] I'll just go do that.
[486] But I will say the top two most shared articles on Facebook last year, both follow this pattern exactly, right?
[487] So the most shared article was a new Alzheimer's Treatment, fully restores memory function.
[488] So we're on to something.
[489] Wait, what's the other one?
[490] I'm curious.
[491] The next one was, how sensitive is your OCD radar?
[492] Hey, Katie Milkman, thanks for playing so much.
[493] Angela, before we get back to our unbelievable guests tonight, I have got some questions for you, some lightning round questions that we wrote just especially for Angela Duckworth.
[494] You ready?
[495] I'm ready.
[496] In 30 seconds or less, how can someone get grittier?
[497] Grit is passion.
[498] perseverance for very long -term goals.
[499] It takes more than 30 seconds.
[500] Fair enough.
[501] Who's the most inspirational person you've ever met?
[502] The person who leaps to mind is my idol Carol Dweck, because if I'm really, really good, I'll be half as good as she is.
[503] Stanford psychologist, growth mindset.
[504] Angela, tell us about someone that you would consider incredibly successful despite a lack of grit.
[505] When I was interviewed by Larry King, he said he only scored one out of five on my scale, but he actually scored the lowest that you could on all of the questions on my grit scale that were about having the same interest over time because he said, look, every week I have a different guest.
[506] I get very bored.
[507] I need a new person.
[508] Well, he's very interested in suspenders for a long time.
[509] And here's the thing.
[510] I was like, Larry, you've been interviewing people, okay, different people, but basically interviewing people since you were nine.
[511] So you actually are gritty.
[512] So I cheated.
[513] He's actually pretty gritty.
[514] Yeah, you did cheat.
[515] I cheated.
[516] Angela, we know that your parents were Chinese immigrants, and yet, as you have written, quote, against stereotype, you can't play a note of piano or violin.
[517] Really what I want to know is, was it that you wouldn't or you couldn't?
[518] It was that I couldn't.
[519] So in addition to grit, there's talent.
[520] And I didn't have any of it for music.
[521] The world does not need any more Asian females from South Jersey playing piano and violin.
[522] Like none.
[523] They need more cheerleaders.
[524] Now, despite that, we also know that you signed up your own daughter for ballet to make her grittier after you saw her fail at opening a box of raisins.
[525] So my question is, did she ever learn to open raisins.
[526] Yes, we have, my 14 -year -old now, Lucy, can now, with effort, open her own box of raisins, and we're holding her to that standard.
[527] Congratulations.
[528] That's a great family accomplishment.
[529] So, Angela, we know that you graduated magna cum laude from Harvard, but that your husband, a real estate developer, graduated summa cum laude from Princeton.
[530] So my question is, is he really smarter than you, or is this just because Princeton is really easy.
[531] Princeton is really easy.
[532] As noted earlier, you are allegedly a world -class swearer.
[533] I think I'm passionate.
[534] I think it's just me expressing my passion for my, you know, whatever I'm saying.
[535] What do you do if your kids swear?
[536] I don't actually, I don't care if they swear.
[537] I think swearing's okay.
[538] You know, in the grand scheme of things, kids are going to do some things that are bad.
[539] Swearing is not one of them.
[540] And finally, Angela Duckworth, What is a duck worth?
[541] Priceless.
[542] I agree.
[543] Nicely done, Angela.
[544] All right, it is time to get back to our game.
[545] Would you please welcome our next guest, Kevin Volpe.
[546] Kevin, tell us what you do, please.
[547] I'm a physician, a professor at the School of Medicine and the Wharton School, and the director of the Center for Health Incentives and Behavioral Economics at the University of Pennsylvania.
[548] Excellent.
[549] I bet you know a ton of stuff.
[550] We do not know, so tell us one.
[551] Well, so let's imagine you're a large employer, and you have a population of people who smoke, and you'd like to help them quit smoking.
[552] You know that cigarettes cost about $10 a day if you smoke a pack a day, so each person could actually save $3 ,650 if they just quit.
[553] And what was interesting is we found in two different studies that incentives of $750 to $800 actually tripled smoking cessation rates, which on the surface doesn't make a lot of sense because this is a much smaller amount than people could have saved on their own.
[554] So if you give them what's relatively a little bit of cash up front, relative to what they'd save by quitting smoking on their own, they're much more likely to do it.
[555] Yeah, I think there's probably three reasons for this.
[556] One is that the savings from quitting, that's a, you experienced that over a long period of time.
[557] It's a discounted revenue stream.
[558] People always discount the future to a large degree.
[559] secondly there's a mental accounting salience issue if somebody offers me this money it's very visible a third issue is that giving me this money up front really helps to offset the procrastination i'd otherwise have a lot of people want to quit about 70 % of americans say they want to quit only about 3 % per year actually do it's very hard to quit smoking there's a lot of immediate costs and i'm discounting the benefits far in the future if i pay people an incentive that helps to move some the benefit into the present.
[560] Do you think it's the money, or do you think it's just the, like, delight, the dopamine of like, oh, I got whatever, I got $7.
[561] It may actually be the just the reward of feeling like you got points, kind of that way.
[562] Yeah, it's a good question, and it's hard for me to be able to say for sure.
[563] What we did in these studies typically is we'd give them some money early on for quitting at, let's say, two weeks to four weeks, and then we'd give them some money a little bit later, so So the idea was to try to help them along, give them some feedback as they went, but also try to hold some of that money out so that they could achieve their long -term quitting goal.
[564] Can you tell us a little bit more about the quitting and people who do quit?
[565] How many stay quit, I guess, is number one question.
[566] Number two, you say that you triple the effect by giving them $750.
[567] How do you know that yours stay quit and how long does it count to stay quit for?
[568] So first of all, in our control group, we had a quit rate of 5%.
[569] We found in the incentive groups that people quit at about 15 % of the time.
[570] What we did is we measure it using saliva or urine -cotonin, that's a metabolite of nicotine, whether there was any of that metabolite present in their saliva or in their urine.
[571] We found in these interventions that when we ran them for 12 months and we tested people six months later, we found that even though there is some recidivism, the quit rate ratio stayed about the same.
[572] So it lasts like at least six months, then did you follow them after that?
[573] No, just out to 18 months.
[574] That's the longest we followed people.
[575] How does this upfront cash incentive compare to the here -to -for most successful incentives for smoking cessation?
[576] So if you compare the quit rates of people who are offered nicotine replacement therapy, the quit rate ratios are about the same as what we achieved.
[577] So I could say that's good, but the quit rates, even though they were tripled, are still only 15%.
[578] So there's a lot of room for improvement.
[579] Mike, Kevin Volpe, is telling us about quitting for cash.
[580] Anything to add?
[581] I think Mark Twain said it best.
[582] He said it's easy to quit smoking.
[583] I've done it hundreds of times.
[584] So yes, what you're saying is true with some caveats.
[585] The primary issue with this theory is that money isn't the only or perhaps even the main driver of sort of, smoking or quitting smoking because addiction has so many other things involved in it.
[586] In fact, and this is true, 50 % of people who have quit smoking relapse when they're drunk.
[587] And so that becomes this major issue is that there are so many other externalities factoring into what goes on.
[588] Kevin Volpe, thank you so much for telling you something we did not know.
[589] This is our final guest of the evening, so would you please give them a warm welcome.
[590] Dean Carlin.
[591] Hi.
[592] Hey, Dean.
[593] Tell us about yourself, please.
[594] What do you do?
[595] I'm a behavioral and development economist at Yale University, and I started two different nonprofits.
[596] One is called Innovations for Poverty Action, and the other is called Impact Matters.
[597] Excellent.
[598] Tell us something we don't know, and as the final guess, make it good.
[599] So we worked with this charity in the United States called Feina from Hunger, and their normal charitable fundraising appeal sent out a letter to their prior donors and would make a fairly emotional appeal.
[600] would talk about Maria and a pig farm and how they're helping her make more money and feed her children, et cetera.
[601] They agreed to add a treatment in which we talked about a randomized trial.
[602] And we said, hey, not only there's a Maria in this pig farm, but we actually conducted a randomized trial to measure the impact of our program.
[603] And we found positive impacts on a bunch of stuff.
[604] The basic question is, what did we find?
[605] We have a nice emotional appeal that just talks about Maria and this identifiable person who was helped versus Maria, this identifiable person, and a little bit of wonk.
[606] So the question is whether added wonk adds effectiveness?
[607] Is that the question, essentially?
[608] Talking about whether a charity is effective.
[609] Does this motivate people to give more or not?
[610] Angela?
[611] I'm going to guess yes.
[612] I'm going to guess that heart -wrenching story plus evidence of effectiveness is better than heart -wrenching story alone.
[613] So the answer is, on average, it didn't matter at all.
[614] I was incorrect.
[615] But it turns out here, the interesting thing was looking within the sample at two different types of people.
[616] People who'd been giving a lot in the past and people who'd been giving a very little amount by little amount, $5, 10, $15, $20.
[617] Turns out the people who'd give them very little, when you got wonky, they actually gave less.
[618] The people who gave a lot in the past, they gave more.
[619] So they had a positive reaction, and the other people had a negative.
[620] Is it ineffective do you think because of what you call the wonkiness?
[621] In other words, is it because of complication or because they think you're bragging or above them or maybe don't need their money?
[622] I don't think it was bragging.
[623] I think of it as a couple things.
[624] One is they're giving for an emotional appeal and you get wonky and it's like, that's just not why they're giving.
[625] The other is that it might actually just remind them, even though your message is this works, you're actually reminding them this might not work.
[626] And now they're like, well, you know, that was just one study.
[627] Is this going to work again?
[628] But the people who were giving more, the idea is maybe they're a little bit more sophisticated and they're giving and thoughtful and they're focused on giving more to fewer charities, for instance.
[629] And so for them, the idea that there's an evidence behind this charity is like, well, that's great.
[630] And so they gave more.
[631] So how can we get the people who are only giving a little bit?
[632] Is there any way that we can get them to actually care about evidence of impact?
[633] I'm not sure that that's the goal.
[634] So one of the things we're doing at the group I mentioned impact matters is to try to audits of charities to help people know which are evidence -based and which are not.
[635] But I think for the small, tiny donor who's giving $5, $10, it's probably not the right motivation.
[636] The right voter might be help them give more and then help the charity that they're giving to choose effective things or the aggregator who's collecting their donation and help them allocate to good charities.
[637] So here's a crass question, but if a charity is doing well, the results are good, and And you can promote that to your donors.
[638] And if big donors like that message, then do you really care so much that the small donors don't like the message?
[639] Well, small donors do add up.
[640] And there's a lot of charities that actually depend quite heavily on small donors.
[641] So does this lead you then to send different kinds of messages, I gather, to large and small donors?
[642] That is the prescription that comes out of this.
[643] And so, you know, the challenge is, of course, figuring out exactly how to do that, where to draw the line.
[644] And can you share with us any other prescriptions, Generally, for fundraising appeals, you know, what's successful in terms of matching funds, you know, happy photo versus sad photo.
[645] Just tell us what you know about that.
[646] So happy versus sad, I would be shocked to learn, happy is not better than sad.
[647] I'll go with you on that.
[648] On matching, one of the earliest ones we did found that the ratio of the match did not matter.
[649] Mike, when charities are too good for their own good, what more can you tell us?
[650] Okay, so a few things.
[651] First of all, happy or sad images, you're unfortunately wrong.
[652] sad images work way better.
[653] Jacqueline Novagroth, who started the Acumen Fund, said that people give because they want to feel good, not because they want to do good.
[654] If you tell them that they can give money to help this woman eat for a week, then they're happy to do that because I, as an individual, can solve that problem, can meet that need.
[655] I think the key is that there's no one answer to this.
[656] It's not, you know, the problem I have with the quote you just gave us is that people are not that simple, and people are very different.
[657] And so some people do care about effectiveness.
[658] And the question is how to segment the market, how to make sure that you can do the most to help that segment.
[659] Really interesting.
[660] Dean Carlin, thank you so much for playing.
[661] Tell me something I don't know.
[662] Can we give one more hand, please, to all our guests tonight?
[663] Great job.
[664] It's time now to pick a winner.
[665] Our live audience will have the final vote, but before we turn it over to them, the three of us, my co -host, Angela Duckworth, fact -checker Mike Mawn and me, Stephen Dubner, we will each weigh in.
[666] The three criteria, everyone should remember, did they tell us something we truly did not know?
[667] Was it worth knowing and was it demonstrably true?
[668] So, Angela, of everything we heard tonight, what would just be your favorite in terms of something you truly did not know?
[669] I know it's hard.
[670] My favorite of the seven friends that were on stage tonight would be, well, I guess I'll pick I yell at Fishbach for defying my life's work.
[671] I like that.
[672] I like the spirit of scholarship triumphs overall, right?
[673] Truth above all.
[674] I'll take the next criteria, which is who told us something that was really worth knowing?
[675] Honestly, I think that maybe Max Bazerman and the idea of failing to notice because some of my favorite ideas in learning generally are those things that in retrospect are really obvious, but you wouldn't necessarily know how to articulate them.
[676] And so also, I like that it's universal and important and useful.
[677] Mike Maughn on factual terms, is there something that rang the truest of all?
[678] Well, given that this is all original research from some of the smartest people in the world, I think we're going to go with good job.
[679] It was true.
[680] But I will say this.
[681] Last year, I was going through this period where I lost 45 pounds in two months.
[682] It's very unhealthy.
[683] So I'm actually going to go with IEL at Fishback as well because I think that there's something to that incremental progress that helps you reach a bigger goal.
[684] Excellent.
[685] Good point.
[686] All right then, audience, you've heard from us, but we don't pick the winner.
[687] You do.
[688] It's time now to do that.
[689] So who will it be?
[690] Dean Carlin with Too Good for Your Own Good.
[691] Kevin Volpe with quitting for cash.
[692] Katie Milkman with what makes viral viral, Max Bazerman with failure to notice, David Labson with early retirement withdrawal, I yell at Fishback with the upside of instant gratification, or Colin Kammerer with This Is Your Brain on Curiosity.
[693] Please take out your phones, follow the texting instructions on the screen.
[694] While our live audience is voting, let me ask you a favor.
[695] If you enjoy, tell me something I don't know, please spread the word, give it a nice rating on Apple Podcast, Stitcher, or wherever you get your podcast.
[696] If you want to listen to this show without ads, sign up for Stitcher Premium at stitcherpremium .com slash tell me. Okay, the audience vote is in.
[697] Once again, thanks to all our guest presenters.
[698] Our winner tonight, put your hands together for Professor Max Bazerman.
[699] Thanks so much for telling us about failure to notice.
[700] Congratulations.
[701] Now, to commemorate this victory, Max, we'd like to present you with this certificate of impressive knowledge which reads in full, I, Stephen Dubner, in collaboration with Angela Duckworth and Mike Mon, do solemnly swear that Max Bazerman told us something we did not know for which we are extremely grateful.
[702] That is our show for tonight.
[703] I hope we told you something you did not know.
[704] Huge thanks to Angela Duckworth, Mike Mawn, to our great guests, and thanks especially to you for coming to play Tell Me Something.
[705] Have a great night.
[706] Thank you.
[707] On our next show, we are off to Minneapolis, and we're joined by Krista Tippett, host of On Being.
[708] We talk about comfort food, pillows, and for some reason, turtles.
[709] So if we could listen to conversations between these female turtles, it would sound very much like female humans saying, where are all the single men?
[710] That's next time on Tell Me Something on Don't Know.
[711] Tell Me Something I Don't Know is produced by Dubner Productions in association with Stitcher.
[712] Our staff includes Allison Hockenberry, Emma Morganstern, Harry Huggins, Brian Gutierrez, Dan DeZula, and Rachel Jacobs.
[713] Our live engineer in Philadelphia was Nathan Rossboro.
[714] David Herman is our technical director.
[715] He also composed our theme music.
[716] Thanks also to our good friends at Qualtrix, whose online survey software has been so helpful in putting on this show.
[717] You can subscribe to tell me something I don't know on Apple Podcast, Stitcher, or on TMSIDK .com.
[718] You can also listen without ads by signing up for Stitcher Premium at stitcherpremium .com slash tell me. You can find this on Facebook, Twitter, and Instagram.
[719] Thanks for listening.