Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert, experts on expert.
[1] I'm Dan Rathers.
[2] I'm joined by the Duchess of Duluth and the King of Kiwis.
[3] King of Kiwis is great.
[4] It's great, the King of Kiwis.
[5] Let me pitch you guys a new, I know you're not supposed to give yourself nicknames.
[6] That's kind of gross.
[7] Well, you have Boulder.
[8] I guess I have a long history of it, so I stop now.
[9] You know, Howard Stern named himself the King of All Media.
[10] Oh, he named himself that?
[11] Oh, that's been a thing for 30 years.
[12] He's the king of all media.
[13] And there was a period where it was kind of a joke.
[14] But then he really did end up being the king of all media.
[15] So it kind of also self -fulfilling prophecy.
[16] So I was thinking, what can I call myself the king of?
[17] Because it's such a exciting title, as you just found out, king of Kiwis.
[18] King of conveyance.
[19] I think it works.
[20] It's got a ring to it.
[21] King of conveyance.
[22] So if you want to go somewhere in a boat, I'm your guy, a motorcycle, a car.
[23] Any kind of conveyance, I'm the king of conveyance.
[24] I like that because that's also a. expensive word.
[25] Conveyance?
[26] And you have a good vocabulary.
[27] Oh, thank you.
[28] So I guess, yeah, implicit in there is he's a word file.
[29] You weren't confused by conveyance.
[30] No, a little for a second.
[31] It's fun being conveyed by you.
[32] Oh, thank you.
[33] Thank you, David.
[34] Scary and fun.
[35] That nice sweet mix.
[36] It wakes you up.
[37] This is a ding, ding, ding for a future two -part flightless bird that has to do with conveyance.
[38] Let me just finish this chewy bagel.
[39] Montreal.
[40] So chewy.
[41] You have to say that's the operative word in describing these bagels.
[42] Have you had them, David?
[43] I've had, I think I've had it.
[44] I think I've had it once.
[45] Had it once, yeah.
[46] I don't mean to make a bit out of your situation, but would you say you have bagel blindness?
[47] Like, do you know you're eating the same bagel?
[48] A bagel's a bagel to me. So you do?
[49] They all look the same.
[50] It depends about the filling, but a bagel, it's basically a bagel.
[51] A fresh bagel tastes like a fresh bagel.
[52] Right.
[53] Or bagel, as my friend in New Zealand said it for a really long time without knowing it was wrong.
[54] So I say it wrong and people give me heat for it.
[55] And I really am saying it how I once imitated our friend Jonathan Yackley.
[56] But then it just was permanently stuck in there and I can't say it bagel.
[57] Bagel.
[58] I like it.
[59] It's fine.
[60] I don't know why people get upset when other people mispronounce words because it's like, Aren't you bored with the pronunciation that you've been hearing?
[61] Mix it up.
[62] And maybe they feel like they just want to correct you.
[63] Like, I have to correct David when he says maths.
[64] Whoa, it's a controversial.
[65] Is that a defendable position for you?
[66] Like, can you say there's plural of math?
[67] What I found out, and I backed down very quickly from this, but apparently the rest of the world is all math.
[68] Maths.
[69] No, math.
[70] Not as maths.
[71] Everywhere, but America is maths.
[72] Oh, everywhere.
[73] I'm actually doing an episode just called Math.
[74] No. And it's about math and maths and the metric system.
[75] Oh, that's great.
[76] Wait, wait, wait.
[77] That can't be true.
[78] No, it's so it's maths pretty much everywhere except for America and maybe two other countries.
[79] Canada.
[80] I'm not lying.
[81] Okay.
[82] Yeah, people came to your defense.
[83] They did?
[84] People got anger.
[85] I backed down too quickly.
[86] I didn't defend myself.
[87] How many?
[88] That's the problem with David.
[89] You think you've, like, bested him in an argument?
[90] I've seen the New Zealanders.
[91] Two confrified.
[92] Your response was you were just illiterate.
[93] You, I just assumed I was wrong.
[94] I said there was a New Zealand thing.
[95] You said there was just a you thing.
[96] I panicked.
[97] So I mean...
[98] I think you also called him out.
[99] He says it wrong a lot.
[100] It happened on that one episode and then the next episode He did it again.
[101] He did it again and you called him out again.
[102] Yeah.
[103] I stick by if he's in America, he has to say math.
[104] Can I ask you quickly?
[105] And it is wrong in America to say math.
[106] Do you even have competitive debate in New Zealand high schools?
[107] I was on a debate team.
[108] How did that look?
[109] Once.
[110] Like, you're right, good point.
[111] You had to look at it from that angle.
[112] It's over in about 20 seconds.
[113] Absolutely.
[114] Everyone soft boils an egg and moves on.
[115] Okay, so we're not here to talk about the way you say math or maths or how quickly you concede to a counterpoint.
[116] What?
[117] Can I just double down that it's math?
[118] You have to say math in America.
[119] Sure.
[120] I mean, again, it goes about.
[121] back to what I think, I love it that he says maths.
[122] It's so stupid and weird and jarring and it's so exciting and novel and unique that I love it.
[123] I love that you, oh, there's about 20 words I love that you, knowing.
[124] Yeah, I like that.
[125] It's seven syllables the way you say known.
[126] I like to add some syllables.
[127] I like to add some eses.
[128] Yes.
[129] Keep at it, Sini.
[130] Don't listen to her.
[131] Hey, you're not supposed to throw your wife under the bus to protect your child.
[132] I'm not doing that.
[133] I And I want you to keep riding him like a rent and a mule.
[134] Like, this is what I want for both of you.
[135] You just want Discord.
[136] I want content.
[137] Today, we have a very pleasing person.
[138] Also from another area of the world, William McCaskill by way of Scotland.
[139] We should have asked him about math versus maths.
[140] Now we have to ask every single person who comes in who's foreign.
[141] Well, he's a professor at Oxford.
[142] So he could have probably tapped someone on the shoulder that was passing behind him and got an Oxford English dictionary.
[143] Ding, ding, ding.
[144] Dang.
[145] Yes, William is a philosopher and an ethicist, and he is a professor at University of Oxford.
[146] There was a period where he was the youngest professor in the, I don't know, in the country, in the world, in his neighborhood, somewhere.
[147] He is a very, very young professor of philosophy.
[148] Wunderkind.
[149] Wunderkind, big time.
[150] He's got a great book called Doing Good Better, but he has a new book out, and that's what we're here to talk about today, called What We O The Future.
[151] And I'm going to tell you, I don't want to spoil anything, but we got bested in this motherfucker.
[152] Well, we got robbed.
[153] We got robbed.
[154] We got bested.
[155] And Rob really took it on the chin.
[156] He got ensnared in something that he had no business in.
[157] Barely.
[158] I like it.
[159] Ride them.
[160] We'll see.
[161] Whip, whip, whip.
[162] We'll see.
[163] Also, it is important to say, our sons, David and Rob, got in a fight the other day over a cord.
[164] And you were downstairs and I was trying to break up this kid fight.
[165] Okay.
[166] And I said, wait till your dad comes up here.
[167] Right.
[168] And you know, that's the kind of thing brothers fight over.
[169] Some object there's one of that they both want.
[170] It was ridiculous.
[171] He wouldn't let me. I'm feralit myself.
[172] He was like nodding up the cable.
[173] It's amazing.
[174] Both perspectives here.
[175] Okay.
[176] Please enjoy William McCaskill.
[177] Wondry Plus subscribers can listen to armchair expert early and ad free right now.
[178] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[179] Or you can listen for free.
[180] wherever you get your podcasts.
[181] He's an object to.
[182] He's an out of chair.
[183] What's the team here?
[184] What is it?
[185] It is.
[186] It's gorgeous.
[187] Jessica is.
[188] The row.
[189] The road.
[190] James Perce.
[191] Jim Percy.
[192] No, it's the rub and so are these pants.
[193] Oh, my God.
[194] And my hair.
[195] Oh, my God.
[196] The road in your hair.
[197] Yeah.
[198] You know how you can tell?
[199] They do this seam that goes down the back of their shirts.
[200] Sure, I see it.
[201] I see it.
[202] Do you see it, well?
[203] I'm not sure what we're looking at.
[204] I'm learning, as you are, real time.
[205] This is the signature seam of the designer of a shirt she wears.
[206] Okay.
[207] If you were an archaeologist who was trying to date some pottery, right?
[208] We would look at different mocha designs.
[209] We could tell what year it is, so that's a good clue for us if we ever find this in a dig.
[210] Okay, well, let's just hope there's no volcano that serves you for the whole time.
[211] It's a very hard seem to perfect, so, you know.
[212] Yeah.
[213] You're married, right?
[214] Do I call you William, or do you go by Bill?
[215] Will is good.
[216] I was married, but no longer.
[217] Oh, no. I'm sorry to hear that.
[218] I have a partner.
[219] I'm very happy and still good friends with my ex -wife.
[220] That's wonderful.
[221] That's wonderful.
[222] That's a good story.
[223] There's a last thing you want to talk about, but didn't you take her last name?
[224] Yeah.
[225] We both took her grandmother's maiden name.
[226] It did mean that I kind of just took her name and then dissipated.
[227] And in that time, I'd publish Doing Good Better.
[228] Well, I was going to say, you're kind of painted into a corner because now you're known professionally with this name.
[229] So then she changed her name again.
[230] Oh, she did.
[231] This is her fourth last name now.
[232] This is what women deal with all the time.
[233] This is interesting.
[234] You never hear it from this perspective.
[235] My mother had this same setup where she had taken my second stepdad's name, had built kind of a little career, and then had it for all this time and had to get out of it.
[236] You know, had to declare at some point.
[237] but she weirdly declared it when she married her fourth husband.
[238] Instead of taking his name, she grabbed her old maiden name back.
[239] Oh, she was like, I learned my lesson here.
[240] Stick with what you were given, yeah.
[241] As they say here in the States, well, if it ain't broke, don't fix it.
[242] Well, I'm pretty happy I came out, McCaskill.
[243] It's a good one.
[244] Yeah.
[245] You and Monica are the exact same age.
[246] Oh, no way.
[247] Not exact.
[248] But to the date?
[249] You're four months older, five months older.
[250] Okay.
[251] August 24th.
[252] When's yours?
[253] March 24th.
[254] Oh.
[255] Yeah, 24.
[256] That's really fun.
[257] And you're 35.
[258] 35, exactly.
[259] Oh, you've done a lot more than me. And we're both a little old to the new decks.
[260] Is that right?
[261] Daddy's 47 years old.
[262] Still with the original last name I was given.
[263] I'm from a bygone era.
[264] Like, I'll be saying references you guys don't even get.
[265] Not only that, Monica, you already felt shitty about yourself, as you should.
[266] Because he's a professor at Oxford.
[267] I don't know how many books you've written.
[268] There's certainly two.
[269] I know of your other book.
[270] This is my third, yeah, actually.
[271] Thrice Times published in the novel form, that'll make you feel shitty.
[272] But is it true?
[273] I think I learned this on Tim Ferriss.
[274] Were you the youngest philosophy professor ever?
[275] When I got appointed, I was the youngest associate professor of philosophy in the world.
[276] Oh.
[277] Well, I'm the youngest girl to co -host armchair expert.
[278] Which is much cooler than being a philosophy professor, I can tell you.
[279] Well, it depends, right?
[280] We all want to be what we're not.
[281] So she'd love to be a world -renowned intellectual.
[282] Like, that would be a great bit of status.
[283] Well, really, what I want to be is a supermodel.
[284] You could be both?
[285] But I can't be either, unfortunately.
[286] I believe in you.
[287] You're just going to have a little bit of faith.
[288] Okay, maybe second life.
[289] Have you tried modeling in a country with a mean height average dramatically lower than the U .S.?
[290] You might just be trying in the wrong country.
[291] That's true.
[292] That's true.
[293] I don't want to get in any racial profiling, but clearly there's countries where you would seem at least average height or above.
[294] Definitely, but I still think the supermodels there.
[295] This is great for the fact check are very tall.
[296] That might transcend culture.
[297] Do you think that's possible, Will?
[298] I guess I don't know the ins and outs of the modeling industry well enough to be able to advise, but I think if you put your mind to it.
[299] Okay.
[300] I should have started with a more broad question.
[301] Are you playful?
[302] Because if I were you, I'd already be like, are we going to talk about anything that I have written about, I'm knowledgeable about.
[303] If we vaguely hit on effective activism at some point, that's a bonus in my view.
[304] Okay, great.
[305] We'll get there.
[306] We'll get there.
[307] But I think we're kind of charmed by your youth, by your cuteness.
[308] Thank you, ditto.
[309] I love the gap in your front teeth.
[310] Like, if I could surgically get, like, you've nailed it.
[311] Yeah, it's right in the peak.
[312] Okay.
[313] Another fun thing that has nothing to do with anything.
[314] Playfulness.
[315] You're like a dogyhouser.
[316] You're ahead of schedule.
[317] When you're thinking of your life as the story of your life, will this put you ahead of schedule for retirement?
[318] Here's what my fear for you is.
[319] Is you're racing, you're racing, you're racing, you're racing.
[320] And then is it a race to retirement?
[321] Is there another chapter we've yet to invent that comes post work pre -death?
[322] I don't really know.
[323] Maybe I just switched career.
[324] So in the course of writing this book, a number of people were like, oh, maybe you should write sci -fi.
[325] I would kind of enjoy that.
[326] There you go.
[327] Maybe that can be a threat to all of the effect of altruism colleagues that I have.
[328] If they're ever too annoying, I'll abandon all this stuff.
[329] It's great to have some leverage.
[330] A viable threat is good.
[331] Okay.
[332] This is going to be fun for us.
[333] because I got to say, even when I was thinking of how I feel about this topic, I feel a little bit like Nietzsche, which I don't want to feel like Nietzsche.
[334] I'm not a Nietzschean or whatever a Nietzschean is.
[335] Are you an Ubermensch?
[336] Oh, wow.
[337] Was that the name of his, like a devotee of his?
[338] Someone that believed in Nellism?
[339] That was the Superman, the person who transcends ordinary morality and truly changes the world.
[340] Oh, wow.
[341] I think it's like Dax Shepherd, Napoleon, maybe Jesus.
[342] Oh, God.
[343] God, we're already there.
[344] I'm just flattered you learned my whole name.
[345] Honestly, I had a little peak of pride when you knew my whole name.
[346] I'm not even kidding.
[347] That's the good thing about it.
[348] You always find the good.
[349] I do.
[350] I find the brown lining and the silver lining.
[351] Let's give you an accolade before we start dancing.
[352] Our hero, Bill Gates, we're Gaitians, whatever that is.
[353] Gatians.
[354] Yeah.
[355] He called you a data nerd after my own heart.
[356] I think it's the only time in his life he's ever used the word heart.
[357] I know.
[358] Oh, my God.
[359] But he has to combine it with data.
[360] Yeah.
[361] The only way you feel safe.
[362] What a high compliment.
[363] Oh, my, I'm so jealous.
[364] Now, let's talk about your previous work a little bit.
[365] I think we should touch in on doing good better because I do think it sets the table for your newest book, what we owe the future.
[366] So I watch your TED Talk.
[367] It has this simple question of like, how do we make the world better?
[368] That's exactly right.
[369] But then how do we make it better in as effective a way as possible?
[370] So we have limited time, limited money.
[371] How can we use that time and money to do as much good as we can?
[372] And what I like about that book is that you had to break some eggs along the way.
[373] I love counterintuitive things.
[374] It's why I love reading Malcolm Gladwell.
[375] Oh, you thought this is how it worked?
[376] It's the opposite.
[377] So you break some hearts.
[378] Let's just talk about fair trade for one second.
[379] That was a heartbreaker, I imagine, for the many people fighting hard for fair trade.
[380] There's just this general fact that many, many ways of trying to do good from people with like best of intentions, wonderful people, just are not particularly effective.
[381] Some can do harm, but I think the most often thing is just that you fail to get most of the potential.
[382] And so with fair trade, I mean, I think in general, ethical consumption just doesn't do very much compared to targeted donations to the most impactful non -profit.
[383] And then in the particular case of fair trade, it's not really obvious to me that you're doing more good by buying fair trade coffee from a middle -income country than non -fair trade coffee from a very poor country.
[384] It's tough to get the standards, and so that means in general we're taking purchases away from like the poorest countries and more to kind of middle income countries where they could hit those standards.
[385] You set up a little framework for your theory and one is you look at the three things that should guide the decisions on what you're going to expend your resources on and you think of addressing the most pressing issues.
[386] They got to be big, they got to be solvable and ideally they've been neglected for reasons we can get into but just right out of the gates if you're going for total to your point you're better off elevating that.
[387] health and income of the poorest countries first before maybe we address the middle -income countries.
[388] Absolutely.
[389] So if you're trying to improve the lives of other people, the most obvious thing is to focus on the very poorest people.
[390] We don't appreciate just how extreme global inequality is.
[391] We're 100 times richer than the poorest people.
[392] And even between middle -income and very low -income people, that's a factor of 10 or so.
[393] Money goes further, the less you have of it.
[394] And so the kind of impact you have by helping the very poorest people, that can be like tens, hundreds of times as much as the impact by helping people who are richer.
[395] Can you explain really what fair trade is?
[396] Like, I'm like, I know what that is.
[397] And then when I'm really thinking, I don't.
[398] If you had to explain it to someone else.
[399] If I had to explain it, I could not explain it.
[400] Yeah, fair trade, it's just like a stamp of approval or certification system.
[401] For some goods, if the production of those goods meets certain criteria, certain like working conditions, for example, or certain levels of people.
[402] pay for the people who work in that organization.
[403] But the idea is if the companies kind of meet those conditions, then their products gets a fair trade stamp of approval.
[404] It's like being certified organic.
[405] Yeah, exactly.
[406] And then another one that's counterintuitive would be boycotting sweatshops.
[407] We're not supposed to do that?
[408] Bad news.
[409] We'll take it away.
[410] You're just homing missile in the most controversial bits of the book.
[411] You've already probably picked this up, but I am cynical and I not a good person like you.
[412] And so, yes, you'll see a pattern emerge.
[413] But you'll have fun defeating me because you're smarter and more educated.
[414] I'm also an optimist as well, and I should say most of the work is about all the good things we can do, not about errors we make.
[415] But sweatshops, we completely agree that working conditions in sweatshops, in poor countries are just absolutely abominable, like awful, they're terrible.
[416] But the key question is, what's the counterfactual?
[417] What would the person be otherwise doing?
[418] And the answer could be like unemployment, which is much worse, backbreaking farm labor, perhaps even worse paid, prostitution, picking from garbage heaps.
[419] These things are even worse.
[420] When you kind of survey workers, they're, like, excited to work in what we call sweatshops.
[421] And the issue is just that we're not taking seriously the sheer scale of extreme poverty, like just how bad that is, such that sweatshop jobs are comparatively good.
[422] And so if at least you boycott them and, like, buy local, you know, buy American, then you're just taking away the best source of employment that those extremely poor people have.
[423] And if you look at just most countries, as you industrialize, you have this period of intensive factories.
[424] As the country gets richer, working conditions improve, that's the trend that we've always seen.
[425] And so the thing that we should be trying to do is just help those countries get richer, both by the donations that we can give to highly effective non -profits, but also by purchasing things from those countries.
[426] Yeah, is unattractive as the proposition might be when you also look at the collateral effect of, okay, now you've got a factory with all these people.
[427] sure, it's not what we would want for them.
[428] But now you also got some things popping up around the factory.
[429] It is just the seed that starts growing into something more sustainable and better.
[430] The huge success stories in development over the last 50 years, places like Taiwan or South Korea, that often generally went via what we would call sweatshops, very intensive factory work.
[431] Then you go through the kind of manufacturing phase of development and instead of producing kind of services.
[432] And my hope is that the poorest countries in sub -Saharan Africa have a similar revolution in living standards.
[433] You can actually chart all these countries currently somewhere on that path.
[434] You've got Vietnam that's midway through that process.
[435] China seems to be on the upper echelon of that process.
[436] It's all happened in front of us.
[437] Yeah, I mean, if you just look at China, that's 600 million people who've been lifted out of poverty.
[438] And that's as a result of market liberalization that allowed goods to be made in China for many decades.
[439] And now they're moving to having things that are more like a service economy.
[440] What about places like India where there's such a differential between the rich and the poor that I feel like as the country gets richer, I'm not sure how much it's actually helping the poor people.
[441] It's kind of like rich get rich, poor kind of stay.
[442] In these cases, what you tend to find is that inequality increases within the country, but the poorest people are also better off.
[443] So that's happening in India, in China, Brazil.
[444] Yeah, that's one of the very confusing aspects of our income inequality in this country.
[445] On the surface, the gap has gotten wider and wider.
[446] And at the same time, the generalized standard of living has also improved for the poorest of the poor.
[447] So it's like a confusing bit of analysis.
[448] And the U .S. is particularly extreme as well in the degree to which the very, very richest people in the U .S. have these statistically increasing incomes or wealth.
[449] And the sluggishness by which the poorest people in the U .S. incomes have increased.
[450] So the U .S. is particularly bad on this dimension.
[451] Yeah, there's an asymmetry between the pace.
[452] Yeah, exactly.
[453] But also it is forgotten.
[454] The standard of living has gone up.
[455] The infant mortality has gone up.
[456] The rate of hunger has diminished.
[457] Similarly, if you just look at subjective well -being as well, so there's this idea that comes from the 70s, but has been proven wrong with subsequent data, that people aren't getting any happier over time.
[458] And in fact, they are.
[459] It's just that the data wasn't good enough and some of the analysis was bad back in the 70s.
[460] So hopefully the data will flow out and people will learn over time.
[461] As countries get richer over time, which they tend to do, people get happier as well.
[462] Okay.
[463] And so you've got to break out three moral issues within that book.
[464] You do global health.
[465] Have you ever been an advisor to the Gates Foundation?
[466] Because it just sounded like when I heard your TED talk that all the issues you listed are ones that they have targeted specifically.
[467] I mean, I've met a bunch of the people involved with the Gates Foundation.
[468] I just think they're one of the best foundations that are out there in terms of the work they do in global health.
[469] development.
[470] And Bill Gates himself is very cost -effectiveness -minded.
[471] And so the approach that I'm outlining is very continuous with the work that the Gates Foundation is doing.
[472] Yeah, focusing on the big diarrheal diseases, malaria, measles, these are the huge killers.
[473] They're certainly neglected.
[474] You're not getting diminished returns as you continue to fund them.
[475] Factory farming, that's a concern that falls into your three moral silos.
[476] And then the last one is the one you're most interested in, or at least the one you spend the most time talking about, which is existential risk.
[477] It is kind of weirdly comedic to watch you give that talk in 2018.
[478] You must feel like Nostradamus a little bit.
[479] I think it's a sad state of affairs if you make predictions that turn out through if the predictions involve the deaths of tens of millions of people, unfortunately.
[480] Right.
[481] But, oh, pandemic.
[482] So you and I might be leaving some out your existential risks biggies are nuclear war, climate change, and of course pandemics.
[483] AI would be the other big one.
[484] And then a kind of meta -charactery of war in general.
[485] Those are the things that I think we should be most focusing on.
[486] in terms of things that could be utterly catastrophic, thereby have impacts not just in the present generation, but for the entire long run's future.
[487] Okay, so here's my only pushback on doing good better.
[488] I would say it has a sweeping kind of utilitarian approach to it.
[489] I mean, it's utilitarian flavored in the sense that it's about optimizing.
[490] It's about doing as much good as possible, and it's really squarely focused on people's well -being.
[491] How can they make them better off?
[492] But it's a little different in a couple of ways.
[493] Utilitarianism would say that supposing Kristen is happier than you are, Dax.
[494] then I should kill you in order to save her.
[495] That is true in this case.
[496] Okay, maybe this is true in that case, but normally we'd think it's morally wrong for me to cut you up and take out your organs and give them to five other people.
[497] And effective altruism, doing good better, it's not about that.
[498] It's just about doing good.
[499] It's not about and justify the means.
[500] Okay.
[501] When you marry a morality to it, even just the word moral, a moral approach at making the world better, and we start with a prioritization, here's my concern.
[502] morally, every single medical student, in this country at least, should be studying heart disease and cancer, period.
[503] Those are the two whoppers.
[504] I think the key issue is when you say every single medical student, because that's very different from how I am thinking about things and how I think we should be thinking about things.
[505] Because I can't control the actions of every single medical student, but maybe a medical student asks me, what should I do with my career?
[506] Then the decision you should make is what should you do, given what everyone else is focusing on?
[507] Given that enormous amounts of effort is spent on heart disease and cancer, it could well be the case, and I think probably is the case, that focusing on some more neglected disease could be even higher impact, where things that don't affect rich countries, like neglected tropical diseases, like malaria, like tuberculosis, they tend to get much less the search funding.
[508] And then things that are kind of catastrophic, but occasional, like pandemics, people pay very little attention to them in advance.
[509] And so that's the kind of neglect of this aspect's coming in.
[510] Given that there's already so much attention on some issues, what are the things where not as much people are focusing on it?
[511] And so your additional effort can do even more good.
[512] Yet at the same time, I'm like, if someone's donating to their local shelter, if people are just trying to make anything better, if you focus on anything, I'm pretty impressed.
[513] If someone wants to go cure toenail fungus, that's great.
[514] I don't think morally there's a call about whether they should be doing something that'll have a more maximal outcome or impact.
[515] I definitely see effective altruism.
[516] Its enemy is apathy.
[517] It's not that its enemy is people doing ineffective things.
[518] And I think we have successfully gotten a lot of people off the couch and doing good stuff, or maybe like off Twitter and doing good stuff.
[519] That being said, there are big differences in how much impact you can have.
[520] And so if you imagine you see a burning building, there's two wings on fire.
[521] If you go run into one, you can save 10 lives.
[522] You run into another wing.
[523] You can only save one life.
[524] What would you do?
[525] What do you think you ought to?
[526] do.
[527] Well, here's where it gets complicated.
[528] Is it your sister?
[529] Are there five serial killers in the building?
[530] Did you ever watch this show?
[531] You might have heard of it at The Good Place.
[532] It talks about some issues like this.
[533] Have you read Mike's book on philosophy, his new book?
[534] I haven't read it yet, but I really want to.
[535] I've listened to some of his podcasts.
[536] It's very funny as a academic seeing some of these ideas hit the main theme.
[537] Were you cheating?
[538] Oh my God, you definitely are.
[539] If only are that good looking.
[540] Oh, right.
[541] But I am a professor of ethics and moral philosophy.
[542] And you're both wills.
[543] And does that Will of a gap in his teeth?
[544] I don't think so, but I'm not sure.
[545] Okay, we can make that happen.
[546] Chidi's real name is Will, so this is incredible.
[547] Okay, so I concede that the value of the lives are all equal in that they're all have a similar criminal background and a similar future.
[548] Yes, I'm going to go into the building with five people.
[549] Yeah, exactly.
[550] You'd say five of a saving one.
[551] That's just the world we're in.
[552] We are in the mother of all thought experiments, philosophical thought experiments right now.
[553] Chidi with driving the trolley down the tracks.
[554] That's the situation we're in.
[555] now, because it literally is the case that don't want to make you feel bad, but you have money in your bank account that could save one life.
[556] You don't know what I have, Will.
[557] It could save 10 lies, it could save 100 lies, or it could protect other women from fistula.
[558] It could improve the education of other people.
[559] We are surrounded by all of these different moral problems facing us right now, and we can help any of them.
[560] And it's just as real as if we're witnessing the burning building with the one child in one wing and five in another.
[561] And so that's what I want to get people to do at least as to like appreciate that's the kind of reality within and at least think that flu.
[562] I want them to do too and I appreciate you.
[563] So I'm not even going to make the joke I had on the back burner.
[564] You're going to.
[565] Go ahead.
[566] I'm not going to move.
[567] No, no, no. I don't want to lighten that statement.
[568] He landed it.
[569] You can't be a tease.
[570] Come on.
[571] Okay.
[572] So the real question for me would be on one side of the building, there's five people.
[573] You could save their lives for $16 ,000.
[574] And in the building on the right, there's a very powerful Dukati that could be purchased.
[575] That's how disgusting the analogy would have to be for me. That's tricky.
[576] I mean, it's just terrible.
[577] It's diabolical.
[578] You would not.
[579] You would definitely save the people.
[580] That's the thing.
[581] And that's what Will's trying to force me to see is that that choice is there.
[582] It's not presented to me in that way.
[583] No one actually asked me, hey, when you buy this, these five people die, are you going to do it?
[584] Of course, I want it at that time.
[585] But what I do instead is I just ignore that.
[586] Right.
[587] It's very easy to ignore.
[588] That is one thing that, even when you're saying it now, I'm like, yeah, I get it.
[589] But, like, how do you get people to feel it?
[590] How do you get people to feel it?
[591] Because intellectually is one thing.
[592] Early on, I was just trying to decide how much of my income should I give and so on.
[593] I did just load up images from Google image search of children with horrific.
[594] Oh, God.
[595] Topical diseases.
[596] Because I was like, I want to confront this.
[597] I want to confront this reality.
[598] Honestly, I just think it is impossible for people to truly appreciate.
[599] But one thing is just being part of a community where it's like, oh, we take these.
[600] things seriously.
[601] Yeah.
[602] Yeah.
[603] It helps align our kind of monkey brains that are not geared up for this stuff with what's actually best, where it's like you can get the ward and support for doing good things.
[604] The great threat to your life of being excluded from your community are hardwiring to fear that gets enacted.
[605] Yeah.
[606] Yeah.
[607] Okay, blast holes in this.
[608] Can't wait.
[609] This is the lie I tell myself.
[610] I actually do intend to gates it, right?
[611] I don't want to leave everything I've made to a bunch of family members who have already bought houses for it.
[612] That's not the game plan.
[613] And this is a battle I have with my wife.
[614] My wife is just like you will.
[615] Perhaps she'll have her last name at some point.
[616] I will say, sure, we could give away half of what's in our bank account right now.
[617] Or I can turn half of what's in our bank account into much, much more.
[618] And then at the end of this whole thing, you can fucking give the whole thing.
[619] But the money, if wisely managed, makes more money, begets more money.
[620] At the end of the day, you could be giving away a lot more if you don't give it away this minute.
[621] Blast away.
[622] I'll blast it.
[623] Thankfully, I have published articles on this topic.
[624] Oh!
[625] Hey!
[626] You're going to be citing actual papers in this rebuttal.
[627] I come prepared.
[628] One big thing is just a very practical thing.
[629] Will you actually give it if you're trying to do so in 30 years?
[630] You should have a bit of self -scepticism at least.
[631] Well, we could agree on this.
[632] I'm five years older than my wife.
[633] I'm not nearly as healthy.
[634] I'm going to be dead long before she is.
[635] She is going to be in charge of where the money goes.
[636] There's no question.
[637] Well, either way, this is going to be many decades in the future.
[638] But here's the bigger concern.
[639] The best opportunities for doing good I think are going to get used up over time, at least if you're focused on the kind of global health and well -being.
[640] Because the overall arch is it does improve and then it's more costly to improve beyond the improvement.
[641] Exactly.
[642] So the proportion of people in the world living in extreme poverty is decreasing over time.
[643] The amount of money flowing into causes recommended by effective altruism is increasing dramatically over time.
[644] Back when I started this, it was like, With a few thousand dollars, now it's like hundreds of millions of dollars per year going to these causes.
[645] What a stud.
[646] For real!
[647] That's so cool.
[648] I hope you're proud of yourself.
[649] Maybe it's us to that should end up together instead.
[650] It'd be your life worked and pry it all away from my greedy little hands.
[651] I just go for the money, a gold digger.
[652] Altruistic gold digger, that's a good expression.
[653] That could have been the name of this show.
[654] That's my follow -up book, is marrying to give.
[655] But yeah, so it does mean, though, that if you're waiting 20 -30 years, to give.
[656] You're getting a return on investment so you can give a greater amount of money, but potentially the opportunities you're giving to are going to be considerably worse.
[657] That's kind of a big deal because there are very large donors coming in, scaling up to giving over time.
[658] That makes sense.
[659] If you chart the returns on the money versus we call it expensive impact, let's say, and for $5 right now I can save a life, but in 30 years, inflation being neutralized, it's probably going to cost me $12 to save a life because we'll have saved all those lives that could be saved for $5.
[660] I think $5 is very good.
[661] Maybe it's like $5 ,000.
[662] But, you know, just arbitrarily, yes.
[663] Okay, great.
[664] You've got me. You've won.
[665] You've bested me. Now we need to talk about what we owe the future.
[666] I was going to say something.
[667] Oh, please say something.
[668] No, I was.
[669] And then I forgot.
[670] And then I got distracted by the gap.
[671] Yeah.
[672] It's very, you know, that's a part of attraction that no one even really factors in is like interest.
[673] Yeah, being a little distinctive.
[674] Yes, I try to tell Monica this all the time.
[675] She's like, well, I don't look like the girl on the cover of the magazine.
[676] I'm like, yeah, let me. tell you about the girl on the cover of the magazine it's so boring i've seen it a million times the symmetry is so benign and perfect that it doesn't even spark my interest whereas you got this gap i got this goofy nose monica's perfect but you know what i'm saying you under count novelty it's true and interest oh i remember there we go we got her there thank you do you think that you're more programmed inherently to believe this because you're not american and it's so american to be like, get the big house, get the car, have all the money.
[677] And then you can start thinking about other people, but it's very individualistic.
[678] People in the U .S. actually give a lot more than people in the U .K. It's about like 2 % of the income on average in the U .S. and 0 .7 % in the U .K. Good job.
[679] It's been a while since I felt good about being American.
[680] Thank you for that little treat along the way.
[681] Well, I mean, maybe you're just compensating for poor Social Security.
[682] Or our history.
[683] Well, we've all got a history.
[684] Sure, sure, sure.
[685] Although you're Scottish, right?
[686] So you're great.
[687] Only because we were too incompetent.
[688] We attempted colonization and then just...
[689] We just brought...
[690] It's the Darian project.
[691] We sunk half the economy.
[692] We go to Central America.
[693] We've got these like willy jumpers because we've got no idea what we've prepared for.
[694] Bankrupted ourselves, and then we had to be ruled by the English.
[695] It's a nightmare.
[696] I didn't know that.
[697] I didn't either.
[698] You paid the ultimate price for expansion.
[699] Exactly.
[700] I do think being Scottish gives you some unique worldview.
[701] You're in the shadow of...
[702] of this very historic empire.
[703] And you're not thrilled that you're under the rule of that empire.
[704] So it's like you're benefiting from the raised wealth of it all, yet they're kind of the haves and you're the have -nots.
[705] So it's almost like you're going to have an innate skepticism of the opulence.
[706] It's certainly true that there's this Scottish culture, the mentality.
[707] It's very like working class pride.
[708] It can have negative consequences as well.
[709] So it's like pretty anti -ambition, kind of cutting down tall poppies.
[710] But it does also mean there is a bit of cultural hardwiring for concern for the poor, concerned for the disempowered, skepticism about people who are flaunting their wealth and thinking that they're big shots.
[711] Yeah.
[712] And so, yeah, I think that does guide me a bit.
[713] There are many people in effect of altruism who are from Scotland, from Australia that I think has similar culture.
[714] Stay tuned for more armchair expert, if you dare.
[715] We've all been there.
[716] Turning to the internet to self -diagnose our inexplicable.
[717] pains, debilitating body aches, sudden fevers, and strange rashes.
[718] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[719] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[720] Listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[721] It's called Mr. Ballin's Medical Mysteries.
[722] Each terrifying true story will be sure to keep you up at night.
[723] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.
[724] Prime members can listen early and add free on Amazon music.
[725] What's up, guys?
[726] It's your girl Kiki, and my podcast is back with a new season, and let me tell you, it's too good, and I'm diving into the brains of entertainment's best and brightest, okay?
[727] Every episode, I bring on a friend and have a real conversation.
[728] And I don't mean just friends.
[729] I mean the likes of Amy Poehler, Kell Mitchell, Vivica Fox.
[730] The list goes on.
[731] So follow, watch, and listen to Baby.
[732] This is Kiki Palmer on the Wondery app or wherever you get your podcast.
[733] Okay, so what we owe the future.
[734] Now, chapter one, and it's the topic of your New York Times piece that came out today.
[735] Thanks for noticing that.
[736] Yes, another feather in your cap.
[737] Longertism.
[738] Am I saying that?
[739] Longertism?
[740] Longerism.
[741] No, there's a tea in there.
[742] Long -termism.
[743] Long -termism, yeah.
[744] I kind of wanted you to keep saying it, though.
[745] It's fun to watch people, like, try to get a ring around a bottle.
[746] People failing is amusing.
[747] Long -termism.
[748] By the way, I got a chip on my shoulder even in your TED Talk because you're projecting out about the human species.
[749] You know, we've been here for 200 ,000 years.
[750] There's a fun graph you put up if you chart our economic output over those 200 ,000 years.
[751] It doesn't exist until the Industrial Revolution.
[752] And then it's just this enormous wall.
[753] And your argument is that there should be an adjoining moral ascension, which I think is great.
[754] But even when you're talking about, okay, the average lifespan of a mammal on planet Earth, we're only the equivalent of like 10 years old in that projected outcome.
[755] And you say, you know, not only should we be here for X amount of more years, but then you factor in our competency and our intelligence, and really you're looking at extending it beyond that, and then even into potentially exploring the universe and being star people.
[756] And this is what makes me sound like a nihilist or a Nietzschean.
[757] If we were observing a group of otters talking about how important it was that they preserve themselves all the way into other galaxies, I would be like, y 'all aren't that important.
[758] It's not going to be a crisis in the universal reality if the otter doesn't make it.
[759] And I got to just say from a little bit of a place of humility, who gives a fuck if we're here in 10 ,000 years?
[760] Why do we care that our species goes on millennia to other planets in the universe?
[761] We're not more important than any other animal here.
[762] So let's start there.
[763] In a sense, I agree with you.
[764] I've got no particular attachment to this species homo sapiens.
[765] As you mentioned, factually farming, I think the fact that we tend to privilege human beings above all other animals is like generally a very bad thing.
[766] However, we are distinctive in that we have the ability to reason, to make long -term plans, to create.
[767] technology.
[768] This is a really big deal because if we were to go extinct, I think it's pretty unlikely that some other species would evolve such similar abilities.
[769] It's at least like unclear.
[770] Maybe it's like 50 -50 that would happen.
[771] Yeah, if the Earth was sustainable, it would be 65 million years first mammal to us.
[772] That's a long time before you'd get another us.
[773] It's a long time, exactly.
[774] But also, we may have just been lucky, in fact, to even have human level intellectual abilities and culture.
[775] I have an anthropology degree, Will.
[776] You don't have to tell me, okay?
[777] Well, you had Joe Henrik on.
[778] He's one of my favorite thinkers today.
[779] The idea is like cumulative cultural evolution.
[780] Maybe that's just a kind of fluke.
[781] Yes.
[782] Martin Luther.
[783] That guy came along and changed our brains permanently because he insisted everyone read.
[784] What?
[785] What a fucking fluke.
[786] So wild, exactly.
[787] And so I cared about all animals, not just human beings.
[788] But here's something.
[789] Within a billion years, the Earth's biosphere will be gone because of the expanding sun.
[790] It is possible to change that.
[791] It's actually not even that complex from a physics perspective.
[792] If you just take the sun and extract some of the hydrogen from it.
[793] The fuel?
[794] The fuel, exactly.
[795] Then it burns more efficiently and cooler.
[796] And then you can return that fuel later.
[797] That would extend the lifespan of the sun by many billions of years.
[798] And in fact, with sufficient tech, and again, there's nothing in principle why you couldn't do this, you can ferry like a small stream of brown dwarfs to further fuel the sun longer.
[799] So we could actually protect the biosphere for potentially tens of trillions of years.
[800] This is why you should be a sci -fi writer.
[801] Do you think brown dwarfs?
[802] I don't know why he's convinced the dwarfs will be brown, but it seems anti -racist, to be honest.
[803] Unless he's using it in like a slave labor way.
[804] Maybe I could lead that charge.
[805] Oh, I think India will be in the mix, for sure.
[806] How tall are you?
[807] By feet, I think I qualify.
[808] Okay, this isn't about you.
[809] I'm sorry.
[810] All right, fine.
[811] These little stars, they're not really burning.
[812] Okay.
[813] How did we get onto this again?
[814] We just ensnared you in like a racist thing.
[815] Well, it's just hard to hear a brown dwarf.
[816] Dwarf in reference to people and not star systems.
[817] This is the end of my career.
[818] Thank you, thanks.
[819] I hope it was a good time.
[820] Glad it went out with a bang.
[821] I just have to speak for the laymen's, and I know that when they hear brown dwarfs, they're like, wait, what?
[822] What am I?
[823] What?
[824] So I had to call it out.
[825] Yeah, stars.
[826] Just take small stars that aren't really burning, feed them into the sun.
[827] But this means that, yeah, we could keep the biosphere going.
[828] We could keep life on Earth, not just human life, going, for hundreds of times as long, otherwise.
[829] So in the long run, if we care about the biosphere, it's only humans.
[830] Artists aren't able to do that.
[831] Okay, here's where you got me with long -termism.
[832] So I was like, you're just an arrogant human that thinks it's so important that we sustain our existence.
[833] But you got me, as philosophers often do, with all these hypothetical moral quandaries.
[834] So the example you give, you're hiking, you have a glass bottle.
[835] Monica, you see a deer.
[836] You're shocked by it.
[837] Now, he didn't go into nearly as fun detail as I'm doing.
[838] He just throws it out there.
[839] Did I pee my pants?
[840] You pee your pants.
[841] Okay.
[842] And then you drop your bottle.
[843] Okay.
[844] And now there's broken glass all over the hiking trail.
[845] E. And you would pick up the glass because it would not be hard for you to imagine a little kid walking without shoes and cutting their feet.
[846] Here's the crux, you clever motherfucker.
[847] He says, I care about the feet of a kid.
[848] So I'm going to clean it up.
[849] And I care about a kid if it's today or next week.
[850] And I guess I care And more than a hundred years if the kid's feet get cut, because I dropped it and broke the glass.
[851] Yeah, yeah.
[852] And when I looked at it that way, I was like, fuck, yes.
[853] I don't give a fuck about man in 10 ,000 years.
[854] Fuck those people.
[855] But if they get cut on my glass.
[856] Oh, that's good.
[857] I don't like it.
[858] Appealing to people's guilt.
[859] Yeah, it's cheap shot, it's clever, and it kind of worked for me. That's a big win.
[860] It's a good jumping off place.
[861] Changing your mind.
[862] It's something to be celebrated.
[863] She's had Adam Grant on?
[864] Three or four times.
[865] Yeah, yeah.
[866] He's changed my mind enough, I just want to say.
[867] That's enough with Adam changing my mind.
[868] Like, he's used his three lives, and then that's enough.
[869] No. Okay, good.
[870] Well, you can pass the baton onto me, and now I'm convincing you.
[871] You know, it's common sense the idea that we would care about future generations.
[872] If you think about climate change, carbon dioxide stays in the atmosphere, at least some of it, for hundreds of thousands of years, and keeps warming throughout that period.
[873] That's a big deal.
[874] We care about that.
[875] Or like radioactive nuclear waste that might last tens of thousands of years.
[876] Or just loads of projects we engage in, like science, innovation, moral progress, building cathedrals or, like, storing information in libraries.
[877] We do that for a mix of the benefits on the present and impact on the future.
[878] So I actually think there's this strong core of moral common sense, which is like, yeah, of course I care what happens to my grandkids and their grandkids.
[879] And so what I'm doing is just taking that aspect of common sense, moral thought, and going through its implications.
[880] Because once you appreciate, oh, actually, maybe we really are at the beginning of history.
[881] Maybe we have hundreds of thousands of years still to go, if we're the typical mammal species, hundreds of millions, if we can survive until the earth still habitable, hundreds of trillions if we one day take to the stars.
[882] If you care about future people, okay, there's a lot at stake with anything we could do to impact that future.
[883] Right, so largely we're all pretty myopic in whatever daily struggle we have.
[884] If you have kids, they're taking up 40 % your bandwidth right out of the gates.
[885] Tell us how people can shape the course of history or how they're involved in it.
[886] I think it's easy to feel like maybe states are shaping history or institutions are shaping history.
[887] Obviously, this little ant can't possibly be shaping history.
[888] So states and institutions are obviously enormously impactful.
[889] It's worth bearing in mind that they consist of collections of people.
[890] And so ultimately, what's actually making decisions are individuals.
[891] And you can contribute.
[892] So we talked about giving earlier.
[893] Absolutely anyone can take a fraction of their income, donate it to highly effective organizations that are working.
[894] in these problems, such as preventing the next pandemic, or such as helping control the development of artificial intelligence or reduce a nuclear war, we can do those things just by donating some of our money to organizations helping in the course.
[895] And this is a way that anyone can get involved.
[896] In effect, like, of the front lines, you are doing it via your money rather than working on it directly, but that doesn't change the impact that you're having.
[897] Okay, so I think a lot of us suffer, me more than others, from a pretty healthy cynicism of more.
[898] organizations.
[899] I think one of the biggest barriers to getting people to give, especially people like me, is a bad history with a lot of organizations.
[900] And no one has time to vet an organization.
[901] Like, you'll go to their web page.
[902] It'll be flowery.
[903] Beyond that, you're like, I don't know, now a Yelp review is a disgruntled person.
[904] I don't trust them.
[905] Like, how does one even begin to evaluate who would be impactful to donate to?
[906] For most people, it's just not worth your time to, like, try and do it yourself.
[907] It would be kind of like, how should I invest?
[908] of my money rather than getting any sort of outside advice or using an index fund.
[909] Instead, you can rely on experts.
[910] And so we've created a set of funds, the effect of altruism funds, or you can look at giving what we can .org, which is where you can also pledge to give some of your income.
[911] And then you're kind of giving the money into a fund that then distributes it to whatever organizations people are thinking, okay, these are really the most impactful ones, the work they're doing makes a lot of sense, and they really need more funding.
[912] And that way you can kind of guarantee that the money is being channeled to somewhere that is extremely high impact.
[913] And this isn't a challenge.
[914] I'm just curious.
[915] Why would one trust you?
[916] Ultimately, you've got to assess that on its own merits.
[917] I mean, one thing is that they're just very few people really trying to do this ultra vigorous in -depth research.
[918] But then you can look.
[919] Our history is via work in global health and development.
[920] And so you can look at Givewell for the recommendations that are made there.
[921] And then kind of through the effect of altruism community and some of the foundations we work with, such as Open Flanthropy or Future Fund, which I advise.
[922] You can see write -ups of some of the things that have been funded.
[923] So you can at least see the kind of reasoning and the transparency and understand some of the organizations that are receiving funds.
[924] And then ultimately, I guess, make up your own mind.
[925] Maybe you then think, oh, no, these guys are full of shit, and I don't close them either.
[926] But my bet is that you wouldn't think that.
[927] Say both of the websites again.
[928] Giving what we can .org.
[929] Giving what we can .org.
[930] Exactly.
[931] An organization I helped to set up, encouraging people to give at least 10 % of their income.
[932] It also has funds that you can donate to that then distribute the money to the most effective places.
[933] You can also look at Effective Altruism funds in the broader Effective Altruism community for more information and recommendations too.
[934] Okay.
[935] Can we do what we would say here, Reader's Digest, kind of cursory look at the trajectory of changes?
[936] Yeah, there are two ways that we can impact the long -term future.
[937] One is by safeguarding civilization, just ensuring we survive at all.
[938] That's reducing the chance of extinction from pandemic or all that nuclear war or something.
[939] And, you know, maybe you'll object to that.
[940] You're like, who cares about these humans anyway, like you were saying?
[941] I think we should, but there's another way of impacting the long term, which I call to eject to these changes, which is making the future better, even in those worlds where we survive a long time.
[942] The key thing there is improving society's values.
[943] I think we've made moral progress over time, but I'm very worried that that might stall at some point.
[944] That we've reached a stagnation and moral progress.
[945] Exactly.
[946] To take a kind of vivid example of this, imagine if the Nazis had.
[947] one World War II, a little bit of counterfactual history.
[948] And let's say they'd gotten very powerful indeed.
[949] So they had succeeded in their aims of global domination and instituting what they called a thousand -year empire.
[950] Would we expect much moral change and progression?
[951] And I think probably not.
[952] They had a totalitarian state.
[953] They had a very fixed ideology.
[954] They wanted that to stay.
[955] And that would have been terrible.
[956] We would have lost out on enormous amount of our potential because the world would have been guided by fascist ideology rather than something liberal and egalitarian.
[957] And I'm worried that something similar could happen in the future.
[958] So if you look at history, over and over again, you see ideologies get power and then try to lock themselves in and entrench themselves.
[959] The people leading those ideologies, they try to eradicate the competition.
[960] They try to effectively kind of brainwash people into subscribing to their moral view and then persisting with that.
[961] In a sense, I think it's just somewhat lucky that that's not happened to the world already.
[962] But I actually think there's a significant risk that It's only a matter of time.
[963] And I think some things that could happen, like the result of a third world war, formation of a world government, the first space settlement, or the development of artificial intelligence that is as powerful or even more powerful than human intelligence, those things could precipitate a moment of what I call value lock -in, where single set of values just determine how the whole future goes.
[964] And if those values are bad, then that could be utter catastrophe.
[965] So we would all get the impact of nuclear holocaust.
[966] That doesn't need any explaining.
[967] A world war, we can see how that would end up in one world victor.
[968] Why does one world government, how does that materialize?
[969] A most natural thing is after the outbreak of some Third World War.
[970] But I think the thing I worry most about is that there's some technological development that enables a small group, like a single country, maybe even less than that, could be a company, to get enormous amounts of power, more power than the rest of the world combined, such that it can then essentially take over and implement.
[971] its regime.
[972] This is what you've seen in the past with conquest and then formation of larger empires.
[973] It's what we could see in the future.
[974] To ground it in a present -day example, neither of us are going to accuse this company of this, but is Facebook a candidate for that?
[975] You have something that's transcending all national borders.
[976] It's got a common code.
[977] Minimally, we're all internalizing some shared culture that was determined by some white dudes that are young in a room.
[978] I don't think I'd pick out Facebook in particular, but in general, large companies, are getting very big.
[979] They often have revenues larger than many countries.
[980] Apple might be a better example.
[981] Or Amazon.
[982] I mean, Amazon, Google, I think all of these big ones, the idea that the most important actors in the future could be companies rather than countries might seem really wild.
[983] But a very idea of a nation state is not that old.
[984] City states used to be how it was.
[985] And before then, it was like hunter -gatherer bands.
[986] New forms of organization are things that we should be taking very seriously as possibilities when we're expanding our horizons a little bit.
[987] But really the key thing for any of these is just if a small group suddenly gets enormous amounts of power, which I think would be possible in particular via artificial intelligence.
[988] There's this kind of key moment where once we develop AI that is able to design better AI, so it's able to do machine learning and chip fabrication and so on.
[989] Then it seems like we should get much faster rates of growth and technological progress.
[990] It should be exponential once the smarter beings take over and create smarter beans.
[991] Faster than exponential, in fact.
[992] Oh.
[993] We're exponential at the moment, but it would go super exponential.
[994] And it could be the case that you get extremely lack of tech progress.
[995] Imagine like centuries' worth of technological progress happening within years.
[996] And that could be very bad.
[997] I mean, look at what happened when Western European countries developed industry.
[998] What did they do?
[999] They used that power to kind of conquer most of the globe.
[1000] In the same way, if the future technological advances, such as via AI, give enormous amounts of power to relatively small groups, that could also be used for kind of conquest of domination, and then subsequently entrenchment of bad values.
[1001] And that's something I think we should be worried about.
[1002] So hypothetical, Google very well -intentioned, Facebook very well -intentioned, YouTube, seemingly well -intentioned.
[1003] No one really truly understood the algorithm that keeps your interest.
[1004] They didn't realize that the way it would naturally keep your interest is bringing you further, further and further into the extreme and the fundamental and the right and left -wing.
[1005] Somehow, with all of our genius, that was unforeseeable.
[1006] So first, by acknowledging that the outcomes are highly unimaginable.
[1007] We've seen it already with what's weaker AI.
[1008] Yeah.
[1009] Some of the worries I have about AI are continuous with what is happening now.
[1010] So one thing you point to there is just these are like the algorithms getting more power.
[1011] And that doesn't guarantee that we end up getting what we want.
[1012] And one of the worries is that as AI becomes more and more powerful, much smarter than us.
[1013] and we're just like offloading more and more of societal decision making onto these AI systems.
[1014] Then we just lose control.
[1015] And then a future that is guided by values and goals that are utterly alien to us.
[1016] Well, also values and goals can evolve.
[1017] We had a great mathematician on that just reminded people that an algorithm doesn't discover the truth.
[1018] We kind of have that notion that it can discover the truth better than we can more rapidly.
[1019] But all it does is discover the means to an outcome that you've established as the goal.
[1020] So even in our best selves right now, the most evolved progressive people in 1776 still allowed slavery.
[1021] We have to acknowledge that we don't currently hold, believe it or not, the ethics and morals, we're not at the apex yet.
[1022] So we would set goals wrongly that would be executed perfectly.
[1023] And then the machine itself would be too big for us to adjust or reset the goals, right?
[1024] Yeah.
[1025] I think you're completely about just 250 years ago, 200 years ago.
[1026] most people around the world, in fact, were very comfortable with the institution of slavery.
[1027] And so what's the kind of best model view going to be like, the sort of one that we converge on after long period of careful reflection and debate and so on?
[1028] Maybe that's as alien to us today or as weird to us today as quantum mechanics is to Aristotle.
[1029] Could totally be the case.
[1030] But there's a worrying human tendency, I think, to say, hey, I figured it out.
[1031] I know the best thing, and I'm going to make it happen.
[1032] And that's national socialism or Stalinist communism.
[1033] or Mao's Cultural Revolution, that's a very dangerous tendency, I think.
[1034] And if there's some technological development such as AI that could entrench power in the hands of a small number of people, then that's perhaps exactly what we could see.
[1035] So what I really want us to aim towards is a period of long, careful reflection, deliberation, diversity of moral views, people can debate and figure stuff out before we take any irrevocable actions.
[1036] And the worry is that that doesn't happen.
[1037] Instead, technology allows one group to seize power, implement their worldview, and then that's what we've got, and that's what the future is guided by.
[1038] Well, it quickly becomes a utilitarian argument.
[1039] Sure, we cannot do it, but we are racing China to do it.
[1040] That's evident today.
[1041] So then it becomes like, well, we can't take our foot off the gas because we're surrendering to whatever version they come up with.
[1042] So you can't exit the arms race of AI.
[1043] Yeah, and I think in that situation where it's like the U .S. versus China, maybe neither U .S. nor China, maybe neither U .S. nor China, China wins and the AI wins instead.
[1044] Yeah, for sure.
[1045] And so I think one thing we can work on if you want to make the long -term future go better is global cooperation and coordination.
[1046] Many of the issues I focus on are called global public goods.
[1047] The issues where everyone's acting in such a way that it's rational for them.
[1048] If the U .S. is like, well, if I don't do it, China will do it.
[1049] And China's like, if I don't do it, US will do it.
[1050] The best outcome would be if U .S. and China are willing to cooperate and say, hey, advanced AI is going to bring an enormous number of challenges.
[1051] is we don't want this to happen.
[1052] We're going to go a little slow.
[1053] We're going to have careful regulation that will enable us to handle this safely.
[1054] That's a general category of things that I think can be enormously impactful for the long term.
[1055] If we first acknowledge it as a zero -sum game, and we say to China, you're right.
[1056] One of us is going to win.
[1057] And when we believe it's us, it's worth the gamble.
[1058] But if you just allow for a minute to imagine it's not you, China, or it's not us, America.
[1059] Let's hone in on how much we could lose if we don't win.
[1060] Might be the motivation for some cooperation.
[1061] Yeah, it used to be that European countries were at war most of the time.
[1062] That is now unthinkable.
[1063] Simile between the UK and the US, maybe we can get to a state where the whole world has as friendly relations as the US does with the UK and Europe.
[1064] I mean, if you distill it down into the simplest idea, through some crazy series of events, British folks and Americans have decided we're an in -group.
[1065] Maybe not inter -in -group, but globally we're an in -group.
[1066] And the real challenge is to somehow label this world as the in -group.
[1067] Exactly.
[1068] And then crucially, doing that while still maintaining diversity of world views.
[1069] Mm -hmm.
[1070] The global conquest with a single ideology, then everyone's part of the in -group.
[1071] That is not what we want either.
[1072] Growth ends there as well.
[1073] Exactly, yeah.
[1074] So that's the balancing act, is having global cooperation.
[1075] operation without global conformity and the slowdown of kind of model progress that we need in order to get to a really good future.
[1076] In a uniculture, we don't want.
[1077] Not without an awful lot of reflection and careful debate and diversity.
[1078] Okay, so if we can acknowledge that what we deem morally appropriate changes over time and that we don't really have that good of an eye to see what will be ahead, how can you, I don't want to say guarantee, but kind of, like, guarantee that let's take some of these places.
[1079] where we'd put our money, some of the websites and stuff.
[1080] How can you guarantee that our money is going to go to something that won't in a hundred years be...
[1081] Regarded as bad.
[1082] Yeah.
[1083] There aren't any guarantees in any of this ultimately, so you're having to make decisions that are the best decisions we have with the evidence available at the time.
[1084] I do think many of the things that we want to promote are very robust.
[1085] And this is something I particularly favor, in that when I think, oh, across a lot of different ways of looking at this, different moral perspectives, it still looks like a really good thing.
[1086] thing to do.
[1087] That's much better than maybe some very narrow targeted thing that looks good on your own particular moral worldview, but is not as generally robustly good.
[1088] How big is the effective altruism group?
[1089] If you look at people who've taken the giving what we can 10 % pledge, there's about 7 ,000 of us.
[1090] People who would say yes to the question, are you part of the effect of altruism community, I think is probably something like 10 ,000, maybe 20 ,000.
[1091] This is a total tangent, but I think we'll have fun doing it.
[1092] I imagine you've read the Yuval Harari books, have you?
[1093] Yeah, Vets Sapiens.
[1094] He introduces us to this amazing concept that every single thing's a story.
[1095] And so many of them are easy to agree with.
[1096] Oh, money's a mental construct.
[1097] Yeah, I get that.
[1098] Religion's a mental construct.
[1099] Sure, I get that.
[1100] Oh, and I see how it materializes how it serves us.
[1101] We can trust one another.
[1102] We can come together in bigger groups because we all believe that this money has a value that we assigned to it.
[1103] As he goes through the stories and he says, human rights is a story where you like me where I was like, whoa, hold on now.
[1104] That's not a story.
[1105] Did you have a similar feeling of like, wait, hold on.
[1106] This is a big question within model philosophy, whether there are moral truths, as it were, whether there's just a fact of the matter about what's like a long morally speaking.
[1107] And I'm actually pretty sympathetic to the answer being yes.
[1108] So if you say is torturing a small child wrong, I'm like, yes, it's wrong.
[1109] That is a fact.
[1110] That's easy.
[1111] There's a nihilist in the technical sense, where you just think there's no fact of the matter about anything to do with right or wrong or good or bad.
[1112] Or you might be a subjectivist, so you think, yeah, what is right or wrong or good or bad?
[1113] That's really just deep down a matter of my all things considered preferences over what should happen, my idealized preferences.
[1114] And I'm sympathetic to the idea that there are kind of moral fact.
[1115] It's a further question about whether human rights, is that just a way of speaking, like a bit of a shortcut?
[1116] I tend to think yes, actually.
[1117] I think what really matters as people's fundamental interests.
[1118] Stay tuned for more armchair expert, if you dare.
[1119] I think human rights as a story, it is ever changing and evolving.
[1120] So it's like first human right might be not to get tortured.
[1121] Once we've conquered that, some of us are starving others aren't.
[1122] We throw starvation on the list.
[1123] This is basically what you're arguing for impact.
[1124] It'll be always an evolving story of human rights as more and more are addressed.
[1125] First and foremost, you shouldn't be subject to a genocide.
[1126] Okay, yeah, yeah, yeah, we agree on that.
[1127] And it's going to evolve.
[1128] It's going to continue ideally to incorporate everyone having the most amount of everything that's good.
[1129] Yeah, and that's ultimately the view that I come down on.
[1130] What I want is just I look at every individual, and I think, what's best for you, what do you ideally want for yourself?
[1131] And I want to give you that as much as possible.
[1132] Maybe those wants coincide with a list of things that appear on the list of human rights.
[1133] But maybe not.
[1134] Like, maybe you just really care about having sex and eating ice cream, and those don't appear on the list of human rights.
[1135] If that's more important to you than the right to clean water or the right to free assembly or something, then I'm like, it's entirely up to you.
[1136] It's about what's good or bad for you.
[1137] Rather than this abstract, here's the list of most important things for human beings.
[1138] I think where it gets really complicated, and as someone who still has one foot in cultural relativism, you look at feminism in the Middle East.
[1139] That's where it gets really rich.
[1140] That's where it gets really for us hard to...
[1141] Well, hold on.
[1142] Because if you tell me wearing the hijab is what you want, I then go as an arrogant Westerner, well, that's because you're so subjugated that they've convinced you that that would be an act of self -empowerment.
[1143] I won't even believe you.
[1144] That would be one of these things that would be very hard to put on a human's right list.
[1145] Like, you can't make women cover their faces.
[1146] I would put that on the list.
[1147] And it's really tough because, on the one hand, I think the best piece of evidence that we have for what is good or bad for someone, is just their carefully considered preferences about what they want.
[1148] On the other hand, people can be deluded about that.
[1149] Obviously, we can get brainwashed, we can be tricked.
[1150] And so for those issues where you think, oh, maybe people really are being deluded, that's tough because how exactly do you know?
[1151] You've got to really reflect on that.
[1152] In general, my inclination is just to take people at face value.
[1153] If people have options and they pick one thing, I can step back from that.
[1154] Exactly.
[1155] As long as they're fully informed and they're really thinking about this, and it's not under some threat of coercion in the background.
[1156] How can you separate in those cases?
[1157] Yeah, it's really hard to get a real answer in those cases.
[1158] That's why morality and philosophy is juicy, because it ain't binary.
[1159] It's not empirical.
[1160] Yeah.
[1161] Also, it can be the case that different forms of society are equally good or approximately equally good at bringing about flourishing overall.
[1162] You could have one libertarian country and one communist country.
[1163] It's like an empirical question.
[1164] Maybe they both work equally well under certain circumstances with certain cultures.
[1165] Travel the world.
[1166] It's all trade -offs, right?
[1167] Yeah, exactly.
[1168] And we're making some progress and being able to look at this.
[1169] You can look at people's well -being and how they answer various well -being -related questions, even across countries.
[1170] And that starts to give us a handle.
[1171] The Scandinavian countries do very well, as well as on more objective measures like health, life expectancy and other things.
[1172] But it's something we're still learning about.
[1173] Well, that's one of your chapters.
[1174] Is it good to make happy people?
[1175] Yeah.
[1176] So obviously it's good to make people better off if they already exist.
[1177] But supposing now that you're not talking about benefiting people who will definitely exist anyway.
[1178] But instead, you're increasing the size of the population.
[1179] You're making more people.
[1180] Well, I don't know about this.
[1181] Because there's two things.
[1182] I mean, one is if you're having kids, this is the obvious thought.
[1183] But the bigger factor is on this question of extinction, where if civilization came to an end, that would mean two things.
[1184] So one is the enormous amount of suffering in the near term.
[1185] But then there would be a second thing of just the loss.
[1186] of all the future lives that could have been born, which are enormous a number, you know, trillions upon trillions of future people that could have existed?
[1187] And the question is like, is that a model loss?
[1188] Yeah, that's where I'm confused.
[1189] Okay, and I argue, yes.
[1190] So this is the hardest part of the book, for sure.
[1191] But here's one thought.
[1192] Supposing you could clap your hands and you would bring into existence a billion people and they would have lives of intense suffering, horrific torture, worst possible suffering, for 10 years and then they die.
[1193] Do you think you ought to do that?
[1194] Or do you think it's neutral to do that, or do you think you ought not to do that?
[1195] Ought not.
[1196] Yeah, no, no, thank you.
[1197] Yeah, they're serving a pharaoh in 2660 BC?
[1198] Nah, you can skip that.
[1199] Okay, great.
[1200] So we think that we can have reasons not to bring into existence lives that are sufficiently bad.
[1201] Can I just say technically what he just did to us?
[1202] He got us to agree to the null hypothesis.
[1203] Okay.
[1204] So I'm now going to flip it, which is just, okay, if you agree it's bad to bring into existence lives that are sufficiently bad, why is it not good to bring into existence lives that's sufficiently good?
[1205] You'll never be able to guarantee that.
[1206] But in this thought experiment, you can't because we could guarantee it was 10 years of suffering and by the same token we're going to guarantee it's 10 years of flourishing.
[1207] Yeah.
[1208] I guess.
[1209] You got to say yes, unfortunately.
[1210] He's like F. Lee Bailey or one of these legendary lawyers where it's like, yeah, I guess that glove doesn't fit him.
[1211] Well, sure.
[1212] But it's not reality, but yes.
[1213] You're right.
[1214] Of course.
[1215] Well, at first was just Effley Bailey is the first one that popped in my men.
[1216] And he was very famous and effective in a courtroom.
[1217] And then he had Johnny Cochran.
[1218] If the glove don't fit to quit.
[1219] Anyways, we just were like, we agree the glove doesn't fit.
[1220] That's what we just did.
[1221] And then he said, so then the opposite is you must acquit.
[1222] Okay.
[1223] I hope I'm making better arguments than Johnny Cochran did.
[1224] Well, he made an impeccable argument.
[1225] You're making a better argument.
[1226] Is it more effective?
[1227] That's up in the air.
[1228] Anyways, we got distracted.
[1229] The conclusion I come to is like, actually, if someone will have a life, that is sufficiently good.
[1230] It's flourishing and happy.
[1231] It is a sort of model loss for that life not to exist.
[1232] And so when we think about the extinction of the human species or the end of civilization, then the fact that all of those people to come, all of those lives would be lost.
[1233] That just is a model loss.
[1234] And that gives an additional reason to care about the end of civilization or the end of humanity.
[1235] Yeah.
[1236] You know, another way to even make it a little more finite, Monica, I almost would position it like, it's easier for me because I have kids, right?
[1237] If I imagine I have the choice to give other people this otherworldly experience of sharing a life with this little person, would I want that for other people?
[1238] God, yes.
[1239] But I would even say, if you don't have that experience, think of the apex moment in your life with you and Cali.
[1240] No, I know, but none of this is guaranteed.
[1241] I could have a kid.
[1242] I can provide for a kid.
[1243] I can do a lot for them.
[1244] And they could be miserable.
[1245] They could have a mental health issue.
[1246] There's no guarantee that that person will flourish.
[1247] Yeah, there's not a guarantee, that's right.
[1248] But you can at least make informed estimates.
[1249] So one of the things I did in the course of this book was I actually commissioned a survey to ask people in both the US and India, and this is by leading psychologists, to ask them just, are you happy with your life?
[1250] As in, do you either get being born or not?
[1251] Do you think your life has more happiness than suffering in it?
[1252] And it turns out in both countries, the large majority of people say yes, that they are happy to have been born.
[1253] They think their life had more happiness than suffering.
[1254] If they could live their life again, they would choose to do so.
[1255] Then if you think, like, oh, and maybe I can do better than that average because I can bring my kids up particularly well, then I think it's a benefit you're bestowing on that child.
[1256] That's a good counter.
[1257] I know you're a liberal person, so, I mean, everyone's thinking it, who's listening to this.
[1258] I need you to explain then why we shouldn't all be pro -life.
[1259] I am so choice.
[1260] He's going to compare the suffering of the mother to the suffering of the embryo.
[1261] You could definitely talk about that.
[1262] I was just going to say a different thing.
[1263] I do conclude in the book that it's good to have kids, and that's one way of making the world better.
[1264] So you can donate to charity, you can work in a career that has a big impact, or, yeah, you can have kids and bring them up well.
[1265] And I actually think for a few different reasons, that's a way of making the world better.
[1266] In none of those cases, do I think the government should get involved and, like, force people to donate to particular charities or work in particular areas, nor should I think they should interfere with people's grief deductive rights and the deductive choices.
[1267] Because it's not like I think we're obligated to have as many kids as possible.
[1268] I just think it's one pathway to living a good life.
[1269] Can I pitch because I like you and I like what you're saying and I too want the world that you want?
[1270] But again, I'm a different personality type than you.
[1271] I'm not inclined to think much out of my circle, out of my immediate family.
[1272] But I want to pitch.
[1273] There'll be many people that are like me listening and they're like, I'd love to care.
[1274] I just don't.
[1275] This is the answer for myself.
[1276] I'm an individualist, and one of my tenets is I leave the world minimally as good as I found it.
[1277] That's my own moral compass.
[1278] And then through that, I even open up the idea of, wow, beyond just leaving it as I found it, I actually am finding I can improve the people's lives around me. And for me, my own ego, my own vanity, my own hero of my story, I get a deep satisfaction out of that.
[1279] I like the guy at the end of the night lying in bed who elevated some of the people around me. And I also am of the opinion that you're a big liberal.
[1280] That's great.
[1281] You're thinking of the community.
[1282] I think there's a lot of validity to the right leaning people and conservatives, which is, hey, make your little community good.
[1283] Make your neighborhood good.
[1284] I also think that this whole thing can trickle upwards from people who are relatively selfish and just want to make the five people around them healthier and flourishing.
[1285] I think when that happens, those people are exposed to five more people and it fans out.
[1286] I think there's a micro level and personal level to explore that works in tandem with your approach and my wife's approach.
[1287] And I guess I'm appealing to the people like me in suggesting that.
[1288] Look, there's two ways of thinking about doing good in the world.
[1289] There's this kind of obligation framing.
[1290] The child's drowning in front of you, you're an asshole if you don't save them.
[1291] But it's also this different framing that I find very motivating, which is if you save a life, that's this amazing thing.
[1292] My brother saved a woman's life from drowning when he was, teenager.
[1293] He won an award at the school, like Maconadad was like over the moon.
[1294] I haven't asked him, but I think that would be like really one of the most meaningful moments.
[1295] Well, minimally on his ledger, he also probably called some kid a dork.
[1296] And that sucks.
[1297] And that was on the ledger.
[1298] Well, he bullied me when I was a kid.
[1299] There you go.
[1300] There you go.
[1301] But by God, he saved that woman's life and it probably put him in the black a little bit.
[1302] But now imagine you were doing that like every week.
[1303] Over and over again, you're saving these lives.
[1304] You would think, wow, this is just this remarkable existence I've had.
[1305] Well, we can do that all the time.
[1306] Like, that is just what model reality is, and you can live like that all the time, saving dozens of lives over the course of your life.
[1307] And then when we look to these other issues, okay, now think even bigger potentially.
[1308] Think, oh, I was like one of the important people who helped win the war against the Nazis and prevent a totalitarian Nazi state from ruling over the world.
[1309] Or I was one of the people that helped avoid World War III from the U .S. and USSR.
[1310] Like, imagine those.
[1311] things in history you can, you would think, I've had this, like, amazingly important life.
[1312] Well, actually, again, that is something that anyone can do.
[1313] And maybe it feels more boring than the, like, action movie that you might be used to.
[1314] Maybe it's just cutting a check to a really impactful organization.
[1315] But, marvellous speaking, it's the same.
[1316] In terms of your impact on the world, it's the same.
[1317] Okay.
[1318] I don't know if it's the gap in your teeth.
[1319] I don't know if it's how charming you are.
[1320] You're making me blush.
[1321] Fuck you for this.
[1322] This happens once or twice a year.
[1323] You do?
[1324] Uh -huh.
[1325] I'm not going all the way.
[1326] You're not.
[1327] Okay.
[1328] I'm not doing 10%.
[1329] When we get off, I'm going to donate 13 ,500, which was the exact amount I wanted to buy this new motorcycle for.
[1330] Holy shit.
[1331] I'm going to donate 13 ,500 to Ways We Can Change .org.
[1332] Giving what we can .org.
[1333] Rob's the one that will make me do this.
[1334] He's writing it down.
[1335] Okay.
[1336] I'm donating a Ducati instead of getting a Ducati.
[1337] That's really nice.
[1338] People you help will be thankful.
[1339] Okay.
[1340] Boy, it's hard to get a penny out of me. So just pat yourself on the back.
[1341] I'm doing it right now.
[1342] Wonderful.
[1343] I'm really pleased that you've been swayed by the force of reason.
[1344] Bamboozled, some might call it.
[1345] No, you're doing a really good thing.
[1346] Okay, and then hopefully a bunch of people will check out what we owe the future.
[1347] Again, like you and Mike Scher, I don't like people like you because you're better than me and I'm competitive, but I'm at the same time, so fucking grateful that I'm not the only personality type on planet Earth or this place would be miserable to live in.
[1348] So I truly, from the bottom of my heart, I thank you, Will, for doing the work you do and for writing a book like What We owe the Future.
[1349] I hope everyone checks out, order it immediately.
[1350] And, you know, maybe you weren't going to buy a Dukadi, but maybe you were going to buy a shirt with a seam in the back of it.
[1351] Okay.
[1352] How much did you spend on the shirt with the seam in the back?
[1353] Look at this.
[1354] I've fucked her now.
[1355] Too much money for the shirt with the seam.
[1356] You're embarrassed to say?
[1357] Yeah, I'm a little embarrassed to say.
[1358] But will you donate that amount of money?
[1359] I'll donate that amount times three.
[1360] Oh my God!
[1361] It's not that much.
[1362] It's unfortunately not that much.
[1363] Holy fuck.
[1364] The telephone lines are open.
[1365] People are calling.
[1366] Rob, Rob, you silent little shit.
[1367] Rob, what were you going to buy recently that you might instead?
[1368] I don't have anything.
[1369] I was going to buy.
[1370] You haven't been eyeing anything on the internet?
[1371] This is the problem with a minimalist.
[1372] We need multiple strategies for this.
[1373] I was thinking I'm going to go iPhone at some point.
[1374] Okay.
[1375] You're going to donate an iPhone value?
[1376] Yeah.
[1377] Okay, there we go.
[1378] Rob's in for an iPhone.
[1379] Nice.
[1380] She's in for three.
[1381] X of a Roe shirt, and I'm in for a new Dakotty.
[1382] Okay.
[1383] The fucking phones are ringing off the hook.
[1384] Fantastic.
[1385] Well, maybe listeners can message in and say what other things they gave up and donated instead.
[1386] And maybe you could shout out next time.
[1387] No, we will.
[1388] So listen.
[1389] So when you listen to this episode, go on to Instagram and tell us what thing you didn't buy and donate it instead.
[1390] It'll be very fun.
[1391] And we will heart it and we will commend you for your sacrifice.
[1392] I want to do something, but I can't yet.
[1393] Okay.
[1394] I would do 10%.
[1395] I have to finish my house.
[1396] Okay, so, well, she's got to, you understand.
[1397] I'm in the middle, I'm in the middle of a pretty intense construction project, and I think my business manager would not be happy if I did that.
[1398] Well, also, she's already tithing 10 % of her income because her church has a lot of litigation for assault cases.
[1399] So we've got to keep the coffers full to pay out those plaintiffs.
[1400] Yeah, you know, you get it.
[1401] I understand.
[1402] Okay, well, next time I'm on the show, I'll check in again.
[1403] You'll both be donating 10%.
[1404] There we go.
[1405] And then we'll talk about your choice of career.
[1406] Fuck you, Will.
[1407] What a pleasure meeting you.
[1408] I wish you a ton of luck with what we owe the future.
[1409] I hope everyone gets it.
[1410] I hope people donate whatever weird thing they were going to buy that they didn't need like me. Yeah, that's that.
[1411] Thanks so much.
[1412] This has been a blast.
[1413] Absolute joy.
[1414] So nice meeting you, Ben.
[1415] All right, take care.
[1416] And now my favorite part of the show, the fact check with my soulmate Monica Padman.
[1417] Oh, and action.
[1418] Hello.
[1419] Whoa.
[1420] I just feel weird.
[1421] What happened?
[1422] They feel like small.
[1423] Oh.
[1424] Because David was wearing them last.
[1425] Does he wear them in the extra small?
[1426] Oh, my God.
[1427] His head's so small?
[1428] That's not true.
[1429] His head's enormous.
[1430] You know, Callie's head is enormous.
[1431] It is?
[1432] You can't tell visually.
[1433] I know.
[1434] What an optical illusion.
[1435] It's a fun pop out because, like, yeah.
[1436] You would never guess, but she wears a bigger helmet than Max.
[1437] Oh, that's, I guess, I don't know if that's a feather in her cap or embarrassing for Max or who cares?
[1438] No, I think it's who cares.
[1439] Yeah, but don't you think a man wants to wear, like a man wants to wear bigger shoes than his wife.
[1440] Yeah, but I think.
[1441] And bigger gloves.
[1442] I don't think it has anything to do.
[1443] I don't think it has anything to do with Max's head so much as it has to do with Callie's head.
[1444] Right, it's more coming on her head.
[1445] Yeah.
[1446] I have the inverse of her.
[1447] You'd look at me and think I have a big head, right?
[1448] Visually?
[1449] I don't.
[1450] I have a small head.
[1451] I wear a large helmet, but really I could get away with a medium helmet.
[1452] You pick large just because you want people to think it's large.
[1453] No, I'm actually in between, and I'd rather have it be a hair loose than a hair tight.
[1454] Oh, okay.
[1455] Tight is uncomfortable.
[1456] But loose is fine.
[1457] It's safer.
[1458] It's tight is safer, for sure.
[1459] Yeah.
[1460] Yeah.
[1461] The loose is more comfortable.
[1462] And I will say I've been in a slew of accidents, and as you know, I've never had any real helmet damage.
[1463] Can you knock on wood right now, please for me?
[1464] Knock, knock, knock, knock, knock.
[1465] That's fake wood.
[1466] I think you need to go to the desk.
[1467] First of all, it's not fake wood.
[1468] Compressed board is still wood, Rob.
[1469] Can you just please?
[1470] He throws these bombs out, and then he goes over to his phone.
[1471] It's like, what is he doing?
[1472] I know.
[1473] Okay, so I'm knocking on wood.
[1474] Thank you.
[1475] But in my different motorcycle.
[1476] accidents.
[1477] I'm wearing helmets in them.
[1478] Only one time if I had a scratch on my helmet, the one where I went over the handlebars, and it was still minor.
[1479] I've certainly seen dudes come back from accidents and their helmets are like crushed.
[1480] Anyways, I've opted for a little tiny bit more comfort.
[1481] That's not the point.
[1482] The point is it looks like I got a big old head on my shoulders and I don't.
[1483] I have a very medium size like my build, medium build.
[1484] Yeah, it's all proportional.
[1485] Okay.
[1486] Speaking of helmets.
[1487] Okay.
[1488] Wow.
[1489] Ding ding ding, ding helmets?
[1490] Yes.
[1491] Incredible weekend.
[1492] yes well particularly sunday for the race was incredible yes people who follow me on instagram they have to know asap how there was a red bull race car in the driveway which is a great question how is it there how is it yeah i think some people thought perhaps i purchased one which oh i think even if you get a deal on it you're looking at a ten million dollar car oh my god anyways our friend jeremy at red bull who you know I love him.
[1493] Yes, he is the best brand ambassador a company could have.
[1494] So Jeremy, who has always been our host when we've gone to races and gone to the Red Bulls suite, we text each other often.
[1495] He's also really good friends with Ricardo.
[1496] Oh, he is.
[1497] Yes, and in fact, I was introduced to Jeremy through Ricardo.
[1498] Because when Ricardo was at Red Bull, I think he and Jeremy were buddies.
[1499] Because who wouldn't be friends with Danny?
[1500] Or Jeremy, but they're a match made in heaven.
[1501] True.
[1502] So at any rate, I text with him.
[1503] know, semi -regularly.
[1504] And he hit me up like, I don't know, three, four weeks ago.
[1505] And he said, hey, by the way, we have two rolling chassis that we put on display at different places.
[1506] And they're not really going to be used until Austin, if you'd ever want one at your Formula One Sunday screening party.
[1507] Yeah.
[1508] And of course, I was like, oh, my, yes, every rate.
[1509] Like, any time that could be at my house.
[1510] And I'm pretty impressed with myself.
[1511] I didn't tell anyone it was going to be...
[1512] You told me. But that makes me feel so special.
[1513] I loved it.
[1514] I didn't tell anyone.
[1515] Yeah, yeah, yeah, yeah.
[1516] Are you mad at yourself or me?
[1517] No, I've, no, no, no, no, no. I'm mad at anyone.
[1518] I feel like that in itself needs a further explanation, which is A, I trust you weren't going to tell anyone.
[1519] Yeah.
[1520] B, I assessed, I wouldn't be denying you the awe and shock of seeing it in the driveway.
[1521] Because you're not a crazy gearhead.
[1522] Oh, yeah.
[1523] Oh, this is funny.
[1524] You took that in such a different direction than I took it.
[1525] Oh, how did you take it?
[1526] Because I took it as I want to tell someone.
[1527] Yes.
[1528] So I got to be that person.
[1529] And obviously, Kristen knew and Carly.
[1530] Yeah, yeah, yeah, yeah.
[1531] I knew too.
[1532] Shut up, Rob.
[1533] You did not know that.
[1534] No, Carly told me. I didn't tell anyone.
[1535] You are getting really close to a physical altercation with me this morning.
[1536] Because look, let's put it this way.
[1537] I want to back up.
[1538] Okay.
[1539] If I thought you were going to pull into the driveway and go, oh, my God, what the fuck?
[1540] I wouldn't have denied you that.
[1541] So I'm just saying there is a calculus in why I would have even.
[1542] You know me, I'm not, I don't tell people what I got on for Christmas.
[1543] I don't believe in opening presents early.
[1544] You'll wait for the surprise.
[1545] In fact, the anticipation of this price is almost better than it's wrong.
[1546] So I didn't think I would be denying you that really.
[1547] Okay, but I didn't tell Charlie, I didn't tell Ryan, I didn't tell Matt.
[1548] I didn't tell Sean or Nina.
[1549] I told one more person, I don't want it to make you feel less special.
[1550] Who?
[1551] Eric, because I didn't think he would give a shit.
[1552] Oh, that's fine.
[1553] Yeah, I was like, he's not going to even care.
[1554] He won't even notice it.
[1555] Like, if I don't tell him it's coming, he'll walk in and pass the car and not have noticed that.
[1556] That's true.
[1557] And then he'll people be taking pictures.
[1558] And he'll first notice that people are taking pictures, but not of what yet.
[1559] Yeah.
[1560] Okay.
[1561] So I didn't tell anyone which is so hard to do.
[1562] Also, I wanted to.
[1563] You didn't tell anyone except for four or five people.
[1564] Oh, you're such a little boy.
[1565] Yeah, yeah, yeah.
[1566] I didn't tell any people.
[1567] And, but the added, here was now another layer of this in depth.
[1568] This is like one of your stories where you go to the mall.
[1569] Okay.
[1570] The other element I had to keep on the table was I needed people to come.
[1571] Like this wasn't a Sunday to skip.
[1572] Oh.
[1573] People come.
[1574] They come.
[1575] They don't come, right?
[1576] So I was in this really tricky situation where I was like, I had to let people know, like, you have to come to this Sunday without telling them there's a Red Bull race car in the driveway.
[1577] Oh, how did you do that?
[1578] I was just like, we're going to blow it out for Italy, food.
[1579] I see.
[1580] Spaghetti.
[1581] Yeah.
[1582] Yeah, it was really, that was for me. That was your Red Bull race car.
[1583] Yes.
[1584] And I don't think I told you I was cooking spaghetti.
[1585] You didn't.
[1586] I arrived and there was a pot of my favorite food.
[1587] Well, I like to think you walked in and were popped in the nose.
[1588] Boom, with the smell.
[1589] Well, there was a lot going on because there's amazing smells and you guys did blow it out.
[1590] And Kristen made this incredible table scape with an olive oil bar.
[1591] It could have been a display.
[1592] Like when you walk through, not Park Avenue, in New York.
[1593] Bucca to Beppo.
[1594] No, not the Pope Room.
[1595] When you're walking and you see the window displays of really nice places.
[1596] It could have been a window display somewhere.
[1597] It was beautiful.
[1598] She did a great job.
[1599] And so I was first, I was stuck.
[1600] Then I moved over and then there was spaghetti, my favorite food.
[1601] One of my deathbed foods.
[1602] All of that willpower or self.
[1603] Yeah.
[1604] My brain's not on fire today.
[1605] I think because I did eat about six pounds of carbohydrates yesterday and gluten in particular.
[1606] No, self -restraint.
[1607] Self -restraint.
[1608] Yeah.
[1609] It was rewarded because people arrived and what was great is no one, people didn't arrive all at once.
[1610] Even though you tell people it's at 11, they come in scatty wampas.
[1611] Yeah.
[1612] You got some guys there at 11, 1045.
[1613] Charlie was first on the scene.
[1614] Charlie and Eric on the boys.
[1615] It was great.
[1616] He couldn't believe what he was seen.
[1617] We got to swim in that for a while.
[1618] That's nice.
[1619] And then next up was Matt Collins.
[1620] Oh, another great recipient.
[1621] Yes.
[1622] And then last, well, second to last on the scene was Ryan Hanson, which he gave me the best reaction.
[1623] He was, almost did a backflip.
[1624] And then, okay, last thing I'll add is then last two arrived, Sean White and Nina rolled in.
[1625] Well, actually, you were last.
[1626] Thank you.
[1627] But you're not a, you don't have to be there for the beginning of the race.
[1628] The race is immaterial to you.
[1629] You pop in, take a little peek.
[1630] Except I was very excited about this race in particular.
[1631] One, because I knew about the car.
[1632] I didn't know about the olive oil bar or the spaghetti.
[1633] That was a huge excitement.
[1634] But I was very excited because Fannie was starting fourth.
[1635] Yes, yes.
[1636] Anywho, point is when Sean and Nina rolled in, they rolled in in their car past us.
[1637] They stopped.
[1638] They waved.
[1639] hi but no just a normal wave yeah then they parked and i was like oh they don't even really care yeah and why not we already know he was on the grid getting sucked off by hamilton yeah oh cool you got one on a model of the car anyhow they hadn't seen it then they walked up and they were like oh my god they went bananas oh good and they were like how did we not see that anyways that's all to say I showed a little self -restraint and it really paid off in the end because people really five waves of excitement.
[1640] Can I ask you a hard question?
[1641] And I want you to be honest.
[1642] I think you know what it is.
[1643] I don't.
[1644] My guard is totally down.
[1645] Oh, God.
[1646] Who are you the most excited and be honest?
[1647] Yeah, I will be.
[1648] About seeing that car in the driveway.
[1649] Charlie.
[1650] Okay.
[1651] That's what I was hoping your answer would be.
[1652] But I wasn't going to judge you either way.
[1653] Yeah.
[1654] But that's what I was hoping.
[1655] Yeah.
[1656] Charlie for sure.
[1657] Okay.
[1658] Yeah.
[1659] And then, Sean?
[1660] No, it kind of went in order of how interested people are in F1.
[1661] So, yeah, Matt was probably next.
[1662] Great.
[1663] Because it just, we care so much.
[1664] Yeah, and you guys, you've been watching all season.
[1665] Yeah, we don't miss a practice.
[1666] We're texting during all the practices.
[1667] People are reading stuff throughout the week.
[1668] Did you hear this is happening?
[1669] Like, it's our religion.
[1670] So definitely those two.
[1671] That's nice.
[1672] And then come as they may. What's the same?
[1673] I don't know.
[1674] Let the chips fall where they may on the rest of the guests.
[1675] Sure.
[1676] I did make one joke that maybe was good or bad.
[1677] I'll let you decide.
[1678] This is also why he's a dangerous friend, Sean.
[1679] We don't need another Dax in the mix.
[1680] This is what he said immediately.
[1681] He's like, is that thing, is it have an engine in it?
[1682] And I'm like, no, if it had an engine, don't you think I'd be driving it around the neighborhood to get attention?
[1683] It's like, yeah, he's like, yeah, he's like, I want to get in it.
[1684] And I'm like, yeah, I don't think we're supposed to like take it apart so you can get in it.
[1685] He's like, already now he wants to get in it.
[1686] And then he's like, we need to tow that thing up to the top of like Mulholland so you can go down like a soapbox derby card.
[1687] Like he went, we got to go out now and test ourselves in this thing.
[1688] That's what I mean we don't need another.
[1689] You're right.
[1690] Yes.
[1691] An adrenaline junkie.
[1692] And then it woke up my.
[1693] Oh, no. Yeah, my competitive spirit.
[1694] And I thought, wow, I had been so content with it just sitting here in the yard.
[1695] But now that he mentions it, I should have had Carly tow me and that thing.
[1696] behind the Hellcat and see if we get up to 120.
[1697] Okay.
[1698] Can you just call me before you make any decisions for the rest of your life?
[1699] Yes.
[1700] We'll be on those fearless boulevard.
[1701] Like, you'll look out your window and you'll see, is that a Red Bull car with no engine?
[1702] I'm just shaking my head.
[1703] Sean White's driving the Hellcat.
[1704] Listen.
[1705] It's a good test for you.
[1706] It is to test my maturity.
[1707] Yeah.
[1708] Also, thank you, Jeremy.
[1709] Thank you, Red Bull.
[1710] Thank you, Prego.
[1711] People in the comments were like, how dare you drop Dan?
[1712] Rick like this or you're betraying Daniel Ricardo.
[1713] Because of the car?
[1714] Because it's Red Bull.
[1715] Oh.
[1716] And so I just wanted everyone to know the complete history.
[1717] I've always been a Max fan.
[1718] I was a Max fan before I met Daniel.
[1719] I also am a huge Daniel fan.
[1720] I'm also a Charles LeClerc fan.
[1721] I have a capacity to love and root for a lot of things.
[1722] You are absolutely right.
[1723] That is a huge part of your personality.
[1724] Which part?
[1725] That you don't have that.
[1726] peace where like in a sporting event that you like hate everyone except the person you're rooting for right i don't love people at the exclusion of all others right i will like just to give everyone a little bit of a break many people do many people watch sports in that way have like an extreme loyalty towards a athlete and you just don't and you never have so i think it is worth saying that and i also think it's more complicated than people are giving it credit it, which is if Max was my best friend, I likely wouldn't have a Leclerc shirt on, because that's his actual competitor for the championship.
[1727] Likewise, I might be a bigger Lando Norris fan.
[1728] I don't know.
[1729] If Danny wasn't in the mix, I might be a bigger Lando Norris, but I'm not rooting for Lando Norris because I want my friend Daniel to beat his teammate Lando Norris, because that's what's relevant in Daniel Ricardo's life right now, is that he is better than Lando Norris.
[1730] Morris.
[1731] Also, I'm rooting for everyone that Daniel's competing with.
[1732] I'm rooting against those people.
[1733] But Max and Daniel are not competing at all.
[1734] They're just not.
[1735] Right now.
[1736] Yeah.
[1737] Currently, they're not.
[1738] The McLaren is not a competitive car.
[1739] It's in fifth in the constructor.
[1740] So there's so many ways in which people can have victory in that sport.
[1741] Totally.
[1742] I also think you can compartmentalize it.
[1743] I think like you watch the race and you can can watch it as the race and then separate that from your friendship with Danny.
[1744] But I can't.
[1745] You came in a good mood because Daniel was starting in fourth and you got progressively more and more what's a cute thing we say about you when you get antsy and you tantrumy?
[1746] Oh, tantrum.
[1747] Yes, you were getting tantrumy.
[1748] Well, I didn't like what they were doing to him.
[1749] Right.
[1750] And you were saying things that were crazy.
[1751] Like, they're making him come in as if he was the only person.
[1752] and I had to pit.
[1753] He had to pit.
[1754] And they were like, why can't he just not pit?
[1755] Yeah.
[1756] I still feel like that.
[1757] I think they tried to screw him.
[1758] They did try to screw him at one point.
[1759] They told him to hang back.
[1760] Yeah, even Matt said it.
[1761] Oh, I missed that part.
[1762] But let's go back to the tires.
[1763] If his tires would have gone 54 laps, which they wouldn't have, they literally will blow up.
[1764] They don't last that long.
[1765] Even if by some miracle he could get 54 laps out of his tires.
[1766] They would be so not sticky the last 30 laps, the last 20 laps, the last 10 laps, it'd be like he were driving on ice and everyone else was not.
[1767] So he'd be losing five, six seconds a lap, lap after lap after lap, as opposed to coming in and taking the 20 second.
[1768] They held him there for four minutes.
[1769] And I did not like it.
[1770] No, he was doing well.
[1771] Third of the race, he was in there.
[1772] But here's the problem.
[1773] And this is why I say Max and he aren't competing.
[1774] So by the second lap, you had Max, Leclerc, and George Russell, Mercedes, Ferrari, Red Bull.
[1775] And then you had Daniel, Pierre Gasly, and Lando.
[1776] So two McLaren's and Alphitari.
[1777] Within laps, there's 10 football fields between those two groups of cars.
[1778] And it's not because that's.
[1779] That's the difference in the driving ability.
[1780] Right.
[1781] Those cars didn't have any top speed.
[1782] Every straightaway, they're getting another 12 feet ahead of them, lap after lap.
[1783] So that's what I'm saying, like, Daniel can't, what's he going to do?
[1784] No, I know.
[1785] Make his car go faster on the straightaway?
[1786] That's why sometimes this sport makes me crazy.
[1787] Uh -huh.
[1788] Because I'm like, just go a little faster, Danny, but like he literally can't because the car won't.
[1789] Yeah, doesn't go faster.
[1790] That's very upsetting.
[1791] Yeah, that's why I think, yes, if you're only watching for who wins the race, then it's very disappointing because 19 people lose.
[1792] I just.
[1793] We love Daniel, and he's our friend, and we want him to get to race for Red Bull or for Ferrari or one of the top teams and be competitive.
[1794] That's what we would love for him.
[1795] Yeah, it hurts my heart.
[1796] Yeah, of course.
[1797] Oh, here's a good question.
[1798] Now, this one would be mildly challenging, like your question about the Sean.
[1799] white thing okay what if you found out that daniel doesn't give a fuck that he's happy as a clam he's not that's not i know but that's just that no i'm here's my question i asked you a real question you ask me a fake question you have not talked to daniel i saw his post he was not happy what i'm saying is you haven't asked danny how you doing after the race and what if he said oh my god i had the greatest time in italy we went to this restaurant and that i totally don't give a fuck because I already am off that team.
[1800] Like, what have you found out that he, there's zero suffering on his end?
[1801] I'm not suggesting that's the case.
[1802] But if you found that out, would you be fine with it or would you still be upset?
[1803] I would be fine with it.
[1804] You really would?
[1805] Yes.
[1806] Okay, good, because like your suspicion that I would value Sean's opinion more, I'm a little suspicious because of your relationship with winning and losing and what it means.
[1807] Yeah.
[1808] That you couldn't, even if it didn't bother him, it would bother you greatly.
[1809] What's happening when I'm watching this is, oh, my God, he's going to be so sad.
[1810] Like, as a winner, as a fellow winner, I do know what that feeling is like, miserable.
[1811] You don't, though.
[1812] Why?
[1813] Because you left the sport on top.
[1814] No, no. But, like, we had competitions and stuff where, like, we fucked up.
[1815] And it was awful.
[1816] Okay.
[1817] And I just know that you can't be at that level of a sport.
[1818] and not care, like it's not a possibility.
[1819] But if he said, yeah, I was bummed out for like a minute and then I moved on and we, yeah, had fun and did this.
[1820] Yeah, it's going to be what it's going to be.
[1821] And I drive as hard as I can and that's what I do.
[1822] I'm open to the fantasy that that could happen or is happening.
[1823] Yeah.
[1824] There was the added heartbreak of last year he won that race.
[1825] Yeah.
[1826] And he didn't want a race in a long time and he was already being criticized last year.
[1827] Yeah.
[1828] And so now he's starting the race.
[1829] in fourth and you're like you have this glimmer of hope like maybe he can do it again this year I did shut all these motherfuckers up yes I just want him to be happy isn't it great though it's just all stories in our head like because he won at the race last year it is it is story but like the stakes are high it's like he's leaving that team and I really get a ride next year I fucking hate all those people and I want him to you know give them a big fuck you I can also be I can love Daniel and also be very objective about his performance this year.
[1830] Like, I love him.
[1831] I think he's a brilliant driver.
[1832] I also think it's possible that because their long -term bet is on Lando, which it is, he has like a fucking eight -year contract or something.
[1833] He's their driver.
[1834] I know.
[1835] That sucks.
[1836] But knowing that all their eggs are in his basket, it does make sense as a team that they would design the car to absolutely benefit his driving style.
[1837] now max has a driving style right like max likes oversteer most drivers don't like oversteer the red bull car is designed for max who likes oversteer and it's got to be that way that's why they're winning so i can also acknowledge that okay McLaren's a team they've got all their eggs and landos baskets they have to make the car be exactly what Lando Norris wants and i also think it's quite possible that Daniel in a different car in his previous Renault would be beating Lando.
[1838] I like to think, largely the car is just not for him.
[1839] I'm not bummed they fired him.
[1840] I don't want him to stay at McLaren and be miserable and be a half second behind Lando every time.
[1841] I'd way rather see him go to a team where he's the star and they set up the car for him.
[1842] And then he starts blowing motherfuckers away.
[1843] So, you know, for me, it's all very complicated and I'm weirdly optimistic.
[1844] It wouldn't make me happy to see him stay at.
[1845] McLaren another year and just suffer.
[1846] Yeah, that's true.
[1847] I think partly maybe why people get riled up, like if they say like, bye, you've abandoned Danny or something.
[1848] We're all projecting our own feelings onto that.
[1849] Would you be loyal to me if I was in that position?
[1850] What would you be saying?
[1851] Would you be also rooting for this other person?
[1852] You know, I think that's what people are doing.
[1853] And I can relate.
[1854] Like, not that it's healthy, but I can relate to being like, oh, my God, where's your loyalty?
[1855] Because if you can do that to him, like, could you do that to me?
[1856] Right, I guess.
[1857] But my loyalty to you as your friend is honesty.
[1858] Yeah.
[1859] I don't believe in going along with people's lies because you love them or ignoring reality because you love somebody.
[1860] If that's somebody's expectation of me as a friend that I can only like them or only root for them.
[1861] That's not a friendship I want.
[1862] Like, I have friends who are either their hosts of shows and I watch a different show.
[1863] They're my good friend.
[1864] There was a period where I would stay up and watch John Oliver.
[1865] Now, Jimmy Kimmel is one of my very, very good friends.
[1866] I don't watch his show.
[1867] I don't stay up and watch that show.
[1868] And I'm talking about how great John Oliver is in public.
[1869] I should be talking about how great Jimmy Kimmel is.
[1870] He's my friend.
[1871] And my commitment to Jimmy Kimmel as a friend is that when he's in trouble and he calls me, I'm always there for him.
[1872] Yeah.
[1873] You know what I'm saying?
[1874] That's what I'm bringing to it.
[1875] It's not that I'm your cheerleader.
[1876] Yeah.
[1877] I think people are different.
[1878] Yeah, they want different things out of relationships, which is totally fine.
[1879] Well, not necessary.
[1880] I just think people provide different things.
[1881] So maybe that is not what you provide.
[1882] Right.
[1883] But maybe someone else does and you can have all those things.
[1884] I do.
[1885] Also, I am your cheerleader.
[1886] Take it away from me. I'm just saying.
[1887] in general.
[1888] If you're not getting everything from one person, that's okay.
[1889] That's normal.
[1890] Here's what I think is weird.
[1891] I need you to root for me. And I need to be your number one pick of a sport that you love because we're friends.
[1892] That feels a little grody to me. Like, I have lots of friends who love smartless and don't listen to this show.
[1893] Uh -huh.
[1894] I'm not, like, my expectation is if we're close is that you shouldn't listen to Smartless, you should listen to our show.
[1895] I know that this is a parallel.
[1896] Like, it's a really good analogy, but it's not registering for me as much because I think content feels different to me than like a sport.
[1897] Well, there can only be one winner, you could argue.
[1898] There can only be one winner.
[1899] They're literally doing the exact same thing.
[1900] But I guess that's what I'm trying to explain to everyone who follows me on Instagram who thinks it's that.
[1901] It's so much more complex than just Max is the winner.
[1902] Totally.
[1903] It's just complicated because I can be like so ride or die Danny because I don't care that much about F1.
[1904] Right, exactly.
[1905] So it's easy for me to walk in and be like, I only care about what's happening with Danny.
[1906] If he's not winning, I don't want anything to do with this for.
[1907] Yes.
[1908] Anyway, really fun weekend.
[1909] Really fun weekend.
[1910] You did not go in the sauna, sauna.
[1911] I didn't.
[1912] How was the sauna?
[1913] Sona was sweaty.
[1914] Yeah.
[1915] We packed six in there yesterday.
[1916] Yeah, it was tight.
[1917] Oh, tight, tight, tight.
[1918] A little claustrophobic.
[1919] Yeah, you can get four very comfortable in there.
[1920] Five year at the limit, six was past capacity, but we made it work.
[1921] Ooh.
[1922] Eric licked, Amy.
[1923] Oh, yeah.
[1924] Yeah, that's his thing.
[1925] Yeah, that's his thing.
[1926] What did he say she tasted like?
[1927] Her feet?
[1928] No. He's another thing I'm learning in addition to the foot fetish is he likes to taste what people's sweat, it tastes like.
[1929] He dried christens a while.
[1930] ago and then mollies and then he tried yesterday amy's i don't know if erika let him or i might try he had a little taste of mine and um he said amy and mine was was similar which was kind of shocking i said to him what is hers taste like athletic greens like minerals vitamins he said no it's kind of fried chickeny like yours gamy amy i want to say gany he didn't use the word gamy and amy hanson does not smell at all that's not what i want i want to conclude from this but at any rate he thought that her sweat tasted a bit like, minded, which was curious.
[1931] Yours tastes like fried chicken?
[1932] Yeah, it's what he's, but he's not the master of adjectives.
[1933] You've got to really wonder, like, what he meant by fried chicken.
[1934] I know, I think that's very visceral.
[1935] Like, I know what that tastes like.
[1936] That's salty and meat.
[1937] It's gross, right?
[1938] No, I love fried chicken.
[1939] Well, I love fried chicken, too, but when you imagine licking someone's sweating, it tastes like fried chicken, that's see, like...
[1940] Hold, I got to call him.
[1941] Okay.
[1942] This is Eric.
[1943] Oh, let me call Molly.
[1944] Hi, you've reached Molly.
[1945] What are they doing?
[1946] I don't even think about that.
[1947] Call their daughters.
[1948] I'm going to text it.
[1949] All right, well, TBD on the sweat.
[1950] I wonder what my sweat tastes like.
[1951] I want him to lick mine.
[1952] Didn't he taste yours?
[1953] No, he's never tasted.
[1954] You guys never co -mingled in there.
[1955] Mm -mm.
[1956] Huh.
[1957] I'm going to guess yours is like mine.
[1958] So we have such a similar diet and stuff.
[1959] Fried chickeny.
[1960] Fried chickeny.
[1961] Because both eat a lot of fright.
[1962] I wonder if, like, you know, trying to read between the lines with him, it's so difficult.
[1963] But I wonder if it's really he's picking between savory and sweet.
[1964] That's what he really means.
[1965] Like, oh, this is kind of a savory sweat.
[1966] Interesting.
[1967] Versus this kind of a sweet sweat.
[1968] Huh.
[1969] Key sweat.
[1970] Well, but this is interesting a little because would you say you sweat a lot?
[1971] In there a tremendous amount.
[1972] But as a person, I'm not a big sweater.
[1973] Okay.
[1974] Although this sauna regimen has permanently changed me, I'm afraid.
[1975] Because now when I work out, I sweat and I never in the past.
[1976] Oh, wow.
[1977] It's almost like once that shit gets active and you get used to doing that, I don't know.
[1978] I'd love to have someone smarter than me, explain that to me. But yes, now I'm seeming, although I'm not sweating right now.
[1979] It's not like I'm now, I have a damn forehead.
[1980] Right.
[1981] But some people, like, sweat real fast and easy.
[1982] Yeah.
[1983] Amy is one of them.
[1984] She sweats a lot.
[1985] Oh, she is.
[1986] Oh, wonderful.
[1987] So I wonder sometimes if that has anything to do with it.
[1988] If your taste has something to do with how often you sweat.
[1989] What would you say your sweat?
[1990] I barely sweat.
[1991] Like, Kristen.
[1992] Dangerously, yeah.
[1993] She also doesn't sweat a lot.
[1994] Yours has an explanation.
[1995] You have no water in you.
[1996] I know.
[1997] There's nothing.
[1998] There's nothing to sweat out.
[1999] to come out.
[2000] When I was working out, and just blood was pouring out of my parents.
[2001] It would be kind of hot.
[2002] That'd be so scary.
[2003] Like a demon?
[2004] No, she's not been injured.
[2005] She's sweating.
[2006] Okay.
[2007] Speaking of cool girls.
[2008] Yeah.
[2009] The row.
[2010] Oh, we're okay.
[2011] We're back.
[2012] After the long break, we're back.
[2013] It's relevant because this is Will McCaskill.
[2014] How is that relevant?
[2015] Because I donated three times one row shirt.
[2016] Okay, that makes a ton of sense, but you can see where I forgot.
[2017] So I told you at the beginning of the episode that there's a seam that goes down the back of those shirts.
[2018] Signature.
[2019] It's called a French seam.
[2020] French seams are sewn twice, encasing the raw edge within the seam and creating a very neat, delicate seam that is ideal for sheer or lightweight fabrics.
[2021] Anyway, it's a beautiful seam.
[2022] It's in keeping with the row.
[2023] Oh, my God, there's a jacket I want so bad.
[2024] From the row?
[2025] Yeah.
[2026] Mary Kate and Ashley, if you want to send me...
[2027] Send me the link.
[2028] I owe you a birthday present.
[2029] No, absolutely not.
[2030] I'm buying it.
[2031] No. You don't even know what it is.
[2032] I'm going to call whatever number I have from her that's 16 years old.
[2033] I'm going to say, Monica needs some jacket.
[2034] And she's going to go, which one?
[2035] You're going to say I don't know.
[2036] Wow.
[2037] This is out of blue.
[2038] And I'm going to go, wow, I'm impressed your number still works.
[2039] All these things are.
[2040] going to happen we're finally going to get down to the yeah what have you been up to oh fuck it's too much time i need this jacket she's going to say i have multiple jackets and i'm going to say i guess i'll take them oh my god if i had every jacket anyway so okay are supermodels across the globe tall or is it regional because i said that i wanted to be a supermodel yes and we suggested Maybe go to another country.
[2041] Yeah.
[2042] Potentially the, um, the pygmy people of the, um, no, this is a real anthropological term of the, the, the, oh, God, what is it?
[2043] The Belgian Congo.
[2044] Yeah, read some more words.
[2045] Congo, African pygmies, Congo pygmies, Central African foragers.
[2046] The name of their population, though, like their culture is like the, the bunch of B words.
[2047] Yes, yes, it is a bunch of Borgia.
[2048] Bambuiti and Botla.
[2049] Yeah, so you need to be a Bambuti or Bambaqua model because those are the famous pygmy people of Africa.
[2050] But you know what's sort of unfortunate is supermodels are like they're just global.
[2051] But you could be a model.
[2052] Sure, a print model.
[2053] Yeah.
[2054] I guess.
[2055] It's great.
[2056] But a supermodel, the ones that walk the runway, they have to be tall.
[2057] Actually, it's a requirement.
[2058] The standard high requirement for a female fashion model is five feet and nine inches to six.
[2059] feet like they won't even the agencies won't take you if you see you it won't because it's for the clothes so I'm never going to be one even though you guys tried to force me to believe I could the reality is I can't okay well let me ask you this oh god back to this topic I can't get off of one comforting thing about having that Red Bull car in the driveway yeah was even if I had dedicated my entire life to be an F1 driver I physically can't fit there's no fucking way.
[2060] They'd have to build an entirely different kind of car for me. That seat is so small.
[2061] Yeah.
[2062] It's a non -starter.
[2063] I would have dedicated whatever from five years old go -karting to, you know, and about 17 have become obvious.
[2064] You're not going to be able to do this.
[2065] But this is what we should do in our time machine, though, because I actually wonder if your body would have adjusted if you started that young.
[2066] Gymnist bodies, they get adapted.
[2067] Yeah.
[2068] So I would have stayed really narrow because my shoulders were always being pushed up against.
[2069] Yeah.
[2070] Also, I'm a bit claustrophobic.
[2071] I would be, whew.
[2072] Yeah, but you also probably would have not been if you had been in that car every day.
[2073] I've been five, six, slight build.
[2074] Yeah.
[2075] Wow.
[2076] And your head would have been so small.
[2077] But maybe if I always put these long clothes on you, you would have become five foot ten.
[2078] Maybe.
[2079] Oh, my God.
[2080] Anyways.
[2081] But you're saying I'd be like a bonsai tree.
[2082] Yeah.
[2083] Okay.
[2084] Oh, my God.
[2085] Picture a pygmy dax.
[2086] That's so cute.
[2087] Do you know what pygmy means?
[2088] Yeah, small.
[2089] But not just small, proportionally small.
[2090] How big would you be a pygmy?
[2091] Like a foot?
[2092] No, no, no. A pygmy dax would be like five feet tall.
[2093] Oh.
[2094] Oh, you thought these people were like a couple feet tall?
[2095] That's like mythical.
[2096] That would be like Peter Ping.
[2097] Five feet tall is just like not that.
[2098] That's my height.
[2099] I wish you could be two and a half feet.
[2100] But the exact same proportions everywhere.
[2101] That'd be fun.
[2102] That'd be so fun.
[2103] Well, that was me when I was, I guess, four.
[2104] I would carry you like I carry Delta.
[2105] Carry me up to the addict.
[2106] Yeah.
[2107] But I have the same oversized mouth.
[2108] Like, I'm as loud in this version of me. Yeah, it's you.
[2109] Okay.
[2110] It's you.
[2111] That might even make me endearing.
[2112] Like, oh, look at this little fucking insect over here trying to.
[2113] No, don't calm that.
[2114] That's my friend.
[2115] That's my friend.
[2116] I can call me whatever I want.
[2117] You can't talk badly about yourself to me. Okay.
[2118] I don't allow that.
[2119] That'd be like getting incarcerated for taking a nude photo of yourself and staring at it.
[2120] And you had your own child pornography.
[2121] Like if you were 10 and you did it and then they put you in jail for looking at yourself.
[2122] Yes, that would be the same as me making fun of my pygmy self and getting canceled for making fun of just me. Yeah.
[2123] But I just don't want you to make fun of yourself at all.
[2124] Oh, okay.
[2125] On average, people in the U .S. give 2 % of their income.
[2126] That's what he said.
[2127] That's correct.
[2128] Also, more than half, 56 % of Americans have donated to charity in 2021 in the year before, 55%.
[2129] It says it's still down from pre -pandemic levels in 2019 when about two -thirds made charitable contributions.
[2130] I'm impressed by that.
[2131] Me too.
[2132] Pretty good.
[2133] The only thing I will say that's a little bit confusing about that statistic, and maybe it shouldn't be is that I do believe they count contributions to a church as that oh you know what I'm saying so you've got like X amount of the country that is active Christians going to church and putting money in the basket and or typing 10 % of their yes that's a tax deductible that's like churches are whatever sea corpse whatever they are where it's a charity and it's a write -off the big movement against Scientology is that they have church tax exempt status and people want to that taken away.
[2134] I see.
[2135] Erica was telling us yesterday when she used to be Mormon and she's since off the church, but she used to be Mormon and she grew up Mormon.
[2136] And so she had to give 10 % of her allowance.
[2137] And her dad gave 10 % of his income.
[2138] And he was a wealthy man. Yeah.
[2139] Good for them.
[2140] Good for the church.
[2141] That's what I'm saying.
[2142] Yeah.
[2143] Anyway, I was like.
[2144] That's what you want.
[2145] You want rich parishioners.
[2146] When I start my cult and I will ask people to talk.
[2147] Oh, okay.
[2148] If I don't ask them to tithe, the cult won't feel legitimate.
[2149] Okay.
[2150] You know what I'm saying?
[2151] You're like, I don't know.
[2152] It feels legit, but I'm not asking to pay.
[2153] He's not asking me for any of my money.
[2154] That's weird.
[2155] What's his agenda?
[2156] You'd almost trust someone more than is after your money.
[2157] Okay.
[2158] I will have tithing.
[2159] And because I'll have tithing, then naturally, I'll try to land some big fish.
[2160] Whoppers, whales.
[2161] But where's the money going?
[2162] Up my nose, probably.
[2163] What?
[2164] If I have a cult, well, if I have a cult, I'll hate myself, and I'll be sleeping with all the parishioners.
[2165] Oh, my God.
[2166] I'll just be in a vortex of, so.
[2167] We need buildings and statues and stuff built.
[2168] That's true.
[2169] We need a lot of statues.
[2170] Good fun that.
[2171] By the way, you know how much I dislike my own image.
[2172] I can't think of anything worse than statues of myself everywhere.
[2173] I'd just be reminded that that's what I look like every time I walk by.
[2174] I'd have to really be gratuitous.
[2175] Like, all the statues of the leader, Dax Shepherd will look like Brad Pitt.
[2176] It'll be very confusing.
[2177] People think they're worshipping Brad Pitt and then I show up.
[2178] That might get you some extra peeps.
[2179] Oh, because they want to just stare at statues of Brad Pitt.
[2180] Should we just start a Brad Pitt cult and he's in Eustantia, the leader?
[2181] And I'm like the mouthpiece of him.
[2182] Sure.
[2183] Is Sheila.
[2184] Yeah, I be Mona Sheila.
[2185] That's right.
[2186] Of course, because they're not competing.
[2187] I have one last follow -up conversation about Ricardo.
[2188] Okay.
[2189] I almost think it makes me more trustworthy as a friend if he wasn't my favorite driver because Because if he leaves, does he think, oh, the thing Dax liked about me?
[2190] I was his favorite driver, isn't in the mixing one.
[2191] Well, that's never what the thing I, what I liked about him.
[2192] Maybe.
[2193] I don't want this to end with you thinking that I've said you're not a good friend.
[2194] I don't think that.
[2195] I think you think I'm a good friend.
[2196] You are.
[2197] You've had a lot of friends.
[2198] And I think I made your top tier.
[2199] You're my best friend.
[2200] Yes.
[2201] So I would never say otherwise.
[2202] But all I'm saying is that people project their own insecurities as do I. You're my favorite podcaster.
[2203] Ooh, that's tough.
[2204] I really like Malcolm.
[2205] I wanted to be honest with you.
[2206] Cool.
[2207] Really like Malcolm.
[2208] I love Malcolm.
[2209] Malcolm's my favorite podcast.
[2210] How does that feel?
[2211] Can I have you in a two -way?
[2212] Yeah, I think he should be.
[2213] I think he should be your favorite podcast.
[2214] But I want to put you in a two -way tie.
[2215] I don't want to be your favorite podcaster.
[2216] I want to be your best friend.
[2217] In a two -way time with Malcolm.
[2218] But I don't value that as much as I value being your best friend.
[2219] Good.
[2220] I should hope not.
[2221] But am I?
[2222] Your best friend?
[2223] Yeah.
[2224] Of course.
[2225] Oh, my God.
[2226] Oh.
[2227] There's shit broken in the background.
[2228] Hi.
[2229] We're recording.
[2230] Wait, is it.
[2231] Hi, Eric.
[2232] Hi.
[2233] Eric's working out.
[2234] Hi.
[2235] Hi, buddy.
[2236] We just have a question for the fact check.
[2237] Thank you for talking to us why you're working out.
[2238] We need to know about the taste of everyone's sweat.
[2239] Oh, right.
[2240] Well, you know, and I got this off the podcast I listened to.
[2241] There's a big differential in sweat taste depending on how much body your sweats out.
[2242] So, like, Amy had almost no taste.
[2243] Oh, she had almost no taste.
[2244] And then there's like a body, a body taste that everybody's got a little bit different, just like they're sick.
[2245] You know what I mean?
[2246] Yeah.
[2247] Like their pheromones.
[2248] Yeah, Kristen was kind of like celery.
[2249] Right.
[2250] With salt on it.
[2251] He was the saltiest of the girl.
[2252] Of the group.
[2253] And then Amy was literally nothing.
[2254] And then Molly had like a little mild salt with some muskiness.
[2255] Oh.
[2256] Ooh.
[2257] Dax reminded me, the first thing I went through my head was fried chicken.
[2258] Right, right, right.
[2259] But stop them real quick.
[2260] Can you hear me, Eric?
[2261] Yeah.
[2262] Yeah, I already mentioned that you actually had put Amy in the fried chicken category yesterday.
[2263] No, no, I didn't say she tasted like fried chicken.
[2264] Oh, you didn't?
[2265] It must have been I was talking about you, or maybe I just said it wrong.
[2266] Oh, okay.
[2267] No, Amy is the farthest thing from fried chicken.
[2268] Oh, we misrepresented both you and Andy.
[2269] And then Charlie, this sounds a little weird, but he was just like, man. Uh -huh.
[2270] I'm not surprised.
[2271] Yeah, he was just like a man, like an outdoor man. So did he taste like a lumberjack, like trees or like smoked meat?
[2272] Yeah, like somebody from Game of Thrones.
[2273] like a big like the big mountain guy maybe like moose mixed with redwood Oh wow That's good That's good Does that make sense Like somebody who would be on a loan Like a big guy on alone Yeah Did you taste your own And can you be objective No no no It's like if you taste your own It's kind of like just tasting your own spit It doesn't taste like anything Well you're sweating right now Molly will you lick Oh great She so doesn't want to do this This is true love It's tiny salty Like Almost like salt water Is it sexy at all?
[2274] No No Is there any food taste?
[2275] Food taste?
[2276] Like salmon?
[2277] Even could you put it in the category Either savory or sweet Could we start there?
[2278] You would think sweet With how much sugar he eats but it was more savory.
[2279] Okay.
[2280] Is anyone sweet?
[2281] Was Amy sweet, Eric?
[2282] Well, Amy was closer to sweet than salty because she was nothing.
[2283] But I do wonder how much of it is psychological because I love Amy so once and I look at her as like the best person on Earth.
[2284] Maybe like psychologically I want her to taste like something.
[2285] And that's, so I don't know what part of it was that psychological and what she was like actually.
[2286] But it is what it is.
[2287] She tasted very clean.
[2288] very clean almost like a virgin oh wow um Eric have you ever when I haven't noticed um taken a lick of my feet I think that would we're getting there I have a I slowly have to work my way into because I don't know how comfortable you are with it yet I'm ready I mean I have held your feet and caress them and and stroked them and shave them for probably a half hour each foot.
[2289] Yeah.
[2290] But putting them in my mouth would be, that's another.
[2291] We'd have to get Molly's permission.
[2292] Oh, yeah.
[2293] She doesn't mind.
[2294] Okay, Dax, do you have anything else for these two?
[2295] Well, just that you would immediately said, I think Eric would like to hear this, that you were like, I want him to taste me. Yeah, I do.
[2296] I want to know what your expert analysis.
[2297] What your analysis would be, but I'm also nervous.
[2298] Let's do this.
[2299] Let's next time where Dax is, let's.
[2300] do the sauna and then I'll lick I'll lick them in the sauna okay and Molly will you look it too so we have two different opinions two tasters yeah because I don't know I'd like to have some objectivity and Molly can I can I go down on you the next time we're in the sauna?
[2301] Oh my God that is what do I do if it's what did you say Eric if they tasted that I don't think I could tell you that's my worry I want you to be honest with me. Tell me you're going to taste like blood we figured out.
[2302] Yeah, I might taste like blood because I don't have much I'm dehydrated.
[2303] And so she sweats blood.
[2304] Right.
[2305] I sweat blood.
[2306] Blood would be okay.
[2307] At least blood blood could be kind of sexy.
[2308] Okay.
[2309] Well, I just want you to be honest with me and don't worry about hurting my feelings because you already build me up so much with my feet.
[2310] Okay, okay.
[2311] Even if your feet don't taste, you still have the best looking feet in the pod.
[2312] Oh, thanks.
[2313] Pod's best feet.
[2314] Oh, he's your number one fan for your feet.
[2315] Now, that would bother you.
[2316] If you found out he started liking someone else's feet, you'd feel like Daniel.
[2317] I would be really upset.
[2318] Yeah.
[2319] Well, trust me, in our pod, you don't have any topics.
[2320] Oh, wow.
[2321] I love this.
[2322] Oh, what a flatter.
[2323] We love you.
[2324] We love you guys.
[2325] Love you guys so much.
[2326] Keep working out.
[2327] Love you.
[2328] Wow, that was fun.
[2329] All right.
[2330] I love you.
[2331] I love you.
[2332] Follow armchair expert on the Wondry app.
[2333] Amazon music, or wherever you get your podcasts.
[2334] You can listen to every episode of Armchair expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple podcasts.
[2335] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.