Insightcast AI
Home
© 2025 All rights reserved
Impressum
Steven Pinker

Steven Pinker

Armchair Expert with Dax Shepard XX

--:--
--:--

Full Transcription:

[0] Welcome, welcome, welcome to armchair expert.

[1] Experts on expert.

[2] I'm Dax Shepard.

[3] I'm joined by Monica Lily Padman.

[4] Hello there.

[5] Hi, we're in your hotel room.

[6] We are.

[7] In London, England.

[8] We have traveled across the pond.

[9] Yeah, we've traveled far and wide.

[10] We have.

[11] And we're in a little bit of a time crunch.

[12] We are, and I'm drinking out of his tiny teacup.

[13] And I'm about to have one of the luckiest experiences ever.

[14] Oh, my God.

[15] You are.

[16] Which is I'm about to go to McClain.

[17] Aaron, about an hour north of London, to go see where the cars are made, where the race team happens, watch Danny and hopefully Lando in the simulator.

[18] That is so fun.

[19] Will you take some video?

[20] I'm going to document the entire thing.

[21] All right.

[22] ABR, always be rolling.

[23] Today, we have one of our very favorite intellectuals.

[24] We have been mildly obsessed with him for going on seven years, maybe.

[25] Yes, we tried to get him really early on in the podcast.

[26] We weren't cool enough then.

[27] Yeah.

[28] But he seems to think we're cool enough now.

[29] And you'll actually hear that story.

[30] We kind of like gang rushed him in the vent.

[31] Oh, yeah.

[32] He urged him to come on the podcast.

[33] At any rate, the man we are talking about is Stephen Pinker.

[34] He is an award -winning experimental cognitive psychologist and best -selling writer on language, mind, and human nature.

[35] He's currently the Johnston family professor in the Department of Psychology at Harvard University.

[36] His books include the language instinct, how the mind works, the blank slate, the stuff of thought, and the better angels of our nature.

[37] He has a new book out called Rationality, What It Is, Why It Seems Scarce, Why It Matters.

[38] I just want to add my two cents on why I love Stephen Pinker so much, is he's just among the smartest folks on the planet.

[39] Yes.

[40] With a current of optimism supported by his big, brilliant brain that I find to be so...

[41] Refreshing?

[42] No, relieving.

[43] It's relieving.

[44] In a world of pessimism and thinking that this whole experiment's going to explode at any moment, Stephen Pinker is a really wonderful voice.

[45] I agree.

[46] In the direction of, we will make it.

[47] So please enjoy Stephen Pinker.

[48] Wondry Plus subscribers can listen to armchair expert early and ad free right now.

[49] Join Wondry Plus in the Wondry app or on Apple Podcasts.

[50] Or you can listen for free wherever you get your podcasts.

[51] I'm recording.

[52] How do I sound?

[53] You sound great.

[54] You look even better.

[55] Can we talk about your eyeballs for one second because I don't think it can be an accident that your shirt is almost perfectly the same color and it's making them pop like you can't imagine.

[56] I should remember that if it's a good look.

[57] Well, I'll make a note of that.

[58] How frequently, and I'm being serious now, how frequently do you receive compliments about your eyes?

[59] Because they're pretty stunning.

[60] Not that often.

[61] I'll take them when I get them.

[62] Oh.

[63] Well, come to us more often.

[64] I should.

[65] So, Stephen, we met one time briefly, the three of us, I remember.

[66] You?

[67] Oh, you do?

[68] I wouldn't have thought you would remember.

[69] Oh, I remembered very well.

[70] Are you kidding?

[71] No, I'm not kidding.

[72] I operate with a great deal of insecurity, so this isn't off -brand for me, but I'll just say, I was so excited to meet you.

[73] And then when we left, I was like, I said to Monica, I'm like, I don't know if he really wanted to meet me. Oh, of course I did.

[74] Are you kidding?

[75] In the green room of the Dorby Theater, where they do the Academy Awards, right?

[76] Right?

[77] Yes, yes.

[78] So are you in San Francisco right now?

[79] I'm in Berkeley.

[80] Oh, you're in Berkeley.

[81] And are you doing something at Berkeley or are you doing something at Stanford?

[82] Or are you just vacationing in the Bay Area?

[83] I'm on sabbatical in the Bay Area through May. I have not yet finagled an official connection to Cal. I may or may not.

[84] I have a lot of friends there and I intend to take advantage of being here.

[85] Are you a professor file at all?

[86] Because right away, I think about the fact that Oppenheimer was so happy where you're at currently.

[87] Well, they're my people.

[88] They're my colleagues.

[89] They're my friends.

[90] So, sure.

[91] Yeah.

[92] We are.

[93] We're obsessed with scholars.

[94] We've created a term called Unifile, which is interesting because neither of us went to prestigious schools per se.

[95] She went to Georgia and I went to UCLA.

[96] But, boy, we know an irrational fascination with Stanford and Harvard and Harvard.

[97] And then - It's because we didn't go to those schools, I would say.

[98] Not it's weird.

[99] It's actually part of why.

[100] or maybe you would know.

[101] But like, we had Danny Kahneman on.

[102] The notion of how flawed we are at making sound decisions is kind of shocking at times just how off -base we can be with what would be a rational decision for a human to make.

[103] And yet we make all these other decisions.

[104] And we're starting, I think, in the zeitgeist, people are understanding like confirmation bias or maybe noise, these new concepts where we're starting to kind of acknowledge how flawed we are at thinking.

[105] So I think your book couldn't have come at a better time for my own personal interest in the topic.

[106] I guess I would ask why you endeavored to write rationality.

[107] Well, like a lot of cognitive psychology professors, I've been teaching it for years.

[108] It's a lot of fun to teach the work of Danny Kahneman and Amoskiewski.

[109] It catches people off guard.

[110] It forces them to think twice.

[111] Once when I gave one of their demonstrations, a student shouted out from the class.

[112] I'm ashamed for my species.

[113] Also, the flip side of human irrationality is that we have developed these tools for being more rational, distinguishing causation from correlation, logic, probability, Bayesian reasoning named after Reverend Thomas Bays' famous rule, statistical decision theory.

[114] And I have a feeling, like a lot of, I think like a lot of social scientists, that it just wouldn't be a bad idea if a lot of people just got the...

[115] concepts behind logic and probability and correlation versus causation.

[116] Just as we learned to read, we learn to write, we learn history.

[117] Learning how to think statistically is just something that every educated person should do, but I didn't know of a particular book or course that tried to put it all between two covers.

[118] So I wanted to do that.

[119] But then, of course, as you point out, it's on everyone's minds these days.

[120] Because the first question that I get, when I tell people, first of I was teaching a course, now I'm writing a book on rationality, is why is the world losing its mind?

[121] How do you explain QAnon and chemtrails and homeopathy and belief in ghosts?

[122] So for a species that was kind of smart enough to discover DNA and go to the moon and invent smartphones and vaccines, why is there so much kind of flap -doodle circulating around?

[123] Why are we so susceptible to weird beliefs?

[124] So it's not just that it's a pressing issue, like, Why do so many people sign on to Q and on?

[125] Yeah.

[126] But, you know, I'm a psychologist, and I kind of also want to know what is it about the way our minds work that make these crazy beliefs seem so plausible to so many people.

[127] Yeah, and it's such a big topic.

[128] I mean, I don't even know where to jump in, but let's just first you brought up causality versus correlation.

[129] So I think a really great nuts and bolts example of that is during the kind of awareness rising around autism, people started correlating that they would get this, booster shot for their children, and that was about the time they would start seeing signs of autism.

[130] So to them, it felt very, very obvious.

[131] My kid was normal.

[132] I got this vaccine.

[133] Now my kid is showing these signs of autism.

[134] So that's correlation.

[135] They are correlating those two events, right?

[136] And then through the great luck of, I don't know if it was the Danish or the Swedish, they do these epidemiological studies.

[137] They have access to all of the citizens' medical records.

[138] Within five minutes, they simply compare who got the vaccine, who didn't, rates of autism.

[139] We know immediately, oh, there's no causality.

[140] Oh, that's great.

[141] Thank God we had them to step in and say, this is just a correlation, not causality.

[142] So that would be an example, right?

[143] But we do this, I do this nonstop.

[144] Yes, absolutely, because the way to distinguish correlation from causation is something that none of us can really do in our spare time, namely a randomized controlled trial.

[145] You get a bunch of people, half of them get the drug, and half of them.

[146] them get the placebo and you wait, you measure the outcome, you see if the ones who got the drug, different ones who got the placebo.

[147] Now, of course, not only are these very expensive, but for things like do vaccines cause autism, you can't do the study to satisfy your curiosity.

[148] Right.

[149] You're limited by what you can ethically do.

[150] So those are a number of the issues that pop up when you try to distinguish causation from correlation.

[151] It almost happens hourly in our life as humans.

[152] Because we eat food and then we feel a certain way, or we get a rash, or we do this.

[153] So I'm constantly like trying to correlate what I'm eating, which is giving me what result, without any knowledge of it's actually causing it.

[154] Oh, yeah.

[155] Even though we make errors of confusing correlation and causation all the time, I mean, people do have a sense that they're different.

[156] In the book, I recount a joke told by my grandfather, who was born in Warsaw in 1900, about the man who gorged himself on this dish called chulant.

[157] Cholent is a heavy meat and bean stew that observant juice leave on the stove over the Sabbath because you're not allowed to cook on the Sabbath.

[158] So it sits there simmering and it's not exactly Nouvelle cuisine.

[159] It's pretty heavy.

[160] Anyway, the joke is about the man who gorged on Cholent and a glass of tea and then he lay on the couch moaning that the tea made him sick.

[161] Now, presumably we had to have been born in Poland in 1900 to find it as uproarious as he did.

[162] But it does convey the idea that you don't have to be a scientist to know that confusing correlation and causation is a problem.

[163] Yes, I guess when that joke was told in the 1900s, nearly everyone listening had been guilty of gorging themselves on the stew, so they're like, oh, I know that feeling.

[164] Hit that mirror neuron and really amplified the joke.

[165] The heavy bean and meat stew is more likely than the glass of tea to have been the causal agent even though they were correlated.

[166] Yes.

[167] Now, another thing is you say it is in the zeitus, as we just said, and I have to imagine a lot of that is driven by the fact that we're really starting to question racism.

[168] Like we had a very binary definition of racism, which was like, if you own slaves, that's racist.

[169] Okay, we got rid of slaves.

[170] Oh, these laws are kind of racist.

[171] Now we're getting into the nitty gritty of our biases against outgroups.

[172] And we're starting to really, I think, examine what we're carrying around in our head.

[173] Do you think that's related to this current interest in it?

[174] It might be just a general kind of self -consciousness as a species.

[175] what are the various flaws and what are the kind of subtle ways of understanding ourselves that aren't black and white?

[176] The black and white fallacy is one of the failures of critical thinking that are often singled out in curricula on critical thinking.

[177] Yeah.

[178] Well, that was going to be one of my questions.

[179] So I'll just ask it now.

[180] So it seems like we all have a real draw towards the definitive.

[181] Like it makes us feel safe to hear something is definitive.

[182] This is right.

[183] This is wrong, that's good for you, this is bad for you.

[184] And it seems like almost everything in life is somewhere on this spectrum where you're aiming for maybe at 70 % right or 70 % good for you, but we're really attracted to something being definitively one way or another.

[185] Is that true about the humans?

[186] It is.

[187] We like to put things into discrete boxes.

[188] And so there's a whole set of fallacies of reasoning of people dividing what are inherently continual into black or white categories.

[189] Some of the philosophers even have to call it that there's a version called the paradox of the heap, which is as follows, if one grain of sand is not a heap.

[190] And if something's not a heap and you add a grain of sand to it, that's not a heap either.

[191] But of course, if you do it thousands of times, you end up with a heap.

[192] The paradox is, according to the logic that we accepted, there's no such thing as a heap.

[193] But there is such a thing as a heap.

[194] Or there's the dieter's fallacy.

[195] One more French fry won't make me fat.

[196] But then, should I have another one?

[197] Well, one more French fry won't make me fat.

[198] Well, should I have another one?

[199] Well, one more French fry won't make me fat.

[200] Yeah, that's it.

[201] Yeah.

[202] You can stay in the truth while also denying the reality.

[203] Exactly.

[204] Or the manana fallacy.

[205] What difference will it make if I do it tomorrow?

[206] And then tomorrow comes, well, what difference is it make if I do it tomorrow?

[207] Yeah.

[208] So, yeah, there's a whole set of fallacies and also, debates, because sometimes where you draw the line is not so clear, because everything is a continuum.

[209] So even something like, what's a vegetable?

[210] Well, okay, carrot, for sure that's a vegetable.

[211] And broccoli, yes.

[212] What about garlic?

[213] What about parsley?

[214] And then you get things that the Republicans will classify pizza as a vegetable in justifying school lunch budgets that have a lot of, yeah, a lot of payoffs from the fast food industry.

[215] And that's where people kind of draw a line and say, pizza is not, that's going too far.

[216] Pizza's not a vegetable.

[217] And lots of other examples, like, did Bill and Monica have sex?

[218] Well, what exactly is sex?

[219] Where do you draw the line if this part of the body is in that part of the body?

[220] And, you know, how much?

[221] Is the SUV a car or a truck?

[222] Where do you draw the line between a car or a truck?

[223] It makes a difference because a lot of the emissions came because Americans went crazy bananas over SUVs, which got away with laxer mileage and emission standards because they were counted as trucks, not cars, even though people use them as cars.

[224] So yes, there's a whole set of issues on all or none versus gradual categories.

[225] And then you mentioned risk.

[226] There's a whole other set of issues on once you accept that there are things that are continuum, then there's a question of how do you assess costs and benefits of different courses of action, given that nothing is certain in life.

[227] It's always a matter of probabilities.

[228] You can never be 100 percent safe.

[229] Any medical treatment isn't guaranteed to cure you.

[230] Surgery might get your cancer, but it might be unnecessary.

[231] It might leave you with pain and disfigurement.

[232] How do you weigh up the costs and benefits?

[233] That's another example of a tool of rationality that I try to explain in the book, sometimes called rational choice or expected utility.

[234] And it's, of course, quite relevant in public health decisions, such as should we get everyone to mask, not really knowing whether, how effective masking is.

[235] Well, if we're wrong, people would our masks, it's a small price to pay.

[236] But if we're wrong in the other direction, and we say masks are useless and people go out and cough on each other, and it turns out that masks really were effective, then we've paid a much bigger price.

[237] So the decisions are both at the individual and the social level of how bad would it be if we were wrong in each of the two ways of being wrong?

[238] Yeah, you've got to not just measure the probability of being wrong, but you have to measure the outcome of being wrong in relation to the outcome of being right.

[239] So the math keeps evolving as you get deeper into thinking about it probably.

[240] That's exactly right.

[241] And so I have a whole chapter on exactly that.

[242] And whether people miscalculate or do things that if you were to get them to think it through, they probably couldn't justify.

[243] Like a lot of people, they buy an appliance and the salesman says, well, do you want to also get the protection plan, the extended warranty?

[244] And I think about a quarter to a third of people do buy the extended warranty.

[245] But if you do the math, I mean, is it really?

[246] Does it really make sense to take out a life insurance policy on your toaster?

[247] Sure, sure.

[248] And would they be offering it if the odds were not in their favor?

[249] Exactly right.

[250] Yeah, I grew up with a car salesman father, so I've never bought any extended anything because he's like, that's where the money's at for us.

[251] We get in an extended warranty, and that's all profit.

[252] Well, you know, and there's a case in which that does make sense.

[253] So it does make sense to take out insurance on your house, even though the insurance company has to make a profit.

[254] The amount that you pay, everyone pays in, premiums has to be greater than the amount the insurance company pays out when a house does burn down.

[255] But the thing is that from your point of view, losing a house would be so catastrophic, you can't just dip into your savings to replace it, that you're willing to suffer a little bit of a loss on average just so you're covered on the downside, the catastrophic outcome.

[256] So it makes sense for a house.

[257] It doesn't make sense for a waffle iron.

[258] This is such a sidebar, but I want to say it to you because you probably already know it.

[259] But I have a friend who works for a big insurer, and he explained to me, which I found really interesting, that the money and insurance isn't actually having a margin that they pay out less than they take in.

[260] The profit in insurance company is it holds vast sums of money while waiting for that to happen, that they're actually designed to operate almost at net zero for them payout, take in.

[261] But the holding of that mass amount of money and the ability to make fractional money on that money is their actual pursuit?

[262] And I was like, oh, that's fascinating and makes me trusted a tiny bit more.

[263] Yes.

[264] Right.

[265] I feel like there's an extra layer of complication in all of this, which is your personality, right?

[266] Like your personal level, like let's say AppleCare.

[267] If you're someone who is constantly throwing your phone around or who doesn't want to have a case on it or who probably should get Apple Care, whereas maybe the average person doesn't need it, but you have to factor in these personal traits you have.

[268] Well, that's absolutely right, because the probabilities might differ, a typical person indeed may not abuse their toaster.

[269] Yeah.

[270] In fact, probably none of us abuse our toaster, but there's a continuum in terms of how much we abuse our iPhones, right?

[271] Uh -huh.

[272] And then what might be rational for a person who doesn't mind putting it in a case and handling it gingerly may not be rational for.

[273] for someone who, where it's just not something they're willing to do to baby their iPhone and it may pay them.

[274] And in fact, the people who baby their iPhones but take out the extended warranty are subsidizing the people who toss their iPhone around.

[275] They're the young people in the health care system.

[276] That's right.

[277] Exactly right.

[278] Anyway, these are the kinds of things.

[279] So what we've been just talking about in the last few minutes is in part what is someone's called normative models, that is, theories of how you ought to choose or think if you were a rational person.

[280] And that's kind of the benchmarks.

[281] Once we say, well, this is what rationality is, then we can ask the question, well, how rational are people?

[282] Do they do what the theory says a rational person or a rational agent of any kind in that situation ought to do?

[283] So in my book, I spend some time on these various normative models what we ought to do.

[284] do?

[285] And then I ask the question, what do people do?

[286] How rational is a typical person?

[287] And you can't answer that question unless you have some kind of benchmark, ground truth, some answer the question of what is rationality?

[288] And then you can say, well, do people do that?

[289] Yeah.

[290] And is there a way to quantify, like, what percentage of our decisions approach rationality?

[291] That's too big of a question, maybe?

[292] There could be.

[293] I mean, it's typically not so much what percentage of our decisions, but it's just on average.

[294] Does the pattern of our decisions fit with what our best models tell us is what a rational person ought to do?

[295] Or are there ways in which we ignore something that's really, really relevant, that we pay too much attention to something that is irrelevant?

[296] And all of these things have been worked out.

[297] And I'm sure in your conversation with Danny Kahneman, a number of them came up because he's the guy who discovered a lot of them.

[298] Yeah.

[299] And just to remind people in a two -second sentence, this becomes really, really clear.

[300] and obvious to observe when you offer people different games or they could win or lose money.

[301] And what you find out immediately is that if the odds are the exact same between these two problems, but one is positioned in a way you'll lose and one's positioned in a way you'll win, you'll ignore the same outcome statistics because we have loss aversion.

[302] Humans would prefer to not lose money over.

[303] Yeah.

[304] So all these little games that you can create to demonstrate, right?

[305] I probably didn't do a great job of laying that out, but...

[306] No, no, you laid it out perfectly, and it's part of a kind of irrationality where, at least in the theory of rational choice, if you're choosing between two alternatives, the language you use to describe the two alternatives shouldn't make any difference.

[307] And that's almost one of the definitions of rationality.

[308] You know, a rose by any other name should smell just as sweet.

[309] If I give you the same option, and I use this sentence to describe it to that sentence to describe it, if you're really irrational, it shouldn't matter, but it does.

[310] And you pinpointed one of the ways if you...

[311] frame the same option in terms of losses.

[312] People are scared away if it's in terms of gains, even if it's the same thing.

[313] It's the same exact thing, yeah.

[314] So for example, just to be concrete, imagine there's a risky drug or a risky surgical treatment.

[315] If you say 90 % of the people who undergo this treatment survive, then people will say, well, I'll take those odds.

[316] If you say 10 % of the people who undergo this treatment die, and they'll say, oh, no, way too dangerous.

[317] even though those are two ways of saying the same thing.

[318] But it matters.

[319] So it's the way things are framed matters, even though it shouldn't.

[320] And we do have overall a negativity bias.

[321] Bad is stronger than good.

[322] Bad news captures attention more than good news.

[323] This is something that I've dealt with in trying to convince people that a lot of measures of human well -being have improved.

[324] People say, well, what do you mean?

[325] There are wars, there are pandemics, their depressions.

[326] And I say, well, yeah, there are all those things.

[327] And you remember them all because the mind gloms on to the negative, especially, and journalism just multiplies that.

[328] But it's much harder to pay attention to the good things, the fact that extreme poverty has gone down every year for 30 years.

[329] The fact that huge parts of the world haven't had a war in decades and decades and decades.

[330] But we tend not to notice those things.

[331] Well, one thing we don't notice non -occurrences, but people and in particular editors tend to pick the most negative.

[332] Well, and they'll just, they'll shape it to focus on the negative.

[333] So, like, one way to look at Afghanistan would be, oh, it's our longest war.

[334] What a bad mark on our record.

[335] But then if you evaluated it, oh, my God, a tenth of the people died in this war that died in the shorter war, Vietnam.

[336] And then if you look at Vietnam and go, oh, a tenth of the people died in that war that died in World War II, which was even short.

[337] Like, there's so many ways to evaluate the cost of that campaign.

[338] Certainly, casualties have just precipitously fell since World War II, right?

[339] Oh, by a lot.

[340] Yes, absolutely.

[341] If you want to put numbers to it, probably in the early 50s during the Korean War, worldwide, there are about 20 war deaths per 100 ,000 people per year.

[342] Now, even with some pretty gruesome wars going on in Syria and Yemen and Afghanistan, it's more like 1 per 100 ,000 per year.

[343] So it's gone down by a factor of 20.

[344] And partly that's because there are all these parts of the world that they were constantly at each other's throats.

[345] And they kind of got sick and tired of it, Southeast Asia.

[346] There aren't any wars in Southeast Asia, that is wars with two armies fighting each other.

[347] In fact, in the Americas, when Columbia signed a deal with the guerrillas a few years ago, it put an end to the last official war in the Western Hemisphere.

[348] So an entire hemisphere is free from war.

[349] And again, we tend not to notice it.

[350] It's nothing happening.

[351] And also positive events don't lodge in the brain the way negative events do.

[352] Well, this is why I most like you, because your last book, which I loved, Enlightenment Now, walks us through what the ideals of the Enlightenment were, and they involve rationality.

[353] That's a big cornerstone of the Enlightenment.

[354] And you look at, and you have a great TED Talk.

[355] I urge everyone to watch your TED Talk.

[356] You kind of map out the arc of our experiment as civilized humans.

[357] And to your point, it'd be hard to find a metric where we're not bettering on ourselves.

[358] But, of course, in the moment, it doesn't feel that way.

[359] and no one would aim to dismiss the real feelings they're experiencing and the real fears, but it also, I do believe, is helpful to put it into a grander context that the experiment in general is moving forward in a very delightful way.

[360] Yeah, it's important for a number of reasons.

[361] One of them is it kind of gives us the gumption to try to attain more progress.

[362] If you just think, despite everyone's efforts to make the world a better place, it just gets worse and worse and worse, the conclusion is, well, why bother?

[363] Yeah, throw in the towel.

[364] Throw in the towel.

[365] The poor will always be with you.

[366] We're cooked no matter what we do, so let's just party.

[367] Whereas if what we do makes a difference and what's the proof that it makes a difference, well, what our parents did.

[368] They made a difference and our grandparents and our great -great -grandparents.

[369] So we're not kind of stuck with a fixed level of suffering.

[370] It's not a kind of biblical Ecclesiastes worldview that there'll always be suffering and there'll be death and life sucks and then you die.

[371] Well, I mean, yeah, you do die, but longer after more years of life than it used to be.

[372] Yeah, yeah, yeah.

[373] And there are still poor people, and there should be fewer and fewer.

[374] But the fact that there are far fewer poor people than there were even 30 years ago means, well, let's keep pushing.

[375] I mean, the World Bank and the United Nations of all institutions set as one of the Millennium Development Goals, eliminating extreme poverty from the face of the earth by 2030.

[376] Now, they probably won't meet that target, but the fact that they even set it as a target.

[377] Yeah, that would have been unthinkable as a goal 50 years ago.

[378] They would have been unthinkable, and it's not as if they're a bunch of kind of folk singers or hallmark card designers.

[379] But the reason they set the goal is it's a bit beyond our reach, but not that much beyond our reach, given how much reduction in poverty there already has been.

[380] So it's knowing about the progress has occurred can be kind of a cure of cynicism.

[381] and fatalism.

[382] Why bother?

[383] But also, you want to know what actually did work and what didn't work.

[384] And you kind of want to look backwards and say, well, they tried this.

[385] And that didn't really bunch the needle.

[386] But man, man, did we ever improve our condition through, for example, vaccines would be sanitation.

[387] Huge difference.

[388] Yeah, you're so positive.

[389] But part of that, to me, feels like arrogance.

[390] And I'll tell you how.

[391] I see this a lot in the religious community, different waves of people that they're certain they're going to see the end of days in their lifetime.

[392] Now, even if I believe in their religion, their people have been here for 2 ,000 years, each generation convinced they were going to see the end.

[393] There's a little bit of arrogance like, oh, I'm going to witness this on my watch.

[394] You know, like, I'm going to ignore that we've been here for 150 ,000 years doing it.

[395] And yet I'm going to be born at the perfect time to watch the total meltdown of this experiment.

[396] it feels a little arrogant.

[397] Oh, no, I think that's a real phenomenon.

[398] You certainly make your life into quite the drama.

[399] If you are there to witness the end, it gives you more gravitas, more drama, more importance, if you're the last generation.

[400] So, yes, I think there is that tendency.

[401] We like to write scripts with really dramatic endings.

[402] Well, even our last five -year, six -year political climate, which I acknowledge is horrendous, and I pray for some solution to it.

[403] But to have heard so many people say it's never been as bad before and then point out that the Civil War happened, and they're like, well, you know, but not that.

[404] I'm like, yeah, yeah, we all fought each other with guns.

[405] Like, remember that?

[406] That's pretty fucking bad.

[407] Yes, that's right.

[408] People really did die of a lot of things.

[409] Stay tuned for more firearm -hire expert, if you dare.

[410] We've all been there, turning to the internet to self -dive.

[411] diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.

[412] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.

[413] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated.

[414] Or the time when an entire town started jumping from buildings and seeing tigers, on their ceilings.

[415] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.

[416] It's called Mr. Ballin's Medical Mysteries.

[417] Each terrifying true story will be sure to keep you up at night.

[418] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.

[419] Prime members can listen early and add free on Amazon Music.

[420] What's up, guys?

[421] It's your girl, Kiki, and my podcast is back with a new season, and let me tell you, it's too good, and I'm diving into the brains of entertainment's best and brightest, okay?

[422] every episode, I bring on a friend and have a real conversation.

[423] And I don't mean just friends.

[424] I mean the likes of Amy Poehler, Kell Mitchell, Vivica Fox.

[425] The list goes on.

[426] So follow, watch, and listen to Baby.

[427] This is Kiki Palmer on the Wondery app or wherever you get your podcast.

[428] Comparing our recently departed president with, say, George W. Bush, who is obviously a much more civil guy, more of a gentleman.

[429] He started a war.

[430] It started two wars.

[431] What am I saying?

[432] That resulted in the deaths.

[433] of some 10 ,000 Americans to say nothing of the hundreds of thousands of Iraqis and Afghans.

[434] And whatever you want to say about Trump, he didn't start any new wars.

[435] I, too, was pointing that out in my liberal bubble.

[436] Like, as much as I despise this guy, we should be looking at how fortunate we are that he wasn't terribly effective.

[437] Like, that's kind of, to me, the real headline.

[438] But he did start some social wars, I would say.

[439] Indeed.

[440] Indeed.

[441] We've got to give them credit for that.

[442] Yeah.

[443] But still, social wars, civil war, was 650 ,000 Americans died.

[444] Of course.

[445] Social wars, they're pretty awful to live through, but not that many people, not that many people died.

[446] And by the way, it's not that I give them credit.

[447] I give them credit for being ineffectual and chaotic and disorganized.

[448] So a small mercy.

[449] But there is a change in the times.

[450] And you don't want to say that war is something like hemlines or, you know, lapel wits in men that goes, in and out of fashion.

[451] But there is something to that in that will you kind of gin up your poll numbers by starting a war?

[452] Will you get people's patriotic hearts pumping?

[453] And that's less and less true over the decades.

[454] That like the jingoistic flare is dissipating or something?

[455] Exactly.

[456] The idea that I'm going to die for my country where it's exactly what are you going to die for?

[457] If your country is not an existential threat, but if it's for the glory, if it's for adding a square inches on a map if it's for the...

[458] Keeping oil prices $9 lower per barrel?

[459] Yeah, people aren't willing to die for those things as much as they used to be.

[460] Now, that could be a problem if we ever faced an existential threat.

[461] One could argue it's a problem in terms of, to the extent that there could be such a thing as a just war, that is, people are rescued from slavery, from genocide.

[462] Hitler.

[463] Yeah, exactly.

[464] Is it a problem that people won't make sacrifices for the greater good?

[465] On the other hand, they won't make sacrifices for the egos of leaders or for nebulous concepts like national preeminence and glory.

[466] Well, you could argue there's a couple world leaders currently on the stage that if they had the latitude to do some of these other things, Stalinist things, they would.

[467] Oh, you bet.

[468] Yeah, I don't think the character of Putin and Stalin is maybe that different.

[469] or Kim Jong or any of these people, but there doesn't seem to be the will of the people to go along with that now.

[470] So that too seems like an improvement.

[471] Certainly they would want to do all those things.

[472] It is.

[473] And after I wrote The Better Angels of Our Nature and I plotted graphs on number of wars, number of war deaths, I kind of became part of a political science community.

[474] I mean, I shouldn't take credit for it.

[475] I got the data from organizations that keep careful track.

[476] And there are political scientists who've been debating this for, for many years.

[477] But there are debates among these people as to what drove down the number of wars?

[478] Why don't we have world wars or even old -fashioned wars where you have two armies between two countries, each flying a flag, meeting on the battlefield, chucking artillery shells at each other, huge tank battles, huge naval battles.

[479] The wars that we have tend to be civil wars, where you've got the popular front for the liberation of whatever, fighting some crummy government in a poor country.

[480] It isn't like two rich countries with their armies facing each other on the battlefield.

[481] So what drove that change?

[482] And there are different theories.

[483] One theory is that democracies are less likely to fight each other because when people have skin in the game and it's their sons who have to fight a war, they might be a little more squeamish than a leader who can do whatever he wants and drag the whole population in.

[484] That's one theory.

[485] The other theory is that we become more commercial.

[486] Interdependency.

[487] Yeah.

[488] Yeah, we're more economically independent.

[489] We're probably not going to go to war with China because they make all our stuff and we owe them too much money.

[490] Yeah, yeah, yeah, yeah.

[491] And China would be kind of crazy.

[492] The reason that the government has maintained its control and the reason that China's been so successful is they sell the world a lot of stuff.

[493] You can't kill your customers.

[494] You don't kill your customers.

[495] Exactly.

[496] That's another theory.

[497] Customers always right.

[498] I mean, you don't kill them.

[499] It's the opposite.

[500] They're always right.

[501] Right.

[502] Another theory is there's lots of international organizations.

[503] There's NATO, there's the EU, there's the UN, and leaders kind of see each other, and they develop kind of norms about what civilized people just do or don't do.

[504] And declaring a war has become kind of uncool.

[505] Putin did grab Crimea, but he's paid a price, and no one says, oh, yeah, that's okay, you can have it.

[506] It's like, what a faux pa. As the kids would say now, not a good look.

[507] Not a good look.

[508] It's just not a good look.

[509] One of political scientist, a guy named John Mueller, he and I kind of kind of go.

[510] back and forth on that, where I say, look, if you say that people stop fighting wars because war became less popular, it's kind of circular, isn't it?

[511] It's like the old story about the guy who asked to explain how sleeping pills work, and he says, well, it's their dormative power.

[512] Okay, that's just a word.

[513] Yeah, yeah, yeah.

[514] He's a different word.

[515] Right.

[516] I mean, it doesn't really explain anything.

[517] John says we might have to get used to that.

[518] It may just be that the reason there are fewer wars is that war is less popular when it has found maybe it is like the bustle the bustle's not going to come back probably yeah there's so many theories because even as you're saying it i'm saying one thing that dramatically changes we used to receive a very curated story about whatever war campaign we were in yeah and that's not the case now you're going to see every angle someone's going to hold up a cell phone you're going to see what it really looks like it's not guys with american flags it's just like helter -skelter chaos and kids we see a different version of it now i think that's absolutely right, and that's huge, because there used to be military sensors who would prevent the New York Times or the L .A. Times from publishing photos of gory war casualties because it would be bad for morale.

[519] They can't get away with it as much anymore.

[520] And it was said when I was a kid and the Vietnam War was raging, people said it's the first war that was brought into people's living rooms.

[521] Yeah.

[522] In real time.

[523] And immediately, people weren't in favor of that war in the way they were World War II, right?

[524] They saw how gnarly it is.

[525] I think that's right, yeah.

[526] You think of your war you grew up with, if me lie, those people would have had cell phones.

[527] Yes, right.

[528] We would have seen how horrific that is instead of waiting 30 years to uncover.

[529] Interesting.

[530] Right.

[531] But also along with it not being cool, it's also technology has made it so that you don't have to be face to face and whoever killed the most people is the more dominant one.

[532] The more dominant one is the more technologically advanced society.

[533] So that's, I think, how we define dominance now is just totally.

[534] That's right.

[535] Wealth doesn't come from land, so it's not that the country that has the most land, or even the most resources.

[536] If that were true, the world's richest country would be Democratic Republic or the Congo.

[537] I mean, they're just overflowing with diamonds and cobalt, but it's not a rich country, it's a poor country.

[538] And whereas there are countries like Singapore and Israel and Taiwan.

[539] I mean, they're the new oil fields, basically, if you think of it that way.

[540] Yeah.

[541] And in general, even countries that have both resources and know -how, it's really the know -how that makes them rich.

[542] There's the cliche that there's no Silicon and Silicon Valley.

[543] And some other country wanting to get rich, if they try to invade Silicon Valley, they'd kind of be missing the point.

[544] Yeah, they would like stop for a coffee at Starbucks and be like, which one of these buildings is going to be the keys to the kingdom?

[545] I don't know.

[546] They all look huge.

[547] Exactly, yes.

[548] So why I'm attracted to you and your messages, I do think it's ultimately optimistic, and I personally am optimistic about our experiment.

[549] I had an anthropology teacher at the end of our course tell us, whatever happens, I want you guys to recognize how new this experiment is, and I don't want you to lose faith in it.

[550] I want you to observe our civilization in full context of that we appear at December 31st at 11 p .m. on the geological calendar.

[551] Like, keep that in mind, right?

[552] This is brand new and we're doing pretty damn good.

[553] And I took that to heart, and it's kept me an optimist.

[554] Now, I am far more pessimistic about people, only because I don't trust anyone on the planet more than myself.

[555] And I am so fucking fallible.

[556] It's insane.

[557] And this is with a ton of tools and checking in regularly in a community, keeping me accountable.

[558] And it's still nearly impossible.

[559] So I just wanted you to touch in on this thing that I heard.

[560] heard once, and it might have been even from, oh, who do we love that talks a lot about empathy?

[561] Oh, a Paul Bloom?

[562] Paul Bloom.

[563] I think I might have heard this.

[564] He's a former student of mine, yeah.

[565] Oh, that's so cool.

[566] Yeah.

[567] We're having him on soon.

[568] I think he pointed out, I want to say it was him I got this idea from, but that the ability of an individual to be rational is almost impossible, but the ability of a group of people to be rational or a system to be rational is really achievable.

[569] What are your opinions on that, the individual versus the collective or versus a system that can be put in place to help us be rational?

[570] Yeah, I think that that's more or less true.

[571] It can't just be a group.

[572] It's got to be a group that's operating with the right rules.

[573] And the rules have to be what is going to bring us to truth, rationality, objectivity.

[574] And so a good contrast is, on the one hand, Wikipedia, which is not perfect, but it's pretty awesome, considering that it's just a bunch of amyed.

[575] And when people compare it to say Britannica, it's kind of up there.

[576] It's not that different.

[577] So compare that to say Twitter.

[578] And if someone you did a sample of tweets, how accurate are they?

[579] Well, not so much.

[580] So they're both groups.

[581] They're both networks.

[582] The difference is that in Wikipedia, there's a set of principles.

[583] If you become a Wikipedian, you've got to adopt a neutral point of view.

[584] Everything has to be sourced.

[585] You can't use it as a soapbox.

[586] to make an argument.

[587] Yeah.

[588] It's not a place to propose theories.

[589] Exactly.

[590] And whatever you put down, someone else can erase.

[591] And that, in turn, can't just be taking turns, writing something erasing it, writing something erasing it.

[592] There are then kind of higher authorities, more Wikipedia wizards, who adjudicate.

[593] So there's a whole set of rules that make it so that the back and forth, give and take, the cacophony of voices, will kind of blunder its waste towards something it's likely to be true.

[594] Yeah, inch its way towards the truth every iteration.

[595] Yeah.

[596] Twitter goes in the opposite direction.

[597] Like, you say something inflammatory and then I pour more gas on it by the fifth person who comments if you're talking about Nazis and all these archetypes of evil and you're like, wait, wait, wait, we're talking about a rebella vaccine.

[598] What the fuck?

[599] How is it ever involved in this?

[600] How'd this happen?

[601] Well, there is that law, and I forget who is named after, that, this is way before social media, this was just in, like, online bulletin boards, where the probability that someone will be compared to Hitler approaches one.

[602] Oh, my God.

[603] That's amazing.

[604] Wow.

[605] But likewise, in other communities, so the point that Paul Bloom made and Jonathan Roush makes it in his book, The Constitution of Knowledge, and I confronted it in writing rationality in grappling with this paradox that the three of us talked about at the top.

[606] papacy session, namely, on the one hand, we're not a stupid species.

[607] We discovered DNA.

[608] We got to the moon.

[609] We discovered cell phones and vaccines.

[610] I like that you say we doubled our lifespan.

[611] There's no other animal that's ever doubled their lifespan, yeah.

[612] Yeah.

[613] On the other hand, there's an awful lot of poppycock circulating and we're vulnerable to all kinds of misinformation and fake news and paranormal woo -woo.

[614] So how do you resolve them?

[615] How can a species whose individuals are so susceptible to nonsense, put their heads together to come up with something that is actually true.

[616] So it is true that it comes out of groups, comes out of groups where people can spot each other's fallacies.

[617] If one person gets carried away with their ego, well, someone else probably has a big ego too.

[618] They can counter that person.

[619] It's kind of like peer review, right?

[620] I mean, ultimately.

[621] That's a perfect example.

[622] So peer review, democratic checks and balances.

[623] I think James Madison, when he was writing the Constitution, said ambition should counter ambition.

[624] Can we really quickly?

[625] I would imagine we agree on this.

[626] I am often shocked by my liberal.

[627] I'm liberal.

[628] Let me own my liberalness.

[629] But I'm in a liberal bubble.

[630] And I do see on the faces of many of my liberal friends that they truly believe in their cells that if the government was 100 percent Democrats, that this place would be in Eden.

[631] And I simply don't think that.

[632] I think the right has a super valid opinion.

[633] They are there to poke holes in our arguments, where they're to, like, I see this two -party system as flawed as it is, at least a some form of checks and balances or peer review.

[634] You're going to get attacked if your idea isn't strong.

[635] I think we need each other.

[636] Absolutely.

[637] And that mindset is at the basis of democracy, that societal improvement doesn't depend on giving power to good people.

[638] Because no one's that good.

[639] No one's that smart.

[640] You know, it's that wise.

[641] We think we are.

[642] Again, that's part of the problem.

[643] We all think we are.

[644] We can't all be right.

[645] And some of my favorite policies over my lifetime have turned out to yield the opposite result we were hoping for.

[646] That can happen.

[647] No one's smart enough to play everything out in their imagination, in the theater of their mind's eye.

[648] So you've got to be able to learn.

[649] You've got to be able to evaluate policies and say, well, gee, did it really reduce crime?

[650] Did it really reduced pollution?

[651] And some things do.

[652] And some things don't.

[653] And you've got to have opinions voiced and ideas scrutinized and criticized and attacked on their content, of course.

[654] If it's just people attacking each other's characters, that's not going to get you anywhere.

[655] Yeah, I'm not taking medical advice from a bald guy.

[656] Wait, wait, wait, wait.

[657] Wait, why are you talking about me being bald?

[658] Or like, I don't know if you ever caught the original Saturday Night Live weekend update where they had a mock debate between a liberal and conservative played by Dan Aykroyd and Jane Curtin?

[659] Yes, and he'd say, Jane, you insufferable bitch.

[660] That's a play.

[661] No, no, Jane, you ignorant slut.

[662] Oh, wow, yeah.

[663] Jane, you ignorant slut.

[664] And to encounter with, Dan, you pomp his ass.

[665] It's too bad we don't have that anymore because, as usual, nowadays, a lot of people who are eager to find signs of sexism and racism in everything don't get jokes.

[666] And so people say, oh, Saturday Night Live is so sexist.

[667] actually called one of the comedians a slut.

[668] They said, no, no, no, no. That was actually making fun of people.

[669] Oh, as you might guess, as two comedians were very, very sensitive to this.

[670] What frustrates me the most is ask yourself who's being made fun of.

[671] You must, that's the smallest requirement before you attack this joke.

[672] They just hear a word and freak out.

[673] Yeah, who is being made fun of?

[674] Quite often, we act like other people to parody them to point out that they're absurd.

[675] So you can't not have those characters in our media.

[676] The first time I ever wrote an article for The New York Times, there's an editor who said, the second rule of journalism is never use irony, because there'll always be a large percentage of readers who won't get it, who think you're being straight.

[677] Now, of course, that was an exaggeration.

[678] Now, as comedians, that, oh, my God, that must be...

[679] Well, to be honest, I don't do print interviews.

[680] I'm like, I'll talk to anyone in front of a camera so you can feel my intention through my inflection, but if you try to read my inflection, it's going to be.

[681] going to go south.

[682] It always does.

[683] Even with emojis.

[684] Yeah, yeah, totally.

[685] But part of it circles back to what you were talking about earlier, where negative things get the biggest clicks.

[686] So people, a media, wants to take that line from S &L and make it a derogatory story.

[687] They don't want it to be about how funny the sketches.

[688] They want it to be inflammatory.

[689] No, that's right.

[690] When you brought it up, I wanted to add, and you would do a better job than me, but don't feel bad that that bad news is more powerful.

[691] When we look on a biochemical level, the actual chemical you receive, for a good job, you found a tree with fruit, remember this tree with fruit, that's dopamine and whatever else, versus I ate this, I got ill for five days, you get a different chemical message in your brain that is on a chemical level 10 times stronger, or however many times stronger, right?

[692] Like our chemistry, we can't change that.

[693] The negative is going to be a more powerful chemical.

[694] Yeah, I think that's right.

[695] And there's a reason why the chemistry, is asymmetrical, which is that bad things can hurt you a lot more than good things can help you.

[696] So at the extreme end, there's death, and death isn't something that just really, really sucks.

[697] It's kind of worse than that.

[698] It's game over, and you don't get to play again.

[699] And so we're very, very conscious of the downside.

[700] On the upside, if I were to say, and actually, I got this from Amos Tversky, Conneman's collaborator, when he and I were colleagues at Stanford many years ago.

[701] This has stuck with me. It was decades ago.

[702] And he said, how many things could happen to you today that would make your life much, much better?

[703] Well, I don't know.

[704] I mean, maybe I could win the lottery if I bought a ticket or how good can it get?

[705] And also he said, and how much better can you imagine feeling than you feel right now?

[706] Well, I guess I could have a spring of my step.

[707] I can be kind of euphoric, I suppose.

[708] And if you've done illicit drugs in your life like I have, you're looking at 3x better?

[709] Exactly.

[710] Yeah.

[711] Now flip the question.

[712] How much worse?

[713] could you feel than you're feeling right now.

[714] Oh.

[715] Yeah.

[716] And how many things could happen that make your life much, much worse?

[717] Oh, my God, it's bottomless.

[718] Yeah.

[719] That's why we live in a state of fear.

[720] Yeah, well, that's why fear evolved to be more powerful than pleasant anticipation.

[721] There really are more bad things that can happen, and they can be objectively worse for you than the good things that happen.

[722] And that's wired into our emotional palate.

[723] The problem is it means that we're miscalibrated.

[724] When there are things that don't depend on our day -to -day experience, but can be demonstrated with data and statistics, we tend not to appreciate them.

[725] Cases where there really have been spectacular improvements, undreamed of in our evolutionary history, and we have trouble swallowing them, ironically.

[726] Yeah.

[727] I bet a lot more of our decisions are made than we're aware of in avoidance of something than in pursuit of something.

[728] That's probably right.

[729] We are loss of our assist, Tversky and Kahneman, put it.

[730] Yeah.

[731] Okay.

[732] So this is the big one that I want your opinion on because I don't hear it talked about a lot.

[733] See if I can lay this out in a coherent fashion.

[734] Basically, we have objective data.

[735] We have an atomic reality.

[736] But that atomic reality passes through our body.

[737] It passes into our eyes and through this filter, this human filter, which is just rife with subjectivity.

[738] I think we fetishize reality and almost the notion that we could make these rational decisions.

[739] little bit, I fear, ignores that all that data and objectivity is irrelevant other than it's passing through us and we're experiencing it.

[740] So I'm doing a terrible job asking this question, but do we factor in emotional reality?

[741] I guess that's the question.

[742] As part of our objective to make people feel safe even when it's perhaps irrational, like an example I could give you is when you talk politics and you hear people on the right talk about immigration, you hear people on the left talk about immigration, you hear one side throwing a bunch of numbers at the other side, and it's not at all alleviating their fear.

[743] It's not addressing it.

[744] It's not attempting to comfort it.

[745] And I think we sometimes are stuck in these paradigms where there's a reality and we should be in search of it.

[746] And I do wonder if it ignores what the emotional reality of being on this planet is.

[747] What do we service, I guess?

[748] Do we service atomic reality or do we service the experiential emotional reality.

[749] I don't know.

[750] You're absolutely right.

[751] It is other humans that we deal with, including political leaders and influencers.

[752] And we can't pretend that they're not human.

[753] And so people's emotional reaction is something that we do have to factor into account and even anticipating cases where everyone's going to be a bit irrational.

[754] Well, if you're rational, that factors into your rationality, namely, how can I get done what has to be done, such as public health, such as dealing with climate change, given that people are going to react to my message in sometimes emotional ways and really effective leaders and ones who we actually value, not just they're effective in the sense that they get things done, but that they get the right things done.

[755] I probably want to know how to combine the right policy with the right message and the right affect.

[756] Franklin Roosevelt might have been the paradigm case where he had to get the country out of the Depression.

[757] He had policies, and economists still debate how effective they were, but the Depression did end.

[758] But he also made people sing Happy Days Are Here Again.

[759] And he had a jaunty air, even though he himself was a paraplegic, but he exuded confidence in a way that created its own reality, because part of the problem was, how do you jumpstart an economy where everyone is withdrawing their money because they think it's going to get worse, which makes things worse.

[760] How do you revive the animal spirit so that people are willing to take the chances?

[761] Well, that is itself a problem in the emotions.

[762] And I don't want to use the word emotional engineering.

[763] That sounds kind of ominous.

[764] Eugenic -y, yeah.

[765] Exactly.

[766] But nonetheless, it is true that, especially in a democracy where people's emotions matter.

[767] The difference between a demagogue and what we might call an effective leader, a great leader, is the great leader combines the rationally optimal policy with the emotional packaging, messaging uplift that gets people to accept it.

[768] It's a unique combination.

[769] And maybe a stupid example I could give is basically if Monica's cold and I look at the thermometer in the house and I say, oh, no, it's 74.

[770] You're not cold.

[771] Humans aren't cold at 74.

[772] Like, I might be atomically correct or biologically correct, yet it's completely irrelevant to her.

[773] I'm always cold.

[774] The person.

[775] I can't tell me I'm not cold.

[776] Oh, yeah, thermostat wars.

[777] Yes.

[778] Oh, if you're married, you know all about thermostat wars.

[779] Oh, yes.

[780] I can't tell you how often my wife says it's freezing here.

[781] I go, honey, it is the same degree in this bedroom every night.

[782] So it couldn't have been warm last night and freezing tonight.

[783] But yeah, man, I wish I would have been able to lay that out more.

[784] I just wonder, is science infusing the fact, the reality that all of this stuff is irrelevant unless the experience of it is confronted or brought into play or evaluated or measured or serviced as opposed to just we know the objective truth and you've got to get on board with it.

[785] That seems like that hasn't been the most productive strategy over the last couple years.

[786] To put it mildly.

[787] Yeah.

[788] So part of it is, and people are different.

[789] They differ in terms of how much they enjoy thinking and reasoning and persuasion.

[790] In a democracy, you've got to appeal to a lot of people.

[791] So there are people who are willing to listen to the argument, crunch the numbers.

[792] There are others, and probably most people, they align themselves with the authorities they can trust.

[793] And this includes people who we like to think end up on the right side of issues.

[794] So, for example, the people who acknowledge human -made climate change, they don't do it because they understand atmospheric chemistry.

[795] In fact, polls show that a lot of people who believe in human -made climate change are kind of out to lunch about the causes.

[796] They think it's global warming the has some of the ozone hole and it's about plastic straws.

[797] Uh -huh.

[798] S -U -Vs?

[799] Your SUVs, at least, there's a little bit of a contribution.

[800] Yeah.

[801] I read Bill Gates' book on it, and if you really break it down, the things we focus on, they, too, are irrational and asymmetrical.

[802] Comparatively, yeah.

[803] Yeah.

[804] No one's talking about concrete.

[805] When's the last time you saw a commercial about how to get green concrete?

[806] Exactly, and fertilizer.

[807] Yeah, steel manufacturing, all these things.

[808] So that is right.

[809] And to the extent that people, do end up with opinions that are consistent with the science, it's because they trust the scientists.

[810] The guys in the white coat, the gals in the white coat.

[811] They're my kind of people, and I believe what they say, and they've had a good track record so far, so I'm not going to actually take a course in chemistry if they say it, it's probably true.

[812] And we all have to do that.

[813] None of us knows everything.

[814] It can be a problem if the scientific establishment seems too aligned with one side of the political spectrum.

[815] If people think scientists are on the left and they're on the right, they'll say, well, why should I trust the scientists?

[816] So I tell my fellow scientists this.

[817] We've got to safeguard our own credibility by not appearing to be just another propaganda faction.

[818] Well, in the numbers, yeah, I was just at this conference and there was a great professor from Stanford who was of an opposing opinion as me, but I couldn't probably argue his data, which was like, it's not like it's 80 % liberal on campuses.

[819] It's really, really high.

[820] Like, colleges are incredibly liberal if they're plotted on a political spectrum.

[821] It actually is a problem, the lack of viewpoint diversity.

[822] My friend Harvey Silver Liberty's hero and lawyer says that on a modern college campus, diversity means people who look different and think alike.

[823] Yeah.

[824] Oh, wow.

[825] I went to, as we established earlier, University of Georgia, which, God, I mean, what I say it was liberal, not really.

[826] Also, when we think about this topic, we are often thinking about elite schools in this country.

[827] And they are often very liberal, very skewed.

[828] But there are a lot of other colleges.

[829] Backwaters?

[830] Like Georgia?

[831] I guess we'll call it if I have to make the point.

[832] I mean, you're going to see Georgia's, yeah.

[833] It's a pretty major university.

[834] But you're right that there is a lot of variation.

[835] And there's also variation across faculties within schools.

[836] So arts and science tend to be the most left.

[837] And within them, for example, cultural anthropology is almost off the scale compared to...

[838] That was my major, yeah.

[839] Yeah, yeah, yeah.

[840] Compared to economics.

[841] And then if you go outside of arts and science, the school of engineering, the business school, they tend to be less leftist.

[842] The public health schools tend to be more leftist.

[843] So you're absolutely right.

[844] There is variation.

[845] Still, if you compare universities as a whole, they are always to the left of the population.

[846] Now, in the title of your book, Rationality, why it seems scarce?

[847] Why does it seem scarce?

[848] So I was careful there.

[849] Originally, it was going to be why it's scarce.

[850] But I wasn't willing to concede that people are irrational because a lot of the people who believe in these kooky theories in Q &O, in Q &O, in chemtrails.

[851] Will you tell us chem trails?

[852] We have a show called Armchared and Dangerous where we explore conspiracies and attempt to defame them.

[853] Oh, oh, yes.

[854] But I'm not familiar with chem trails.

[855] What are chem trails?

[856] Oh, chem trails.

[857] Oh, my goodness.

[858] You didn't know that those jet con trails in the sky are actually mind -altering drugs dispersed in a secret government program to pacify the population?

[859] You haven't done your research.

[860] Oh, man. I know.

[861] Also, if they think it's to pacify the population, are they not paying attention?

[862] It's not doing a very good job.

[863] So, if you believe in that, you'd have to acknowledge it's a flawed gas.

[864] Yeah, they need to keep R &D in the gas.

[865] The people who believe that, presumably they hold jobs, they pay their taxes, they keep gas in the car, they get the kids to school.

[866] So it's not like they're irrational across the board.

[867] And can I add, this goes right back to the conversation we're just having, which is I don't believe the way through to that person is to explain to them how that's physically impossible.

[868] Right.

[869] I think the route through that is like, what is it they're really afraid of?

[870] How do we find out what they're really afraid of and address that?

[871] Because we could spend the rest of our lives arguing how plausible that system of inoculating the population is.

[872] It's not.

[873] But why do that?

[874] That's not even getting close to what's going on.

[875] Right.

[876] So basically, the people who subscribe to a lot of these conspiracy theories hate the establishment.

[877] They distrust the establishment.

[878] And they're willing to say that establishment politicians, business people, intellectuals, celebrities are capable of anything.

[879] And you might say, well, wait a second.

[880] Do you literally believe that there's a cabal of Satan worshipping pedophiles, pedophilic animals?

[881] Children in their basement.

[882] Yeah.

[883] So, yeah, I mean, they sacrifice them and drink their blood.

[884] And when it comes to things like that, it's not clear what you mean by believe, that a lot of people, the view that all of your beliefs should be verified as true or false.

[885] You should only believe the true ones.

[886] That's kind of a pretty, it sounds obvious, but it's a pretty exotic belief.

[887] It's kind of a scientifically informed post -enlightenment mindset.

[888] But for most of human history, things that you couldn't experience in your day -to -day life.

[889] lives.

[890] Like, where does disease come from?

[891] What was the origin of the universe?

[892] What do people really do it behind closed doors in the White House or in the executive suite of General Electric?

[893] You can't find out.

[894] So your beliefs aren't things that are, you care that whether they're actually true or false.

[895] They are ways of building up solidarity with other people.

[896] They're good stories.

[897] They're morally edifying.

[898] They're enlightening.

[899] They're empowering.

[900] And people hold a lot of beliefs, not so much that they're committed to the literal truth or falsity, but because they are morally good things to believe.

[901] They identify the good guys and the bad guys.

[902] Yeah, this good and evil.

[903] Yeah, if someone says that Hillary Clinton ran a child sex ring out of a pizzeria, are they really just saying, Hillary, boo?

[904] She is so bad that she, for all we know, she could be doing that.

[905] That's how bad she is.

[906] Right.

[907] Yeah.

[908] Now, granted, there are some people who take conspiracy theories, literally.

[909] There was the guy who burst into the pizzeria with his guns blazing.

[910] There are people who stormed the Capitol believing that the election had been stolen.

[911] And so that's why conspiracy theories are harmless when people switch over from, oh, what a great story to it really happened.

[912] It can be really dangerous.

[913] Yeah.

[914] But I think for a lot of people, just like you challenge some people on their religious belief, they say, do you really think that Jesus rose from the and walked on water and healed the sick.

[915] And it's like, well, it's not a matter of proving it true or false.

[916] It makes my life meaningful to believe it.

[917] And how can you prove that he didn't?

[918] And I think a lot of beliefs are like that.

[919] Like, it expresses my values.

[920] I want to hear what you think about my theory on the conspiracy theory uptick, or what at least seems like an uptick.

[921] And I'm very compassionate to it, which is if I am excluded from a system, if I'm not thriving, if I can't find a job, and I'm not prospering in any way, this system has excluded me and I'm worthy of inclusion.

[922] And people should feel that way.

[923] Why am I excluded from this system and other people are included?

[924] Well, the system has to be broken or it would include me, this nice person.

[925] Like, that is a thought process you should have.

[926] You should value yourself enough to think you're worthy of being included into this system that other people seem to be prospering from.

[927] So I think as more people get displaced by the changing landscape of our manufacturing and all these other elements, the option of accepting I'm just not thriving in the system that does allow people to thrive would be such an admission of their own failure.

[928] That's not going to happen.

[929] So it becomes very plausible that the system's rigged against you and that there's people that are intentionally doing this.

[930] Like, to me, that all follows.

[931] If you feel first excluded by the system, Oh, absolutely.

[932] I think that that's a lot of insight there, that people just feel alienated from the system.

[933] And so they're willing to believe anything of it.

[934] They're willing to express anything of it.

[935] At the extreme, there is a hard core of people who have what sometimes called the need for chaos.

[936] They just think the whole system is so corrupt and evil that they'd be happy to watch it burn.

[937] Yeah, that's scary.

[938] Which is scary because things can get a lot worse and people can make it a lot worse.

[939] But a lot of the some destructive elements in politics and voting and embrace of conspiracy theories are in people who just think the entire edifice is rotten.

[940] The only real fix we have for this is the time machine that puts people back in the 1500s for like two weeks.

[941] That's it.

[942] You send them there for two weeks and then they come back and you go, how you like this place?

[943] I'm going to say something a little controversial here here at the end.

[944] It's that they do feel excluded, but because they feel entitled to the inclusion.

[945] It's not like they had been given all this stuff.

[946] Like I don't know percentage -wise how race plays into these conspiracy theories, but from what I can see of the people storming in the capital, of the people I know, in Q and on.

[947] I don't see very many black people there because they don't have this sense of...

[948] But you see them at a BLM march break in windows because this system has excluded them.

[949] And they're going, this system fucking sucks if you can kill us at three times the rate as the white people.

[950] Well, yeah.

[951] We reject this system.

[952] Right.

[953] And those are real things that are happening to them as opposed to I'm entitled to be rich and yet I'm not so I'm mad.

[954] Well, but even if you're entitled to have employment, you live in middle of Ohio and there was like five manufacturing plants and now all there is as opiate addicts, you're like, this system is flawed.

[955] Yes.

[956] So I don't respect it.

[957] I have no respect for it and I don't mind dismantling it.

[958] Yeah, but it's like, but are we entitled to be employed to employment in this faction?

[959] Like, I don't know.

[960] I'm not entitled to be an actor.

[961] I want to be one, but that doesn't mean I should be known.

[962] I think you deserve to be.

[963] And if you don't become a very famous actor, you know, you don't become a very famous actor.

[964] you should burn down paramount.

[965] Okay, all right.

[966] As long as I have permission.

[967] I just think it's interesting that there's that.

[968] No, I agree that is a profound observation.

[969] And some of the popularity of authoritarian populism comes from groups who feel that they've been displaced from a position of influence and status that has gone down.

[970] And that's why not just in the United States, but in Europe, the populist support often does come from the ethnic majority that is not as dominant as it used to be.

[971] Yeah.

[972] And Monica, you also write that even the idea that governments have the responsibility of keeping everyone employed and keeping everyone prosperous is historically kind of recent.

[973] I suppose it's probably since the Great Depression and Roosevelt.

[974] But beforehand, governments were there to prevent your country from being invaded and being carried off as slaves.

[975] Maybe to prevent you from being stabbed by highwaymen and brigands.

[976] Maybe collect some taxes for roads and garbage.

[977] collection.

[978] The idea that the government should have both the responsibility and the competence to manage an economy, which is just what we expect of governments now.

[979] It's a huge ass, but I will just point out really quick, those people are witnessing that when the elites fuck up, they are taken care of.

[980] So if they're running bare sterns, there will be a safe net.

[981] So it does appear that the government is protecting certain people.

[982] Well, and sometimes the government is protecting certain people.

[983] Yeah, yeah, objectively.

[984] That's true.

[985] Okay, so why it seems scarce.

[986] Did we answer that?

[987] Oh, one of the reasons is that we're just so partisan, the my -side bias, namely that you endorse the beliefs that make your coalition, your team, your tribe, your posse, your political party, your sect look good, and the other side look stupid and evil, is a really powerful motive.

[988] So there's that kind of tribalism, sectarianism, my -side bias, as it's called.

[989] There's the fact we all have certain primitive intuitions that are probably wired into us because that's the way you survive in a natural environment, such as that diseases are caused by pollutants or contaminants, and so I don't want a vaccine.

[990] Are you kidding?

[991] That's actually injecting a germ into me. Conversely, if I am sick, I should vomit, I should bleed, I should get rid of toxins.

[992] These are kind of deep -seated primitive intuitions that are hard to unlearn unless you sign on to the scientific consensus, which a lot of people don't.

[993] Or that throughout the day I make plans, the tools, the food, everything around me has been designed by someone.

[994] You take a few steps beyond that and you can think, well, the universe must have been planned.

[995] Everything happens for a reason.

[996] There's, you know, synchronicity.

[997] There's...

[998] All this correlation we're seeing, yeah.

[999] That's right.

[1000] There's got to be something.

[1001] It can't be a coincidence.

[1002] So there are these primitive intuitions that we, that not everyone has unlearned.

[1003] And then there's the distinction between beliefs that we hold because we think they're literally true and that really affect our lives, and beliefs that are good stories.

[1004] They're enlightening.

[1005] They paint the right people as good guys and the right people as bad guys.

[1006] And the idea that, no, that's not why you should believe something.

[1007] You should only believe it if it's true.

[1008] And not everyone signs on to that.

[1009] Most of us don't when it comes to certain sacred beliefs.

[1010] Can I give you my most mind -blowing experience with this has just happened three weeks ago?

[1011] In fact, when I saw you in Sam debate, or he interview you, however you want to frame that, the tail end of it became a debate between your guys' exclusive views on where AI is going to go.

[1012] And he comes to it with great fear and you come do it with some optimism.

[1013] But that's not even the relevant point.

[1014] I was rereading notes from the underground, Dotsiefsky, I don't know when that was written in the 1800s.

[1015] And by God, I am reading a chapter where he goes into great detail that our mathematics have gotten to a point that these new math professors are so brilliant.

[1016] and they're so good at predicting outcomes that our free will is gone in a matter of minutes.

[1017] And I thought, oh, my God, this is the AI fear.

[1018] But he had it about mathematics.

[1019] Oh, humans just have a great fear of losing their free will.

[1020] Like, that's all that's really going on, that we hang the argument on different developments and technology.

[1021] But here's a guy in the 1800s that's convinced we won't have free will because calculus has gotten so good.

[1022] No, that's right.

[1023] I forgot about that from notes from underground, but he did in Brothers Karamazov there are debates there about what one character says it's just, it's amazing.

[1024] They're these little cells in our head and they have little tails and when they vibrate.

[1025] That's all the thinking consists of.

[1026] That creased me out.

[1027] I don't think I'm ready to believe that.

[1028] Yeah.

[1029] So he did grasp these things.

[1030] So I think there's that and the fear of determinism.

[1031] Am I out of the loop?

[1032] Is it all going on in my brain without me?

[1033] Is one of the fears?

[1034] Another one is also the fear.

[1035] of technological runaway is also pretty old.

[1036] I mean, it goes to, there's the Jewish legend of the golem, a clay monster that was animated by the name of God written on a piece of paper and placed in its mouth and then goes on a rampage.

[1037] Frankenstein.

[1038] And even before that, Pandora's Box and Prometheus, he stole fire from the gods and didn't turn out too well for him.

[1039] He got chained to a rock and a eagle picked out his liver for forever.

[1040] But the idea that our own creations can turn on us.

[1041] Oh, uh, sorcerer's apprentice.

[1042] Oh, sure.

[1043] Including the Mickey Mouse version in Fantasia.

[1044] Yeah.

[1045] So the idea that our technology will summon black magic or dreadful powers from the gods is an archetype.

[1046] Primitive fear, yeah.

[1047] Primitive fear that I do think plays out in some of the AI scenarios.

[1048] Yeah.

[1049] Can I hit you with one last thing?

[1050] Because I've been in this program for 17 years, Alcoholics Anonymous.

[1051] I am both cynical of it and I witness what it does, right?

[1052] So it's been a very fun endeavor for me because I'm an atheist.

[1053] This program requires, I believe, of higher powers.

[1054] Higher power, right?

[1055] Yeah.

[1056] And, you know, at some point I had to make a decision in my head, do I value living or adopting something I don't believe for a minute to save my life?

[1057] I think it's a unique situation to find oneself in where your kind of life depends on you committing to some ideas that maybe you don't really hold.

[1058] but the one step in the program that I find to be so breakthrough is this four step, or if you're not familiar with it, all they ask you to do is list people you have resentments against on the left side, right?

[1059] And you fill that page up, maybe there's 30 people, maybe there's 10, maybe there's 100.

[1060] And on the next column, you have to write what they threatened.

[1061] What do they do?

[1062] Well, Mike's always trying to get me fired, right?

[1063] That's why I resent Mike.

[1064] And the third column is, what does that threaten in your life?

[1065] Well, it threatens my economic security.

[1066] And then the last one is like, well, what could I do to prevent that from happening?

[1067] Well, I could stop being late.

[1068] If I don't give Mike anything to say to my boss, I can't get fired.

[1069] But the magic of this step is if you have a hundred people, Stephen, on your resentment list, I promise you.

[1070] I've never seen it go another way.

[1071] You have three fears that are being triggered by all these different people.

[1072] Mine are economic insecurity, my status, people thinking I'm dumb.

[1073] So all my issues with almost anyone I've ever had an issue with in my whole life have boiled down to these three fears I have.

[1074] And for me, it's been so helpful to identify what three fears I have, how they're running my life, and always check all of my emotions and my thoughts and my reactions against those three fears I have.

[1075] Oh, is this just that?

[1076] Oh, it is.

[1077] I'm constantly afraid I'm going to end up penniless.

[1078] Okay, so that threatened that.

[1079] And I wonder if being rational should involve one recognizing what their fears are, because it seems to taint so much how we process the information we're receiving.

[1080] I wonder what your thoughts are on that about our fears and how they taint us.

[1081] Oh, yes, absolutely.

[1082] People often say, well, how can you extol rationality?

[1083] It doesn't have these limitations.

[1084] It's not the totality of our emotional life.

[1085] It doesn't acknowledge people's irrationalities.

[1086] But the thing about rationality is it can always back up a level and take into account our own emotional life, other people's emotional lives, the fact that each one of us is irrational and factor in workarounds for those very limitations that it can recognize.

[1087] And if that doesn't work, if you thought you were going to deal with the irrationality and you don't, well, you can step up a level above that and say, well, where did I go wrong in factoring people's irrationalities?

[1088] Rationality is, as they say, recursive in that it can always think about itself and think about its limitations.

[1089] So that's why ultimately, as long as we're discussing anything at all, we've got to be rational about it.

[1090] And that does not exclude being rational about our irrationality, stepping back, popping up a level, taking into account the totality of human experience, including our irrationalities as something that our rationality ought to be.

[1091] deal with.

[1092] Correct for.

[1093] I think of it in an experiment like you would correct for elevation.

[1094] So we know at this temperature water should boil at 5 ,000 feet.

[1095] I'm searching in my life for those corrections.

[1096] I think that's a great analogy.

[1097] It's exactly right.

[1098] And the thing with the 12 steps, I think like a lot of successful social and religious movements, it might mix in a little bit of religious mumbo -jumbo, but with a lot of real human insight, including, say, identifying the people that you resent, the people that you have harmed, the commitment to make up to people you've harmed.

[1099] A lot of these things kind of make a lot of sense in terms of just a rational way of dealing with your life.

[1100] And if they're mixed in with surrender to a higher power, it might even be a metaphorical way of saying there are parts of myself that I can't control, parts of my world that I can't control.

[1101] And so there's probably some wisdom that is framed in ways that not everything is literally true.

[1102] Yeah, I think it's actually, just a humility step, to be honest.

[1103] It's me saying something makes the sun come up and go down and it's not Dak Shepard.

[1104] Yeah, I think that makes a lot of sense, which is why it has been successful for so many people, as well as tapping into the part of ourselves that really can exert free will, not in the sense of anything magical or mystical or outside of our brain, but the fact that we do have these massive frontal lobes that take in information from the rest of the brain, including our memories, our plans, and that really can implement certain decisions if they're engaged to do so.

[1105] This is what cognitive behavior therapy does, rational emotive therapy.

[1106] Right.

[1107] We've got this frontal lobe.

[1108] It's not always in control completely, but we should recruit it as much as possible.

[1109] Yeah, most of the best decisions are made up there, at least minimally.

[1110] Yeah.

[1111] Yeah, and also it becomes a system and it becomes an account.

[1112] system where people, you're saying something in AA and someone's like, yeah, I don't know if that's true.

[1113] When I was in this situation, I found out this.

[1114] So it is a peer review.

[1115] Ultimately, it's an accountability with built -in peer review.

[1116] That's interesting because going back to an earlier point in our conversation, just about anything that we managed to accomplish that we could call rational, we do in some kind of system with peer review, checks and balances, fact -checking, editing, one person's flaws are pointed out by someone else.

[1117] Yeah, I guess my big, big wish in the first step towards rationality for me would just be like, everyone have some humility.

[1118] It's so hard to have.

[1119] Start with the recognition that you're a super flawed computer that spits out wrong data all the time.

[1120] Oh, and that's the essence of science, too, is I think Richard Feynman had a great quote.

[1121] The first imperative of science is to not to fool yourself because you are the easiest person, to fool.

[1122] Yeah, yeah, yeah.

[1123] One thing I wrote down in preparation to talk to you was anyone who has a partner in life, you quickly recognize you'd be much better at predicting what they will do in a situation than they themselves would be able to predict.

[1124] And that is true of myself, too.

[1125] My wife could tell you exactly the four next things that are going to happen when you introduce new stimuli to me, and I wouldn't be able to do that.

[1126] You know, I'm just so my last thought was, and I've said that three times, but this truly is, I often think this.

[1127] I often just try to recognize what we're up against.

[1128] As an anthro major, I have a pretty good sense of how we lived at one time.

[1129] And all of this great hardware we had to keep us alive in that environment.

[1130] And now we just live in an environment that doesn't really have the risks or the threats that it once had, yet we didn't get rid of all that hardware.

[1131] So how much of this stuff we deal with and the stuff that keeps us from being irrational is just that we have a high, hyper -alert system for fear and threats, when in fact we don't really live in that situation anymore.

[1132] I think that is a large part of it, and it isn't just that we used to get eaten by lions.

[1133] It's also that a lot of our, the apparatus that makes us more rational, data and logic and probability theory are pretty recent inventions.

[1134] And even in modern societies, most people just don't get exposed to them.

[1135] It's not even so much we should look back to remember what it was like on the Savannah.

[1136] I mean, although there's something to that.

[1137] But it's, we should look at like 99 .9 % of human experience, which is face -to -face, day -to -day, here and now, not so academic, not so formal, not so mathematical.

[1138] We're lucky enough that we have developed data sets and randomized controlled trials and all of these tools to get us closer to the truth and communities with peer review and checks and balances.

[1139] But we didn't evolve with those truth refining institutions.

[1140] We evolved in a case where it was kind of our own eyeballs and gossip and hearsay.

[1141] Now we can do better, but it isn't intuitive that that's how we do better.

[1142] We rely on our memories.

[1143] We rely on our our images, our narratives, what people on our team reassure us about.

[1144] So we've got to kind of remind ourselves, we've got this infrastructure that really does get closer to the truth than any of our intuitions.

[1145] Yeah.

[1146] And we should safeguard those institutions and no, you're not trust them because the whole point is you don't trust anything all the time.

[1147] You always should be somewhat skeptical, but you should assign credence where credence is earned.

[1148] If we really did wipe out smallpox and people don't get measles anymore, or we should take that seriously?

[1149] How did we get there?

[1150] And the ways of thinking that got us there, we should put some more stock in.

[1151] Great.

[1152] And now the last thing I want you to tell me is why would a human being, now I'm, as you say, some people are interested in this.

[1153] I'm interested in this.

[1154] I cannot wait to read rationality.

[1155] I have the personal goal of getting a more objective view of the issues I confront because I don't love the emotional role of being irrational, and I don't like the cortisol dump.

[1156] I don't like any of that.

[1157] I do aspire to choose the best thing when it's in front of me with some regularity.

[1158] That would make my life easier.

[1159] What would be a selfish reason someone should read rationality?

[1160] Because in some weird way, you could imagine someone going, well, I'm not a scientist.

[1161] I don't set policy.

[1162] Why on earth is it incumbent upon me to be rational?

[1163] But I would imagine everyone's life could benefit with some rationality.

[1164] I'll give a couple of reasons.

[1165] One of the, it's got some good Jewish jokes.

[1166] It's got some good days.

[1167] Always love those.

[1168] But also, I like to think there's some interesting stuff in there.

[1169] But I do cite studies that show that people who commit fewer fallacies, fewer statistical and logical fallacies actually have better outcomes in life on average.

[1170] They're less likely to get sick.

[1171] They're less likely to get into accidents.

[1172] They're less likely to get fired.

[1173] Life really is better if you are more rational, on average.

[1174] I'm going to make the argument that if you care about social justice, about the expansion of equality and rights.

[1175] A lot of those social movements were driven by a rational argument.

[1176] Too much for us to go into right now.

[1177] Yeah.

[1178] But if you go back, there were very rational, logical people making the case against slavery, making the case for women's rights, making the case against war, making the case against dictatorship.

[1179] And so good things, I argue, come from being rational.

[1180] That's why the third part of the subtitle is why it matters.

[1181] Stephen, you're so radical.

[1182] I'm so glad we got to talk to you.

[1183] just add to people who have not read you.

[1184] Not only are you brilliant and have all the data, but you're just a beautiful writer.

[1185] So the fact that I put you in that Malcolm Gladwell category where it's like, I'll eat anything you're serving.

[1186] It's just coming out be delicious.

[1187] So I really hope people will check out rationality what it is, why it seems scarce, why it matters.

[1188] Stephen, thank you so much for taking us so much time.

[1189] Yeah, we appreciate it.

[1190] Thank you, Dax.

[1191] Thank you, Monica.

[1192] I'm sorry we weren't in the same room together, but circumstances conspired today, but I hope we'll have another chance.

[1193] If history tells me anything, you'll write another book in about 2 .8 years.

[1194] So I think we could do this again in person.

[1195] Either way, I hope we'll get to meet at some point.

[1196] Monica, it's great to meet you virtually and to speak with you.

[1197] Dax, it was great to see you again.

[1198] Thank you.

[1199] Thank you.

[1200] Thank you.

[1201] Take care.

[1202] Follow Armchair Expert on the Wondry app, Amazon music, or wherever you get your podcasts.

[1203] You can listen to every episode of Armchair Expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.

[1204] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.