Armchair Expert with Dax Shepard XX
[0] Happy Thanksgiving, everybody.
[1] Turkey Day.
[2] Turkey Day.
[3] I stole that impersonation of a turkey from Aaron Weekly.
[4] Oh, that's his impersonation?
[5] He does such a good turkey impersonation.
[6] What a niche.
[7] I know.
[8] Before I knew him, I think I went gobble, gobble, gobble, gobble, gobble, gobble, gobble, gobble, gobble, go.
[9] Her standard.
[10] Yeah, but then he goes, and I think it sounds more like a turkey.
[11] Yeah, I think you're right.
[12] Well, it's a great day.
[13] It's a day to give thanks.
[14] Today on experts on expert, we have.
[15] Sam Harris, who everyone knows.
[16] I love his podcast, waking up with Sam Harris.
[17] And this has been a long time coming.
[18] We've talked about Sam endlessly.
[19] Yes.
[20] But what is really relevant, and I want to give a big, big warning that this episode, more than anyone we've ever done, probably, is, for lack of a better term, potentially divisive.
[21] And we're quite critical about religion in varying degrees between he and I and then Monica, there's kind of three points of view.
[22] But, you know, I just want to warn everyone that if that is a topic that will not make you feel good, I just want to warn you because it's quite heavy in that.
[23] Yeah.
[24] And another thing to just remind people is that this podcast is about learning about our guest and what makes them who they are and what they're interested in and what makes their work.
[25] tick yes um so this stuff that we are talking about is part of sam's DNA at this point it's it's what he finds the most compelling so there's no way we're going to be able to interview sam and not talk about this stuff because it's what he finds important and that's okay everyone can have different things that they he's written books on it yeah yep so if you're interested in sam harris uh please enjoy it is a dense mother Yeah, he's a smart cookie.
[26] Don't try to listen to it after you've gotten all that in your tummy because you'll probably get a little sleepy.
[27] That's true.
[28] And again, you all know how I feel about the menu.
[29] I don't love it.
[30] But most certainly right now, I've got at least 6 ,000 calories of white enriched flour by way of my favorite Hawaiian rolls.
[31] I don't mind giving Kings a shout out.
[32] It's a free shout out.
[33] Boy, boy, boy.
[34] You know who gets me through this Thanksgiving?
[35] Kings Hawaiian rolls.
[36] I can't believe you have to get through with the stuffing and the sweet potato casseroles and you have to get through that?
[37] I got to get through it, but I got to tell you it's a nice, soft ride when I tear into that loaf of Kings Hawaiian.
[38] So please enjoy Sam Harris and happy Thanksgiving, everybody.
[39] Wondry Plus subscribers can listen to armchair expert early and.
[40] And add free right now.
[41] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[42] Or you can listen for free wherever you get your podcasts.
[43] He's an object expert.
[44] He's an ultra next one.
[45] So, uh, yeah.
[46] Sam Harris, I don't think there's been any person on our podcast that's been talked about more.
[47] I think we bring you up.
[48] If I had to say a percentage, I got to say it's 80%, right?
[49] At least, yeah, four out of five episodes, we either cite you or we talk about some other guests you've had on.
[50] So you are our North Star, even though our show doesn't resemble yours at all.
[51] Right, that's hilarious.
[52] Yeah.
[53] That's great.
[54] We even considered having a sound effect every time we bring it out.
[55] We wanted to play a few bars from your theme song.
[56] Oh, did we give you that music?
[57] You supplied it and then we got lazy.
[58] It's not that it didn't work.
[59] We never followed through.
[60] I'm sure it would work.
[61] But for folks who don't, I do think a lot of our listeners have checked you up because we do talk about you incessantly.
[62] But for folks who don't know about you, you have a philosophy degree from Stanford.
[63] Yep.
[64] Yeah, you're a neuroscientist.
[65] Well, I have a PhD in neuroscience.
[66] The truth is I function more as a philosopher of mind these days because I am not in the lab running experiments.
[67] Or at least it's been a while.
[68] It's been about a year since I published a proper neuroscience paper.
[69] And honestly, I went into neuroscience as very much as a philosopher of mind.
[70] I just wanted to know more about the brain as a basis from which to talk about the mind.
[71] My motives have always been to think and write and speak about this intersection between philosophy of mind and moral philosophy and scientific breakthroughs in understanding ourselves biologically and just our general conception of how to live a good life.
[72] You couldn't have been born at a more perfect time to have been interested in philosophy.
[73] And then also this technology comes along where you can actually look and see what's happening inside, right?
[74] To some degree.
[75] Yeah.
[76] Oh, yeah.
[77] Totally there yet.
[78] But you can actually, you can propose questions to people, right?
[79] And then you can just watch what part of their brain is firing while they're evaluating that question, right?
[80] You can really study any act of higher cognition or emotion that is competitive.
[81] compatible with people being perfectly still inside a scanner.
[82] Right.
[83] Motion is the thing that would kill your data.
[84] So, and there are many things you can do, but there are many things you obviously can't do.
[85] But, you know, I got studying someone athletically.
[86] You know, they need to be visualizing what they're doing, but they can't be actually doing any of it.
[87] You also are an author.
[88] You wrote The End of Faith.
[89] You wrote the Moral Landscape and you wrote Waking Up, a Guide to Spirituality Without Religion, and you have the Waking Up podcast in which I assume you stole that title from your own book.
[90] Indeed.
[91] That's safe assumption.
[92] But I came to know you.
[93] I believe a friend of mine recommended that Jonathan Hate.
[94] I never pronounce that correctly.
[95] Hight, yeah.
[96] He hates when you say hate.
[97] As he should, as he should.
[98] But yeah, the first podcast of yours I heard was, but let me back up.
[99] That's not true.
[100] Many of you might know.
[101] And when I'm explaining to friends who you are, I go straight at, well, he's the guy who fought Ben Affleck on Belmar.
[102] Right.
[103] Yes.
[104] Would you say that's one of your more famous moments?
[105] It is for better or worse.
[106] Yeah, that went more viral than most things.
[107] Yeah.
[108] And that was before I was even like a big consumer of your thought process or anything.
[109] But the thing that was really glaring to me in that debate where I was getting frustrated was you were attacking an ideology.
[110] And he was insisting you were attacking a group of people.
[111] Right.
[112] And would you say that was kind of the division in your mind that was happening?
[113] Yeah.
[114] Because it was about Islam and Muslims, and you're very critical of Islam, as you are very critical of Christianity and Judaism and probably all organized religions.
[115] And I think you were trying to make the distinction saying, not only are we allowed to, but it's incumbent upon us to attack bad ideologies, right?
[116] We have some obligation to point out bad ideologies.
[117] Is that fair?
[118] Yeah.
[119] So the distinction is utterly clear in my mind, but it's surprisingly difficult to get across for most people.
[120] So, and sometimes proves impossible.
[121] So the, it's the distinction between ideas and people.
[122] And, you know, I have a very strong sense.
[123] I think there's a fair amount of evidence to back this up that all the, the mayhem we see created by people in the world, most of it is not the result of bad people doing bad things they would do anyway because they're bad.
[124] It's the result of good people, or at least psychologically normal people, people just like ourselves, under this way of bad ideas.
[125] I think bad ideas are some bad software.
[126] Yeah.
[127] So bad ideas are far more powerful than bad people.
[128] Again, it's not to say that, you know, the Hitler's don't exist or the people who are true psychopaths or sadists who will do bad things in whatever context they find themselves, presumably.
[129] But ideas are really what do the heavy lifting for us.
[130] And in the most perverse case, you have people who are genuinely good people, really compassionate people, really committed to helping others.
[131] But their conception of what helps is nuts, right?
[132] And completely at odds with any rational project for maximizing human well -being.
[133] So there are people who are, you know, in the Christian context, who think that Jesus is going to come back in their lifetime and throw sinners like me into a lake of fire.
[134] And that's a good thing because that's, you know, the God has, has the creator of the universe, all knowing and all loving, has for some reason created the universe this way.
[135] It's this kind of massive psychological experiment where you're tested to see whether or not you can believe in him on bad evidence.
[136] Right.
[137] And that this is just the way things should be.
[138] And they're genuinely moved.
[139] Some of these people are genuinely moved by compassion to spread the gospel to others.
[140] And so, and this is true, even with a group like ISIS, you know.
[141] It's like, many people think that ISIS was simply attracting all the world psychopaths to one place.
[142] And these are people who would do terrible things anyway.
[143] But that's clearly not the case.
[144] I mean, you just have to read the biographies of specific people who have, you know, dropped out of medical school in London to go fight for ISIS.
[145] And it's, they're under this way of very specific ideas about jihad and martyrdom and just the way this, the universe is structured morally.
[146] You had a great guest on, I forget his name, but he himself had allowed himself to be recruited.
[147] Do you recall the guess I'm talking about?
[148] He had put himself in a position to be recruited numerous times in the pursuit of writing about it.
[149] And one of the things he said he found to be kind of unilaterally used as an approach is they're really praying upon your sense of guilt, right?
[150] That we all have these guilt, these things that we've done that are shameful.
[151] and that this is a path to redemption for our own personal shame.
[152] And that's a part of all the recruitment.
[153] And I thought, well, now we're getting close to something human I can comprehend, which is we do all as humans seem to have this perverse feeling that we need to repent.
[154] It just seems like a really easy thing to get people to succumb to.
[155] Is this, you know, the Christians are born with original sin, right?
[156] And this is a very easy concept for people to adopt because we all do feel shame.
[157] and we feel guilt about not being perfect human beings.
[158] And we kind of carry around this luggage of it.
[159] And if someone offers you a way to not carry that around anymore, it can be appealing.
[160] Yeah.
[161] I mean, there's so much about human psychology that is not optimal.
[162] I mean, we have not evolved to be.
[163] We have a lot of bad vestigal shit, right?
[164] Yeah.
[165] It helped us at one time that now probably is hurting us.
[166] Yeah, or it helps us in certain contexts, but those are not the majority of the context in which we live.
[167] You have a strong capacity to fear other people or to worry about the sexual experience of your, you know, your mate or your child.
[168] And, you know, there are taboos around sexuality and, you know, it leverages jealousy and discussed in a way that religions canonize, right?
[169] So then you have whole cultures where, you know, you can have a phenomenon like honor killing.
[170] where, you know, so in the most perverse case, you have, you know, the phenomenon, again, this is, this is widespread across the globe where if a girl gets raped, the response of the men in her life is not compassion.
[171] It's a shame that is so overwhelming that they want to kill her for having brought this shame on the family.
[172] The only way to expunge this shame is to kill her.
[173] And, I mean, this is where political correctness on this idea takes you.
[174] And this is, I mean, this is the, this is the, to come back to where we started, this was the fight I got into with Ben Affleck because he just said, you know, that any criticism of Islam was, quote, gross and racist.
[175] I mean, that's the meme that got exported from that encounter.
[176] It's just, you know, I was clearly talking about ideas.
[177] And not only was I clearly talking about ideas, I was, I said, you know, I bracketed what I said with this claim that, you know, this is, it's very hard for people to make this distinction between ideas and people, right?
[178] Because it is, you know, Islam is a set of ideas.
[179] You can convert to it, toward it and away from it.
[180] It doesn't matter.
[181] The color of your skin is irrelevant.
[182] You know, Muslims are not a race.
[183] Yeah, you can choose to be a part of that group.
[184] Yeah.
[185] And so I made all of this clear.
[186] And then, you know, he exemplified all of the problems of having this conversation because he then just, you know, was unwilling to make this distinction between ideas and people.
[187] And I don't want to interrupt this with something trivial, but I'm going to, Monica's number one crush in her whole life has been Affleck.
[188] Long time, crush.
[189] Yeah.
[190] Now, did you feel like when you were watching this?
[191] It's okay.
[192] I watched it.
[193] And I understand everyone's points.
[194] Yeah.
[195] But Monica, what's your background?
[196] Are you Indian or?
[197] Yeah.
[198] But Hindu, Christian, Muslim?
[199] Kind of nothing.
[200] But my parents are Hindu, technically.
[201] But I wasn't raised in any rule of religion.
[202] But when you were looking at Ben through rose colored glasses, were you, as I always do.
[203] As you always do, were you waiting his side of the argument a little bit?
[204] Or did you even remember the feelings you were having?
[205] Like, wow, he's making kind of a logical point, but I do love him.
[206] No. Well, but I do love him at the end of every sentence.
[207] Right, right, as it should be.
[208] But I also, my first inclination is always to.
[209] side on the more liberal, more people -based, more Ben Affleck side.
[210] Like, that's my first response.
[211] But I am also logical enough to see the difference.
[212] And I love your podcasts.
[213] I'm as equally a fan.
[214] We talk about your podcast all the time.
[215] It's the foundation of our friendship.
[216] I can see the difference between ideas and people.
[217] But I also see that he probably felt fear.
[218] You were weaponizing the right.
[219] Fear.
[220] Exactly.
[221] It was based out of this fear that whatever you were going to say was going to help, you know, propel a group.
[222] Right.
[223] The actual racists who do hate Muslims.
[224] It's you, you, it's not your quote.
[225] I think it is Majad Mahjad, Majid, who said that the left has it wrong on Islam and the right has it wrong on Muslims, which I think is a very, like, cuts right through my opinion of it.
[226] Yeah.
[227] Well, and just to be clear, you can be focused on the suffering of people, but you should pick the right people.
[228] So I'm very focused on the suffering of people in the Muslim community who are in hiding because of their ideas.
[229] They're the largest victims of all this to begin with, right?
[230] They're the largest victims of Muslim terrorism.
[231] I mean, if you hear that a suicide bombing happened somewhere, you can safely bet that it was Muslims for the most part who were blown.
[232] up right yeah it's just not it's not americans getting blown up day after day well i will say um when i've uh you know vocalized similar opinions to yours people are quick to say well the christians are just as bad and i will say absolutely and if i were at a dinner party during the crusades you would hear me talking incessantly about how crazy the crusades are i just i don't happen to be at dinner parties during the crusades i mean any one of these religions has had a period you know that has been as productive in that capacity.
[233] But it just so happens that currently there's a leader in it.
[234] Yeah.
[235] There are contexts in which dogmatic Christianity might be worse.
[236] I mean, the specific consequences of specific beliefs is what matters.
[237] And so it's, you know, and Christianity has been the hegemonic religion in that our government, like it or not, church and state isn't truly separated.
[238] and that we've had policies for 200 years now that are strongly motivated by Christian ideals that have had gigantic global impacts that I don't think you would deny.
[239] No, no, I wouldn't.
[240] Generally speaking, Christianity has been beaten into submission by modernity for the last 200 years.
[241] So you're not getting the same kind of theocratic overreach.
[242] But you can definitely find pockets of Christianity that look a lot more like the Taliban than, you know, even Bible -thumping Christianity in the U .S. does.
[243] Sure.
[244] And I mean, in a place like Uganda, you know, Christians are, you know, killing gay people and killing children who they think are witches.
[245] I mean, it is it is the Middle Ages and certain.
[246] Yeah.
[247] But the crucial thing there is that all you need to do that as an otherwise good person is a sincere belief in the reality of witchcraft, right?
[248] Yeah, yeah.
[249] You simply, you don't understand something about medicine or about meteorology or whatever you think these witches were affecting.
[250] But I believe in witchcraft is all you need to do the crazy things that inquisitors and other.
[251] Because you're saving your community.
[252] You have ultimately some kind of altruistic goal that you can justify.
[253] Stay tuned for more armchair expert if you dare.
[254] What's up, guys?
[255] This is your girl Kiki and my podcast.
[256] is back with a new season, and let me tell you, it's too good.
[257] And I'm diving into the brains of entertainment's best and brightest, okay?
[258] Every episode, I bring on a friend and have a real conversation.
[259] And I don't mean just friends.
[260] I mean the likes of Amy Poehler, Kell Mitchell, Vivica Fox, the list goes on.
[261] So follow, watch, and listen to Baby.
[262] This is Kiki Palmer on the Wondery app or wherever you get your podcast.
[263] We've all been there.
[264] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers and strange rashes.
[265] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[266] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[267] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[268] It's called Mr. Ballin's Medical Mysteries.
[269] Each terrifying true story will be sure to keep you up at night.
[270] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.
[271] Prime members can listen early and ad free on Amazon Music.
[272] So those are all the ways I agree with you.
[273] Now, a couple of the ways that I feel like you don't give a ton of attention to is just by weird coincidence I happen to read this book about recently about the CIA, you know, arming what became the Taliban or all these Mujahideen people against Russia.
[274] And then I also happened to watch like a four -part front line about the birth of the conflict between Iran and Saudi Arabia.
[275] I don't know if you caught that on front of it.
[276] It was really fascinating.
[277] But it gave me this kind of historical context that I was missing, which is there really weren't these big swaths of fun.
[278] fundamentalist Islamists prior to the West dealing with Iran over oil in the Shahs.
[279] And we really did create this modern fundamentalist.
[280] It wasn't anything what it resembles today in 1970.
[281] But, you know, we wanted to secure our oil.
[282] We had a puppet regime in there, right?
[283] And the Shah was overthrown in Iran.
[284] And then all of a sudden, the Wahhabis in Saudi.
[285] Arabia started going, well, we need to overthrow the royal family and then the royal family, which at that time, it was for Middle Eastern standards, a pretty liberal country, right?
[286] They started making all these concessions to the Wahhabis.
[287] And that all really started from our meddling in that area.
[288] Would you concede to that?
[289] Yeah, well, I would concede that we certainly funded and enabled and collaborated with certain of these factions and, you know, this is the phrase blowback is what is used to describe what happened as a result, specifically with Afghanistan and al -Qaeda.
[290] So yeah, that's all clearly documented.
[291] The reality, however, is that the, quote, fundamentalists have been there for 1 ,500 years because it is the, I mean, the fundamentals of the faith haven't changed since the 7th century.
[292] So it's there to be leveraged.
[293] But they weren't really mobilized.
[294] There was an imperial power that was controlling them, basically.
[295] And it became a very easy villain hero story to sell and perpetuate, right?
[296] So we have to take kind of responsibility for that.
[297] We made it really fertile for that to happen.
[298] But I mean, we had other things.
[299] No one should find it easy to globally defend the history of U .S. foreign policy across the board.
[300] I mean, there's no question we've done things that at least in hindsight look unconscionable.
[301] But, But you do have to keep in mind what a threat the spread of communism seemed and in fact was.
[302] I mean, you can't be at all nostalgic about, you know, what reality was in the Soviet Union for people living under communism.
[303] And you certainly can't feel that it's a bad thing that communism hasn't spread over the whole globe, given what that reality was.
[304] So we were in...
[305] Yeah, now we have the advantage of hindsight going that we probably didn't need to intervene militarily.
[306] Those systems collapsed themselves.
[307] Right.
[308] But yes, but now...
[309] There was no way to know that.
[310] There was no way to know that.
[311] But it is frustrating when you watch any show on the history channel.
[312] It's basically like this huge militaristic endeavor over something that was like it would have worked itself out.
[313] Yeah.
[314] Again, I'm not clear how much...
[315] In each case, whether the...
[316] conflict was necessary or not.
[317] I mean, I think you could probably argue that the Soviet defeat in Afghanistan was part of what caused this collapse.
[318] Yeah.
[319] And just the fact that we were outspending them militarily to the degree that we were and they were trying to keep up with us and the, you know, the kind of the bankrupting, the bankrupting effects of that.
[320] Yeah, it's, it's, it's ugly.
[321] You know, I mean, the details are ugly and I'm sure we don't know all the details.
[322] And, and you it's an interesting problem that the more and more we understand the consequences of our warmaking, right?
[323] The more evidence of collateral damage we see may make it impossible for us to fight not only unnecessary wars, but necessary wars.
[324] So as you know, I'm not one of Trump's many fans.
[325] Oh, we've heard.
[326] So I won't go on a rant about Trump, but, you know, it's, it's conceivable.
[327] I don't know that we're in that situation, but it's definitely conceivable that having someone who incredibly advertise his, his callousness and his ignorance such that our adversaries recognize that they may not be dealing with a rational actor.
[328] Because what we're finding ourselves encountering a lot, I mean, this is, this happened a lot with with jihadists across the board we have adversaries who consciously leverage our own ethical scruples against us you know it's like I mean just I mean the the starkest example for me here and it's it's something that you never here talked about very much but it is morally and and politically one of the more interesting and and consequential asymmetries you ever find you have the situation where there are people who use human shields and then there are people who have to figure out how to respond to the people who are using human shields.
[329] So you take it this is this happens with the Israelis and the Palestinians from time to time and it's happened with us in our engagements in Iraq and Afghanistan and it is such an amazing thing when you think about it because it is one group knowing that the other group is more civilized, more scrupulous, more concerned about the deaths of non -combatants that aren't part of their group.
[330] I mean, we're not talking about human shields, you know, like in the road warrior where they, you know, they captured some of your, you know, people and you strapped them to the cars.
[331] And, I mean, you have people who literally will put their, you know, rifles over the shoulders of children and shoot at our troops.
[332] And we, and our troops will be unable to shoot back in in the normal way and be deterred to whatever degree but can i i i just want to own how i can relate to that bizarre thought process which is i will be in my car let's say i'm texting and driving and i get pulled over by the cop i've just established that i'm willing to break a rule i just did it i openly break the rules of society then i get pulled over and then the cop i interact with, I'm assuming he will act in complete accordance with the law to the letter of it.
[333] Like I could push in any way and I know he has to stay in this box, right?
[334] But it is very ironic because I just proved I personally don't give a shit about staying in the box, but yet I'm expecting this person to stay in the box and not pull me out of the car.
[335] You know, it is, it is interesting that you can both be acknowledging the frailty of the laws and yet expecting the other person to adhere to them.
[336] Yeah, although in that case, it seems like you're engaged in.
[337] First, you've made a probabilistic calculation that you're probably not going to get caught, right?
[338] And so it's like it's not, you're not, you're not breaking the rules with the certainty that you're going to get caught.
[339] But again, these are, these, I'm not think, I'm not focused so much on rules here.
[340] I'm focused on the, the underlying ethics and, and compassion and concern about human well -being.
[341] I guess I, I just know what it's like to have an expectation that the other side is going to behave the way they're supposed to.
[342] There is a, utterly cynical, fully conscious, and very systematic and well -informed program to use liberal values against liberalism.
[343] Right.
[344] And this happens, I mean, they're far more benign versions of this, but it is a, it is a kind of asymmetric warfare that liberalism is always vulnerable to its own scruples being used against it as weapons.
[345] But wouldn't you agree, though, that that's just a price we have to pay to live the way we do?
[346] Right.
[347] So, yeah, we both agree that that's, unfortunately, it's not unlike our legal system where it's like, yeah, we got to live with the fact that some guilty people are going to get off to ensure we don't put anyone innocent away.
[348] That's just, we got to make that agreement.
[349] Except for the corner condition we started talking about where you could imagine a situation where we have a necessary war we have to fight.
[350] You know, the next Hitler emerges somewhere, right?
[351] And yet we find ourselves civilizationally incapable of doing what would actually be required to win that war.
[352] Yeah, like there's how much collateral damage could we accept in our warmaking, given that we can actually see what that looks like.
[353] Right.
[354] I think we could, you know, we certainly couldn't fight World War II the way we fought it and probably shouldn't.
[355] I mean, there were the many things we did that probably didn't matured.
[356] effect the outcome of the war that were that caused you know hundreds of thousands of lives that didn't have to be lost but so i mean it's it's all it's all to the good until conceivably it isn't and we'll never necessarily know in advance when we cross that line your bigger fear is that we have established a society and a culture right now that we can't even really evaluate when that time may be, right, that we have so much policing of any kind of ideas that might be offensive or threatening that we won't even, if that is ever required of us, we no longer have the system by which we could bring up some very provocative idea without being labeled something and then excommunicated.
[357] Well, this is a concern I have just across the board on topics that are far less fraught than this one.
[358] I mean, talking about anything of social importance now seems to carry with it, not only conceivable, but very likely reputational risk that you're going to get smeared as some kind of untouchable, you know, and it's a, I mean, if you're talking about race or gender or police violence, virtually anything you see that blows up as a controversy on social media is the result of somebody having said something that, which in context might have been totally reasonable or at least not, you know, beyond the pale.
[359] Or if it was beyond the pale, something for which people should be allowed to apologize, right?
[360] They should be able to recognize, oh, you know, that came out wrong.
[361] You know, I'm not that big of an asshole.
[362] But there's this just, there's, again, it's amazing to the degree to which this is all being manufactured by social media.
[363] It's a technologically driven phenomenon.
[364] Yeah.
[365] Because I'm not encountering this in the real world.
[366] No, it is interesting.
[367] The example I've been giving a lot lately is, you know, Obama 10 years ago was openly against gay marriage and, in fact, was supporting policy to prevent that from happening.
[368] We, thank goodness, we've not labeled him a homophobe for the rest of his life.
[369] We allowed for him to change his mind and then get on the right side.
[370] And I just, I don't see the latitude for people to even get on the right side sometimes.
[371] Yeah.
[372] Or if you're, we have a similar fear.
[373] And that, yeah, that was one of the things I wanted to talk to you about.
[374] is this kind of sensitivity on college campuses, which traditionally, if you believe in the ideals of the Enlightenment and you believe that a lot of our progress has come from people making kind of provocative claims, Darwin suggesting we, you know, Eve was not made from the rib of Adam at that time was very threatening and offensive to a whole bunch of people, or the fact that we actually revolve around the sun and not the opposite.
[375] it that got Galeo under house arrest by the Pope.
[376] So those examples we can look at and quite clearly say, oh, no, you need to always have a safe place to present something.
[377] And there is, I guess, and I don't know if I'm overreacting and overly fearful of this, or it is happening enough that it needs to be addressed.
[378] I fear that students now seem to have this idea that they have some kind of constitutional right to not be offended or have their feelings hurt.
[379] And it just seems very counterproductive to intellectual discourse and throwing a lot of things at the wall, finding out some are wrong, they don't hold up.
[380] We need to throw those bad ideas up there, don't we, to even find out that they don't hold up.
[381] Oh, absolutely.
[382] The model for me is really a philosophy seminar.
[383] In a philosophy seminar, when you're talking about the foundations of morality, for instance, you need to be able to think out loud about why something may or may not be wrong.
[384] And so, you know, it is completely fair game.
[385] In fact, expected.
[386] It's just totally routine for someone in a philosophy seminar on a college campus to say, well, okay, so what exactly would be wrong about killing everyone in their sleep tonight?
[387] Right.
[388] So like painlessly, right?
[389] So, first of all, there'd be no pain.
[390] You're not imposing suffering on anyone.
[391] Yeah.
[392] And there'd be no one around to be bereaved afterwards.
[393] There'd be no sadness because everyone would be dead.
[394] So if the wrongness of an act relates to human suffering on some level, what's wrong with killing everyone in their sleep?
[395] Now, that is a perfect starting point for a conversation about the foundations of morality.
[396] Yeah.
[397] And you have to be able to say that.
[398] And people have to be able to, based on a principle of charity, recognize that a perfectly good non -murderous, non -cyclopathic person in the right conditions would be tempted to float that very idea.
[399] And free to evaluate it.
[400] Yeah, but imagine what would be done to that person on social media if in a, you know, this, in a political context or a pseudo -journalistic context, that comment were excerpted.
[401] Well.
[402] And people said, well, this schmuck can't even figure out what's wrong with killing all of humanity in their sleep.
[403] Right, right.
[404] right right you know or the jonathan height example of you go on vacation with your sister and you guys sleep together and there's no pregnancy involved and no one ever knows is that amoral like just going down that road you're a dangerous pervert for the rest of your life yes if a famous film director proposed that question on instagram they would lose the franchise they're directing right yeah yeah it's it's very dangerous there's a really common debate happening currently i hear it a lot on here and on tv and social media which is this notion of separating the art from the artists and the thing i thought of today that i want to kind of compare it to and ask what your opinion is is what we would never ask ourselves is you know should we be able to to separate the science from the scientists so if we found out that pythagoras was a confirmed pedophile right no one would ever suggest it seems it seems a good chance that he was well that's true he's in greece he's in ancient greece different So let's just say he was.
[405] He was a serial rapist of children.
[406] No one in the world would say, well, we should stop using the Pythagorean theorem because it would be so blatantly obvious that this is this one thing the person actually did contribute to society and benefits all of us.
[407] And we can use it to great ends.
[408] And so no one would even consider that, nor would they with Gale or anyone else, right, if we found out something terrible about them.
[409] So it seems so blatantly obvious that we should never throw out the science.
[410] because the scientist is an asshole.
[411] Right.
[412] So my deduction from that is, if we are saying we should throw up the art because the artist was terrible, what we're really saying is that art isn't as valuable to human history as science is.
[413] Well, I think there's one more thing that could justify the double standard there or the seeming double standard, which is that with art, the value of the wisdom and creativity and insight of the artist being successfully transferred in that medium.
[414] So it's like, I think it's also, it's interesting to consider what your feelings about a work of art change if you are told that actually a person didn't even create this.
[415] This was just an AI algorithm that just randomly, you know, sorted through data space and came up with this painting of a beautiful painting of a little girl.
[416] So you had this experience.
[417] You're looking at this painting.
[418] Imagine, you know, let's take a famous painting like, you know, Christina's world, you know, the Andrew Wyeth painting that most people will have seen.
[419] I'll pretend I know what you're talking about continuing.
[420] It's a beautiful, beautiful piece.
[421] Everyone Google this.
[422] I have a lithograph of it.
[423] So it's, but this is a painting that has been much interpreted.
[424] Seems to, I mean, there are kind of many layers of what, of meaning that it is successfully transmitting, right?
[425] So, for instance, the, it's this girl in a field looking back at a farmhouse.
[426] Oh, looking back at the farmhouse.
[427] I know this, yeah.
[428] And, like in a nightgown.
[429] Yeah.
[430] She's like in a dress or, yeah, but so.
[431] Well, I see a night now and you see a dress.
[432] Yeah, yeah, yeah.
[433] That's right.
[434] We're already.
[435] Yes, yes, we're at the, it's already the Tower of Babel.
[436] So.
[437] A very sexy negligent.
[438] Yeah.
[439] So she's in a negligent.
[440] She's in a field of marijuana.
[441] But so, but she's far, she's far from the house and there's something kind of, you know, lonely and sad about it.
[442] It's, people look at it.
[443] It's beautiful, but it's sort of, there's something a little dark about it.
[444] And then there's something about the way she's painting.
[445] where I think it's plausible to say that she may be a paraplegic so that like she's not she can't actually get back to the farmhouse or the farmhouse is either further you know further away than it would be for someone who's not compromised in that way and and so it's I'm sure there are many layers of interpretation that I don't know but you just it sounds like you just transfer whatever your fears are into this painting but but just imagine if we found out that this was just computer generated and there was nothing in the mind of the artist because there was no artist, right?
[446] That changes our sense of, of its value on some levels.
[447] Like, okay, this is just, all right, this is just a kind of an accident of, you know, ones and zeros.
[448] Right.
[449] And I mean, it's not to say that you couldn't, at some point in the future, create an artificial intelligence that would be a real artist, you know, could even be a conscious artist.
[450] But let's assume that we don't have that now.
[451] And this was just, you know, just pure blind algorithm.
[452] Control out all 796 enter.
[453] And spit out.
[454] And we got Christina's world.
[455] Yeah.
[456] So that, that does deflate the whole thing.
[457] And there's something similar here where if you find out that the artist was a, you know, a serial killer, you know, like you're admiring one of the clown paintings of, you know, John Wayne Gasey.
[458] Yeah.
[459] That does change your feeling about it.
[460] And there's not that kind of residue with the Pythagorean theorem.
[461] But it's just that either works or it doesn't.
[462] Okay.
[463] So there's two things.
[464] One is you would make the choice to let that effect your feeling of it.
[465] You would be retroactively changing what your impression of the painting was.
[466] You know, you decide.
[467] But I liked the Cosby show and now it's not funny to me. Right.
[468] Well, it's not funny.
[469] At least it's, it has another element to it because in that case, I mean, now we're not, we're not even talking about just a work of art. We're talking about seeing the person.
[470] That's a bad example.
[471] I just, I shit the bet on that example.
[472] But let me say, Let's take it one at a time.
[473] Do you think that Copranicus's contribution to humankind is more or less or equivalent to Shakespeare's?
[474] Well, it's certainly in the same ballpark.
[475] Yeah.
[476] And again, if we found out that Shakespeare was this horrific guy who was anti -Semitic or something, would we be making the right decision to rob all of us of his?
[477] his work in all the happiness and joy and flourishing that is derived from it because the source was tainted.
[478] That to me seems logically lazy or something.
[479] It seems like, you know, maybe it bumps for you for a second, but you need to work through that.
[480] Well, you can sort of, again, there's no bright line here, but if you make the art good enough and you make the moral infraction of the artist trivial enough, well, then it becomes a non -issue.
[481] It's when those two curves begin to cross and the art wasn't that great anyway and this person is a monster on the scale of Hitler.
[482] Well, then it's trivial.
[483] Then we know that we don't have to pay attention to this guy anymore.
[484] But the interesting territory is where, you know, the art is great or the contribution to human knowledge is great.
[485] And the guy by any moral standard of today belonged in jail.
[486] yes yes that you know that's interesting and it's just there's no clear answer is you just have to well that's what i'm finding it's like um and again i think it's one of our failings and you would know neurologically more than i do about why we're so drawn to it but we're so drawn to a black and white yes or no we have a definition and we're so opposed to evaluating everything as it comes it's so much more laborious to do that but like yes as we go through these examples i'll bring up michael jackson you know i think it and in any given day in the world of six billion people probably a several hundred thousand people's day was made by hearing beat it on the radio this morning around the globe right and it just seems like a weird proposition to me that because he was likely a terrible pedophile that those 700 ,000 people should not have a great day.
[487] It just, and so it's interesting.
[488] You give that example and people generally go, no, I don't think we should take Michael Jackson off the radio.
[489] I don't think many people would agree to that.
[490] Many people would say, well, we should take Cosby off TV, which I have no dog in that fight.
[491] I didn't watch the show to begin with.
[492] And it is weird to look at the person, I suppose, that has done these heinous things.
[493] But it is all dicey and it all almost requires a like per incident evaluation, but we have now aligned ourselves so firmly and left, right, all these binary compartments that it seems less and less possible for us to do that.
[494] Do you think that's a problem?
[495] Well, it is a bandwidth or a cognitive overhead problem.
[496] And the reason why we need a sort of moral bureaucracy much of the time is that it just, we don't have the time or the attention span to figure out everything as though it were the first time we were considering this problem in human history.
[497] So you just sort of, you need to find bright lines where it's, okay, well, this is the bin we put everything of that type in and we don't think about it.
[498] But in this space, it is, I don't see how we would generate those criteria.
[499] It really is a case -by -case basis of necessity.
[500] And it's also just not even a, it's not the government doing this.
[501] This is not a policy that has, to be, you know, global or, or federal or even local.
[502] This is, you're talking about what does Disney do with its archive when it finds out that one of its stars did something heinous, right?
[503] Right.
[504] And that just seems like a judgment call, which now, you know, unfortunately, is being driven by an amazingly thin skin concern about what's happening on Twitter in the next 24 hours.
[505] Yeah.
[506] Yeah, it appears to me, without any data to back this up, that it's all, everything's being steered by the craziest 5 % on the left and the craziest 5 % on the right.
[507] They're making the loudest headlines, and then that is what people are reacting to.
[508] But just in this case of separating art from the artists, it appears that there's only two camps to be in.
[509] You're either in favor of separating the art from the artist, or you're saying, no, there is no separation.
[510] I'm against it.
[511] Like, those appear to be the options out there for people.
[512] If you're 18 and you're defining your identity, I feel like that's a camp you have to pick.
[513] Yeah, well, I just don't pick because it's just there's a third option.
[514] Okay.
[515] The third option is give me more of the details.
[516] Like how much do I like the art?
[517] Right, right.
[518] And how bad was this person and how sure are we that there's not another side to the story?
[519] You know, so take Picasso as an example.
[520] You know, I have a vague sense.
[521] I don't think I've ever read a biography on Picasso.
[522] And so I consider myself fairly uninformed about the details of his life, but I have a vague sense that he was like to young women.
[523] An incredibly boorish paramour of all the women who got involved with him.
[524] And, you know, he's just, he's somebody who, you know, by any modern standard, I'm sure was a total schmuck with respect to his relationships with women.
[525] And also just a colossal narcissist.
[526] I mean, like, that was probably of a piece with how he treated other people.
[527] So not someone who I would expect to meet and think, well, he's really a great guy.
[528] Like, you know, I really like this guy.
[529] I go camping with him, yeah.
[530] But, you know, his art obviously is, I mean, he defined, you know, more than a generation of the visual arts.
[531] And the idea that we would, I'm not sure what we would have to discover about Picasso So it would make it remotely tempting to say, all right, enough is enough.
[532] We've got to close the Picasso Museum.
[533] Yeah.
[534] We've got to just purge, you know, every.
[535] But isn't there a potential hypocrisy there, which is basically, if you're a shitty artist, you better behave very well.
[536] If you're a genius artist, then, you know, we can accept it.
[537] That does just morally feel a little hypocritical, right?
[538] Well, no, but part of this is the advantage that Picasso now gets.
[539] of, you know, being dead and having done his work in another time.
[540] So today, the Picasso today, whoever that is, if he's caught with a, you know, an underage girl who he's beaten up, you know, there's no question.
[541] That's a career wrecking phenomenon.
[542] Yeah.
[543] And it should be.
[544] Because we're at a standard now.
[545] The time of when these people lived is relevant.
[546] Like whatever he did then, which I think he was terrible, he was terrible.
[547] but in regard to the rest of the world at that time, he probably was sort of normal.
[548] That's probably how most people conducted themselves.
[549] Or maybe like a standard deviation above.
[550] Yeah, exactly.
[551] But we have evolved.
[552] So you have, we have to apply that to art as well and science probably.
[553] Yeah.
[554] I mean, in honesty, I wish I had about 11 hours with you because there's so many things I, let's just, I have a tendency to not say the things that should go without saying.
[555] It's a bad habit of mine.
[556] I think you have a similar bad habit.
[557] Like I'll start with the obvious.
[558] I absolutely adore you.
[559] I think you're brilliant.
[560] I enjoy so many of your arguments.
[561] But I've now, I've consumed several hundred hours of you.
[562] Right.
[563] I don't think anyone has this level of access to another person unless they're that person's therapist.
[564] Yeah.
[565] The truth is my wife doesn't listen to my podcast.
[566] You may know more about what I think on certain topics than she does.
[567] Yeah, but these along the way, and let me also say, you know, you're very polarizing, as you are well aware, and then I have defended you publicly different times, and then I've myself gotten ensnared.
[568] Yeah, I've gotten ensnared a little bit.
[569] And the thing that it frustrates me about that is I also think we are living in a paradigm where for me to like you, I'm condoning every single thing you say, which I totally reject.
[570] I maybe agree with 30 % of the things my wife says, and I fucking marry.
[571] her like i am devoted my life to her and i don't agree with most of what she says right well i hope to i hope to beat 30 % oh you're way above 30 % but there are things that we differ on as you would expect to differ i mean i i don't know how anyone would just you know agree across the board with someone else it seems very improbable well let's see if we can take my foot out of your mouth and you can tell me the things you disagree about and so spare we can spare you well but even before we do that i just want to i just want to point out this thing that I just want to reject very publicly, which is like, I love Talib Kuali.
[572] You guys have had your issues.
[573] There's parts of Talib's thing I totally love and agree with.
[574] There's other parts I think he's out to lunch on, and that's fine with me. Well, let me just close a loop on that, because, again, this is a pure confection of social media.
[575] I had no idea who he was.
[576] Oh, okay, okay.
[577] And he just started tweeting at me, attacking me for things I didn't believe.
[578] if I recall correctly.
[579] And at a certain point, I said, listen, you have no idea who I am or what I think.
[580] So you're like, what are you going on about?
[581] And he just kept doubling down.
[582] And I think Majid got involved there too.
[583] I think he was attacking.
[584] I think that's how it started actually.
[585] He was both, both Majid and I were getting attacked by him.
[586] So that was just, that was just what he was doing that day on Twitter.
[587] Sure.
[588] I have no, I have, I still don't know who he is really or what he thinks.
[589] And I have no desire to rehash that whole thing again because I basically was taking your position when I had a hot.
[590] He was on our podcast.
[591] Yeah.
[592] And he was and he was awesome.
[593] And then I found myself kind of having to take your position, which I'm not as smart as you.
[594] So I doubt I executed the way you could have.
[595] But it did force me into like, you know, defending all of your thoughts, which was just an interesting experiment in and of itself.
[596] But I do think it's, we do, again, I really buck up against this binary thing where on social media, if I vocally state that I like you, that now you're that's an invitation for you to find the most arcane thing that you've ever said and then tell me I need to defend it or I am somehow this person because I like you yeah and I think we've kind of lost this a little bit of gray area where it's like you can totally like people and also disagree with parts of what they're saying oh yeah do you find this at all that like if you were to I assume when you have someone like Jordan Peterson on and I have to imagine there are a lot of points Jordan makes that you agree with Yeah.
[597] So we've had now four public debates.
[598] I mean, they're not filled as debates, but they definitely have the character of debates, at least for part of the time on stage.
[599] And so there's a lot we agree about and there's a lot we disagree about.
[600] And it's just that there's a value in having a conversation with someone who you're not totally aligned with.
[601] And especially if it's a good faith conversation where this person is engaged in the same effort you are to get at the truth and to, vet their own ideas, you know.
[602] And so if, you know, those conversations have been very useful and fun.
[603] But clearest example I've noticed of late of the kind of thing you're talking about.
[604] Did you follow this thing on Twitter with, with Mark Duplas?
[605] I'm friends with Mark.
[606] I have not heard about what happened.
[607] Oh, this was big on Twitter.
[608] Oh, geez.
[609] Call Mark and ask him what happened.
[610] So I've never met Mark, but he.
[611] He's lovely guy.
[612] Let me just start with saying that.
[613] Yeah, he seems fantastic.
[614] And he seems, you know, He, I admire his work, but he went out on Twitter and he said, you know, anyone who's, you know, I don't have it verbatim, but the spirit of it was, you know, for all of my liberal friends, you may, if you want to hear a smart conservative who will challenge a lot of your ideas, follow Ben Shapiro.
[615] Oh, okay.
[616] And then, you know, the liberal mob went into Ben Shapiro's, you know, tweet record.
[617] Oh, geez.
[618] And pulled out the worst things he'd ever tweeted.
[619] And Ben has got some whoppers, you know, which he, he hasn't deleted.
[620] He hasn't deleted.
[621] He didn't delete them because he doesn't want to call.
[622] It's like an admission of the old person.
[623] He didn't want to call more attention to them by or make it seem nefarious that he was he was deleting them.
[624] But he has responded to, I mean, he's recanted them and he's, he's gone through every permutation of apologizing and contextualizing, you know, depending as the case may be the need for his past tweets and utterances.
[625] But he's got quite a paper trail.
[626] And so Mark got just savaged by his own audience.
[627] And in, I don't know if it was 24 hours later, but it was some very quick time frame completely caved in and issued a public apology to his audience.
[628] And it was just, it was the worst example of the mob winning.
[629] I mean, it was completely unnecessary because it should just be obvious that he doesn't agree with everything.
[630] I mean, you know, I disagree with probably 80 % of what Ben believes about on any given topic.
[631] Ben's an orthodox Jew, you know, like we don't agree about much.
[632] He's a genuine conservative.
[633] But there should be no reputational cost to saying, hey, this guy, you know, he's on the other team.
[634] Like, I'm not aligned with this guy, but he's interesting.
[635] You owe it to yourself to figure out the best version of your disagreement with this guy because he's smart, right?
[636] Yes.
[637] That should be totally valid in Mark's world.
[638] And what his fan base or the segment, and this is another problem with social media, is that who knows who he was actually hearing from.
[639] I mean, in his feed, he felt like he was.
[640] There was a consensus.
[641] It was a tsunami of consensus saying, you have just destroyed your career.
[642] Yes.
[643] And yet, what was it?
[644] Was it, you know, 1 ,500 people telling him this?
[645] I don't, yeah, it's, and it's the, it's the most vocal, most.
[646] And don't you think that's uniquely on the left where we can't wait to eat one another?
[647] Like, it's like we're so excited to eat ourselves.
[648] Yeah, yeah.
[649] This is an asymmetry that I've long bemoaned, which is that on the left, the left does something that the right never does.
[650] It eats its own in a way that is totally, politically speaking, dysfunctional.
[651] And as I mean, you could see it in the juxtaposition of, of what happened to Al Franken and what didn't happen to Roy Moore.
[652] So, Al Franken had me, it's like whatever, I'm not sure what should have happened to Al Franken.
[653] I mean, it seemed, but the juxtaposition with Roy Moore, who was not even bothering to deny that he as a 30 -year -old man was sleeping with 14 -year -old girls, right?
[654] He was paying no penalty at all in Republican circles.
[655] Now, I'm not saying that the left should be more like that, but clearly this is, this is an asymmetry that the left will suffer under in any head -to -head contest against the right.
[656] Stay tuned for more armchair expert, if you dare.
[657] So this is another thought I wanted to ask you, which is you've debated a bunch of people, and maybe they're not framed as debates or they become debates or whatnot.
[658] I got to see you live talk with Stephen Pinker.
[659] Yeah, that hilariously, you were, that we were having sound problems in that theater and you were adjusting the monitors.
[660] And so I can't tell you how many people were amazed that I've got so that celebrities are coming forward to adjust the equipment on stage.
[661] Well, you're clearly getting feedback from the monitors that were pointed out.
[662] And I just couldn't sit there and do nothing.
[663] And I was like, this is so arrogant of me to get up and start adjusting these speakers, but someone has got to do it.
[664] If brave men, what is the saying?
[665] Find a famous person who can climb on stage and adjust the sound equipment.
[666] Anyway, it was much appreciated, but it was hilarious.
[667] But the one thing I'm curious about, because I will sometimes, I'll hear episodes of yours and I'll think you fared one way or another.
[668] Sometimes I think you kind of decimate people.
[669] Sometimes I think people make better points on certain topics.
[670] But when I'm talking to fellow Sam Harris heads, In my social circus, circus, that's a great Freudian's lip, you've, at least in my circle, kind of reached deity status where some people are just going to blindly go, you know, regardless of where you end up.
[671] Some people are just there.
[672] They're going to be there.
[673] And I think it's as an anthropologist, I'm not one because I don't practice, but someone who studied anthropology, there is this unavoidable attraction we have to Alpha or whoever we establish is Alpha.
[674] and many people can occupy that position.
[675] But once you've kind of said that this is maybe one of my alphas, like my intellectual alpha, whatever it is, we do have a real capacity to kind of just go with people.
[676] Would you agree?
[677] I mean, it's part of what perpetuates religion and all these different things.
[678] Except, honestly, this may sound self -serving, but I feel like I have trained my audience to be incredibly fickle with respect to, they hold me to a standard that I notice many other people's audiences don't hold them to.
[679] And this asymmetry was brought out really clearly in my public conversations with Jordan Peterson because there it was very much a kind of a team Jordan, team, Sam vibe.
[680] And so you could hear like in various moments something one of us would say would segment out the audience, right?
[681] But I have an audience that is holding me to a standard.
[682] of reasonableness and consistency and openness and honesty that I've trained them to hold me to.
[683] I mean, like, intellectual honesty, if I have a brand, it's that.
[684] Like, if I'm wrong, I need to admit it fast, right?
[685] Or if I appear wrong, I need to notice that and then figure out, you know, how this misperception is occurring and cancel it, whereas someone like Jordan amazingly is functioning in a kind of, you know, quasi -cultic, quasi -religious mode where, and I'm not saying he's, you know, necessarily consciously engineering this, but he's, he can just change his tone in a way, which will get a thousand people in the audience to applaud.
[686] And it has a kind of revivalist thing happening.
[687] Yeah.
[688] And it's my audience, and this is something I love about my job, frankly is that I am hearing from people who are holding me to a standard of, you know, coherence and consistency and reasonableness that I have just endlessly advertised that I want to be held to and I want to hold myself to.
[689] So I have this, I have an amazing error -correcting mechanism.
[690] You know, so when I get something wrong, I mean, this is a real difference between me and my, you know, many, you know, public, you know, critics and I would even say enemies in certain cases.
[691] I can probably count on one or two hands, the people who have made kind of a cottage industry around misrepresenting my views as publicly as possible.
[692] I mean, there are people who will cut together audio from my podcast, making it sound like I'm saying the opposite of what I was actually saying in context, and release that as though this, look, look at this racist bastard.
[693] But then someone like Glenn Greenwald will tweet that out to his million followers now.
[694] And so Glenn and I have gone back and forth fighting.
[695] But the difference between our audience is that he never apologizes for an error, right?
[696] He can, he can consciously or not retweet something that's completely spurious about me. And we'll hear in his ad mentions from thousands of people saying that this is the error you've made.
[697] He will never apologize, right?
[698] If I get the slightest thing wrong about somebody like Glenn Greenwald, the slightest thing.
[699] It could be a hair's breath between what I said.
[700] he thinks and what he actually thinks, my fans notice and say, oh, no, that's not, you know, you have to apologize for that, right?
[701] Right.
[702] And that's fine because I think actually, you know, misrepresenting somebody, somebody even an enemy is not fair game at all.
[703] This is what I was going to commend you on is that one of the ways I think you have a tremendous amount of integrity is that you never make straw man arguments.
[704] You don't misrepresent the other person's point to help perpetuate your own point.
[705] You do a very thorough job of understanding the other person's point of view and you have a commitment to be able to make their argument to them as well as they would be able to make it.
[706] I mean, I think that's just, forget about the ethics of it.
[707] That's just good rhetoric.
[708] That's just good argumentation.
[709] If you want to persuade somebody of anything, you should take the best version of the position you're arguing against.
[710] And it's just, it's incredibly helpful in a public debate to be able to restate the other.
[711] other person's position in a way that they sign off on.
[712] So you're not talking past each other at that point.
[713] Like, yeah, now I know what we're.
[714] Or it and it also allows you to then go forward, which is where you're hoping to go.
[715] But what I will say is that what draws me to science in general is this notion of continual betterment, that ultimately everything's peer reviewed and no one's perfect.
[716] And that just through this process of criticism, it gets better and better and better.
[717] and that's what science has over the Bible, which is the Bible was just written.
[718] There is not going to be any updates.
[719] And there's no new chapters.
[720] No one's going to account for any of the things we learn.
[721] And so I just think anything you would hope to believe in or look to for answers should be something that implicitly is designed to evolve and continue to take on new information, right?
[722] And react to that.
[723] Yeah.
[724] So with that said, I've not heard.
[725] You probably correct yourself.
[726] I'm curious, are there any of these debates you can think of that you've lost?
[727] That's what I want to know.
[728] There are certainly debates I've had where the person I was debating credibly felt that he won for his audience.
[729] This is often the mismatch between the audience.
[730] But if you left one of these and gone, fuck me. Stephen pointed out something really kind of profound.
[731] I need to reorient myself on this.
[732] Like, I'm going to give an example.
[733] So when I saw, when you and Stephen Pinker started debating AI, right.
[734] And you have a pretty outspoken fear of where this could potentially go AI.
[735] Yeah.
[736] Stephen does not seem to have that fear.
[737] No, he doesn't.
[738] Yeah.
[739] And you guys went at it a bit and you guys were both ironically quoting like some of the same authors of things, but they were saying different things to both of you, which is very human.
[740] Yeah.
[741] But when he pointed out, in my opinion, in that you're you're implanting into any of these AIs the fundamental thing that they don't have that all organisms have, which is just simply that all organisms innately are competing for space and against one another.
[742] And that in itself is just something that is so unique to an organism that we would have to so thoroughly designed to put it into a machine that it just seems crazy that we would take the time to do that because that would ensure our eventual annihilation.
[743] And when he said that and that connected for me, I was like, yeah, that's right.
[744] Half of our, more than half of our hard wiring and programming is for us to have the highest fecundity rate.
[745] And that's just a weird hiccup of being an organism that a machine doesn't have.
[746] At any rate, when you were debating him on that, did you take that point and go like, oh, yeah, that's something to think about?
[747] No, well, that actually seems like a red herring to me. I don't remember that moment in that conversation, but my general feeling about Steve on this point is that he's not, for one reason or another, actually engaging the best points on the other side.
[748] I mean, he has a kind of straw man version.
[749] Certainly in his book, it's an utter straw man version of the concern.
[750] I mean, the people who are concerned about this are not concerned about Terminator -style robots suddenly becoming malicious, like where you have AI that was benign, But then it forms a hatred of humanity.
[751] And there's several people who are arch skeptics about this concern.
[752] Like this is, this concern is so, first of all, not only is it crazy to think we could ever build AI that wouldn't be aligned with our interests, we may never build a general AI at all.
[753] It's like this is, the whole thing might be a pipe dream, right?
[754] Sure, sure.
[755] skeptical about the just the fundamental phenomenon of building this technology.
[756] I don't think those people are seriously engaging with the real arguments.
[757] And so, yeah, there's no, that wasn't a moment where I felt like, oh.
[758] But so, but had it been, there, I mean, there are many moments like that when I'm in dialogue with somebody.
[759] Because I want you to be like science, you know.
[760] Yeah.
[761] I want to see you evolve.
[762] I want to see you occasionally come on and admit that you were wrong.
[763] Like in that same live event, you were asked, what is, something profound you've reversed your position on recently and the one you could think of was death penalty which I too have had my position reverse but I thought that was maybe an easy one and it was a clear one and it was a big one but I think I also said this recently on my podcast because it's a more recent one in response to that very question so I had Neil Ferguson on the podcast we're talking about Trump and Neil Neil is a historian and the husband of Ion Herssey Ali.
[764] Oh, okay.
[765] So they're quite the power couple.
[766] The Kristen Ballandak Shepherd of social justice.
[767] Of anti -theocracy and history in this case.
[768] But Neil's a powerhouse in his own right, and he's a historian.
[769] He's now got an appointment at the Hoover Institute.
[770] I guess he's kind of a libertarian.
[771] He's not a true conservative, but he's, you know, kind of, I guess, grouped right of center often on many of these questions.
[772] But he's certainly not a social conservative.
[773] But he's someone who is not, he's not a Trump supporter, but he's someone who he's just, he hasn't, he's kind of stayed out of the fray on this topic to some degree, certainly more than I have.
[774] And so when I got into my litany of reasons to think Trump was just, you know, the orange nightmare that we need to find some way to, expunge, he said, well, have you ever thought about the counterfactual situation of just, you know, what if Hillary had won?
[775] What would the world look like if Hillary were president?
[776] And because everything you're saying about how bad it is that Trump is president seems to be predicated on your having formed a clear conception of what the world would be like if Hillary were president and how much better a world that would be.
[777] Right.
[778] And he had many reasons to think that the world would be perhaps in some ways worse, right?
[779] And one of his reasons, which is a, you know, somewhat, you know, paranoid and cynical, but, you know, possibly true reason, which is that he thought that the, if Trump had lost the election to Hillary, the response of the Trump voters would have been something close to an actual insurrection.
[780] I mean, like we would have had a real problem on our hands.
[781] It had, because, you know, Trump would have, you know, said this is, know, this was a fake result.
[782] Yeah, rigged election.
[783] It was totally, I mean, he was clearly preparing the ground for that response.
[784] And who knows what chaos would have ensued?
[785] And in Neil's view, which, you know, I'm not sure I agree with, but he views the election of Trump as kind of lancing a boil that really had to be lanced.
[786] Like, like, this was a powder keg.
[787] And had Hillary been elected, it, you know, it would have exploded.
[788] And so what was interesting to me is that.
[789] that all of my complaining about Trump was informed virtually not at all by engaging that exercise.
[790] Right, by comparing what her potential presidency would be versus his.
[791] Right.
[792] And, you know, all of the possible consequences.
[793] And so it definitely changed my view of our current situation.
[794] It's not that Trump looks any better to me. But it's just like, I realize, wait, I mean, there are many things I don't know.
[795] I'm being motivated by a presumption of knowledge that I don't actually have.
[796] And so it took a lot of the energy out of my concern.
[797] Yeah, yeah.
[798] And so that was a, you know, that was a, a quote, debate style point that landed.
[799] But this is just the way I am, which is, you know, noticeably different from the way many people I engage are, which is that, and I've said this many times on my podcast, I don't want to be wrong for a moment longer than I need to be.
[800] Right.
[801] Like, I've got, you know, say, the moment is uniquely painful.
[802] I've been wrong on here and had to admit it.
[803] And it's, if it happens quickly enough, all people notice and all the other person notices is that they have the pleasure of educating you on a certain point.
[804] Like if you're wrong, like, oh, seriously, that's the way it is?
[805] Oh, wow.
[806] So I didn't know that.
[807] Okay.
[808] And so if you can revise your worldview in real time under pressure, it doesn't have this character of, oh, fuck, that's embarrassing.
[809] I just, you know, I dug in and now I just got slaughtered, right?
[810] The world isn't what?
[811] what I thought it was.
[812] So now what is it?
[813] So you don't have any fear that it retroactively, like for you to admit you're wrong publicly, you don't have any fear that it's your other arguments you've had now seem to lose credibility.
[814] Like do you feel this responsibility to get it right all the time because it would inadvertently diminish all the other points you've made?
[815] I feel a responsibility to be honest in every moment.
[816] Right.
[817] So if, so if I'm wrong and don't see it, well, then that's, you know, that's not an error.
[818] I mean, it's an error, but it's not a violation of my core ethic, which is, to be honest.
[819] Right.
[820] If I don't see it, if I don't see it, I don't see it.
[821] Right.
[822] Maybe there's something I don't understand.
[823] Undoubtedly, there are many things I don't understand.
[824] So I'm wrong about those things or capable of being wrong about those things because I'm just ignorant of certain facts or I'm, you know, not logical enough or whatever it is.
[825] Yeah.
[826] The moment that someone, that I, that I, that I recognize that I'm in the presence of someone who's making a good point that is running against the grain of one of my stated beliefs.
[827] I mean, those dominoes begin to fall really quickly for me. It's like, and that, and that is actually the pleasure of conversation for me. So like, and it's, I mean, it's great to have a podcast where you can do this, where you can just invite some smart person on who's much more of an expert on any given topic you might be talking about than you are.
[828] I mean, there's something that.
[829] That's very admirable because you have a specialty, which is neuroscience.
[830] And really, you're rarely talking to a fellow neuroscientist.
[831] Yeah, yeah.
[832] No, it's, it's very often, at least 90 % of the time, I'm talking to someone who knows much more about something than, than I do.
[833] And that's something is, is forming a significant part of the conversation, right?
[834] Yes.
[835] This is the position I currently find myself in.
[836] I'm at least in a lazy boy.
[837] But unlike a normal journalist or a normal interviewer whose job is just to kind of poke that person to kind of get out of the way, I am usually approaching it as a conversation where I'm going to take up at least 40 % of the airtime because I do have a dog in the fight.
[838] I have opinions about what we're talking about.
[839] And I want to pressure test those against someone who also has strong opinions and for good reason because this is really what we're talking about now is really in their wheelhouse.
[840] Yeah.
[841] And so it puts me on the line in a way that, it's like my friend Joe Rogan, who's got a, you know, as you know, a huge podcast.
[842] We've talked to many of the same people.
[843] But at the end of the day, he can, you know, if he puts his foot in his mouth, he can say, you know, what the fuck?
[844] I'm just a comic.
[845] You know, like, I'm not like, I don't have to get this right.
[846] You know.
[847] And so he's got, he's, he's doing a high wire act, but there's a net, right?
[848] Yeah.
[849] I have a high wire act.
[850] And if I'm falling, it's just a different dynamic.
[851] But for me, there really is, there's, there's nothing at stake.
[852] as long as I am honest.
[853] Right.
[854] So I'm either honestly wrong or I'm, I recognize that I'm wrong and then I change my commitment, you know, more or less instantaneously.
[855] Right.
[856] This could just be my own personal preference, but I'm always curious is, if you think about epistemology, right, you know, how we know what we know.
[857] I'm equally as interested in why I chose the thing.
[858] things to learn about as I am the actual things I've learned about.
[859] And even when you and I had dinner, I was, because I was falsely seeing a parallel between you and I, because we're both children of divorce and stuff.
[860] Right.
[861] And so, only one of us race as motorcycles.
[862] Oh, yes.
[863] Well, that, that's true.
[864] But I'm focusing on the similarities.
[865] It would be fun for me to hear you question even why you're drawn to the things you're drawn to.
[866] Or is that something you do?
[867] Do you think you have a handle on why?
[868] You're clearly drawn to the empirical, right?
[869] There is some kind of comfort in that.
[870] Are you interested in why you're interested in things?
[871] I guess that's my question.
[872] I don't do a lot of that.
[873] I guess the question I have asked is, why am I touching topics that predictably produce so much pain?
[874] I don't need to do this, right?
[875] Like, do it, like, why do a podcast on race?
[876] Like, well, from any angle.
[877] Yeah, yeah.
[878] Like, what's, what's in it for me now?
[879] And, but, yeah, but, there's no blue ribbon at the end of this for you.
[880] Yeah, but my last podcast was on race, right?
[881] Right.
[882] And I recognize the reason why I do it is it does come down to, it's the coincidence of human suffering and conflict and my sense that it's obviously unnecessary.
[883] So it's like, it's like it's the unnecessary part of human.
[884] misery that really motivates me to react or respond whatever I can bring to that problem.
[885] Yeah, but you could have gotten a Ph .D. in jazz history or Renaissance painters or any number of things that are to study in this world.
[886] And you were so clearly drawn to philosophy and then the brain.
[887] And I just wonder if you have any sense of what from your childhood, why those things seemed like questions you needed to ask or if they were comforting when you got that information or if there's you know well ironically it's it's a quasi religious calling in the sense that the thing that brings many people to religion or at least anchors them in their religious beliefs is death and the you know the reality of death and having lost people and wanting to know what that's all about and how can you prepare yourself emotionally for the inevitable losses that we'll all experience in life.
[888] And, you know, I had a few people close to me die early.
[889] I mean, the one that really impacted me was a best friend died when I was 13.
[890] And then my father died when I was 17.
[891] But I had, you know, I remember, I mean, I had dogs die early as a kid and I had a grandfather die earlier than that.
[892] So I was like, death was something that punctuated my life for as long as I can remember.
[893] I actually went into, I kind of made an independent study of religion and, you know, known it at the time but you know quasi new age phenomenon and psychic phenomenon you know like just what happens after death you know what is consciousness how what are we but so you you had some incidents early where you were forced to recognize the the vulnerability of being human if your friend dies at 12 years old and your dad dies at 17 it's it's quite obvious that you're not you have no assurance that you're just going to make it to 100 years old that's pretty obvious I guess you trained jujitsu right you had the podcast with the security specialist Gavin de Becker I had a few I had it was Gavin De Becker I had Scottie Reetz who's a former SWAT operator right and then I had Jock Willink the Navy SEAL so I had a few so for me I'm not as interested in security because I don't I just don't really have a fear of that stuff for right or wrong it's just of the many weird proclivities of my personality, I'm just in general not very fearful.
[894] Right.
[895] Do you think in general you're fearful as just your worldview is a little fearful?
[896] I wouldn't.
[897] You'd call it realistic, I'm sure, but that would be the trap.
[898] Fear is not the emotion that I experience that causes me to pay attention to any of these things or do any of the things that you like to train Jiu -Jitsu.
[899] So it's like it's a, I mean, first of it was Training jiu -jitsu is just flat -out fun and addictive.
[900] I mean, it's like once you start doing it, it's like it is the most addictive sport I've ever encountered.
[901] So it's like fear is the last thing from your mind when you're training and judicious.
[902] But even like working with firearms, it becomes a kind of, once you go through the down the rabbit hole and having decided, well, this is worth your attention, then it becomes just like a guilty pleasure.
[903] It's like, you know, I imagine it's something like, riding motorcycles or whatever.
[904] I mean, let's say riding motorcycles.
[905] Like one way to get into it was, you know, there are many safety situations where being able to ride a motorcycle could save your life, right?
[906] Like in some end of the world phenomenon, like you're going to want to know how to ride a motorcycle.
[907] Sure.
[908] So let's say you believe that and then you get into it.
[909] Well, then all of a sudden, riding motorcycles, presumably is hugely fun.
[910] And you're not motivated by fear while doing that.
[911] That's a great example.
[912] So riding motorcycles, I love it.
[913] It's a huge part of my life.
[914] It has been for 30 plus years.
[915] Right.
[916] I also have the awareness that, I as a child was trying to do anything that could establish me as manly.
[917] That was just one of many things.
[918] I would jump off the roof.
[919] I would jump off a tall bridges.
[920] I would fight other dudes.
[921] I was in constant search of approval from my peers that I was manly.
[922] And I do believe that's because I just didn't have a dad around.
[923] There was no dad to go like, well done, son.
[924] You really did it.
[925] And so I was just endlessly looking for that approval from my male peers.
[926] And the way you won glory where I'm from is you fucking jumped a motorcycle over a bomb fire.
[927] You know, so by God, I was going to do that.
[928] Or you punched the dude at the bar who called your girlfriend a bitch.
[929] Like, these are the things I did.
[930] So it's working on two levels.
[931] I very much enjoy it.
[932] It's, I do, it's, I love it.
[933] I could tell you a million reasons why it is meditation.
[934] If I go to the track on a motorcycle, it's the one place in my life that for 30 minutes, I absolutely can't afford to have another thought other than the next turn in breaking and accelerating.
[935] If I have another thought, I will crash.
[936] And so that in itself is meditative and I enjoy it.
[937] But if I'm just being honest about what got me into it, it was one of many ways to establish my manliness.
[938] And I have a neuroses like any human being.
[939] And it's one of many.
[940] Yeah.
[941] Well, so there is something similar for me there in that, yeah, also having been raised by a single mom.
[942] Yeah.
[943] There's a, again, another.
[944] nothing ever happened to impress this on me. But I think just feeling like there wasn't a man in the house.
[945] Right.
[946] And feeling that there's some reality of vulnerability.
[947] You know, like you have a, you know, you have a mom who is out having to function out in the world with her young son.
[948] At a certain point, the young son can perceive the world to be a hostile place or perceive the stress in his mother, you know, like, you know, when she, when she, doesn't feel safe and yeah you know and we we traveled and I mean she was by my lights now as a parent you know incredibly brave and how she took me out in the world you know we went to Africa we went to China when it just opened up I mean we're like pretty baller yeah I mean she is single mom and she had never you know she didn't have a history of having traveled you know as a young person so she she just wanted to travel for the first time in her life and she had a you know an eight -year -old kid or a nine -year -old kid with her who she had to take.
[949] And those are fantastic experiences.
[950] But the reality is, is I'm a young boy in a world where men are noticing my mom as a single woman, right?
[951] So, you know, I hate it when my mom wore tank tops because she had big boobs.
[952] I was like, geez, Louise, I can see guys are looking at my mom's boobs.
[953] So I don't have explicit memories of that, but I'm sure that happened.
[954] They happened.
[955] So, yeah, so I'm sure that got in there.
[956] And the, but then it's just a matter of having thought through these admittedly somewhat rare circumstances.
[957] But the truth is they're not all that rare.
[958] I mean, you know, like I feel pretty rational in the degree to which I prioritize certain risks.
[959] For instance, you know, or like most people, I have some nascent anxiety around flying.
[960] Right?
[961] So if you put me on an airplane and there's, you know, any significant turbulence, you know, part of my brain is saying, okay, is this, is this the moment where, you know, I go, it all ends.
[962] But the reality is that flying is so safe that it is, it is irrational to worry about dying in a plane crash.
[963] Totally.
[964] And it is, but it is totally rational to worry about dying while texting while driving.
[965] And yet almost nobody worries about that, right?
[966] That's right.
[967] Or of heart disease.
[968] or cancer or any number of the big killers.
[969] Yeah.
[970] And so the things I worry about are fairly in proportion to the actual risk in so far as I understand these risks.
[971] Again, the psychology isn't fear in the sense that it's like putting a, you know, when you put your helmet on to ride your motorcycle, when I put my seatbelt on in my car, it's not fear that is inspiring that move.
[972] It's just an awareness that, you know, the difference between getting, getting into an accident with a seatbelt on or off is enormous, right?
[973] So it's worth putting it on.
[974] Yeah.
[975] And it's the same way, you know, locking my doors at night.
[976] It's like I could just leave the front door open, but there's a reason to lock it.
[977] And yet implementing that is not at each moment driven by this feeling of anxiety, at least I'm certainly not aware of it being.
[978] Have you ever had psychotherapy?
[979] I did, a long time ago.
[980] Okay.
[981] Did you like it?
[982] Do you recommend it at this point for me?
[983] No, I just, I'm endless.
[984] curious why I'm doing anything I'm doing.
[985] I don't think I'm just randomly doing anything.
[986] I mean, it feels like I am.
[987] I'm responding to a sensation of hunger or horniness or all these things.
[988] But way lower than that, I am usually the victim of something I'm overcoming from my childhood.
[989] I think it's kind of inescapable.
[990] With respect to what I pay attention to, like on my podcast or what I write about, that's largely driven by just, I mean, like, there's two filters.
[991] Like, is it interesting?
[992] and is it consequential?
[993] And sometimes just pure interest wins and sometimes just pure consequence.
[994] I mean, this is sort of boring, but there's no way around it.
[995] Let's talk about how bad social media is or whatever it is.
[996] But it's the convergence of those two things where this is actually ethically or intellectually interesting and it's also either hugely consequential now or it will be.
[997] Like AI is a great example there.
[998] I think it stands a chance of being an enormous problem.
[999] if we do it badly.
[1000] And it's just very cool to think about, too.
[1001] Oh, yeah.
[1002] When you, like, did you read Homo Deus the second?
[1003] Yeah, yeah.
[1004] When he starts talking about what your phone could potentially do is it's measuring your cortisol level and your blood sugar and you're about to walk into a meeting and it tells you don't speak in this meeting because the last time you did when your levels were this high, you got reprimand.
[1005] Like that kind of being able to set a goal for yourself, that's basically setting a goal for your narrative self, that's going to be at odds with your experiential self, is fucking fascinating.
[1006] And what will the device prioritize your narrative self or your experiential self?
[1007] That is a fascinating thought experiment, I think.
[1008] Yeah.
[1009] And it's one that we need to run for ourselves even in the absence of that technology.
[1010] Absolutely.
[1011] So much of what we do is, is this negotiation between what it's like as a matter of process to do those things and the payoff of being able to retrospectively say we're satisfied.
[1012] Feel good or bad about it.
[1013] In my experience on planet Earth, rarely did those things meet peacefully.
[1014] Almost everything I enjoy doing is something that my narrative self was not going to be proud of that night.
[1015] That's the tragedy of the human condition.
[1016] So in keeping was what we're talking about with fear.
[1017] So I, too, share your critique of Islamists.
[1018] You know, I have a fear of that.
[1019] But I also would recognize, and I've heard you already admit to this, but I'm going to take you further down that path, is crazy high improbability of ever being the victim of a terrorist attack in the United States.
[1020] Yes, except, except, I mean, there's one caveat there.
[1021] Yes, it is true to say that very few people are dying in the United States from terrorism.
[1022] Yes.
[1023] I mean, really, really tiny, less than lightning.
[1024] Yeah, Islamist or otherwise, right?
[1025] Yeah.
[1026] But it is also true to say.
[1027] that there are people waking up every day trying to figure out how to get their hands on nuclear material or biological weaponry so as to bring off an attack that is orders of magnitude bigger than anything we've ever seen, right?
[1028] Yes, that's certainly true.
[1029] But I think the more compelling argument you've made is that it's not really about the terrorist attack.
[1030] It's our response to the terrorist attack.
[1031] So the last time we had a significant one at 9 -11, we invaded.
[1032] Iraq and we invaded Afghanistan and we've spent $1 .5 trillion on those endeavors.
[1033] And so the reason it's so paramount this issue is that our appetite to overreact is so large that that's what we're really trying to alleviate or avoid.
[1034] Yeah.
[1035] And we have to price that in because I think that overreaction is inevitable.
[1036] And some of it is not, some of it's just like what happens to the stock market, when the next 9 -11 hits.
[1037] Right.
[1038] Like what global financial crisis will that precipitate?
[1039] And I agree with you on both those points.
[1040] But my question to you is, if that's true, and I agree it is, then why not make the focus of your work confronting the disease and not the symptoms?
[1041] So the disease is human overreaction, which I think is...
[1042] Well, no, there's two diseases.
[1043] Because the other disease is the fact that hundreds of millions of people at a minimum think that you should be killed or leaving the faith.
[1044] I mean, to take Islamism as this one problem, right?
[1045] So, you know, and I know the sorts of people who are now threatened with death for having left the faith, right?
[1046] So, like, this is a human rights concern that is so much bigger than the kinds of legitimate concerns we have in our society where we're just not getting it right in terms of, you know, political equality between men and women or, you know, transgender issues or, like, all of that stuff.
[1047] fine, let's worry about that too.
[1048] But I know people who are in hiding for saying that they, for being gay or for, or for, you know, or for saying they doubt the existence of God.
[1049] And, you know, this is just on a human rights level and kind of a freedom of speech level.
[1050] A traditionally liberal.
[1051] Yeah, this is just, as the liberal in me finds this intolerable, the feminist in me who's actually concerned about the plight of women, you know, if I go down the list of, you know, Let's prioritize this.
[1052] Well, you know, the fact that 98 % of girls in Somalia are still getting clitorectomies, right?
[1053] And knowing the details of what that's like.
[1054] We call the female mutilation when we were in anthropology.
[1055] Well, just don't call it female circumcision because that gives people the sense that it's somehow equivalent to male circumcision.
[1056] And so, you know, how big a problem is that?
[1057] And really quick, let's just say the difference being is that the ultimate goal of male circumcision when it was enacted was cleanliness and the the true motivation behind a clitorectomy is to deny a woman pleasure during sex yeah and there are different goals right yes and the details of the procedure and the the health ramifications all of it is just it is just you're misled by a term to call them both circumcision yeah you know so if you're actually concerned about women I just I can't see how you know having battery acid thrown in the faces of girls who are whose crime is just trying to go to school in Afghanistan like how that isn't just so important to focus on it is a kind of just coming back to the terrorism angle is there are two problems there's the our potential overreaction to any human engineered act of terrorism but that but then there's just this clash of worldviews which has just millions of casualties and it's it's clear what the right side of history is here.
[1058] You know, like we don't have to debate whether girls should be allowed to learn to read anymore.
[1059] But yeah, how does one focus energy and confront this irrational overreaction that we all have?
[1060] Because it's kind of centered in this notion right of blame.
[1061] Was it Paul Bloom who was talking about it?
[1062] Who was talking about when you can, if you're just the victim of an accident, medical malpractice versus the guy who was fake giving people cancer, to treat them it's the results are the same yeah because we've attached blame to it betrayal bias is what we trail bias yeah because if we as a country we're just about alleviating as many untimely deaths as possible we would start going through the list if we were allocating resources and this includes our military and everything else right we would start with cancer we'd go to heart disease we would just we would if all if we're honest about wanting to prevent death we would confront and attack and allocate resources to about 6 ,000 things before we got to many of the things we're most upset about.
[1063] Yeah, except certain things trouble us more than mere death.
[1064] I mean, and it's not necessarily wrong to be troubled by them.
[1065] So, you know, to have someone in your life die of heart disease suddenly is one thing.
[1066] To have them run over by a car is another thing.
[1067] to have them raped and tortured and then murdered is another thing.
[1068] Now, the net result is the same.
[1069] They're like, they're no longer here.
[1070] Right.
[1071] And you're now deprived of their presence in your life quite suddenly and unexpectedly.
[1072] But there's a surrounding set of facts that, for better or worse, changes the level of human suffering in response to that loss.
[1073] And in certain cases, I would agree, it's starkly irrational.
[1074] And if you could just sort of change the way you think about it, well, then, you know, you should, you know, you'll, you'd feel better and everyone would feel better.
[1075] Well, maybe we wouldn't be allocating one, 33 cents on every dollar to the military.
[1076] Right.
[1077] Bigger than the next 11 militaries combined, it seems a little disproportionate to me. I have a real quick thing to say about the radicalism conversation.
[1078] I don't think anyone would disagree with the absurdity and craziness and horror that is happening in all those places.
[1079] But don't you think that just saying that over and over again with.
[1080] out solutions is not all that helpful.
[1081] Like when you had Fareed Zakaria on, I feel like he was really kind of saying that, like education.
[1082] I mean, that was an example of a kind of an illegitimate move he made.
[1083] I mean, he sort of played, it wasn't the race card, but it was kind of the quasi -race and culture card.
[1084] I mean, basically he said, listen, I come from the community.
[1085] I'm a Muslim.
[1086] You're not, you know, like, I know that what you're doing is counterproductive.
[1087] Right.
[1088] Right.
[1089] And like it's a Trump card he could play with me, which he can't play with Majid Nawas, you know, and Madjad and I are totally aligned on this.
[1090] So it was kind of an illegitimate move because it wasn't, it wasn't, certainly not an argument I take seriously.
[1091] But there is just the, there clearly is a path forward.
[1092] And it's the path forward that every other religion has been forced to take, which is to moderate its crazy medieval views under pressure from secular, humanistic, scientific, democratic, liberal values and so but his i i don't think you're representing his position super fairly because i listened to that and i love that thing and to me i heard you both saying the exact same thing which is that this change has to be led by moderate muslims you were very insistent that for that to happen we must first start with acknowledging all of the asinine parts of the scripture and that it first needs to be debunked right or acknowledge that let's say jihad is bad or whatever you you were pretty i think dug into this notion that no that needs to be a part of how all this happens and i think he was saying that's not going to be helpful to them you coming in and pointing out all the errors and their thinking isn't actually going to be the solution it's going to be the moderates who have already become a little more moderate living as an example and then intervening so I just well yeah so I mean because again I have the evidence there's two things one is I acknowledged and I'm sure I acknowledge to Farid that there are different roles to play here like my role is not to be you know parish not to parachute down into Islamabad and start talking right you don't think that would go well yeah I'm not tempted to try put your jihitsu to the test my my fear -based worldview suggests that it would be a bad experiment you know I see the evident.
[1093] There are people who essentially jihadists who are now atheists because they were, you know, we're watching my YouTube videos.
[1094] So there are people who definitely who have come out because of me or people like me, you know, Christopher Hitchens or Richard Dawkins.
[1095] But I'd be the first to say that that's not the normal use case, right?
[1096] And the normal case is something that's far safer and just a little bit more reasonable, you know, taking the rough edges off of being within eyesight of Qatar or Dubai.
[1097] Those are the things to me that.
[1098] seem like how that happens.
[1099] Well, there's that too, but there's, yeah, I mean, there's just the effect of modernity, but again, that does, that's a two -edged sword because modernity is also advertising its moral and spiritual bankruptcy as well in every moment.
[1100] So you're getting this pendulum swing away from, you know, all the ways in which people are, have trivialized human life back into a more fundamentalist adherence to not just Islam, but every kind of religious faith.
[1101] Yeah, for as many people who are being converted by how much better life is getting materially and socially in, you know, the West, say, or in, you know, a place like Dubai compared to a place like Afghanistan, yeah.
[1102] There are many people who are reacting against the superficiality of that, the mere materialism of that, and finding it necessary to be more doctrinaire and more dogmatic.
[1103] Okay.
[1104] The last thing I want to say is there was a really interesting moment when you, you interviewed Eric Weinstein, where he was kind of going through the fact that he had had a pretty traditional Orthodox Jewish wedding, I believe, and he himself is self -professed atheist, right?
[1105] And you were kind of like questioning that.
[1106] And he just simply said to you, well, that's because you, Sam, would put truth at the number one principle in your life.
[1107] Like truth is the number one guiding life in your life.
[1108] What was profound about that observation to me was simply I too put that.
[1109] I fetishize the truth.
[1110] I want some kind of certainty.
[1111] And it just occurred to me, oh, wow, it is a choice you make.
[1112] And you could put any number of things in the number one slot, right, that you're trying to service in your life.
[1113] It could be truth.
[1114] It could be compassion.
[1115] It could be love.
[1116] could be any number of things.
[1117] And I was thinking you're married.
[1118] You've stayed married, which leads me to believe there has to be many times in your real life where it's on the table for you to point out that that's inaccurate what she said.
[1119] That's not actually what happened on the way to get groceries.
[1120] But you smartly go, that's not going to be my priority right now.
[1121] My priority is going to be comforting my partner who I did something that triggered some emotional thing in her or a fear.
[1122] And my job in this moment is to throw away the facts and get to what's important, which is comforting my partner, which I've agreed to do.
[1123] Does that happen or you don't ever budge on that?
[1124] You go, no, I did not run the stop sign.
[1125] I can tell you why I stop for two seconds.
[1126] Well, I mean, this is, I'm a little more of a stickler for for truth than that.
[1127] but yeah I mean so this is this is something I get into in my book lying and so it's not some of that truth is paramount for me I mean it's close to that but it's more that honesty is paramount and sometimes the honest truth is that you don't know something right or your state of knowledge is like essentially a coin toss you know you have a 50 % sense that it's 50 % likelihoods that something's true or false say so so it's to honestly represent one's knowledge of the world to oneself and to others is my default position.
[1128] But as you rightly point out, there are cases where you feel like you have to pick your battles or there's no reason to go there, right?
[1129] There's no, like there's some.
[1130] Or it's about being honest about your objective, which is to be a good partner to somebody.
[1131] Yeah, yeah.
[1132] But I find that honesty and the truth insofar as you can apprehended tends to be so useful that it there's there's rarely much tension between being honest and being good or being a supportive or compassionate or a good friend now it's not that it's I'm never in that situation but once honesty becomes your kind of master value right and everyone in your life knows it then you you sort of you sort of try to train everyone in your life to know what to expect from you, right?
[1133] Like you're the guy, like, people don't often ask me what I think unless they actually want to know what I think.
[1134] Sure.
[1135] You know?
[1136] Now, and they're, like, do you like my overalls?
[1137] Yeah, right.
[1138] Yeah, exactly.
[1139] I very much like them on you.
[1140] Okay, okay.
[1141] Occasionally I encounter a mismatch between, like, someone else's expectation of what, you know, honest communication is like and my own.
[1142] known like the person doesn't know me very well or they you know so or someone has asked for creative feedback on something and I start to give it and I realize okay wait a minute this person really just wanted to be told that their thing was great yeah they want a cheerleader yeah they just didn't we're just not on the same page but rarely is that the case for me and it's it's amazing how clarifying of one's life it is to be to have advertised to to yourself and to the world sure that's the way it's going to be but but But I think we'd both agree your obligation to the colleague who wants your input on a paper versus your covenant you've entered with this human being to emotionally support each other and make each other feel safe.
[1143] Right.
[1144] That the stakes of that are much higher and they do require, I mean, just in my own case, they definitely require, my wife will go, you never offer to do blank.
[1145] And in my mind, I'm like, well, no, Wednesday I did.
[1146] And then I think Sunday I remember I did.
[1147] And I can start building a pretty defendable case that she's just erroneous in this claim.
[1148] And it's so tempting.
[1149] I want to do it.
[1150] No, I get tempted by that.
[1151] But it doesn't yield what I'm ultimately trying to achieve, which is I should be always trying to make this human.
[1152] and I've committed to be with feel as safe and taking care of as possible.
[1153] At the expense quite often of reality.
[1154] Not necessarily at the expense of reality because it's not that you're agreeing that her representation is true necessarily.
[1155] You could just move on.
[1156] Yeah, exactly.
[1157] I fast forward to, honey, I'd be so happy to help more.
[1158] Sometimes I don't think about it.
[1159] So if you could meet me halfway and just remind me, I'd love to help you get out the door.
[1160] Like, let me do that and then help me meet those needs, right?
[1161] Yeah, but that falls for me under the sort of the picking the battles rubric where it's just, it's not important at that moment to get the record straight.
[1162] Correct.
[1163] It's just like, what can I do?
[1164] What can I do now to solve this problem right now?
[1165] And I would argue the emotional truth is as important as the historic record truth, the archaeological record, right?
[1166] It is as important.
[1167] But I don't see any dishonesty entailed there.
[1168] What would be dishonest is to say, oh, you're right, you're right.
[1169] You're right.
[1170] You're right.
[1171] I didn't do those things.
[1172] All the while, I believe I did those things and you're delusional, right?
[1173] That introduces a kind of distance.
[1174] Now you're managing the person as though they're not a rational actor or they're not on the same team.
[1175] And this is at the core of, I think, your frustration with mankind is that we are irrational.
[1176] And the expectation that will be rational is actually antithetical to.
[1177] what we are.
[1178] It actually, it doesn't exist.
[1179] So none of it's logical, none of it's real.
[1180] No, there's, I'm not saying that we're always motivated by reason.
[1181] I mean, there are all kinds of things that are, we have many drives that are not any kind of chain of reasoning.
[1182] And it's hard to reason us out of them.
[1183] But we can have, we can rationally understand what's going on and navigate around all of that.
[1184] Yeah.
[1185] I mean, where I'm trying to lead.
[1186] you is that I do think this paradigm that exists, I think there are times that you could extend that out to some of the debates you're in.
[1187] Do you feel like some of these seemingly these chasms that exist that we just can't cross?
[1188] I do think some of it's going to be illogical and irrational because it's about making other humans feel safe.
[1189] Yeah, I don't think the bearded maniacs who are throwing battery acid in the faces of young girls need to feel safe.
[1190] I think they need to feel terrified that their worldview is going to be canceled because it will be because we have enough of a consensus that they're barbarians.
[1191] So like we need to, like there are certain people who will only respond to the starkest possible encounter of like not an inch further.
[1192] Yeah, yeah.
[1193] And so like the people who are threatening to, you know, like, you know, kill novelists.
[1194] It's just so downstream.
[1195] That solution is treating the symptom and not trying to figure out why on earth would someone be recruited in the first place?
[1196] Well, no, but that's actually not the symptom.
[1197] It is the actual, it's the ideas.
[1198] It's like the idea about the price we're paying for the certainty of paradise, right?
[1199] I mean, this is what, this is what's, again, back to where we started, it's not that these are bad people.
[1200] It's worse than that.
[1201] These are, most of these people are good people.
[1202] or normal people like i don't well i don't know i don't think that the americans that have been recruited over here and have fucking flown to join isis to suggest that they aren't the victim of some kind of emotional damage as kids i find hard to believe okay okay but then then then you got to work harder because because people do extreme i pray you'd say that to me okay i spent two years in silence in my 20s Not all at once in like, you know, three -month increments, one -month increments.
[1203] But of all the things I could have been doing, right, I was on silent retreat.
[1204] Yeah, almost as radical as joining ISIS, by the way.
[1205] No, no, but like the, but for a different idea, like I know, when I look at someone like John Walker Lind, you know, the so -called American Taliban, he was the first American who we noticed, you know, right after 9 -11.
[1206] He was in the basement of his.
[1207] Yeah, he got, he got caught and wounded.
[1208] He's now in prison.
[1209] and I think he's completely unrepentant.
[1210] I think he's just as fundamentalist as he ever was.
[1211] He's been in prison for whatever it is, 14 years or so.
[1212] When I looked at him and I got his backstory and he went to these countries, I think he started in Saudi Arabia.
[1213] I forget how he got to Afghanistan, but maybe he was in Pakistan.
[1214] But he learned Arabic and he just got totally immersed, hook -line and sinker, in this Muslim religiosity that, based on just a sincere conviction, this was the true account of the world.
[1215] But really quick, do you think that that guy felt included in his group?
[1216] Do you think he was on the football team and felt like he was quite had an identity that was shared with many around him?
[1217] I forget about his, I forget about his backstory there, but it is, I just have to imagine.
[1218] It is totally possible to be the quarterback of the football team and then to join Isis.
[1219] Really?
[1220] Totally.
[1221] I mean, I was, I was a very social, very, you know, a reasonably self -actualized guy, had lots of friends, lots of intellectual interests.
[1222] I dropped out of Stanford and decided there is something much deeper here that I have to pay attention to.
[1223] Yeah.
[1224] And I don't regret any of it, but it could have been something that I would now regret.
[1225] It could have been fundamentalist Islam.
[1226] Had I believed, you know, that the Quran was the perfect document.
[1227] Where you and I diverge, I do think there's probably innumerable reasons, but I think some people are susceptible to it or not.
[1228] Well, I mean, so there are, I mean, obviously the human mind is very complicated and it's hard to find happiness in this world, whatever your circumstance.
[1229] Take it out of the context of jihadism or religious extremism.
[1230] You just take something like the opioid crisis, right?
[1231] Well, it's like you can be prince and have a problem with drugs and you can be somebody who, who has no prospects in life and have a problem with drugs down.
[1232] Addiction is very egalitarian in that way.
[1233] Yeah.
[1234] But you could make the argument that as life gets better and better, you know, as more people have a life like you and I are currently enjoying, well, then the temptation to become an addict must go down, right?
[1235] Well, it must, maybe, maybe it goes down a lot by half, say, but still it's possible to basically have everything and have your life run off the rails with addiction.
[1236] Yeah, and we've got to be clear what metrics we're using to evaluate life getting better, right?
[1237] So we're using money, you've got money, you've got education, you've got physical health, you've got people around you who love you and are worried about you.
[1238] You know, I mean, it's just, but, you know, you could be Anthony Bourdain, who's got, but from the outside, a great life, right?
[1239] He's doing exactly what he wants.
[1240] He's got a career that everyone envies, what could be more fun than doing what he's doing?
[1241] and, you know, provided you like to travel.
[1242] And then he hangs himself, right?
[1243] And then you're left with this absolute refutation of the apparent goodness of his life.
[1244] I mean, he was even into Jiu -Jitsu, right?
[1245] He was like a total Jiu -Jitsu addict, right?
[1246] And like that was the coolest thing to see.
[1247] Well, he was an attic, period.
[1248] So a lot of those things answered themselves.
[1249] Right.
[1250] But he's like 60 years old.
[1251] He's getting into the most fun sport ever.
[1252] I had never met him.
[1253] But, you know, we had friends in common.
[1254] And so I'm looking at this from afar thinking, man, that's just so cool that this guy is, you know, he's older than me, he's got, but apparently, you know, way more energy and, you know, great luck with, with his health.
[1255] And then he kills himself.
[1256] And so it's, that's a sign that there is considerable distance still for a human mind to be, you know, profoundly unfulfilled, even when basically all the other boxes are checked.
[1257] Yes.
[1258] And if you had listened to this podcast a lot, that's, that is the theme.
[1259] So I get to talk with a lot of people who are rich and famous.
[1260] And my first question is generally, you're rich and famous.
[1261] Did it cure all the things you thought it would cure?
[1262] Right.
[1263] And almost across the board, it didn't.
[1264] And it didn't do it for me. So then the question is, what does make you feel good?
[1265] It's counterintuitive the things.
[1266] So imagine what it would be like for you and all of your rich and famous and successful and good looking guests who should be happy.
[1267] Yeah.
[1268] To be fully immersed in a culture that has an answer for all of that remaining existential acts.
[1269] So we have to win this war of ideas.
[1270] We cannot give an inch on this.
[1271] And it's not racism, certainly, to beholding the line here.
[1272] It's actually the only rational, compassionate response to the suffering of millions of people who are living in context.
[1273] where their lives are threatened every day by the theocrats around them.
[1274] Yeah.
[1275] It's like, well, I've kept you for almost an entire day.
[1276] Which is, you just, you think you made all of our.
[1277] This is the way it is on a podcast.
[1278] Yeah.
[1279] Well, Sam, you really have made us happy by coming in.
[1280] You've been the object of our desire since we started.
[1281] Wow.
[1282] Pleasure to do it.
[1283] All right.
[1284] Hello.
[1285] Is this Jameson?
[1286] Jameson.
[1287] I think you're there and I want to say on behalf of Monica and I that we're so grateful you bought a left -handed mug and we would very much love to pass on your generosity to a foundation or a charity that you believe in.
[1288] Is there something, an issue that you care about that we can help with?
[1289] I was hoping you could donate to the Open Medicine Foundation.
[1290] Open Medicine Foundation.
[1291] And what do they do?
[1292] What do they do?
[1293] Well, I have a chronic illness called ME -CFS, and they do a lot of research for that.
[1294] They raise funding because it's really underfunded.
[1295] And what kind of symptoms do you have with your condition?
[1296] Well, speaking is difficult as you can.
[1297] Yeah, and I, I, I'm doing better now.
[1298] I couldn't eat solid food for about a year and a half.
[1299] Oh, wow.
[1300] Yeah, it was pretty, pretty awful.
[1301] And, you know, some people are worse, worse than I am.
[1302] They're, you know, they have to, you know, like feeding tubes, and they can't speak at all.
[1303] You know, small stuff is taxing.
[1304] I need, I need a lot of help.
[1305] And so do you listen to a lot of podcasts?
[1306] Oh, you know, yeah You know, yours has been a really wonderful distraction for me Just gives me a little bit of escape And I, you know, I don't, I just don't sleep well at all So, you know, I noticed that you guys usually post it at like 2 in the morning or something So I'm usually awake, right?
[1307] You might be our first Jameson's our very first listener Now the last question I want to ask you before we go Your name is Jameson Is your father an alcoholic Does he love whiskey?
[1308] Yeah, it's interesting It's spelled different Oh, okay It's not so like I'm actually named this for a road But my dad actually He was an alcoholic He's been sober for, I think, like 30 years or something.
[1309] Oh, wow.
[1310] We have that in common.
[1311] Yeah.
[1312] Is it okay if I talk about the open medicine a little bit?
[1313] Of course, please do, yeah.
[1314] They, you know, they work to, you know, raise money for funding.
[1315] They work with Stanford a lot at UC San Diego.
[1316] And so they're trying to get more research for Lyme disease and MECFS, which is the main one that I was talking about.
[1317] And other like fibromyalgia and stuff like that.
[1318] And are these all autoimmune diseases in a way?
[1319] You know, MECFS is pretty mysterious.
[1320] You know, studies have shown that it affects energy at the cellular level.
[1321] So it kind of something is attacking the mitochondria.
[1322] There's some inflammation markers that are pretty hard.
[1323] Well, as someone with arthritis, I find inflammation really sexy, as you know.
[1324] So I just always want to know if I've got like a fellow sexy brother out there with inflammation.
[1325] Yeah, exactly.
[1326] And it affects like 24 million people around the world.
[1327] So it's something, you know, and it's obviously very personal for me. So I really would like to, you know, raise awareness.
[1328] And people can donate.
[1329] Oh, actually, you know what?
[1330] I was going to mention if they donate before the 27th donations are tripled.
[1331] Oh, that someone's matching donations?
[1332] Yeah, anonymous donors.
[1333] So if you guys donate, donate that $2 ,000 from the mug, $6 ,000.
[1334] Oh, fantastic.
[1335] We will do that before November 27th.
[1336] Before November 27th.
[1337] Great.
[1338] Okay, great.
[1339] You can go to a OMF