Insightcast AI
Home
© 2025 All rights reserved
Impressum
#365 – Sam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs

#365 – Sam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs

Lex Fridman Podcast XX

--:--
--:--

Full Transcription:

[0] The following is a conversation with Sam Harris, his second time in the podcast.

[1] As I said two years ago, when I first met and spoke with Sam, he's one of the most influential pioneering thinkers of our time, as the host of the Making Sense podcast, creator of the Waking Up app, and the author of many seminal books on human nature and the human mind, including the end of faith, the moral landscape, lying, free will, and waking up.

[2] In this conversation, besides our mutual fascination with AGI and free will, we do also go deep into controversial, challenging topics of Donald Trump, Hunter Biden, January 6th, vaccines, lab leak, Kanye West, and several key figures at the center of public discourse, including Joe Rogan and Elon Musk, both of whom have been friends of Sam and have become friends of mine.

[3] somehow in an amazing life trajectory that I do not deserve in any way and in fact believe is probably a figment of my imagination.

[4] And if it's all right, please allow me to say a few words about this personal aspect of the conversation of discussing Joe, Elon, and others.

[5] What's been weighing heavy on my heart since the beginning of the pandemic now three years ago is that many people I look to for wisdom in public discourse stop talking to each other as often, with respect, humility, and love, when the world needed those kinds of conversations the most.

[6] My hope is that they start talking again.

[7] They start being friends again.

[8] They start noticing the humanity that connects them that is much deeper than the disagreements that divide them.

[9] So let me take this moment to say with humility and honesty, why I look up to and am inspired by Joe, Elon, and Sam.

[10] I think Joe Rogan is important to the world as a voice of compassionate curiosity and open -mindedness to ideas, both radical and mainstream, sometimes with humor, sometimes with brutal honesty, always pushing for more kindness in the world.

[11] I think Elon Musk is important to the world as an engineer, leader, entrepreneur, and human being who takes on the hardest problems that face humanity and refuses to accept the construction, of conventional thinking that made the solutions to these problems seem impossible.

[12] I think Sam Harris is important to the world as a fearless voice who fights for the pursuit of truth against growing forces of echo chambers and audience capture, taking unpopular perspectives and defending them with rigor and resilience.

[13] I both celebrate and criticize all three privately, and they criticize me, usually more effectively, from which I always learn.

[14] a lot and always appreciate.

[15] Most importantly, there is respect and love for each other as human beings, the very thing that I think the world needs most now in a time of division and chaos.

[16] I will continue to try to mend divisions, to try to understand, not to ride, to turn the other cheek if needed, to return hate with love.

[17] Sometimes people criticize me for being naive, cheesy, simplistic, all of that.

[18] I know.

[19] I agree.

[20] but I really am speaking from the heart, and I'm trying.

[21] This world is too fucking beautiful not to try, in whatever way I know how.

[22] I love you all.

[23] And now, a quick few second mention of each sponsor.

[24] Check them out in the description.

[25] It's the best way to support this podcast.

[26] We got Notion for AI -powered note -taking and team collaboration, indeed for hiring great teams, and a master class for online learning.

[27] Choose wisely, my friends.

[28] Also, if you want to work with our team, we're always hiring, go to Lexfreedman .com slash hiring.

[29] And now, onto the full ad reads.

[30] As always, no ads in the middle.

[31] I try to make this interesting, but if you must skip them, please still check out our sponsors.

[32] I enjoy their stuff.

[33] Maybe you will too.

[34] This show is brought to you by Notion, a note -taking and team collaboration tool, my favorite note -taking and team collaboration tool, and they have a new feature.

[35] notion AI that I've been using and loving and this thing is probably the best implementation of a system that uses a language model to generate text because it integrates across the entirety of your note -taking process and it adds just a giant number of small and big features that help you out that save a lot of time but also make everything more fun and creatively sort of inject ideas into your workflow.

[36] So just to list some features, they can edit the voice and tone of the text you already wrote so they can rewrite it in a different tone.

[37] They can make the text, which I love.

[38] They can make it shorter or longer.

[39] Also, they can simplify the text, which to me is at the core of the writing process make things as simple as possible, but not simpler, as Einstein said.

[40] And to have tools that give you ideas how to do that, not necessarily completely automate everything, but give you really powerful ideas of how to get 90 % there.

[41] This is just brilliant.

[42] Also, if there's technical jargon, they can rewrite the text and explain it more simply.

[43] What else?

[44] They can obviously summarize the text.

[45] If you start writing, they can continue your writing.

[46] If you're having trouble starting to write and there's a blank page glaring back at you, they can generate based on a topic, a bunch of texts to get you going.

[47] I mean, there's so many just amazing features.

[48] I love it when great, powerful language models or any idea in AI is then injected into a tool that's actually usable and useful and powerful across a number of use cases to a huge number of people.

[49] I mean, this is really, really, really exciting.

[50] Notion AI helps you work faster, write better, and think bigger, doing tasks that normally take you hours in just minutes, Notion AI for free when you go to Notion .com slash Lex.

[51] That's all lowercase notion .com slash Lex to try the power of Notion AI today.

[52] This show is also brought to you by indeed a hiring website.

[53] I've used it.

[54] I continue to use it to hire folks for the teams I've been on.

[55] I've led from engineering to creative.

[56] Everything requires a rigorous, systematic artistic, all many adjectives I want to use process to build up an amazing team because there's nothing more important to the success of an endeavor or the success of life or to just your contentment and happiness, enjoy, and fulfillment, and a source of meaning than the team that you take on the hard challenges of life with, of work with.

[57] So you should use the best tools for the job.

[58] of hiring, and you should take hiring very, very, very seriously.

[59] Don't overspend on hiring.

[60] Visit indeed .com slash lex to start hiring now.

[61] That's indeed .com slash Lex.

[62] Terms and conditions apply.

[63] This show is also brought to you by Masterclass.

[64] $180 a year gets you in all -access pass to watch courses from the best people in the world in their respective disciplines.

[65] One of the people I just recently talked to is Chris Voss.

[66] He is a former FBI hostage negotiator.

[67] Brilliant guy.

[68] Off the mic, I really enjoy talking to him.

[69] There is kindness, camaraderie, thoughtfulness, humor, wit, also a certain sort of cultural density and complexity hailing from New York or whatever that rich, sexy accent is from.

[70] It's just really fun to listen to him, to listen to him, discuss what he's really good.

[71] good at.

[72] That was true on the podcast, and that is very much true in his masterclass.

[73] Well, he really systematically breaks down his ideas of what it takes to negotiate.

[74] Terrorists.

[75] Negotiate with hostage takers, negotiate with bank robbers.

[76] But I think the most important thing is negotiate in everyday life to negotiate in business relationships, all of that.

[77] It's just a really brilliant, concise, clear, actionable advice that he gives.

[78] And that's true for almost every single masterclass they have, and you get access to all of them.

[79] Get unlimited access to every masterclass and get 15 % off an annual membership at masterclass .com slash Lex.

[80] This is the Lex Friedman podcast.

[81] To support it, please check out our sponsors in the description.

[82] And now, dear friends, here's Sam Harris.

[83] What is more effective at making a net positive impact on the world?

[84] empathy or reason it depends on what you mean by empathy there are two at least two kinds of empathy there's the the cognitive form which is you know i would argue even a species of reason it's it's just understanding another person's point of view you know you understand why they're suffering or why they're happy or what you know we just you have a theory of mind about another human being that is is accurate and so you You can navigate in relationship to them more effectively.

[85] And then there's another layer entirely, not incompatible with that, but just distinct, which is what people often mean by empathy, which is more a kind of emotional contagion, right?

[86] Like you feel depressed and I begin to feel depressed along with you because, you know, it's just, it's contagious, right?

[87] I, you know, we're so close and I'm so concerned about you and your problems become my problems and it bleeds through right now.

[88] I think both of those capacities are very important, but the emotional contagion piece, and this is not really my thesis, this is something I have more or less learned from Paul Bloom, the psychologist who wrote a book on this topic titled Against, empathy.

[89] The emotional social contagion piece is a bad guide rather often for ethical behavior and ethical intuitions.

[90] Oh boy.

[91] And I'll give you the clear example of this, which is we find stories with a single identifiable protagonist who we can effortlessly empathize with far more compelling than data, right?

[92] So if I tell you, this is the classic case of the little girl who falls down a well, right?

[93] You know, this is somebody's daughter.

[94] You see the parents distraught on television.

[95] You hear her cries from the bottom of the well.

[96] The whole country stops.

[97] I mean, there was an example of this, you know, 20, 25 years ago, I think, where it was just wall to wall on CNN.

[98] This is just the perfect use of CNN.

[99] It was, you know, 72 hours or whatever, was of continuous coverage of you just extracting this girl from a well.

[100] So we effortlessly pay attention to that.

[101] We care about it.

[102] We will donate money toward it.

[103] I mean, it's just, it marshals 100 % of our compassion and altruistic impulse.

[104] Whereas if you hear that there's a genocide raging in some country you've never been to and never attended to go to, and the numbers don't make a dent.

[105] And we find the story boring.

[106] We'll change the channel in the face of a genocide, right?

[107] It doesn't matter.

[108] And it literally, perversely, it could be 500 ,000 little girls have fallen down wells in that country, and we still don't care, right?

[109] So, it's, you know, many of us have come to believe that this is a bug rather than a feature of our moral psychology.

[110] And so empathy plays an unhelpful role there.

[111] So ultimately, I think, when we're making big decisions about what we should do and how to mitigate human suffering and what's worth value in and how we should protect those values.

[112] I think reason is the better tool, but it's not that I would want to dispense with any part of empathy either.

[113] Well, there's a lot of dangers to go on there, but briefly to mention, you've recently talked about effective altruism on your podcast.

[114] I think you mentioned some interesting statement, I'm going to horribly misquote you, but that you'd rather live in a world, like it doesn't really make sense, but you'd rather live in a world where you care about maybe your daughter and son more than 100 people that live across the world, something like this.

[115] Where the calculus is not always perfect, but somehow it makes sense to live in the world where it's irrational in this way, and yet empathetic in the way you've been discussing.

[116] Right.

[117] I'm not sure what the right answer is there, or even whether there is one right answer.

[118] There could be multiple peaks on this part of the moral landscape.

[119] But so the opposition is between an answer, that's articulated by, you know, someone like the Dalai Lama, right?

[120] You know, really any exponent of, you know, classic Buddhism would say that sort of the ultimate enlightened ethic is true dispassion with respect to friends and strangers, right?

[121] So you would, you know, the mind of the Buddha would be truly dispassionate.

[122] You would love and care about all people equally.

[123] And by that light, it seems some kind of ethical thought.

[124] or at least, you know, failure of, to fully actualize compassion in the limit or, you know, enlightened wisdom in the limit, to care more or even, and much more about your kids than the kids of other people and to prioritize your energy in that way, right?

[125] So you spend all this time trying to figure out how to keep your kids healthy and happy and you'll attend to their minutest concerns and however superficial.

[126] And again, there's a genocide rate.

[127] in Sudan or wherever, and it takes up less than one percent of your bandwidth.

[128] I'm not sure it would be a better world if everyone was running the Dalai Lama program there.

[129] I think some prioritization of one's nearest and dearest ethically might be optimal, because we'll all be doing that, and we'll all be doing that in a circumstance where we have certain norms and laws and other structures that force us to be dispassionate where that matters, right?

[130] So, like, when I go to, when my daughter gets sick, and I have to take her to a hospital, you know, I really want her to get attention, right?

[131] And I'm worried about her more than I'm worried about everyone else in the lobby.

[132] But the truth is, I actually don't want a totally corrupt hospital.

[133] I don't want a hospital that treats my daughter better than anyone else in the lobby because she's my daughter and I've, you know, bribed the guy at the door or whatever, you know, or the guy's a fan of my podcast or whatever the thing is, you don't want starkly corrupt, unfair situations.

[134] And when you're, when you sort of get pressed down the hierarchy of Maslow's needs, you know, individually and societally, a bunch of those variables change and they change for the worse, understandably.

[135] But yeah, when things are, when everyone's corrupt and it's, You're in a state of collective emergency.

[136] You know, you've got a lifeboat problem.

[137] You're scrambling to get into the lifeboat.

[138] Yeah, then fairness and norms and the other vestiges of civilization begin to get stripped off.

[139] We can't reason from those emergencies to normal life.

[140] I mean, in normal life, we want justice, we want fairness, we want, we're all better off for, it, even when the spotlight of our concern is focused on the people we know, the people who are friends, the people who are family, people we have good reason to care about.

[141] We still, by default, want a system that protects the interests of strangers, too.

[142] And we know that, generally speaking, just in game theoretic terms, we're all going to tend to be better off in a fair system than a corrupt one.

[143] One of the failure modes of empathy is our susceptible, ability to anecdotal data.

[144] Just a good story will get us to not think clearly.

[145] But what about empathy in the context of just discussing ideas with other people?

[146] And then there's a large number of people, like in this country, you know, red and blue, half the population believes certain things on immigration or on the response to the pandemic or any kind of controversial issue, even if the election was fairly executed.

[147] having an empathy for their worldview, trying to understand where they're coming from, not just in the explicit statement of their idea, but the entirety of the roots from which their idea stems, that kind of empathy while you're discussing ideas.

[148] What is, in your pursuit of truth, having empathy for the perspective of a large number of other people, versus raw mathematical reason.

[149] I think it's important, But I just, it only takes you so far, right?

[150] It doesn't, it doesn't get you to truth, right?

[151] It's not, truth is not a, it's not decided by, you know, democratic principles.

[152] And certain people believe things for understandable reasons, but those reasons are nonetheless bad reasons, right?

[153] They don't scale, they don't generalize, they're not reasons anyone should adopt for themselves or, or respect, you know, epistemologically.

[154] and yet their circumstance is understandable and it's something you can care about, right?

[155] And so yeah, like, I mean, just take, I think there's many examples of this that you might be thinking of, but I mean, one that comes to mind is I've been super critical of Trump, obviously, and I've been super critical of certain people for endorsing him or not criticizing him when he really made it, you know, patently obvious who he was, you know, if there had been any doubt initially.

[156] There was no doubt when we have a sitting president who's not agreeing to a peaceful transfer of power, right?

[157] So I'm critical of all of that, and yet the fact that many millions of Americans didn't see what was wrong with Trump or bought into the, didn't see through his.

[158] con right i mean they bought into the idea that he was a brilliant businessman who might just be able to change things because he's so unconventional and so you know his heart is in the right place you know he's really a man of the people even though he's like you know gold -plated everything in his life um they bought the myth somehow uh of you know largely because they had seen him on television for almost a decade and a half uh pretending to be this genius businessman who could get things done.

[159] It's understandable to me that many very frustrated people who have not had their hopes and dreams actualized, who have been the victims of globalism and many other current trends, it's understandable that they would be confused and not see the liability of electing a grossly incompetent, morbidly narcissistic person into the presidency.

[160] So I don't, so which is to say that I don't blame, there are many, many millions of people who I don't necessarily blame for the Trump phenomenon.

[161] But I can nonetheless bemoan the phenomenon as, as indicative of, you know, very bad state of affairs in our society, right?

[162] So it's, there's two levels to it.

[163] I mean, one is, I think you have to call a spade a spade when you're talking about how how things actually work and what things are likely to happen or not.

[164] But then you can recognize that people have very different life experiences.

[165] And yeah, I mean, I think empathy and, you know, probably the better word for what I would hope to embody there is compassion, right?

[166] Like really, you know, to really wish people well, you know, and to really wish, you know, strangers well effortlessly wish them well.

[167] I mean, to realize that there is no opposition between, at bottom, there's no real opposition between selfishness and selflessness, because wise selfishness really takes into account other people's happiness.

[168] I mean, you know, which do you, do you want to live in a society where you have everything, but most other people have nothing?

[169] Or do you want to live in a society where you're surrounded by happy, creative, self -actualized people who are having their hopes and dreams realize.

[170] I think it's obvious that the second society is much better, however much you can guard your good luck.

[171] But what about having empathy for certain principles that people believe?

[172] For example, the pushback, the other perspective on this, because you said bought the myth of Trump as a great businessman.

[173] There could be a lot of people that are supporters of Trump who could say that Sam Harris bought the myth that we have this government of the people by the people that actually represents the people as opposed to a bunch of elites who are running a giant bureaucracy that is corrupt that is feeding themselves and they're actually not representing the people and then here's this chaos agent Trump who speaks off the top of his head yeah he's flawed in all this number of ways he's a more comedian than he is a president type of figure, and he's actually creating the kind of chaos that's going to shake up this bureaucracy, shake up the elites that are so uncomfortable, because they don't want the world to know about the game they got running on everybody else.

[174] So that's the kind of perspective that they would take and say, yeah, there's these flaws that Trump has, but this is necessary.

[175] I agree with the first part.

[176] So I haven't bought the myth that it's, you know, a truly representative democracy in the way that we might idealize.

[177] And, you know, on some level, I mean, this is a different conversation, but at some level I'm not even sure how much I think it should be, right?

[178] Like, I'm not sure we want, in the end, everyone's opinion give an equal weight about, you know, just what we should do about anything.

[179] And I include myself in that.

[180] I mean, there are many topics around which I don't deserve to have a strong opinion.

[181] because I don't know what I'm talking about, right?

[182] Or what I would be talking about if I had a strong opinion.

[183] So, and I think we'll probably get to that, to some of those topics, because I've declined to have certain conversations on my podcast just because I think I'm the wrong person to have that conversation, right?

[184] And it's, and I think it's important to see those bright lines in one's life and in the moment politically and ethically.

[185] So, yeah, I think, so leave aside the viability of democracy, I'm under no illusions that all of our institutions are, you know, worth preserving precisely as they have been up into the moment.

[186] This great orange wrecking ball came swinging through our lives.

[187] But I just, it was a very bad bet to elect someone who is grossly incompetent.

[188] and worse than incompetent, genuinely malevolent in his selfishness.

[189] And this is something we know based on literally decades of him being in the public eye.

[190] He's not a public servant in any normal sense of that term.

[191] And he couldn't possibly give an honest or sane answer to the question you asked me about empathy and reason and like how should we, you know, what should guide us.

[192] I genuinely think he is missing some necessary moral and psychological tools, right?

[193] And this is, I can feel compassion for him as a human being because I think having those things is incredibly important and genuinely loving other people is incredibly important and knowing what all that's about is, that's really the good stuff in life.

[194] And I think he's missing a lot of that.

[195] But I think we don't want to promote people to the highest positions of power in our society who are far outliers in pathological terms, right?

[196] We want them to be far outliers in, in the best case, in wisdom and compassion and some of the things you've, some of the topics you've brought up.

[197] I mean, we want someone to be deeply informed.

[198] We want someone to be unusually curious, unusually alert to how they may be wrong or getting things wrong consequentially.

[199] He's none of those things.

[200] And insofar as we're going to get normal mediocrities in that role, which I think is often the best we could expect, let's get normal mediocrities in that role, not once -in -a -generation narcissists and frauds.

[201] I mean, it's like, we just take honesty as a single variable, right?

[202] I think you want, yes, it's possible that most politicians lie, at least some of the time.

[203] I don't think that's a good thing.

[204] I think people should be generally honest, even to a fault.

[205] Yes, there are certain circumstances where lie and I think is necessary.

[206] It's kind of on a continuum of self -defense and violence.

[207] So it's like if you're going to, you know, if the Nazis come to your door and ask you if you've got Anne Frank in the attic, I think it's okay to lie to them.

[208] But, you know, Trump, there's arguably there's never been a person that anyone could name in human history who's lied with that kind of velocity.

[209] I mean, it's just, he was just a blizzard of lies, great and small, you know, to pointless and, and to, and effective.

[210] But it's just, it, it, it says something.

[211] fairly alarming about our society that a person of that character got promoted.

[212] And so, yes, I have compassion and concern for half of the society who didn't see it that way, and that's going to sound elitist and smug or something for anyone who's on that side listening to me. But it's genuine.

[213] I mean, I understand that, like, I barely have the, I'm like one of the luckiest people in the world, and I barely have the bandwidth to pay attention to half the things I should pay attention to in order to have an opinion about half the things we're going to talk about, right?

[214] So how much less bandwidth is somebody who's working two jobs or, you know, a single mom who's, you know, raising, you know, multiple kids, you know, even a single kid.

[215] It's just, it's unimaginable to me that people have the bandwidth to really track this stuff.

[216] And so then they jump on social media and they see, they get inundated, by misinformation, and they see what their favorite influencer just said, and now they're worried about vaccines, and it's just, it's, we're living in an environment where our, our, the information space has become so corrupted, and we've built machines to, to further corrupt it, you know, I mean, we've built a business model for the internet that it further corrupts it.

[217] So it's, it is just, it's chaos in informational terms, and I don't follow.

[218] fault people for being confused and impatient and at the at their wits end and yes Trump was a an enormous fuck you to the establishment and that's that was understandable for many reasons to me Sam Harris the great Sam Harris is somebody I've looked up to for a long time as a beacon of voice of reason and there's this meme on the internet and I would love you to steal man the case for it and against, that Trump broke Sam Harris' brain.

[219] That there is something is disproportionately to the actual impact that Trump had on our society.

[220] He had an impact on the ability of balanced, calm, rational minds to see the world clearly, to think clearly, you being one of the beacons of that.

[221] Is there a degree to which he broke your brain?

[222] Otherwise known as Trump derangement syndrome, medical condition.

[223] Yeah, I mean, I think Trump derangement syndrome is a very clever meme because it just throws the problem back on the person who's criticizing Trump.

[224] But in truth, the true Trump derangement syndrome was not to have seen how dangerous and divisive it would be to promote someone like Trump to that position of power.

[225] and in the final moment not to see how untenable it was to still support someone who, you know, a sitting president who was not committing to a peaceful transfer of power.

[226] I mean, that was, if that wasn't a bright line for you, you have been deranged by something because that was, you know, that was one minute to midnight for our democracy as far as I'm concerned.

[227] And I think it really was but for the integrity of a few people that we didn't suffer some real constitutional crisis and real emergency after January 6th.

[228] I mean, if Mike Pence had caved in and decided to not certify the election, right, literally you can count on two hands, a number of people who held things together at that moment.

[229] And so it wasn't for want of trying on Trump's part that we didn't succumb to some, you know, real, truly uncharted catastrophe with our democracy.

[230] So the fact that that didn't happen is not a sign that those of us who were worried that it was so close to happening were exaggerating the problem.

[231] I mean, it's like, you know, you almost got run over by a car, but you didn't.

[232] And so, you know, the fact that you're adrenalineized and you're thinking, you know, boy, that was dangerous.

[233] I probably shouldn't, you know, wander in the middle of the street with my eyes closed.

[234] You weren't wrong to feel that you really had a problem, right?

[235] And came very close to something truly terrible.

[236] So I think that's where we were, and I think we shouldn't do that again, right?

[237] So the fact that he's still, he's coming back around as potentially a viable candidate.

[238] You know, I'm not spending much time thinking about it.

[239] frankly, because it's, you know, I'm waiting for the moment where it requires some thought.

[240] I mean, it did, it took up, I don't know how many podcasts I devoted to the topic.

[241] It wasn't that, I mean, it wasn't that many in the end, you know, against the number of podcasts.

[242] I devoted to other topics.

[243] But there are people who look at Trump and just find him funny, entertaining, not especially threatening.

[244] It's like not a, you know, it's just, it's just good fun to see somebody who's like who's just not taking anything seriously.

[245] And it's just, just putting a, you know, a stick in the wheel of business as usual again and again and again and again.

[246] And they don't really see anything much at stake, right?

[247] It doesn't really matter if we don't support NATO.

[248] It doesn't really matter if he says he trusts Putin more than our intelligence services.

[249] I mean, none of this is, it doesn't matter if he's, on the one hand, saying that he loves the leader of North Korea and on the other threatening, it threatens to, you know, bomb them back to the Stone Age, right, on Twitter, it's all, it all can be taken in the spirit of kind of reality television.

[250] Like, this is just, this is the part of the movie that's just fun to watch, right?

[251] And I understand that.

[252] I can even inhabit that space for a few minutes at a time, but.

[253] there's a deeper concern that we're in the process of entertaining ourselves to death, right?

[254] That we're just not taking things seriously.

[255] And this is a problem I've had with several other people we might name who just, who just appear to me to be goofing around at scale.

[256] And they lack a kind of moral seriousness.

[257] I mean, they're touching big problems where lives hang in the balance, but they're just fucking around.

[258] And I think they're really important problems that we have to get our head straight around.

[259] And we need, you know, it's not to say that institutions don't become corrupt.

[260] I think they do.

[261] And I think, and I'm quite worried that, you know, both about the loss of trust in our institutions and the fact that trust has eroded for good reason, right, that they have become less trustworthy.

[262] I think, you know, they've become infected by, you know, political ideologies that are not truth tracking.

[263] I mean, I worry about all of that.

[264] But I just think we need institutions.

[265] We need to rebuild them.

[266] We need experts who are real experts.

[267] We need to value expertise over, you know, amateurish speculation and conspiracy thinking and just, you know, and bullshit.

[268] What kind of amateur speculation we're doing on this very podcast?

[269] I'm usually alert to the moments where I'm just guessing or where I actually feel like I'm talking from within my wheelhouse.

[270] I try to telegraph that a fair amount with people.

[271] So yeah, I mean, but it's not, it's different.

[272] Like, I mean, you can invite someone onto your podcast who's an expert about something that you're, you're not an expert about.

[273] And then you, you, in the process of getting more informed yourself, your, your audience is getting more inform.

[274] So you're asking smart questions.

[275] And you might be pushing back at the margins, but you know that when push comes to shove on that topic, you really don't have a basis to have a strong opinion.

[276] And if you were going to form a strong opinion that was this counter to the expert you have in front of you, it's going to be by deference to some other expert who you've brought in or who you've heard about or whose work you've read or whatever.

[277] But there's a paradox to how we've value authority in science that most people don't understand.

[278] And I think we should at some point unravel that because it's the basis for a lot of public confusion.

[279] And frankly, it's the basis for a lot of criticism I've received on these topics, where it's, you know, people think that I'm a, you know, I'm against free speech or I'm an establishment shill or it's like I just think I'm a credentialist.

[280] I just think people with PhDs from Ivy League universities should you know, run everything.

[281] It's not true, but there's a ton of confusion.

[282] There's a lot to cut through to get to daylight there because people are very confused about how we value authority in the service of rationality generally.

[283] You've talked about it, but it's just interesting, the intensity of feeling you have.

[284] You've, you've had this famous phrase about Hunter Biden and children in the basement.

[285] Can you just revisit this case?

[286] So let me let me give another perspective on the situation of January 6th and Trump in general.

[287] It's possible that January 6th and things of that nature revealed that our democracy is actually pretty fragile.

[288] And then Trump is not a malevolent and ultra -competent malevolent figure, but it's simply a jokester.

[289] And he just, by creating the chaos, revealed that it's all pretty fragile because you're a student history and there's a lot of people like Vladimir Lenin, Hitler, who are exceptionally competent at controlling power, at being executives and taking that power, controlling the generals, controlling all the figures involved, and certainly not tweeting, but working in the shadows behind the scenes to gain power.

[290] And they did so extremely competently, and that is how they were able to gain power.

[291] The pushback with Trump, he was doing none of that.

[292] that.

[293] He was creating what he's very good at, creating drama, sometimes for humor's sake, sometimes for drama's sake, and simply reveal that our democracy is fragile.

[294] And so he's not this once -in -a -generation horrible figure.

[295] Once -in -a -generation narcissist.

[296] No, I don't think he's a a truly scary, sinister, you know, Putin -like, or, you know, Hitler -like figure.

[297] Not at all.

[298] I mean, he's not ideological.

[299] He doesn't care about anything beyond himself.

[300] So it's not, no, no, he's much less scary than any really scary, you know, totalitarian, right?

[301] I mean, and he's...

[302] He's more brave New World than 1984?

[303] This is what, you know, Eric Weinstein never stops.

[304] badgering me about, but, you know, he's still wrong, Eric.

[305] You know, I can, you know, my analogy for Trump was that he's an evil Chauncey Gardner.

[306] I don't know if you remember the, the, the book or the film being there with it with Peter Sellers.

[307] But, you know, Peter Sellers is this gardener who really doesn't know anything, but he gets recognized as this wise man and he gets promoted to immense power in Washington because he's speaking in these kind of, in a semblance of wisdom, he's got these very simple aphorisms, or it seemed to be aphorisms, he's just talking, all he cares about is gardening.

[308] He's just talking about his garden all the time, but, you know, he'll say something, but yeah, you know, in the spring, you know, the new shoots will bloom, and people read into that some kind of genius, you know, insight politically, and so he gets promoted, and so that's the joke of the film.

[309] For me, Trump has always been someone like an evil Chauncee Gardner.

[310] I mean, he's, it's not to say he's totally, and yes, he has a certain kind of genius.

[311] He's got a genius for creating a spectacle around himself, right?

[312] He's got a genius for getting the eye of the media always coming back to him.

[313] But it's only, it's a kind of, you know, self -promotion that only works if you actually are truly shameless and don't care about having a reputation for anything that I or you would want to have a reputation for, right?

[314] It's like it's pure, the pure pornography of attention, right?

[315] And he just wants more of it.

[316] I think the truly depressing and genuinely scary thing was that we have a country that, at least half of the country, given how broken our society is in many ways, we have a country that didn't see anything wrong with that.

[317] bringing someone who obviously doesn't know what he should know to be president and who's obviously not a good person, right, obviously doesn't care about people, can't even pretend to care about people really, right, in a credible way.

[318] And so, I mean, this, if there's a silver lining to this, it's along the lines you just sketched, it shows us how vulnerable our system is to a truly brilliant and sinister, figure, right?

[319] I mean, like, I think we are, we really dodged a bullet.

[320] Yeah, someone far more competent and conniving and ideological could have exploited our system in a way that Trump didn't.

[321] And that's, yeah, so if we plug those holes eventually, that would be a good thing, and he would have done a good thing for our society, right?

[322] I mean, one of the things we realized and I think nobody knew I mean I certainly didn't know it and I didn't hear anyone talk about it is how much our system relies on norms rather than laws.

[323] Yeah, civility almost.

[324] Yeah, it's just like it's quite possible that he never did anything illegal.

[325] You know, truly illegal.

[326] I mean, I think he probably did a few illegal things, but like illegal such that he really should be thrown in jail for it.

[327] You know, at least that remains to be seen.

[328] So all of the chaos, all of the, you know, all of the diminishment of our stature in the world, all of the just the opportunity costs of spending years focused on nonsense, all that was just norm violations.

[329] All that was just, that was just all a matter of not saying the thing you should say.

[330] But that doesn't mean they're insignificant, right?

[331] It's not that it's like, It's not illegal for a sitting president to say, no, I'm not going to commit to a peaceful transfer of power, right?

[332] We'll wait and see whether I win.

[333] If I win, the election was valid.

[334] If I lose, it was fraudulent, right?

[335] But aren't those humorous perturbations to our system of civility such that we know what the limits are, and now we start to think that and have these kinds of discussions?

[336] But that wasn't a humorous perturbation because he did everything he could, granted he wasn't very competent, but he did everything he could to try to steal the election.

[337] I mean, the irony is he claimed to have an election stolen from him all the while doing everything he could to steal it, declaring it fraudulent in advance, trying to get the votes to not be counted as the evening wore on, knowing that they were going to be disproportionately Democrat votes.

[338] because of the position he took on mail -in ballots.

[339] I mean, all of it was fairly calculated.

[340] The whole circus of the clown car that crashed into four seasons landscaping, right?

[341] And you got Rudy Giuliani with his hair dye, and you got Sidney Powell and all these grossly incompetent people lying as freely as they could breathe about election fraud, right?

[342] And all of these things are getting thrown out by.

[343] by, you know, Republican, largely Republican election officials and Republican judges, it wasn't for want of trying that he didn't maintain his power in this country.

[344] He really tried to steal the presidency.

[345] He just was not competent, and the people around him weren't competent.

[346] So that's a good thing, and it's worth not letting that happen again.

[347] But he wasn't competent, so he didn't do everything he could.

[348] Well, no, he did everything he could.

[349] He didn't do everything that could have been done by someone more competent.

[350] Right.

[351] But the tools you have as a president, you can do a lot of things.

[352] You can declare emergencies, especially doing COVID.

[353] You could postpone the election.

[354] You can create military conflict that, you know, any kind of reason to postpone the election.

[355] There's a lot of ways.

[356] But he tried to do things, and he would have to have done those things through other people, and there are people who refuse to do those things.

[357] They're people who said they would quit.

[358] They would quit publicly.

[359] Right.

[360] I mean, you start, again, there are multiple books written about, in the last hours of this presidency.

[361] And the details are shocking in what he tried to do and tried to get others to do.

[362] And it's awful, right?

[363] I mean, it's just awful that we were that close to something, to a true unraveling of our political process.

[364] I mean, it's the only time in our lifetime that anything like this has happened.

[365] And it's deeply, embarrassing, right, on the world stage.

[366] It's just like we looked like a banana republic there for a while.

[367] And we're the lone superpower.

[368] It's not good.

[369] And so we shouldn't, like, there's no, there's no, the people who thought, well, we just need to shake things up.

[370] And this is a great, great way to shake things up.

[371] And having people, you know, storm our capital and, you know, smear shit on the walls.

[372] That's just more shaking things up, right?

[373] It's all just, for the lulls.

[374] There's a nihilism and cynicism to all of that, which, again, in certain people, it's understandable.

[375] You know, frankly, it's not understandable if you've got a billion dollars and you have a compound in Menlo Park or wherever.

[376] It's like there are people who are cheerleading this stuff, who shouldn't be cheerleading this stuff and who know that they can get on their Gulfstream and fly to their compound in New Zealand if everything goes to shit, right?

[377] So there's a cynicism to all of that that I think we should be.

[378] deeply critical of.

[379] What I'm trying to understand is not, and analyze, it's not the behavior of this particular human being, but the effect it had, in part, on the division between people.

[380] And to me, the degree, the meme of Sam Harris' brain being broken by Trump represents, you're like the person I would look to to bridge the division.

[381] Well, I don't think there is something profitably to be said to someone who's truly captivated by the personality cult of Trumpism, right?

[382] Like, there's nothing that I'm going to say to, there's no conversation I'm going to have with Candace Owens, say, about Trump that's going to converge on something reasonable, right?

[383] You don't think so?

[384] No, I mean, I've tried, I haven't tried with Candace, but I've tried with, you know, many people who are in that particular orbit.

[385] I mean, I've had conversations with people who won't admit that there's anything wrong with Trump, anything.

[386] So I'd like to push for the empathy versus reason.

[387] Because when you operate in the space of reason, yes.

[388] But I think there's a lot of power in you showing, in you Sam Harris, showing that you're willing to see the good qualities of Trump, publicly showing that.

[389] I think that's the way to win over.

[390] But he has so few of them.

[391] He has fewer good qualities than virtually anyone I can name, right?

[392] So he's funny.

[393] I'll grant you that he's funny.

[394] He's a good entertainer.

[395] There's others look at just policies and actual impacts he had.

[396] I've admitted that.

[397] No, no. So I've admitted that many of his policies I agree with, many, many of his policies.

[398] I mean, so probably more often than not, at least on balance, I agreed.

[399] So I agreed with his policy that we should take China seriously.

[400] as an adversary, right?

[401] And I think, I mean, again, you have to, there's a lot of fine print to a lot of this because the way he talks about these things and many of his motives that are obvious are things that I don't support, but take immigration.

[402] I think there's, it's obvious that we should have control of our borders, right?

[403] Like, I don't see the argument for not having control of our borders.

[404] We should let in who we want to let in, and we should keep out who we want to to keep out, and we should have a sane immigration policy.

[405] So I don't, I didn't necessarily think it was a priority to build the wall, but I didn't, I never criticize the impulse to build the wall because if, you know, tens of thousands, hundreds of thousands of people are coming across that border and we are not in a position to know who's coming, that seems untenable to me. So, and I can recognize that many people in our society are on balance the victims of immigration.

[406] And there is a, in many cases, a zero -sum contest between the interests of actual citizens and the interests of immigrants, right?

[407] So I think we should have a, we should have control of our borders.

[408] We should have a sane and compassionate immigration policy.

[409] We should have, we should let in refugees, right?

[410] So, you know, Trump on refugees was terrible.

[411] But, no, like, I would say 80 % of the policy concerns people celebrated in him are concerns that I either share entirely or certainly sympathize with.

[412] So like that's not the issue.

[413] The issue is a threat to democracy in some fundamental way.

[414] The issue is largely what you said it was.

[415] It's not so much the person.

[416] It's the effect on everything he tells.

[417] touches, right?

[418] He just, he has this, this superpower of deranging and destabilizing almost everything he touches and sullying the, and compromising the integrity of almost anyone who comes into his orbit.

[419] I mean, so you looked at these people who served, you know, as chief of staff or, you know, in various cabinet positions, people had real reputations, you know, for probity and, and level -headedness, you know, whether you share their politics or not.

[420] I mean, these were real people.

[421] These were not, you know, some of them were goofballs, but, you know, many people who just got totally trashed by proximity to him and then trashed by him when they finally parted company with him.

[422] Yeah, I mean, there's just people bent over backwards to accommodate his norm violations, and it was bad for them, and it was bad for our system.

[423] But none of that discounts the fact that we have a system that really needs a proper house cleaning.

[424] Yes, there are bad incentives and entrenched interests, and And I'm not a fan of the concept of the deep state, because it, you know, has been so propagandized, but yes, there's something like that, you know, that is not flexible enough to respond intelligently to the needs of the moment, right?

[425] So there's a lot of rethinking of government and of institutions in general that I think, you know, we should do, but we need smart, well -informed, well -intentioned people to do that job.

[426] And the well -intentioned part is hugely important, right?

[427] It's just give me someone who is not the most selfish person anyone has ever heard about in their lifetime, right?

[428] And what we got with Trump was that, like, literally the one most selfish person I think anyone could name.

[429] I mean, and again, there's so much known about this, man. That's the thing.

[430] It was like it predates his presidency.

[431] We knew this guy 30 years ago.

[432] And this is why to come back to those inflammatory comments about Hunter Biden's laptop.

[433] The reason why I can say with confidence that I don't care what was on his laptop is that there is, and that includes any evidence of corruption on the part of his father, right?

[434] Now, there's been precious little of that that's actually emerged.

[435] So it's like there is no, as far as I can tell, there's not a big story associated with that laptop as much as people bang on about a few emails.

[436] But even if there were just obvious corruption, like Joe Biden was at this meeting and he took, you know, this amount of money from this shady guy for bad reasons, right?

[437] Given how visible the lives of these two men have been, right, given how much we know about Joe Biden and how much we know about Donald Trump and how they have lived in public for almost as long as I've been alive, both of them, the scale of corruption can't possibly balance out between the two of them, right?

[438] If you show me that Joe Biden has this secret life where he's driving a Bugatti and he's living like Andrew Tate, right, and he's doing all these things I didn't know about, okay, then I'm going to start getting a sense that, all right, maybe this guy is way more corrupt than I realize.

[439] Maybe there is some deal in Ukraine or with China that is just like this guy is not who he seems, He's not the public servant he's been pretending to be.

[440] He's been on the take for decades and decades, and he's just, he's as dirty as can be.

[441] He's all mobbed up and it's a nightmare.

[442] And he can't be trusted, right?

[443] That's possible if you show me that his life is not at all what it seems.

[444] But on the assumption that having looked at this guy for literally decades, right, and knowing that every journalist has looked at him for decades, just how many affairs is he having, just how much?

[445] How many drugs is he doing?

[446] How many houses does he have?

[447] Where, you know, what is, what are the obvious conflicts of interest, you know?

[448] You hold that against what we know about Trump, right?

[449] And I mean, the litany of indiscretions you can put on Trump side that testify to his personal corruption, to testify the fact that he has no ethical compass, there's simply no comparison, right?

[450] So that's why I don't care about what's on the laptop when.

[451] And now, if you tell me Trump is no longer running for president in 2024, and we can put Trumpism behind us, and now you're saying, listen, there's a lot of stuff on that laptop that makes Joe Biden look like a total asshole, okay, I'm all ears, right?

[452] I mean, it was a forced, in 2020, it was a forced choice between a sitting president who wouldn't commit to a peaceful transfer of power and a guy who's obviously too old to be president who has a, crack addicted son who lost his laptop and i just knew that i was going to take biden in spite of whatever litany of horrors was going to come tumbling out of that laptop and that might involve sort of so the actual quote is hunter biden literally could have had the corpses of children in the basement there's a dark humor to it right which is i think you speak to i would not have cared there's nothing it's hunter biden it's not joe biden whatever the scope of joe biden's corruptive is, it is infinitesimally compared to the corruption.

[453] We know Trump was involved in.

[454] It's like a firefly to the sun is what you're speaking to.

[455] But let me make the case that you're really focused on the surface stuff, that it's possible to have corruption that masquerades in the thing we mentioned, which is civility.

[456] You can spend hundreds of billions of dollars or trillions of dollars or trillions of war in the Middle East, for example, something that you've changed your mind on.

[457] in terms of the negative impact that it has on the world.

[458] And that, you know, the military industrial complex, everybody's very nice, everybody's very civil, just very upfront.

[459] Here's how we're spending the money.

[460] Yeah, sometimes somehow disappears in different places, but that's the way, you know, war is complicated.

[461] And it's everyone is very polite.

[462] There's no Coke and strippers or whatever is on the laptop.

[463] It's very nice and polite.

[464] In the meanwhile, hundreds of thousands of civilians die.

[465] of hate, just an incredible amount of hate is created because people lose their family members, all that kind of stuff, but there's no strippers and coke on a laptop.

[466] Yeah, but it's not just superficial.

[467] It is when someone only wants wealth and power and fame, that is their objective function, right?

[468] They're like a robot that is calibrated just to those variables, right?

[469] And they don't care about the risks we run on any other front.

[470] They don't care about environmental risk, pandemic risk, nuclear proliferation risk, none of it, right?

[471] They're just tracking fame and money and whatever can personally redound to their self -interest along those lines.

[472] And they're not informed about the other risk we're running, really.

[473] I mean, in Trump, you had a president who was repeatedly asking his generals, why couldn't we use our nuclear weapons?

[474] Why can't we have more of them?

[475] Why do I have fewer nuclear weapons than JFK?

[476] Right?

[477] As though that were a sign of anything other than progress, right?

[478] And this is the guy who's got the button, right?

[479] I mean, somebody's following him around with a bag waiting to take his order to launch, right?

[480] That is a, it's just, it's a, it's a risk we should never run.

[481] One thing Trump has going for him, I think, is that he's, he doesn't drink or do drugs, right?

[482] Although, you know, people allege that he does speed, but, you know, let's take him in his word.

[483] He's, he's not deranging himself with, with pharmaceuticals, at least, but, apart from Diet Coke.

[484] There's nothing wrong, just for the record, let me push back on.

[485] that.

[486] There's nothing wrong with that.

[487] I mean, I can see a very large amount.

[488] I occasionally have some myself.

[489] There's no medical, there's no scientific evidence that I observed the negatives of, you know, all those studies about Aspartame and all of that is, no, I don't know.

[490] I like, I hope, I hope you're right.

[491] Um, yeah, I mean, everything you said about the military industrial complex is true, right?

[492] And it's been, we've been worrying about that on both sides of the aisle for a very long time.

[493] I mean, that's just, you know, that phrase came from, from Eisenhower.

[494] Um, it's, uh, I mean, so much of what ails us is a story of bad incentives, right?

[495] And bad incentives are so powerful that they corrupt even good people, right?

[496] How much more do they corrupt bad people, right?

[497] Like, so it's like, you want, at minimum you want reasonably good people, at least non -pathological people in the system trying to navigate against the grain of bad incentives.

[498] And better still, all of us can get together and try to diagnose those incentives and change them, right?

[499] And we will really succeed when we have a system of incentives where the good incentives are so strong that even bad people are effortlessly behaving, as though they're good people because they're so successfully incentivized to behave that way, right?

[500] That's, you know, so it's almost the inversion of our current situation.

[501] So yes, and you say I changed my mind about the war, not quite.

[502] I mean, I was never a supporter of the war in Iraq.

[503] I was always worried that it was a distraction from the war in Afghanistan.

[504] I was a supporter of the war in Afghanistan.

[505] And I will admit in hindsight, that looks like, you know, at best a highly ambiguous and painful exercise, you know, more likely a fool's errand, right?

[506] It's like that, you know, it did not turn out well.

[507] It's, it wasn't for want of trying.

[508] I don't, you know, I have not done a deep dive on, on all of the failures there.

[509] And maybe all of these failures are failures in principle.

[510] And maybe it's just, maybe that's not the kind of thing that can be done.

[511] well by anybody, whatever our intentions.

[512] But yeah, the move to Iraq always seemed questionable to me. And when we knew the problem, the immediate problem at that moment, you know, Al -Qaeda was in Afghanistan and, you know, and then bouncing to Pakistan.

[513] Anyway, you know, so, yes, but my sense of the possibility of nation -building my sense of, you know, insofar as the neocon spirit of, you know, responsibility and idealism, that, you know, America was the kind of nation that should be functioning in this way as the world's cop, and we have to get in there and untangle some of these knots by force rather often because, you know, if we don't do it over there, we're going to have to do it over here kind of thing.

[514] Yeah, some of that has definitely changed for me in my thinking.

[515] There are obviously cultural reasons why it failed in Afghanistan, and if you can't change the culture, you're not going to force a change at gunpoint in the culture.

[516] It certainly seems that that's not going to happen.

[517] And it took us, you know, over 20 years to apparently to realize that.

[518] That's one of the things you realize with the war is there's not going to to be a strong signal that things are not working.

[519] You can just keep pouring money into a thing, a military effort.

[520] Well, also, there are signs of it working, too.

[521] You have all the stories of girls now going to school, right?

[522] You know, the girls are getting battery acid thrown in their faces by religious maniacs, and then we come in there, and we stop that, and now girls are getting educated, and that's all good, and our intentions are good there.

[523] And, I mean, we're on the right side of history there.

[524] Girls should be going to school.

[525] You know, Malala Yosef Sai should have the Nobel Prize, and she shouldn't have been shot in the face by the Taliban, right?

[526] We know what the right answers are there.

[527] The question is, what do you do when there are enough, in this particular case, religious maniacs, who are willing to die and let their children die in defense of crazy ideas and moral norms that belong in the seventh century?

[528] and it's a problem we couldn't solve and we couldn't solve it even though we spent trillions of dollars to solve it.

[529] This reminded me of the thing that you and Jack Dorsey jokingly had for a while, the discussion about banning Donald Trump from Twitter, but does any of it bother you now that Twitter files came out that, I mean, it has to do with sort of the Hunter laptop, Hunter Biden a laptop story, does it bother you that there could be a collection of people that make decisions about who to ban or not?

[530] And that that could be susceptible to bias and to ideological influence.

[531] Well, I think it always will be, or in the absence of perfect AI, it always will be.

[532] And this becomes relevant with AI as well.

[533] Yeah.

[534] Because there's some censorship on AI happening.

[535] And it's an interesting question there as well.

[536] I don't think Twitter is important as people think it is, right?

[537] And I used to think it was more important when I was on it, and now that I'm off of it, I think it's, I mean, first let me say it's just an unambiguously good thing in my experience to delete your Twitter account, right?

[538] It's like it is just, even the good parts of Twitter that I miss were bad in the aggregate, in the degree to which it was fragmenting my attention.

[539] the degree to which my life was getting doled out to me in periods between those moments where I checked Twitter, right, and had my attention diverting.

[540] And I was, you know, I was not a crazy Twitter addict.

[541] I mean, I was probably a pretty normal user.

[542] I mean, I was not someone who was tweeting multiple times a day or even every day, right?

[543] I mean, I probably, I think I probably averaged something like one tweet a day.

[544] I think I averaged.

[545] But in reality, but in reality, It was like, you know, there'd be like four tweets one day and then I wouldn't tweet for, you know, the better part of a week.

[546] And, but I was looking a lot because it was my newsfeed.

[547] I was just following, you know, 200 very smart people and I would just wanted to see what they were paying attention to.

[548] And they would recommend articles and I would read those articles.

[549] And then when I would read an article, then I would, that I would thought I should signal boost.

[550] I would tweet.

[551] And so all of that seemed good.

[552] And, like, that's all separable from all of the odious bullshit that came back at me in, in response to this, largely in response to this Hunter Biden thing.

[553] But even the good stuff has a downside.

[554] And it comes at just this point of your phone is this perpetual stimulus of, which is intrinsically fragmenting of time and attention.

[555] And now my phone is much less of a presence in my life.

[556] And it's not that I don't check Slack or check email.

[557] I use it to work, but my sense of just what the world is and my sense of my place in the world, the sense of where I exist as a person, has changed a lot by deleting my Twitter account.

[558] I mean, I had a, and it's just, and the things that I think, I mean, we all know this phenomenon.

[559] I mean, we say of someone, that person's too online, right?

[560] Like, what does it mean to be?

[561] too online.

[562] And where do you draw that boundary?

[563] How do you know, what constitutes being too online?

[564] Well, in some sense, just being, I think being on social media at all is to be too online.

[565] I mean, given what it does to, given the kinds of information, it signal boosts.

[566] And given the impulse it kindles in each.

[567] of us to reach out to our audience in specific moments and in specific ways, right?

[568] It's like, there are lots of moments now where I have an opinion about something, but there's nothing for me to do with that opinion, right?

[569] Like, there's no Twitter, right?

[570] So, like, there are lots of things that I would have tweeted in the last months that are not the kind of thing I'm going to do a podcast about.

[571] I'm not going to roll out 10 minutes on that topic on my podcast.

[572] I'm not going to take the time to really think about it.

[573] But had I been on Twitter, I would have reacted to this thing in the news or this thing that somebody did, right?

[574] What do you do with that thought now?

[575] I just let go of it.

[576] Like chocolate ice cream is the most delicious thing ever.

[577] Yeah, it's usually not that sort of thing.

[578] But it's just, but then you look at the kinds of problems people create for themselves.

[579] You look at the life deranging and reputation destroying things that people do.

[580] And I look at the things that have, the analogous.

[581] things that have happened to me. I mean, the things that have really bent my life around professionally over the past, you know, decade.

[582] So much of it is Twitter.

[583] I mean, honestly, in my case, almost 100 % of it was Twitter.

[584] The controversies I would get into, the things I would think I would have to respond to in a pod, like I would release a podcast on a certain topic, I would see some blowback on Twitter.

[585] You know, it would give me the sense that there was some signal that I really had to respond to.

[586] Now that I'm off Twitter, I recognize that most of that was just totally specious, right?

[587] It was not something I had to respond to.

[588] But yet I would then do a cycle of podcasts responding to that thing that like taking my foot out of my mouth or taking someone else's foot out of my mouth.

[589] And it became this self -perpetuating cycle, which, I mean, it's, you know, if you're having fun, great.

[590] I mean, if it's, if it's, if it's, if it's, it's generative of useful information and engagement professionally and and psychologically great.

[591] But and there, you know, there was some of that on Twitter.

[592] I mean, there were people who I've connected with because I just, you know, one, one of us DMed the other on Twitter and it was hard to see how that was going to happen otherwise.

[593] But it was largely just a machine for manufacturing unnecessary controversy.

[594] Do you think it's possible to avoid the drug of that?

[595] So now that you've achieved the Zen state, is it possible for somebody like you to use it in a way that doesn't pull you into the whirlpool?

[596] And so anytime there's attacks, you just, I mean, that's how I tried to use it.

[597] Yeah, but it's not the way I wanted to use it.

[598] It's not the way it promises itself as a...

[599] You wanted to have debate.

[600] I wanted to actually communicate with people.

[601] Yeah.

[602] I wanted to hear from the person because, again, it's like being in Afghanistan, right?

[603] It's like there are the potted cases where it's obviously good, right?

[604] It's like in Afghanistan, the girl who's getting an education, that is just here.

[605] That's why we're here.

[606] That's obviously good.

[607] I have those moments on Twitter where it's okay, I'm hearing from a smart person who's detected an error I made in my podcast or in a book or they've just got some great idea about something that I should spend time on.

[608] and I would never have heard from this person in any other format and now I'm actually in dialogue with them and it's fantastic that's the promise of it to actually talk to people and so I kept getting lured back into that no the way the same or sanity preserving way of using it is just as a marketing channel you just put your stuff out there and you don't look at what's coming back at you and that's you know for you know I'm on other social media platforms that I don't even touch.

[609] I mean, my team post stuff on Facebook and on Instagram.

[610] I never even see what's on there.

[611] So you don't think it's possible to see something and not let it affect your mind?

[612] No, that's definitely possible.

[613] But the question is, and I did that for vast stretches of time, right?

[614] And but then the promise of the platform is dialogue and feedback, right?

[615] So like, so why am I, if I know for what?

[616] reason, I'm going to see like 99 to one awful feedback, you know, bad faith feedback, malicious feedback.

[617] Some of it's probably even bots, and I'm not even aware of who's a person who's a bot, right?

[618] But I'm just going to stare into this fun house mirror of acrimony and dishonesty that is going to, I mean, the reason why I got off is not because I couldn't recalibrate and find equanimity again with all the nastiness that was coming back at me, and not that I couldn't ignore it for vast stretches of time, but I could see that I kept coming back to it hoping that it would be something that I could use, a real tool for communication, and I was noticing that it was insidiously changing the way I felt about people, both people I know and people I don't know, right?

[619] Like people I, you know, mutual friends of ours, who are behaving in certain ways on Twitter, which just seemed insane to me. And then that became a signal I felt like I had to take into account somehow, right?

[620] You're seeing people at their worst, both friends and strangers.

[621] And I felt that it was as much as I could sort of try to recalibrate for it, I felt that I was losing touch with what was real information, because people are performing, people are faking, people are not themselves, or they're, you can see.

[622] people at their worst.

[623] And so I felt like, all right, was being advertised to me here on a not just a daily basis, you know, an hourly basis or, you know, an increment sometimes of, you know, multiple times an hour.

[624] I mean, I probably checked Twitter, you know, at minimum 10 times a day, and maybe I was checking it a hundred times a day on some days, right, where things were really active and I was really engaged with something, what was being delivered into my brain there was subtly false information about how dishonest and, you know, just generally unethical, totally normal people are capable of being, right?

[625] It was like, it was a fun house mirror.

[626] I was seeing the most grotesque versions of people who I know, right?

[627] People who I know I could sit down at dinner with, and they would never behave this way.

[628] And yet they were coming at me on Twitter in, I mean, it was essentially turning ordinary people into sociopaths, right?

[629] Like, people are just, you know, and their analogies that many of us have made.

[630] It's like, one analogy is road rage, right?

[631] Like, people behave in the confines of a car in ways that they never would if they didn't have this metal box around them, you know, and moving at speed.

[632] And it's, you know, all that becomes quite hilarious and, you know, obviously dysfunctional when they actually have to stop at the light next to the person they just flipped off.

[633] And they realized they didn't realize, they didn't understand that the person coming out of that car next to them with cauliflower ear is someone who they never would have, you know, rolled their eyes at in public because they would have taken one look at this person and realized.

[634] This is the last person you want to fight with.

[635] That's one of the heartbreaking things is to see people who I know who I admire, who I know our friends, be everything from snarky to downright, mean, derisive towards each other.

[636] It doesn't make any sense.

[637] Like this is the only place where I've seen people I really admire who have had a calm head about most things, like really be shitty to other people.

[638] It's probably the only place I've seen that.

[639] And I don't, I tend, I choose to maybe believe that that's not really them.

[640] There's something about the system.

[641] Like if you go paintballing, if you, Jordan Peterson and, you're going to shoot your friends, yeah.

[642] Yeah, you're going to shoot your friends, but you kind of accept that that's kind of what you're doing in this little game that you're playing.

[643] But it's sometimes hard to remind yourself with that.

[644] And I think I was guilty of that, definitely.

[645] you know I don't think there's nothing I I I don't think I ever did anything that I really feel bad about but yeah it was always pushing me to the edge of snideness somehow and it's just not healthy it's not it's not so the reason why I deleted my Twitter account in the end was that it was obviously making me a worse person and and so and yeah is there some way to be on there where he's not making you a worse person.

[646] I'm sure there is, but it's given the nature of the platform and given what was coming back at me on it, the way to do that is just to basically use it as a one -way channel of communication.

[647] It's just marketing.

[648] It's like, here's what I'm paying attention to.

[649] Look at it if you want to and just push it out.

[650] And then you don't look at what's coming back at you.

[651] I put out a call for questions on Twitter.

[652] And And actually quite surprising, there's a lot of good.

[653] I mean, they're like, even if they're critical, they're like being thoughtful, which is nice.

[654] I used it that way, too, and that was what kept me hooked.

[655] But then there's also Touch Balls -69 wrote a question.

[656] Ask what...

[657] I can't imagine.

[658] This is part of it.

[659] One way to solve this is, you know, we've got to get rid of anonymity for this.

[660] Let me ask the question.

[661] Ask Sam why he sucks was the question.

[662] Yeah, that's good.

[663] Well, one reason why I sucked was Twitter.

[664] That was, and I've since solved that problem.

[665] So touch ball was 69?

[666] Touch ball 69 should be happy that I suck a little bit less now that I'm off Twitter.

[667] I don't have to hear from touch balls 69 on the regular.

[668] The fact that you have to see that, it probably can have a negative effect, just even a moderation, just to see that there is, like for me, the negative effect is.

[669] slightly losing faith in the underlying kindness of humanity.

[670] Yeah, that was for me, yeah.

[671] You can also just reason your way out of it, saying that this is an inimity, and this is kind of fun, and it's kind of just the shit show of Twitter.

[672] It's okay, but it does mentally affect you a little bit.

[673] Like, I don't read too much into that kind of comment.

[674] It's like that's just, that's just trolling, and it's, you know, I get what's, I get, I understand the fun, the person is having on the other side of that it's like do you though i do well i do i don't i mean i don't behave that way but i do and for all i know that person could be you know 16 years old right so it's it's like it could be also an alt -a -con for elin i don't know well yeah yeah yeah um no i'm pretty sure Elon would just tweet that uh you know under his own name at this point um oh and you love each other.

[675] Okay, so the, do you think, so speaking of which, now that Elon has taken over Twitter, is there something that he could do to make this platform better?

[676] This Twitter and just social media in general, but because of the aggressive nature of his innovation that he's pushing, is there any way to make Twitter a pleasant place for Sam Harris?

[677] Maybe.

[678] Like in the next five years.

[679] I don't know.

[680] I think I'm agnostic as to whether or not he or anyone could make a social media platform that really was healthy.

[681] So you were just observing yourself, week by week, seeing the effect as on your mind, and on how much you're actually learning and growing as a person, and it was negative.

[682] Yeah, and I also seen the negativity in other people's lives.

[683] I mean, it's obviously, I mean, he's not going to admit it, but I think it's obviously negative for Elon, right?

[684] I mean, it's just not, it's, that was one of the things that, you know, You know, when I was looking into the fun house mirror, I was also seeing the fun house mirror on his side of Twitter, and it was just even more exaggerated.

[685] It's like, when I was asking myself, why is he spending his time this way, I then reflected on why, you know, why was I spending my time this way to a lesser degree, right?

[686] And at lesser scale and at lesser risk, frankly, right?

[687] And so, and it was just so, it's not just Twitter.

[688] I mean, this is part an internet phenomenon.

[689] It's like the whole Hunter Biden mess that you, you explored.

[690] Explored.

[691] That was based on, I mean, I was on somebody's podcast, but that was based on a clip taken from that podcast, which was highly misleading as to the general shape of my remarks on that podcast.

[692] even, you know, I had to then do my own podcast, untangling all of that, and admitting that even in, even in the full context, I was not speaking especially well and didn't say exactly what I thought in a way that was, would have been recognizable to anyone, you know, even someone with not functioning by a spirit of charity.

[693] But, but the clip was quite distinct from the podcast itself.

[694] The reality is, is that we're living in an environment now where people, are so lazy and their attention is so fragmented that they only have time for clips 99 % of people will see a clip and we'll assume there's no relevant context I need to understand what happened in that clip and obviously the people who make those clips know that and they're doing it quite maliciously and in this case the person who made that clip and subsequent clips of other podcasts was quite maliciously trying to engineer some reputational immolation for me. And being signal -boasted by Elon and other prominent people who can't take the time to watch anything other than a clip, even when it's their friend or someone who's ostensibly their friend in that clip, right?

[695] So it's a total failure, an understandable failure of ethics that everyone is so short on time and they're so fucking lazy that and we now have these contexts in which we react so quickly to things, right?

[696] Like, Twitter is inviting an instantaneous reaction to this clip that it's just too tempting to just say something and not know what you're even commenting on.

[697] And most of the people who saw that clip don't understand what I actually think about any of these issues.

[698] And the irony is, people are going to find clips from this conversation that are just as misleading, and they're going to export those, and then people are going to be dunking on those clips.

[699] And, you know, we're all living and dying by clips now, and it's, um, it's dysfunctional.

[700] See, I think it's possible to create a platform.

[701] I think we will keep living on clips, but, you know, when I saw that clip of you talking about children and so on, just knowing that you have a sense of humor, you just went to a dark place and in terms of humor.

[702] So, like, I didn't even bother.

[703] And then I knew that the way clips work is that people will use it for virality's sake.

[704] But the giving a person benefit of the doubt, that's not even the right term.

[705] It's not like I was, it's really, like, interpreting it in the context of your past.

[706] The truth is you even need, like I even give Trump the benefit of the doubt when I see a clip of Trump.

[707] Because there are famous clips of Trump that are very misleading as to what he was saying in context.

[708] And I've been honest about that.

[709] Like the whole, you know, there were good people on both sides scandal around his remarks after Charlottesville.

[710] Like the clip that got exported and got promoted by everyone, you know, left of center, from Biden on down, you know, the New York Times, CNN, there's nobody that I'm aware of who has honestly.

[711] apologized for what they did with that clip.

[712] He did not say what he seemed to be saying in that clip about the Nazis at Charlottesville, right?

[713] And I have always been very clear about that.

[714] So it's just, you know, even people who I think should be marginalized and people who who should be defenestrated because they really are terrible people who are doing dangerous things and for bad reasons, I think we should be honest about what they actually meant in context, right?

[715] And this goes to anyone else we might talk about, you know, who's more, where the case is much more confusing.

[716] But yeah, so everyone's, it's just so, and then I'm sure we're going to get to AI, but, you know, the prospect of being able to manufacture clips with AI and deep fakes and that where it's going to be hard for most people most of the time to even figure out that whether they're in the presence of something real, you know, forget about being divorced from context.

[717] There was no context.

[718] I mean, that is, that's an misinformation apocalypse that is, we're right on the cusp of and, you know, it's, it's terrifying.

[719] Well, it could be just a new world, like where Alice going to Wonderland, where humor is the only thing we have and that will save us.

[720] Maybe in the end, Trump's approach to social media was the right one after all.

[721] Nothing is true and everything is absurd.

[722] But we can't live that way.

[723] People function on the basis of what they assume is true, right?

[724] They think...

[725] People have functioned.

[726] Well, to do anything.

[727] It's like, I mean, you have to know what you think is going to happen, or you have to at least give a probabilistic waiting over the future.

[728] Otherwise, you're going to be incapacitated.

[729] by you're not going to like people want certain things and they have to have a rational plan to get those desires gratified and they don't want to die they don't want their kids to die you tell them that there's a comet hurtling toward earth and they should get outside and look up right they're going to do it and if it turns out it's misinformation you know it's it's it's going to matter because it comes down to like what medicines do you give your children right like we're going to be manufacturing fake journal articles.

[730] I mean, this is, I'm sure someone's using chat GPT for, for this, you know, as we speak.

[731] And if it's not credible, if it's not persuasive now to most people, I mean, honestly, I don't think we're going to, it's, I'll be amazed if it's a year before.

[732] We can actually create journal articles that it would take, you know, a PhD to debunk that are completely fake.

[733] And there are people who are celebrating this kind of, you know, coming cataclysm.

[734] But it's just, they're the people who don't have anything to lose who are celebrating in or just are so confused that they just don't even know what's at stake.

[735] And then they're the people who have, the few people who we could count on a few hands, who have managed to insulate themselves, or at least imagine they've insulated themselves from the downside here enough that they're not implicated in the great unraveling we are witnessing or could witness.

[736] The shaking up of what is true.

[737] So actually that returns us to experts.

[738] Do you think experts can save us?

[739] Is there such things as expertise and experts at something?

[740] How do you know if you've achieved it?

[741] I think it's important to acknowledge upfront that there's something paradoxical about how we relate to to authority, especially within science.

[742] And I don't think that paradox is going away, and it doesn't have to be confusing.

[743] It's just, and it's not truly a paradox.

[744] It's just like there are different moments in time.

[745] So it is true to say that within science or within rationality generally, I mean, whenever you're having a fact -based discussion about anything, it is true to say that the truth or falsity of a statement does not even slightly depend on the credentials of the person making the statement, right?

[746] So it doesn't matter if you're a Nobel laureate.

[747] You can be wrong, right?

[748] The thing you could, the last sentence you spoke could be total bullshit, right?

[749] And it's also possible for someone who's deeply uninformed to be right about something or to be right for the wrong reasons, right?

[750] or someone just gets lucky or someone or or and they're there are middling cases where you have like a a backyard astronomer who's got no credentials but he just loves astronomy and he's got a telescope and it's he's spent a lot of time looking at the nice sky and he discovers a comet that no one else has seen you know not even the professional expert astronomers um and my god i think that happens less and less now but but some version of that keeps happening and it and it may all always keep happening in every area of expertise, right?

[751] So it's true that truth is orthogonal to the reputational concerns we have among apes who are talking about the truth.

[752] But it is also true that most of the time, real experts are much more reliable than frauds or people who are not experts, right?

[753] So, and expertise really is a thing, right?

[754] And when, you know, when you're flying an airplane in a storm, you don't want just randos come into the cockpit saying, listen, I've got a new idea about how to, you know, how we should tweak these controls, right?

[755] You want someone who's a trained pilot, and that training gave them something, right?

[756] It gave them a set of competences and intuitions, and they know what all those dials and switches do, right?

[757] And I don't, right?

[758] I shouldn't be flying that plane.

[759] And so when things really matter, you know, and putting this at 30 ,000 feet in a storm sharpens this up, we want real experts to be in charge, right?

[760] And we are at 30 ,000 feet a lot of the time on a lot of issues, right?

[761] And whether they're public health issues, whether it's a geopolitical emergency like Ukraine, I mean, climate change.

[762] I mean, just pick your topic.

[763] There are real problems, and the clock is rather often ticking, and their solutions are non -obvious, right?

[764] And so expertise is a thing, and deferring to experts much of the time makes a lot of It's, at minimum, it, it prevents, you know, spectacular errors of incompetence and, and, uh, just, uh, you know, foolhardiness.

[765] But even in, in the case of some, where, where you're talking about someone, I mean, people like ourselves who are like, we're well educated, we, we're not the, the worst possible candidates for, you know, the Dunning Kruger effect.

[766] When we're going into a new area where we're not experts, we're fairly alert to the possibility that we don't, you know, it's not as simple as things seem at first, and we don't, you know, we don't know how our tools translate to this new area.

[767] We can be fairly circumspect, but we're also, because we're well educated, we can, and we're pretty quick studies, we can learn a lot of things pretty fast, and we can begin to play a language game that sounds fairly expert, right?

[768] And in that case, you know, The invitation to do your own research, right, is when times are good, I view as an invitation to waste your time pointlessly, right, when times are good.

[769] Now, the truth is times are not all that good, right?

[770] And we have the ongoing public display of failures of expertise.

[771] We have experts who are obviously corrupted by bad incentives.

[772] We've got experts who, you know, perversely won't admit they were wrong when they in fact, you know, are demonstrated to be wrong.

[773] We've got institutions that have been captured by political ideology that's not truth tracking.

[774] I mean, this whole woke encroachment into really every place, you know, whether it's universities or science journals or government or, I mean, it's just like that is, that has been genuinely deranging.

[775] So there's a lot going on that where experts and the very concept of expertise, has seemed to discredit itself, but the reality is that there is a massive difference when anything matters, when there's anything to know about anything, there is a massive difference most of the time between someone who has really done the work to understand that domain and someone who hasn't.

[776] And if I get sick or someone close to me gets sick, you know, I have a PhD in neuroscience, right?

[777] So I can read a medical journal article and understand a lot of it.

[778] Right?

[779] And I, you know, so I'm just fairly conversant with, you know, medical terminology.

[780] And I understand this methods.

[781] And I'm alert to the difference because I've, you know, because in neuroscience, I've spent hours and hours in journal clubs, you know, diagnosing, you know, analyzing the difference between good and bad studies.

[782] I'm alert to the difference between good and bad studies in medical journals, right?

[783] And I understand that bad studies can get published and, you know, et cetera.

[784] and experiments can be poorly designed.

[785] I'm alert to all of those things, but when I get sick or when someone close to me gets sick, I don't pretend to be a doctor, right?

[786] I've got no clinical experience.

[787] I don't go down the rabbit hole on Google for days at a stretch trying to become a doctor, much less a specialist in the domain of problem that has been visited upon me or my family, right?

[788] So if someone close to me gets cancer, I don't pretend to be an oncologist.

[789] I don't go out and start reading, you know, in journals of oncology and try to really get up to speed as an oncologist because it's not, it's, one is a bad and potential and very likely misleading use of my time, right?

[790] And it's, if I decide, if I had, if I had a lot of runway, if I decided, okay, it's really important for me to know everything.

[791] I can.

[792] At this point, I know someone's going to get cancer.

[793] I may not go back to school and become an oncologist, but what I want to do is I want to know everything I can know about cancer, right?

[794] So I'm going to take the next four years and spend most of my time on cancer.

[795] Okay, I could do that, right?

[796] I still think that's a waste of my time.

[797] I still think at the end of, even at the end of those four years, I'm not going to be the best person to form intuitions about what to do in the face of the next cancer that I have to confront.

[798] I'm still going to want a better oncologist than I've become to tell me what he or she would do if they were in my shoes or in the shoes of my family member.

[799] I'm going to, you know, what I'm not advocating, I'm not advocating a blind trust and authority.

[800] Like if you get cancer and you're talking to one oncologist and they're recommending some course of treatment, by all means get a second opinion.

[801] get a third opinion, right?

[802] But it matters that those opinions are coming from real experts and not from, you know, Robert Kennedy Jr., you know, who's telling you that, you know, you got it because you got a vaccine, right?

[803] It's like, it's just, it, there's, we're swimming in a sea of misinformation where you've got people who are moving the opinions of millions of others who should not have an opinion on these stuff.

[804] topics.

[805] Like, there's no, there is no scenario in which you should be getting your opinion about vaccine safety or, or climate change, or, uh, the war in Ukraine, or anything else that we might want to talk about from Candace Owens, right?

[806] It's just like, like, she, she's not a relevant expert on any of those topics.

[807] And what's more, she doesn't seem to care, right?

[808] And And she's living in a culture that has amplified that not carrying into a business model, and an effective business model, right?

[809] So it's just, it's, and there's something very Trumpian about all that.

[810] Like, that's the problem is the culture.

[811] It's not these specific individuals.

[812] So the paradox here is that expertise is a real thing, and we defer to it a lot as a labor -saving device and just based on the reality that it's very hard to be a polymath, right?

[813] And specialization is a thing, right?

[814] And so the people who specialize in a very narrow topic, they know more about that topic than the next guy, no matter how smart that guy or gal is, and that those differences matter.

[815] But it's also true that when you're talking about facts, sometimes the best experts are wrong.

[816] The scientific consensus is wrong.

[817] You get a sea change in the thinking of a whole field because one person who's an outlier, for whatever reason, decides, okay, I'm going to prove this point, and they prove it, right?

[818] So somebody like the doctor who believed that stomach ulcers were not due to stress, but were due to A. Pylori infections, right?

[819] So he just drank a vial of H. Pylori bacteria and proved and quickly got an ulcer and convinced the field that at minimum H. Pylori was involved in that process.

[820] Okay.

[821] So yes, everyone was wrong.

[822] That doesn't disprove the reality of expertise.

[823] It doesn't disprove the utility of relying on experts most of the time, especially in an emergency, especially when the clock is ticking, especially when you're, you know, You know, you're in this particular cockpit, and you only have one chance to land this plane, right?

[824] You want the real pilot at the controls.

[825] But there's just a few things to say.

[826] So, one, you mentioned this example with cancer and doing your own research.

[827] There are several things that are different about our particular time in history.

[828] One, doing your own research has become more and more effective because you can read the Internet made information.

[829] a lot more accessible, so you can read a lot of different meta -analyses, you can read blog posts that described you exactly the flaws in the different papers, they make up the meta -analysis.

[830] And you can read a lot of those blog posts that are conflicting with each other, and you can take that information in, and in a short amount of time, you can start to make good -faith interpretations.

[831] For example, I don't know, I don't want to overstate things, but if you suffer from depression, for example, then you could go to an expert and a doctor that prescribes you some medication.

[832] But you could also challenge some of those ideas and seeing, like, what are the different medication, what are the different side effects, what are the different solutions to depression, all that kind of stuff.

[833] And I think depression is just a really difficult problem that's very, I don't want to, again, state incorrect things, but I think it's...

[834] There's a lot of variability of what depression really means.

[835] So being introspective about the type of depression you have and the different possible solutions you have, just doing your own research as a first step before approaching a doctor or as you have multiple opinions could be very beneficial in that case.

[836] Now, that's depression.

[837] That's something that's been studied for a very long time with a new pandemic that's affecting everybody.

[838] that it's uh you know with the airplane equated to like 9 -11 or something like the new emergency just happened and everybody every expert in the world is publishing on it and talking about it so doing your own research there could be exceptionally effective in asking questions and then there's a difference between experts virologists and it's actually a good question who is exactly the expert in a pandemic.

[839] Yeah.

[840] But there's the actual experts doing the research and publishing stuff, and then there's the communicators of that expertise.

[841] And the question is if the communicators are flawed, to a degree where doing your own research is actually the more effective way to figure out policies and solutions.

[842] Because you're not competing with experts.

[843] You're competing with the communicators of expertise.

[844] That could be WHO, CDC, in the case of the pandemic, or politicians, or political type of science figures like Anthony Fauci.

[845] There's a question there of the effectiveness of doing your own research in that context.

[846] And the competing forces there, incentives that you've mentioned, is you can become quite popular by being contrarian.

[847] By saying everybody's lying to you, all the authorities align to you, all the institutions are lying to you.

[848] So those are the waters just swimming in.

[849] But I think doing your own research in that kind of context could be quite effective.

[850] Let me be clear.

[851] I'm not saying you shouldn't do any research, right?

[852] I'm not saying that you shouldn't be informed about an issue.

[853] I'm not saying you shouldn't read articles on whatever the topic is.

[854] And yes, if I got cancer or someone close to me got cancer, I probably would read more about cancer than I've read thus far about cancer.

[855] and I've read some.

[856] So I'm not making a virtue of ignorance and a blind obedience to authority.

[857] And again, I recognize that authorities can discredit themselves or they can be wrong.

[858] They can be wrong even when there's no discredit.

[859] There's a lot we don't understand about the nature of the world.

[860] But still this vast gulf between truly informed opinion and bullshit exists.

[861] It always exists.

[862] And conspiracy thinking is rather often, most of the time, a species of bullshit, but it's not always wrong, right?

[863] There are real conspiracies, and there really are just awful corruptions of, you know, born of bad incentives within our, you know, our scientific process, within institutions, and again, we mentioned a lot of these things in passing, but what woke political ideology did to scientific communication during the pandemic was awful, and it was really corrosive of public trust, especially on the right, for understandable reasons.

[864] It was just, it was crazy, some of the things that were being said, and still is.

[865] And these cases are all different.

[866] I mean, you take depression.

[867] We just don't know enough about depression for you know, anyone to be that confident about anything, right?

[868] And there are many different modalities in which to interact with it as a problem, right?

[869] So there's, yes, pharmaceuticals have whatever promise they have, but there's certainly reason to be concerned that they don't work well for everybody, and, I mean, that's, it's obvious they don't work well for everybody, but they do work for some people.

[870] But again, depression is a multifactorial problem, and they're different levels at which to influence it.

[871] And there are things like meditation, there are things like just life changes.

[872] And, you know, one of the first things about depression is that when you're depressed, all of the things that would be good for you to do are precisely the things you don't want to do.

[873] You don't have any energy to socialize.

[874] You don't want to get things done.

[875] You don't want to exercise.

[876] and all of those things, if you got those up and running, they do make you feel better in the aggregate.

[877] But the reality is that there are clinical level depressions that are so bad that it's just, we just don't have good tools for them.

[878] And it's not enough to tell, there's no life change.

[879] Someone's going to embrace that it's going to be an obvious remedy for that.

[880] I mean, pandemics are obviously, complicated problem, but I would consider it much simpler than depression in terms of, you know, what's on the menu to be chosen among the various choices.

[881] Just less multifactorial.

[882] The logic by which you would make those choices, yeah.

[883] So it's like we have a virus, we have a new virus, it's some version of bad, you know, it's human transmissible, we're still catching up, we're catching up to every aspect of we don't know how it spreads.

[884] We don't know how, how effective matters.

[885] Well, at a certain point, we knew it was respiratory, but we knew, but, yeah, and whether it's spread by fomites and, like, all that we were confused about a lot of things, and we're still confused.

[886] It's been a moving target this whole time, and it's been changing this whole time, and our responses to it have been, you know, we, we ramped up the vaccines as quickly as we could, but, you know, too quick for some, not as, not quick enough for others.

[887] We could have done human challenge trials and got them out more quickly with better data.

[888] And I think that's something we should probably look at in the future because to my eye, that would make ethical sense to do challenge trials.

[889] But, and so much of my concern about COVID, I mean, many people are confused about my concern about COVID.

[890] My concern about COVID has, for much of the time, not been narrowly focused on COVID itself, and how dangerous I perceive COVID to be as a illness, it has been, for the longest time, even more a concern about our ability to respond to a truly scary pathogen next time.

[891] Like, what, I, for, you know, outside those initial months, you know, give me the first six months to be quite worried about COVID and the unraveling of society.

[892] And a supply of toilet paper.

[893] You want to secure a steady supply of toilet paper.

[894] But beyond that initial period, when we had a sense of what we were dealing with and we had every hope that the vaccines are actually going to work and we knew we were getting those vaccines in short order, right?

[895] Beyond that, and we knew just how dangerous the illness was and how dangerous it wasn't, for years now, I've just been worrying about this as a failed dress rehearsal for something much worse, right?

[896] I think what we prove to ourselves at this moment in history is that we have built informational tools that we do not know how to use and we have made ourselves, we've basically enrolled all of human society into a psychological experiment that is deranging us and making it virtually impossible to solve coordination problems that we absolutely have to solve next time when things are worse.

[897] Do you understand who's at fault for the way this unraveled.

[898] The way we didn't seem to have the distrust in institutions and the institution of science that grew seemingly exponentially or got revealed to this process, who is a fault here?

[899] And what's to fix?

[900] So much blame to go around, but so much of it is not a matter of bad people conspiring to do bad things.

[901] it's a matter of incompetence and misaligned incentives and just ordinary, you know, just plain vanilla dysfunction.

[902] But my problem was that people like you, people like Brett Weinstein, people like that I look to for reasonable, difficult conversations on difficult topics, have a little bit lost their mind, became emotional and dogmatic in style of conversation, perhaps not in the depth of actual ideas, but they're you know a tweet something of that nature and not about you but just it feels like the pandemic made people really more emotional than before and then kimball musk responded i think something i think you probably would agree with maybe not i think it was the combo of trump and the pandemic trump triggered the far left to be way more active than they could have been without him and then the pandemic handed big government nanny state lefties a huge platform on a silver platter I want to punch, and here we are.

[903] I would agree with some of that.

[904] I'm not sure how much to read into the nanny state concept, but.

[905] But yet, like, basically got people on the far left really activated, and then gave control to, I don't know if you say nanny state, but just control to government, that when executed poorly, has created a complete distrust in government.

[906] My fear is that there was going to be that complete distrust anyway, given the nature of the information space, given the level of conspiracy thinking, given the gaming of these tools by an anti -vax cult.

[907] I mean, there really is an anti -vax cult that just ramped up its energy during this moment.

[908] But it's a small one.

[909] It's not to say that everything, every concern about vaccines is a species of, it was born of misinformation or born of this cult, but there is a cult that is just, you know, and, you know, and the core of Trumpism is a cult.

[910] I mean, QAnon is a cult.

[911] And so there's a lot of lying and there's a lot of confusion.

[912] You know, there are, it's almost impossible to exaggerate how confused some people are and how fully their lives are organized around that confusion.

[913] I mean, there are people who think that the world's being run by pedophile cannibals and that, you know, Tom Hanks and Oprah Winner, Winfrey and Michelle Obama are among those cannibals.

[914] I mean, like they're adjacent to the pure crazy, there's the semi -crazy, and adjacent to the semi -crazy, there's the grifting opportunist asshole.

[915] And the layers of bad faith are hard to fully diagnose, but the problem is all of this is getting signal boosted by an outrage machine that is preferentially spreading misinformation.

[916] It has a business model that is guaranteeing that is preferentially sharing misinformation.

[917] Can I actually just in a small tangent, how do you defend yourself against the claim that you're a pedophile cannibal cannibal?

[918] It's difficult.

[919] Here's the case I would make, because I don't think you can use reason.

[920] I think you have to use empathy.

[921] You have to understand.

[922] But what, like, part of it, I mean, I, I, I, I, I, I. I find it very difficult to believe that anyone believes these things.

[923] I mean, I think that there's, and there's, I'm sure there's some number of people who are just pretending to believe these things because it's just, again, this is sort of like the forechanification of everything.

[924] It's just, it's just a, it's just Pepe the Frog, right?

[925] Like, none of this is what it seems.

[926] They're not signaling an alliance with white supremacy or neo -Nazism, but they're not not doing it.

[927] Like, they just don't fucking care.

[928] It's just cynicism overflowing.

[929] banks, right?

[930] It's just fun to wind up the normies, right?

[931] Like, look at all the normies who don't understand that a green frog is just a green frog, even when it isn't just a green frog, right?

[932] It's like it's just gumming up everyone's cognitive bandwidth with bullshit, right?

[933] I get that that's fun if you're a teenager and you just want to vandalize our newsphere, but at a certain point, we have to recognize that real questions of human welfare are in play.

[934] Right?

[935] There's like, they're really, there is this, there are wars getting fought or not fought, and there's a pandemic raging, and there's medicine to take or not take.

[936] But I mean, to come back to this issue of COVID, I don't think my, I don't think I got so out of balance around COVID.

[937] I think people are quite confused about what I was concerned about.

[938] I mean, like, there was a, yes, there was a period where I was crazy because anyone who was taking it seriously was crazy because they had no idea what was going.

[939] on.

[940] And so it's like, yes, I was wiping down packages with alcohol wipes, right?

[941] Because people thought it was transmissible by touch, right?

[942] So when we realized that was no longer the case, I stopped doing that.

[943] But so there, again, it was a moving target.

[944] And a lot of things we did in hindsight around masking and school closures looked fairly dysfunctional, right?

[945] But I think the criticism that people would say about your talking about COVID, and maybe you can correct me, but you were skeptic, or you were against skepticism of the safety and efficacy of the vaccine.

[946] So people who get nervous about the vaccine, but don't fall into the usual anti -vax camp, which I think there was a significant enough number.

[947] they're getting nervous.

[948] I mean, especially after the war in Afghanistan and Iraq, I too was nervous about anything where a lot of money could be made.

[949] And you just see how the people who are greedy, they come to the surface all of a sudden.

[950] And a lot of them that run institutions are actually really good human beings.

[951] I know a lot of them, but it's hard to know how those two combine together when there's hundreds of billions, trillions of dollars to be made.

[952] And so that skepticism, I guess the sense was that you weren't open enough to the skepticism.

[953] I understand that people have that sense.

[954] I'll tell you how I thought about it and think about it.

[955] One, again, it was a moving target.

[956] So there was a point in the timeline where it was totally rational to expect that the vaccines were both working, both they were reasonably safe and that COVID was reasonably dangerous and that the tradeoff for basically everyone was it was rational to get vaccinated given how many, given the level of testing and how many people had been vaccinated before you, given what we were seeing with COVID, right?

[957] That that was a forced choice.

[958] You're eventually going to get COVID and the question is do you want to be vaccinated when you do, right?

[959] There was a period where that forced choice where it was just obviously, reasonable to get vaccinated, especially because there was every reason to expect that while it wasn't a perfectly sterilizing vaccine, it was going to knock down transmission a lot, and that matters.

[960] And so it wasn't just a personal choice.

[961] You were actually being a good citizen when you decided to run whatever risk you were going to run to get vaccinated because there are people in our society who actually can't get vaccinated.

[962] I know people who can't take any vaccines.

[963] They're so allergic to, I mean, they in their own person seem to justify all of the fears of the anti -vax cult.

[964] I mean, it's like they're the kind of person who Robert Kennedy Jr. can point to and say, see, vaccines will fucking kill you, right?

[965] Because of the experience, and we're still, I know people who have kids who fit that description, right?

[966] So we should all feel a civic responsibility to be vaccinated against egregiously awful and transmissible diseases for which we have relatively safe vaccines to keep those sorts of people safe.

[967] And there was a period of time when it was thought that the vaccine could stop transmission.

[968] Yes.

[969] And so again all of this has begun to shift.

[970] I don't think it has shifted as much as Brett Weinstein thinks it's shifted, but yes, there are safety concerns around the MRI vaccines, especially for young men, right?

[971] As far as I know, that's the purview of the, of actual heightened concern.

[972] But also, there's now a lot of natural immunity out there, a lot of, basically everyone who was going to get vaccinated has gotten vaccinated.

[973] The virus has evolved to the point in this context where it seems less dangerous.

[974] you know, again, I don't, I'm going more on the seamings than on research that I've done at this point, but I'm certainly less worried about getting COVID.

[975] I've had it once.

[976] I've been vaccinated.

[977] I've like, it's like, so you ask me now, how do I feel about getting the next booster?

[978] I don't know that I'm going to get the next booster, right?

[979] So I was somebody who was waiting in line at four in the morning, you know, hoping to get some overflow vaccine when it was first available.

[980] And that was, at that point, given what we knew, or given what I thought I knew, based on the best sources I could consult, and based on, you know, based on anecdotes that were too vivid to ignore, you know, both data and personal experience, it was totally rational for me to want to get that vaccine as soon as I could.

[981] And now, I think it's totally rational for me to do a different kind of cost -benefit analysis and wonder.

[982] listen, do I really need to get a booster, right?

[983] You know, like how many of these boosters am I going to get for the rest of my life, really?

[984] And how safe is the MRNA vaccine for a man of my age, right?

[985] And do I need to be worried about myocarditis for, you know?

[986] All of that is completely rational to talk about now.

[987] My concern is that at every point along the way, I was the wrong person and Brett Weinstein was the wrong person, and there's many other people I could add to this list, to have strong opinions about any of this stuff.

[988] I just disagree with that.

[989] I think, yes, in theory, I agree 100%, but I feel like experts failed at communicating.

[990] Not at doing...

[991] They did.

[992] And I just feel like you and Brent Weinstein actually have the tools with the internet, given the engine you have in your brain of thinking for months at a time deeply about the problems that face our world, that you actually have the tools to do pretty good thinking here.

[993] The problem I have with experts.

[994] But there would be deference to experts and pseudo -experts behind all of that.

[995] Well, the papers, you would stand on the shoulders of giants, but you can surf those shoulders better than the giants themselves.

[996] Yeah, but I knew we were going to disagree about the, like I saw his podcast where he brought on these experts who had, many of them, had the right credentials.

[997] But for a variety of reasons, they didn't pass the smell test for me. One larger problem, and this goes back to the problem of how we rely on authority in science, is that you can always find a PhD or an MD to champion any crackpot idea.

[998] I mean, it is amazing, but you could find PhDs and MDs who would sit up there in front of Congress and say that they thought smoking was not addictive, you know, or that it was not harmful to, you know, no direct link between smoking and lung cancer.

[999] You can always find those people.

[1000] And you can and so, but, you know, some of the people Brett found were people who had obvious tells to my point of view, to my eye.

[1001] I mean, and I saw them on, some of the same people were on Rogan's podcast, right?

[1002] And, um, and it's hard because if a person does have the right credentials, and they're not, and they're not saying something floridly mistaken, And we're talking about something where they're genuine unknowns, right?

[1003] Like, how much do we know about the safety of these vaccines, right?

[1004] It's at that point, not a whole hell of a lot.

[1005] I mean, we have no long -term data on mRNA vaccines.

[1006] But to confidently say that millions of people are going to die because of these vaccines, and to confidently say that ivermectin is a panacea, right?

[1007] Ivermectin is the thing that prevents COVID, right?

[1008] There was no good reason to say either of those things at that moment.

[1009] moment.

[1010] And that's, and so, given that that's where Brett was, I felt like there was, there was just no, there was nothing to debate.

[1011] We were both the wrong people to be getting into the weeds on this.

[1012] We're both going to defer to our chosen experts.

[1013] His experts look like crackpots to me and, or at least the ones who were most vociferous on those most, on those edgiest points that seem most.

[1014] And your experts seem like, what is the term, mass hysteria?

[1015] I forgot the term.

[1016] Well, well, it's, no, but it.

[1017] it's like, it's like with, you know, climate science.

[1018] I mean, this old, it's received as a canard for, for, in half of our society now, but the claim that 97 % of climate scientists agree that human cause climate change is a thing, right?

[1019] So do you go with the 97 % most of the time, or do you go with the 3 % most of the time?

[1020] It's obvious you go with the 97 % most of the time for anything that matters.

[1021] It's not to say that the 3 % are always wrong.

[1022] Again, the, it's There are, things get overturned.

[1023] And yes, as you say, and I've spent much more time worrying about this on my podcast than I've spent worrying about COVID, our institutions have lost trust for good reason, right?

[1024] And it's an open question whether we can actually get things done with this level of transparency and pseudo -transparency given our information ecosystems.

[1025] Can we fight a war, really fight a war that we may have to fight, like the next Nazis, can we fight that war when everyone with an iPhone is showing just how awful it is that little girls get blown up when we drop our bombs, right?

[1026] Like, could we, as a society, do what we might have to do to actually get necessary things done when we're living in this panopticon of just, you know, everyone's a journalist, right?

[1027] Everyone's a scientist.

[1028] Everyone's an expert.

[1029] Everyone's got direct contact with the facts, or semblance of the facts.

[1030] I don't know.

[1031] I think yes, and I think voices like yours are exceptionally important, and I think there's certain signals you send in your ability to steal me on the other side, in your empathy, essentially.

[1032] So that's the fight, that's the mechanism by which you resist the dogmatism of these.

[1033] this binary thinking, and then if you become a trusted person that's able to consider the other side, then people will listen to you as the aggregator, as the communicator of expertise.

[1034] Because if I're all just haven't been able to be good communicators, I still to this day, don't really know what is the, what am I supposed to think about the safety and efficacy of the vaccines today?

[1035] As it stands today, what are we supposed to think?

[1036] What are we supposed to think?

[1037] to think about testing?

[1038] What are we supposed to think about the effectiveness of masks or lockdowns?

[1039] Where's the great communicators on this topic that consider all the other conspiracy theories, all the communication that's out there and actually aggregating it together and be able to say this is actually what's most likely the truth?

[1040] And also some of that has to do with humility, epistemic humility, knowing that you can't really know for sure, just like with depression.

[1041] You can't really know for sure.

[1042] I'm not seeing those communications being effectively done, even still today.

[1043] Well, the jury is still out on some of it, and again, it's a moving target.

[1044] And some of it, it's complicated.

[1045] Some of it's a self -fulfilling dynamic where, like, so like lockdowns, in theory, lockdown would work if we could only do it, but we can't really do it.

[1046] And there's a lot of people who won't do it because they're convinced that this is the totalitarian boot, you know, and finally on the neck of the good people who are always having their interests, you know, introduced by the elites, right?

[1047] So, like, this is, if you have enough people who think the lockdown for any reason, in the face of any conceivable illness, right, is just code for the new world order coming to fuck you over and take your guns.

[1048] right.

[1049] Okay, you have a society that is now immune to reason, right?

[1050] Because there are absolutely certain pathogens that we should lock down for next time, right?

[1051] And it was completely rational in the beginning of this thing to lock down, given, to attempt to lock down, we never really locked down, to attempt some semblance of a lockdown just to, quote, bend the curve, to spare our health care system, given what we were seeing happening in Italy, right?

[1052] Like, that moment was it was not hard to navigate, at least in my view, it was obvious at the time.

[1053] In retrospect, my views on that haven't changed, except for the fact that I recognize maybe it's just impossible given the nature of people's response to that kind of demand, right?

[1054] We live in a society that's just not going to lock down.

[1055] Unless the pandemic is much more deadly.

[1056] Right.

[1057] So, So that's a point I made, which, you know, was maliciously clipped out from some other podcast where someone's trying to make it look like I want to see children die.

[1058] Look, it's a pity more children didn't die from COVID, right?

[1059] This is actually the same person who, and that's the other thing that got so poisoned here.

[1060] It's like that person, this psychopath or effective psychopath who's creating these clips of me on podcast, this second clip of me seeming to say that I wish more children die.

[1061] during COVID, but it was so clear in context what I was saying that even the clip betrayed the context, so it didn't actually work.

[1062] This psycho, and again, I don't know whether he actually is a psychopath, but he's behaving like one because of the incentives of Twitter.

[1063] This is somebody who Brett signal boosted as a very reliable source of information, right?

[1064] He kept retweeting this guy at me, against me, right?

[1065] And this guy, at one glance, I knew how unreliable this guy was, right?

[1066] But I think I'm not at all set.

[1067] One thing I think I did wrong, one thing that I do regret, one thing I have not sorted out for myself, is how to navigate the professional and personal pressure that gets applied at this moment where you have a friend or an acquaintance or someone you know who's behaving badly in public or behaving badly in a way that you think is bad in public and they have a public platform where they're influencing a lot of people and you have your own public platform where you're constantly getting asked to comment on what this friend or acquaintance or colleague is doing.

[1068] I haven't known what I think is ethically right about the choices that seem forced on us.

[1069] at moments like this.

[1070] So like I've criticized you in public about your interview with Kanye.

[1071] Now in that case, in that case, I reached out to you in private first and told you exactly what I thought.

[1072] And then when I was going to get asked in public or when I was touching that topic on my podcast, I more or less said the same thing that I said to you in private, right?

[1073] Now that was how I navigated that moment.

[1074] I did the same thing with Elon, at least on at the beginning.

[1075] you know this we have we have maintained good vibes that which is which is not what i don't think i i disagree with you because good vibes in the moment there's a deep core of good vibes that persists through time between you and elon and i would argue probably between some of the other folks you mentioned i i think with brett i failed to reach out in private uh to the degree that i should have and we never really had, we had tried to set up a conversation in private that never happened, but there was some communication, but it would have been much better for me to have made more of an effort in private than I did before it spilled out into public.

[1076] And I would say that's true with other people as well.

[1077] What kind of interaction in private do you think you should have with Brett?

[1078] Because my case would be beforehand and now still, the case I would like, and this part of the criticism, you sent my way.

[1079] Maybe it's useful to go to that direction.

[1080] Actually, let's go to that direction because I think I disagree with your criticism as you stated publicly, but this is...

[1081] You're of your interview with Kanye.

[1082] Yeah, yeah, yeah.

[1083] The thing you criticize me for is actually the right thing to do with Brett.

[1084] Okay, you said Lex could have spoken with Kanye in such a way as to have produced a useful document.

[1085] He didn't do that because he has a fairly naive philosophy about the power of love.

[1086] Let's see if you can maintain that philosophy in the presence.

[1087] Let's go.

[1088] No, it's beautiful.

[1089] He seemed to think that if he just got through the mind field to the end of the conversation where the two of them still were feeling good about one another and they can hug it out, that would be by definition of success.

[1090] So let me make the case for this power of love philosophy, right?

[1091] And first of all, I love you, Sam.

[1092] You're still an inspiration and somebody I deeply admire.

[1093] Okay.

[1094] Back at you.

[1095] To me, in the case of Kanye, it's not only that you get to the conversation and have hugs, it's that the display that you're willing to do that has power.

[1096] So even if it doesn't end in hugging, the actual turning the other cheek, the act of turning the other cheek itself communicates both to Kanye later and to the rest of the world that we should have empathy and compassion towards each other.

[1097] There is power to that.

[1098] Maybe that is naive, but I believe in the power of that.

[1099] So it's not that I'm trying to convince Kanye that some of his ideas are wrong, but I'm trying to illustrate that just the act of listening and truly trying to understand the human being, that opens people's minds to actually questioning their own beliefs, more.

[1100] It takes them out of the dogmatism.

[1101] It de -escalates the kind of dogmatism that I've been seeing.

[1102] So in that sense, I would say the power of love is the philosophy you might apply to Brett because the right conversation you have in private is not about, hey, listen, you're, you know, the experts you're talking to, they seem credentialed, but they're not actually as credentials as they're illustrating.

[1103] They're not grounding their findings in actual meta -analysis.

[1104] in the papers and so on, like making a strong case.

[1105] Like, what are you doing?

[1106] It's going to get a lot of people in trouble.

[1107] But instead just saying, like, being a friend in the dumbest of ways, being, like, respectful, sending love their way, and just having a conversation outside of all of this.

[1108] Like, basically showing that, like, removing the emotional attachment to this debate, even though you are very emotionally attached because in the case of COVID, specifically, there is a very large number of lives at stake.

[1109] But removing all of that and remembering that you have a friendship.

[1110] Yeah, well, so I think these are highly non -analogous cases, right?

[1111] So your conversation with Kanye misfired, from my point of view, for a very different reason.

[1112] It was, it has to do with Kanye.

[1113] I mean, so Kanye, I don't know, I've never met Kanye.

[1114] So obviously I don't know him.

[1115] But I think he's either obviously in the midst of a mental health crisis or he's a colossal asshole, or both.

[1116] I mean, actually those aren't mutually exclusive.

[1117] So one of three possibilities, he's either mentally ill, he's an asshole, or he's mentally ill and an asshole.

[1118] I think all three of those possibilities are possible for the both of us as well.

[1119] No. I would argue none of those are likely for either.

[1120] of us, but, um, possible, not to say we don't have our moments, but, so, so the reason not to talk to Kanye, so you, I think you should have had the conversation you had with him in private.

[1121] That's great.

[1122] And there's no, I've got no, uh, criticism of what you said had it been in private.

[1123] In public, I just thought, you're not doing him a favor.

[1124] If he's mentally ill, right, he's in the middle of a, a manic episode or, or, you know, I'm not a clinician, but I've heard it said of him that he is bipolar.

[1125] You're not doing him a favor sticking a mic in front of him and letting him go off on the Jews or anything else, right?

[1126] We know what he thought about the Jews.

[1127] We know that there's not much illumination that's going to come from him on that topic.

[1128] And if it is a symptom of his mental illness that he thinks these things, well, then you're not doing him a favor making that even more public.

[1129] If he's just an asshole and he's just an anti -Semite, an ordinary garden variety anti -Semite, well then there's also not much to say unless you're really going to dig in and kick the shit out of him in public.

[1130] And I'm saying you can do that with love.

[1131] I mean, that's the other thing here is that I don't agree that compassion and love always have this patient embracing acquiescent face.

[1132] They don't always feel good to the recipient, right?

[1133] There is a sort of wisdom that you can wield compassionately in moments like that, where someone's full of shit and you just make it absolutely clear to them and to your audience that they're full of shit.

[1134] And there's no hatred being communicated.

[1135] In fact, you could just, it's like, listen, I'm going to do everyone a favor right now and, you know, just take your foot out of your mouth.

[1136] And the truth is, you know, I just wouldn't have aired the conversation.

[1137] Like, I just don't think it was a document that had to get out there, right?

[1138] I get that many people, this is not a signal you're likely to get from your audience, right?

[1139] Like, I get that many people in your audience thought, oh, my God, that's awesome.

[1140] You're talking to Kanye and you're doing it in Lex style where it's just love and you're not treating him like a pariah.

[1141] And, you know, you're holding this tension between he's this creative genius, who is work we love.

[1142] and yet he's having this moment that's so painful and what a tightrope walk and I get that maybe 90 % of your audience saw it that way they're still wrong and I still think that was non -balanced not a good thing to put out into the world You don't think it opens up the mind and heart of people that listen to that?

[1143] Just have it seeing a person.

[1144] If it does, it's letting it's opening it up in the wrong direction where just gale force nonsense is coming in, right?

[1145] I think we should have an open mind and an open heart, but there's some clear things here that we have to keep in view.

[1146] One is, the mental illness component is its own thing.

[1147] I don't pretend to understand what's going on with him.

[1148] But insofar as that's the reason he's saying what he's saying, do not put this guy on camera and let know.

[1149] Sorry, in that point real quick, I had a bunch of conversation with him offline, and I didn't get a sense of mental illness.

[1150] That's why I chose to sit down.

[1151] Okay.

[1152] And I didn't get it, I mean, mental illness is such a...

[1153] But when he shows up in a gimp hood on Alex Jones' podcast, either that's more, you know, genius performance in his world or he's unraveling further.

[1154] I wouldn't put that under mental illness.

[1155] I think there's another conversation to be had about how we treat artists.

[1156] Right.

[1157] Because they're weirdos.

[1158] They're very, I mean, you know, taking words from, Kanye is if he's like Christopher Hitchens or something like that, like very eloquent, researched, you know, written many books on history, on politics, and geopolitics, on psychology.

[1159] Kanye didn't do any of that.

[1160] He's an artist just spouting off.

[1161] And so it's a different style of conversation and a different way to treat the words that are coming out of us.

[1162] Let's leave the mental illness aside.

[1163] So if we're going to say that there's no reason to think he's mentally ill, and this is just him being creative and brilliant, and opinionated, well, then that falls into the asshole bucket for me. It's like then he's someone, and honestly, the most offensive thing about him in that interview from my point of view is not the anti -Semitism, which we can talk about, because I think there are problems just letting him spread those memes as well, but the most offensive thing is just how delusionally egocentric he is or was coming off in that interview and in others.

[1164] He has an estimation of himself as this omnibus genius to rival, not only to rival Shakespeare, to exceed Shakespeare, right?

[1165] I mean, he's like, he is the greatest mind that has ever walked among us, and he's more or less explicit on that point, and yet he manages to talk for hours without saying anything actually interesting or insightful or factually illuminating, right?

[1166] So it's complete delusion of a very Trumpian sort.

[1167] It's like, it's like, you know, when Trump says he's a genius who understands everything, but nobody takes him seriously, one wonders whether Trump takes himself seriously, Kanye seems to believe, he seems to believe his own press.

[1168] He actually thinks he's, you know, just a colossus.

[1169] And he may be a great musician.

[1170] You know, I'm not, you know, I've, it's certainly not my wheelhouse to compare him to any other musicians.

[1171] But, um, one thing.

[1172] it's patently obvious from your conversation is he's not who he thinks he is intellectually or ethically or in any other relevant way.

[1173] And so when you couple that to the anti -Semitism he was spreading, which was genuinely noxious and ill -considered and has potential knock -on effects in the black community.

[1174] I mean, there's an ambient level of anti -Semitism in the black community that it's worth worrying about and talking about anyway.

[1175] There's a bunch of guys, you know, playing the knockout game in Brooklyn, just punching Orthodox Jews in the face.

[1176] And I think letting Kanye air his anti -Semitism that publicly only raises the likelihood of that rather than diminishes it.

[1177] I don't know.

[1178] So let me say just a couple of things.

[1179] So one, my belief at the time was it doesn't.

[1180] It decreases it.

[1181] Showing empathy while pushing back decreases the likelihood of that.

[1182] It does might, it might on the surface.

[1183] just look like it's increasing it, but that's simply because the anti -Semitism or the hatred in general is brought to the surface and that people talk about it.

[1184] But I should also say that you're one of the only people that wrote to me privately criticizing me. And like out of the people I really respect and admire, and that was really valuable, that I had to, painful because I had to think through it for a while.

[1185] It still haunts me because the other kind of criticism I got a lot of, people basically said things towards me based on who I am that they hate me. You mean anti -Semitic things?

[1186] Yeah, anti -Semitic things.

[1187] I just hate the word, anti -Semitic.

[1188] It's like racist.

[1189] But here's the reality.

[1190] So I'm someone, so I'm Jewish, although obviously not religious.

[1191] I have never taken, you know, I've been a student of the Holocaust, obviously.

[1192] I know a lot about that, and there's reason to be a student of the Holocaust.

[1193] But in my lifetime and in my experience, I have never taken anti -Semitism very seriously.

[1194] I have not worried about it.

[1195] I have not made a thing of it.

[1196] I've done exactly one podcast on it.

[1197] I had Barry Weiss on my podcast when her book came out.

[1198] But it really is a thing, and it's something we have to keep an eye on societally, because it's a unique kind of hatred, right?

[1199] It's unique in that it seems, it's knit together with, it's not just ordinary racism, it's knit together with lots of conspiracy theories that never seem to die out.

[1200] it can by turns equally animate the left and the right politically.

[1201] I mean, what's so perverse about anti -Semitism, look in the American context.

[1202] With the far right, you know, with white supremacists, Jews aren't considered white, so they hate us in the same spirit in which they hate black people or brown people or anyone is not white.

[1203] But on the left, Jews are considered extra white.

[1204] I mean, we're the extra beneficiaries of white privilege, right?

[1205] And in the black community, that is often the case, right?

[1206] We're a minority that has thrived.

[1207] And it seems to stand as a counterpoint to all of the problems that other minorities suffer, in particular, you know, African Americans in the American context.

[1208] And yeah, Asians are now getting a little bit of this, you know, like the model minority issue.

[1209] But Jews have had this going on for centuries and millennia, and it not.

[1210] never seems to go away.

[1211] And again, this is something that I've never focused on, but this has been at a slow boil for as long as we've been alive, and there's no guarantee it can't suddenly become much, much uglier than we have any reason to expect it to become, even in our society.

[1212] And so there's kind of a special concern at moments like that where you have an immensely influential person in a community who already has a checkered history with respect to their own beliefs about the Jews and the conspiracies and all the rest.

[1213] And he's just messaging, you know, not especially fully opposed by you and anyone else who's giving him the microphone at that moment to the world.

[1214] And so that, that, you know, made my spidey sense.

[1215] Yeah, it's complicated.

[1216] It's, the stakes are very high.

[1217] And I, somebody that's been, obviously, family and also reading a lot about World War II.

[1218] And just this whole period, it was a very difficult conversation.

[1219] But I believe in the power, especially given who I am of, not always, but sometimes, often turning the other cheek.

[1220] Oh, yeah.

[1221] And, again, things change when they're for public consumption.

[1222] And, you know, when you're, it's like, I mean, the cut for me that, you know, has just the use case I keep stumbling upon is the kinds of things that I will say on a podcast like this or if I'm giving a public lecture versus the kinds of things I will say a dinner with strangers or with friends.

[1223] Like if you're in an elevator, like if I'm in an elevator with strangers, I do not feel, and I hear someone say something stupid, I don't feel an intellectual responsibility to turn around.

[1224] in the confines of that space with them and say, listen, that thing you just said about X, Y, or Z is completely false and here's Y, right?

[1225] But if somebody says it in front of me on some public dais where I'm actually talking about ideas, that's when, you know, there's a different responsibility that comes online.

[1226] The question is how you say it, how you say it.

[1227] Or even whether you say anything in those.

[1228] I mean, there are moments, they're definitely moments to privilege civility or just to pick your battle.

[1229] I mean, sometimes it's just not worth it to get into it with somebody out, out in real life.

[1230] I just believe in the power of empathy both in the elevator and when a bunch of people are listening.

[1231] That when they see you willing to consider another human being's perspective, it just gives more power to your words after.

[1232] Well, yeah, but until it doesn't.

[1233] Like if you, because you can, you can, right, you can extend charity too far, right?

[1234] You can, like, it can be absolutely obvious what someone's motives really are.

[1235] Right.

[1236] And they're, you know, dissembling about that, right?

[1237] And so then you're taking at face value, their representations begins to look like you're just being duped and you're not, you're not actually doing the work of, of putting pressure on a bad actor, you know?

[1238] So it's, it's, and again, the whole, the mental illness component here makes, makes it very difficult to think about.

[1239] what you should or shouldn't have said to Kanye.

[1240] So I think the topic of platforming is pretty interesting.

[1241] Like, what's your view on platforming controversial people?

[1242] Let's start with the old, would you interview Hitler on your podcast, and how would you talk to him?

[1243] Oh, and follow a question.

[1244] Would you interview him in 1935, 41, and then like 45?

[1245] Well, I think we have an uncanny valley problem with respect to this issue of whether or not to speak to bad people, right?

[1246] So if a person is sufficiently bad, right, if they're all the way out of the valley, then you can talk to them and it's just totally unproblematic to talk to them because you don't have to spend any time signaling to your audience that you don't agree with them.

[1247] And if you're interviewing Hitler, you don't have to say, listen, I just got to say, before we start, I don't agree with the whole, you know, genocide thing.

[1248] you know i just think you're uh killing you know killing mental patients and vans and all that all that's a bad as a bad look at all uh so you you just it can go without saying that you don't agree with this person and you're not platforming them to signal boost their their views you're just trying to if they're sufficiently evil you can go into it very much as an anthropologist would uh just you just want to understand the nature of evil You just want to understand this phenomenon, like, how is this person who they are, right?

[1249] And that strikes me as an intellectually interesting and morally necessary thing to do, right?

[1250] So, yes, I think you always interview Hitler.

[1251] Wait, wait, wait, wait, wait, wait, wait.

[1252] Well, when you know, once he's Hitler.

[1253] But when do you know it?

[1254] Once he's legitimately.

[1255] But when do you know it?

[1256] Is genocide really happening?

[1257] Yeah, yeah, yeah.

[1258] 42, 43?

[1259] No, no, if you're on the cusp of it where it's just he's someone who's gaining power and you don't want to help facilitate that, then there's a question of whether you can undermine him while pushing back against him in that interview.

[1260] So there are people I wouldn't talk to just because I don't want to give them oxygen and I don't think that in the context of my interviewing them, I'm going to be able to take the wind out of their sails at all, right?

[1261] So it's like for whatever, either because it's an asymmetric advantage, because I just know that they can do something that they, within the span of an hour that I can't, that I can't correct for, you know, it's like they can light many small fires and it just takes too much time to put them out.

[1262] That's more like on the topic of vaccines, for example, having a debate on the efficacy of vaccines.

[1263] Yeah.

[1264] It's not that I don't think sunlight is usually the best disinfectant.

[1265] I think it is.

[1266] you know, even these asymmetries aside, I mean, there are, it is true that a person can always make a mess faster than you can clean it up, right?

[1267] But still, there are debates worth having, even given that limitation.

[1268] And they're the right people to have those specific debates.

[1269] And there's certain topics where, you know, I'll debate someone just because I'm the right person for the job and it doesn't matter how messy they're going to be.

[1270] It's just worth it because I can make my points.

[1271] land, at least to the right part of the audience.

[1272] So some of it is just your own skill and competence and also interest in preparing correctly?

[1273] Well, yeah, yeah, and the nature of the subject matter.

[1274] But there are other people who just by default, I would say, well, there's no reason to give this guy a platform.

[1275] And there are also people who are so confabulatory that they're making such a mess with every sentence that you, insofar as you're even trying to interact.

[1276] with what they're saying, you're going to, you're by definition going to fail and you're going to seem to fail to an sufficiently large uninformed audience where it's going to be a net negative for the cause of truth no matter how good you are.

[1277] So like, for instance, I think talking to Alex Jones on any topic for any reason is probably a bad idea, because I just think he's, he's just neurologically wired to just, I mean, utter a string of sentences.

[1278] He'll get 20 sentences out, each of which has to be, each of which is, you know, contains more lies than the last.

[1279] And there's just, there's not time enough in the world to run down, and certainly not time enough in the span of a conversation, to run down each of those leads to bedrock so as to falsify it.

[1280] I mean, he'll just make shit up.

[1281] Or make shit up and then then weave it in with, you know, half truths and, and micro truths that give some semblance of credibility to somebody out there.

[1282] I mean, apparently millions of people out there.

[1283] And there's just no way to untangle that in real time with him.

[1284] I have noticed that you have an allergic reaction to confabulatorization.

[1285] Yeah, confabulation.

[1286] Confabulation, that if somebody says something a little micro -untruth, it really stops your brain.

[1287] Here I'm not talking about micro -untruths, I'm just talking about making up things out of whole cloth.

[1288] Just like, if someone says something, well, what about, and then the thing they put at the end of that sentence is just a set of pseudofacts, right, that you can't possibly authenticate or not.

[1289] in the span of that conversation, they will, you know, whether it's about UFOs or anything else, right?

[1290] They will seem to make you look like an ignoramus when, in fact, everything they're saying is specious, right, whether they know it or not.

[1291] I mean, there's some people who are just crazy, there are some people who are just bullshitting and they're not even tracking whether it's true, it just feels good, and there's some people who are consciously lying about things.

[1292] But don't you think there's just a kind of jazz masterpiece of untruth, that you should be able to just a wave off by saying, like, well, none of that is backed up by any evidence, and just almost like take it to the humor place.

[1293] Well, but the thing is, it's, I mean, just, the place I'm familiar with doing this and not doing this is, is, um, on specific conspiracies like 9 -11 truth, right?

[1294] Like the 9 -11, so I, because of my, because of what 9 -11 did to my, uh, intellectual life.

[1295] And they really just, you know, it, it sent me down a path for the better part of a decade.

[1296] Like, I became a critic of religion when I, I don't know if I was ever going to be a critic of religion, right?

[1297] Like, but that, like, it happened to be in my wheelhouse because I spent so much time studying religion, uh, on my own.

[1298] and I was also very interested in the underlying spiritual concerns of every religion.

[1299] And so I was, you know, I devoted more than a full decade of my life to just, you know, what is real here?

[1300] What is possible?

[1301] What is the nature of subjective reality and how does it relate to reality at large?

[1302] And is there anything to, you know, who was someone like Jesus or Buddha?

[1303] And are these people frauds, or are they, are these just myths, or is there really a continuum of insight to be had here that is interesting?

[1304] So I spent a lot of time on that question through my 20, the full decade of my 20s.

[1305] And that was launched in part by 9 -11, truth there?

[1306] No, but then when 9 -11 happened, I had spent all this time reading religious books, empathically understanding the motivations of religious people, right?

[1307] just how fully certain people believe what they say they believe, right?

[1308] So I took religious convictions very seriously.

[1309] And then people started flying planes into our buildings.

[1310] And so I knew that there was something to be said about the core doctrines of Islam.

[1311] Yeah, exactly.

[1312] So I went down, so that became my wheelhouse for a time, you know, terrorism and jihadism and related topics.

[1313] and so the 9 -11 truth conspiracy thing kept getting aimed at me and the question was do I want to debate these people like Alex Jones perhaps yeah so Alex Jones I think was an early purveyor of it although I don't think I knew who he was at that point and so and privately I had some very long debates with people who you know one person in my family went way down that rabbit hole and I just you know every six months or so, I'd literally write the two -hour email, you know, that would try to deprogram him, you know, however ineffectually.

[1314] And so I went back and forth for three years on that topic with, in private, with people.

[1315] But I could see the structure of the conspiracy.

[1316] I could see the nature of how, of how impossible it was to play whack -a -mole sufficiently well so as to convince anyone of anything who was who was not seeing the the problematic structure of that way of thinking.

[1317] I mean, it's not actually a thesis.

[1318] It's a proliferation of anomalies that don't, you can't actually connect all the dots that are being pointed to.

[1319] They don't connect in a coherent way.

[1320] They're incompatible theses that are not, and their incompatibility is not being acknowledged.

[1321] But they're running this algorithm of things are never.

[1322] what they seem, there's always malicious conspirators doing things perfectly.

[1323] We see all, we see evidence of human incompetence everywhere else.

[1324] No one can tie their shoes, you know, expertly, anywhere else.

[1325] But over here, people are perfectly competent.

[1326] They're perfectly concealing things.

[1327] Like, thousands of people are collaborating, you know, inexplicably.

[1328] I mean, incentivized by what, who knows, they're collaborating to murder thousands of their neighbors, And no one is breathing a peep about it.

[1329] No one's getting caught on camera.

[1330] No one's breathed the word of it to a journalist.

[1331] And so I've dealt with that style of thinking, and I know what it's like to be in the weeds of a conversation like that, and the person will say, okay, well, but what do you make of the fact that all those F -16s were flown 800 miles out to see on the morning of 9 -11 doing an exercise that hadn't even been scheduled for that day, but it was, and now all of these, I dimly recall some thesis of that kind, but I'm just making these things up now, right?

[1332] So, like, that detail hadn't even been scheduled for that day.

[1333] It's inexplicably run that day.

[1334] So how long would it take to track that down, right?

[1335] The idea that this is anomalous, like, there was an F -16 exercise run on a, And it wasn't even supposed to be been run that day, right?

[1336] Someone like Alex Jones, their speech pattern is to pack as much of that stuff in as possible at the highest velocity that the person can speak.

[1337] And unless you're knocking down each one of those things to that audience, you appear to just be uninformed.

[1338] You appear to just not be, you don't, William, he didn't know about the F -16s.

[1339] Sure.

[1340] He doesn't know about Project Mockingbird.

[1341] You haven't heard about Project Mockingbird?

[1342] I just made up Project Mockingbird.

[1343] I don't know what it is, but that's the kind of thing that comes out, in a conversation like that.

[1344] That's the kind of thing, frankly, I was worried about in the COVID conversation, because not that someone like Brett would do it consciously, but someone like Brett is swimming in a sea of misinformation on social, living on Twitter, getting people sending the blog post and the study from, from, you know, the Philippines that showed that in this cohort, Ivermectin did X, right?

[1345] And not, like, to actually run anything to ground, right?

[1346] You have to actually do the work journalistically and scientifically and run it to ground, right?

[1347] So for many, for some of these questions, you actually have to be a statistician to say, okay, they used the wrong statistics.

[1348] in this experiment, right?

[1349] Now, yes, we could take all the time to do that, or we could at every stage along the way in a context where we have experts we can trust go with what 97 % of the experts are saying about X, about the safety of MRNA, about the transmissibility of COVID, about whether to wear masks or not wear masks.

[1350] And I completely agree that that broke down unacceptably in the over the last few years and that, but I think that's largely social media and blogs and the efforts of podcasters and substact writers were not just a response to that.

[1351] It was a, I think it was a, it was a symptom of that and a cause of that, right?

[1352] And I think we're living in an environment where people, we've basically, we have trained ourselves not to be able to agree about facts on any topic, no matter how urgent, right?

[1353] What's flying in our sky?

[1354] You know, what is, you know, what is, what's happening in Ukraine?

[1355] Is Putin just denosifying Ukraine?

[1356] I mean, like, there are people who we respect who are spending time down that particular rabbit hole.

[1357] Like, this is, This is, you know, maybe there are a lot of Nazis in Ukraine, and that's the real problem, right?

[1358] Maybe Putin's not the bad actor here, right?

[1359] How much time do I have to spend empathizing with Putin to the point of thinking, well, maybe Putin's got a point and it's like, what about the polonium and the nerve agents and the killing of journalists and the Navalny and, like, does that count?

[1360] Well, no, listen, I'm not paying so much attention to that because I'm following all these interests.

[1361] people on Twitter, and they give me some pro -Putton material here.

[1362] And there is a, there are some Nazis in Ukraine.

[1363] It's not like there are no Nazis in Ukraine.

[1364] How am I going to wait these things?

[1365] I think people are being driven crazy by Twitter.

[1366] Yeah.

[1367] But you're, you're kind of speaking to conspiracy theories that pollute everything.

[1368] But every, every example you gave us kind of a bad faith style of conversation.

[1369] But it's not necessarily knowingly bad faith by, I mean, The people who are worried about Ukrainian Nazis, to my, I mean, they're some of the same people.

[1370] They're the same people who are worried about Ivermectin got suppressed.

[1371] Like, Ivermectin is really a panacea, but it got suppressed because no one could make billions on it.

[1372] It's the same, it's literally, in many cases, the same people and the same efforts to unearth those.

[1373] saying it's very difficult to have conversations with those kinds of people what about a conversation with trump himself would you do a podcast with trump no i don't think so i don't think i'd be learning anything about him it's like with with hitler and i'm not comparing trump to hitler but clips guy is your chance you got this one with certain world historical figures um i would just feel like this is an opportunity to learn something that I'm not going to learn.

[1374] I think Trump is among the most superficial people we have ever laid eyes on.

[1375] Like, he is in public view, right?

[1376] And I'm sure there's some distance between who he is in private and who he is in public, but it's not going to be the kind of distance that's going to blow my mind.

[1377] And I think, so I think the liability of how, for instance, I think Joe Rogan was very wise not to have Trump on his podcast.

[1378] I think all he would have been doing is he would have put himself in a situation where he couldn't adequately contain the damage Trump was doing, and he was just going to make Trump seem cool to a whole new, you know, a potentially new cohort of his massive audience, right?

[1379] I mean, they would have had a lot of laughs.

[1380] Trump's funny.

[1381] I mean, the entertainment value of things is so influential.

[1382] I mean, there was that one debate where Trump, you know, got a massive laugh on the, you know, his line, only Rosie O'Donnell, right?

[1383] The truth is, we're living in a political system where if you can get a big laugh during a political debate, you win.

[1384] It doesn't matter who you are.

[1385] Like that's the level of, you know, it doesn't matter how uninformed you are, it doesn't matter that half the debate was about what the hell we should do about, about, you know, a threat of nuclear war or anything else.

[1386] It's, we're monkeys, right?

[1387] And we like to laugh.

[1388] Well, because you brought up Joe.

[1389] He's somebody like you, I look up to.

[1390] I've learned a lot from him because I think who he is privately as a human being.

[1391] Also, he's kind of the voice of curiosity to me. He inspired me that.

[1392] So unending, open -minded curiosity, much like you are the voice of reason.

[1393] They recently had a podcast.

[1394] Joe had recently a podcast to Jordan Peterson and I brought you up saying they still have a hope for you.

[1395] Yeah.

[1396] Any chance to talk to Joe again and reinvigorate your friendship?

[1397] yeah well i reached out to him privately when i saw that did you use the power of love joe knows i i love him and consider him a friend right so there's no there's no issue there um he also knows i'll be happy to do his podcast uh when we get that together you know so there's no i have got no policy of not talking to joe or not doing his podcast um i mean i think we're we got a little sideways along these same lines where you know we've talked about Brett and Elon and other people, it was never to that degree with Joe because Joe's in a very different lane, right?

[1398] He's unconsciously so.

[1399] I mean, Joe is a stand -up comic who interviews, who just is interested in everything, interviews the widest conceivable variety of people and just lets his interests collide with their expertise or lack of expertise.

[1400] I mean, he's He's, again, it's a super wide variety of people.

[1401] He'll talk about anything, and he can always pull the rip cord saying, you know, I don't know what the fuck I'm saying, I'm a comic, I'm stoned, we just drank too much, right?

[1402] Like, it's very entertaining, it's all in, you know, to my eye, it's all in good faith.

[1403] I think Joe is an extraordinarily ethical, good person.

[1404] Also, doesn't use Twitter.

[1405] Doesn't really use Twitter.

[1406] Right, yeah, yeah.

[1407] The crucial difference, though, is it, because he is an entertainer first.

[1408] I mean, I'm not saying he's not smart and doesn't understand things.

[1409] I mean, what's potentially confusing is he's very smart, and he's also very informed.

[1410] His full -time job is taught, you know, when he's not doing stand -up, we're doing color commentary for the UFC, his full -time job is talking to lots of very smart people at great length.

[1411] So he's created, you know, the Joe Rogan University for himself, and he's gotten a lot of information crammed into his head.

[1412] So it's not that he's uninformed, but he can always, when he feels that he's uninformed, or when it turns out he was wrong about something, he can always pull the ripcourt and say, I'm just a comic, we were stoned, it was fun, you know, don't take medical advice from me. I don't play a doctor on the internet, right?

[1413] But I can't quite do that, right?

[1414] You can't quite do that.

[1415] We're in different lanes.

[1416] I'm not saying you and I are in exactly the same lane.

[1417] But for much of Joe's audience, I'm just this establishment shill, just banging on about, you know, the universities and medical journals.

[1418] And it's not true, but that would be the perception.

[1419] And as a counterpoint to a lot of what's being said on Joe's podcast or, you know, certainly Brett's podcast on these topics, I can see how they, they would form that opinion, but in reality, if you listen to me long enough, you hear that I've said as much against the woke nonsense as anyone, even any lunatic on the right who can only keep that bright shining object in view, right?

[1420] So there's nothing that Candace Owens has said about wokeness, that I haven't said about wokeness as far as she's speaking rationally about wokeness.

[1421] But We have to be able to keep multiple things in view, right?

[1422] If you could only look at the problem of wokeness and you couldn't acknowledge the problem of Trump and Trumpism and QAnon and the explosion of irrationality that was happening on the right and bigotry that was happening on the right, you were just disregarding half of the landscape.

[1423] And many people took half of the problem in recent years.

[1424] The last five years is a story of many people taking half of the problem and monetizing that half of the problem and getting captured by an audience that only wanted that half of the problem talked about in that way.

[1425] And this is the larger issue of audience capture, which is very, I'm sure it's an ancient problem, but it's a very helpful phrase that I think comes so as courtesy of our mutual friend, Eric Weinstein.

[1426] And audience captures a thing, and I believe I've witnessed many, you know, casualties of it.

[1427] And if there's anything I've been on guard against in my life, you know, professionally, it's been that.

[1428] And when I noticed that I had a lot of people in my audience who didn't like my criticizing Trump, I really leaned into it.

[1429] And when I noticed that a lot of the other cohort in my audience didn't like me criticizing the far left, wokeness, I thought I was, you know, exaggerating that problem, I leaned into it because I thought those parts of my audience were, were absolutely wrong, and I didn't care about whether I was going to lose those parts of my audience.

[1430] There are people who have created, you know, knowingly or not, there are people who have created different incentives for themselves, because of how they've monetized their podcast, and because of the kind of signal they've responded to in their audience.

[1431] And I worry about, you know, Brett would consider this a totally invidious ad hominem thing to say, but I really do worry that that's happened to Brett.

[1432] I think, I think, I cannot explain how you do 100, with all the things in the universe to be interested in, and of all the things he's competent to speak intelligently about, I don't know how you do 100 podcasts in a row on COVID, right?

[1433] It's just, it makes no sense.

[1434] in part audience capture can explain that.

[1435] I absolutely think it can't.

[1436] What about do you, like for example, do you feel pressure to not admit that you made a mistake on COVID or made a mistake on Trump?

[1437] I'm not saying you feel that way, but do you feel this pressure?

[1438] So you've attacked audience capture within the way you do stuff, so you don't feel as much pressure from the audience, but within your own ego.

[1439] I mean, again, the people.

[1440] People who think I'm wrong about any of these topics are going to think, okay, you're just not admitting that you're wrong, but now we're having a dispute about specific facts.

[1441] There are things that I believed about COVID or worried might be true about COVID two years ago that I no longer believe or I'm not so worried about now, and vice versa.

[1442] I mean, things have flipped, certain things have flipped upside down.

[1443] The question is, was I wrong, so here's the cartoon version of it, but this is something I said probably 18 months ago, and it's still true.

[1444] You know, when I saw what Brett was doing on COVID, you know, let's call it two years ago, I said, even if he's right, even if he turns out that Ivermectin is a panacea and the MRI vaccines kill millions of people, right?

[1445] He's still wrong right now.

[1446] His reasoning is still flawed right now.

[1447] His facts still suck right now, right?

[1448] And his confidence is, is unjustified now.

[1449] That was true then.

[1450] That will always be true then, right?

[1451] And so, and not much has changed for me to revisit any of my time points along the way.

[1452] Again, I will totally concede that if I had teenage boys and their schools were demanding that they'd be vaccinated with the MRNA vaccine, I would be powerfully annoyed, right?

[1453] I wouldn't know what I was going to do, and I would be doing more research about myocarditis, and I'd be badgering our doctors, and I would be worried that we have a medical system and a pharmaceutical system and a health care system and a public health system that's not incentivized to look at any of this in a fine -grained way and they just want one blanket admonition to the entire population, just take the shot, you idiots.

[1454] I view that largely as a result, a panicked response to the misinformation explosion that happened and the public, the populist resistance animated by misinformation that just made it impossible to get anyone to cooperate, right?

[1455] So it's just, part of it is, again, a pendulum swing in the wrong direction.

[1456] It's somewhat analogous to the woke response to Trump and the Trumpist response to woke, right?

[1457] So there's a lot of people have just gotten pushed around for bad reasons, but understandable reasons.

[1458] But yes, it's, there are, there are caveats to my, Things have changed about my view of COVID, but the question is, if you roll back the clock 18 months, was I wrong to want to platform Eric Topal, you know, a very well -respected cardiologist on this topic, or, you know, Nicholas Christakis to talk about the network effects of, you know, whether we should close schools, right?

[1459] He had written a book on COVID.

[1460] He's, you know, of network effects or his wheelhouse, both as an MD and as a sociologist.

[1461] There was a lot that we believed we knew about the efficacy of closing schools during pandemics, right?

[1462] During the Spanish flu pandemic and others, right?

[1463] But there's a lot we didn't know about COVID.

[1464] We didn't know how negligible the effects would be on kids compared to older people.

[1465] We didn't know, like the...

[1466] My problem, I really enjoyed your conversation with Eric Topo, but also didn't.

[1467] So he's one of the great communicators in many ways on Twitter, like distillation of the current data.

[1468] But he, I hope I'm not overstating it, but there is a bit of an arrogance from him that I think it could be explained by him being exhausted by being constantly attacked by conspiracy theory, like anti -vaxxers.

[1469] to me, the same thing happens with people that start drifting to being right wing is to get attacked so much by the left, they become almost irrational and arrogant in their beliefs.

[1470] And I felt your conversation with Eric Topo did not sufficiently empathize with people that have skepticism, but also did not sufficiently communicate uncertainty we have.

[1471] So like many of the decisions you've made, Many of the things you were talking about were kind of saying there's a lot of uncertainty, but this is the best thing we could do now.

[1472] Well, it was a forced choice.

[1473] You're going to get COVID.

[1474] Do you want to be vaccinated when you get it?

[1475] That was always, in my view, an easy choice.

[1476] And it's up until you start breaking apart the cohorts and you start saying, okay, wait a minute, there is this myocarditis issue in young men.

[1477] Let's talk about that.

[1478] before that story emerged it was just it was just clear that this is if it's not if it's not knocking down transmission as much as we had hoped it is still mitigating severe illness and death and I still believe that it is the the current view of the most people competent to analyze the data, that we lost something like 300 ,000 people unnecessarily in the US because of vaccine hesitancy.

[1479] But I think there's a way to communicate with humility about the uncertainty of things that would increase the vaccination rate.

[1480] I do believe that it is rational and sometimes effective to signal impatience with certain bad ideas, right?

[1481] And certain conspiracy theories and certain forms of misinformation.

[1482] You think so?

[1483] I just think it makes you look a douchebag most times.

[1484] Well, I mean, certain people are persuadable.

[1485] Certain people are not persuadable, but it's, no, because there's not enough, it's the opportunity cost.

[1486] Not everything can be given a patient hearing.

[1487] So you can't have a physics conference and then let people in to just trumpet their pet theories about, you know, the grand unified vision of physics.

[1488] when they're obviously crazy or when they're obviously half crazy or they're just not, you know, the people, like you begin to you begin to get a sense for this when it is your wheelhouse, but there are people who kind of declare their, their irrelevance to the conversation fairly quickly without knowing that they have done it, right?

[1489] And the truth is, I think I'm one of those people on the topic of COVID, right?

[1490] Like, it's like, it's not, it's never that I felt, listen, I know exactly what's going on here.

[1491] I know these MRI vaccines are safe.

[1492] I know exactly, I know exactly how to run a lockdown.

[1493] No, this is, this is a situation where you want the actual pilots to fly the plane, right?

[1494] We needed experts who we could trust.

[1495] And insofar as our experts got captured by all manner of thing.

[1496] I mean, some of them got captured by Trump.

[1497] Some of them were made to look ridiculous just standing next to Trump while he was bloviating about, you know, whatever, that, you know, it's just going to go away.

[1498] There's just 15 people, you know, there's 15 people in a cruise ship and it's just going to go away.

[1499] This is going to be no problem.

[1500] Or it's like when he said, you know, many of these doctors think, I understand this better than them.

[1501] They're just amazed at how I understand this.

[1502] And you've got doctors, real doctors.

[1503] the heads of the CDC and NIH standing around just ashen -faced while he's talking, you know, all of this was deeply corrupting of the public communication of science.

[1504] And then, again, I've banged on about the depredations of wokeness.

[1505] The woke thing was a disaster, right?

[1506] Still is a disaster.

[1507] But it doesn't mean that, I mean, but the thing is, there's a big difference between me and Brett in this case, I didn't do 100 podcasts on COVID.

[1508] I did like two podcasts on COVID.

[1509] The measure of my concern about COVID can be measured in how many podcasts I did on it, right?

[1510] It's like once we had a sense of how to live with COVID, I was just living with COVID, right?

[1511] Like, okay, get faxed or don't get vaxed.

[1512] Wear a mask or don't wear a mask.

[1513] Travel or don't travel.

[1514] Like, you've got a few things to decide, but my kids were stuck at home on iPads, you know, for too long.

[1515] I didn't agree with that.

[1516] It was obviously not functional.

[1517] I criticized that on the margins, but there was not much to do about it.

[1518] But the thing I didn't do is make this my life and just browbeat people with one message or another.

[1519] We need a public health regime where we can trust what competent people are saying to us about what medicines are safe to take.

[1520] And in the absence of that, craziness is going to, even in the presence of that craziness is going to proliferate given the tools we've built.

[1521] But in the absence of that, it's going to proliferate for understandable reasons.

[1522] And that's going to, it's not going to be good next time when when something, orders of magnitude more dangerous hits us.

[1523] And that's, I spend, you know, insofar as I think about this issue, I think much more about next time than this time.

[1524] Before this COVID thing, you and Brad had some good conversations, I would say we're friends.

[1525] What do you admire most about Brett outside of all the criticism we've had about this COVID topic?

[1526] Well, I think Brett is very smart and he's a very ethical person who wants good things for the world.

[1527] I mean, I have no reason to doubt that.

[1528] So the fact that we're on, you know, we're crosswise on this issue is not, does not mean that I think he's a bad person.

[1529] I mean, the thing that worried me about what he was doing, and this was true of Joe, and this was true of Elon, this was true of many other people, is that once you're messaging at scale to a vast audience, you incur a certain kind of responsibility not to get people killed.

[1530] And I did worry that, yeah, people were making decisions on the basis of the information that was getting shared there.

[1531] And that's why I was, I think, fairly circumspect.

[1532] I just said, okay, give me the center of the fairway expert opinion at this time point and at this time point and at this time point and then I'm out, right?

[1533] I don't have any more to say about this.

[1534] I'm not an expert on COVID.

[1535] I'm not an expert on the safety of mRNA vaccines.

[1536] if something changes so as to become newsworthy, then maybe I'll do a podcast.

[1537] I just did a podcast on the lab leak, right?

[1538] I was never skeptical of the lab leak hypothesis.

[1539] Brett was very early on saying this is a lab leak, right, at a point where my only position was, who cares if it's a lab leak, right?

[1540] Like, the thing we have to get straight is, what do we do, given the nature of this pandemic?

[1541] But also we should say that you've actually stated that it is a possibility.

[1542] Oh, yeah.

[1543] You just said it doesn't quite matter.

[1544] I mean, the time to figure that out.

[1545] Now, I've actually, I have had my podcast guest on this topic changed my view of this because, you know, one of the guests, Alina Chan, made the point that, no, actually the best time to figure out the origin of this is immediately, right?

[1546] Because in the evidence, you lose touch with the evidence.

[1547] And I hadn't really been thinking about that.

[1548] If you come back after a year, you know, there are certain facts you might not be able to get in hand.

[1549] But I've always felt that it didn't matter for two reasons.

[1550] One is we had the genome of the virus and we could design, we're very quickly designed, immediately designing vaccines against that genome.

[1551] And that's what we had to do.

[1552] And then we had to figure out how to vaccinate and to mitigate and to develop treatments and all of that.

[1553] So the origin story didn't matter.

[1554] Generically speaking, either origin story was politically inflammatory and made the Chinese look bad, right?

[1555] And the Chinese response to this looked bad, whatever the origin story, right?

[1556] They're not cooperating.

[1557] They're stopping their domestic flights, but letting their international flights go.

[1558] I mean, it's just they were bad actors, and they should be treated as such regardless of the origin, right?

[1559] And, you know, I would argue that the wet market origin is even more politically invidious than the lab leak origin.

[1560] I mean, why do you think?

[1561] Because lab leak, to my eye, the lab leak could happen to anyone, right?

[1562] We're all running, all these advanced countries are running these dangerous labs.

[1563] That's a practice that we should be worried about, you know, in general.

[1564] We know lab leaks are a problem.

[1565] There have been multiple lab leaks of even worse things that haven't gotten.

[1566] gotten out of hand in this way, but, you know, worse pathogens.

[1567] We're wise to be worried about this, and on some level, it could happen to anyone, right?

[1568] The wet market makes them look like barbarians living in another century.

[1569] Like, you've got to clean up those wet markets.

[1570] Like, what are you doing putting a bat on top of a pangolin, on top of a duck?

[1571] It's like, get your shit together.

[1572] So, like, if anything, the wet market makes them.

[1573] look worse in my view.

[1574] Now, I'm sure there's, I'm sure that what they actually did to conceal a lab leak, if it was a lab leak, all of that's going to look odious.

[1575] Do you think we ever get to the bottom of that?

[1576] I mean, one of the big negative, I would say failures of Anthony Fauci and so on is to be transparent and clear and just a good communicator about gain and function research, the dangers of that.

[1577] Like, you know, why it's a useful, way of research, but it's also dangerous.

[1578] Just being transparent about that, as opposed to just coming off really shady.

[1579] Of course, the conspiracy theorists and the politicians are not helping, but this just created a giant mess.

[1580] Yeah, no, I would agree.

[1581] So that exchange with Fauci and Rand Paul that went viral, yeah, I would agree that Fauci looked like he was taking refuge in kind of very lawyered.

[1582] language and not giving a straightforward account of what we do and why we do it.

[1583] And so, yeah, I think it looked shady, it played shady, and it probably was shady.

[1584] I mean, I don't know how personally entangled he is with any of this, but yeah, the gain of function research is something that I think we're wise to be worried about.

[1585] And insofar as I judge myself adequate to have an opinion on this.

[1586] I think it should be banned, right?

[1587] Like, I'm probably a podcast I'll do, you know, if you or somebody else doesn't do it in the meantime.

[1588] You know, I would like a virologist on to defend it against a virologist who would criticize it.

[1589] Forget about just the gain of function research.

[1590] I don't even understand virus hunting at this point.

[1591] It's like, I don't know.

[1592] I don't even know why you need to go into a cave to find this next, virus that could be circulating among bats that may jump zoonotically to us.

[1593] Why do that when we can make, when we, when we can sequence in a day and make vaccines in a weekend?

[1594] I mean, like, what kind of head start do you think you're getting?

[1595] That's a surprising new thing, how quickly you can develop a vaccine.

[1596] That's, uh, yeah, that's really interesting, but the shadiness around lab leak.

[1597] I think the point I didn't make about Brett's style of, engaging this issue is people are using the fact that he was early on lab leak to suggest that he was right about ivermectin and about mRNA vaccines and all the rest like no that that's none of that connects and it was possible to be falsely confident like you shouldn't have been confident about lab no one should have been confident about lab leak early even if it turns out to be lab leak right it was always plausible it was never definite it still isn't definite zoonotic is is is is also quite plausible.

[1598] It certainly was super plausible then.

[1599] Both are politically uncomfortable.

[1600] Both of the time were inflammatory to be banging on about when we were trying to secure some kind of cooperation from the Chinese, right?

[1601] So there's a time for these things, and it's possible to be right by accident.

[1602] That's the thing.

[1603] It met your reasoning, the style of reasoning matters whether you're right or not.

[1604] You know, it's like, because your style of reasoning is dictating what you're going to do on the next topic.

[1605] Sure, but this is a, this multivariate situation here.

[1606] It's really difficult to know what's right on COVID, given all the uncertainty, all the chaos, especially when you step outside the pure biology, virology of it, and you started getting to policy.

[1607] It's really Yeah, there's just tradeoffs, yeah Like transmissibility of the virus Sure, just knowing if 65 % of the population gets vaccinated, what effect would that have?

[1608] Just even knowing those things, just modeling all those things.

[1609] Given all the other incentives, I mean, Pfizer, I don't know what to think.

[1610] You had the CEO of Pfizer on your podcast.

[1611] Did you leave that conversation feeling like this is a person who is consciously reaping windfall profits on a dangerous vaccine and putting everyone at intolerable risk or do you think this person did you think this person was making a good faith attempt to save lives and had no no bad taint of bad incentives or something between The thing I sensed, and I felt in part, it was a failure on my part, but I sense that I was talking to a politician.

[1612] So it's not thinking of, there was malevolence there or benevolence.

[1613] There was, he just had a job.

[1614] He put on a suit, and I was talking to a suit, not a human being.

[1615] Now, he said that his son was a big fan of the podcast, which is why he wanted to do it.

[1616] So I thought it would be talking to a human being.

[1617] And I asked challenging questions what I thought the internet thinks otherwise.

[1618] Every single question in that interview was a challenging one.

[1619] But it wasn't grilling, which is what people seemed to want to do with pharmaceutical companies.

[1620] There's a deep distrust of pharmaceutical companies.

[1621] What was the alternative?

[1622] I mean, I totally get that windfall profits at a time of public health emergency looks bad.

[1623] It is a bad look, right?

[1624] But how do we reward and return capital to risk takers who will spend a billion dollars to design a new drug for a disease that maybe only harms a single digit percentage of the population?

[1625] It's like, well, what do we want to encourage?

[1626] And who do we want to get rich?

[1627] I mean, so the person who cures cancer, do we want that person to get rich or not?

[1628] We want the person who gave us the iPhone to get rich, but we don't want the person who cures cancer to get rich.

[1629] What are we trying to do?

[1630] I think it's a very gray area.

[1631] So what we want is the person who declares that they have a cure for cancer to have authenticity and transparency.

[1632] I think we're good now as a population smelling bullshit.

[1633] And there is something about the Pfizer CEO, for example, just CEOs of pharmaceutical companies in general, just because they're so lawyered up, so much marketing PR people, that they are, you just smell bullshit.

[1634] You're not talking to real human.

[1635] It just feels like none of it is transparent to us as a public.

[1636] So like this whole talking point that Pfizer is only interested in helping people just doesn't ring true, even though it very well could be true.

[1637] It's the same thing with Bill Gates, who sees.

[1638] seems to be at scale helping a huge amount of people in the world.

[1639] And yet there's something about the way he delivers that message where people like, this seems suspicious.

[1640] What's happening underneath this?

[1641] There's certain kinds of communication styles that seem to be more service better catalysts for conspiracy theories.

[1642] And I'm not sure what that is, because I don't think there's an alternative for capitalism in delivering drugs that help people.

[1643] people.

[1644] But also at the same time, there seems to need to be more transparency.

[1645] And plus, like, regulation that actually makes sense versus, it seems like pharmaceutical companies are susceptible to corruption.

[1646] Yeah.

[1647] I worry about all that.

[1648] But I also do think that most of the people go into those fields and most of the people going into government are doing it for good.

[1649] They're non -psychopaths trying to get good things done and trying to solve hard problems.

[1650] And they're not trying to get rich.

[1651] I mean, many of the people are, it's like, bad incentives are something.

[1652] Again, I've uttered that phrase 30 times on this podcast, but it's just almost everywhere it explains normal people creating terrible harm, right?

[1653] It's not that there are that many bad people.

[1654] And, yes, it makes the truly bad people that much more remarkable and worth paying attention to.

[1655] But the bad incentives and the power of bad ideas do much more harm.

[1656] Because, I mean, that's what gets good people running in the wrong direction or doing things that are clearly creating unnecessary suffering.

[1657] you've had, and I hope still have, a friendship with Elon Musk, especially over the topic of AI.

[1658] You have a lot of interesting ideas that you both share, concerns you both share.

[1659] Well, let me first ask, what do you admire most about Elon?

[1660] Well, you know, I've had a lot of fun with Elon.

[1661] I like Elon a lot.

[1662] I mean, Elon, I knew as a friend.

[1663] I like a lot.

[1664] And, you know, it's not going to surprise anyone.

[1665] I mean, he's done and he's continuing to do amazing things.

[1666] And I think he's, you know, I think many of his aspirations are realized the world will be a much better place.

[1667] I think it's just, it's amazing to see what he's built and what he's attempted to build and what he may yet build.

[1668] So with Tesla, with SpaceX, with...

[1669] Yeah, no, I'm a fan of.

[1670] almost all of that.

[1671] I mean, there are wrinkles to a lot of that, you know, or some of that.

[1672] All humans are full of wrinkles.

[1673] There's something very trumping about how he's acting on Twitter.

[1674] I mean, Twitter, I think Twitter's great.

[1675] He doesn't, he thinks Twitter's great.

[1676] He bought the place because he thinks it's so great.

[1677] I think Twitter's driving him crazy, right?

[1678] I think he's, I think he's needlessly complicating his life and harming his reputation and creating a lot of noise and harming a lot of other people.

[1679] I mean, so, like, he, the thing that I objected to with him on Twitter is not that he bought it and made changes to it.

[1680] I mean, that was not, again, I remain agnostic as to whether or not he can improve the platform.

[1681] It was how he was personally behaving on Twitter, not just toward me, but toward the world.

[1682] I think when you, you know, forward an article about Nancy Pelosi's husband being attacked, not as he was by some lunatic, but that it's just some gay, gay, trist gone awry, right?

[1683] That's not what it seems.

[1684] And you link to a website that previously claimed that Hillary Clinton was dead and that a body double was campaigning in her place.

[1685] That thing was exploding in Trumpistan as a conspiracy theory, right?

[1686] And it was having its effect.

[1687] And it matters that he was signal boosting it in front of 130 million people.

[1688] And so it is with saying that your, you know, your former employee, Yel Roth, is a pedophile, right?

[1689] I mean, it's like that has real consequences.

[1690] It appeared to be complete bullshit.

[1691] And now this guy's getting inundated with death threats, right?

[1692] And Elon, all that's totally predictable, right?

[1693] And so he's behaving quite recklessly.

[1694] And there's a long list of things like that that he's done on Twitter.

[1695] It's not ethical.

[1696] It's not good for him.

[1697] good for the world.

[1698] It's not serious.

[1699] It's just, it's, it's, it's a very adolescent relationship to real problems in our society.

[1700] And so my, my problem with how he's behaved is that he's, he's purported to touch real issues by turns, like, okay, do I give the satellites to Ukraine or not?

[1701] Do I, do I minimize their use of them or not?

[1702] Is this, should I publicly worry about World War III or not, right?

[1703] He's doing this shit on Twitter.

[1704] Twitter, right?

[1705] And at the same moment, he's doing these other very impulsive, ill -considered things, and he's not showing any willingness to really clean up the mess he makes.

[1706] He brings Kanye on, knowing he's an anti -Semite who's got mental health problems, and then kicks him off for a swastika, which I probably wouldn't have kicked him off for a swastikers.

[1707] Like, that's even, like, can you really kick people off for swastikas?

[1708] Is that something that's, that you get banned for, I mean, are you a free speech absolutist if you can't let a swasticker show up?

[1709] I'm not even sure that's enforce an enforceable terms of service, right?

[1710] There's, there are moments to use swastikers that are not conveying hate and not raising the risk of violence.

[1711] Clip that?

[1712] Yeah.

[1713] Any, but so much of what he's doing, given that he's, again, scale matters.

[1714] He's doing this in front of 130 million people.

[1715] That's very different than a million people.

[1716] And that's very different than 100 ,000 people.

[1717] And so when I went off the tracks with Elon, he was doing this about COVID.

[1718] And again, this was a situation where I tried to privately mitigate a friend's behavior.

[1719] And it didn't work out very well.

[1720] Did you try to correct him sort of highlighting things he might be wrong on?

[1721] Yeah.

[1722] Or did you use the Lex Power Love method?

[1723] I should write like a pamphlet for Sam Harris.

[1724] No, but it was totally coming from a place of love because I was concerned about his reputation.

[1725] I was concerned about what he, I mean, there was a twofold concern.

[1726] I could see what was happening with the tweet.

[1727] I mean, he had this original tweet that was, I think it was panic over COVID is dumb or something like that, right?

[1728] This is in March, this is early March 2020.

[1729] Oh, super early days.

[1730] Super early.

[1731] When nobody knew anything, but we knew, we saw what was happening in Italy, right?

[1732] It was totally kicking off.

[1733] God, that was a wild time.

[1734] That's one of the toilet paper.

[1735] It was totally wild.

[1736] But that became the most influential tweet on Twitter for that week.

[1737] I mean, it had more engagement than any other tweet, more than any crazy thing Trump was tweeting.

[1738] I mean, it went off, again, it was just a nuclear bomb of information.

[1739] And I could see that people were responding to it, like, wait a minute, here's this genius technologist who must have inside information about everything, right?

[1740] Surely he knows something that is not on the surface about this pandemic, and they're reading, they were reading into it a lot of information that I knew wasn't there, right?

[1741] And I, and I, at the time, I didn't even, I didn't think he had any reason to be suggesting that.

[1742] I think he was just firing off a tweet, right?

[1743] So I reached out to him in private, and I mean, because it was a private text conversation, I won't talk.

[1744] about the details, but I'm just saying that's a case, you know, among the many cases of friends who have public platforms and who did something that I thought was dangerous and ill -considered, this was a case where I reached out in private and tried to help, genuinely help, because it was just, I thought it was harmful in every sense, because it was being misinterpreted, and it was like, okay, you can say that panic over anything is dumb, fine.

[1745] but this was not how this was landing.

[1746] This was like non -issue, conspiracy, there's going to be no COVID in the U .S., it's going to peter out, it's just going to become a cold.

[1747] I mean, that's how this was getting received, whereas at that moment it was absolutely obvious how big a deal this was going to be, or that it was going to, at minimum, going to be a big deal.

[1748] I don't know if it was obvious, but it was obvious that it was a significant probability that it could be a big deal.

[1749] I remember in March, it wasn't unclear, like, how big, because there was some, still stories of it, like, it's probably going to, like, the big concern, the hospitals might overfill, but it's going to die out in, like, two months or something.

[1750] Yeah, we didn't, no, but it was, there was no way we weren't going to have tens of thousands of deaths at a minimum at that point.

[1751] And, and it was, it was every, it was totally rational to be worried about hundreds of thousands.

[1752] And when Nicholas Christakis came on my podcast very early, you know, he predicted quite confidently that we would have about a million people dead in the U .S., right?

[1753] And that didn't seem, you know, it was, you know, I think appropriately hedged, but I mean, it was still, it was just like, okay, it's just going to, you just look at the, we were just kind of ride in this exponential and we're, and it's going to be, you know, it'd be very surprising not to have that order of magnitude and not something much, much less.

[1754] And so anyway, I mean, again, to close the story on Elon, I could see how this was being received, and I tried to get him to walk that back.

[1755] And then we had a fairly long and detailed exchange on this issue.

[1756] And so that intervention didn't work.

[1757] and it was not done, you know, I was not an asshole, I was not, I was just concerned, you know, for him, for the world, for, and, you know, and then there are other relationships where I didn't take the, again, that's an example where taking the time didn't work, right, privately.

[1758] There are other relationships where I thought, okay, this is just going to be more trouble than it's worth, and I said, I just, just ignored it, you know, and there's a lot of that.

[1759] And I, frankly, again, I'm not, not comfortable with how this is all netted out because I don't know if, you know, frankly, I'm not comfortable with how much time in this conversation we've spent talking about these specific people.

[1760] Like, what good is it for me to talk about Elon or bread or any I think there's a lot of good because those friendships, listen, as a fan, these are the conversations I loved love as a fan and it feels like COVID has robbed the world of these conversations because you were exchanging back and forth on Twitter but that's not what I mean by conversations like long -form discussions like a debate about COVID like a normal debate but there's no there is no Elon and I shouldn't be debating COVID you should be here's the thing with humility like basically saying we don't really know like the roger method.

[1761] We're just a bunch of idiots.

[1762] Like one is an engineer, you're a neuroscientist but like it just kind of okay here's the evidence and be like normal people.

[1763] That's what everybody was doing.

[1764] The whole world was like trying to figure out what the hell, what?

[1765] Yeah, but the issue was that at that, so at the moment I had this collision with Elon certain things were not debatable, right?

[1766] It was just it was absolutely clear where this was going.

[1767] It wasn't clear how far it was going to go or how quickly we would mitigate it, but it was absolutely clear that it was going to be an issue, right?

[1768] The train had come off the tracks in Italy.

[1769] We knew we weren't going to seal our borders.

[1770] There were already people, you know, there are already cases known to many of us personally in the U .S. at that point.

[1771] And he was operating by a very different logic that I couldn't engage with.

[1772] Sure, but that logic represents a part of the population, and there's a lot of interesting topics that have a lot of uncertainty around them, like the effectiveness of masks.

[1773] Yeah, but no, but where things broke down was not at the point of, oh, there's a lot to talk about, a lot to debate, this is all very interesting, and who knows what's what.

[1774] It broke down very early at this is, you know, there's nothing to talk about here.

[1775] It's like either there's a water bottle on the table or there isn't, right?

[1776] Like, well, technically there's only one -fourth of a water bottle.

[1777] So what defines a water bottle?

[1778] Is it the water bottle?

[1779] Is it the water bottle?

[1780] What I'm giving you as an example of is worth a conversation.

[1781] This is difficult because this is, we had an exchange in private and I want to, I want to honor not exposing the details of it.

[1782] But, you know, the details convinced me that there was not a follow -up conversation on that topic.

[1783] On this topic.

[1784] That said, I hope, and I hope to be part of helping that happen, the friendship was rekindled because one of the topics I care a lot about artificial intelligence, you've had great public and private conversations about this topic.

[1785] Yeah, and Elon was very formative in my taking that issue.

[1786] seriously.

[1787] I mean, he and I went to that initial conference in Puerto Rico together, and it was only because he was going, and I found out about it through him, and I just wrote his coattails to it, you know, that I got, so dropped in that side of the pool to hear about these concerns at that point.

[1788] It would be interesting to hear how is your concern evolved with the coming out of Chad GPT and these new large language models that are fine -tuned or with reinforcement learning and seemingly to be able to do some incredible human -like things.

[1789] There's two questions.

[1790] One, how is you concerned in terms of AGI and superintelligence evolved?

[1791] And how impressed that you with Chad GPT as a student of the human mind and mind in general?

[1792] Well, my concern about AGI is unchanged.

[1793] So I did a, I've spoken.

[1794] I've spoken.

[1795] talking about it a bunch of my podcast, but I, you know, I did a TED talk in 2016, which was the kind of summary of what that conference and, you know, various conversations I had after that did to my, my brain on this topic.

[1796] Basically, that once superintelligence is achieved, there's a takeoff, it becomes exponentially smarter, and in a matter of time, there's just were ants and they're gods.

[1797] Well, yeah, unless we find some way of permanently tethering a self, a super intelligent self -improving AI to our value system.

[1798] And I, you know, I don't believe anyone has figured out how to do that or whether that's even possible in principle.

[1799] I mean, I know people like Stuart Russell, who I just had on my podcast, are...

[1800] Oh, really?

[1801] Have you released it?

[1802] Haven't released it, yeah.

[1803] Oh, great.

[1804] He's been on previous podcast, but we just recorded.

[1805] this week.

[1806] Because you haven't done an AI podcast in a while, so it's great.

[1807] Yeah, it's great.

[1808] He's a good person to talk about alignment with.

[1809] Yeah, so Stuart, I mean, Stuart has been, you know, probably more than anyone, my guru on this topic.

[1810] I mean, like, you're just reading his book and doing, I think I've done two podcasts with them at this point.

[1811] I think it's called the control problem or something like that.

[1812] His is, his book is Human Compatible.

[1813] Human compatible.

[1814] Yeah, he talks about the control problem.

[1815] And yeah, so I just think the idea that we can define a value function in advance that permanently tethers a self -improving super -intelligent AI to our values as we continue to discover them, refine them, extrapolate them in an open -ended way, I think that's a tall order, and I think there are many more ways, there must be many any more ways of designing superintelligence that is not aligned in that way and is not ever approximating our values in that way.

[1816] So, I mean, Stewart's idea to put it in a very simple way is that he thinks you don't want to specify the value function up front.

[1817] You don't want to imagine you could ever write the code in such a way as to admit of no loophole.

[1818] you want to make the AI uncertain as to what human values are and perpetually uncertain and always trying to ameliorate that uncertainty by hewing more and more closely to what our professed values are.

[1819] So it's always interested in us saying, oh, no, no, that's not what we want.

[1820] That's not what we intend.

[1821] Stop doing that.

[1822] No matter how smart it gets, all it wants to do is more perfectly approximate human values.

[1823] I think there are a lot of problems with that, you know, at a high level.

[1824] I'm not a computer scientist, so I'm sure there are many problems at a low level that I don't understand.

[1825] Like how to force a human into the loop always, no matter what.

[1826] There's that and like what humans get a vote and just what is, you know, what do humans value and what is the difference between what we say we value and our revealed preferences, which, I mean, if you just, if you were a super intelligent AI that could look at, you know, at humanity now, I think you could be forgiven for concluding that what we value is driving ourselves crazy with Twitter and living perpetually on the brink of nuclear war and, you know, just watching, you know, hot girls in yoga pants on TikTok again and again and again.

[1827] And you're saying that is not?

[1828] This is all revealed preference.

[1829] And it's what is an AI to make of that?

[1830] And what should it optimize?

[1831] So, part of, this is also Stewart's observation that one of the insidious things about, like, the YouTube algorithm is it's not that it just caters to our preferences.

[1832] It actually begins to change us in ways so as to make us more predictable.

[1833] Like, it finds ways to make us a better reporter of our preferences and to trim our preferences down so that it can further trained up.

[1834] that signal.

[1835] So the main concern is that most of the people in the field seem not to be taking intelligence seriously.

[1836] As they design more and more intelligent machines and as they profess to want to design true AGI, they're not, again, they're not spending the time that Stewart is spending trying to figure out how to do this safely, above all.

[1837] They're just assuming that these problems are going to solve themselves as we make that final stride into the end zone, or they're saying very, you know, polyan -ish things like, you know, an AI would never form a motive to harm human, like, why would it ever form a motive to be malicious toward humanity, right, unless we put that motive in there, right?

[1838] And that's not the concern.

[1839] And the concern is that in the presence of vast disparities and competence and certainly in a condition where the machines are improving themselves, they're improving their own code, they could be developing instrumental goals that are antithetical to our well -being without any intent to harm us.

[1840] It's analogous to what we do to every other species on Earth.

[1841] I mean, you and I don't consciously form the intention to harm insects on a daily basis, but there are many things we could intend to do that would, in fact, harm insects because, you know, you decide to repave your driveway or whatever you're doing.

[1842] You're just not taking the interest of insects into account because they're so far beneath you in terms of your cognitive horizons.

[1843] and so the real challenge here is that if you believe that intelligence, you know, scales up on a continuum toward heights that we can only dimly imagine.

[1844] And I think there's every reason to believe that.

[1845] There's no reason to believe that we're near the summit of intelligence.

[1846] And you can, you know, define, maybe there's some forms of intelligence for which this is not true, but for many relevant forms, you know, like the top 100 things we care about cognitively, I think there's every reason to believe that many of those things, most of those things, are a lot like chess or go, where once the machines get better than we are, they're going to stay better than we are, although they're, I don't know if you've caught the recent thing with Go, and this actually came out of Stewart's Lab.

[1847] Yeah, yeah.

[1848] One time a human beat a machine.

[1849] Yeah, they found a hack for that.

[1850] But anyway, ultimately, there's going to be no looking back.

[1851] And then the question is, what do we do in relationship to these systems that are more competent than we are in every relevant respect?

[1852] Because it will be a relationship.

[1853] It's not like the people who think we're just going to figure this all out, you know, without thinking about it in advance, is just going to, these solutions are just going to find themselves, seem not to be taking the prospect of really creating autonomous superintelligence seriously.

[1854] Like, like, what does that mean?

[1855] It's every bit as independent and ungovernable, ultimately, as us having created, I mean, just imagine if we created a race of people that were 10 times smarter than all of us.

[1856] Like, how would we live with those people.

[1857] They're 10 times smarter than us, right?

[1858] They begin to talk about things we don't understand.

[1859] They begin to want things we don't understand.

[1860] They begin to view us as obstacles to them, to their solving those problems or gratifying those desires.

[1861] We become the chickens or the monkeys in their presence.

[1862] And I think that it's, but for some amazing solution of the sort that Stewart is imagining, that we could somehow anchor their reward function permanently, no matter how intelligent scales.

[1863] I think it's really worth worrying about this.

[1864] I do buy the sci -fi notion that this is an existential risk if we don't do it well.

[1865] I worry that we don't notice it.

[1866] I'm deeply impressed with Chad GPT, and I'm worried that it will become super intelligent, these language models will become super intelligent because they're basically trained in the collective intelligence to the human species, and then it will start controlling our behavior if they're integrated into our algorithms, the recommender systems, and then we just won't notice that there's a superintelligent system that's controlling our behavior.

[1867] Well, I think that's true even before, far before superintelligence, even before general intelligence.

[1868] I mean, I think just the narrow intelligence of these algorithms and of what something like, you know, chat GPT can do, I mean, it's just far short of it developing its own goals and that is, that are at cross purposes with ours, just the unintended consequences of using it in the ways we're going.

[1869] going to be incentivized to use it and, you know, the money to be made from scaling this thing and what it does to our information space and our sense of just being able to get to ground truth of on any facts, it's, yeah, it's super scary and it was, it's, do you think it's a giant leap in terms of development towards AGI, Chad GPT, or we still, is this just an impressive little toolbox.

[1870] So, like, when do you think the singularity is coming?

[1871] Or is it to, it doesn't matter if it's eventually?

[1872] I have no intuitions on that front, apart from the fact that if we continue to make progress, it will come.

[1873] So it's just, you just have to assume we continue to make progress.

[1874] There's only two assumptions.

[1875] You have to assume substrate independence.

[1876] So there's no reason why this can't be done in silico.

[1877] It's just, we can build.

[1878] arbitrarily intelligent machines, there's nothing magical about having this done in the wet wear of our own brains.

[1879] I think that is true, and I think that's scientifically parsimonious to think that that's true.

[1880] And then you just have to assume we're going to keep making progress.

[1881] It doesn't have to be any special rate of progress, doesn't have to be Moore's law, it can just be, we just keep going.

[1882] At a certain point, we're going to be in relationship to minds, leaving consciousness aside.

[1883] I don't have any reason to believe that they'll necessarily be conscious by virtue of being super intelligent, and that's its own interesting ethical question.

[1884] But leaving conscience aside, they're going to be more competent than we are.

[1885] And then that's like, you know, the aliens have landed.

[1886] You know, that's literally, that's an encounter.

[1887] with, again, leaving aside the possibility that something like Stewart's path is actually available to us.

[1888] But it is hard to picture if what we mean by intelligence, all things considered, and it's truly general, if that scales and begins to build upon itself, how you maintain that perfect, slavish devotion until the end of time in those systems.

[1889] The tether to humans?

[1890] Yeah.

[1891] I think my gut says that that tether is not, there's a lot of ways to do it.

[1892] So it's not this increasingly impossible problem.

[1893] Right.

[1894] So I have no, you know, as you know, I'm not a computer scientist, so I have no intuitions about just algorithmically how you would approach that and what's.

[1895] What's possible?

[1896] My main intuition is maybe deeply flawed, but the main intuition is based on the fact that most of the learning is currently happening on human knowledge.

[1897] So even Chad GPT is just trained on human data.

[1898] Right.

[1899] I don't see where the takeoff happens where you completely go above human wisdom.

[1900] The current impressive aspect of Chad GPT is that's using collective intelligence of all of us.

[1901] Well, from what I gleaned, from, again, from people who know much more about this than I do, I think we have reason to be skeptical that these techniques of deep learning are actually going to be sufficient to push us into AGI.

[1902] So it's just, they're not generalizing in the way they need to, they're certainly not learning like human children.

[1903] And so they're, there's brittle in strange ways.

[1904] They're, they're, it's not to say that the human path is the only path, you know, and maybe there's, we might learn better lessons by ignoring the way brains work, but we know that they don't generalize and use abstraction the way we do.

[1905] And so, they have strange holes in their competence.

[1906] But the size of the holes is shrinking every time.

[1907] And that's, so the intuition starts to slowly fall apart.

[1908] You know, the intuition is like, sure.

[1909] can't be this simple to achieve superintelligence.

[1910] But it's becoming simpler and simpler.

[1911] So I don't know.

[1912] The progress is quite incredible.

[1913] I've been extremely impressed with Chad GPD and the new models.

[1914] And there's a lot of financial incentive to make progress in this regard.

[1915] So we're going to be living through some very interesting times.

[1916] In raising a question that I'm going to be talking to you, a lot of people brought up this topic, probably because Eric Weinstein talked to Joe Rogan recently, and said that he and you were contacted by folks about UFOs.

[1917] Can you clarify the nature of this contact, that you were contacted about?

[1918] I've got very little to say on this.

[1919] I mean, he has much more to say.

[1920] I think he went down this rabbit hole further than I did, which wouldn't surprise anyone.

[1921] He's got much more of a taste for this sort of thing than I do.

[1922] But I think we're contacted by the same person.

[1923] And it wasn't clear to me who this person was or how this person got my cell phone number.

[1924] They didn't seem, it didn't seem like we were getting punked.

[1925] I mean, the person seemed credible to me. And they were talking to you about the release of different videos on UFO.

[1926] Yeah, and this is when there was a flurry of activity around this.

[1927] So there was a big New Yorker article on UFOs.

[1928] and there was rumors of congressional hearings, I think, come in and there were the videos that were being debunked or not.

[1929] And so this person contacted both of us, I think, around the same time.

[1930] And I think he might have contacted Rogan or other.

[1931] Eric is just the only person I've spoken to about it, I think, who I know was contacted.

[1932] And the...

[1933] happened is the person kept, you know, writing a check that he didn't cash.

[1934] Like, he kept saying, okay, next week, I'm going to, you know, I understand this is sounding spooky and, you know, you have no reason to really trust me. But next week, I'm going to, I'm going to put you on a Zoom call with people who you will recognize and they're going to be, you know, former heads of the CIA and, you know, people who just, you're going to, within five seconds of being on the Zoom call, you'll, you'll know this is not a hoax.

[1935] And I say, great, just let me know, just send me the Zoom link, right?

[1936] And I went, that happened maybe three times, you know, there was just one phone conversation and then it was just texts, you know, that's just a bunch of texts.

[1937] And I think Eric spent more time with this person, and I'm not, I haven't spoken about it.

[1938] I know he spoke about it publicly, but, um, so I, you know, it's not that my bullshit detector ever really went off in a big way.

[1939] It's just the thing never.

[1940] happened, and so I lost interest.

[1941] So you made a comment, which is interesting, that you ran the, which I really appreciate, that you ran the thought experiment of saying, okay, maybe we do have alien spacecraft, or just the thought experiment that aliens did visit.

[1942] Yeah.

[1943] And then this is a very kind of nihilistic, sad thought that it wouldn't matter.

[1944] It wouldn't affect your life.

[1945] Can you, can you explain that?

[1946] Well, no, I was, I think many people, noticed this.

[1947] I mean, this was a sign of how crazy the news cycle was at that point, right?

[1948] Like, we had COVID, and we had Trump, and I forget when the UFO thing was really kicking off, but it just seemed like no one had the bandwidth to even be interested in this.

[1949] It's like I was amazed to notice in myself that I wasn't more interested in figuring out what was going on.

[1950] It's like, and I considered, okay, wait a minute, this is, if this is true, this is the biggest story in anyone's lifetime.

[1951] I mean, contact with alien intelligence is by definition the biggest story in anyone's lifetime in human history.

[1952] Why isn't this just totally captivating?

[1953] And not only was it not totally captivating, it was just barely rising to the level of, might being able to pay attention to it.

[1954] And I view that, I mean, one as a, to some degree, an understandable defense mechanism against the bogus claims that have been made about this kind of thing in the past.

[1955] You know, the general sense is probably bullshit, or it probably has some explanation that is, you know, purely terrestrial and not surprising.

[1956] and there was there's there is somebody who what does name is it is mic west i forget is it a youtube yeah yeah he debunked stuff yeah he don't i mean i i i have since seen some of those videos i mean now now this is going back still at least a year but some of those videos seem like fairly credible debunkings of some of the optical evidence um and i'm surprised we don't haven't seen more of that like there was a a fairly credulous 60 minutes piece of that came out around that time, looking at some of that video, and it was the very video that he was debunking on YouTube, and his video only had like 50 ,000 views on it or whatever.

[1957] But again, it seemed like a fairly credible debunking.

[1958] I haven't seen debunkings of his debunkings, but...

[1959] I think there is, but he's basically saying that there is possible explanations for it.

[1960] Right.

[1961] And usually in these kinds of contexts, if there's a possible explanation, even if it seems unlikely, is going to be more likely than an alien civilization visiting us.

[1962] Yeah, so the extraordinary claims require extraordinary evidence principle, which I think is generally true.

[1963] Well, with aliens, I think generally, I think there should be some humility about what they would look like when they show up.

[1964] I tend to think they're already here.

[1965] The amazing thing about this AI conversation, though, is that we're talking about a circumstance where we would be designing the aliens, and there's every reason to police, that eventually this is going to happen.

[1966] I'm not at all skeptical about the coming reality of the aliens, and we're going to build them.

[1967] Now, here's the thing.

[1968] Does this apply to when superintelligence shows up?

[1969] Will this be trending on Twitter for a day?

[1970] And then we'll go on to complain about something Sam Harris once again said on its podcast next day.

[1971] You tend to trend on Twitter, even though you're not on Twitter, which is great.

[1972] Yeah.

[1973] I haven't noticed.

[1974] I mean, I did notice when I was on, but...

[1975] Did you have this concern about AGI, basically, the same kind of thing, that we would just look the other way?

[1976] Is there something about this time where even like World War III, which has been throwing around very casually, concerningly so, even that, the new cycle wipes that away?

[1977] Yeah.

[1978] Well, I think we have this general problem that we can't make...

[1979] certain information, even, you know, unequivocally certain information, emotionally salient.

[1980] Like, we respond quite readily to certain things.

[1981] I mean, as we talked about, we respond to the little girl who fell down a well.

[1982] I mean, that just, that gets 100 % of our emotional resources.

[1983] but the abstract probability of nuclear war, right, even a high probability, even just even an intolerable probability, even if we put it at 30%, right?

[1984] You know, like, it's just like that's, that's a Russian roulette with a gun with three chambers and, you know, it's aimed at the heads, not only your head, but your kid's head and everyone's kid's head and it's just 24 hours a day.

[1985] I mean, I think people who, this pre -Ukraine, I think the people who have made it their business to, you know, professionally to think about the risk of nuclear war and to mitigate it, you know, people like Graham Allison or William Perry or, I mean, I think they were putting like the ongoing risk.

[1986] I mean, just the risk that we're going to have a proper nuclear war at some point in the next generation.

[1987] people were putting it at, you know, something like 50%, right?

[1988] They were living with this sort of Damocles over our heads.

[1989] Now, you might wonder whether anyone can have reliable intuitions about the probability of that kind of thing, but the status quo is truly alarming.

[1990] I mean, we've got, you know, we've got ICBMs on, I mean, leave aside smaller exchanges and, you know, tactical nukes and how that could, how we could have a world war, you know, based on, you know, incremental changes.

[1991] We've got the biggest bombs aimed at the biggest cities in both directions, and it's old technology, right?

[1992] And it's, you know, and it's vulnerable to some lunatic deciding to launch or misreading, you know, bad data.

[1993] And we know we've been saved from nuclear war, I think at least twice by, you know, Soviet submarine commanders deciding, I'm not going to pass this up the chain of command, right?

[1994] It's like, this is almost certainly an error, and it turns out it was an error.

[1995] And it's like, and we need people to, I mean, in that particular case, like he saw, I think it was five, what seemed like five missiles launched from the U .S. to Russia.

[1996] And he reasoned, if America was going to engage in a first strike, they'd launch more than five missiles, right?

[1997] So this has to be fictional.

[1998] And then he waited long enough to decide that it was fictional.

[1999] But the probability of a nuclear war happening by mistake or some other species of inadvertence, you know, misunderstanding, technical malfunction, that's intolerable.

[2000] Forget about the intentional use of it by people who are driven crazy by some ideology.

[2001] And more and more technologies enable a kind of scale of destruction.

[2002] And misinformation plays into this picture in a way that is especially scary.

[2003] I mean, once you can get a deep fake of any current president of the United States claiming to have launched a first strike, You know, and just, you know, send that everywhere.

[2004] But they could change the nature of truth, and then we, that might change the engine we have for skepticism, sharpen it, the more you have deep things.

[2005] Yeah, and we might have AI and digital watermarks that help us.

[2006] Maybe we'll not trust any information that hasn't come through specific channels, right?

[2007] I mean, so in my world, it's like, I no longer feel the need to respond to anything other than what I put out in my channels of information.

[2008] It's like there's so much, there are so many people who have clipped stuff of me that shows the opposite of what I was actually saying in context.

[2009] I mean, the people have like reedited my podcast audio to make it seem like I said the opposite of what I was saying.

[2010] It's like, unless I put it out, you know, you can't be sure that I actually said it, you know.

[2011] I mean, it's just, but I don't know what it's like to live like that for all forms of information.

[2012] And, I mean, strangely, I think it may require a greater siloing of information in the end.

[2013] You know, it's like it's, we're living through this sort of Wild West period.

[2014] where everyone's got a newsletter and everyone's got a blog and everyone's got an opinion.

[2015] But once you can fake everything...

[2016] There might be a greater value for expertise.

[2017] Yeah.

[2018] For experts, but a more rigorous system for identifying who the experts are.

[2019] Yeah, or just knowing that, you know, it's going to be an arms race to authenticate information.

[2020] So it's like if you can never trust a photograph unless it has been vetted by Getty images, because only Gettie images has the resources to authenticate the provenance of that photograph and a test that hasn't been meddled with by AI.

[2021] And again, I don't even know if that's technically possible.

[2022] And maybe whatever the tools available for this will be commodified and the cost will be driven to zero so quickly that everyone will be able to do it.

[2023] It could be like encryption.

[2024] And it would be proven and tested most effectively first, of course, as always in porn.

[2025] Which is where most of human innovation technology happens first.

[2026] Well, I have to ask because Ron Howard, the director, asked us on Twitter.

[2027] Since we're talking about the threat of nuclear war and otherwise, he asked, I'd be interested in both your expectations for human society if when we move beyond Mars.

[2028] Will those societies be industrial based?

[2029] How will it be governed?

[2030] How will criminal infractions be dealt with?

[2031] uh when you read or watch sci -fi what comes closest to sounding logical do you think about our society beyond earth if we colonize mars if we colonized space yeah well i think i have a pretty uh -oh humbling picture of that so because we're still going to be the apes that we are so when you when you imagine colonizing mars you have to imagine a first fist fight on mars yeah you have to imagine a first murder on mars also infidelity yeah somebody extramarital affairs on Mars, right?

[2032] So it's going to get really homely and boring really fast, I think.

[2033] You know, it's like only the space suits or the other exigencies of just living in that atmosphere or lack thereof will limit how badly we can behave on Mars.

[2034] But do you think most of the interaction would be still in meat space versus digital?

[2035] Do you think there'll be, do you think we're like living through a transformation of, of a kind where we're going to be doing more and more interaction than digital space.

[2036] Like everything we've been complaining about Twitter, is it possible that Twitter is just the early days of a broken system that's actually giving birth to a better working system that's ultimately digital?

[2037] I think we're going to experience a pendulum swing back into the real world.

[2038] I mean, I think many of us are experiencing that now anyway.

[2039] I mean, just wanting to have face -to -face.

[2040] encounters and to spend less time on our phones and less time online.

[2041] I think, you know, maybe everyone isn't going in that direction, but I do notice it myself and I notice, I mean, once I got off Twitter, then I noticed the people who were never on Twitter, right?

[2042] And the people who are never, basically, I mean, I know I have a lot of friends who are never on Twitter.

[2043] Yeah.

[2044] And they actually never understood what I was doing on Twitter.

[2045] It's like, like, it wasn't that they were seeing it and then reacting to it, they just didn't know it's like, it's like, I'm not on Reddit either, but I don't spend any time thinking about not being on Reddit, right?

[2046] It's like I'm just not on Reddit.

[2047] So you think the pursuit of human happiness is better achieved, more effectively achieved outside of Twitter world?

[2048] Well, I think all we have is our attention in the end, and we just have to notice what these various tools are doing to it.

[2049] And it's just, it became very clear to me that it was an unrewarding use of my attention.

[2050] Now, it's not to say there isn't some digital platform that's conceivable that would be useful and rewarding.

[2051] But, yeah, I mean, we just have, you know, our life is doled out to us in moments.

[2052] And we have, and we're continually solving this riddle of what is going to suffice to make this moment.

[2053] engaging and meaningful and aligned with who I want to be now and how I want the future to look, right?

[2054] We have this tension between being in the present and becoming in the future.

[2055] And, you know, it's a seeming paradox.

[2056] Again, it's not really a paradox, but I can see it.

[2057] Like, I do think the ground truth for personal well -being is to find a mode of being where, you can pay attention to the present moment, and this is meditation by another name, you can pay attention to the present moment with sufficient gravity that you recognize that just consciousness itself in the present moment, no matter what's happening, is already a circumstance of freedom and contentment and tranquility.

[2058] Like you can be happy now before anything happens.

[2059] Before this next desire gets gratified, before this next problem gets solved, there's this kind of ground truth that you're free, that consciousness is free and open and unencumbered by really any problem until you get lost and thought about all the problems that may yet be real for you.

[2060] So the ability to catch and observe consciousness, that in itself is a source of happiness.

[2061] Without being lost in thought.

[2062] And so what this happens, this happens haphazardly for people who don't meditate because they find something in their life that's so captivating it's so pleasurable, it's so thrilling.

[2063] It can even be scary but it can be even being scared as captivated.

[2064] It gets their attention, right?

[2065] Whatever it is.

[2066] Like, you know, Sebastian Younger wrote a great book about people's experience in war here.

[2067] You know, it's like, like, strangely, it can be the best experience anyone's ever had because everything, it's like, only the moment matters, right?

[2068] Like the bullet is whizzing by your head.

[2069] You're not thinking about your 401k or that thing that you didn't say last week to the person you shouldn't have been talking about.

[2070] You're not thinking about Twitter.

[2071] It's like you're just fully immersed in the present moment.

[2072] Meditation is the only way, I mean, that word can mean many things to many people, but what I mean by meditation is simply the discovery that there is a way to engage the present moment directly, regardless of what's happening.

[2073] You don't need to be in a war.

[2074] You don't need to be having sex.

[2075] You don't need to be on drugs.

[2076] You don't need to be surfing.

[2077] There doesn't have to be a peak experience.

[2078] It can be completely ordinary, but you can recognize that in some basic sense, there's only this, and everything else is something you're thinking.

[2079] You're thinking about the past.

[2080] You're thinking about the future.

[2081] And thoughts themselves have no substance, right?

[2082] It's fundamentally mysterious that any thought ever really commandeers your sense of who you are and makes you anxious or afraid or angry or whatever it is.

[2083] And the more you discover that, the half -life of all these negative emotions that blow all of us around get much, much shorter, right?

[2084] And you can literally just, you know, the anger that would have kept you angry for hours or days lasts, you know, four seconds because you just, the moment it arises, you recognize it and you can get off that.

[2085] You can decide, at minimum, you can decide whether it's useful to stay angry at that moment.

[2086] And, you know, obviously it usually isn't.

[2087] And the illusion of free will is one of those thoughts.

[2088] Yeah.

[2089] It's all just happening, right?

[2090] Like, even the mindful and meditative response to this is just happening.

[2091] It's just like even the moments where you recognize or not recognize is just happening.

[2092] It's not that this does open up a degree of freedom for a person, but it's not a freedom that gives any motivation to the notion of free will.

[2093] It's just a new way of being in the world.

[2094] Is there a difference between intellectually knowing free will as an illusion and really experiencing it?

[2095] What's the longest you've been able to experience, the escape the illusion of free will?

[2096] Well, it's always obvious to me when I pay attention.

[2097] I mean, whenever I'm mindful, the term of jargon, you know, in the Buddhist and increasingly, you know, outside the Buddhist context is mindfulness, right?

[2098] But there are sort of different levels of mindfulness and there's different degrees of insight into this.

[2099] But yes, I mean, what I'm calling evidence of lack of free will and lack of, you know, lack of the self, I mean, I've got two sides of the same coin.

[2100] There's a sense of being a subject.

[2101] in the middle of experience, to whom all experience refers, the sense of I, the sense of me. And that's almost everybody's starting point when they start to meditate, and that's almost always the place people live most of their lives from.

[2102] I do think that gets interrupted in ways they get unrecognized.

[2103] I think people are constantly losing the sense of eye.

[2104] They're losing the sense of subject, object, distance, but they're not recognizing it.

[2105] And meditation is the...

[2106] mode in which you can recognize, you can both consciously precipitate it, you can look for the self and fail to find it, and then recognize its absence.

[2107] And that's just the flip side of the coin of free will.

[2108] I mean, the feeling of having free will is what it feels like to feel like a self who's thinking his thoughts and doing his actions and intending his intentions.

[2109] and the man in the middle of the boat who's rowing, that's the false starting point.

[2110] When you find that there's no one in the middle of the boat, right?

[2111] Or in fact, there's no boat.

[2112] There's just the river.

[2113] There's just the flow of experience.

[2114] And there's no center to it.

[2115] And there's no place from which you would control it.

[2116] Again, even when you're doing things, this does not negate the difference between voluntary and involuntary behavior.

[2117] It's like I can voluntarily reach for this.

[2118] this but when I'm paying attention I'm aware that everything is just happening like just the intention to move is just arising right and I'm in no position I don't know why it didn't arise a moment before or a moment later or a moment or you know 50 % stronger or weaker or you know so as to be ineffective or to be doubly effective where I lurched for it versus I move slow I mean I'm not, I can never run the counterfactuals.

[2119] I can never, all of this opens the door to an even more disconcerting picture along the same lines, which is subsumes this conversation about free will, and it's the question of whether anything is ever possible.

[2120] Like, what if, this is a question, I haven't thought a lot about it, but it's been a few years.

[2121] I've been kicking this question around.

[2122] So, I mean, what if only the actual is possible?

[2123] What if there was, what if, see, we live with this feeling of possibility.

[2124] We live with a sense that, let me take, so, you know, I have two daughters.

[2125] I could have had a third child, right?

[2126] So what does it mean to say that I could have had a third child?

[2127] Or is it, you don't have kids, I don't think.

[2128] So.

[2129] Not that I know of.

[2130] Yes.

[2131] So the possibility might be there.

[2132] So what do we mean when we say you could have had a child or you might have a child in the future?

[2133] Like what is the space in reality?

[2134] What's the relationship between possibility and actuality and reality?

[2135] Is there a reality in which non -actual things are nonetheless real?

[2136] And so we have other categories of like non -concrete things.

[2137] We have things that don't have spatial, temporal dimension, but they're nonetheless, they nonetheless exist.

[2138] So like, you know, the integers, right?

[2139] So numbers.

[2140] There's a reality, there's an abstract reality to numbers.

[2141] And this is philosophically interesting to think about these things.

[2142] So they're not like, in some sense, they're real, and they're not merely invented by us, they're discovered because they have structure that we can't impose upon them, right?

[2143] It's not like, they're not fictional characters like, you know, Hamlet and Superman also exist in some sense, but they exist at a level of our own fiction and abstraction, but it's like they're true and false statements you can make about Hamlet, they're true and false statements you can make about Superman, because our fiction, the fictional worlds we've created have a certain kind of structure.

[2144] But again, this is all abstract.

[2145] It's all abstractable from any of its concrete instantiation.

[2146] It's not just in the comic books and just in the movies.

[2147] It's in our, you know, ongoing ideas about these characters.

[2148] But natural numbers or the integers don't function quite that way.

[2149] I mean, they're similar, but they also have a structure that's purely a matter of discovery.

[2150] It's not, you can't just make up whether numbers are prime, you know, if you give me two integers, you know, of a certain size, you mentioned two enormous integers, if I were to say, okay, well, between those two integers, they're exactly 11 prime numbers, right?

[2151] That's a very specific claim about which I can be right or wrong, and whether or not anyone knows I'm right or wrong.

[2152] It's like, that's just, there's a domain of facts there, but these are abstract reality that relates in some way that's philosophically interesting, you know, metaphysically interesting, to what we call real reality, you know, the spatial temporal order, the physics of things.

[2153] But possibility, at least in my view, occupies a different space.

[2154] And this is something, again, my thoughts on this are pretty inchoate.

[2155] And I think I need to talk to a philosopher of physics and or a physicist about how this may interact with things like the many worlds interpretation of quantum Canada.

[2156] Yeah, that's an interesting, right, exactly.

[2157] So I wonder if discovers in physics, like further proof or more concrete proof that many worlds interpretation of quantum mechanics has some validity, if that completely starts to change things.

[2158] But even if that's just more actuality.

[2159] So if I took that seriously.

[2160] Ah, sure.

[2161] That's a case of, and truth is that happens even if the many worlds interpretation isn't true, we just imagine we have a physically infinite universe.

[2162] The implication of infinity is such that things will begin to repeat themselves, you know, the farther you go in space, right?

[2163] So, you know, if you just head out in one direction, eventually you're going to meet two people just like us having a conversation just like this, and you're going to meet them an infinite number of times in every, you know, infinite variety of permutation slightly different from this conversation, right?

[2164] So, I mean, infinity is just so big that our, intuitions of probability completely break down.

[2165] But what I'm suggesting is maybe probability isn't a thing, right?

[2166] Maybe there's only actuality.

[2167] Maybe there's only what happens.

[2168] And at every point along the way, our notion of what could have happened or what might have happened is just that.

[2169] It's just a thought about what could have happened or might have happened.

[2170] So it's a fundamentally different thing.

[2171] If you can imagine a thing that doesn't make it real.

[2172] So they, because that's where that possibility exists that's in your imagination, right?

[2173] Yeah, and possibility itself is a kind of spooky idea because it too has a sort of structure, right?

[2174] So like if I'm gonna say, you know, you could have had a daughter, right, last year.

[2175] So we're saying that's possible, but not actual, right?

[2176] That is a claim.

[2177] There are things that are true and not true about that daughter, right?

[2178] Like, it has a kind of structure.

[2179] It's like...

[2180] I feel like there's a lot of fog around that the possibility.

[2181] It feels like almost like a useful narrative.

[2182] But what does it mean?

[2183] So, like, what does it mean...

[2184] If we say, you know, I just did that, but it's conceivable that I wouldn't have done that, right?

[2185] Like, it's possible that I just threw this cap, but...

[2186] Right.

[2187] I might not have done that.

[2188] So you're taking it very temporally close to the original, like what would appear as a decision?

[2189] Whenever we're saying something's possible, but not actual, right?

[2190] Like this thing just happened, but it's possible that it wouldn't have happened or that it would have happened differently.

[2191] In what does that possibility consist?

[2192] Like, where is that?

[2193] For that to be real, for the possibility to be real, what claim are we making about the universe?

[2194] Well, isn't that an extension of the idea that free will is an illusion, that all we have is actuality, that the possibility is an illusion?

[2195] Yeah, I'm just extending it beyond human action.

[2196] This goes to the physics of things.

[2197] This is just everything.

[2198] We're always telling ourselves a story that includes possibility.

[2199] Possibility is really compelling for some reason.

[2200] Well, yeah, because it's, I mean, so this, yeah, I mean, this could sound just academic, but every backward -looking regret or disappointment and every forward -looking worry is completely dependent on this notion of possibility.

[2201] Like every regret is based on the sense that something else, I could have done something else, something else could have happened.

[2202] And every disposition to worry about the future is based on the feeling that there's this range of possibilities.

[2203] It could go either way.

[2204] And, you know, I mean, I know whether or not there's such a thing as possibility, you know, I'm convinced that worry is almost never psychologically appropriate because the reality is in any given moment, either you can do something to solve the problem you're worried about or not.

[2205] So if you can do something, just do it.

[2206] you know, and if you can't, your worrying is just causing you to suffer twice over, right?

[2207] You're going to, you know, you're going to get the medical procedure next week anyway.

[2208] How much time between now and next week do you want to spend worrying about it, right?

[2209] It's going to, it's, the worry, the worry doesn't accomplish anything.

[2210] How much do physicists think about possibility?

[2211] Well, I think about it in terms of probability more often, but probability just describes, and again, this is a place where I might be out of my depth and need to talk to us, somebody to debunk this, but the, do therapy with a physicist.

[2212] Yeah, but probability, it seems, just describes a pattern of actuality that we've observed, right?

[2213] I mean, we have, there are certain things we observe, and those are the actual things that have happened, and we have this additional story about probability.

[2214] I mean, we have the frequency with which things happen, have happened in the past.

[2215] you know I can flip a fair coin and know I know in the abstract that I have a belief that in the limit those flips those tosses should converge on 50 % heads and 50 % tails I know I have a story as to why it's not going to be exactly 50 % within any arbitrary time frame but in reality all we ever have are the observed tosses right and then we have an additional story that, oh, it came up heads, but it could have come up tales.

[2216] Why do we think that about that last toss?

[2217] And what are we claiming is true about the physics of things if we say it could have been otherwise?

[2218] I think we're claiming that probability is true.

[2219] It just allows us to have a nice model about the world.

[2220] gives us hope about the world.

[2221] It seems that possibility has to be somewhere to be effective.

[2222] It's a little bit like what's happening with the laws of, there's something metaphysically interesting about the laws of nature, too, because the laws of nature, so the laws of nature impose their work on the world, right?

[2223] We see their evidence.

[2224] But they're not reducible to any specific set of instances, right?

[2225] So there's some structure there, but the structure isn't just a matter of the actual things we have the actual billiard balls that are banging into each other all of that actuality can be explained by what actual things are actually doing but then we have this notion that in addition to that we have the laws of nature that are making they're explaining this act but but how are the laws of nature an additional thing in addition to just the actual things that are actually affect causally and if they're if they are an additional thing in how are they effective if they're not among the actual things that are just actually banging around yeah and so to some degree for that possibly possibly has to be hiding somewhere for the laws on nature to be possible like for anything to be possible it has to be it has to have closet somewhere I'm sure it's where all the possibility it has to be attached to something so I mean you don't think many worlds is that because many worlds it still exists Well, because we're in this strand of that multiverse.

[2226] Yeah.

[2227] So it's still, still you have just a local instance of what is actual.

[2228] Yeah.

[2229] And then if it proliferates elsewhere where you can't be affected by it.

[2230] Many worlds says you can't really connect with the other.

[2231] Yeah.

[2232] And so many worlds are just a statement of basically everything that can happen, happen somewhere.

[2233] Yeah.

[2234] You know, and that's, I mean, maybe that's not an entirely kosher formula.

[2235] of it, but it seems pretty close.

[2236] So, so, but there's whatever happens, right?

[2237] In fact, there's, you know, relativistically, there's a, there's an, you know, the Einstein's original notion of a block universe seems to suggest this.

[2238] And it's been a while since I've been in a conversation with a physicist where I've gotten a chance to ask about the standing of this concept in physics currently.

[2239] I don't, I don't hear it discuss much, but the idea of a block universe is that, you know, space time exists as a totality.

[2240] and our sense that we are traveling through space time where there's a real difference between the past and the future that that's an illusion of just our, you know, the weird, the weird slice we're taking of this larger object.

[2241] But on some level, it's like, you know, you're reading a novel, the last page of the novel exists just as much as the first page when you're in the middle of it.

[2242] And they're just, you know, if that's, if we're living, in anything like that, then there's no such thing as possibility.

[2243] It would seem as just what is actual.

[2244] So as a matter of our experience, moment to moment, I think it's totally compatible with that being true, that there is only what is actual.

[2245] And that sounds to the naive ear, that sounds like it would be depressing and disempowering and confining, but is anything but It's actually, it's a circumstance of pure discovery.

[2246] Like, you have no idea what's going to happen next, right?

[2247] You don't know who you're going to be tomorrow.

[2248] You're only by tendency seeming to resemble yourself from yesterday.

[2249] I mean, there's way more freedom in all of that than it seems true to many people.

[2250] And yet, the basic insight is that you're not, you're not, you're not in, the real freedom is is the recognition that you're not in control of anything.

[2251] Everything is just happening, including your thoughts and intentions and moves.

[2252] So life is a process of continuous discovery.

[2253] You're part of the universe.

[2254] You are just this.

[2255] I mean, it's the miracle that the universe is illuminated to itself as itself where you sit.

[2256] And you're continually discovering what your life is.

[2257] and then you have this layer at which you're telling yourself a story that you already know what your life is and you know exactly who you should be and what's about to happen or you're struggling to form a confident opinion about all of that and yet there is this fundamental mystery to everything even the most familiar experience we're all NPCs in a most marvelous video game maybe although my game my sense of gaming is does not run as deep as to know what I'm committing to there a non -playing character you're more yeah non -player oh wow yes you're more you're more of a Mario card guy yeah yeah I went back I was an original video gamer but it's been a long time since I mean I was I was there for Pong I remember when I saw the first Pong in a restaurant in uh I think it was like Benny Hanas or something they had a Pong and a table and that was Isn't it amazing that...

[2258] It was an amazing moment when you...

[2259] You, Sam Harris, might live from Pong to the invention and deployment of a super -intelligent system.

[2260] Yeah, well, that happened fast if it happens any time in my lifetime.

[2261] From Pong to AGI.

[2262] Yeah.

[2263] What kind of things do you do purely for fun that others might consider a waste of time?

[2264] Purely for fun.

[2265] Because meditation doesn't count because most people would say that's not a waste of time.

[2266] Is there something like Pong?

[2267] that's a deeply embarrassing thing you would never admit I don't think well I mean once or twice a year I will play a round of golf which many people would find embarrassing they might even find my play embarrassing but it's fun do you find it embarrassing no I mean I love golf just takes way too much time so I can only squander a certain amount of time on it I do love it it's a lot of fun but you have no control over your actual performance you're ever discovering I do I do have I have control over my mediocre performance, but it's, I don't have enough control as to make it really good.

[2268] But happily, I don't, I'm in the perfect spot because I don't invest enough time in it to care how I play.

[2269] So I just have fun when I play.

[2270] Well, I hope there'll be a day where you play around golf with the former president of Donald Trump.

[2271] And I would love to be.

[2272] I would bet on him if we played golf.

[2273] I'm sure he's a better golfer.

[2274] I miss the chaos of human civilization in modern times as we've talked about.

[2275] What gives you hope about this world?

[2276] in the coming year, in the coming decade, in the coming 100 years, maybe a thousand years?

[2277] What's the source of hope for you?

[2278] Well, it comes back to a few of the things we've talked about.

[2279] I mean, I think I'm hopeful.

[2280] I know that most people are good and are mostly converging on the same core values, right?

[2281] It's like we're not surrounded by psychopaths.

[2282] And the thing that finally convinced me to get off Twitter was how different life was seeming through the lens of Twitter.

[2283] It's like I just got the sense that there's way more psychopaths or effective psychopaths than I realized.

[2284] And then I thought, okay, this isn't real.

[2285] This is either a strange context in which actually decent people are behaving like psychopaths or it's, you know, bot army or something that I don't have to take seriously.

[2286] So, yeah, I just think most people, if we can get the incentives right, I think there's no reason why we can't really thrive collectively.

[2287] So there's enough wealth to go around, there's enough, you know, there's no, there's no effective limit, you know, I mean, again, within the limits of what's physically possible, but we're We're nowhere near the limit on abundance.

[2288] You know, on this, forget about going to Mars, on this, the one rock, right?

[2289] It's like we could make this place incredibly beautiful and stable if we just did enough work to solve some, you know, rather longstanding political problems.

[2290] The problem of incentives.

[2291] So to you, the basic characteristics of human nature are such that will be okay if the incentives are okay.

[2292] We'll do pretty good.

[2293] I'm worried about the asymmetry is that it's easier to break things than to fix them.

[2294] It's easier to light a fire than to put it out.

[2295] And I do worry that as technology gets more and more powerful, it becomes easier for the minority who wants to screw things up.

[2296] up to effectively screw things up for everybody, right?

[2297] So it's easier.

[2298] It's like a thousand years ago, it was simply impossible for one person to derange the lives of millions, much less billions.

[2299] Now that's getting to be possible.

[2300] So on the assumption that we're always going to have a sufficient number of crazy individuals or malevolent individuals, it's, that is, that we have to figure out that asymmetry somehow.

[2301] And so there's some cautious exploration of emergent technology that we need to get our head screwed on straight about.

[2302] So gain a function research.

[2303] Like just how much do we want to democratize, you know, all the relevant technologies there?

[2304] You know, do we want, really, you really want to give everyone the ability to order nucleotides in the mail and, and give them the blueprints for viruses online.

[2305] because of, you know, you're a free speech absolutist and you think all PDFs need to be, you know, exportable everywhere.

[2306] So I'm much more, so this is where, yeah, so there are limits to, many people are confused about my take on free speech because I've come down on the unpopular side of some of these questions, but it's been, my overriding concern is that in many cases, I'm worried about the free speech of, individual businesses or individual platforms or individual media people to decide that they don't want to be associated with certain things, right?

[2307] So like if you own Twitter, I think you should be able to kick off the Nazi you don't want to be associated with because it's your platform, you own it, right?

[2308] That's your free speech, right?

[2309] That's the side of my free speech concern for Twitter, right?

[2310] It's not that every Nazi has the right to be, to algorithmic speech on Twitter.

[2311] I think if you own Twitter, you should be you or, you know, whether it's just Elon or, you know, in the world where it wasn't Elon, just the people who own Twitter and the board and the shareholders and the employees, these people need to, can, should be free to decide what they want to promote or not.

[2312] They're public, I view them as publishers more, you know, more than as platforms in the end.

[2313] And, um, that has other implications.

[2314] But I do worry about this problem of misinformation and, you know, algorithmically and otherwise, you know, supercharged misinformation.

[2315] And I think, I do think we have we're at a bottleneck now.

[2316] I mean, I guess it could be the hubris of every present generation to think that their moment is especially important.

[2317] But I do think with the emergence of these technologies, we're some kind of bottleneck where we really have to figure out how to get this right.

[2318] And if we do get this right, if we figure out how to not drive ourselves crazy by giving people access to all the, all possible information and misinformation at all times, I think, yeah, we could, there's no limit to how happily we could collaborate with billions of creative, fulfilled people.

[2319] You know, it's just.

[2320] And trillions of robots.

[2321] Some of them sex robots, but that's another topic.

[2322] Robots that have, are running the right algorithm, whatever that algorithm is.

[2323] Whatever you need in your life to make you happy.

[2324] Sam, the first time we talked is one of the huge honors of my life.

[2325] I've been a fan of yours for a long time.

[2326] The few times you were respectful, but critical to me means the world.

[2327] And thank you so much for helping me and caring enough and caring enough about the world and for everything you do.

[2328] But I should say that the few of us that try to put love in the world on Twitter, miss you on Twitter, but enjoy yourselves don't break anything have a good party without me but very happy to do this thanks for the invitation thank you great to see you again thanks for listening to this conversation with Sam Harris to support this podcast please check out our sponsors in the description and now let me leave you with some words from Martin Luther King Jr. Love is the only force capable of transforming an enemy into a friend Thank you for listening.

[2329] I hope to see you next time.