Insightcast AI
Home
© 2025 All rights reserved
ImpressumDatenschutz
#264 – Tim Urban: Elon Musk, Neuralink, AI, Aliens, and the Future of Humanity

#264 – Tim Urban: Elon Musk, Neuralink, AI, Aliens, and the Future of Humanity

Lex Fridman Podcast XX

--:--
--:--

Full Transcription:

[0] The following is a conversation with Tim Urban, author and illustrator of the amazing blog called Wait, But Why.

[1] And now, a quick few second mention of each sponsor.

[2] Check them out in the description.

[3] It's the best way to support this podcast.

[4] First is Audible, an audiobook service I use, and love.

[5] Second is Paperspace, a platform I use to train and deploy machine learning models.

[6] Third is Coinbase, a platform I use to buy cryptocurrency.

[7] Fourth is inside tracker, a service I used to track my biological data, and fifth is NetSuite, business software for managing HR, financials, and other details.

[8] So the choice is knowledge, computation, health, or finances.

[9] Choose wise as my friends.

[10] And now, onto the full ad reads.

[11] As always, no ads in the middle.

[12] I try to make these interesting, but if you skip them, please still check out the sponsors.

[13] I enjoy their stuff.

[14] Maybe you will too.

[15] This episode is brought to you by Audible, an audiobook service that has given me hundreds, if not thousands of hours of education, wisdom, knowledge, joy.

[16] What else can I say?

[17] All of that, because I get to listen to audiobooks that I get from Audible.

[18] Many of the books I've mentioned on this podcast were ones I listened to with Audible.

[19] Examples include The Sound of Money by Neil Ferguson, Your Inner Fish by Neil She, Chubin about evolution, The New Tsar by Stephen Lee Myers, which I think is the best, or let's say the most objective work on Vladimir Putin that I've read to date, at least in English.

[20] And of course, the book that I've mentioned way too many times, The Rise and Fall of the Third Reich by William Shire.

[21] I think it's over 50 hours long and one hell of a crazy ride through the darkest moments of human history.

[22] To get a discount, visit audible .com slash lex or text Lex to 500, 500.

[23] They have thousands of titles to choose from, so visit audible .com slash Lex.

[24] Now, that's audible .com slash Lex.

[25] This show is also brought to you by PaperSpace Gradient, which is a platform that lets you build, train, deploy, machine learning models of any size and complexity.

[26] I love how powerful and intuitive.

[27] of it is.

[28] I'm likely going to use paper space for a couple of machine learning experiments I'm doing as part of an upcoming video.

[29] Fast at AI.

[30] Of course, I highly recommend use it.

[31] That's run by Jeremy Howard.

[32] He's a brilliant guy, brilliant educator, brilliant researcher.

[33] I highly recommend him.

[34] Machine learning world is full of people that kind of lean on hype.

[35] Jeremy Howard is somebody that is legit.

[36] You want to follow people who are legit.

[37] You can host notebooks on there.

[38] You can swap out the compute instances at any time start on a small -scale GPU instance or even a CPU instance and then swap out once your compute needs increase.

[39] I'm really excited about what they're calling workflows, which provides a way to automate machine learning pipelines and top of gradient compute infrastructure.

[40] It makes it really easy to build a production app because all the orchestration has reduced a simple configuration file, a YAMO file.

[41] To give gradient to try, visit gradient.

[42] Run slash Lex and use the sign -up link there.

[43] You get 15 bucks and free credits, which you can use to power your next machine learning application.

[44] That's gradient.

[45] run slash Lex.

[46] This show is also brought to you by Coinbase, which is a trusted and easy -to -use platform to buy, sell, and spend cryptocurrency.

[47] I use it, and I love it.

[48] You can buy Bitcoin, Ethereum, Cardano, and Dochecoin, and all the most popular digital currencies.

[49] I believe all the cryptocurrencies associated with folks that have been on this podcast, it's a great way to try crypto.

[50] It's a great way to diversify your portfolio.

[51] I know it might be a silly thing, but the actual interface of the website is probably my favorite part about Coinbase.

[52] It's just so intuitive and easy to use and it looks clean, crisp.

[53] It's surprising to me how many financial related websites just have terrible UI.

[54] I don't get it, but that's why you have to give Coinbase props for doing it well.

[55] Anyway, go to Coinbase .com slash Lex.

[56] For limited time, new users can get $5 in free Bitcoin when you sign up today at Coinbase .com slash Lex.

[57] This show is also brought to you by Inside Tracker, a service I use to track biological data.

[58] They have a bunch of plans, most of which include blood tests.

[59] They give you a lot of information that you.

[60] you can then make decisions based on they have algorithms that analyze your blood data, DNA data, and fitness tracker data to provide you with a clear picture of what's going on inside you and to offer you signs -backed recommendations for positive diet and lifestyle changes.

[61] I love this idea.

[62] It feels like the future.

[63] It's obvious to me that decisions for your health, for your lifestyle should be made based on data that comes from your body.

[64] Unlike data, you know, that comes from really good research studies, but they're ultimately population data.

[65] They're supposed to be representative of the general population.

[66] You're not the general population.

[67] You're unique.

[68] Each one of us are unique.

[69] So our health and lifestyle decisions should be data driven based on our own body.

[70] For a limited time, you get 25 % off.

[71] The entire InsideTracker store, if you go to Insightracker .com slash Lex.

[72] That's insidtracker .com slash Lex.

[73] This show is also brought to you by NetSuite.

[74] NetSuite allows you to manage financials, HR, human resources, inventory, e -commerce, and many more business -related details, all in one place.

[75] Running a company is hard.

[76] This is something I often think about.

[77] Do I really want to take on this giant, beautiful mess?

[78] It's not just about the ideas.

[79] It's not just about the research.

[80] research or the engineering at the core.

[81] It's all the other pieces.

[82] So you should definitely use the best tools for the job for those messy pieces, especially the ones that include humans.

[83] Anyway, right now, special financing is back.

[84] Head to netsweet .com slash Lex to get the one -of -a -kind financing program.

[85] That's netsweet .com slash Lex.

[86] NetSuite .com slash Lex.

[87] This is the Lex Friedman podcast, and here is my conversation.

[88] with Tim Urban.

[89] You wrote a wait -but -wide blog post about the big and the small, from the observable universe to the atom.

[90] What world do you find most mysterious or beautiful?

[91] The very big or the very small?

[92] The very small seems a lot more mysterious.

[93] And I mean, they're very big, I feel like we kind of understand.

[94] I mean, not the very, very big, not the multiverse, if there is a multiverse, not anything outside of the observable universe.

[95] But the very small, I think we really have no idea what's going on, or very, you know, much less idea.

[96] But I find that, so I think the small is more mysterious, but I think the big is sexier.

[97] I just cannot get enough of the bigness of space and the farness of stars.

[98] And it just continually blows my mind.

[99] I mean, it was still, the vastness of the observable universe has.

[100] has the mystery that we don't know what's out there.

[101] We know how it works, perhaps.

[102] Like, general relativity can tell us how the movement of bodies works, how they're born, all that kind of things.

[103] But, like, how many civilizations are out there?

[104] How many, like, what are the weird things that are out there?

[105] Oh, yeah, life, well, extraterrestrial life is a true mystery, the most tantalizing mystery of all.

[106] But that's, like, our size.

[107] So that's maybe it's that the actual, the big and the smaller, really.

[108] cool, but it's actually the things that are potentially our size that are the most tantalizing.

[109] Potentially our size is probably the key word.

[110] Yeah, I mean, I wonder how small intelligent life could get.

[111] Probably not that small.

[112] And I assume that there's a limit that you're not going to, I mean, you might have like a whale, blue whale size intelligent being.

[113] That would be kind of cool.

[114] But I feel like we're in the range of order of magnitude smaller and bigger than us for life.

[115] But maybe not.

[116] Maybe you could have some giant life form.

[117] It just seems like, I don't know.

[118] There's got to be some reason that anything intelligence between kind of like a little tiny rodender finger monkey up to a blue whale on this planet.

[119] I don't know.

[120] Maybe when you change the gravity and other things.

[121] Well, you could think of life as a thing of self -assembling organisms and they just get bigger and bigger and bigger.

[122] Like there's no such thing as a human being.

[123] A human being is made up of a bunch of tiny organisms of working together.

[124] And we somehow envision that as one entity because it has constant.

[125] consciousness.

[126] But maybe it's just organisms on top of organisms.

[127] Organisms all the way down, turtles all the way down.

[128] So like Earth can be seen as an organism for people, for alien species that's very different.

[129] Like, why is the human, the fundamental entity that is living?

[130] And then everything else is just either a collection of humans or components of humans.

[131] I think of it kind of is if you think about, I think of like an emergence elevator.

[132] And so you've got an ant is on one floor and then the ant colonies, you know, a floor above.

[133] Or maybe there's even units within the colony that's one floor above and the full colony is two floors above.

[134] And to me, I think that it's the colony that is the closest to being the animal.

[135] It's like the individual thing that competes with others while the individual ants are like cells in the animal's body.

[136] We are, are more like a colony in that regard.

[137] But the humans are weird because we kind of, I think of if emergence happens in an emergence tower, where you've got kind of, you know, as I said, cells and then humans and communities and societies, ants are very specific.

[138] You know, the individual ants are always cooperating with each other for the sake of the colony.

[139] So the colony is this unit that is, that is the competitive unit.

[140] Humans can kind of go, we take the elevator up and down an emergence tower psychologically.

[141] Sometimes we are individuals that are competing with other individuals, and that's where our mindset is.

[142] And then other times, we get in this crazy zone, you know, a protest or a sporting event.

[143] And you're just, you know, you're just chanting and screaming and doing the same hand motions with all these other people.

[144] And you feel like one.

[145] You feel like one, you know, and you'd sacrifice yourself.

[146] And now that's what, you know, soldiers.

[147] And so our brains can kind of psychologically go up and down this elevator in an interesting way.

[148] Yeah.

[149] I wonder how much of that is just the next.

[150] narrative we tell ourselves.

[151] Maybe we are just like an ant colony.

[152] We're just collaborating always, even in our stories of individualism, of like the freedom of the individual, like this kind of isolation, lone man on an island kind of thing, we're actually all part of this giant network of maybe one of the things that makes humans who we are is probably deeply social, the ability to maintain not just the single human intelligence, but like a collective intelligence.

[153] And so this feeling like individual is just because we woke up at this level of the hierarchy.

[154] So we make it special, but we very well could be just part of the Anne colony.

[155] This whole conversation, I'm either going to be doing a Shakespearean analysis of your Twitter, your writing, or very specific statements that you've made.

[156] So you've written answers to a mailback of questions.

[157] the questions are amazing, the ones you've chosen, and your answers are amazing.

[158] So on this topic of the big and the small, somebody asked, are we bigger than we are small or smaller than we are big?

[159] Who's asking these questions?

[160] This is really good.

[161] You have amazing fans.

[162] Okay.

[163] So where do we sit at this level of the very small to the very big?

[164] So are we bigger or are we small?

[165] Are we bigger than we are small?

[166] I think it depends on what we're asking here.

[167] If we're talking about the biggest thing that we kind of can talk about without just imagining is the observable universe, the Hubble sphere.

[168] And that's about 10 to the 26th meters in diameter.

[169] The smallest thing we talk about is a plank length.

[170] But you could argue that that's kind of an imaginary thing.

[171] But that's 10 to the negative 35.

[172] Now, we're about conveniently, about 10 to the 1.

[173] Not quite, 10 to the 0.

[174] We're about 10 to the 0 meters long.

[175] So it's easy because you can just look and say, okay, well, for example, atoms are like 10 to the negative 15th or 10 to the negative 16th meters across, right?

[176] If you go 10 to the 15th or 10 to the 16th, which is right now, so an atom to us is us to this, you get to like nebulas, smaller than a galaxy and bigger than the biggest star.

[177] So we're right in between nebula and an atom.

[178] Now, if you want to go down to quirk level, you might be able to get up to galaxy level.

[179] When you go up to the observable universe, you're getting down on the small side to things that we, I think, are mostly theoretically imagining are there and hypothesizing are there.

[180] So I think as far as real world objects that we really know a lot about, I would say we are smaller than we are big.

[181] But if you want to go down to the plank length, we're very quickly, we're bigger than we are small.

[182] If you think about strings.

[183] Yeah, string, exactly, with string theory and so on.

[184] That's interesting.

[185] But I think, like you answer, no matter what, we're kind of middle -ish.

[186] Yeah.

[187] I mean, here's something cool.

[188] If a human is a neutrino, and again, neutrino, the size doesn't really make sense.

[189] It's not really a size.

[190] But when we talk about some of these neutrinos, I mean, if neutrino is a human, a proton is the sun.

[191] So that's like, I mean, a proton is real small, like, really small.

[192] And so, yeah, the small gets like crazy small very quickly.

[193] Let's talk about aliens.

[194] We already mentioned it.

[195] Let's start just by with the basic.

[196] What's your intuition as of today?

[197] This is a thing that could change day by day.

[198] But how many alien civilizations out there?

[199] Is it zero?

[200] Is it a handful?

[201] Is it almost endless?

[202] Like the, the universe, the observer.

[203] universe, or the universe is teeming with life?

[204] If I had gunned to my head, I have to take a guess.

[205] I would say it's teeming with life.

[206] I would say there is.

[207] I think running a Monte Carlo simulation, this paper by Anders Sandberg and Drexler and a few others a couple years ago, I think you probably know about it.

[208] I think they're, the mean, you know, using different, you know, running through a randomized Drake equation multiplication, you ended up with 27 million as the mean of intelligent civilizations in the galaxy, in the Milky Way alone.

[209] And so then if you go outside the Milky Way, that would turn into trillions.

[210] That's the mean.

[211] Now, what's interesting is that there's a long tail because they believe some of these multipliers in the Drake equation.

[212] So, for example, the probability that life starts in the first place, they feel.

[213] think that the kind of range that we use is for that variable or is way too small and that's constraining our possibilities and if you actually extend it to you know you know some some crazy number of orders of magnitude like 200 they think should that that variable should be uh you get this long tail where i forget the exact number but it's like a third or a quarter of the total outcomes have us alone like so you know i think it's like i think it's a A sizable percentage has us as the only intelligent life in the galaxy, but you can keep going.

[214] And I think there's like, you know, a non -zero, like, legitimate amount of outcomes there that have us as the only life in the observable universe at all is on Earth.

[215] I mean, it seems incredibly counterintuitive.

[216] It seems like, you know, when you mentioned that, people think, you're, you know, you must be an idiot because, you know, if you picked up one grain of sand on a beach and examined it and you found all these little things on it, it's like saying, well, maybe this is the only one that happened.

[217] And it's like, probably not.

[218] They're probably, most of the sand, probably, or a lot of the sand, right?

[219] So, and then the other hand, we don't see anything.

[220] We don't see any evidence, you know, which, of course, people would say that the people who scoff at the concept that were potentially alone, they say, well, of course, there's lots of reasons we wouldn't have seen anything, and they can go list them.

[221] And they're very compelling.

[222] But we don't know.

[223] And the truth is, if this were a freak thing, I mean, we don't, if this were a completely freak thing that happened here.

[224] here, whether it's life at all or just getting to this level intelligence, that species, whoever it was, would think there must be lots of us out there and they'd be wrong.

[225] So just being, again, using the same intuition that most people would use, I'd say there's probably lots of other things out there.

[226] Yeah, and you wrote a great blog post about it, but to me the two interesting reasons that we haven't been in contact.

[227] I, too, have an intuition that the universe is teaming in life.

[228] So one interesting is around the great filter.

[229] So we're either the great filter is either behind us or in front of us.

[230] So the reason that's interesting is you get to think about what kind of things ensure or ensure the survival of an intelligent civilization or lead to the destruction of intelligence civilization.

[231] That's a very pragmatic, very important question to always be asking.

[232] And we'll talk about some of those.

[233] And then the other one is I'm saddened by the possibility that there could be aliens communicating with us all the time.

[234] In fact, they may have visited and we're just too dumb to hear it, to see it.

[235] Like the idea that the kind of life that can evolve is just the range of life that can evolve is so large that our narrow view of what is life and what is intelligent life is preventing us from having communication with them.

[236] But then they don't seem very smart because if they were trying to communicate with us, they would surely, if they were super intelligent, they would be very, I'm sure if there's lots of life, we're not that rare, we're not some crazy weird species that hears and has different kinds of ways of perceiving signals.

[237] So they would probably be able to, you know, if you really wanted to communicate with an earth -like species, with a human -like species, you would send out all kinds of things.

[238] You send out radio waves and you send out gravity waves and lots of things.

[239] So if they're communicating in a way, they're trying to communicate with us.

[240] And it's just we're too dumb to proceed the signals.

[241] It's like, well, they're not doing a great job of considering the primitive species we might be.

[242] So I don't know.

[243] I think of a super intelligent species wanted to get in touch with us and had the capability of.

[244] I think probably they would.

[245] Well, they may be getting in touch with us.

[246] They're just getting in touch with the thing that we humans are not understanding that they're getting in touch with us with.

[247] I guess that's what I was trying to say.

[248] There could be something about Earth that's much more special than us humans.

[249] Like the nature of the intelligence that's on Earth or the thing that's of value and that's curious and that's complicated and fascinating and beautiful might be something that's not just like tweets.

[250] Okay, like English language that's interpretable or any kind of language or any kind of signal, whether it's gravity or radio signal that humans seem to appreciate.

[251] Why not the actual, it could be the process of evolution itself.

[252] There could be something about the way that Earth is breathing, essentially, through the creation of life and this complex growth of life.

[253] There's, like, it's a whole different, way to view organisms and view life that could be getting communicated with and we humans are just a tiny fingertip on top of that intelligence and the communication is happening with the main mothership of earth versus us humans that seem to treat ourselves as super important and we're missing the big picture i mean it sounds crazy but our understanding of what is intelligent of what is life what is consciousness is very limited and it seems to be and just being very suspicious it seems to be awfully human -centric like this story it seems like the progress of science is you know constantly putting humans down on the importance like on the cosmic importance the ranking of how big we are how important we are that seems to be the more we discovered that's what's happening and I think science is very young.

[254] And so I think eventually we might figure out that there's something much, much bigger going on.

[255] The humans are just a curious little side effect of the much bigger thing.

[256] That's what, I mean, that, as I'm saying, it just sounds insane.

[257] Well, it sounds a little, like, religious.

[258] It sounds like a spiritual.

[259] It gets to that realm where there's something that more than meets the eye.

[260] Well, yeah, but not.

[261] So religious and spiritual often have this kind of woo -woo characteristic, and people write books about them and then go to wars over whatever the heck is written in those books.

[262] I mean more like it's possible that collective intelligence is more important than individual intelligence, right?

[263] It's the end colony.

[264] What's the primal organism?

[265] Is it the ant colony or is it the ant?

[266] Yeah, I mean, humans, just like any individual ant can't do shit, but the colony can do these incredible structures and has this intelligence.

[267] And we're exactly the same.

[268] I mean, you know, you know, the famous thing that, you know, no one, no human knows how to make a pencil.

[269] Have you heard this?

[270] No. Basically, I mean, this is great.

[271] There's not a, it's a single human out there has absolutely no idea how to make a pencil.

[272] So you have to think about, you have to get the wood, the paint, the different chemicals that make up the yellow paint.

[273] The eraser is a whole other thing.

[274] The metal has to be mined from somewhere and then the graphite, whatever that is.

[275] And there's not one person on earth who knows how to kind of collect all those materials and create a pencil.

[276] But together, that's one of the, that's child's play.

[277] It's just one of the easiest things.

[278] So, you know, the other thing I like to think about, I actually put this as a question on the blog once.

[279] There's a thought experiment, and I actually want to hear what you think.

[280] So if a witch, kind of a dickish witch comes around and she says, I'm going to cast a spell on all, all of humanity, and all material things that you've invented are going to disappear all at once.

[281] So suddenly we're all standing there naked.

[282] There's no buildings.

[283] There's no cars and boats and ships and no mines, nothing, right?

[284] It's just the Stone Age Earth and a bunch of naked humans, but we're all the same.

[285] We have the same brain.

[286] So we're all know what's going on.

[287] And we all got a note from her, so we understand the deal.

[288] And she says, she communicated to every human, here's the deal.

[289] You lost all your stuff.

[290] You got.

[291] guys need to make one working iPhone 13.

[292] And you make one working iPhone 13 that could pass in the Apple store today, you know, in your previous world, for an iPhone 13, then I will restore everything.

[293] How long do you think?

[294] And so everyone knows, this is the mission.

[295] We're all aware of the mission.

[296] Everyone, all humans.

[297] How long would it take us?

[298] That's a really interesting question.

[299] So obviously, if you do a random selection of 100 or 1 ,000 humans within the population, I think you're screwed to make that iPhone.

[300] I tend to believe that there's fascinating specialization among the human civilization.

[301] Like, there's a few hackers out there that can, like, solo build an iPhone.

[302] But with what materials?

[303] So no materials whatsoever.

[304] It has to, I mean, it's virtually, I mean, okay, you have to build factories.

[305] I mean, to fabricate.

[306] Okay.

[307] And how are you going to mine them?

[308] You know, you got to mine the materials where you don't have any cranes.

[309] You don't have any, you know.

[310] Okay, you 100 % have to have the, everybody's naked.

[311] Everyone's naked and everyone's where they are.

[312] So you and I would currently be naked, it's on the ground in what used to be Manhattan.

[313] So no, grassy, no, grassy island.

[314] Yeah.

[315] So you need a naked Elon Musk type character to then start building a company.

[316] So you have to have a large company then.

[317] Right.

[318] It doesn't even know where he, you know, where is everyone?

[319] You know, oh, shit, how am I going to find other people I need to find you?

[320] But we have all the knowledge of.

[321] Yeah, everyone has the knowledge that's in.

[322] their current brains.

[323] I've met some legit engineers.

[324] Crazy polymath people.

[325] But the actual labor of, because you said, because like the original Mac, like the Apple 2, that can be built.

[326] But even that, you know, even that's going to be tough.

[327] Well, I think part of it is a communication problem.

[328] If you could suddenly have, you know, if everyone had a walkie talkie and there was, you know, a couple, you know, 10 really smart people were designated to leaders.

[329] They could say, okay, I want, you know, everyone who can do this to walk west, you know, until you get to this little hub and everyone else, you know, and they could actually coordinate, but we don't have that.

[330] So it's like people just, you know, and then what I think about is, so you've got some people that are like trying to organize and you'll have a little community where a couple hundred people have come together and you're going to be a couple thousand have organized and they designated one person, you know, as the leader and then they have sub -leaders and okay, we have a start here.

[331] We have some organization.

[332] You're also going to have some people that say, good.

[333] Humans were a scourge upon the earth and this is good.

[334] And they're going to try to sabotage.

[335] They're going to try to murder the people with the, and who know what they're talking about.

[336] The elite that possessed the knowledge.

[337] Well, and so everyone, maybe everyone's hopeful for the, you know, we're all civilized and hopeful for the first 30 days or something.

[338] And then things start to fall off.

[339] People start to lose hope.

[340] And there's new kinds of, you know, new kinds of governments popping up, you know, new kinds of societies and they're, they're, they don't play nicely with the other ones.

[341] And I think very quickly, I think a lot of people would just give up and say, you know what, this is it.

[342] We're back in the Stone Age.

[343] Let's just create, you know, agrarian.

[344] We also don't know how to farm.

[345] No one knows how to farm.

[346] There's like, even the farmers, you know, a lot of them are relying on their machines.

[347] And so we also, is going to allow you mass starvation.

[348] And that, you know, when you're trying to organize, a lot of people are, you know, coming in with, you know, spears they've fashioned and trying to murder everyone who has food.

[349] That's an interesting question, given today's society, how much violence would that be?

[350] We've gotten softer or less violent.

[351] And you don't have weapons.

[352] So that's something.

[353] We have really primitive weapons now.

[354] But we have a, and also we have a kind of ethics where murder is bad.

[355] Right.

[356] You used to be less, like human life was less valued.

[357] In the past, so murder was more okay, like ethically.

[358] But in the past, they also were really good at figuring out how to have sustenance.

[359] They knew how to get food and water because they, they, so we have no idea.

[360] Like the ancient hunter -gatherer societies would laugh at what's going on here.

[361] They'd say, you don't know what you're, none of you know what you're doing.

[362] Yeah.

[363] And also, the amount of people feeding this amount of people in a very, in a stone age, you know, civilization that's not going to happen so new york and san francisco are screwed well whoever's not near water is really screwed so that's you're near a river a fresh water river and you know anyway it's a very interesting question and what it does this and the pencil it makes me um feel so grateful and like excited about like man our civilization is so cool and this is talk about collective intelligence humans did not build any of this it's collective human super collective humans is a collective humans is a super intelligent, you know, being that is, that can do absolutely, especially over long periods of time, can do such magical things.

[364] And we just get to be born.

[365] When I go on, when I'm working and I'm hungry, I just go click, click, click and like a salad's coming.

[366] The salad arrives.

[367] If you think about the incredible infrastructure that's in place for that quickly, it's just the internet to, you know, the electricity, first of all, that's just powering the things, you know, how the, where the, the amount of structures that have to be created and for that electricity to be there.

[368] And then you've got the, of course, the internet.

[369] And then you have this system where delivery drivers, and they have, they're running bikes that were made by someone else.

[370] And they're going to get the salad and all those ingredients came from all over the place.

[371] I mean, it's just, so I think it's like, I like, I like thinking about these things because it, um, it makes me feel like, just so grateful.

[372] I'm like, man, it would be so awful if we didn't have this and people, people who didn't have it would think this was such magic we live in and we do.

[373] And like, cool, that's fun.

[374] Yeah, one of the most amazing things when I showed up, I came here at 13 from the Soviet Union and the supermarket was, people don't really realize that, but the abundance of food, it's not even, so bananas was the thing I was obsessed about.

[375] I just ate bananas every day for many, many months because I haven't had bananas in Russia.

[376] And the fact that you can have as many bananas as you want, plus there were like somewhat inexpensive relative to the other food.

[377] The fact that you can somehow have a system that brings bananas to you without having to wait in a long line, all of those things.

[378] It's magic.

[379] I mean, also, imagine, so first of all, the ancient hunter gatherers.

[380] You know, you picture the mother gathering and eating all this fresh food.

[381] No. So, you know what an avocado used to look like?

[382] It was a little, like, a sphere.

[383] Yeah.

[384] And the fruit of it, the actual avocado part, was like a little tiny layer around this big pit that took up almost the whole volume.

[385] We've made a crazy, like, robot avocados today that have nothing to do with like what they so same with bananas these big sweet uh you know um you know and not infested with bugs and grow you know they used to eat the shittiest food um and they're eating and they're eating you know uncooked meat or maybe they cook it and they're just it's gross and it's um things rot so you go to the supermarket and it's just it's just a it's like crazy super engineered cartoon food fruit and food and then it's all this processed food which you know we complain about in our setting oh you know we complain about you know we need too much process.

[386] That's a, this is a good problem.

[387] I mean, if you imagine what they would think, my God, a cracker, you know, how delicious a cracker would taste to them.

[388] You know, candy, you know, pasta and spaghetti, so they never had anything like this.

[389] And then you have from all over the world, I mean, things that are grown all over the place, all here in nice little racks organized and on a middle class salary, you can afford anything you want.

[390] I mean, it's, again, just like incredible gratitude.

[391] Like, ah, yeah.

[392] And the question is, is how resilient is this whole thing?

[393] I mean, this is another darker version of your question is if we keep all the material possessions we have, but we start knocking out some percent of the population, how resilient is the system that we built up, or if we rely on other humans and the knowledge of built up on the past, the distributed nature of knowledge, how much does it take, how many humans need to disappear for us to be completely lost.

[394] Well, I'm trying to go off one thing, which is Elon Musk says that he has this number a million in mind as the order of magnitude of people you need to be on Mars to truly be multi -planetary.

[395] Multi -planetary doesn't mean, you know, like when Neil Armstrong, you know, goes to the moon, they call it a great leap for mankind.

[396] Yeah.

[397] It's not a great leap for anything.

[398] It is a great achievement for mankind.

[399] And I always like think about if the first fish to kind of go on land just kind of went up and gave the shore high five and goes back into the water, that's not a great leap for life.

[400] That's a great achievement for that fish.

[401] And there should be a little statue of that fish and it's, you know, in the water and everyone should celebrate the fish.

[402] But it's, but when we talk about a great leap for life, it's permanent.

[403] It's something that now, from now on, this is how things are.

[404] So this is part of why I get so excited about.

[405] Mars, by the way, is because you can count on one hand.

[406] and the number of great leaps that we've had, you know, like no life to life and single cell or simple cell to complex cell and single cell organisms to animals, to, you know, multi -cell animals, and then ocean to land, and then one planet to two planets.

[407] Anyway, diversion.

[408] But the point is that we are officially that leap for all of life, you know, has happened once the ships could stop coming from Earth because there's some horrible, catastrophic World War III and everyone dies on Earth and they're fine and they can turn that certain X number of people into seven billion, you know, population that's thriving just like Earth.

[409] They can build ships.

[410] They can come back and recolonize Earth because now we are officially multi -planetary where it's a self -sustaining.

[411] He says a million people is about what he thinks.

[412] Now, that might be a specialized group.

[413] That's a very specifically, you know, selected million that has very, very skilled million people, not just maybe the average million on Earth.

[414] But I think it depends what you're talking about, but I don't think, you know, so one million is one seven thousand, one eight thousandth of the current population.

[415] I think you need a very, very, very small fraction of humans on Earth to get by.

[416] Obviously, you're not going to have the same thriving civilization if you get too small a number, but it depends who you're killing off, I guess, is part of the question.

[417] Yeah.

[418] If you killed off half of the people just randomly right now, I think we'd be fine.

[419] It would be obviously a great, awful tragedy.

[420] I think if you killed off three quarters of all people randomly, just three out of every four people drops dead.

[421] I think we'd have, obviously, the stock market would crash.

[422] We'd have a rough patch, but I almost can assure you that the species would be fine.

[423] Well, because the million number, like you said, it is specialized.

[424] So I think because you have to do this, you have to basically do the iPhone experiment.

[425] Like literally, you have to be able to be able to manufacture computers.

[426] Yeah, everything.

[427] If you're going to have the self -sustaining means you can, you know, any major important skill, any important piece of infrastructure on Earth can be built there in this, you know, just as well.

[428] It would be interesting to list out what are the important things, what are the important skills?

[429] Yeah, I mean, you have to feed everyone.

[430] So, you know, mass farming, things like that, you have to, you have mining these questions.

[431] It's like the materials might be, I don't know, I don't know, five miles, two miles underground.

[432] I don't know what that's actual.

[433] But, like, it's amazing to me just that these things got built in the first place.

[434] And, you know, they never got, no one built the first, the mine that we're getting stuff for the iPhone for, probably wasn't built for the iPhone, you know, or in general, early mining, you know, was for, you know, I think obviously, I assume that the industrial revolution when we realized, oh, fossil fuels, we want to extract this magical energy source.

[435] I assume that, like, mining took a huge leap without knowing very much about this.

[436] I think, you know, you're going to need, you need mining.

[437] you're going to need a lot of electrical engineers.

[438] If you're going to have a civilization like ours, and of course you could have oil and lanterns, we could go way back, but if you're trying to build our today thing, you're going to need energy and electricity and mines that can bring materials, and then you're going to need a ton of plumbing and everything that entails.

[439] And like you said, food, but also the manufacturer, so like turning raw materials into something useful, that whole thing, like factories, some supply chain, transportation, Right.

[440] You know, I mean, you think about, when we talk about like world hunger, one of the major problems is, you know, there's plenty of food.

[441] And by the time it arrives, most of it's gone bad in the truck, you know, in a kind of an impoverished place.

[442] So it's like, you know, we take, again, we take it so for granted.

[443] All the food in the supermarket is fresh.

[444] It's all there.

[445] And which always stresses me, if I were running a supermarket, I would always be so, like, miserable about like things going bad on the shelves.

[446] Or if you don't have enough, that's not good.

[447] But if you have too much, it goes bad anyway.

[448] there would be entertainers, too.

[449] Like, somebody would have a YouTube channel that's running on Mars.

[450] There is something different about a civilization on Mars and Earth existing versus, like, a civilization in the United States versus Russia and China.

[451] Like, that's a different, fundamentally different distance, like, philosophically.

[452] Will it be, like, fuzzy?

[453] We know there'll be, like, a reality show on Mars that everyone on Earth is obsessed with.

[454] And, you know, I think if people are going back and forth, enough, then it becomes fuzzy, it becomes like, oh, our friends on Mars, and there's like this Mars versus Earth, you know, like, and it become like fun tribalism.

[455] I think if people don't rarely go back and forth, and it really, they're there for, I think if you get kind of like, oh, we hate, you know, a lot of like us versus them stuff going on.

[456] There could be also war in space for territory.

[457] As first colony happens, China, Russia, or whoever, the European, different European nations, Switzerland finally gets their act together and starts wars.

[458] This is supposed to staying out of all.

[459] Yeah, there's all kinds of crazy geopolitical things that like we have not even, no one's really even thought about too much yet that like, that could get weird.

[460] Think about the 1500s when it was suddenly like a race to like, you know, colonize or capture land or discover new land that hasn't been, you know, so it was like this new frontiers.

[461] There's not really, you know, the land is not, you know, the thing about Crimea was like, this huge thing because this tiny peninsula switched.

[462] That's how, like, optimized everything has become.

[463] Everything is just, like, really stuck.

[464] Mars is a whole new world of, like, you know, territory, finding, naming things.

[465] And it's a chance for new kind of governments, maybe, or maybe it's just the colonies of these governments so we don't get that opportunity.

[466] I think it would be cool if there's new country as being, you know, totally new experiments.

[467] And that's fascinating because Elon talks exactly about that.

[468] And I believe that very much.

[469] like that should be like from from the start they should determine their own sovereignty like they they should determine their own thing there was one modern democracy in late 1700s the US I mean it was the only modern democracy and now of course there's hundreds or doesn't many dozens but I think part of the reason that was able to start I mean it's not people didn't have the idea people had the idea it was that it was they had a clean slate new place you know and they suddenly were, you know, so I think it would be a great opportunity to have, because a lot of people have done that, you know, oh, if I had my own government on an island, my own country, what would I do?

[470] And it's, the, the U .S. founders actually had the opportunity, that fantasy, they were like, we can do it, let's make, okay, what's the perfect country?

[471] And they tried to make something.

[472] Sometimes progress is, it's not held up by our imagination.

[473] It's held up by just, there's no, you know, blank canvas to try something on.

[474] Yeah, it's an opportunity for a fresh start.

[475] You know, the funny thing about the conversation we're having is not often had.

[476] I mean, even by Elon, he's so focused on Starship and actually putting the first human on Mars.

[477] I think thinking about this kind of stuff is inspiring.

[478] It makes his dream.

[479] It makes us hope for the future.

[480] And it makes us somehow like thinking about a civilization on Mars is helping us think about the civilization here on Earth.

[481] Yeah.

[482] how we should run it.

[483] What do you think are, like, in our lifetime?

[484] Are we going to, I think any effort that goes to Mars, the goal is in this decade.

[485] Do you think that's actually going to be achieved?

[486] I have a big bet, $10 ,000 with a friend when I was drunk in an argument.

[487] This is great.

[488] That the Neil Armstrong of Mars, whoever he or she may be, will set foot by the end of 2030.

[489] Now, this was probably 2018 when I had this argument.

[490] So, like, what if a human has to touch Mars by 20 and by the end of 20.

[491] 30.

[492] Oh, by the year 30.

[493] Yeah, by January 1st, 2013.

[494] Yeah.

[495] So, um, did you agree on the time zone or what?

[496] No, no. Yeah.

[497] If it's coming on that exact day, that's going to be really stressful.

[498] But, um, but anyway, I, I think that there will be.

[499] That was 2018.

[500] I was more confident then.

[501] Um, I think it's going to be around this time.

[502] I mean, I still won the general bet because his point was, you are crazy.

[503] This is not going to happen in our lifetime.

[504] They're not for many, many decades.

[505] And I said, you're wrong.

[506] You don't know what's going on in SpaceX.

[507] I think if the world depended on it, I think probably SpaceX could probably, I mean, I don't know this, but I think the tech is almost there.

[508] Like, I don't think, of course, it's delayed many years by safety.

[509] So they first want to send a ship, you know, around Mars and they want to land a cargo ship on Mars.

[510] And there's the moon on the way too.

[511] Yeah, yeah.

[512] There's a lot.

[513] But I think the moon, a decade before, seemed like magical tech that we, that humans didn't have.

[514] This is like, no, we can, we can And it's totally conceivable that this, you've seen Starship, like it's, it is a interplanetary colonial or interplanetary transport like system.

[515] That's what they used to call it.

[516] The SpaceX, the way they do it is every time they do a launch, something fails, usually, you know, when they're testing, and they learn a thousand things, the amount of data they get, and they improve.

[517] So each one has, you know, it's like they've moved up like eight generations in each one.

[518] Anyway, so it's not inconceivable that pretty soon they could send a starship to Mars and land it.

[519] There's just no good reason.

[520] I don't think that they couldn't do that.

[521] And so if they could do that, they could, in theory, send a person to Mars pretty soon.

[522] Now, taking off from Mars and coming back, again, I don't think anyone want to be on that voyage today because there's just, you know, it's still, it's still amateur hour here in getting that perfect.

[523] I don't think we're too far away now.

[524] The question is, so every 26 months, Earth laps Mars.

[525] It's like the synocidal, soidal, or orbit or whatever it's called, the period.

[526] 26 months.

[527] So it's right now, like in the evens, like 2022 is going to have one of these, late 2024.

[528] So people could, this was the earliest estimate I heard.

[529] Elon said, maybe we can send people to Mars in 2024, you know, to land to 2020, early 2025.

[530] That is not going to happen because that included 2022 sending a cargo ship to Mars, maybe even one in 2020.

[531] And so I think they're not quite on that schedule.

[532] But to win my bet.

[533] 2027, I have a chance, and 2029 I have another chance.

[534] Nice.

[535] We're not very good at backing up and seeing the big picture.

[536] We're very distracted by what's going on today and what we can believe because it's happening in front of our face.

[537] There's no way that humans are going to be landing on Mars and it's not going to be the only thing everyone is talking about, right?

[538] I mean, it's going to be the moon landing, but even bigger deal, going to another planet, right?

[539] And for it to start a colony, not just to, again, high five and come back.

[540] So this is like the 2020s, maybe the 2030s, is going to be the new 1960s.

[541] We're going to have a space decade.

[542] I'm so excited about it.

[543] And again, it's one of the great leaps for all of life happening in our lifetimes.

[544] Like, that's wild.

[545] To paint a slightly cynical possibility, which I don't see happening.

[546] But I just want to put sort of value into leadership.

[547] I think it wasn't obvious that the moon landing would be so exciting for all of human civilization.

[548] Some of that have to do with the right speeches, with a space race.

[549] Like, space, depending on how it's presented.

[550] it can be boring.

[551] I don't, I don't, I don't think it's been that so far, but I've actually, I agree.

[552] I think space is quite boring right now.

[553] Not, not, no, SpaceX is super, but like 10 years ago, space.

[554] Yeah.

[555] Some writer, I forget who wrote, it's like the best magic trick in the show happened at the beginning, and now they're starting to do this like easy, you know, it's like, you can't go in that direction.

[556] And the line that this writer said is like watching, uh, astronauts go up to the space station after watching the moon is like watching Columbus sail to Ibiza.

[557] It's just like, you know, it's, everything is so unpractical.

[558] You're going up to the space station not to explore, but to do science experiments in microgravity.

[559] And you're sending rockets up, you know, mostly here and there's a probe, but mostly you're sending up to put satellites to, you know, for direct TV, you know, and I or whatever it is.

[560] It's kind of like lame earth industry, you know, usage.

[561] So I agree with you, space is boring there.

[562] The first human setting foot on Mars, that's got to be a crazy global event.

[563] event.

[564] I can't imagine it not being.

[565] Maybe you're right.

[566] Maybe I'm taking for granted of the speeches and the space race.

[567] I think the value of, I guess what I'm pushing is the value of people like Elon Musk and potentially other leaders that hopefully step up is extremely important here.

[568] Like, I would argue without the publicity of SpaceX, it's not just the ingenuity of SpaceX, but like what they've done publicly by having a figure that tweets and all that kind of stuff like that, that's a source of inspiration.

[569] Totally.

[570] NASA wasn't able to quite pull off with a shuttle.

[571] That's one of his two reasons for doing this.

[572] Space Sixes exist for two reasons.

[573] One, life insurance for the species.

[574] If we're on, you know, if you're, if you're, if you're, if you're, if you're, I always think about this way, if you're an alien on some faraway planet and you're rooting against humanity and you win the bet if humanity goes extinct.

[575] You do not like SpaceX.

[576] You do not want them to have their eggs in two baskets now.

[577] Yeah.

[578] Um, you know, it's, yeah, sure, it's like, obviously this, you know, you could have some, you know, something that kills everyone on both planets, some AI war or something.

[579] But, um, but the point is, obviously it's good for our chances, our long -term chances to be having, you know, to self -sustaining civilizations going on.

[580] The second reason he's, you know, he values this, I think, just as high is, it's the greatest adventure in history, you know, going multi -planetary.

[581] And that, you know, it's, you know, people need some reason to wake up in the morning and, and, um, and it'll, it'll, just be this hopefully great uniting event too.

[582] I mean, I'm sure in today's nasty, awful political environment, which is like a whirlpool of that sucks everything into it.

[583] So it doesn't mean, you name a thing and it's become a nasty political topic.

[584] So I hope, I hope that space can, you know, Mars can just bring everyone together.

[585] But, you know, it could become this hideous thing where it's, you know, oh, you know, billionaire, some annoying storyline gets built.

[586] So half the people think that Anyone who's excited about Mars is an evil, you know, something.

[587] Yeah.

[588] Anyway, I hope it is super excited.

[589] So far, space has been a uniting, inspiring thing.

[590] And in fact, especially during this time of a pandemic, has been just a commercial entity putting out humans into space for the first time was just one of the only big sources of hope.

[591] Totally.

[592] And awe, just like watching this huge skyscraper go up in the air, flip over, come back down and land.

[593] And, I mean, it just makes everyone just want to sit back and clap and kind of like, you know, the way I look at something like SpaceX is it makes me proud to be a human.

[594] And I think it makes a lot of people feel that way.

[595] It's like, good for our self -esteem.

[596] It's like, you know what?

[597] We're pretty, you know, we have a lot of problems, but like, we're kind of awesome.

[598] Yeah, we're awesome.

[599] And if we can put people on Mars, you know, sticking an Earth flag on Mars, like, damn, you know, we should be so proud of our, like, little family here.

[600] Like, we did something cool.

[601] And by the way, I've made it clear to SpaceX people.

[602] including Elon, many times, and it's like once a year reminder that if they want to make this more exciting, they send the writer to Mars on, you know, I was on the thing.

[603] And I'll, I'll, I'll blog about it.

[604] So I'm just, you know, continuing to throw this out there.

[605] On which?

[606] I'm trying to get them to send me to Mars.

[607] I understand that.

[608] I just want to clarify on which trip does the writer want to go.

[609] I think my dream one, to be honest, would be like the, you know, like the Apollo 8, where they just looped around the moon and came back.

[610] Because landing on Mars give you a lot of good content to write about great content right i mean the amount of kind of high -minded you know and so i would go into the thing and i would blog about it uh and i and i'd be in microgravity so i'd be bouncing around my little space i get a little they can just send me in a dragon they don't need to do a whole starship and um and i would bounce around and i would get to and my i've always had a dream of going to like a one of those nice jails for a year yes because i just have nothing to do besides like read books and i and no responsibilities and social plan.

[611] So this is the ultimate version of that.

[612] Anyway, it's a side topic, but I think it would be.

[613] But also if you, I mean, to be honest, if you land on Mars, it's epic, and then if you die there of, like, finishing your writing, it would be just even that much more powerful for the impact.

[614] Yeah, but then I'm gone, and I don't even get to, like, experience the publication of it, which is the whole point.

[615] Well, some of the greatest writers in history didn't get a chance to experience the publication of their great.

[616] I know.

[617] I don't really think that.

[618] I think, like, I think back to Jesus, and I'm like, oh, man, that guy really, like, crushed it, you know?

[619] But then if you think about it, it doesn't, like, you could literally die today and then become the next Jesus, like, 2 ,000 years from now in this civilization that's like, they're, you know, they're, like, magical in the clouds, and they're worshipping you.

[620] They're worshipping Lex, like, and, like, that sounds like your ego probably would be like, wow, that's pretty cool, except irrelevant to you because you never even knew it happened.

[621] This feels like a Rick and Morty episode.

[622] It does.

[623] It does.

[624] Okay.

[625] You've talked to Elon quite a bit.

[626] You've written about him quite a bit.

[627] It'd be cool to hear you talk about what are your ideas of the magic sauce as you've written about with Elon.

[628] What makes him so successful, his style of thinking, his ambition, his dreams, the people he connects with, the kind of problems he tackles, is there kind of comments you can make about what makes them special?

[629] I think that obviously there's a lot of things.

[630] things that he's very good at.

[631] He has, he's, he has, he's obviously super intelligent.

[632] His heart is very much in, like, I think the right place.

[633] Like, you know, I really, really believe that.

[634] Like, and I think people can sense that.

[635] You know, he just doesn't seem like a grifter of any kind.

[636] He's truly trying to do these big things for the right reasons.

[637] And he's obviously crazy, ambitious and hardworking, right?

[638] Not everyone is.

[639] Some people are as talented and have cool visions, but they just don't want to spend their life that way.

[640] So, but that's none of those alone is what makes Elon.

[641] I mean, if it were, there'd be more of him because there's a lot of people that are very smart and smart enough to accumulate a lot of money and influence and they have great ambition and they have, you know, their hearts in the right place.

[642] To me, it is the very unusual quality he has is that he's sane in a way that almost every human is crazy.

[643] What I mean by that is we are programmed to trust conventional wisdom over our own reasoning.

[644] For good reason, if you go back 50 ,000 years and conventional wisdom says, you know, don't eat that berry, you know, or this is the way you tie a spearhead to a spear.

[645] And you're thinking, I'm smarter than that.

[646] Like, you're not, you know, that that comes from the accumulation of life experience, accumulation of observation and experience over many generations.

[647] And that's a little mini version of the collective superintelligence.

[648] It's like, you know, it's equivalent of, like, making a pencil today, like, people back then, like, the conventional wisdom, like, had this super, this knowledge that no human could ever accumulate.

[649] So we're very wired to trust it.

[650] Plus, the secondary thing is that the people who, you know, just say that they believe the mountain is, they worship the mountain is their God, right?

[651] And the mountain determines their fate.

[652] That's not true, right?

[653] And the conventional wisdom's wrong there, but believing it was helpful to survival because you were part of the crowd and you stayed in the tribe.

[654] And if you started to, you know, insult the mountain god and say that's just a mountain, it's not, you know, you didn't fare very well, right?

[655] So for a lot of reasons, it was a great survival trait to just trust what other people said and believe it.

[656] And truly, obviously, you know, the more you really believed it, the better.

[657] Today, conventional wisdom in a rapidly changing world and a huge giant society, our brains are not built to understand that.

[658] They have a few settings, you know, and none of them is, you know, 300 million person society.

[659] So their, so your brain is basically, is treating a lot of things like a small tribe, even though they're not.

[660] And they're treating conventional wisdom as, you know, very wise in a way that it's not.

[661] If you think about it this way, it's like, picture A, like a bucket that's not moving very much, moving like a millimeter a year.

[662] And so it has time to collect a lot of water in it.

[663] That's like conventional wisdom in the old days when very few things change.

[664] Like your 10, you know, great, great, great grandmother probably lived a similar life to you, maybe on the same piece of land.

[665] And so old people really knew what they were talking about.

[666] Today, the bucket's moving really quickly.

[667] And so, you know, the wisdom doesn't accumulate, but we think it does because our brain settings doesn't have the, oh, you know, quickly moving bucket setting on it.

[668] So my grandmother gives me advice all the time.

[669] And I have to decide, is this?

[670] So there are certain things that are not changing, like relationships and, love and loyalty and things like this.

[671] Her advice on those things, I'll listen to it all day.

[672] She's one of the people who said, you've got to live near your people you love.

[673] Live near your family, right?

[674] I think that is like tremendous wisdom, right?

[675] That is wisdom because that happens to be something that hasn't, doesn't change from generation to generation.

[676] For now.

[677] Right.

[678] She all, right, for now.

[679] She's also telling, right, so I'll be the idiot, telling my.

[680] Exactly.

[681] They'll actually be in the, it's a metaverse like being like, exactly.

[682] It doesn't matter.

[683] And I'm like, it's not the same when you're not in person.

[684] They're going to say it's exactly the same.

[685] And they'll also be thinking.

[686] to me with their nearer link, and I'm going to be like, slow down.

[687] I don't understand what you're saying.

[688] You just talk like a normal version.

[689] Anyway, so my grandmother then, but then she says, you know, you're, I don't know about this writing you're doing.

[690] You should go to law school and, you know, you want to be secure.

[691] And that's not good advice for me, you know, given the world I'm in and what I like to do and what I'm good at, that's not the right advice.

[692] But because the world is totally, she's in a different world.

[693] So she became wise for a world that's no longer here, right?

[694] Now, if you think about that, so then when we think about conventional wisdom, it's a little like my grandmother, and there's a lot of, no, it's not maybe, you know, 60 years outdated, like her software.

[695] It's maybe 10 years outdated to conventional wisdom, sometimes 20.

[696] So anyway, I think that we all continually don't have the confidence in our own reasoning when it conflicts with what everyone else thinks, when with what seems right.

[697] We don't have the guts to act on that reasoning for that reason, right?

[698] You know, we, we, we, and so there's so many Elon examples.

[699] I mean, just from the beginning, building Zip 2 was the first company.

[700] And it was internet advertising at the time when people said, you know, this internet was brand new, like kind of like kind of thinking of like the metaverse, the R metaverse today.

[701] And people would be like, you know, we, you know, we facilitate internet advertising.

[702] People are saying, yeah, people are going to advertise on the internet.

[703] Yeah, right.

[704] Actually, it wasn't that he's magical and saw the future is that he looked at the present, looked at what the internet was, thought about, you know, the obvious, like, advertising opportunity this was going to be.

[705] It wasn't rocket science.

[706] It wasn't genius.

[707] I don't believe.

[708] I think it was just seeing the truth.

[709] And when everyone else is laughing, saying, well, you're wrong.

[710] I mean, I did the math and here it is, right?

[711] Next company, you know, X .com, which became eventually PayPal, people say, oh, yeah, people are going to put their financial information on the internet.

[712] No way.

[713] To us, it seems so obvious.

[714] If you went back then, you would probably feel the same where you'd think that is a fake company.

[715] It's just obviously not a good idea.

[716] He looked around and said, you know, I see where this is.

[717] And so, again, he could see where it was going because he could see what it was that day.

[718] And not what it, you know, not people, conventional wisdom was still a bunch of years earlier.

[719] SpaceX is the ultimate example.

[720] A friend of his apparently bought, actually compiled a montage, video montage of rockets blowing up to show him this is not a good idea.

[721] But just even the bigger picture, the amount of billionaires who have, like, thought this was, I'm going to start launching rockets and, you know, the amount of failed.

[722] I mean, it's not, conventional wisdom said this isn't a bad endeavor.

[723] He was putting all of his money into it.

[724] Landing rockets was another thing, you know, well, here's the classic kind of way we reason, which is, if this could be done, NASA would have done it a long time ago because of the money it would save.

[725] This could be done, the Soviet Union would have done it back in the 60s.

[726] it's obviously something that can't be done.

[727] And the math on his envelope said, well, I think it can be done.

[728] And so he just did it.

[729] So in each of these cases, I think actually in some ways Elon gets too much credit as, you know, people think it's that he's, you know, it's that his Einstein intelligence or he can see the future.

[730] He has incredible, he has incredible guts.

[731] He's so, you know, courageous.

[732] I think if you actually are looking at reality, and you're just assessing probabilities and you're ignoring all the noise, which is wrong, so wrong, right?

[733] And you just, then you just have to be, you know, pretty smart and, you know, pretty courageous.

[734] And you have to have this magical ability to be sane and trust your reasoning over conventional wisdom because your individual reasoning, you know, part of it is that we see that we can't build a pencil.

[735] We can't build, you know, the civilization on our own, right?

[736] So we kind of count, you know, tout to the collective, for good reason, but this is different when it comes to kind of what's possible, you know, the Beatles were doing their kind of motowny chord patterns in the early 60s, and they were doing what was normal.

[737] They were doing what was clearly this kind of sound is a hit.

[738] Then they started getting weird because they were so popular, they had this confidence to say, let's just, we're going to start just experimenting.

[739] And it turns out that like, if all these people are in this like one groove together doing music, and it's just like there's a lot of land over there.

[740] And it seems like, you know, I'm sure the managers would say, and all the record exacts would say, no, you have to be here.

[741] This is what sells.

[742] and it's just not true.

[743] So I think that Elon is why the term for this that actually Elon likes to use is reasoning from first principles, the physics term.

[744] First principles are your axioms.

[745] And physicists, they don't say, well, what do people think?

[746] No, they say, what are the axioms?

[747] Those are the puzzle pieces.

[748] Let's use those to build a conclusion.

[749] That's our hypothesis.

[750] Now let's test it, right?

[751] And they come up with all kinds of new things constantly by doing that.

[752] If Einstein was assuming conventional wisdom was right, he never would have even tried to create something that really disproved Newton's laws.

[753] And the other way to reason is reasoning by analogy, which is a great shortcut.

[754] It's when we look at other people's reasoning and we kind of photocopy it into our head, we steal it.

[755] So reasoning by analogy, we do all the time.

[756] And it's usually a good thing.

[757] I mean, we don't, if you, it takes a lot of mental energy in time to reason from first principles.

[758] It's actually, you know, you don't want to reinvent the wheel every time, right?

[759] You want to often copy other people's reasoning most of the time.

[760] And I, you know, most of us do it most of the time.

[761] And that's good.

[762] but there's certain moments when you're, forget just for a second, like, succeeding in, like, the world of, like, Elon, just who you're going to marry?

[763] Where are you going to settle down?

[764] How are you going to raise your kids?

[765] How are you going to educate your kids?

[766] How you should educate yourself?

[767] What kind of career paths in terms?

[768] These moments, this is what on your deathbed, like you look back on and that's what, these few number of choices that really define your life.

[769] Those should not be reasoned by analogy.

[770] You should absolutely try to reason from first principles.

[771] And Elon, not just, by the way, in his work, but in his personal life.

[772] I mean, if you just look at the way he's on Twitter, it's not how you're supposed to be when you're a super famous, you know, industry titan.

[773] You're not supposed to just be silly on Twitter and do memes and getting little quibbles with you.

[774] He just does things his own way, regardless of what you're supposed to do, which sometimes serves him and sometimes doesn't, but I think it has taken him where it has taken.

[775] Yeah, I mean, I probably wouldn't describe his approach of Twitter as first principles, but I guess it has the same element.

[776] I think it is.

[777] Well, first of all, I will say that a lot of tweets people think, oh, like, he's going to be done after that.

[778] He's fine.

[779] He's on, you know, he's just one man, time man of the year.

[780] Like, it's something is, it's not sinking him.

[781] And I think, you know, it's not that I think this is like super reasoned out.

[782] I think that, you know, Twitter is his silly side.

[783] But I think that he saw, he, with his reasoning did not feel like there was a giant risk in just being his silly self on Twitter when a lot of billionaires would say, well, no one else is doing that.

[784] Yes.

[785] So it must be a good reason, right?

[786] Well, I got to say that he inspires me to, that it's okay to be silly.

[787] Totally.

[788] On Twitter.

[789] But, yeah, you're right.

[790] The big inspiration is the willingness to do that when nobody else is doing it.

[791] Yeah.

[792] And I think about all the great artists, you know, all the great inventors and entrepreneurs, almost all of them, they had a moment when they trusted their reasoning.

[793] I mean, Airbnb was over 60 with VCs.

[794] a lot of people would say obviously they know something we don't right but they didn't they said i think they're all wrong i mean that's that takes some kind of different wiring in your brain and then that's both for big picture and uh detailed like engineering problems it's fun to talk to him it's it's fun to talk to jim keller who's a good example of this kind of thinking about like manufacturing how to get cost on they they always talk about like they talk about SpaceX rockets this way they talk about manufacturing this way like um cost per pound or per ton to get to orbit or something like that this is all the reason we need to get the cost down it's a very kind of raw materials yeah like just very basic way of thinking first principles it's really yeah and the first principles are like the price of raw materials and gravity you know and wind i mean these are your first principles and fuel, Henry Ford, you know, what made Henry Ford blow up as an entrepreneur, assembly line, right?

[795] I mean, he did a, he thought for a second and said, this isn't how imagining, featuring is normally, you know, is normally done this way, but I think this is a different kind of product.

[796] And that's what changed it.

[797] Because, you know, and then what happens is when someone reasons from first principles, they often fail, you know, you're going out into the fog with no conventional wisdom to guide you.

[798] But when you succeed, what you know, is that everyone else turns and says, wait, what, what are they doing?

[799] What are they doing?

[800] And they all flock over.

[801] Look at the iPhone.

[802] iPhone, you know, Steve Jobs was famously good at for reasoning for first principles because that guy had crazy self -confidence.

[803] He just said, you know, if I think this is right, like everyone, and that, I mean, I don't know how he does that.

[804] And I don't think Apple can do that anymore.

[805] I mean, they lost that.

[806] That one brain's ability to do that was made that in an totally different company, even though there's tens of thousands of people there, he said, he didn't say, and now, I'm giving a lot of credit to Steve Jobs, But, of course, it was a team at Apple who said they didn't look at the flip phones and say, okay, well, let's make a, no, keyboard that's like clicky and, you know, really cool Apple -y keyboard.

[807] They said, what should a mobile device be?

[808] You know, what the axioms?

[809] What are the axioms here?

[810] And none of them involved a keyboard necessarily.

[811] And by the time they pieced it up, there was no keyboards.

[812] It didn't make sense.

[813] Everyone suddenly is going, wait, what are they doing?

[814] What are they doing?

[815] And now every phone looks like the iPhone.

[816] I mean, that's how it goes.

[817] You tweeted, what's something you've changed your mind about?

[818] That's the question you've tweeted.

[819] Elon replied, brain transplants, Sam Harris responded nuclear power.

[820] There's a bunch of people with cool responses there.

[821] In general, what are your thoughts about some of the responses and what have you changed your mind about, big or small, perhaps in doing the research for some of your writing?

[822] So I'm writing right now, just finishing a book on kind of why our society is such a shit place at the moment, just polarized.

[823] and, you know, we have all these gifts like we're talking about, just the supermarket.

[824] You know, we have these exploding technology.

[825] Fewer and fewer people are in poverty.

[826] You know, it's Louis C .K., you know, likes to say, you know, everything's amazing and no one's happy, right?

[827] But it's really an extreme moment right now where it's like hate is on the rise, like crazy things, right?

[828] And if I could interrupt briefly, you did tweet that you just wrote the last word.

[829] I sure did.

[830] And then there's some hilarious asshole who said, now you just have to work on all the ones in the middle.

[831] Yeah, I earned that.

[832] I mean, when you, when you earn a reputation as a, as a tried and true procrastinator, you're just going to get shit forever, and that's fine.

[833] I accept my fate there.

[834] So do you mind sharing a little bit more about the details of what you're writing?

[835] So you're, uh, what, what, how do you approach this question about the state of society?

[836] I wanted to figure out what was going on because, um, what I noticed was a bad trend.

[837] It's not that, you know, things are bad.

[838] It's that things are getting worse in certain ways.

[839] in every way.

[840] And if you look at Max Rozier's stuff, you know, he comes up with all these amazing graphs.

[841] This is what's weird is that things are getting better in almost every important metric you can think of, except the amount of people who hate other people in their own country and the amount of people that hate their own country, the amount of Americans that hate America is on the rise, right?

[842] The amount of Americans that hate other Americans is on the rise.

[843] the amount of Americans that hate the president is on the rise, all these things, like on their very steep rise.

[844] So what the hell?

[845] What's going on?

[846] Like there's something causing that.

[847] It's not that, you know, a bunch of new people were born who were just dicks.

[848] It's that something is going on.

[849] So I think of it as a very simple, oversimplified equation, human behavior.

[850] And it's the output that I think the two inputs are human nature and environment, right?

[851] And this is basic, you know, super, super kindergarten level.

[852] like, you know, animal behavior.

[853] But I think it's worth thinking about you've got human nature, which is not changing very much, right?

[854] And then you got, you throw that nature into a certain environment and it reacts to the environment, right?

[855] It's shaped by the environment.

[856] And then eventually what comes out is behavior, right?

[857] Human nature is not changing very much, but suddenly we're behaving differently, right?

[858] We are, again, you know, look at the polls.

[859] Like, it used to be that the president, you know, was liked by, I don't remember.

[860] in the exact numbers, but, you know, 80 % or 70 % of, of their own party and, you know, 50 % of the other party.

[861] And now it's like 40 % of their own party and 10 % of the other party.

[862] You know, it's, and it's not that the presidents are getting worse, and maybe some people would argue that they are, but more so, and there's a lot of, you know, idiot presidents throughout the, what's going on is something in the environment is changing and that's different that you're seeing as a change in behavior.

[863] A easy example here is that, you know, by a lot of metrics, racism is getting, is becoming less and less of a problem.

[864] You know, it's hard to measure, but there's metrics like, you know, how upset would you be if your kid married someone of another race?

[865] And that number is plummeting.

[866] But racial grievance is skyrocketing, right?

[867] There's a lot of examples like this.

[868] So I wanted to look around and say, and the reason I took it on, the reason I don't think this is just an unfortunate trend, unpleasant trend, that hopefully we come out of, is that all this other stuff I like to write about, all this future stuff, right?

[869] And is this magical?

[870] I always think of this.

[871] I'm very optimistic in a lot of ways.

[872] And I think that our world would be a utopia, would seem like actual heaven.

[873] Like whatever Thomas Jefferson was picturing as heaven, other than maybe the eternal life aspect, I think that if he came to 2021 U .S., it would be better.

[874] It's cooler than heaven.

[875] What we live in a place that's cooler than 1700s heaven.

[876] Again, other than the fact that we still die.

[877] Now, I think that future world actually probably would have, quote, eternal life.

[878] I don't think anyone wants eternal life, actually, if people think they do.

[879] Eternal is a long time.

[880] but I think the choice to die when you want.

[881] Maybe we're uploaded.

[882] Maybe we can refresh our bodies.

[883] I don't know what it is.

[884] But the point is, I think about that utopia.

[885] And I do believe that, like, if we don't botch this, we'd be heading towards somewhere that would seem like heaven, maybe in our lifetimes.

[886] Of course, if we, if things go wrong, now think about the trends here.

[887] Just like the 20th century would seem like some magical utopia to someone from the 16th century.

[888] The bad things in the 20th century were kind of the worst.

[889] things ever in terms of just absolute magnitude.

[890] You know, World War II, you know, the biggest genocides ever.

[891] You've got, you know, maybe climate change.

[892] If it is the existential threat that many people think it is, I mean, we never had an existential threat on that level before.

[893] I mean, so the good is getting better and the bad is getting worse.

[894] And so what I think about the future, I think of us as some kind of big, you know, long canoe as a species, five million mile long canoe, each of us sitting in a row and we each have one oar and we can paddle on the left side or the right side and what we know is there's a fork up there somewhere and the river forks and there's a utopia on one side and a dystopia on the other side and I really believe that we're probably not headed for just an okay future it's just the way tech is exploding like it's probably gonna be really good or really bad the question is which side should we be rowing on we can't see up there right but it really matters so I'm running about this future stuff and I'm saying none of this matters if we're squabbling our way into kind of like a civil war right now.

[895] So what's going on?

[896] So it's a really important problem to solve.

[897] What are your sources of hope in this?

[898] So like how do you steer the canoe?

[899] One of my big sources of hope, and this is my putting my answer to what I changed my mind on, is I think I always knew this, but it's easy to forget it.

[900] Our primitive brain does not remember this fact, which is that I don't think there are very many bad people.

[901] now you say bad you know are there are selfish people we're most of us i think if you know i think that if you think of people uh if you know there's there's you know digital language is ones and zeros and our primitive brain very quickly can get into the land where everyone's a one or a zero our tribe we're all ones you know we're perfect i'm perfect my family is that other family is that other tribe there are zeros and you dehumanize them right these people these people are awful so dehuman's zero you know zero is not a human place no one's a one those are you're dehumanizing yourself so But when we get into this land, I call it political Disney world.

[902] Because the Disney movies are good guys.

[903] Scar is totally bad and Mufasa is totally good, right?

[904] You don't see Mufasa's character flaws.

[905] You don't see Scars upbringing that made him like that, that humanizes him.

[906] No, lionizes him, whatever.

[907] You are...

[908] Well done.

[909] Yeah.

[910] Mufus is a one and Scars a zero.

[911] Very simple.

[912] So political Disney world is a place, a psychological place that all of us have been in.

[913] And it can be religious Disney World.

[914] It can be national Disney World.

[915] world and war, whatever it is, but it's a place where we fall into this delusion that there are protagonists and antagonists, and that's it, right?

[916] That is not true.

[917] We are all 0 .5s, or maybe 0 .6 is to 0 .4s in that.

[918] We are also, on one hand, it's not, I don't think there's that many really great people, frankly.

[919] I think if you get into it, people are kind of a lot of people, you know, most of us have, you know, if you get really into our most shameful memories, the things we've done that, that are worse, the most shameful thoughts, the deep selfishness that some of us have in areas we wouldn't want to admit, right?

[920] Most of us have a lot of unadmirable stuff.

[921] right um on the other hand if you actually got into really got into someone else's brain and you looked at their upbringing you looked at the trauma that they've experienced and then you looked at the the insecurities they have and the and you look at all there if you assembled the highlight real love your worst moments the meanest things you've ever done the worst they's most selfish the time you know you stole something whatever and you just people think wow lex is an awful person if you highlighted your if you did a montage of your best moments people would say oh he's a god right but of course we all have both of those so started to really try to remind myself that everyone's a point five, right?

[922] And point fives are all worthy of criticism and we're all worthy of compassion.

[923] And the thing that makes me hopeful is that I really think that there's a bunch of point fives and point fives are good enough that we should be able to create a good society together.

[924] There's a lot of love in every human.

[925] And I think there's more love in humans than hate.

[926] You know, I always remember this moment, this is weird anecdote, but I'm a Red Sox fan, Boston Red Sox Baseball.

[927] And Derek Jeter is who we hate the most.

[928] He's on the Yankees.

[929] Yes.

[930] And hate, right?

[931] Eater, right?

[932] He was his last game in Fenway.

[933] He's retiring.

[934] And he got this rousing standing ovation.

[935] And I almost cried.

[936] And it was like, what is going on?

[937] We hate this guy.

[938] But actually, there's so much love in all humans.

[939] You know, it felt so good to just give a huge cheer to this guy we hate because it's like this moment of like a little fist pound being like, of course, we all actually love each other.

[940] And I think there's so much of that.

[941] And so the thing that I think I've come around on is I just, I don't, I think that we are in an environment that's bringing out really bad stuff.

[942] I don't think it's, if I thought it was the people, I would be more hopeful.

[943] Like, if I thought it was human nature, I'd be, you know, be more upset.

[944] It's the two independent variables here, or that there's a fixed variable.

[945] There's a constant, which is the human nature, and there's the independent variable environment, and that behavior is the dependent variable.

[946] I like that the thing that I think is bad is the independent variable, the environment.

[947] Which means I think we can, the environment can get better.

[948] And there's a lot of things I can go into about why the environment I think is bad.

[949] But I have hope because I think the thing that's bad for us is something that can change.

[950] The first principle's idea here is that most people have the capacity to be a 0 .7 to 0 .9 if the environment is properly calibrated with the right incentives.

[951] I think that, well, I think that maybe if we're all, yeah, if we're all 0 .5s, I think that that environments can bring out our good side.

[952] You know, yes, if maybe we're all, all on some kind of distribution and the right environment can, yes, can bring out our higher sides.

[953] And I think in a lot of ways you could say it has.

[954] I mean, the U .S. environment, we take for granted how the liberal laws and liberal environment that we live in, I mean, like in New York City, right, if you walk down the street and you, like, assault someone, A, if anyone sees you, they're probably going to yell it.

[955] You might get your ass kicked by someone for doing that.

[956] you also might end up in jail, you know, of its security cameras, and there's just norms.

[957] You know, we're all trained.

[958] That's what awful people do, right?

[959] So there's, it's not the human nature doesn't have it in it to be like that.

[960] It's that this environment we're in has made that a much, much, much smaller experience for people.

[961] There's so many examples like that where it's like, man, you don't realize how much of the worst human nature is contained by our environment.

[962] And, but I think that, you know, rapidly changing environment, which is what we have right now, Social media starts.

[963] I mean, what a seismic change to the environment?

[964] There's a lot of examples like that rapidly changing environment can create rapidly changing behavior.

[965] And wisdom sometimes can't keep up.

[966] And so we, you know, we can really kind of lose our grip on some of the good behavior.

[967] Were you surprised by Elon's answer about brain transplants or Sam's about nuclear power?

[968] Would anything else just?

[969] Sam's, I think, is, I have a friend, Isabel Beaumackay, who has a, who's a nuclear power, you know, influencer.

[970] I've become very convinced, and I've not done my deep dive on this.

[971] But here's in this case, this is reasoning by analogy here.

[972] The amount of really smart people I respect who all, who seem to have dug in, who all say nuclear power is clearly a good option.

[973] It's obviously emission -free, but, you know, the concerns about meltdowns and waste, they see, they're completely overblown.

[974] So judging from those people, secondary knowledge here, I will say, I'm a strong advocate.

[975] I haven't done my own deep dive yet, but it does seem like a little bit odd that you've got people who are so concerned about climate change, who have, it seems like it's kind of an ideology where nuclear power doesn't fit rather than rational fear of climate change that somehow is anti -nuclear power.

[976] I personally am uncomfortably reasoning by analogy with climate change.

[977] I actually have not done a deep dive myself.

[978] Me neither.

[979] Because it's so, man, it seems like a deep dive.

[980] And my reasoning by analogy there currently has me thinking it's a truly existential thing, but feeling hopeful.

[981] So let me, this is me speaking, and this is speaking from a person who's not done the deep dive.

[982] I'm a little suspicious of the amount of fearmongering going on.

[983] I've, especially over the past couple of years, I've gotten uncomfortable with fear mongering in all walks of life.

[984] there's way too many people interested in manipulating the populace with fear.

[985] And so I don't like it.

[986] I should probably do a deep dive because to me it's at the, well, the big problem with opposition to climate change or whatever, the fear mongering is that it also grows the skepticism in science broadly.

[987] So I need to make sure I do that deep dive.

[988] I have listened to a few folks who kind of criticize the fear monging and all those kinds of things, but they're few and far in between.

[989] And so it's like, all right, what is the truth here?

[990] And it feels lazy.

[991] But it also feels like it's hard to get to the, like, there's a lot of kind of activists talking about idea versus, like, sources of objective, like calm, first principles type reasoning.

[992] Like, one of the things, I know it's supposed to be a very big problem, but When people talk about catastrophic effects of climate change, I haven't been able to see really great, deep analysis of what that looks like in 10, 20, 30 years, raising sea levels.

[993] What are the models of how that changes human behavior, society?

[994] What are the things that happen?

[995] There's going to be constraints on the resources and people are going to have to move around.

[996] This is happening gradually.

[997] Are we going to be able to respond to this?

[998] How would we respond to this?

[999] What are the best models for how everything goes wrong?

[1000] Again, I was, this is a question I keep starting to ask myself without doing any research, like motivating myself to get up to this deep dive that I feel as deep, just watching people not do a great job with that kind of modeling with the pandemic and sort of being caught off guard and wondering, okay, if we're not good with this pandemic, how are we going to respond to other kinds of tragedies?

[1001] So this is part of why I wrote the book, because I said, we're going to have more and more of these, like, big collective, what should we do here situations, you know, whether it's, how about when, you know, we're probably not that far away from people being able to go and decide the IQ of their kid or, like, you know, make a bunch of embryos and actually, you know, pick the highest IQ.

[1002] You can possibly go wrong.

[1003] Yeah.

[1004] And also, like, imagine the political sides of that and, like, something the only wealthy people can afford it first and just the nightmare, right?

[1005] we need to be able to have our wits about us as a species where we can actually get into a topic like that and come up with a where the collective brain can be smart.

[1006] I think that there are certain topics where I think of this, and this is, again, another simplistic model, but I think it works, is that there's a higher mind and a primitive mind, right?

[1007] You can, you know, in your head.

[1008] And these team up with others.

[1009] So when the higher minds are, in a higher mind is more rational and puts out ideas that it's not attached to.

[1010] And so it can, it can change its mind easily.

[1011] It's just an idea, and the higher mind can get criticized.

[1012] Their ideas can get criticized, and it's no big deal.

[1013] And so when the higher minds team up, it's like all these people in the room, like throwing out ideas and kicking them, and one idea goes out, and everyone criticizes it, which is, like, you know, shooting bows and arrows at it.

[1014] And the true idea is, you know, the arrows bounce off.

[1015] And it's so, okay, it rises up.

[1016] And the other ones get shot down.

[1017] So it's this incredible system.

[1018] This is what, you know, this is what good science institution is, you know, someone puts out a thing, criticism arrows come at it.

[1019] And, you know, most of them fall, and the needle is in the haystack end up rising up, right?

[1020] Incredible mechanism.

[1021] So what that's happening is a bunch of people, a bunch of flawed medium scientists are creating superintelligence.

[1022] Then there's the primitive mind, which, you know, is the more limbic systemy part of our brain.

[1023] It's the part of us that is very much not living in 2021.

[1024] It's living many tens of thousands of years ago.

[1025] And it does not treat ideas like this separate thing.

[1026] It identifies with its ideas.

[1027] it only gets involved when it finds an idea sacred.

[1028] It starts holding an idea sacred, and it starts identifying.

[1029] So what happens is they team up too.

[1030] And so when you have a topic that a bunch of primitive, that really rouses a bunch of primitive minds, it quickly, the primitive minds team up, and they create an echo chamber where suddenly no one can criticize this.

[1031] And in fact, if it's powerful enough, people outside the community, no one can criticize it.

[1032] We will get your paper retracted.

[1033] We will get you fired, right?

[1034] That's not higher mind behavior.

[1035] That is crazy primitive mind.

[1036] And so now what happens is the collective becomes dumber than an individual, a dumber than a single reasoning individual.

[1037] You have this collective is suddenly attached to this sacred scripture with the idea, and they will not change their mind.

[1038] And they get dumber and dumber.

[1039] And so climate change, what's worrisome is that climate change has in many ways become a sacred topic, where if you come up with a nuanced thing, you might get called branded a denier.

[1040] Yes.

[1041] So there goes the superintelligence.

[1042] All the arrows, no arrows can be fired.

[1043] But if you get called a denier, that's a social penalty for firing an arrow at a certain orthodoxy.

[1044] Right.

[1045] And so what's happening is the big brain gets like frozen, right?

[1046] And it becomes very stupid.

[1047] Now, you can also say that about a lot of other topics right now.

[1048] You know, you just mentioned another one.

[1049] I forget what it was.

[1050] But that's also kind of like this.

[1051] The world of vaccine.

[1052] Yeah, yeah, COVID.

[1053] Okay.

[1054] And here's my point earlier is that what I see is that the political.

[1055] typical divide has like a world pool that's pulling everything into it.

[1056] And in that world pool, thinking is done with the primitive mind tribes.

[1057] And so I get, you know, okay, obviously something like race, right?

[1058] That makes sense.

[1059] That also right now, the topic of race, for example, or gender, these things are in the world pool.

[1060] But that at least is like, okay, that's something that the primitive mind would always get really worked up about.

[1061] You know, it taps into like our deepest kind of like primal selves.

[1062] COVID, you know, I mean like this COVID in a way too, but climate change, like that should just be something that our rational brains are like, let's solve this complex problem.

[1063] But the problem is that it's all gotten sucked into the red versus blue world pool.

[1064] And once that happens, it's in the hands of the primitive minds.

[1065] And we're losing our ability to be wise together, to make decisions.

[1066] It's like the big species brain or the big American brain is like drunk at the wheel right now.

[1067] And we're about to go, we're going to our future with more and more big technologies.

[1068] Scary things.

[1069] we have to make big, right decisions and not, you know, we're getting dumber as a collective, and that's part of this environmental problem.

[1070] So within the space of technologists and the space of scientists, we should allow the arrows.

[1071] That's one of the saddest things to me about, is like the scientists, like I, I've seen arrogance.

[1072] There's a lot of mechanisms that maintain the tribe.

[1073] It's the arrogance.

[1074] It's how you built up this mechanism that defends, this wall that defends against the arrows.

[1075] It's arrogance, credentialism, like, just ego, really.

[1076] And then just it protects you from actually challenging your own ideas, this ideal of science that makes science beautiful.

[1077] In a time of fear and in a time of division created by perhaps politicians that leverage the fear, it like you said, makes the whole system dumber.

[1078] The science system dumber, the, the, the tech developer system dumber if they don't allow the challenging of ideas.

[1079] What's really bad is that, like, in a normal environment, you're always going to have echo chambers.

[1080] So what's the opposite of an echo chamber?

[1081] I created a term for because I think we need it, which is called an idea lab, an idea lab, right?

[1082] It's like people treat, it's like people act like scientists, even if they're not doing science, they just treat their ideas like science experiments.

[1083] And they toss them out there and everyone disagrees.

[1084] And disagreement is like the game.

[1085] Everyone likes to disagree, you know, on a certain text thread where everyone is just just, you know, saying, you know, it's almost like someone throws something out and just it's an impulse with the rest of the group to say, I think you're being like overly general there.

[1086] Or I think like, aren't you kind of being, I think that's like your bias showing.

[1087] And it's like, no one's getting offended because it's like we're all just messing.

[1088] We all, of course, respect each other, obviously.

[1089] We're just, you know, trashing each other's ideas and that the whole group becomes smarter.

[1090] You're always going to have idea labs at echo chambers, right, in different communities.

[1091] And most of us participate in both of them, you know, and, you know, maybe in your marriage is a great idea lab.

[1092] You love to disagree with your, your spouse and maybe in but this group of friends or your family at home you know you know in front of that sister you do not bring up politics because she's now in force when that happens her bullying is forcing the whole room to be an echo chamber to appease her now what's what scares me is that usually have these things existing kind of in bubbles and usually there's like and they each have their natural defenses against each other so an echo chamber person stays in their echo chamber they don't like they will cut you out they don't like to be friends with people who disagree them.

[1093] You notice that.

[1094] They will cut you out.

[1095] They'll cut out their parents if they vote for Trump or whatever, right?

[1096] So that's how they do it.

[1097] They will say, I'm going to stay inside of an echo chamber safely.

[1098] So my ideas, which I identify with, because my primitive mind is doing the thinking, are not going to ever have to get challenged because it feels so scary and awful for that to happen.

[1099] But if they leave and they go into an ideal environment, they're going to, people are going to say, what, no, they're going to disagree.

[1100] And they're going to say, and the person is going to try to to bully them.

[1101] They're going to say, that's really offensive.

[1102] And people are going to say, no, it's not.

[1103] And they're going to, they're going to immediately say these people are assholes, right?

[1104] So the echo chamber person, it doesn't have much power once they leave the echo chamber.

[1105] Likewise, the idea lab person, they have this great environment, but if they go into an echo chamber where everyone else and they do that, they will get kicked out of the group, they will get branded as something, you know, a denier or a racist, you know, a right -winger, a radical, you know, these nasty words.

[1106] The thing that I don't like right now is that the echo chambers have found ways to forcefully expand into places that normally have a pretty good immune system against echo chambers, like universities, like science journals, places where usually it's like there's a strong idea lab culture, they're veritas, you know, you know, that's, that's an idea lab slogan.

[1107] You have is that these people have found a way to, a lot of people have found a way to actually go out of their thing and keep their echo chamber by making sure that everyone is scared because they can punish anyone whether you're in their community or not that's uh all brilliantly put when's the book coming out and the idea june july we're not quite sure yet okay i can't wait thanks awesome do you have a title yet or you can't talk about that still working on okay uh if it's okay just a couple of questions from mailbag i just love these i would love to i would love to hear you riff on these so one is about film and music why do we prefer to watch the question question goes, why do we prefer to watch a film?

[1108] We haven't watched before, but we want to listen to songs that we have heard hundreds of times.

[1109] This question and your answer really started to make me think, like, yeah, that's true.

[1110] That's really interesting.

[1111] Like, we draw that line somehow.

[1112] So what's the answer?

[1113] So I think, let's use these two minds again.

[1114] I think that when your higher mind is the one who's taking something in and they're really interested, you know, what are the lyrics, or I'm going to learn something, or what, you know, reading a book or whatever, and the higher mind is trying to get information.

[1115] And once it has it, there's no point in listening to it again.

[1116] It has the information.

[1117] You know, your rational brain is like, I got it.

[1118] But when you eat a good meal or have sex or whatever, that's something you can do again and again because it actually your primitive brain loves it, right?

[1119] And it never gets forward of things that it loves.

[1120] So I think music is a very primal thing.

[1121] I think music goes right into our primitive brain.

[1122] And a lot, you know, I think it's, of course, it's a collaborative collaboration.

[1123] You know, your rational brain is absorbing the actual message.

[1124] But I think it's all about emotions and even more than emotions.

[1125] It literally, like, you know, music taps into like some very, very deep, you know, primal part of us.

[1126] And so when you hear a song once, even some of your favorite songs, the first time you heard it, you were like, I guess that's kind of catchy.

[1127] Yeah.

[1128] And then some, and then you end up loving it on the 10th listen.

[1129] but sometimes you even don't even like a song.

[1130] You're like, oh, this song sucks.

[1131] But suddenly you find yourself on the 40th time because it's on the radio all the time, just kind of being like, oh, I love this song.

[1132] And you're like, wait, I don't, I hated this song.

[1133] And what's happening is that the sound is actually, the music is actually carving a pathway in your brain.

[1134] And it's a dance.

[1135] And when your brain knows what's coming, it can dance, it knows the steps.

[1136] So your brain is, your internal kind of, your brain is actually dancing with the music.

[1137] And it knows the steps and it can anticipate.

[1138] And it, and so there's something about knowing, having memorized the song that makes it incredibly enjoyable to us.

[1139] But when we hear it for the first time, we don't know where it's going to go.

[1140] We're like an awkward dancer.

[1141] We don't know the steps.

[1142] And your primitive brain can't really have that much fun yet.

[1143] That's how I feel.

[1144] And in the movies, that's more, that's less primitive.

[1145] That's the story.

[1146] You're taking in.

[1147] But a really good movie that we really love, often we will watch it like 12 times.

[1148] You know, it's still like it.

[1149] Not that many, but versus.

[1150] If you're watching a talk show, right, you're listening to, if you're listening to one of your podcast, as a perfect example, there's not many people that will listen to one of your podcast no matter how really it is 12 times.

[1151] Because it's, once you've got it, you got it.

[1152] It's a form of information that's very higher mind focused.

[1153] That's how I, well, you know, the funny thing is there is people that listen to a podcast episode many, many times.

[1154] And often I think the reason for that is not because of the information is the chemistry, is the music of the conversation.

[1155] Yeah.

[1156] So it's not the actual.

[1157] It's the art of it they like.

[1158] Yeah, they'll fall in love with some kind of person, some weird personality, and they'll just be listening to, they'll be captivated by the beat of that kind of person.

[1159] Or like a stand -up comic.

[1160] I've watched, like, certain of things, like, episodes, like 20 times, even though, you know.

[1161] I have to ask you about the wizard hat.

[1162] You've had a blog about Neurrelink.

[1163] I got a chance to visit NeuroLink a couple of times hanging out with those folks.

[1164] That was one of the pieces of writing you did that, like, changes culture and changes the way.

[1165] people think about a thing.

[1166] The ridiculousness of your stick figure drawings are somehow it's like, you know, it's like calling the origin of the universe the Big Bang.

[1167] It's a silly title, but it somehow sticks to be the representative of that.

[1168] The same way the wizard had for the NeurLink somehow was a really powerful way to explain that.

[1169] You actually proposed that the man of the year cover of time should be.

[1170] one of my drawings.

[1171] One of your drawings.

[1172] In general, yes.

[1173] It's an outrage that it was.

[1174] It was.

[1175] Okay.

[1176] So what are your thoughts about like all those years later about NeurLink?

[1177] Do you find this idea?

[1178] Like what excites you about is the big long -term philosophical things?

[1179] Is it the practical things?

[1180] Do you think it's super difficult to do on the neurosurgery side and the material engineering, the robotic side?

[1181] Or do you think the machine learning side for the brain computer interface?

[1182] where they get to learn about each other, all that kind of stuff.

[1183] I would just love to get your thoughts because you're one of the people that really considered this problem, really studied it, of computer interfaces.

[1184] I mean, I'm super excited about it.

[1185] It's a, I really think it's actually Elon's most ambitious thing, more than colonizing Mars, because that's just a bunch of people going somewhere, even though it's somewhere far.

[1186] NeurLink is changing what a person is eventually.

[1187] Now, I think that Neurlink engineers and Elon himself would all be the first to admit that it is a maybe, whether they can do their goals here.

[1188] I mean, it is so crazy ambitious to try to, when their eventual goals are, you know, of course, in the interim, they have a higher probability of accomplishing smaller things, which are still huge, like basically solving paralysis, you know, strokes, Parkinson, things like that.

[1189] I mean, it can be unbelievable.

[1190] and, you know, anyone who doesn't have one of these things, like, we might, you know, everyone should be very happy about this kind of helping with different disabilities.

[1191] But the thing that is like, so the grand goal is this augmentation where it's, you take someone who's totally healthy and you put a brain machine interface in any way to give them superpowers.

[1192] You know, it's the possibilities if they can do this, if they can really, so, you know, they've already shown that they are for real.

[1193] They've created this robot.

[1194] Elon talks about it should be like LASIC, where it's not, it shouldn't be something that needs a surgeon.

[1195] This shouldn't just be for rich people who have waited in line for six months.

[1196] It should be for anyone who can afford LASIC and eventually, hopefully, something that isn't covered by insurance or something that anyone can do.

[1197] Something this big a deal should be something that anyone can afford eventually.

[1198] And when we have this, again, I'm talking about very advanced phase down the road.

[1199] So maybe a less advanced phase just to just there maybe right now, uh, if you think about when you listen to a mute, when you listen to a song, what's happening is do you actually hear the sound?

[1200] Well, not really.

[1201] It's that the sound is coming out of the speaker.

[1202] The speaker is vibrating.

[1203] It's vibrating air molecules.

[1204] Those air molecules, you know, get vibrated all the way to your head, um, uh, pressure wave.

[1205] And then it vibrates your ear.

[1206] your eardrum is really the speaker now in your head that then vibrates bones and fluid, which then stimulates neurons in your auditory cortex, which give you the perception that you're hearing sound.

[1207] Now, if you think about that, do we really need to have a speaker to do that?

[1208] You could just somehow, if you had a little tiny thing that could vibrate ear drums, you could do it that way.

[1209] That seems very hard.

[1210] But really what you need is to go to the very end with a thing that really needs to happen is your auditory cortex neurons need to be stimulated in a certain way.

[1211] If you have a ton of neuralink things in there, neuralink electrodes, and they get really good at stimulating things, you could play a song in your head that you hear that not is not playing anywhere.

[1212] There's no sound in the room, but you hear it, and no one else could.

[1213] It's not like they can get close to your head and hear it.

[1214] There's no sound.

[1215] They could not hear anything, but you hear sound.

[1216] You can turn up, so you open your phone, you have the neuralink app.

[1217] You open the neuralink app You know, and, and, or just neural, so basically you can open your Spotify and you can play to, you know, you can play to your speaker, you can play to your computer, you can play right out of your phone to your headphones or you can, you have a new one.

[1218] You can play into your brain.

[1219] And this is one of the earlier things.

[1220] This is, you know, something that seems like really doable.

[1221] So, you know, no more headphones.

[1222] I always think that's so annoying because I can leave the house with just my phone, you know, and nothing else, or even just Apple Watch, but there's always this one thing.

[1223] I'm like, and headphones.

[1224] You do need your headphones, right?

[1225] So I feel like, you know, know, that'll be the end of that.

[1226] But there's so many things that you, and you keep going, the ability to think together, you know, you can talk about like super brains.

[1227] I mean, one of the examples Elon uses is that the low bandwidth of speech.

[1228] If I go to a movie and I come out of a scary movie and you say, how was?

[1229] I said, oh, it was terrifying.

[1230] Well, what did I just do?

[1231] I just gave you, I just gave you, I had five buckets I could have given you.

[1232] One was horrifying, terrifying, terrifying, scary, eerie, creepy, whatever.

[1233] That's about it.

[1234] And I had a much more nuanced experience than that.

[1235] And I don't, all I have is, you know, these words, right?

[1236] And so instead I just hand you the bucket.

[1237] I put the stuff in the bucket and give it to you.

[1238] But all you have is the bucket.

[1239] You just have to guess what I put into that bucket.

[1240] All you can do is look at the label of the bucket and say, I'll, when I say terrifying, here's what I mean.

[1241] So the point is it's very lossy.

[1242] I had this, all this nuanced information of what I thought of the, movie.

[1243] And I'm sending you a very low -res package that you're going to now guess what the high -res thing looked like.

[1244] That's language in general.

[1245] Our thoughts are much more nuanced.

[1246] We can think to each other.

[1247] We can do amazing things.

[1248] We could, A, have a brainstorm that doesn't feel like, oh, we're not talking in each other's heads.

[1249] It's not just that I hear your voice.

[1250] No, no, no words are being said internally or externally.

[1251] The two brains are literally collaborating.

[1252] It's something.

[1253] It's a skill.

[1254] I'm sure we'd have to get good at it.

[1255] I'm sure young kids will be great at it and old people would be bad.

[1256] But you think together and together, you're like, ah, had the adjoined epiphany.

[1257] And now how about eight people in a room doing it, right?

[1258] So it gets, you know, there's other examples.

[1259] How about when you're a dress designer or a bridge designer?

[1260] And you want to show people what your dress looks like.

[1261] Well, right now, you got to sketch it for a long time.

[1262] Here, just beam it onto the screen from your head.

[1263] So you can picture it.

[1264] If you know, if you can picture a tree in your head, well, you can just suddenly, whatever's in your head, you can be pictured.

[1265] So we'll have to get very good at it.

[1266] Right.

[1267] And take a skill, right?

[1268] You know, you're going to have to, but the possibilities.

[1269] My God, talk about, like, I feel like if that works, if we really do have that as something, I think it'll almost be like a new ADBC line.

[1270] It's such a big change that the idea of like anyone living before everyone had brain machine interfaces is living in like before the common era.

[1271] It's that level of like big change if it can work.

[1272] Yeah, and like replay of memories, just replaying stuff in your head.

[1273] Oh my God, yeah.

[1274] And copying, you know, you can hopefully copy memories onto other things and you don't have to just rely on you're you know your wet circuitry it does make me sad because you're right the brain is incredibly neuroplastic and so it can adjust it can learn how to do this i think it'll be a skill but probably you and i will be too old to truly learn well maybe we can get they'll be great trainings you know i'm spending the next three months in like a you know in a neural one of the neurolink trainings but it'll still be a bit of like grandpa can definitely this is you know i was saying how am i going to be old i'm like no i'm going to be great at the new phones it's like not going to be the phones it's going to be that you know the kid's going to be thinking to me, I'm going to be like, I just, can you just talk, please?

[1275] And they're going to be like, okay, I'll just talk.

[1276] And they're going to, so that'll be the equivalent of, you know, yelling to your grandma's today.

[1277] I really suspect.

[1278] I don't know what your thoughts are, but I grew up in a time when physical contact, interaction was valuable.

[1279] I just feel like that's going to go the way that's going to disappear.

[1280] Why?

[1281] I mean, is there anything more intimate than thinking with each other?

[1282] I mean, that's, you talk about, you know, once we're all doing that, it might feel like, man, everyone was so isolated from each other before.

[1283] Sorry.

[1284] So I didn't say that intimacy disappears.

[1285] I just meant physical, having to be in the same, having to touch each other.

[1286] But people like that, if it is important, won't there be whole waves of people start to say, you know, there's all these articles that come out about how, you know, in our metaverse, we've lost something important.

[1287] And then now there's a huge, all first, the hippies start doing it.

[1288] And then eventually it becomes this big wave and now everyone, won't, you know, if something truly is lost, won't we recover it?

[1289] Well, I think from first principles, all of the components are there to engineer intimate experiences in the metaverse or in the in the in the cyberspace and so to me it's it it I don't see anything profoundly unique to the physical experience like I don't understand but then why are you saying there's a loss there no just said because I won't oh it's a loss for me person because I'll it the the world so then you do think there's something unique in the physical experience for me because I was raised with it Oh, yeah, yeah, yeah.

[1290] So anything you're raised with, you fall in love with.

[1291] Like, people in this country came up with baseball.

[1292] I was raised in the Soviet Union.

[1293] I don't understand baseball.

[1294] I like it, but I don't love it the way Americans love it.

[1295] Because a lot of times they went to baseball games with their father, and there's that family connection.

[1296] There's a young kid dreaming about, I don't know, becoming an MLB player himself.

[1297] I don't know, something like that, but that's what you're raised with, obviously, is really important.

[1298] But I mean, fundamentally to the human experience, listen, we're doing this podcast in person.

[1299] So clearly, I still value it.

[1300] But it's true.

[1301] If this were, obviously through a screen, we all agree, that's not the same.

[1302] Yeah, it's not the same.

[1303] But if this were some, you know, we had contact lenses on and like, you know, maybe neuralink, you know, play.

[1304] Maybe, again, forget, again, this is all the devices, even if it's just cool as a contact lens, that's all old school.

[1305] Yeah.

[1306] Once you have the brain machine interface, it'll just be projection.

[1307] of, you know, it'll take over my visual cortex.

[1308] My visual cortex will get put into a virtual room and so will yours.

[1309] So we will see, we will hear, really hear and see as if where you won't have any masks, no VR mask needed.

[1310] And at that point, it really will feel like you'll forget.

[1311] You'll say, we'll read together and physically or not.

[1312] You won't even, it'd be so unimportant, you won't even remember.

[1313] And you're right.

[1314] This is one of those shifts in society that changes everything.

[1315] But romantically, people still need to be together.

[1316] There's a whole set of, like, physical things with relationship that are needed, you know, like...

[1317] Like what?

[1318] Like sex?

[1319] Sex, but also just, like, that there's pheromones.

[1320] Like, there's, the physical touch is such a...

[1321] That's like music.

[1322] It goes to such a deeply primitive part of us that what physical touch with a romantic partner does, that I think that...

[1323] So I'm sure there'll be a whole wave of people who, their new thing is that, you know, you're romantically involved people you never actually in person with, but...

[1324] And I'm sure there'll be things where you can actually smell.

[1325] what's in the room and you can yeah and touch yeah but i think that'll be one of the last things to go i think there's something that to me seems like something that'll be um it'd be a while before people feel like there's nothing lost by not being it's it's it's very difficult to replicate the human interaction although sex also again you could not to get too like weird but you could have a thing where you you're you basically um you know or you know you're let's just do a massage because it's less like awkward but like you someone you know someone everyone is still imagining sex A masseuse could massage a fake body and you could feel whatever's happening, right?

[1326] So you're lying down in your apartment alone, but you're feeling a full...

[1327] They'll be the new, like, YouTube are like streaming, where it's one masseuse massaging one body, but like a thousand people are experiencing.

[1328] Exactly, right now, think about it.

[1329] Right now, you know what Taylor Swift doesn't play for one person and has to go around and everyone of her fans just to go play for or a book, right?

[1330] You do it and it goes everywhere, so it'll be the same idea.

[1331] you've written and thought a lot about AI so AI safety specifically you've mentioned you're actually starting a podcast which is awesome you're so good at talking so good at thinking so good at being weird in the most beautiful of ways but you've been thinking about this AI safety question where today does your concern lie for the for the near future for the long term future like quite a bit of stuff happened, including with Elon's work with Tesla Autopilot.

[1332] There's a bunch of amazing robots with Boston Dynamics, and everyone's favorite vacuum robot, iRobot, Roomba, and then there's obviously the applications of machine learning for recommender systems in Twitter, Facebook, and so on, and, you know, face recognition for surveillance, all these kinds of things are happening.

[1333] Just a lot of incredible use of, not the face recognition, but the incredible use of deep learning, machine learning to capture information about people and try to recommend to them what they want to consume next.

[1334] Some of that can be abused, some of that can be used for good, like for Netflix or something like that.

[1335] What are your thoughts about all this?

[1336] Yeah, I mean, I really don't think humans are very smart, all like all things considered.

[1337] I think we're like limited.

[1338] And we're not, we're dumb enough that we're very easily manipulable.

[1339] Not just like, oh, like, our emotions, people can, you know, our emotions can be pulled like puppet strings.

[1340] I mean, again, I look at, like, I do look at what's going on in political polarization now, and I see a lot of a puppet string emotions happening.

[1341] So, yeah, there's a lot to be scared of, for sure, like, very scared of.

[1342] I get excited about a lot of things, very specific things.

[1343] Like, one of the things I get excited about is, like, so the future of wearables, right?

[1344] Again, I think that we're like, oh, the wrist, the fit bit around my wrist is going to see.

[1345] you know, the whoop is going to seem really hilariously old school in 20 years.

[1346] Back with neural ink.

[1347] We're like a big bracelet, right?

[1348] It's going to turn into little sensors in our blood probably or, you know, even, you know, We're, you know, just, just things that are going to be, it's going to be collecting a hundred times more data than it collects now, more nuanced data, more specific to our body.

[1349] And it's going to be, you know, super reliable, but that's the hardware side.

[1350] And then the software is going to be, this is, I've not done my deep dive.

[1351] This is all speculation, but the software is going to get really good, and this is the AI component.

[1352] And so I get excited about specific things like that.

[1353] Like, think about if you're, if hardware we're able to collect, first of all, the hardware knows your whole genome, and we know a lot more about what a genome sequence means, because you can collect your genome now, and we just don't know much, we don't have much to do with that information.

[1354] As AI gets, so now you have your genome, you've got what's in your blood at any given moment, all the levels of everything, right?

[1355] You have the exact width of your heart arteries.

[1356] At any given moment, you've got all the, all the viruses that ever visited your body because there's a trace of it.

[1357] So you have all the pathogens, all the things that, like, you should be concerned about health -wise and might have threatened you or you might be immune from all of that kind of stuff.

[1358] Also, of course, it knows how a faster heart is beating, and it knows how much you, you know, exactly the amount of exercise.

[1359] It knows your muscle mass and your weight and all that.

[1360] but it also maybe can even know your emotions.

[1361] I mean, if emotions, you know, what are they?

[1362] You know, where do they come from?

[1363] Probably pretty obvious chemicals once we get in there.

[1364] So again, Neurrelink can be involved here maybe in collecting information.

[1365] You know, because right now you have to do the thing.

[1366] What's your mood right now?

[1367] And it's hard to even assess, you know, and you're in a bad mood.

[1368] It's hard to even.

[1369] But by the way, just as a shout out, Lisa Feldman -Barritt, who's a neuroscientist at Northeastern, just wrote a whole, not just, like a few years ago, wrote a whole book saying, our expression of emotions has nothing to do with the experience of emotions.

[1370] So you really actually want to be measuring.

[1371] That's exactly.

[1372] And you can tell because one of these apps pops up and says, you know, how do you feel right now?

[1373] Good, bad.

[1374] I'm like, I don't know.

[1375] Like, I feel bad right now because the thing popping up reminded me that I'm procrastinating because I was on my phone and I should remember.

[1376] You know, like that's not my emotion.

[1377] So I think it will probably be able to very, get all this info, right?

[1378] Now the AI can go to town.

[1379] Think about when the AI it's really good at this.

[1380] And it knows your genome, and it knows, it can just, I want the AI to just tell me what to do when it turns up, okay, for wanting to, so how about this?

[1381] Now imagine attaching that to a meal service, right?

[1382] And the meal service has everything, you know, all the, you know, million ingredients and supplements and vitamins and everything.

[1383] And I give the, I tell the AI my broad goals.

[1384] I want to gain muscle or I want to, you know, maintain my weight, but I want to have more energy or whatever.

[1385] I just want, or I want to, you know, I just want to be very healthy and I want to, obviously, everyone wants the same like 10 basic things like you want to avoid cancer you want to you know various things you want to age slower so now the AI has my goals and a drone comes at you know it's a little thing pops up and this is like you know beep beep like you know 15 minutes you're going to eat because it knows that's a great that's the right time for my body to eat 15 minutes later a little slot opens in my wall where a drone has come from the factory the eating at the food factory and dropped the perfect meal for my that moment for me for my mood, for my genome, for my blood contents.

[1386] And it's because it knows my goals.

[1387] So, you know, it knows I want to feel energy at this time and then I want to wind down here.

[1388] So those things, you have to tell it.

[1389] Well, plus the pleasure thing, like, it knows what kind of components of a meal you've enjoyed in the past so you can assemble the perfect meal.

[1390] Exactly.

[1391] It knows you way better than you know yourself, better than any human could ever know you.

[1392] And a little thing pops up.

[1393] You still have some choice, right?

[1394] So it pops up and it says like, you know, coffee because it knows that, you know, my cutoff, they says, you know, I can have coffee for the next 15 minutes only because at that point, it knows how long it stays in my system.

[1395] It knows what my sleep is like when I have it too late.

[1396] It knows I have to wake up at this time tomorrow because that's my calendar.

[1397] And so I think a lot of people's, this is, I think something that humans are wrong about is that most people will hear this and be like, that sounds awful.

[1398] That sounds dystopian.

[1399] No, it doesn't.

[1400] It sounds incredible.

[1401] And if we all had this, we would not look back and be like, I wish I was like making awful choices every day, like I was in the past.

[1402] And then these aren't important decisions your important decision -making energy your important focus and your attention can go on to your kids and on your work and on helping other people and things that matter and so I think AI can when I think about like personal lifestyle stuff like that I really love like I love thinking about that I think it's going to be very excited and I think we'll all be so much healthier that when we look back today one of the things that's going to look so primitive is the one size fits all thing getting like reading advice about keto um each genome is going to have very specific one you know unique advice coming from AI and so yeah yeah the customization that's enabled by collection of data and the use of AI a lot of people think what's the like they think of the worst case scenario that data being used by authoritarian governments to control you all that kind of stuff they don't think about most likely especially in a capitalist society it's most likely going to be used as part of a competition to get you the most delicious and healthy meal possible as fast as possible.

[1403] Yeah, so the world will definitely be much better with the integration of data.

[1404] But of course, you want to be transparent and honest about how that data is misused, and that's why it's important to have free speech and people to speak out, like when some bullshit is being done by companies.

[1405] That we need to have our wits about us as a society.

[1406] Like, this is one thing.

[1407] Free speech is the mechanism by which the big brain can think, can think for itself.

[1408] can think straight, can see straight.

[1409] When you take away free speech, when you start saying that in every topic, when any topic's political, it becomes treacherous to talk about.

[1410] So forget the government taking away free speech.

[1411] If the culture penalizes nuanced conversation about any topic that's political and the politics is so all -consuming and it's such a incredible market to polarize people, you know, for media to polarize people and to bring any topic it can into that.

[1412] and get people hooked on it as a political topic, we become a very dumb society.

[1413] So free speech goes away as far as it matters.

[1414] People say, oh, people like to say, oh, it's not, you know, you don't even know what free speech is.

[1415] Free speech is, you know, your free speech is not being violated.

[1416] It's like, no, you're right.

[1417] My First Amendment rights are not being violated.

[1418] But the culture of free speech, which is the second ingredient of two.

[1419] You need the First Amendment, and you need the culture of free speech, and now you have free speech.

[1420] And the culture is much more specific.

[1421] You obviously can have a culture that believes people, right?

[1422] now take any topic again that has to do with like you know some very sensitive topics uh you know police shootings um uh or you know what's going on in you know k -12 schools or you know even you know climate change you know take take any of these and the first amendment's still there no one go you're not going to get arrested no matter what you say uh their culture or free speech is is gone because you will be destroy your life can be over you know as far as it matters um if you say the wrong thing.

[1423] But even, you know, but a culture, a really vigorous culture of free speech, you get no penalty at all for even saying something super dumb.

[1424] People will say, like, people will laugh and be like, well, that was like kind of hilariously offensive and like, not at all correct.

[1425] Like, you know, you're wrong.

[1426] And here's why.

[1427] But no one's like mad at you.

[1428] Now the brain is thinking at its best.

[1429] The IQ of the big brain is like as high as it can be in that culture.

[1430] And the culture where you say something wrong and people say, oh, wow, you've changed.

[1431] Oh, wow.

[1432] Like, look, this is his real, you know, colors.

[1433] The big brain is dumb.

[1434] You still have mutual respect for each other.

[1435] So, like, you don't think lesser of others when they say a bunch of dumb things.

[1436] You know it's just the play of ideas, but you still have respect.

[1437] You still have love for them.

[1438] Because I think the worst case is when you have a complete, free, like, anarchy of ideas where it's, like, everybody lost hope that something like a truth can even be converged towards.

[1439] Like, everybody has their own truth.

[1440] Then it's just chaos.

[1441] Like, if you have mutual respect.

[1442] and a mutual goal of arriving at the truth and the humility that you want to listen to other people's ideas and a forgiveness that other people's ideas might be dumb as hell.

[1443] That doesn't mean they're lesser beings, all that kind of stuff.

[1444] But that's like a weird balance to strike.

[1445] Right now people are being trained, little kids, college students, being trained to think the exact opposite way, to think that there's no such thing as objective truth, which is, you know, the objective truth is the end on the compass for every thinker.

[1446] Doesn't mean we're, you know, necessarily on our way or refining, but we're all aiming in the same direction.

[1447] We all believe that there's a place we can eventually get closer to.

[1448] Not objective truth, you know, teaching them that disagreement is bad, violence.

[1449] You know, it's, it's, you know, it's like, you know, it's quickly sound like you're just going on like a political rant with this topic, but like it's really bad.

[1450] It's like genuinely the worst, if I was had my own country, I mean, it's like I would teach kids some very specific things that this is doing the exact opposite of and it sucks it sucks speaking of a way to escape this you've tweeted 30 minutes of reading a day equals yeah this whole video and it's cool to think about reading like in an as a habit and something that accumulates you said 30 minutes of reading a day equals 1 ,000 books in 50 years I love like thinking about this like chipping away at the mountain can you expand on that sort of the habit of reading how do you recommend people read yeah yeah i mean it's it's incredible if you do something a little of something every day it compiles it compiles you know i always think about like the people who achieve these incredible things in life these great like famous legendary people they had the same number of days that you do and it's not like they were doing magical days they just they got a little done every day and that adds up to uh to a monument, you know, they're putting one brick in a day.

[1451] Eventually they have this building, this legendary building.

[1452] So you can take writing, someone who, you know, there's two aspiring writers.

[1453] And one doesn't ever write, doesn't, you know, manages to never, you know, zero pages a day.

[1454] And the other one manages to do two pages a week, right?

[1455] Not very much.

[1456] The other one does zero pages a week, two pages a week.

[1457] 98 % of both of their time is the same.

[1458] The other person, just two percent, they're doing one other thing.

[1459] one year later, they have written, they write two books a year.

[1460] This prolific person, you know, in 20 years, they've written 40 books.

[1461] They're one of the most prolific writers of all time.

[1462] They write two pages a week.

[1463] Sorry, that's not true.

[1464] That was two pages a day.

[1465] Okay, two pages a week, you're still writing about a book every two years.

[1466] So in 20 years, you've still written 10 books, also prolific writer, right?

[1467] Huge, massive writing career.

[1468] You write two pages every Sunday morning.

[1469] The other person has the same exact week, and they don't do that Sunday morning thing.

[1470] They are a wannabe writer.

[1471] They always said they could write.

[1472] They talk about how they used to be here.

[1473] And nothing happens, right?

[1474] So it's inspiring, I think, for a lot of people who feel frustrated.

[1475] They're not doing anything.

[1476] So reading is another example where someone who reads very, you know, doesn't read.

[1477] And someone who's a prolific reader, you know, I always think about like the Tyler Cowan types.

[1478] I'm like, how the hell do you read so much?

[1479] It's infuriating, you know.

[1480] Or like James Clear puts out his like his 10 favorite books of the year.

[1481] 20, he's 20 favorite books of the year.

[1482] I'm like, you're 20 favorites.

[1483] Like, I'm trying to just read 20 books.

[1484] Like, that would be an amazing year.

[1485] So, um, but the thing is, they're not doing something crazy and magical.

[1486] They're just reading a half hour and night.

[1487] You know, if you read a half hour a night, the calculation I came to is that you can read a thousand books in 50 years.

[1488] So if someone who's 80 and they've read a thousand books, you know, between 30 and 80, they are extremely well read.

[1489] They can, they can delve deep into many nonfiction areas.

[1490] They can be, you know, an amazing fiction reader, avid fiction reader.

[1491] And again, that's a half hour a day.

[1492] Some people can do an hour, a half hour in the morning audiobook, half hour at night and bed.

[1493] Now they've read 2 ,000 books.

[1494] So I think it's, it's just, it's motivating.

[1495] And you realize that a lot of times you think that the people who are doing amazing things and you're not, you think that there's, there's a bigger gap between you and then there really is.

[1496] I, on the reading front.

[1497] I'm a very slow reader, which is just a very frustrating fact about me. But I'm faster with audiobooks.

[1498] And I also, I just, you know, I'll just, it's just hard to get myself to read.

[1499] But I've started doing audiobooks and I'll wake up, throw it on, do it in the shower, brushing my teeth, you know, making breakfast, dealing with the dogs, things like that, whatever, until I sit down.

[1500] And that's, I can read, I can read a book every week, book every 10 days at that clip.

[1501] And suddenly I'm this big reader because I'm just while doing my morning stuff, I have it on.

[1502] And also it's this fun that makes the morning so fun.

[1503] I'm like having a great time the whole morning.

[1504] So I'm like, oh, I'm so into this book.

[1505] So I think that, you know, audiobooks is another amazing gift to people who have a hard time reading.

[1506] I find that that's actually an interesting skill.

[1507] I do audiobooks quite a bit.

[1508] Like, it's a skill to maintain, at least for me, probably the kind of books I read, which is often like history or like, there's a lot of content.

[1509] and if you miss parts of it, you miss out on stuff.

[1510] And so it's a skill to maintain focus, at least for me. Well, the 10 -second back button is very valuable.

[1511] Oh, interesting.

[1512] So I just, if I get lost it, sometimes the book is so good that I'm thinking about what the person just said, and I just get the skill for me is just remembering to pause.

[1513] And if I don't, no problem, just back, back, back, just three quick backs.

[1514] So, of course, it's not that efficient, but it's, I do the same thing when I'm reading.

[1515] I'll read a whole paragraph and realize I was tuning out, you know?

[1516] You know, I haven't actually even considered to try that.

[1517] I've been so hard on myself maintaining focus because you do get lost in thought.

[1518] Maybe I should try that.

[1519] Yeah, and when you get lost in thought, by the way, you're processing the book.

[1520] That's not wasted time.

[1521] That's your brain really categorizing and cataloging what you just read and like.

[1522] Well, there are several kinds of thoughts, right?

[1523] There's thoughts related to the book and there's a thought that it could take you elsewhere.

[1524] Well, I find that if I am continually thinking about something else, I just say I'm not, I just pause the book.

[1525] Yeah, especially in the shower or something when like, that's sometimes when really great thoughts come off.

[1526] I'm having all these thoughts about other stuff.

[1527] I'm saying, clearly my mind wants to work on something else.

[1528] I'll just pause it.

[1529] Quiet, Dan, Carlin.

[1530] I'm thinking about something else right now.

[1531] Exactly, exactly.

[1532] Also, you can, things like you have to head out to the store.

[1533] Like, I'm going to read 20 pages on that trip.

[1534] Just walking back and forth.

[1535] Going to the airport.

[1536] I mean, flights, you know, the Uber, and then you're walking through the airport.

[1537] You're shedding the security line.

[1538] I'm reading the whole time.

[1539] Like, I know this is not groundbreaking.

[1540] People know what audio books are, but I think that more people should probably get into them than do.

[1541] I know a lot of people, they have this stubborn kind of things.

[1542] I don't like, I like to have the paperbook.

[1543] And sure, but like, it's pretty fun to be able to read.

[1544] I still, to this, I listen to a huge number of audiobooks and podcasts, but I still, the most impactful experiences for me are still reading.

[1545] And I read very, very slow.

[1546] And it's very frustrating when, like, you go to these websites, like, that estimate how long a book takes on average, those are always annoying.

[1547] They do like a page a minute when I read like best, a page every two minutes at best.

[1548] At best, when you're like really like actually not pausing.

[1549] I just my ADD, it's like I just, it's hard to keep focusing.

[1550] And I also like to really absorb.

[1551] So on the other side of things, when I finish a book, 10 years later, I'll be like, you know, that scene when this happens in another friend of read it, would be like, what?

[1552] I don't remember any like details.

[1553] I'm like, oh, I can tell you like the entire.

[1554] So I absorbed the shit out of it.

[1555] But I don't think it's worth it to, like, have read so much less in my life.

[1556] I actually, so in terms of going to the airport, you know, in these like filler moments of life, I do a lot of, it's an app called Enki.

[1557] I don't know if you know about it.

[1558] It's a space repetition app.

[1559] So there's all of these facts I have when I read, I write it down if I want to remember it.

[1560] And it's, you review it.

[1561] And the one, the things you remember, it takes longer and longer to bring back up.

[1562] It's like flashcards, but a digital app.

[1563] It's called ANK, I recommend it to a lot of people.

[1564] There's a huge community of people that are just, like, obsessed with it.

[1565] A -N -K -E?

[1566] A -N -K -I.

[1567] So this is extremely well -known app and idea, like, among students who are, like, medical students, like, people that really have to study.

[1568] Like, this is not, like, fun stuff.

[1569] They really have to memorize a lot of things.

[1570] They have to remember them well.

[1571] They have to be able to integrate them with a bunch of ideas.

[1572] And I find it to be really useful.

[1573] for, like when you read history, if you think this particular factoid, they'd probably be extremely useful for you, because you're, that'd be interesting actually thought, because you're doing, you talked about like opening up a trillion tabs and reading things.

[1574] You know, you probably want to remember some facts you read along the way.

[1575] Like, you might remember, okay, this thing I can't directly put into the writing, but it's a cool little factoid.

[1576] I want to store that in there.

[1577] And that's what I go, Anki, drop it in.

[1578] Oh, you can just drop it in?

[1579] Yeah.

[1580] You drop in a line of a podcast or like a video?

[1581] Well, no. I guess I can type it, though.

[1582] So, yes.

[1583] So Anki, there's a bunch of, it's called Space Repetitions.

[1584] There's a bunch of apps that are much nice than Enki.

[1585] Anki is the ghetto like Craigslist version, but it has a giant community because people are like, we don't want features.

[1586] We want a text box.

[1587] Like, it's very basic, very strict dom.

[1588] So you can drop in stuff.

[1589] You can drop in.

[1590] That sounds really, I can't believe I have not come across this.

[1591] You actually, once you look into it, you realize that how have I not come up?

[1592] You are the person.

[1593] I guarantee you you'll probably write a blog about it.

[1594] I can't believe you actually have any.

[1595] Well, it's also just like, it's your people too.

[1596] And my people say, what do you write about?

[1597] Literally anything I find interesting.

[1598] And so for me, once you start a blog, like your entire worldview becomes, would this be a good blog post?

[1599] I mean, that's the lens I see everything through.

[1600] But I constantly coming across something, or, you know, or just a tweet, you know, something that I'm like, oh, I need to, like, share this with my readers.

[1601] My readers to me are, like, my, like, my friends who I'm like, I'm going to, oh, I need to show, I need to tell them about this.

[1602] And so I feel like just a place to, I mean, I collect things in a document right now, if it's, like, really good.

[1603] But it's the little factoids and stuff like that, I think, especially if I'm learning something.

[1604] So the problem is when you say stuff, when you look at it, like tweet and all that kind of stuff, is you also need to couple that with a system for review because what Anki does is like literally it determines for me, I don't have to do anything.

[1605] There's this giant pile of things I've saved and it brings up to me, okay, here's, I don't know, when Churchill did something, right?

[1606] I'm reading about World War II a lot now.

[1607] Like a particular event, here's that.

[1608] Do you remember when, what year that happened?

[1609] And you say yes or no or, like, Like you get to pick, you get to see the answer, and you get to self -evaluate how well you remember that fact.

[1610] And if you remember, well, it'll be another month before you see it again.

[1611] If you don't remember, it'll bring it up again.

[1612] That's a way to review tweets, the review concepts.

[1613] Yeah.

[1614] And it offloads the kind of the process of selecting which parts you're supposed to review or not.

[1615] And you can grow at that library.

[1616] I mean, obviously, medical students use it for tens of thousands of facts.

[1617] It just gamifies it, too.

[1618] It's like you can passively sit back and just.

[1619] and the thing will like make sure you eventually learn it all versus you know it's you don't have to be the executive calling that like the program the memorization program someone else is handling yeah i would love to to hear about like you trying it out or space repetition is an idea there's a few other apps but anchies the big must i totally want to try you've written and spoken quite a bit about procrastination i like you suffer from procrastination like many other people suffering quotes uh How do we avoid procrastination?

[1620] I don't think the suffer is in quotes.

[1621] I think that's a huge part of the problem is that it's treated like a silly problem.

[1622] People don't take it seriously as a dire problem, but it can be.

[1623] It can ruin your life.

[1624] There's like, we talked because we talked about.

[1625] the compiling concept with, you know, if you read a little, you know, you, okay, if you write, if you write two pages a week, you write a book every two years.

[1626] You're a prolific writer, right?

[1627] And the difference between, you know, again, it's not that that person's working so hard, it's that they have the ability to, when they commit to something, like on Sunday mornings, I'm going to write two pages.

[1628] That's it.

[1629] They respect, they have enough, they have, they respect the part of them that made that decision is a respected character in their brain.

[1630] And they say, well, that's, I decided it, so I'm going to do it.

[1631] The procrastinator won't do those two pages.

[1632] That's just exactly the kind of thing.

[1633] The procrastinator will keep on their list and they will not do.

[1634] But it doesn't mean they're any less talented than the writer who does the two pages.

[1635] It doesn't mean they wanted any less.

[1636] Maybe they want it even more.

[1637] And it doesn't mean that they wouldn't be just as happy having done it as the writer who does it.

[1638] So what they're missing out on, picture a writer who writes 10 books, you know, bestsellers, and they go on these book tours and, you know, they, and they just are so gratified with their career and, you know, and they think about what the other person is missing who does none of that, right?

[1639] So that is a massive loss, a massive loss.

[1640] And it's because the internal mechanism in their brain is not doing what the other person.

[1641] is that they don't have the respect for the part of them that made the choice, they feel like it's someone they can disregard.

[1642] And so, to me, is it in the same boat as someone who is obese because they're eating habits, make them obese over time or their exercise habits, that, you know, that's a huge loss for that person.

[1643] That person is, you know, the health problems and it's just probably making them miserable.

[1644] And it's self -inflicted, right?

[1645] it's self -defeating, but that doesn't make an easy problem to fix just because you're doing it to yourself.

[1646] So to me, procrastination is another one of these where you are the only person in your own way.

[1647] You are, you know, you are failing at something or not doing something that you really want to do.

[1648] You know, it doesn't have to be work.

[1649] Maybe you want to get out of that marriage that you know, you realize it hits you, you shouldn't be in this marriage.

[1650] You should get divorce.

[1651] And you wait 20 extra years before you do it, or you don't do it at all.

[1652] That is, you know, you're not living the life that you know you should be living, right?

[1653] And so I think it's fascinating.

[1654] Now, the problem is it's also a funny problem because there's short -term procrastination, which I talk about as, you know, the kind that has a deadline.

[1655] Now, some people, you know, this is when I bring in, there's different characters, there's the panic monster comes in the room, and that's when you actually, you know, the procrastinator can, there's different levels.

[1656] There's the kind that even when there's a, deadline, they stop panicking.

[1657] They just, they've given up and they, they really have a problem.

[1658] Then there's the kind that when there's a deadline, they'll do it, but they'll wait to the last second.

[1659] Both of those people, I think, have a huge problem once there's no deadline.

[1660] Because, and most of the important things in life, there's no deadline, which is, you know, changing your career, you know, becoming a writer when you never have been before, getting out of your relationship, you know, be doing whatever you need to, the changes you need to make in order to get into a relationship.

[1661] There's a thing after, launching a startup.

[1662] Launching a startup, right?

[1663] Or once you've launched a startup, firing is the right, someone that needs to be fired, right?

[1664] Yes.

[1665] I mean, going out for fundraising and instead of just trying to, you know, there's so many moments when the big change that you know you should be making that would completely change your life if you just did it has no deadline.

[1666] It just has to be coming from yourself.

[1667] And I think that a ton of people have.

[1668] have a problem where they will, they think this delusion that, you know, I'm going to do that.

[1669] I'm definitely going to do that, you know, but not this week, not this month, not today, because whatever.

[1670] And they make this excuse again and again, and it just sits there on their list, collecting dust.

[1671] And so, yeah, to me, it is very real suffering.

[1672] And the fix isn't fixing the habits, just, like, not.

[1673] I'm still working on the fix, first of all.

[1674] So there's, okay, there is, there's, just say you have a boat that sucks and it's leaking and it's going to sink.

[1675] You can fix it with duct tape for a couple, you know, for one ride or whatever.

[1676] That's not really fixing the boat, but you can get you by it.

[1677] So there's duct tape solutions.

[1678] To me, so the panic monster is the character that rushes into the room once the deadline gets too close or once there's some scary external pressure, not just from yourself.

[1679] And that's a huge aid to a lot of procrastinators.

[1680] Again, there's a lot of people who won't, you know, do that thing.

[1681] They've been writing that book they wanting to write.

[1682] But there's way fewer people who will not show up to the exam.

[1683] You know, most people show up to the exam.

[1684] So that's because the panic monster is going to freak out if they don't.

[1685] So you can create a panic monster.

[1686] If you want to, you know, you really want to write music, you really want to become a singer -songwriter.

[1687] Well, book a venue.

[1688] tell 40 people about it and say, hey, on, you know, this day, two months from now, come and see, I'm going to play you some of my songs.

[1689] You now have a paint, but you're going to write songs.

[1690] You're going to have to, right?

[1691] So there's duct tape things.

[1692] You know, you can do things.

[1693] You know, people do, I've done a lot of this with a friend, and I say, if I don't get X done by a week from now, I have to donate a lot of money somewhere I don't want to donate.

[1694] And that's, you would put that in the category of duct tape solutions.

[1695] Yeah.

[1696] Because it's not, why do I need that, right?

[1697] If I really have solved this, this is something I want to do for me. It's selfish.

[1698] This is, I just literally just want to be selfish here and do the work I need to do to get the goals I want to get, right?

[1699] There's a much, all the incentives there should be in the right place.

[1700] And yet, if I don't say that, it'll be a week from now and I won't have done it.

[1701] Something weird is going on.

[1702] There's some resistance.

[1703] There's some force that is prevent, that is in my own way, right?

[1704] And so doing something where I have to pay all this money, okay, now I'll panic and do it.

[1705] So that's duct tape.

[1706] Fixing their boat is something where I don't have to do that.

[1707] I just will do the things that I, again, it's not, I'm talking about super crazy work ethic.

[1708] Just like, for example, okay, I have a lot of examples because I have a serious problem that I've been working on.

[1709] And in some ways, I've gotten really successful at solving it.

[1710] In other ways, I'm still floundering.

[1711] Yeah, the world's greatest duct taper.

[1712] Yes.

[1713] Well, I'm pretty good at duct taping.

[1714] I probably could be even better, and I'm like, and I'm, and I'm, you're procrastinating and becoming a better duct tape breathing.

[1715] Literally, like, yes, there's nothing, I won't.

[1716] So, here's what I know what I should do as a writer, right?

[1717] It's very obvious to me, is that I should wake up.

[1718] It doesn't have to be crazy.

[1719] I don't have to be six a .m. or anything sane, or I'm not going to be one of those crazy people, 530 jogs.

[1720] I'm going to wake up at whatever, you know, 738, 830, and I should have a block, like, just say nine to noon.

[1721] where I get up and I just really quick, make some coffee, and write.

[1722] It's obvious because all the great writers in history did exactly that.

[1723] Some of them have done that.

[1724] That's common.

[1725] There's some that I like these writers.

[1726] They do the late night sessions, but most of them, they do wait a lot.

[1727] There's a session.

[1728] But there's a session that's a lot of.

[1729] Most writers write in the morning, and there's a reason.

[1730] I don't think I'm different than those people.

[1731] It's a great time to write.

[1732] You're fresh, right?

[1733] Your ideas from dreaming have kind of collected.

[1734] did you have all, you know, new, new answers that you didn't have yesterday and you can just go.

[1735] But more importantly, if I just had a routine where I wrote from noon nine to noon, weekdays, every week would have a minimum of 15 focused hours of writing, which doesn't sound like a lot, but it's a lot.

[1736] A 15, 15, no, this is no joke.

[1737] This is, you know, you're not, your phone's away.

[1738] You're not talking to anyone.

[1739] You're not opening your email.

[1740] You are focused writing for three hours, five.

[1741] That's a big week for most writers, right?

[1742] So now what's happening is that every weekday is a minimum of a B. I'll give myself.

[1743] I know an A might be, you know, wow, I really just got into a flow and wrote for six hours and had, you know, great.

[1744] But it's a minimum of a B. I can keep going if I want.

[1745] And every week is a minimum of a B with that's 15 hours.

[1746] Right.

[1747] And if I just had to talk about compiling.

[1748] If I, this is the two pages a week.

[1749] If I just did that every week, I achieve all my writing goals in my life.

[1750] And yet I wake up and most days I just, either all revenge procrastination late at night and go to bed way too late and then wake up later and get on a bad schedule and I just fall into these bad schedules or I'll wake up and there's just, you know, I'll say I was going to do a few emails and I'll open it up and I'm suddenly on text and I'm texting and I'll just go and, you know, I'll make a phone call and I'll be on phone calls for three hours.

[1751] It's always something.

[1752] Yeah, yeah.

[1753] Or I'll start writing and then I hit a little bit of a wall, but because there's no sacred, this is a sacred writing block, I'll just hit the wall and say, well, this is icky and I'll go do something else.

[1754] So duct tape, what I've done is, um, uh, wait, Wait, But Why, has one employee, Alicia.

[1755] She's the manager of lots of things.

[1756] That's her role.

[1757] She truly does lots of things.

[1758] And one of the things we started doing is either she comes over and sits next to me where she can see my screen from 9 to noon.

[1759] That's all it takes.

[1760] The thing about procrastinations is usually they're not kicking and screaming.

[1761] I don't want to do this.

[1762] It's the feeling of, you know, in the old days when you had to go to class, your lunch block is over and it's like, oh, shit, I have class in five minutes.

[1763] Or it's Monday morning.

[1764] You go, oh, yeah.

[1765] But you said, you know what, but you know, you go.

[1766] You say, okay, and then you get to class and it's not that bad once you're there, right?

[1767] You know, you have a trainer and he says, okay, next set.

[1768] And you go, okay, and you do it.

[1769] That's all it is.

[1770] It's someone, some external thing being like, okay, I have to do this.

[1771] And then you have that moment of like, it sucks, but I guess I'll do it.

[1772] If no one's there, though, the problem with the procrastinator is they don't have that in person in their head.

[1773] Other people, I think were raised with a sense of shame if they don't do stuff.

[1774] And that stick in their head is hugely helpful.

[1775] I don't really have that.

[1776] And so anyway, Alicia is sitting there next to me. It's not she's doing her own work, but she can see my screen and she, of all people, knows exactly what I should be doing, what I shouldn't be doing.

[1777] That's all it takes.

[1778] The shame of just having her see me while she's sitting there not working would just be too, it's too weird and too embarrassing.

[1779] So I get it done, and it's amazing.

[1780] It's like game changer for me. So duct tape can solve.

[1781] Sometimes duct tape is enough, but I'm curious to, I'm still trying to, what is going on?

[1782] Yeah.

[1783] I think part of it.

[1784] it is that we are actually wired.

[1785] I think I'm being very sane, human actually is what's happening.

[1786] You're not sane is not the right word.

[1787] I'm being like, I'm being a natural human that we are not programmed to sit there and do homework of a certain kind that we get the results like six months later.

[1788] Like that is not, so we're supposed to conserve energy and like fulfill our needs as we need them and like do immediate things.

[1789] And we're overriding our natural ways when we, wake up and get to it.

[1790] And I think sometimes because the pain, I think a lot of times we're just avoiding suffering and for a lot of people, the pain of not doing it is actually worse because they feel shame.

[1791] So if they don't get up and take a jog and get up early and get to work, I'll feel like a bad person.

[1792] And that is worse than doing those things.

[1793] And then it becomes a habit eventually and it becomes just easy automatic.

[1794] It becomes I do it because that's what I do.

[1795] But I think that if you don't have a lot of shame necessarily, the pain of doing those things is worse in the immediate moment than not doing it and yet.

[1796] But I think that there's this feeling that you've captured with your body language and so on the, like the, I don't want to do another set, that feeling, the people I've seen that are good at not procrastinating are the ones that have trained themselves to like the moment they would be having that feeling.

[1797] They just, it's like Zen, like Sam Harris style, Zen, you don't experience that feeling.

[1798] Yeah.

[1799] You just march forward.

[1800] Like I talked to Elon about this a lot, actually, offline.

[1801] line, it's like he doesn't have this.

[1802] No, purely not.

[1803] The way I think, he talks about and the way I think about it, is it's like you just pretend you're like a machine running an algorithm.

[1804] Like, you know this, you should be doing this.

[1805] Not because somebody told you or so on, this is probably the thing you want to do.

[1806] Like, look at the big picture of your life and just run the algorithm.

[1807] Like, ignore your feelings, just run as a feeling.

[1808] Just framing, frame it differently.

[1809] Yeah.

[1810] You know, yeah, you can frame it as like, it can feel like homework.

[1811] or it can feel like you're living your best life or something when you're doing your work.

[1812] Yeah.

[1813] And maybe you reframe it.

[1814] But I think ultimately is whatever reframing you need to do, you just need to do it for a few weeks.

[1815] And that's how the habit is formed and you stick with it.

[1816] Like I've, I'm now on a kick where I exercise every day.

[1817] It doesn't matter what that exercise is.

[1818] It's not serious.

[1819] It could be 200 push -ups.

[1820] But it's the thing that, like, I make sure exercise every day and it's become way, way easier because of the habit.

[1821] And I just, and I don't, like, at least with exercise because it's easier to replicate that feeling.

[1822] I don't allow myself to go, like, I don't feel like doing this.

[1823] Right.

[1824] Well, I think about that, even just like little things.

[1825] Like, I brush my teeth before I go to bed.

[1826] And it's just a habit.

[1827] Yeah.

[1828] And it is effort.

[1829] Like, if it were something else, I would be like, I don't go to the bathroom.

[1830] I'm going to do that.

[1831] And I just want to, like, I'm just going to lie down right now.

[1832] But it doesn't even cross my mind.

[1833] It's just like that I just robotically go and do it.

[1834] And it almost has become like a nice routine.

[1835] It's like, oh, this part of the night.

[1836] You know, it's like morning routine for me stuff is like, you know, that stuff is kind of just like automated.

[1837] It's funny because you don't like go, like I don't think I've skipped many days.

[1838] I don't think I skipped any days brushing my teeth.

[1839] Like unless I didn't have a toothbrush like I was in the woods or something.

[1840] And what is that?

[1841] Because it's annoying.

[1842] To me there is, so the character that makes me procrastination is the instant gratification monkey.

[1843] that's what I've labeled him, right?

[1844] And there's the rational decision maker and the instant gratification monkey and these battle with each other.

[1845] But for procrastinator, the monkey wins.

[1846] I think the monkeys, you know, from you, you read about this kind of stuff, I think that the, this kind of more primitive brain is always winning.

[1847] And in the non -procrastinator, is that primitive brain is on board for some reason and isn't resisting.

[1848] So, but when I think about brushing my teeth, it's like the monkey doesn't even think there's an option to not do it.

[1849] So it doesn't even, like, get, there's no hope.

[1850] The monkey has no hope there.

[1851] So it doesn't even, like, get involved.

[1852] And it's just like, yeah, yeah, yeah, no, we have to.

[1853] Just, like, kind of, like, robotically, just like, you know, it's kind of like Stockholm syndrome, just like, oh, no, no, no, I have to do this.

[1854] It doesn't even, like, wake up.

[1855] It's like, yeah, we're doing this now.

[1856] For other things, the monkey's like, ooh, no, no, no, most days I can win this one.

[1857] And so the monkey puts up that, like, fierce resistance.

[1858] And it's like, it's a lot of it's, like, the initial transition.

[1859] So I think of it as like jumping in a cold pool where it's like I will spend the whole day pacing around the side of the pool in my bathing suit just being like I don't want to have that one second when you first jump in and it sucks.

[1860] And then once I'm in once I jump in I'm usually, you know, once I start writing, I'm suddenly I'm like, oh, this isn't so bad.

[1861] Okay, I'm kind of into it.

[1862] And then I sometimes you can't tear me away, you know, then I suddenly am like, I get into a flow.

[1863] So it's like once I get into cold water, I don't mind it.

[1864] But I will spend hours standing around the side of the pool.

[1865] And by the way, I do this in a more literal sense.

[1866] When I go to the gym with a trainer, in 45 minutes, I do a full, full -ass workout.

[1867] And it's not because I'm having a good time, but it's because it's that, oh, okay, I have to go to class feeling, right?

[1868] But when I go to the gym alone, I will literally do a set and then dick around my phone for 10 minutes before the next set.

[1869] And I'll spend over an hour there and do way less.

[1870] So it is the transition.

[1871] Once I'm actually doing the set, I'm never like, I don't want to stop in the middle.

[1872] And now it's just like, I'm going to do this.

[1873] And I feel happy I just did it.

[1874] So it's something, there's something about transitions that is very, that's why procrastinators relate a lot of places.

[1875] It's, I will procrastinate getting ready to go to the airport, even though I know I should leave it three, so I can not be stressed.

[1876] I'll leave at 336 and now be super stressed.

[1877] Once I'm on the way to the airport, immediately, I'm like, why didn't I do this earlier?

[1878] Now I'm back on my phone doing what I was doing.

[1879] I just had to get in the damn car or whatever.

[1880] So, yeah, there's some very, very odd.

[1881] irrational yeah like i i was waiting for you to come and you said that you're running a few minutes late and i was like um i was like i'll go get you a coffee because i can't possibly be the one who's early right i can't i don't understand i'm always late to stuff and i know it's disrespectful in in the eyes of a lot of people i can't help it's not you know what i'm doing ahead of it it's not like i don't care about the people i'm often like you know for like this conversation i'd be preparing more right like it's like it's like I obviously care about the person but for some interpreted it is like there I mean there are some people that like show up late because they like they kind of like that quality in themselves and that that's a dick right there's a lot of those people but more often it's someone who shows up frazzled and they feel awful and they're furious at themselves yeah so regretful exactly I mean that's me and I mean also all you all you have to do is look at those people alone running through the airport right there's they're not being disrespectful to anyone there they just inflicted this on themselves like This is hilarious.

[1882] You've tweeted a quote by James Baldwin saying, quote, I imagine one of the reasons people cling to their hates so stubbornly is because they sense once hate is gone, they will be forced to deal with the pain.

[1883] What has been a painful but formative experience in your life?

[1884] What's the flavor, the shape of your pain that fuels you?

[1885] I mean, honestly, the first thing that jumped to mind is, own like battles against myself to get my work done because it affects everything when I I just took five years in this book and granted it's a it's a beast like I probably would have taken two or three years but it didn't need to take five and that was a lot of not just you know not just that I'm not working it's that I'm I'm over researching I'm making it I'm adding in things I shouldn't because I'm perfectionist you know being a perfectionist about like oh well I learned that now I want to get it in there I know I'm going to end up cutting it later just you know or I over outline you know something you know, trying to get it perfect when I know that's not possible.

[1886] She's making a lot of immature kind of, like, I'm not actually that much of a writing amateur.

[1887] I've written, including my old blog.

[1888] I've been a writer for 15 years.

[1889] I know what I'm doing.

[1890] I could advise other writers really well.

[1891] And yet, I do a bunch of amateur things that I know while I'm doing them is an, I know I'm being an amateur.

[1892] So that A, it hurts the actual product.

[1893] It makes, you know, B, it's waste your precious time.

[1894] see when you're mad at yourself, when you're in a negative, you know, self -defeating spiral, it almost inevitably will, you'll be less good to others.

[1895] Like, you know, I'll just, you know, early on in my now marriage, one of the things we always used to do is I used to plan mystery dates.

[1896] You know, New York City, great, great place for this.

[1897] I'd find some weird little adventure for us, you know, it could be anything.

[1898] And I wouldn't tell her what it was.

[1899] I said, I'm reserving you for Thursday night, you know, at seven, okay?

[1900] And it was such a fun part of our relationship.

[1901] Started writing this book and got into a really bad, you know, personal space where it was like, in my head, I was like, I can't do anything until this is done.

[1902] You know, like, no. And I just stopped like ever valuing like joy of any kind.

[1903] Like I was like, no, no, that's when I'm done.

[1904] And that's a trap or very quickly, you know, you know, because I always think, you know, think it's going to be a six months away.

[1905] But actually five years later, I'm like, wow, I really wasn't living fully.

[1906] And for five years is not, we don't live very long.

[1907] like you talk about your prime decades like that's like a sixth of my prime years like wow like that's a huge loss so to me that was excruciating and you know and and it was a bad um pattern a very unproductive unhelpful pattern for me which is i'd wake up in the morning in this great mood great mood every morning wake up thrilled to be awake i have the whole day ahead of me i'm going to get so much work done today and but you know first i'm going to do all these other things and it's all going to be great and then i end up kind of failing for the day um with the those goals, sometimes miserably, sometimes only partially, and then I get in bed, probably a couple hours later than I want to, and that's when all the reality hits me. Suddenly, so much regret, so much anxiety, furious at myself, wishing I could take a time machine back three months, six months a year, or just even to the beginning of that day.

[1908] And just tossing and turning now, I mean, this is a very bad place.

[1909] That's what I said, suffering.

[1910] Procrastinators suffer in a very serious way.

[1911] So look, I, you know, I know this probably sounds like a lot of like first world problems.

[1912] And it is.

[1913] But it's real suffering as well.

[1914] Like it's, um, so to me, it's like, it's painful because you're not being, you're not being as good a friend or a spouse or whatever as you could be.

[1915] You're also not treating yourself very well.

[1916] You're usually not being very healthy in these moments, you know, you're often.

[1917] And you're not being, I'm not being good to my reader.

[1918] So it's just a lot of this.

[1919] And it's like, it feels like it's.

[1920] one small tweak away.

[1921] Sometimes it's like, that's what I said.

[1922] It's like, you just suddenly are just doing that nine to 12 and you get in that habit, everything else falls into place.

[1923] All of this reverses.

[1924] So I feel hopeful, but it's like it is a, I have not figured, I haven't fixed the boat yet.

[1925] I have some good duct tape.

[1926] And you also don't want to romanticize it because it is true that some, some of the greats in history, especially writers, suffer from all the same stuff.

[1927] Like they weren't quite able, I mean, you might only write for two or three hours a day, but the rest of the day is often spent, you know, kind of tortured.

[1928] Well, right.

[1929] This is the irrational thing.

[1930] This goes for a lot of people's jobs, people especially who work for themselves.

[1931] You'd be a shock how much you could wake up at nine or eight or seven or whatever.

[1932] Get to work and stop at one, but you're really focused in those hours.

[1933] One or two.

[1934] And do 25 really focused hours of stuff, product, stuff a week.

[1935] And then there's 112 waking hours in the week, right?

[1936] So we're talking about 80 -something hours.

[1937] is a free time.

[1938] You can live, you know, if you're just really focused in your, you know, yin and yang of your time, and that's what, that's my goal is black and white time.

[1939] I really focused time and then totally like clean conscience free time.

[1940] Right now I have neither.

[1941] It's a lot of gray.

[1942] It's a lot of, I should be working, but I'm not, oh, I'm wasting this time.

[1943] This is bad.

[1944] And that's just just just massive.

[1945] So if you can just get really good at, uh, the black and the white, so you just wake up and it's just like full work.

[1946] And then I think a lot of people could have like all this free time.

[1947] But instead, I'll do those same three hours.

[1948] And instead, I'll do those same three hours.

[1949] It's like you said, I'll do them really late at night or whatever after having tortured myself the whole day and not had any fun.

[1950] It's not like I'm having fun.

[1951] I call it the dark playground, by the way, which is where you are when you know you should be working, but you're doing something else.

[1952] You're doing something fun on paper, but it's never, it feels awful.

[1953] And so, yeah, I spent a lot of time in the dark.

[1954] And you know, you shouldn't be doing it and you still do it.

[1955] And yeah.

[1956] It's, it's not clean conscience, it's bad.

[1957] It's, it's, it's toxic.

[1958] And I think that it's, there's something about, you know, you're draining yourself all the time and if you just did your focused hours and then if you actually have good clean fun fun can be anything reading a book can be hanging out with someone who can be really fun you can go and do something cool in the city you know that is critical it's you're recharging some part of your psyche there and I think it makes it easier to actually work the next day and I say this from the experiences when I have had you know good stretches it's like it's like it's you're you know what it is it's like you feel like you're fist pounding one part of your brain's fist pounding the other part like you're like you're like we got like we treat we treat ourselves well like is how you're you internally feel like I treat myself and it's like yeah no it's work time and then later you're like now it's play time and it's like okay back to work and you're in this very healthy like parent child relationship in your head versus like this constant conflict and like the kid doesn't respect the parent the parent hates the kid and like yeah and you're right it always feels like it's like one fix away so that there's hope I mean I guess I mean so much of what you said just rings so true I guess I have the same kind of hope but you know this podcast is very regular.

[1959] I mean, I'm impressed.

[1960] And I think partially what there is a bit of a duct tape solution here, which is you just, the, because it's always easy to schedule stuff for the future for myself, right?

[1961] Because that's future Tim, and future Tim is not my problem.

[1962] So I'll schedule all kinds of shit for future Tim, and I will, and I will not then not do it.

[1963] But in this case, you can schedule podcasts and you have to show up.

[1964] Yeah, you have to show up.

[1965] Right.

[1966] It seems like a good medium for a program.

[1967] But this is not my, this is what I do for fun.

[1968] I know, but at least this is the kind of thing, especially if it's not your main thing.

[1969] Especially if not your main thing, it's the kind of thing that you would dream of doing and want to do and never do.

[1970] And I feel like your regular production here is a sign that something is working, at least in this regard.

[1971] Yeah, in this regard, but this, I'm sure you have this same kind of thing with the podcast.

[1972] In fact, because you're going to be doing the podcast, it's possible the podcast becomes what the podcast is for me. This is you procrastinating.

[1973] If you think about being 80 and if you can get into that person's head and look back and be like, just deep regret, you just, you know, yearning, you could do anything to just go back and have done this differently, that is desperation.

[1974] It's just you don't feel it yet.

[1975] It's not in you yet.

[1976] The other thing you could do is if you have a partner, if you want a partner with someone, now you could say, we meet these 15 hours every week.

[1977] And that point, you're going to get it done.

[1978] So working with someone can help.

[1979] Yeah, that's why they say, like, a co -founder is really powerful for many reasons, but that's, that's kind of one of them.

[1980] Because to actually, for the startup case, you, unlike writing, perhaps, you, it's really like a hundred hour plus thing.

[1981] Like, once you really launch, you go all in.

[1982] Like, everything else just disappears.

[1983] Like, you can't even have a hope of a balanced life for a little bit.

[1984] So, and there, co -founder really helps.

[1985] That's the idea.

[1986] When you, you're one of the most interesting people on the internet.

[1987] So as a writer, you look out into the future, do you dream about certain things you want to still create?

[1988] Is there projects that you want to write?

[1989] Is there movies you want to write or direct?

[1990] Endless.

[1991] So it's just endless see of ideas.

[1992] No, there's specific list of things that really excite me, but it's a big list that I know I'll never get through them all.

[1993] And that's part of why the last five years really like, you know, when I feel like I'm not moving as quickly as I could, it bothers me because I have so much genuine excitement to try so many different things, and they get so much joy from finishing things.

[1994] I don't like doing things, but a lot of writers are like that.

[1995] Publishing something is hugely joyful and makes it all worth it, you know, or just finishing something you're proud of, putting it out there and have people appreciate it.

[1996] It's like the best thing in the world, right?

[1997] You know, every kid makes them a little bargain with themselves, has a little, you know, a dream or, you know, something.

[1998] and I feel like when I'm when I do something that I make something in this you know for me it's been mostly writing and I feel proud of it and I put it out there I feel like I like again I'm like fist pounding my seven year old self like there's a little like I'm I like I owe to myself to do certain things and I just did one of the things I owe I just paid off some debt to myself I owed it and I paid it and it feels great it feels like very like you just feel very a lot of inner peace when you do so the more things I can do you know and I just have fun doing it Right?

[1999] And so I just, it's, for me, that includes a lot more writing.

[2000] I just, you know, short, short, short, not short blog post.

[2001] I write very long blog post.

[2002] But basically short writing in the form of long blog, blog post is a great.

[2003] I love that medium.

[2004] I want to do a lot more of that.

[2005] Books yet to be seen.

[2006] I'm going to do this.

[2007] And I'm going to have another book.

[2008] I'm going to do right after.

[2009] And we'll see if I like those two.

[2010] And if I do, I'll do more.

[2011] Otherwise, I won.

[2012] But I also want to try other mediums.

[2013] I want to, um, I did a little travel series once.

[2014] I love doing that.

[2015] I want to do, you know, more of that.

[2016] Almost like a little.

[2017] vlog like no it was um i let readers in a survey pick five countries they wanted me to go that's awesome and they picked they sent me to weird places they sent me i went to uh siberia i went to japan i went from there to this is all in a row into to nigeria from there to iraq and from there to greenland and then i went back to new york uh like two weeks in each place um and i get to you know each one I got to, you know, have some weird experiences.

[2018] I tried to, like, really dig in and have, like, you know, some interesting experiences.

[2019] And then I wrote about it.

[2020] And I taught readers a little bit about the history of these places.

[2021] And it was just, I love doing that.

[2022] I love, right.

[2023] So, you know, and I'm like, oh, man, like, I haven't done one of those in so long.

[2024] And then, um, then I have a big, like, desire to do fictional stuff.

[2025] Like, I want to write a sci -fi at some point.

[2026] And I would love to, um, write a musical.

[2027] That's actually what I was doing before.

[2028] But why I was, I was with a partner, Ryan Langer, um, We were halfway through a musical, and he got tied up with his other musical and why I started taking off, and we just haven't gotten back to it.

[2029] But it's such a fun medium.

[2030] It's such a silly medium, but it's so fun.

[2031] So you think about all of these mediums on which you can be creative and create something, and you like the variety of it.

[2032] Yeah, it's just that if there's a chance on a new medium, I could do something good, I want to do it.

[2033] I want to try it.

[2034] It sounds like so gratifying and so fun.

[2035] I think it's fun to just watch you.

[2036] actually sample these.

[2037] So I can't wait for your podcast.

[2038] I'll be listening to all of them.

[2039] I mean, that's a cool medium to see, like, where it goes.

[2040] The cool thing about podcasting and making videos, especially with a super creative mind like yours, you don't really know what you're going to make of it until you try it.

[2041] Yeah, podcasts I'm really excited about, but I'm like, I like going on other people's podcasts.

[2042] Yeah.

[2043] And I never try and have anyone so there's this, with every medium, there's the challenges of how the sausage is made.

[2044] So like the challenges of the challenge of action yeah but it's also i'd like to like i'll go on like as you know long ass monologues and you can't do it on if you're the interviewer like you're not supposed to do that as much so i have to like rein it in and um and that's that can be that might be hard but we'll see you could also do solo type stuff yeah maybe i'll do a little of each you know it's funny i mean some of my favorite is more like solo but there's like a sidecake so you're you're you're having a conversation but you're like friends but it's really you ranting which I think you'd be extremely good at it's funny yeah or even if it's 50 50 that's fine like if it's just a friend who I want to like really riff with um I just don't I don't like interviewing someone which I won't that's not what the podcast will be but I can't help I've tried moderating panels before and I cannot help myself I have to get involved and no one likes a moderator who's too involved it's very unappealing so I um you know interviewing someone and I'm like I can't I don't even I just it's not my I can grill someone that's different that's my curiosity being like Wait, how about this?

[2045] And I interrupt them, and I'm trying to.

[2046] I see the way your brain works.

[2047] It's hilarious.

[2048] It's awesome.

[2049] It's like lights up with fire and excitement.

[2050] Yeah, actually, I love listening.

[2051] I like watching people.

[2052] I like listening to people.

[2053] Yeah.

[2054] So this is like me right now having, just listening to a podcast.

[2055] This is me listening to your podcast.

[2056] I love listening to a podcast because then it's not even like, but once I'm in the room, I suddenly can't help myself, like jumping in.

[2057] Okay.

[2058] Big last ridiculous question.

[2059] What is the meaning of life?

[2060] The meaning of like an individual life?

[2061] your existence here on earth, or maybe broadly, this whole thing we got going on, descendants of apes, busily creating.

[2062] Yeah, well, there's, yeah, for me, I feel like I want to be around as long as I can.

[2063] If I can do some kind of crazy life extension or upload myself, I'm going to, because who doesn't want to see how cool the year 3000 is?

[2064] You did say mortality was not appealing.

[2065] No, it's not appealing at all to me. Now, it's ultimately appealing, as I said, no one wants eternal life, I believe.

[2066] if they understood what eternity really was.

[2067] And I did Graham's number as a post, and I was like, okay, no one wants to live that many years.

[2068] But I'd like to choose.

[2069] I'd like to say, you know, I'm truly over it now and I'm going to have, you know, at that point, we'd have our whole society would have, like, sit.

[2070] We'd have a whole process of someone signing off.

[2071] And, you know, it would be beautiful, and it wouldn't be sad.

[2072] Well, I think you'd be super depressed by that point.

[2073] Like, who's going to sign off when they're doing pretty good?

[2074] Maybe, maybe, yes.

[2075] Okay, maybe it's dark.

[2076] But at least, but the point is, if I'm happy, I can stay around for five, you know, my, I'm thinking 50 cents.

[2077] century sounds great.

[2078] I don't know if I want more than that.

[2079] 50 sounds like a right number.

[2080] And so if you're thinking, if you would sound up for 50, if you had a choice, one is what I get.

[2081] That is bullshit.

[2082] If you want, if you're somebody who wants 50, one is a hideous number, right?

[2083] Anyway, so for me personally, I want to be around as long as I can.

[2084] And then honestly, the reason I love writing, the thing that I love most is like, is like a warm, fuzzy connection with other people, right?

[2085] And that can be my friends and it can be readers.

[2086] And the that's why I would never want to be like a journalist where the personality is like hidden behind the writing or like even a biographer, you know, there's a lot of people who would do as great writers, but it's, I like to personally connect.

[2087] And if I can take something that's in my head and other people can say, oh my God, I think that too.

[2088] And this made me feel so much better.

[2089] It made me feel seen.

[2090] Like that feels amazing.

[2091] And I just feel like we're all having such a weird common experience on this one little rock in this one little moment of time.

[2092] We're this weird, these weird four -limbed beings.

[2093] And we're all the same.

[2094] And it's like, we're all we all the human experience so i feel like so many of us suffer in the same ways and we're all going through a lot of the same things and to me it is very if i if i lived if i was on my death and i feel like i had like i had a ton of human connection and like shared a lot of common experience and made a lot of other people feel like uh not alone do you feel that as a writer do you do you like hear and feel like the inspiration like all the people that you make smile and all the people you inspire.

[2095] Honestly, not sometimes, you know, when we did an in -person event and I, you know, meet a bunch of people and it's incredibly gratifying.

[2096] Or, you know, you just, you know, you get emails, but I think it is easy to forget that how many people sometimes you're, because you're just sitting there alone typing.

[2097] Yeah.

[2098] Dealing with your procrastination.

[2099] But that's why publishing is so gratifying because that's the moment when all this connection happens.

[2100] Yeah.

[2101] And especially if I had to put my finger on it, it's like, it's having a bunch of people who feel lonely and they're like, existence is all real, like all, you know, connect, right?

[2102] So that, if I do a lot of that, And that includes, of course, my actual spending, you know, a lot of really high quality time with friends and family and, like, and make you the whole thing as heartbreaking as like mortality and life can be, make the whole thing like fun and at least we can like laugh at ourselves together while going through it.

[2103] Yeah.

[2104] And that to me is that, yeah.

[2105] And then your last blog post will be written from Mars as you get the bad news that you're not able to return because of the malfunction in the rocket.

[2106] Yeah.

[2107] I would like to go to Mars and like go there for a week.

[2108] and be like, yeah, here we are, and then come back.

[2109] No, I know that's what you want.

[2110] Staying there, yeah.

[2111] And that's fine, by the way.

[2112] If I, yeah, if, so you think you're picturing me alone on Mara's as the first person there and then it malfunctions.

[2113] Right, no, you were supposed to return, but it malfunctions.

[2114] And then there's this, the, so it's both the hope, the awe that you experience, which is how the blog starts.

[2115] And then it's the overwhelming, like, feeling of existential dread.

[2116] but then it returns to like the love of humanity.

[2117] Well, that's the thing is if I could be writing and actually like writing something that people would read back on Earth, it would make it feel so much better.

[2118] You know, if I were just alone and no one was going to realize what happened.

[2119] No, no, you get to write.

[2120] Yeah, no, it's perfect.

[2121] Also, that would bring out great writing.

[2122] Yeah, I think so.

[2123] On your deathbed on Mars alone.

[2124] I think so.

[2125] Yeah.

[2126] Well, that's exactly the future, I hope for you, Tim.

[2127] All right, this was an incredible conversation.

[2128] You're a really special human being, Tim.

[2129] Thank you so much for spending your really valuable time with me. I can't wait to hear your podcast.

[2130] I can't wait to read your next blog post, which you said in a Twitter reply.

[2131] You'll get more after the book, which add that to the long list of ideas to procrastinate.

[2132] Tim, thanks so much for talking to Dave, man. Thank you.

[2133] Thanks for listening to this conversation with Tim Urban.

[2134] To support this podcast, please check out our sponsors in the day.

[2135] description.

[2136] And now, let me leave you with some words from Tim Urban himself.

[2137] Be humbler about what you know, more confident about what's possible, and less afraid of things that don't matter.

[2138] Thanks for listening and hope to see you next time.