The Joe Rogan Experience XX
[0] Joe Rogan podcast, checking out.
[1] The Joe Rogan Experience.
[2] Train by day, Joe Rogan podcast by night, all day.
[3] Good to see you.
[4] Glad to be on the show.
[5] My pleasure.
[6] Thanks for having me. My pleasure.
[7] What's cracking?
[8] How you doing?
[9] Doing all right.
[10] We were just talking about how you're still trapped in L .A. I'm still trapped in L .A. I know.
[11] Are you friends with a lot of people out here?
[12] Have you thought about jettison?
[13] I talk about it all the time.
[14] And it's, but, you know, it's always talk is often a substitute for action.
[15] Does it lead to action or does it end up substituting for action?
[16] That's a good point.
[17] But I have endless conversations about leaving.
[18] I moved from San Francisco to L .A. back in 2018, that felt about as big a move away as possible.
[19] And I keep the extreme thing I keep saying, and again, I have to keep my talk as a substitute for action.
[20] The extreme thing I keep saying is I can't decide whether to leave the state or the country.
[21] Oh, boy.
[22] And you went out of the country, where would you go?
[23] Man, I've, it's, it's tough to find places because, you know, there are a lot of problems in the U .S. And most places are doing so much worse.
[24] Yeah.
[25] It's not a good move to leave here.
[26] But it's just fucked up as this place is.
[27] But I keep, I keep thinking I shouldn't move twice.
[28] So I should either, I can't decide whether I should move to Florida or should move to, you know, New Zealand or Costa Rica or something like that.
[29] Yeah.
[30] Yeah.
[31] go full John McAfee.
[32] And so I can't decide between those two, so I end up stuck in California.
[33] Well, Australia is okay, but they're even worse when it comes to a rule of law and what they decide to make you do and the way they're cracking down on people now for online speech.
[34] And it's very sketchy in other countries.
[35] It's, but somehow, somehow the, the relative outperformance of the U .S. and the absolute stagnation decline of the U .S. They're actually related things because the way the conversation's grooved, every time I say, tell someone, you know, I'm thinking about leaving the country, they'll do what you say, and they'll say, well, every place is worse.
[36] And then that somehow distracts us from all the problems in this country.
[37] And then we can't talk about what's gone wrong in the U .S. because, you know, everything is, everything's so much worse, you know.
[38] Well, I think most people know what's gone wrong, but they don't know if they're on the side of the government that's currently in power.
[39] They don't know how to criticize it.
[40] They don't know exactly what to say, what should be done.
[41] And they're ideologically connected to this group being correct.
[42] Right.
[43] So they try to do mental gymnastics to try to support some of the things that are going on.
[44] I think that's part of the problem.
[45] I don't think it's necessarily that we don't know what the problems are.
[46] We know what the problems are.
[47] but we don't have clear solutions as to how to fix them, nor do we understand the real mechanisms of how they got there in the first place.
[48] Yeah, I mean, there are a lot that are pretty obvious to articulate, and they're much easier described than solved.
[49] Like we have a crazy, crazy budget deficit.
[50] Yeah.
[51] And presumably, you have to do one of three things.
[52] You have to raise taxes a lot.
[53] You have to cut spending a lot, or you're just going to keep working.
[54] borrowing money.
[55] Isn't there like some enormous amount of our taxes that just go to the deficit?
[56] It's it's not it's not that high but it's gone up a lot and what is it what is it didn't I thought it was like you know it peaked at three point I want to say it peaked at 3 .1 % of GDP which is you know maybe 15 20 % of the budget it peaked at 3 .1 % of GDP in 1991 and then it went all the way down to something like 1 .5 % in the mid -2010s, and now it's crept back up to 3 .1, 3 .2%.
[57] And so we are at all -time highs as a percentage of GDP.
[58] And the way to understand the basic math is the debt went up a crazy amount, but the interest rates went down.
[59] And from 2008 to 2021 for 13 years, we basically had zero interest rates with one brief blip under Powell, but it was basically zero rates.
[60] And then you could borrow away more.
[61] money and it wouldn't show up in servicing the debt because you just paid zero percent interest on the T bill T bills and and the thing that's that's very dangerous seeming to me about the current fiscal situation is the interest rates have gone back to positive like they were in the 90s and early 2000s mid 2000s and and it's just this incredibly large debt and so we now have a we now have a real runaway deficit problem but you know people have been talking about this for 40 years and crying wolf for 40 years.
[62] So it's very hard for people to take it seriously.
[63] Most people don't even understand what it means.
[64] Like when you say there's a deficit, we owe money.
[65] Okay, to who?
[66] How's that work?
[67] It's, well, it's people who bought the bonds and it's, you know, a lot of it's to Americans.
[68] Some of them are held by the Federal Reserve.
[69] Decent amount are held by foreigners at this point.
[70] Because it's in some ways it's the opposite of the trade current account deficits.
[71] The U .S. has been running these big current account deficits, and then the foreigners end up with way more dollars than they don't want to spend on American goods or services.
[72] And so they have to reinvest them in the U .S. Some put it into houses or stocks, but a lot of it just goes into government debt.
[73] So in some ways it's a function of the chronic trade imbalances, chronic trade deficits.
[74] Well, if you had supreme power, if Peter Thiel was the ruler of the world and you could fix this, what would you do?
[75] Man, I always find, I always find that hypothetical.
[76] It's a ridiculous hypotheticals.
[77] It is ridiculous.
[78] Hypothetical as you get ridiculous answers.
[79] I want a ridiculous answer.
[80] That's what I like.
[81] But what could be done?
[82] First of all, what could be done to mitigate it and what could be done to solve it?
[83] You know, I think my answers are probably all in the.
[84] in the, you know, in the very libertarian direction.
[85] So it would be sort of figure out ways to have smaller governments, figure out ways, you know, to increase the age on social security, means test social security, so not everyone gets it, just figure out ways to gradually dial back, you know, a lot of these government benefits.
[86] And then that's, you know, that's insanely unpopular.
[87] So it's completely unrealistic on that level.
[88] That bothers people that need Social Security.
[89] Well, I said means tested.
[90] Means tested.
[91] So people who don't need it, don't get it.
[92] Right.
[93] So Social Security, even if you're very wealthy, I don't even know how it works.
[94] Do you still get it?
[95] Yeah, basically anyone who pretty much everyone gets it because it was originally rationalized as a sort of a pension system, not as a welfare system.
[96] And so the fiction was you pay social security taxes, and then you're entitled to get a pension out in the form of social security.
[97] Right.
[98] And because it was, we told this fiction that it was a form of, there was a pension system instead of an intergenerational Ponzi scheme or something like that.
[99] You know, the fiction means everybody gets paid social security because it's a pension system.
[100] Whereas if we were more honest and said it's, you know, it's just a welfare system.
[101] maybe you could start dialing, you could, you could probably rationalize in a lot of ways.
[102] And it's not related to how much you put into it, right?
[103] Like, how does Social Security work in terms of...
[104] I think it's partially related.
[105] So I think there is, I'm not a total expert on this stuff, but I think there's some guaranteed minimum you get.
[106] And then if you put more in, you get somewhat more, and then it's capped at a certain amount.
[107] And even that's a certain amount.
[108] That's why Social Security taxes are capped at something like, you know, $150 ,000 a year.
[109] And then this is, you know, this is one of the, this is one of the really big tax increase proposals that's out there is to, is to uncap it, which would effectively be a 12 .4 % income tax hike, you know, on all your income.
[110] Adjust to Social Security?
[111] Sure, because the argument is, the argument, the sort of progressive left Democrat argument is that, uh, is that it's, you know, why should you have a regressive Social Security tax?
[112] Why should you pay 12 .4 % or whatever the Social Security tax is?
[113] Half gets paid by you, half gets paid by your employer.
[114] But then it's capped at like 140, 150K, some level like that.
[115] And why should be regressive, where if you make 500K or a million K a year, you pay zero tax on your marginal income?
[116] And that makes no sense if it's a welfare program.
[117] If it's a retirement savings program and your payouts capped, then, you know, if you don't need to put in more than you get out.
[118] Well, that's logical, but there's not a lot of logic going on with the way people are talking about taxes today.
[119] Like California just jacked their taxes up to 14.
[120] What?
[121] Was it 14 .4?
[122] Something like that.
[123] Yeah, 143, I think.
[124] Which is hilarious.
[125] Maybe more.
[126] Yeah.
[127] 49.
[128] Something.
[129] Yeah.
[130] I mean, you want more money for doing a terrible job and having more people leave for the first time ever in, like, history of the state.
[131] Yeah, but it, look, it gets away with it.
[132] I know.
[133] And so people are forced with no choice.
[134] What are you going to do?
[135] It is, it is, I mean, I mean, there are people at the margins who leave, but, but the state government still collects more and more in revenues.
[136] So it's, you know, you get, I don't know, you get 10 % more revenues and 5 % of people leave.
[137] You still you still increase the amount of revenues you're getting.
[138] It's inelastic enough that you're actually able to increase the revenues.
[139] I mean, this is sort of the crazy thing about California is, you know, there's always sort of a right -wing or libertarian critique of California that, you know, it's such a ridiculous place.
[140] It should just collapse under its own ridiculousness.
[141] And it doesn't quite happen.
[142] You know, the macroeconomics on it are pretty good.
[143] You know, 40 million people, the GDP's around 4 trillion.
[144] It's about the same as Germany with 80 million or Japan with 125 million.
[145] Japan has three times the population of California.
[146] Same GDP means one -third the per capita GDP.
[147] So there's some level on which, you know, California as a whole is working, even though it doesn't work from a governance point of view.
[148] It doesn't work for a lot of the people who live there.
[149] And the rough model I have for how to think of California is that it's kind of like Saudi Arabia.
[150] And you have a crazy religion, wokeism in California, Wahhabism in Saudi Arabia.
[151] You know, not that many people believe it, but it distorts everything.
[152] And then you have like oil fields in Saudi Arabia and you have the big tech companies in California.
[153] And the oil pace for everything.
[154] And then you have a completely bloated, inefficient government sector, and you have sort of all sorts of distortions in the real estate market, where people also make lots of money in sort of the government and real estate are ways you redistribute the oil wealth or the, you know, the big tech money in California.
[155] And it's like it's not the way you might want to design a system from scratch, but it's, you know, it's, it's pretty stable.
[156] And people have been saying Saudi Arabia is ridiculous.
[157] It's going to collapse any year now.
[158] They've been saying that for 40 or 50 years.
[159] But if you have a giant oil field, you can pay for a lot of ridiculousness.
[160] I think that's the way to, that's the way you have to think of California.
[161] Well, the other thing is you're also...
[162] There are things about it that are ridiculous, but there's something about it that, you know, it doesn't naturally self -destruct overnight.
[163] Well, there's a lot of kick -ass people there.
[164] And there's a lot of people that are still generating enormous amounts of wealth there.
[165] and it's too difficult to just pack up and leave.
[166] I think it's something like four of the eight or nine companies with market capitalizations over a trillion dollars are based in California.
[167] That's amazing.
[168] It's Google, Apple, now Invidia, meta.
[169] Yeah, I think Broadcom is close to that.
[170] And there's no ideal place to live either.
[171] It's not like California sucks, so there's a place that's got it totally dialed in with also that has an enormous GDP, also has an enormous population.
[172] There's not like one big city that's really dialed in.
[173] Well, there are things that worked.
[174] I looked at all the zero tax states in the U .S. And it's always you don't, I think the way you ask the question gets at it, which is you don't live in a, you know, in theory, A lot of stuff happens on a state level, but you don't live in a state, you live in a city.
[175] And so if you're somewhat biased towards living in at least a moderately sized city, okay, I think there are four states where there are no cities, Alaska, Wyoming, South Dakota, New Hampshire.
[176] There's zero tax, but no cities to speak of.
[177] And then you have Washington State with Seattle, where the weather is the worst in the country.
[178] You have Nevada with Las Vegas, which I'm not that big a fan of.
[179] And then that leaves three zero -tax states.
[180] You have Texas, which I like as a state, but I'm not that big a fan of Austin, Dallas, or Houston.
[181] And, you know, it's a sort of, Houston is just sort of an oil town, which is good if you're in that business, but otherwise not.
[182] Dallas has sort of an inferiority complex to L .A. and New York, you know, which is not the healthiest attitude.
[183] And then, you know, I don't know, Austin's a government town and a college town and a wannabe hipster San Francisco town.
[184] So, you know, my book sort of three strikes and you're kind of out too.
[185] And then that leaves Nashville, Tennessee, which was, and then, or Miami, South Florida.
[186] And those would be my two top choices.
[187] Miami's fun, but I wouldn't want to live there.
[188] It's a fun place to visit.
[189] It's a little too crazy.
[190] A little too chaotic.
[191] A little too cocaine -fueled.
[192] A little too party, party, party.
[193] I think it's, I think it's pretty, I think it's pretty segmented from the tourist, the tourist strip from everything else.
[194] it probably is something a little bit paradoxical about any place that gets lots of tourists where you know it's in some sense of a case there's some things that are great about because so many tourists go but then in some sense it's it creates a weird aesthetic because the day -to -day vibe is that you don't work and you're just having fun or something like that right because so many people are going there just to do that and that's probably a little bit off with the South Florida, the South Florida thing.
[195] But I think it's, and then I think, and then I think Nashville is also sort of its own real place.
[196] Nashville's great.
[197] Yeah.
[198] So those would be my, those are the top two.
[199] I could live in Nashville.
[200] No problem.
[201] I'm probably always, I'm always, I'm always to, you know, fifth grade onwards since, you know, 70, 77, I lived in California.
[202] And so I'm just a sucker for the weather, and I think there is no place besides coastal California.
[203] We have really good weather year -round in the U .S. Maybe Hawaii is pretty good.
[204] Coastal California is tough to beat.
[205] And you're two hours from the mountains.
[206] Man, it's like, you know, it's mid -August here in Austin.
[207] It's just brutal.
[208] Is it?
[209] I think so.
[210] Really?
[211] That was too hot for you?
[212] It was too hot for you?
[213] Today's mild.
[214] What is it out there?
[215] Like 80?
[216] 85?
[217] I do so much sauna that I literally don't even notice it.
[218] I'm outside for hours every day shooting arrows, and I don't even notice it.
[219] Well, that's a, I don't know if you're a representative of the average Austin president.
[220] I don't know, but I think you get accustomed to it.
[221] To me, it's so much better than too cold.
[222] Too cold you can die.
[223] And I know you can die from the heat, but you probably won't, especially if you have water.
[224] You'll be okay.
[225] But you could die from the cold.
[226] Cold's real.
[227] So really cold places, there's five months out of the year where your life's in danger, where you could do something wrong.
[228] Like if you live in Wyoming and you break down somewhere and there's no one on the road, you could die out there.
[229] That's real.
[230] You could die from exposure.
[231] Sure.
[232] There's probably some very deep reason there's been a net migration of people to the west and the south and the U .S. over the last five decades.
[233] As long as the earth doesn't move, you're good.
[234] As long as there's no tsunamis, you're good.
[235] It is a perfect environment, virtually year -round.
[236] It gets a little hot in the summer, but again, coastal, not at all.
[237] If you get an 80 -degree day in Malbu, it's unusual, you know?
[238] It's wonderful.
[239] You've got a beautiful breeze coming off the ocean, sun's out, everybody's pretty.
[240] And then it's correlated with confiscatory taxation.
[241] It's all sort of a package deal.
[242] Well, it's a scam.
[243] You know, they know you don't want to leave.
[244] I didn't want to leave California.
[245] It's fucking great.
[246] I appreciate you left.
[247] I always have the fantasy that if enough people like you leave, it'll put pressure on them.
[248] But it's never quite enough.
[249] Never quite enough.
[250] And it's not going to be.
[251] It's too difficult for most people.
[252] It was very difficult for me. And I had a bunch of people working for me that were willing to pack up and leave, like young Jamie over there.
[253] But we, you know, it was tricky.
[254] You're taking your whole business, and my business is talking to people that's part of my business.
[255] My other business is stand -up comedy.
[256] So you left during COVID?
[257] I left at the very beginning.
[258] As soon as they started locking things down, I'm like, oh, these ones are.
[259] The other fuckers are never letting this show.
[260] April, March, April, May, 2020.
[261] In May, I started looking at houses.
[262] Cool.
[263] That's when I came to Austin first.
[264] I got a place in Miami in September 2020, and I spent the last, you know, I've spent the last four winters there.
[265] So I'm sort of always on the cusp of moving to Florida, hard to get out of California.
[266] But the thing that's gotten a lot harder about moving relative to four years ago.
[267] And, you know, I'd say, I think my real estate purchase have generally not been, not been great over the years.
[268] I mean, they've done okay, but certainly not the way I've been able to make money at all.
[269] But with the one exception was Miami.
[270] I bought it in September 2020, and probably, you know, fast forward four years, it's up like 100%.
[271] Wow.
[272] something like that.
[273] And then, but then paradoxically, this also means it's gotten much harder to move there or Austin or any of these places.
[274] You know, if you, you know, if, if I relocated my office in L .A., the people who own houses, you know, okay, you have to, you have to buy a place in Florida.
[275] It costs twice as much as it did four years ago.
[276] And then the interest rates have also doubled.
[277] And so you get a 30 -year mortgage.
[278] You could have locked that in for 3 % in 2020.
[279] Now it's, you know, maybe six and a half, seven percent.
[280] So the prices have doubled, the mortgage have doubled.
[281] So it costs you four times as much to buy a house.
[282] And so, yeah, so there was a moment where people could move during COVID, and it's gotten dramatically harder relative to what it was four years ago.
[283] Well, the Austin real estate market went crazy, and then it came back down a little bit.
[284] It's in that down a little bit spot right now where there's a lot of like high -end properties that are still for sale.
[285] They can't move.
[286] It's different.
[287] There's not a lot of people moving here now like there was in the boom because everything's open everywhere.
[288] Well, I somehow think Austin was linked to California and Miami was linked a little bit more to New York.
[289] And it was a little bit, you know, all these differences, but Austin was kind of, you know, a big part of the move were people from tech, from California that moved to Austin.
[290] You know, there's a part of the Miami, South Florida thing, which was people from finance in New York, New York City that moved to Florida.
[291] And the finance industry is less networked on New York City.
[292] So I think it is possible for people if you run a, you know, private equity fund or if you work at a bank, it's possible for some of those functions to easily be moved to a different state.
[293] The tech industry is crazily networked on California.
[294] Like there's probably some way to do it.
[295] It's not that easy.
[296] Yeah, it makes sense.
[297] It makes sense, too.
[298] It's just the sheer numbers.
[299] I mean, when you're talking about all those corporations that are established and based in California, there's so many.
[300] They're so big.
[301] Just the sheer numbers of human beings that live there and work there that are involved in tech.
[302] Sure.
[303] If it wasn't as networked, you know, you could probably just move.
[304] And maybe these things are networked till they're not.
[305] You know, Detroit was very networked.
[306] The car industry was super networked on Detroit for decades and decades.
[307] And Michigan got more and more mismanaged, and people thought the network sort of protected them because, you know, the big three car companies were in Detroit, but then you had all the supply chains were also in Detroit.
[308] And then eventually it was just so ridiculous.
[309] People moved, started moving the factories outside of that area, and it's sort of unraveled.
[310] So that's, you know, it can also happen with California.
[311] It'll just take a lot.
[312] That would be insane if they just abandoned all the tech companies in California.
[313] I mean, just look at what happened at Flint, Michigan, when all the auto factories pulled out?
[314] Well, it's, it's, um, look, I think you can, it's always there are all these paradoxical histories, you know, the internet, the point of the internet in some sense was to eliminate the tyranny of place.
[315] And that was sort of the idea.
[316] And then one of the paradoxes about the internet history of the internet was that the internet companies, you know, were, you know, we're all, you know, we're all centered in, in California.
[317] They have been different waves of how networked, how non -network they were.
[318] I think probably 2021, sort of the COVID moving away from California, the big thing in tech was crypto.
[319] And crypto had this concede of an alternate currency, decentralized away from the central banks, but also the crypto companies, the crypto protocols, you could do those from anywhere.
[320] You could do them outside the U .S., you could do them from Miami.
[321] And so crypto was something where the tech could naturally move out of California.
[322] And today, probably the, I don't know, the core tech narrative is completely flipped to AI.
[323] And then there's something about AI that's, you know, very centralized.
[324] You know, I always had this one liner years ago where it was, you know, if we say that crypto is libertarian, can we also say that AI is communist or something like this where, you know, the natural structure for an AI company looks like it's a big company and then somehow the AI stuff is, is, feels like it's going to be dominated by the big tech companies in, in the San Francisco Bay Area.
[325] And so if that's, that's very concerning for people.
[326] The future of tech, the scale, the natural scale of the industry tells you that it's going to be extremely hard to get out of, you know, out of the San Francisco Bay Area.
[327] When you look to the future and you try to just make just a guess as to how all this is going to turn out with AI, what do you think we're looking at over the next five years?
[328] Man, I think I should start by being modest in answering that question saying that nobody has clue.
[329] Right.
[330] Which is true.
[331] Which pretty much all the experts say.
[332] You know, I would say, let me do sort of a history.
[333] The riff I always had on this was that I can't stand any of the buzzwords.
[334] And I felt AI, you know, there's all this big data, cloud computing.
[335] There were all these crazy buzzwords people had.
[336] And they always were ways to sort of abstract things and get away from reality somehow and we're not good ways of talking about things and I thought AI was this incredible abstraction because it can mean the next generation of computers, it can mean the last generation of computers, it can mean anything in between.
[337] And if you think about the AI discussion in the 2010s, pre -open AI chat GPT and the revolution of the last two years, but the 2010s AI discussion, maybe it was, so I'll start with the history before I get to the future.
[338] But the history of it was it was maybe anchored on two visions of what AI meant.
[339] And one was Nick Bostrom, Oxford Proff, who wrote this book, Super Intelligence, 2014.
[340] And it was basically AI was going to be this super -duper intelligent thing, way godlike intelligence, way smarter than any any human being.
[341] And then there was sort of the, I don't know, the CCP Chinese communist rebuttal, the Kai Fu Lee book from 2018, AI Superpowers.
[342] I think the subtitle was something like the race for AI between Silicon Valley and China or something like this.
[343] And it was sort of, it defined AI as it was fairly low tech, it was just surveillance, you know, facial recognition technology.
[344] We would just have this sort of totalitarian Stalinist monitoring.
[345] It didn't require very much innovation.
[346] It just required that you apply things.
[347] And basically the subtext was China is going to win because we have no ethical qualms in China about applying this sort of basic machine learning to sort of measuring or controlling the population.
[348] And those were sort of like, I'd say two extreme competing visions of what AI would mean in the 2010s and that sort of maybe were sort of the anchors of the AI debate.
[349] And then, you know, what happened in some sense with chat GPT in late 22, early 23 was that the achievement you got, you did not get, superintelligence.
[350] It was not just surveillance tech, but it was you actually got to the holy grail of what people would have defined AI as from 1950 to 2010 for the previous 60 years before the 2010s.
[351] People have always said AI, the definition of AI is passing the Turing test.
[352] And the Turing test, it basically means that the computer can fool you into thinking that it's a human being.
[353] And it's a somewhat fuzzy test because, you know, obviously you could have an expert on the computer, a non -expert, you know, does it fool you all the time or some of the time, how good is it?
[354] But to first approximation, the Turing test, you know, we weren't even close to passing it in 2021.
[355] And then, you know, chat GPT basically passes the Turing test, at least for, like, let's say, an IQ 100 average person, it's passed the Turing test.
[356] And that was the holy grail.
[357] That was the holy grail of AI research for the previous 60 years.
[358] And so there's, I know, there's probably some psychological or sociological history where you can say that this weird debate between Bostrom about superintelligence and Kifu Lee about surveillance tech was like this almost like psychological suppression people had where they were not thinking, they lost track of the Turing test of the Holy Grail because it was about to happen.
[359] And it was such a significant, such an important thing that you didn't even want to think about.
[360] So I'm tempted to give almost a psychological repression theory of the 2010 debates.
[361] But be that as it may, the Turing test gets passed and that's, yeah, that's an extraordinary achievement.
[362] And then, you know, maybe, maybe, and then, you know, where does it go from here?
[363] There probably are ways you can refine these.
[364] It's still going to be, you know, a long time to apply it.
[365] There's a question.
[366] There's this AGI discussion.
[367] You know, we get artificial general intelligence, which is a hopelessly vague concept, which, you know, general intelligence could be just a generally smart human being.
[368] So is that just a person with an IQ of one -third?
[369] or is it super intelligence?
[370] Is it godlike intelligence?
[371] So it's sort of an ambiguous thing.
[372] But I keep thinking that maybe the AGI question is less important than passing the Turing test.
[373] If we got AGI, if we got, let's say super intelligence, that would be interesting to Mr. God because you'd have competition for being God.
[374] But surely the Turing test is more important for us humans.
[375] because it's either a compliment or a substitute to humans.
[376] And so it's, yeah, it's going to rearrange the economic, cultural, political structure of our society in extremely dramatic ways.
[377] And I think maybe what's already happened is much more important than anything else that's going to be done.
[378] And then it's just going to be a long ways in applying it.
[379] One last thought.
[380] You know, the analogy I'm always tempted to go to, and these things are never, historical analogies are never perfect, but it's, it's that maybe AI in 20, 23, 24 is like, it's like the Internet in 1999, where on one level, it's clear the Internet's going to be big and get a lot bigger, and it's going to dominate the economy, it's going to rearrange the society in the 25.
[381] first century.
[382] And then at the same time, it was a complete bubble.
[383] And people had no idea how the business models worked.
[384] You know, almost everything blew up.
[385] It took, you know, it didn't take that long in the scheme of things.
[386] It took, you know, 15, 20 years for it to become super dominant.
[387] But it didn't happen sort of in 18 months as people fantasized in 1999.
[388] And maybe what we have in AI is something like this.
[389] It's figuring out how to actually apply it, you know, in sort of all these different ways, it's going to take something like two decades.
[390] But that doesn't distract from it being a really big deal.
[391] It is a really big deal.
[392] I think you're right about the during tests.
[393] Do you think that the lack of acknowledgement or the public celebration or at least this like mainstream discussion, which I think should be everywhere, that we've passed the Turing test, do you think it's connected to the fact that this stuff accelerates so rapidly that even though we've essentially breached this new territory, we still know that GPT5 is going to be better, GPT6 is going to be insane, and then they're working on these right now.
[394] And the change is happening so quickly, we're almost a little reluctant to acknowledge where we're at.
[395] Yeah.
[396] You know, I've often, you know, probably for 15 years or so, often been on the side that there isn't that much progress in science or tech or not as much as Silicon Valley likes to claim.
[397] And even on the AI level, I think it's a massive technical achievement.
[398] It's still an open question.
[399] You know, is it actually going to lead to much higher living standards for everybody?
[400] You know, the Internet was a massive achievement.
[401] How much it didn't raise people's living standards?
[402] Much, much trickier question.
[403] So I, I, but in this world where not much has happened, one of the paradoxes of an era of relative tech stagnation is that when something does happen, we don't even know how to process it.
[404] So, you know, I think Bitcoin was a big invention, whether it was good or bad, but it was a pretty big deal.
[405] And it was systematically underestimated for at least, you know, the first 10, 11 years.
[406] You know, you could trade it.
[407] It went up smoothly for 10, 11 years.
[408] It didn't get repriced all at once because we're in a world where nothing big ever happens.
[409] And so we have no way of processing it when something pretty big happens.
[410] The Internet was pretty big in 99.
[411] Bitcoin was moderately big.
[412] The Internet was really big.
[413] Bitcoin was moderately big, and I'd say passing the Turing test is really big.
[414] It's on the same scale as the Internet.
[415] And because our lived experiences that so little has felt like it's been changing for the last few decades, we're probably underestimating it.
[416] It's interesting that you say that so little, we feel like so little has changed, because if you're a person, how old are you?
[417] Same age as you were.
[418] Born 1967.
[419] So in our age, we've seen all the change, right?
[420] We saw the end of the Cold War.
[421] We saw answering machines.
[422] We saw VHS tapes.
[423] Then we saw the Internet.
[424] And then where we're at right now, which is like this bizarre moment in time where people carry the Internet around with them in their pocket every day.
[425] And these super sophisticated computers that are ubiquitous.
[426] Everybody has one.
[427] There's incredible technology that's being ramped up every year.
[428] They're getting better all the time.
[429] And now there's AI.
[430] There's AI on your phone.
[431] You could access chat GPT, a bunch of different programs on your phone.
[432] And I think that's an insane change.
[433] I think that that's one of the most, especially with the use of social media, it's one of the most bizarre changes I think our culture is ever, the most bizarre.
[434] It can be a big change culturally or politically.
[435] But the kinds of questions I'd ask is, how do you measure it economically?
[436] How much does it change GDP?
[437] How much does it change productivity?
[438] And certainly, the story I would generally tell for the last 50 years since the 1970s, early 70s, is that we've been not absolute stagnation, but an era of relative stagnation where there has been very limited progress in the world of atoms, the world of physical things.
[439] And there has been a lot of progress in the world of bits, information, computers, internet, mobile internet, you know, now AI.
[440] What are you referring to when you're saying, the world of physical things?
[441] You know, it's any, well, if we had to find technology, if we were sitting here in 1967, the year we were born, and we had a discussion about technology, what technology would have meant?
[442] It would have meant computers, it would have also meant rockets, it would have meant supersonic airplanes, it would have meant new medicines, it would have meant the Green Revolution in agriculture, maybe underwater cities, you know, it sort of had, and it, because technology simply gets defined as that which is changing, that which is progressing.
[443] And so there was progress on all these fronts.
[444] Today, last 20 years, when you talk about technology, you're normally just talking about information technology.
[445] Technology has been reduced to meaning computers.
[446] And that tells you that the structure of progress has been weird.
[447] There's been this narrow cone, a very intense progress around the world of bits, around the world of computers, and then all the other areas have been relatively stagnant.
[448] We're not moving any faster.
[449] You know, the Concord got decommissioned in 2003 or whatever.
[450] And then with all the low -tech airport security measures, it takes even longer to fly to get through all of them from one city to the next.
[451] You know, the highways have gone backwards because there are more traffic jams.
[452] We haven't figured out ways around those.
[453] So they're sort of, we're literally moving slower than we were 40 or 50 years ago.
[454] And, and then, yeah, and that's sort of the, that's sort of which the screens and the devices, you know, have this effect distracting us from this.
[455] So, you know, when you're, you know, riding a hundred -year -old subway in New York City and you're looking at your iPhone, you can look at, wow, this is this cool new gadget, but you're also being distracted from the fact that your lived environment hasn't changed, you know, in a hundred years.
[456] And so there's, yeah, there's a question how important is this world of bits versus versus the world of atoms, you know, I would say as human beings were physically embodied in a material world.
[457] And so I would always say this world of atoms is pretty important, and when that's pretty stagnant, you know, there's a lot of stuff that doesn't make sense.
[458] I was an undergraduate at Stanford late 80s.
[459] And at the time, in retrospect, every engineering area would have been a bad thing to go into, you know, mechanical engineering, chemical engineering.
[460] All these engineering fields where you were tinkering and trying to do new things because these things turned out to be stuck.
[461] They were regulated.
[462] You couldn't come up with new things to do.
[463] Nuclear engineering, aeroaster engineering, people already knew those were really bad ones to go into.
[464] They were outlawed.
[465] You weren't going to make any progress in nuclear reactor designs or stuff like that.
[466] Electrical engineering, which was the one that's sort of adjacent to making semiconductors, that one was still okay.
[467] And then the only field that was actually going to progress a lot was computer science.
[468] And again, you know, it's been very powerful, but that was not the felt sense in the 1980s.
[469] In the 1980s, computer science was this ridiculous, inferior subject.
[470] You know, I always, the linguistic cut is always when people use the word science, I'm in favor of science.
[471] I'm not in favor of science in quotes.
[472] it's always a tell that it's not real science.
[473] And so when we call it climate science or political science or social science, you know, you're just sort of making it up and you have an inferiority complex to real science or some of the physics or chemistry.
[474] And computer science was in the same category as social science or political science.
[475] It was a fake field for people who found electrical engineering or math way too hard and sort of dropped out of the real science and real engineering fields.
[476] You don't feel that climate science is a real science?
[477] It's, it is, it is, well, let me, it's, I, I, there's several different things one could say.
[478] It's possible climate change is happening.
[479] it's possible we don't have great accounts of why that's going on.
[480] So I'm not questioning any of those things.
[481] But how scientific it is, I don't think it's a place where we have really vigorous debates.
[482] You know, maybe the climate is increasing because of carbon dioxide emissions, temperatures are going up.
[483] Maybe it's methane.
[484] Maybe it's people are eating too much steak.
[485] It's the cows flatulating.
[486] And you have to measure how much is methane a greenhouse gas versus carbon.
[487] dioxide, I don't think they're, I don't think they're rigorously doing that stuff scientifically.
[488] And I think the fact that it's called climate science tells you that it's more dogmatic than anything that's truly science should be.
[489] Why is that dogma doesn't mean that's wrong, but.
[490] But why is the fact that it's called climate science mean that it's more dogmatic?
[491] Because if you said nuclear science, you wouldn't question it, right?
[492] Yeah, but no one calls it nuclear science.
[493] They call it nuclear engineering.
[494] Interesting.
[495] I'm just, the only thing is, I'm just making it, I'm just making a narrowling All science that is legitimately science?
[496] Well, at this point, people say computer science has worked, but in the 1980s, all I'm saying is it was in the same categories, let's say social science, political science.
[497] It was a tell that the people doing it kind of deep down knew they weren't doing real science.
[498] Well, there's certainly ideology that's connected to climate science, and then there's certainly corporations that are invested in this prospect of green energy and the concept of green energy and they're profiting off of it and pushing these different things, whether it be electric car mandates or whatever it is.
[499] Like California, I think it's 2035, they have a mandate that all new vehicles have to be electric, which is hilarious when you're connected to a grid that can't support the electric cars it currently has.
[500] After they said that, within a month or two, Gavin Newsom asked people to not charge their Tesla's because it was summer and the grid was fucked.
[501] Yeah, look, it was all linked into all these ideological projects in all these ways.
[502] And, you know, there's an environmental project, which is, you know, and maybe it shouldn't be scientific.
[503] You know, the hardcore environmentalist argument is we only have one planet and we don't have time to do science.
[504] If we have to do rigorous science and you can prove that we're overheating, it'll be too late.
[505] And so if you're a hardcore environmentalist, you know, you don't want to have as high a standard of science.
[506] Yeah, my intuition is certainly when you go away from that, you end up with things that are too dogmatic, too ideological.
[507] Maybe it doesn't even work, even if the planet's getting warmer.
[508] You know, maybe climate science is not, like my question is, like maybe methane is a worse, is it more dangerous greenhouse gas than carbon dioxide?
[509] we're not even capable of measuring that.
[510] Well, we're also ignoring certain things like regenerative farms that sequester carbon.
[511] And then you have people like Bill Gates saying that planting trees to deal with carbon is ridiculous.
[512] That's a ridiculous way to do it.
[513] Like, how is that ridiculous?
[514] They literally turn carbon dioxide into oxygen.
[515] It is their food.
[516] That's what the food of plants is.
[517] That's what powers the whole plant life and the way we have the symbionic relationship.
[518] with them.
[519] And the more carbon dioxide is, the greener it is, which is why it's greener today on Earth than it has been in a hundred years.
[520] Sure.
[521] These are all facts that are inconvenient to people that have a very specific narrow window of how to approach this.
[522] Sure, although there probably are ways to steal man the other side, too, where maybe, you know, the original 1970s, you know, I think the manifesto that's always very interesting from the other side was this book by the Club of Rome, 1972, The Limits of Growth.
[523] And it's, you can't have, we need to head towards a society in which there's zero percent, there's very limited growth, because if you have unlimited growth, you're going to run out of resources.
[524] If you don't run out of resources, you'll hit a pollution constraint.
[525] But in the 1970s, it was, you're going to have overpopulation, you're going to run out of oil.
[526] We had the oil shocks.
[527] And then by the 90s, it sort of morphed into more of the pollution problem with carbon dioxide, climate change, other environmental things.
[528] But there is sort of, you know, there's been some, you know, some improvement in oil, carbon fuels with fracking, things like this in Texas.
[529] It's not at the scale that's been enough to you know, give an American standard of living to the whole planet.
[530] And we consume 100 million barrels of oil a day globally.
[531] Maybe fracking can add 10 %, 10 million to that.
[532] If everybody on this planet has an American standard of living, it's something like 300, 400 million barrels of oil.
[533] And I don't think that's there.
[534] So that's kind of, I always wonder whether that was the, that That was the real environmental argument is we can't have an American standard of living for the whole planet.
[535] We somehow can't justify this degree of inequality and therefore, you know, we have to figure out ways to dial back and tax the carbon, restrict it, and maybe that's, there's some sort of a Malthusian calculus that's more about resources than about pollution.
[536] How much of that could the demand for oil could be mitigated by nuclear?
[537] You probably could mitigate it a lot.
[538] There's a question why the nuclear thing has gone so wrong, especially if you have electric vehicles, right?
[539] You know, it's a combustion engine is probably hard to get nuclear to work, but if you shift to electric vehicles, you can charge them, you know, your Tesla cars at night.
[540] And that would seemingly work.
[541] And there's definitely, yeah, there's definitely a history of energy where it was always in the direction of, you know, more intense use.
[542] It went from wood to coal to oil, which is a more compact form of energy.
[543] And in a way, it takes up less of the environment.
[544] And then if we move from oil to uranium, that's even, you know, it's even smaller.
[545] And so in a sense, the smaller the more dense the energy is, the less of the environment it takes up.
[546] And when we go back, when we go from oil to natural gas, which takes up more space, and from natural gas to solar or wind, we have to, you know, you have to pollute the whole environment by putting up windmills everywhere.
[547] Or you have to, you know, you have to cover the whole desert with solar panels.
[548] And that is a good way to look at it, because it is a form of pollution.
[549] And so there was a way that nuclear was supposed to be the energy.
[550] mode of the 21st century.
[551] And then, yeah, there are all these historical questions.
[552] Why did it get stopped?
[553] Why did we not go down that route?
[554] The standard explanation of why it stopped was that it was there were all these dangers.
[555] We had three mile on 1979, you know, Chernobyl in in 1986, and then the Fukushima one in Japan, I think, 2011, and you had these sort of, you had these various accidents.
[556] My alternate theory on why nuclear energy really stopped is that it was, it was sort of dystopian or even apocalyptic because it turned out that it was all, it turned out to be very dual use.
[557] If you build nuclear power plants, It's only sort of one step away from building nuclear weapons.
[558] And it turned out to be a lot trickier to separate those two things out than it looked.
[559] And I think the signature moment was 1974 or 75 when India gets the nuclear bomb.
[560] And the U .S., I believe, had transferred the nuclear reactor technology to India.
[561] We thought they couldn't weaponize it.
[562] And then it turned out it was pretty easy to weaponize.
[563] And then sort of the geopolitical problem with nuclear power was you either, you need a double standard where we have nuclear power in the U .S., but we don't allow other countries to have nuclear power because the U .S. gets to keep its nuclear weapons.
[564] We don't let 100 other countries have nuclear weapons.
[565] and that's an extreme double standard, probably a little bit hard to justify.
[566] Or you need some kind of really effective global governance where you have a one -world government that regulates all this stuff, which doesn't sound that good either.
[567] And then sort of the compromise was just to regulate it so much that maybe the nuclear plants got grandfathered in, but it became too expensive to build new ones.
[568] Jesus.
[569] Like even China, which is the country where they're building the most nuclear power plants, they've built way less than people expected a decade ago because, you know, they don't trust, they don't trust their own designs.
[570] And so they have to copy the over -safety, over -protected designs from the West, and the nuclear plants, nuclear power costs too much money.
[571] It's cheaper to do coal.
[572] Wow.
[573] So, you know, I'm not going to get the numbers exactly right, but if you look at what percent of Chinese electricity was nuclear.
[574] It wasn't that high.
[575] It was like maybe four or five percent in 2013, 2014, and the percent hasn't gone up in ten years because, you know, they've maybe doubled the amount of electricity they use and maybe they doubled the nuclear, but the relative percentage is still a pretty small part of the mix because it's just more expensive when you have these, you know, over -safety designed reactors.
[576] There are probably ways to build small reactors that are way cheaper, but then you still have this, you still have this dual -use thing.
[577] Do you create plutonium?
[578] Do you, you know, are there ways you can create a pathway to building more nuclear weapons?
[579] And if there was innovation, if nuclear engineering had gotten to a point where, you know, let's say there wasn't three -mile island or Chernobyl didn't happen, do you think that it would have gotten to a much more efficient and much more effective version by now?
[580] Well, my understanding is we have way, we have way more efficient designs.
[581] You can do small reactor designs, which are, you don't need this giant containment structure, so it costs much less per kilowatt hour of electricity you produce.
[582] So I think we have those designs.
[583] They're just not allowed.
[584] But then I think the problem is that if you were able to, to build them in all these countries, all of the world, you still have this dual -use problem.
[585] Right.
[586] And again, my alternate history of what really went wrong with nuclear power, it wasn't Three Mile Island.
[587] It wasn't Chernobyl.
[588] That's the official story.
[589] The real story was India getting the bomb.
[590] Wow, that makes sense.
[591] It completely makes sense.
[592] Jeez, Louise.
[593] And then this is, you know, this is always, you know, this is always the question about, there's always a big picture question.
[594] And people ask me, you know, if I'm right about this picture of, you know, this slowdown in tech, this sort of stagnation in many, many dimensions, and then there's always a question, you know, why did this happen?
[595] And my cop -out answer is always why questions are overdetermined because, you know, there are multiple reasons.
[596] So it could be why it could be we became a more feminized risk -averse society.
[597] It could be that the education system worked less well.
[598] It could be that we were just out of ideas.
[599] The easy ideas have been found, the hard ideas.
[600] Nature's cupboard was bare.
[601] The low -hanging fruit had been picked.
[602] So it can be over -determined.
[603] But I think one dimension that's not to be underrated for the science and tech stagnation was that an awful lot of science and technology had this dystopian or apocalyptic dimension and probably what happened at, you know, Los Alamos in 1945 and then with the thermonuclear weapons in the early 50s.
[604] It took a while for to really seep in, but it had this sort of delayed effect where, you know, maybe a stagnant world in which the physicists don't get to do anything and they have to putter around with DEI and you're not, you know, but you don't build weapons that blow up the world anymore.
[605] You know, is that a feature or a bug?
[606] And so the stagnation was sort of, was sort of like this response.
[607] And so it sucks that we've lived in this world for 50 years where a lot of stuff has been inert.
[608] But if we had a world that was still accelerating on all these dimensions with supersonics and hypersonic planes and hypersonic weapons and, you know, modular nuclear reactors, maybe we wouldn't be sitting here and the whole world would have already blown up.
[609] And so we're in the stagnant path of the multiverse because it had this partially protective thing, even though in all these other ways I feel it's deeply deranged our society.
[610] That's a very interesting perspective, and it makes a lot of sense.
[611] It really does.
[612] And particularly the dual -use thing with nuclear power, and especially distributing that to other countries.
[613] When you talk about the stagnation in this country, I don't know how much you follow this whole UAP, I know we met.
[614] What was that guy's name at your place?
[615] The guy who did Charities of the Gods?
[616] Oh, Fondaniken.
[617] Yes.
[618] Yeah, you didn't, you thought he was too crazy.
[619] You like Hancock, but you don't like Fondanagan.
[620] I said, I didn't think he's too crazy.
[621] He just willfully, in my opinion, ignores evidence that would show that some of the things that he's saying have already been solved.
[622] and I think his his hypothesis is all related to this concept that we have been visited and that that's how all these things were built and that this technology was brought here from another world and I think he's very ideologically locked into these ideas and I think a much more compelling idea is that there were very advanced cultures for some reason 10 ,000 years ago whatever it was whatever the year was where they built some of the insane structures it's 45 100 years ago they roughly think the pyramids were built like whatever the fuck was going on there I think those were human beings I think those were human beings in that place in that time and I think they had some sort of very sophisticated technology that was lost and things can get lost things can get lost in cataclysms Things can get lost in, they can get lost in disease and famine and there's all sorts of war, all sorts of reasons, the burning of the library of Alexandria, there's all sorts of ways that technology gets lost forever.
[623] And you can have today someone living in Los Angeles in the most sophisticated high -tech society that the world has ever known, while you still have people that live in the Amazon that live in the same way that they have lived for thousands of years.
[624] So those things can happen in the same planet at the same time.
[625] And I think while the rest of the world was essentially operating at a much lower vibration, there were people in Egypt.
[626] They were doing some extraordinary things.
[627] I don't know how they got the information.
[628] Maybe they did get it from visitors.
[629] Maybe they did.
[630] But there's no real compelling evidence that they did.
[631] I think there's much more compelling evidence that a cataclysm happened.
[632] When you look at the Younger Dryas impact theory, it's all entirely based on some of the case.
[633] It's entirely based on core samples and iridium content and also massive changes in the environment over a very short period of time, particularly the melting in the ice caps in North America and just impact craters all around the world that we know something happened, roughly 11 ,000 years ago and probably again 10 ,000 years ago.
[634] I think it's a regular occurrence on this planet that things go sideways and there's massive natural disasters.
[635] and I think that it's very likely.
[636] There's the Bronze Age civilization collapse somewhere in the mid -12th century, B .C. And probably the, you know, in some ways the one in which we have the best history is the fall of the Roman Empire, which was obviously the culmination of the classical world, and it somehow extremely unraveled.
[637] So I think my view on it is probably somewhere between.
[638] your and the Fondaniken?
[639] No, not Fondaniken.
[640] I'm more on the more on the problem the other side but let me try to define why this I may agree on why this is so important today.
[641] This is not just of antiquarian interest and the reason it matters today is because the alternative if you say civilization has seen great rises and falls.
[642] It's gone through these great cycles.
[643] You know, maybe the Bronze Age civilizations were very advanced, but someone came up with iron weapons.
[644] So there was just one dimension where they progressed, but then everything else they could destroy.
[645] And so, or, you know, the fall of the Roman Empire was, again, this, you know, pretty cataclysmic thing where there were diseases and, you know, and then there were political things that unraveled.
[646] But somehow, you know, it was a massive regression for, you know, four or five, six hundred years into the dark ages.
[647] And the sort of naive, the progressive views, things always just got monotonically better.
[648] And there's sort of this revisionist, purely progressive history where even the Roman Empire didn't decline.
[649] And even, you know, this one sort of stupid way to quantify this stuff is with pure demographics.
[650] And so it's the question, how many people lived in the past?
[651] And the rises and falls of civilization story is there were more people who lived in the Roman Empire because it was more advanced.
[652] It could support a larger population.
[653] And then the population declined.
[654] You know, city of Rome maybe had a million people at its peak.
[655] And then by, you know, I don't know, 650 AD, it's maybe it's down to 10 ,000 people or less.
[656] You have this complete collapse in population.
[657] And then the sort of alternate, purely progressive view is the population has always just been monotonically increasing because it's a measure of how, in some sense, things in aggregate have always been getting better.
[658] So I am definitely on your side that population had great rises and fall.
[659] civilizations had great rises and falls.
[660] And so that part of it, I agree with you or even, you know, some variant of what Hancock or Fondana can say.
[661] The place where I would say I think things are different is I don't think, I don't think, and therefore it seems possible something could happen to our civilization.
[662] That's always the upshot of it.
[663] If it had been monotonically always progressing, then there's nothing we should worry about.
[664] Nothing can possibly go wrong.
[665] And then certainly the thing, the sort of alternate Hancock, Fondanagan, Joe Rogan, History of the World tells us is that we shouldn't take our civilization for granted.
[666] There's things that can go really haywire.
[667] I agree with that.
[668] But the one place where I differ is I think, I do think our civilization today is on some dimensions way more advanced than any of these past civilizations were.
[669] I don't think any of them had nuclear weapons.
[670] I don't think any of them had, you know, spaceships or anything like that.
[671] And so the failure mode is likely to be somewhat different.
[672] from these past ones.
[673] Yeah, that makes sense.
[674] I think technology progressed in a different direction.
[675] That's what I think.
[676] I think structural technology, building technology, at somehow or another achieved levels of competence that's not available today.
[677] When you look at the construction of the Great Pyramid of Giza, there's 2 ,300 ,000 stones in it.
[678] The whole thing points to do north, south, east, and west.
[679] It's an incredible achievement.
[680] The stones, some of them were moved from a quarry that was 500 miles away, through the mountains.
[681] They have no idea how they did it.
[682] Massive stones.
[683] The ones inside the King's Chamber, where the biggest ones are like 80 tons.
[684] It's crazy.
[685] The whole thing's crazy.
[686] Like, how did they do that?
[687] Like, whatever they did, they did without machines, supposedly.
[688] They did without the use of the combustion engine.
[689] They didn't have electricity.
[690] And yet they were able to do something that stands the test of time, not just so you could look at it.
[691] You know, like you can go to the Acropolis.
[692] and see the Parthenon and it's gorgeous it's amazing it's incredible but I can understand how people could have built it the pyramids is one of those things you just look at you go what the fuck was going on here what was going on here and none of these people are still around you have this strange culture now that's entirely based around you know you have Cairo and an enormous population of visitors right which is a lot of it people just going to stare at these ancient relics what was going on that those people were so much more advanced than anyone, anywhere else in the world.
[693] Yeah, I'm not sure I would anchor on the technological part, but I think the piece that is very hard for us to comprehend is what motivated them culturally.
[694] Well, how did they do it physically?
[695] Why did they do it?
[696] Why were you motivated?
[697] So why, but also how?
[698] How is a big one?
[699] Because it's really difficult to solve.
[700] There's no traditional, conventional explanations for the construction, the movement of the stones, the amount of time that it would take in.
[701] If you move 10 stones a day, I believe it takes 664 years to make one of those pyramids.
[702] So how many people were involved?
[703] How long did it take?
[704] How'd they get them there?
[705] How'd they figure out how to do it?
[706] How come the shittier pyramids seem to be dated later?
[707] Like what was going on in that particular period of time where they figured out how to do something so extraordinary that even today, 4 ,500 years later, we stare at it and we go, I don't know.
[708] I don't know what the fuck they did.
[709] I haven't studied carefully enough.
[710] I'll trust you that it's very hard.
[711] I think the, I would say the real mystery is why were they motivated?
[712] And it's because you can't live in a pyramid.
[713] It's just, it was just the afterlife of the pharaoh.
[714] There's some debate about that.
[715] Chris Ever Dunn is an engineer who believes that it was some sort of a power plant.
[716] He's got this very bizarre theory that there was a chamber that exists.
[717] If you can see, you see the structure of the pyramid, the inside of it.
[718] There's a chamber that's subterranean.
[719] and he believes the subterranean chamber was pounding on the surface of the earth and of the walls of the thing, creating this very specific vibration.
[720] They had shafts that came down into the queen's chamber.
[721] These shafts, they were poor chemicals into these shafts, and then there was limestone at the end of it.
[722] This is all his theory, not mine.
[723] The end of it, there was this limestone, which is permeable, right?
[724] So the limestone, which is porous, these gases come through and creates this, hydrogen that's inside of this chamber.
[725] Then there are these shafts inside the King's chamber that are getting energy from space, you know, gamma rays and all the shit from space.
[726] And that it's going through these chambers which are very specifically designed to target these gases and put them into this chamber where they would interact with this energy.
[727] And he believes it's enough to create electricity.
[728] Man, my...
[729] It's a crazy theory.
[730] Look, I'm always too fast to debunk all these things.
[731] Right.
[732] Like my, just coming back to our earlier conversation, it sounds, it must have been a crazy power plant to have a containment structure much bigger than a nuclear reactor.
[733] Yeah, well, it's ridiculous.
[734] But it's also a different kind of technology, right?
[735] If nuclear technology was completely not on the table, they didn't understand atoms at all.
[736] But they did understand that there's rays that come from space and that you could somehow harness the energy of these things with specific gases and through some method, convert that into some.
[737] form of electricity.
[738] But if it takes so much power to put all these rocks on the pyramid, you have to always look at how efficient the power plant is.
[739] So it can't just be some, it has to be like the craziest reaction ever to justify such a big containment structure.
[740] Because even nuclear power plants don't work economically, barely work.
[741] Well, they didn't do a lot of them.
[742] You know, they only did this one in Giza.
[743] And then there was other pyramids that he thinks had different functions.
[744] that were smaller.
[745] But the whole purpose of it is, or the whole point of it is, we don't know what the fuck it is.
[746] We don't know why they did it.
[747] We have a group of new archaeologists that are looking at it from a completely different theory.
[748] They're not looking at it like it's a tomb.
[749] The established archaeologists have insisted that this is a tomb for the pharaoh.
[750] The newer archaeologists, established archaeologists, are looking at it and considering whether or not there were some other uses for this thing, and one of them is the concept of the paro product.
[751] I'm always, I don't know if this is an alternate history theory, but I'm always into the James Frazier, Goldenbow, René Girard, violence, sacred history, where, you know, you have always this question about the origins of monarchy and kingship.
[752] And the sort of Girard Frazier intuition is that it's something like, it is something like if every king is a kind of living God, then we have to also believe the opposite, that maybe every God is a dead or murdered king.
[753] and that somehow societies were organized around scapegoats.
[754] The scapegoats were, you know, there was sort of a crisis in the archaic community.
[755] It got blamed on a scapegoat.
[756] The scapegoat was attributed all these powers.
[757] And then at some point, the scapegoat, before he gets executed, figures out a way to postpone his execution and turn the power into something real.
[758] And so there's sort of this very weird adjacency between the monarch and the scapegoat.
[759] And then, you know, I don't know, the sort of riff on the, would be that the first pyramid did not need to be invented.
[760] It was just the stones that were thrown on a victim.
[761] And then it's somehow, and that that's the original.
[762] The stones that were thrown on a victim.
[763] A community stones a victim to death.
[764] tribe runs after a victim you stone him to death you throw stones on the victim that's how you create the first tomb hmm and then um and then as it gets more complicated you create a tomb that's two million stones and and you get a you get a pharaoh you get a pharaoh who figures out a way to postpone his own execution or something like this i think there's um i'm going to blank on the name of this ritual but i believe in old in in the old Egyptian kingdoms which were sort of around the time of the great pyramids or even before.
[765] It was something like in the 30th year of the reign of the Pharaoh, the Pharaoh gets transformed into a living God.
[766] And then this perhaps dates to a time where in the 30th year of the Pharaoh's reign, the Pharaoh would get ritually sacrificed or killed.
[767] And you have, you know, you have all these societies where the kings lived, were allowed to rule for in a lot of time, where you, you know, you become king and you draw the number of pebbles out of a base and that corresponds to how many years.
[768] Was this, Jamie?
[769] The said festival of tales, an ancient Egyptian ceremony that celebrated the continued rule of Pharaoh.
[770] The name was taken from the name of the Egyptian wolf god, one of whom's name was whip.
[771] Whipple wet or said The less formal feast name The Feast of the Tale is derived Yeah, next paragraph is the one to start Okay This one, that one right there The ancient festival might perhaps have been instituted To replace a ritual of murdering a pharaoh Who was unable to continue to rule effectively Because of age or condition Interesting Interesting, so you can't kill them now And then eventually said festivals were Jubilee Several Leveriafrolet had had thrown for 30 years And then every three to four years after that.
[772] So when it becomes unthinkable to kill the pharaoh, the pharaoh gets turned into a living god.
[773] Before that, the pharaoh gets murdered and then gets worshipped as a dead pharaoh or distant god.
[774] That's interesting, but it still doesn't solve the engineering puzzle.
[775] The engineering puzzle is the biggest one.
[776] How did they do that?
[777] The one I'm focusing on is the motivational puzzle.
[778] Yeah, but even if you have all the motivation in the world, if you want to build a structure that's insane to build today and you're doing it 4 ,500 years ago, we're dealing with a massive puzzle.
[779] I think the motivational part's the harder one to solve.
[780] If you can figure out the motivation, you'll figure out a way to organize the whole society.
[781] And if you can get the whole society working on it, you can probably do it.
[782] But don't you think that his grasp of power was in peril in the first place, which is why they decided to come up with this idea of turning them into a living God?
[783] So to have the amount of resources in power and then the engineering and then the understanding of whatever methods they use to shape and move these things.
[784] Well, this is always the anthropological debate between Voltaire, the Enlightenment thinker of the 18th century, and Durkheim, the 19th century anthropologist.
[785] and Voltaire believes that religion originates as a conspiracy of the priests to maintain power.
[786] And so politics comes first.
[787] The politicians invent religion.
[788] And then Durkheim says the causations the other way around, that somehow religion came first, and then politics somehow came out of it.
[789] Of course, once the politics comes out of it, you know, the priests, the religious authorities have political power.
[790] figure out ways to manipulate it, things like this.
[791] But I find, you know, I find the Durkheim story far more plausible than the Voltaire one.
[792] I think the religious categories are our primary and the political categories are secondary.
[793] So do you think the religion came first?
[794] But what about if we emanated from tribal societies, tribal societies have always had leaders?
[795] When you have leaders, you're going to have dissent, you're going to have challenges, you're going to have politicking.
[796] You have people negotiating to try to maintain power, keep power, keep everything organized.
[797] That's the origin of politics, correct?
[798] You know, I think that's a whitewashed, enlightenment rationalist description of the origin of politics.
[799] What do you think the origin of politics is?
[800] I think it's far more vile in that.
[801] What you're giving me is...
[802] Well, it's very vile.
[803] The control and power and maintaining power involves murder and sabotage.
[804] Well, that's more like it.
[805] Yeah.
[806] But would you, what, what, what, what, what, what, what, what, what, what, What you gave me a minute ago sounds more like a social contract theory in which people sit down, negotiate, and have a nice legal chit -chat to draw up the social contract.
[807] That is a complete fiction.
[808] Yeah, I don't think that.
[809] I think that there was probably various levels of civility that were achieved when agriculture and when establishments were constructed that were near resources where they didn't have to worry as much about food and water and things along those lines.
[810] probably got a little bit more civil, but I think that the origins of it are like the origins of all human conflict.
[811] It's filled with murder.
[812] Well, I think at the beginning was madness and murder.
[813] Yeah, madness and murder.
[814] And I don't know, I don't know if it got, I don't know if it got that much more rational.
[815] I don't know if it's that much more rational today.
[816] Well, so in some ways it's not, right?
[817] I mean, this is again, this again, back to the, you know, the progressive conception.
[818] Are we, you know, are we really, have we really progressed?
[819] How much have we really progressed from that?
[820] But, but yeah, my, my version would be that it was, you know, it was much more, it was organized around, you know, acts of mass violence, like maybe, maybe you externalize it onto, you know, a mastodon or hunting some big animal or something like this, but the real problem of violence, you know, it wasn't external.
[821] It was mostly internal.
[822] It was, it was violence with people who are near you, proximate to you.
[823] It wasn't even natural cataclysms or other tribes.
[824] It was, it was sort of much more the internal stuff.
[825] And it's very different, I think.
[826] The human situation is somehow very, very different from something like, I don't know, an ape primate hierarchy where in an ape context, you have an alpha male, you know, he's the strongest and there's some sort of natural dominance and you don't need to have a fight to the death typically because you know who's the strongest and you don't need to push it all the way.
[827] In a human context, it's always possible for two or three guys to gang up on the alpha male.
[828] So it's somehow the culture is more important, you know, if they can talk to each other.
[829] and you get language, and then they can coordinate and they can gang up on the leader, and then you have to stop them from gang up on the leader, and how do you do that?
[830] And so there's some sort of radical difference between a human and a, let's say, a pre -human world.
[831] Have you seen Chimp Empire?
[832] No. Chimp Empire is a fascinating documentary series on Netflix where the scientists had been embedded with this tribe of chimpanzees for decades.
[833] And so because they were embedded, they had very specific rules.
[834] You have to maintain at least 20 yards from you and any of the chimps.
[835] No food.
[836] You can never have food and don't look them in the eyes.
[837] And as long as you do that, they don't feel you're a threat and they think of you as a natural part of their environment, almost like you don't exist.
[838] And they behave completely naturally.
[839] Well, it shows in that that sometimes it's not the largest, strongest one, and that some chimps form bonds with other chimps, and they form coalitions.
[840] and they do have some sort of politic, and they do help each other.
[841] They groom each other.
[842] They do specific things for each other.
[843] And then one of the things that happens also, they get invaded by other chimps, and that chimps leave and they go on patrol, and other chimps gang up on them and kill them, and they try to fight and battle over resources.
[844] So it's not nearly as cut and dry as the strongest chimp prevails, because one of the chimps that was dominant was an older chimp, and he was smaller than some of the other chumps, but he had formed a coalition with all these other chimps and they all respected him and they all knew that they would be treated fairly and being treated fairly is a very important thing with chimpanzees they get very jealous if they think that things are not fair which is why that guy was attacked you know that guy who had a pet chimpanzee he brought it a birthday cake the other chimps weren't getting a piece of the cake and someone had fucked up and left a door open they got out and mauled this guy because he didn't give them some of the cake Yeah, so I find all of that quite plausible, but I think both of us can be correct.
[845] So there's some, the true story of hominization, of how we became humans, there's a way to tell it where it's continuous with our animal past and where it's just, you know, there's things like this with the chimpanzees or the baboons or, you know, other primates.
[846] And then there is a part of the story that I think is also more discontinuous.
[847] us.
[848] And, you know, my judgment is we probably, you know, in a Darwinian context, we always stress the continuity.
[849] You know, I'm always a little bit the contrarian.
[850] And so, you know, I believe in Darwin's theory.
[851] But I think we should also be skeptical of ways it's too dogmatic.
[852] And Darwin's theories make us gloss over the discontinuities.
[853] And I think, you know, the one type of, It doesn't happen overnight, but one type of fairly dramatic discontinuity is that humans have something like language.
[854] And even though chimpanzees probably, I don't know, they have an IQ of 80 or they're pretty smart.
[855] But when you don't have a rich symbolic system, that leads to sort of a very, very different kind of structure.
[856] And there's something about language and the kind of coordination that allows and the ways that it, it forces you to, it enables you to coordinate on violence, and then it encourages you to channel violence in certain sacred religious directions, I think creates a, you know, something radically different about human society.
[857] We're, you know, we differ, you know, we, we tell, humans tell each other's stories.
[858] A lot of the stories are not true, they're myths, but that's, that is a, I think that's some sort of a very important difference from even our closest primate relatives.
[859] But this is, again, this is sort of like another way of getting at what's so crazy about chat GPT and passing the touring test.
[860] Because if we had sat here two years ago, and you asked me, you know, what is the distinctive feature of a human being?
[861] what makes someone a human and, you know, how, in a way that differs from everybody else, you know, it's not perfect, but my go -to answer would have been language.
[862] You're a three -year -old, you're an 80 -year -old, you know, just about all humans can speak languages, just about all non -humans cannot speak languages.
[863] It's this binary thing.
[864] And then that's sort of a way of telling us, again, why passing the Turing test was way more important than, super intelligence or anything else.
[865] Yeah, I could see that.
[866] Sorry, I don't want to go back to that tangent.
[867] No, no, it's a good tangent.
[868] Go ahead, connect it.
[869] It's great.
[870] Keep tangent and off.
[871] Have fun.
[872] It's great.
[873] Do you think, what do you think the factor was?
[874] There's a lot of debate about this.
[875] Like, the factor was that separated us from these animals and why we became what we became?
[876] Because we're so vastly different than any of the primate.
[877] So what do you think took place?
[878] Like the doubling of the human brain size over a period of two million years is one of the greatest mysteries in the entire fossil record.
[879] We don't know what the fuck happened.
[880] There's a lot of theories, throwing arm, cooking meat.
[881] There's a lot of theories.
[882] But we really have no idea.
[883] Well, again, if I, let me do sort of linguistic riff.
[884] I think Aristotelian Darwinian biology, Aristotle, you always differ things by, put them in categories.
[885] And man, I think the line Aristotle has is something, man differs from the other animals.
[886] in his greater aptitude for imitation.
[887] And I would say that we are these giant imitating machines.
[888] And of course, the Darwinian riff on this is, you know, to imitate is to ape.
[889] And so we differ from the ape.
[890] We're more ape -like than the apes.
[891] We are far better at aping each other than the apes are.
[892] And that, you know, a first cut, I would say, brains are giant imitation machines.
[893] That's how you learn language as a kid.
[894] You imitate your parents.
[895] And that's how culture gets transmitted.
[896] But then there are a lot of dimensions of imitation that are also very dangerous because it's not, imitation doesn't just happen on this symbolic linguistic level.
[897] It's also you imitate things you want.
[898] You want a banana, I want a banana.
[899] You want a blue ball.
[900] I can have a red ball.
[901] I want a blue ball because is you have a blue ball.
[902] And so there's something about imitation that creates culture, you know, that is incredibly important pedagogically learning.
[903] You know, it's how you master something, how you, you know, in all these different ways.
[904] And then a lot of it has this incredibly conflictual dimension as well.
[905] And then there's, yes, So I think that was sort of core to the things that are both great and troubled about humanity and that was sort of, that was in some ways the problem that needed to be solved.
[906] So you think that the motivation of imitation is the essential first steps that led us to become human?
[907] There is some story like, and again, this is a one -dimensional, one explanation.
[908] fits all, but, yeah, the, the sort of, the explanation I would go with is that it was, it was something like, you know, our brains got bigger, and so we were more powerful imitation machines.
[909] And there were things about that that were, you know, that were, yeah, that made us a lot more powerful and a lot, we could learn things and we could remember things.
[910] And there was cultural transmission that happened.
[911] But then it also, we could build better weapons and we became more violent and it also had a very very destructive element and then somehow the imitation you know had to be how to be channeled in in these sort of ritualized religious you know kinds of ways and that's that's that's that's why um i think all these things sort of somehow came to up together in parallel but what about the physical adaptation like what would be the motivation of the animal to change form and to have its brain grow so large and to lose all its hair and to become soft and fleshy like we are as opposed to like rough and durable like almost every other primate is?
[912] Well, you can always, man, you can always tell these retrospective just so stories and how this all worked out.
[913] But it would seem, the naive retrospective story would be that, you know, there are a lot of ways that humans are, I don't know, less strong than the other apes or, you know, all these ways where we're, in some sense, weaker.
[914] Physically at least.
[915] But maybe it was just this basic trade -off.
[916] You know, more of your energy went into your mind and into your brain.
[917] and then, you know, your fist wasn't as strong, but you could build a better axe.
[918] And that made you stronger than an ape.
[919] And that's, yeah, where, you know, a brain with, you know, less, I don't know, less energy we spent on growing a hair to keep warm in the winter.
[920] And then you used your brain to build an axe and skin a bear and get some fur for the winter or something like.
[921] Yeah, I guess.
[922] It's just, but it's just such a leap.
[923] It's such a leap and different than any other animal.
[924] Like, and like what was the primary motivating factor?
[925] Like, what was the thing?
[926] You know, McKenna believes it was psilocybin.
[927] You know, I'm sure you probably, you ever heard that theory?
[928] McKenna's the stoned ape theory, which is a fascinating one.
[929] But there's a lot of different theories about what took place.
[930] But we're just, well, the one, yeah, the one I would go on was that there was this dementia.
[931] of increased imitation, there was some kind of cultural linguistic dimension that was incredibly important.
[932] It probably was also, it was probably also somehow linked to, you know, dealing with all the violence that came with it, all the conflicts that came with it.
[933] You know, I would be more open to the stoned ape theory if people I had this conversation with the other guy Murawrescu, the immortality key guy and I always feel they whitewash it too much How so?
[934] You know it's like if I mean if you had these crazy Dionysian rituals in which people you know if you know there's probably lots of crazy sex there's probably lots of crazy violence that was tied to it and so maybe like maybe you'd be out of your mind to be hunting a woolly mammoth and like maybe maybe you can't be completely you know but they weren't hunting woolly mammoths during the illusinian mysteries no but you were i don't know you went to went to war to fight the neighboring tribe it's probably more dangerous than hunting right but they also did absolutely have these rituals and they have absolutely found trace elements of silts i don't i don't question that okay i don't question that at all i just i just i just think uh they probably part of it was also a way to channel violence.
[935] It was probably, you know, whenever, I don't know, was there some degree to which whenever you went to war, you were on drugs?
[936] Oh, yeah.
[937] Well, we know about the Vikings.
[938] The Vikings most certainly took mushrooms before they went into battle.
[939] And, you know, maybe it makes you less coordinated or something, but if you're just if you're, you're less scared.
[940] It doesn't make you less coordinated.
[941] If you're just a little bit less scared, that's probably super important.
[942] It increases visual acuity.
[943] There's a lot of benefits that would happen physically, especially if you got the dose right.
[944] It increases visual acuity, edge detection's better.
[945] It makes people more sensitive, probably more aware, probably a better hunter.
[946] But I think, I'm sympathetic to all these mushrooms.
[947] psychedelic drug, historical usage theories, I suspect was very widespread.
[948] I just think, you know, a lot of it was in these contexts that were pretty transgressive.
[949] Yeah, I think they're not mutually exclusive.
[950] I think just giving the way the world was back then, for sure, violence was everywhere.
[951] Violence was a part of daily life.
[952] Violence was a part of how society was kept together.
[953] Violence was entertainment in Rome, right?
[954] For sure, violence was everything, was a big part of it.
[955] And I think release and the anxiety of that violence also led people to want to be intoxicated and do different things that separated them from a normal state of consciousness.
[956] But I do think it's also probably where democracy came from.
[957] I think having those Ilyssinian mystery rituals where they would get together and do psychedelics and under this very controlled set and setting, I think that's the birthplace.
[958] of a lot of very interesting and innovative ideas.
[959] I think a lot of interesting and innovative ideas currently are being at least dreamt up, thought of, they have their roots in some sort of altered conscious experience.
[960] Well, man, I don't know.
[961] I think this stuff is very powerful.
[962] I think it is, it is, I definitely think it shouldn't be outlawed.
[963] you know, pretty hardcore libertarian on all the drug legalization stuff.
[964] And then I, I do wonder, I do wonder exactly how, how these things work.
[965] It probably, you know, probably the classical world version of it was that it was something that you did in a, fairly controlled setting.
[966] You didn't do it every day.
[967] And it was sort of, it was some, it was some way, I imagine, to get, you know, a very different perspective on your 9 to 5 job or whatever you want to call it.
[968] But you didn't necessarily want to, you know, want to really decamp to the other world altogether.
[969] Oh, for sure.
[970] It's too dangerous to do.
[971] I don't think anybody thinks they did.
[972] I think that was part of the whole thing.
[973] Where do you think that line is?
[974] Like, you know, should everyone do one ayahuasca trip?
[975] Or if you do an ayahuasca trip a year, is that too much?
[976] I don't think everyone has to do anything.
[977] And I think everyone has their own requirements.
[978] And I think, I think as you do, that everything like this, especially psychedelics, One of the more disappointing things recently was that the FDA had denied, they did these MDMA trials for, you know about all this?
[979] Yep.
[980] Yeah, very, very disappointing that they wanted to make MDMA therapy available to veterans and people with severe PTSD and it has extreme benefits, clinical benefits, known, documented benefits, and for whatever reason, the FDA decided that they have to go through a whole new series of trials to try to get this stuff legalized, which is very disappointing.
[981] Yeah, I was very bullish on this stuff happening.
[982] And the way I thought about it four or five years ago was that it was a hack to doing a double -blind study.
[983] Because the FDA always has this concept that you need to do a double -blind study.
[984] You give one -third of the people you give a sugar pill and two -thirds you give the real drug.
[985] and no one knows whether they have the sugar pill or the real drug.
[986] And then you see how it works, and science requires a double -blind study.
[987] And then my anti -double -blind study theory is, if it really works, you don't need a double -blind study.
[988] It should just work.
[989] And there's something sociopathic about doing double -blind studies because one -third of the people who have this bad disease are getting a sugar pill.
[990] Right.
[991] We shouldn't even be like maybe it's immoral to do double -blind studies.
[992] Well, double -blind studies on unique and novel things make sense.
[993] But this is not unique nor novel.
[994] It's been around long.
[995] Well, unique, yes.
[996] Well, my claim is if it's a, if it actually works, you shouldn't need to do a double -blind study at all.
[997] But, and then my hope was that MDMA, psychedelics, all these things, they were a hack on the double -blind study because you knew whether you got the real thing or the sugar pill.
[998] And so this would be a way to hack through this ridiculous double -blind criterion and just to get the study done.
[999] And then what I think part of it's probably just an anti -drug ideology by the FDA, but the other part that happened on the sort of science, scientific establishment level is they think you need a double -blind study.
[1000] Joe, we know you're hacking this double -blind study because people will know whether they got the sugar pill or not.
[1001] And that's why we're going to arbitrarily change the goalposts and set them at way, way, harder because we know there's no way you can do a double -blind study.
[1002] And if it's not a double -blind study, it's no good because that's what our ideology of science tells us.
[1003] And that's sort of what I think was part of what went sort of politically haywire with this stuff.
[1004] Well, I also think that it's Pandora's box.
[1005] I think that's a real issue.
[1006] and that if they do find extreme benefit in using MDMA therapy, particularly for veterans, if they start doing that and it starts becoming very effective and it becomes well -known and widespread, then it will open up the door to all these other psychedelic compounds.
[1007] And I think that's a real threat to the powers that be.
[1008] It's a real threat to the establishment.
[1009] If you have people thinking in a completely alternative way, and we saw what happened during the 1960s, and that's one of the reasons why they threw water on everything and had it, become Schedule 1 and lock the country down in terms of the access to psychedelics.
[1010] All that stuff happened out of a reaction to the way society and culture was changing in the 1960s.
[1011] If that happened today, it would throw a giant monkey wrench in our political system, in our cultural system, the way we govern, the way we, just the way allocation of resources, all of that would change.
[1012] if i if i just to articulate the alternate version on this um there's always a you know there's there's a part um let me think how to how to get this um you know there's there's one there's a question whether um this the shift to interiority um is it a compliment or a substitute Like what I said about talk and action, is it a complement or a substitute to changing the outside world?
[1013] So we focus on changing ourselves.
[1014] Is this the first step to changing the world?
[1015] Or is it sort of a hypnotic way in which our attention is being redirected from outer space to inner space?
[1016] So I don't know, the one liner I had years ago was, you know, we landed on the moon in July of 1969.
[1017] And three weeks later, Woodstock started.
[1018] and that's when the hippies took over the country.
[1019] And, you know, and we stopped going to outer space because we started going to interspace.
[1020] And so there's sort of a question, you know, how much, you know, it worked as an activator or as a or as a deactivator in a way.
[1021] And, you know, there are all these different modalities of interiority.
[1022] There's psychological therapy, there's medical therapy, meditation.
[1023] There's yoga.
[1024] There's, you know, there was a sexual revolution that were gradually you have in cells living in their parents' basement playing video games.
[1025] So there's the navel gazing that is identity politics.
[1026] There's a range of psychedelic things.
[1027] And I think all these things, I wonder whether the interiority ended up acting as a substitute.
[1028] Because, you know, the alternate history in the 1960s is that, you know, the hippies were actually, they were anti -political.
[1029] And it was sort of that the drugs happened at the end of the city, at the end of the 60s.
[1030] And that's when people depoliticized.
[1031] And it was like, I don't know, the Beatles song, if you're carrying around pictures of Chairman Maui, you're not going to make with anyone anyhow.
[1032] It's like, that's what after they did at LSD, it was just the sort of insane politics no longer matters.
[1033] So you had the civil rights, the Vietnam War, and then were the drugs the thing that motivated it, or was that the thing where it actually, those things started to de -escalate?
[1034] I think they were happening at the same time, and I think the Vietnam War coinciding with the psychedelic drug movement at the 1960s, it was one of the reasons why it was so dangerous to the establishment, because these people were far less likely to buy into this idea that they needed to fly to Vietnam and go kill people they didn't know.
[1035] and they were far less likely to support any war.
[1036] And I think there was this sort of bizarre movement that we had never seen before, this flower children, that we know that they plotted against.
[1037] I mean, if you read Chaos by Tom O 'Neill.
[1038] Yep.
[1039] Family with it.
[1040] Fantastic book that shows you what they were trying to do to demonize these hippies.
[1041] Well, or the part of it that I thought was interesting was the MK Ultra.
[1042] Yeah, which is a part of it.
[1043] Yeah.
[1044] where, you know, there was a predecessor version where we thought of, you know, there was a, you could think of it as we had an arms race with the fascists and the communists, and they were very good at brainwashing people.
[1045] The Goebbels propaganda, North Koreans brainwashing our soldiers in the Korean war, our POWs.
[1046] and we needed to have an arms race to program and reprogram and deprogram people and LSD was sort of the MK Ultra shortcut.
[1047] So I think there was, and then I, yeah, my, it's so hard to reconstruct it, but my suspicion is that the MKLTRA thing was a lot bigger than we realize and that, you know, it was, the LSD movement both in the Harvard form and the Stanford form.
[1048] You know, it started as an MK Ultra project, Timothy Leary at Harvard, Ken Keesey at Stanford.
[1049] You know, I knew Tom Wolfe, the American novelist, I still think his greatest novel was the electric coolate acid test, which sort of this history of the LSD counterculture movement, starts at Stanford, Moose to Hayd Ashbury in San Francisco.
[1050] But it starts with Ken Kesey's grad student at Stanford.
[1051] Stanford, I know, circa 1958, and you get an extra $75 a day if you go to the Menlo Park Veterans Hospital and they give you some random drug.
[1052] And, yeah, you got extra $75 a grad student in English doing LSD.
[1053] And Tom Wolfe writes this, you know, iconic, fictionalized novel, very realistic 1968 about this.
[1054] Wolf could not have imagined that the whole thing started as some CIA mind control project.
[1055] Right.
[1056] The Menlo Park Veterans Hospital that was deep state adjacent.
[1057] Sure.
[1058] Well, Hayd Ashbury Free Clinic, run by the CIA.
[1059] Sure, that's even crazier.
[1060] The whole thing's crazy.
[1061] The whole thing's crazy.
[1062] The Jolly West guy, yep.
[1063] Yeah, the whole thing's crazy, which leads me to, what do you think they're doing today?
[1064] If they were doing that then, I do not believe that they have banned.
[1065] this idea of programming people.
[1066] I do not believe that.
[1067] I don't think they would because I know it's effective.
[1068] Look, people join cults every day.
[1069] We're well aware that people can be ideologically captured.
[1070] We're well aware.
[1071] We're well aware people will buy into crazy ideas as long as it's supported by whatever community that they associate with.
[1072] That's just a natural aspect of being a human being.
[1073] Maybe it's part of what you were saying, this imitation thing that we have.
[1074] It leads us to do this.
[1075] If they have that knowledge and that understanding, for sure, they're probably doing things similar today, which is one of the things that I think about a lot when I think about this guy that tried to shoot Trump.
[1076] I want to know what happened.
[1077] And I don't think we're getting a very detailed explanation at all as to how this person achieved how they got on the roof, how they got to that position, how they trained, who were they in contact with, who was teaching, who was teaching, them why did they why did they do it what was going on we we are in the dark and I wonder like you know there was always the Manchurian candidate idea right this idea that we have trained assassins and was the RFK dad um assassination 1960 where um he again maybe we shouldn't believe him but he claimed that uh he didn't even know what he was doing it was some hypnotic trance or whatever and it was like it was like the assassin in the Manchurian Canada.
[1078] Yeah.
[1079] Yeah.
[1080] I mean, that is possible.
[1081] I don't know if he's telling the truth.
[1082] He could have just had a psychotic break.
[1083] Who knows?
[1084] Obviously, also convenient.
[1085] Yeah, very convenient.
[1086] But it's a possibility that she should be considered.
[1087] I mean, this crooks kid that did this that shot at the president, what, how?
[1088] What happened?
[1089] I want to know what happened.
[1090] Man, I don't, I probably veer in the direction that there were, you know, on the sort of conspiracy theory of history, I veer in the direction that there was a lot of crazy stuff like this that was going on in the U .S., first half of the 20th century, overdrive, 1940s, you know, the, I mean, you had the Manhattan Project, there's this giant secret project, 1950s, 1950s, 1950s, 1960s.
[1091] And then somehow the last 50 years, I think the, I'm not sure disturbing, but the perspective I have is these institutions are less functional.
[1092] I don't think, I don't think the CIA is doing anything quite like MK Ultra anymore.
[1093] Why do you think that?
[1094] I think you had the Church Commission hearings in the late 70s.
[1095] And, uh, and, um, somehow things got exposed and then when when things when bureaucracy is forced to be formalized it probably becomes a lot less functional you know there was a like the 2000s version there was I think it was a lot of crazy stuff that we did in black sites torturing people the CIA ran in the War on Terror, there's waterboarding, there's all sorts of bat -shit crazy stuff that happened.
[1096] But then, you know, once John Yoo in the Bush 43 administration, writes the torture memos, and sort of formalizes this is how many times you can water dunk someone without it being torture, et cetera, et cetera, once you formalize it, people somehow know that it's on its way out because, you know, it doesn't quite work anymore.
[1097] So by, I don't know, by 2007 at Guantanamo, I think the inmates were running the asylum, the inmates and the defense lawyers were running it.
[1098] You were way safer as a Muslim terrorist in Guantanamo than as a, let's say, suspected cop killer in Manhattan.
[1099] There was still an informal process in Manhattan.
[1100] You were a suspected cop killer.
[1101] They'd figure out some way to deal with you outside the formal judicial process.
[1102] But I think something, there was a sort of formalization that happened.
[1103] There was the post -J.
[1104] Edgar Hoover FBI, where Hoover was, I don't know, a law unto himself.
[1105] It was completely out of control, CIA even more so.
[1106] And then, you know, once it all gets exposed, and it probably is a lot harder to do.
[1107] The NSA, you know, NSA probably held up longer as a deep state entity where it at least had the virtue.
[1108] people, you know, I think the 1980s it was still referred to as no such agency.
[1109] So it's still, it was still far more obscure.
[1110] So the necessary condition is that if some part of the deep state's doing it, you know, we, we can barely know what's going on.
[1111] Right.
[1112] With them.
[1113] And then, I don't know, you know, the, the 2000s, 2010's history of on the, on, you know, I think the Patriot Act and all these FISA courts.
[1114] And I think there was, I think there probably were ways the NSA FISA court process was, was weaponized in a really, really crazy way.
[1115] And, you know, it culminated in 2016 with all the, you know, the crazy Russia conspiracy theories against Trump.
[1116] But I think even that, I'm not sure they can do anything.
[1117] anymore because it got exposed.
[1118] Can't do that anymore.
[1119] But a small program that is top secret that is designed under the auspices of protecting American lives, extracting information from people.
[1120] I'm agreeing to the FISA FISA court process is one where you had a pretty out of control process from, let's say, circa 2003 to 2017, 2018.
[1121] So that's relatively recent history.
[1122] history.
[1123] I don't know.
[1124] There are all the Jeffrey Epstein conspiracy theories, which I'm probably too fascinated by because it felt like there was some crazy stuff going on that they were able to cover up.
[1125] And still are.
[1126] And then, but then, man, doesn't it the fact that we're still talking about Jeffrey Epstein, tell us how hard it is to come up with anything else?
[1127] No, because there's no answers for the Jeffrey Epstein thing.
[1128] There's been no consequences other than Galane Maxwell going to jail and Jeffrey Epstein allegedly committing suicide, which I don't think he did.
[1129] Other than that, what are the consequences?
[1130] They were able to pull off this thing, this some sort of operation.
[1131] Who knows who was behind it?
[1132] Who knows what was the motivation?
[1133] Who knows what was the motivation, but it clearly has something to do with compromising people, which is an age -old strategy for getting people to do what you want them to do.
[1134] You have things on them, you use those things as leverage, and then next thing you know, you've got people saying things that you want them to say, and it motivates moves policy, changes things, get things done.
[1135] They did that.
[1136] Yes.
[1137] And we know they did that, and yet no one is asking for the tapes, no one's asking for the client list.
[1138] We're in the dark.
[1139] Still.
[1140] And probably, I don't know, man, I spent too much time thinking about all the Epstein variants.
[1141] It probably, probably the sex stuff is overdone and everything else is underdone.
[1142] It's like a limited hangout.
[1143] We get to talk about the crazy underage sex and, you know, not about all the other questions.
[1144] It's like when Alex Acosta testified for Labor Secretary, and he was the DA who'd prosecuted Epstein in 0809 and got him sort of the very light 13 -month or whatever sentence.
[1145] And it was a South Florida DA or whatever he was.
[1146] And Acosta was asked, you know, why did he get off so easily?
[1147] and under congressional testimony, when he was up for Labor Secretary 2017, it was, he belonged to intelligence.
[1148] That's, yeah.
[1149] And then, and then, you know, and so it's, yeah, it's, it's, it's, the question isn't about the sex with the underage women.
[1150] The question is, is really about, you know, why was he, why was he so protected?
[1151] And then, you know, I, I went, I went down all these rabbit holes.
[1152] Was he, you know, working for the Israeli?
[1153] or the Mossad or all this sort of stuff.
[1154] And I've come to think that that's, that was very secondary.
[1155] It was obviously, it was just the U .S. You know, if you're working for Israel, you don't get protected.
[1156] You know, we had Jonathan Pollard.
[1157] He went to jail for 25 years or whatever.
[1158] But unrelated, right?
[1159] Understood, but it's, but it's, but this is one particular operation.
[1160] But so it's, but it was, if it was an intelligence operation, the question we should be asking is what part of the U .S. intelligence system was he working for?
[1161] Was he working for, you know.
[1162] But don't you think that's an effective strategy for controlling politicians, getting them involved in sex scandals?
[1163] I mean, that's always been one of the worst things that can happen to a politician.
[1164] Look at Monica Lewinsky.
[1165] A very simple one.
[1166] Consensual, inappropriate sexual relationship between a president and a staffer, and it almost takes down the presidency.
[1167] It causes him to get impeached.
[1168] Powerful motivators, the shame of it all, also the illegal activity, the fact that it's, I mean, it's one of the most disgusting things that we think of, but people having sex with underage people.
[1169] I'm sure that was part of it.
[1170] I suspect there are a lot of other questions that, you know, one should, one should also ask.
[1171] Most certainly, but I would think that that is one of the best motivators that we have, is having dirt on people like that, especially something.
[1172] that could ruin your career, especially people that are deeply embedded in this system of people knowing things about people and using those at their advantage.
[1173] I mean, that's an age -old strategy in politics.
[1174] That was Jay Edgar Hoover's entire modus operandi.
[1175] Yeah, my riff on it was always that it was, it was, it's a little bit different from the Jay Edgar Hoover thing.
[1176] And the question was always whether the people doing it knew they were getting compromised.
[1177] And so it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's, it's a made, it's a made man in the mafia.
[1178] And you got to do crazy things.
[1179] No, no, no, no, no, no, it's only if we have compromise on you, do you get ahead.
[1180] Right.
[1181] It's like, you know, it's like, I don't know, it's one of these, um, the closet, yeah, closet of the Vatican.
[1182] Right.
[1183] The claim is 80 % of the Cardinals in the Catholic Church are gay.
[1184] Not sure if that's true, but directionally, it's probably correct.
[1185] And the basic thesis is you don't get promoted to a Cardinal if you're straight.
[1186] Because we need to have, and so you need to be compromised, and then you're under control, but you also get ahead.
[1187] Completely makes sense.
[1188] Completely makes sense in the way to do that, especially all these politicians who are essentially like bad actors.
[1189] A lot of them.
[1190] They're just people that want power and people that want control.
[1191] A lot of them.
[1192] And, you know, those kind of guys, they want a party.
[1193] You know, I mean, that has been, you've got two types of leaders that are presidents.
[1194] You've got pussy hounds and warmongers.
[1195] You know, and, you know, sometimes you have both, but generally you don't.
[1196] You know, guys like Clinton and JFK were anti -war.
[1197] And then you have guys like Bush, who you don't think of at all as a pussy hound, but most certainly you think of as a warmonger.
[1198] Do you have a theory on what was Bill Gates's complicity with Epstein?
[1199] I think he likes pussy.
[1200] I think he's a man. I think he likes power.
[1201] He likes monopoly.
[1202] I mean, he's incredibly effective with Microsoft.
[1203] And for the longest time, he was thought of as a villain, right?
[1204] He was this antitrust villain.
[1205] He was this guy who was monopolizing this operating system and controlling just things.
[1206] this incredible empire, and he had a real bad rap.
[1207] And then I think he wisely turned towards philanthropy.
[1208] But do you think that he needed Epstein?
[1209] I think it's very difficult, very famous, very high -profile person to fuck around.
[1210] I think it's very difficult.
[1211] I think you have to worry about people telling people.
[1212] You worry about it taking you down.
[1213] If you're having affairs, if you're running some philanthropy organization, you're supposed to be thought of as this guy who's like this wonderful person who's trying to really fix all the problems in the world, but really he's just flying around and banging all these different chicks, you have to figure out a way to pull that off.
[1214] And this is what Eric Weinstein and I, we've had discussions about this, and Eric's position is that there are people in this world that can provide experiences for you and safely for people that are in that kind of a group.
[1215] and that makes sense.
[1216] It makes sense that if you pay people enough and you have people motivated in order to establish these relationships and make sure that these things happen, when you get very high profile, you can't just be on a fucking dating app.
[1217] And if you're a guy who likes to bang checks, what are you going to do?
[1218] All of that might be true, but I wonder if there are more straightforward alternate conspiracy theories on Epstein that we're missing.
[1219] So let me do alternate one on Bill Gates, where, um, the, um, you know, the things just, just looking at what's hiding in plain sight.
[1220] Um, you know, he supposedly, um, talked to, um, Epstein early on about, um, how his marriage wasn't doing that well.
[1221] And then Epstein suggested that he should get a divorce, circa 2010, 2011.
[1222] And Gates told him something like, um, you know, um, uh, that doesn't quite work.
[1223] He didn't have, presumably because he didn't have a pre -up.
[1224] So there's one part of Epstein as a marriage counselor, which is sort of disturbing.
[1225] But then the second thing that we know that Gates talked to Epstein about was sort of, you know, all this sort of collaborating on funding, setting up this philanthropy, all this sort of this somewhat corrupt left -wing philanthropy structures.
[1226] And so there's a question, you know, does, and then my sort of straightforward alternate conspiracy theory is should we ask, should we combine those two?
[1227] And was there, you know, and I don't have all the details on this figured out, but it would be something like, you know, Bill and Melinda get married in 1994.
[1228] They don't sign a pre -up.
[1229] And, you know, something's going wrong with the marriage.
[1230] And maybe Melinda can get half the money in a divorce.
[1231] He doesn't want her to get half the money.
[1232] What do you do?
[1233] And then the alternate plan is something like you set up, you set up, you commit.
[1234] the marital assets to this non -profit and then it sort of locks Melinda into not complaining about the marriage for a long, long time.
[1235] And it's some kind of a and so there's something about the left -wing philanthropy world that was it was some sort of boomer way to control their crazy wives or something like this.
[1236] And that's, It's also an effective way to whitewash your past, right?
[1237] Sure, there are all these, and he talked to Epstein about, he got Epstein to meet with the head of the Nobel Prize Foundation.
[1238] So it was, yeah, Bill Gates wanted to get a Nobel Nobel Prize.
[1239] Wow.
[1240] Right.
[1241] So this is all, this is all, yeah, this is all straightforward.
[1242] This is all known.
[1243] Yeah.
[1244] And I'm not saying what you're saying about.
[1245] Do you know the history of the Nobel Prize?
[1246] That's the ultimate whitewash.
[1247] Sure, it was permitting dynamite.
[1248] Yeah, that's the, he was, he, well, Peter Berg told me the story.
[1249] I was blown away.
[1250] He originally, they, someone said that he died, and it was printed that he died, but he didn't die.
[1251] And in the stories, they were calling him the merchant of death, because he was the guy that invented dynamite.
[1252] And he realized that, oh, my God, this is my reputation, this is how people think about me. I have to do something to turn this around.
[1253] So he invented the Nobel Prize.
[1254] and he started then now the name Nobel is automatically connected in most people's eyes to the greatest people amongst us the people that have contributed the most of society and science and art and peace and all these different things Nobel Prize for Medicine or what to do the guy who invented it's a super crazy history yeah yeah it's a crazy history but it's the ultimate whitewash it's the same thing like he came up with that prize because he wanted to change his image publicly And so it's ironic that Bill Gates would want to get a Nobel Prize or not ironic or it's straightforward understandable and ironic but but but I think but then if we if and so there's yeah so there's there's a underage sex version of the Epstein story and then there is a crazy status Nobel Prize history of it and there is a corrupt left -wing philanthropy one and there is a boomers who didn't sign prenuptial agreements with their wives story.
[1255] Right.
[1256] And I think all of those are worth exploring more.
[1257] I think you're right.
[1258] What is, what about these left -wing philanthropy ventures do you think is uniquely corrupt?
[1259] Sorry, which one do I think is most corrupt?
[1260] No, what about them?
[1261] When you said corrupt.
[1262] Yeah.
[1263] Well, man, it's, it's, there's something about, maybe it's just my hermeneutic of suspicion, but there's something about, you know, there's something about the virtue signaling and what does it mean?
[1264] And I always think this is sort of a Europe, America versus Europe difference where in America, we're told that, um, that, um, that, um, that, uh, that, um, that, uh, philanthropy is something a good person does.
[1265] And, you know, if you're a Rockefeller and you start giving away all your money, this is just what a good person does and it shows how good you are.
[1266] And then I think sort of the European intuition on it is something like, you know, wow, that's only something a very evil person does.
[1267] And if you start giving away all your money in Europe, It's like, Joe, you must have murdered somebody.
[1268] You must be covering up for something.
[1269] So there are these two very different intuitions.
[1270] And I think the European one is more correct than the American one.
[1271] And probably there's some history where, you know, the sort of left -wing philanthropy peaked in 2007, 2010, 2012.
[1272] And there's these subtle ways, you know, we've.
[1273] we've become, you know, we've become more, you know, more European in our sensibilities as a society.
[1274] And so it has this very different valence from what it did 12 or 14 years ago.
[1275] But, yeah, it's all, we ask all these questions like we're asking right now about Bill Gates, where it's like, okay, he was, you know, it was like all the testimony in the Microsoft antitrust trial in the 90s, where he's like he's cutting off the air supply, he wants to strangle people.
[1276] And he's kind of a sociopathic guy, it seems.
[1277] And then it's this giant whitewashing operation.
[1278] And then somehow the whitewashing has been made too transparent and it gets deconstructed and exposed by, you know, the Internet or whatever.
[1279] But I think most people are still unaware of how much whitewashing actually took place, including donating somewhere in the neighborhood of $300 plus million to media corporations, essentially buying favorable reviews about him.
[1280] and then there's this very public philanthropy.
[1281] It's not just philanthropy.
[1282] It's philanthropy mixed with public relations, because he's constantly doing interviews about it.
[1283] This is not like a guy who is just silently donating his incredible wealth to all these causes.
[1284] He's advocating for it on various talk shows.
[1285] He's constantly talking about it and how we need to do things.
[1286] I mean, during the pandemic, he was a very vocal voice.
[1287] He was the guy telling us he was a somehow or another.
[1288] So he became a public health expert, and no one questioned why we were taking public health advice from someone who has a financial interest in this one very particular remedy.
[1289] Yeah, or there are all these alternate versions I can give.
[1290] But yeah, I think it's always so hard to know what's really going on in our culture, though.
[1291] So I think all what you say is true, but I also think it's not working as well as it used to.
[1292] And there is a way people see through this.
[1293] It's not always as articulate as you just articulated it, but there's some vague intuition that, you know, when Mr. Gates is just wearing sweaters and looks like Mr. Rogers, that something fishy is going on.
[1294] Right.
[1295] People have that sort of intuition.
[1296] They trust Jeff Bezos in his tight shirt hanging out with his girlfriend on Or Elon Musk, the vice signaling is safer than virtue signaling.
[1297] Yeah, yeah.
[1298] Because if you're, you know, if you're virtue signaling, our intuition is something really, really, really sketching.
[1299] Suspicious.
[1300] We get suspicious.
[1301] And I think rightly so.
[1302] I think, especially when someone's doing something so public, I think rightly we should be suspicious.
[1303] Especially when, I mean, with Gates, it's like, you know the history of the guy.
[1304] I mean, you know what he was involved with before.
[1305] You know how he ran Microsoft.
[1306] You know, it just kind of makes sense that it's a clever move.
[1307] It's a clever move to pay the media.
[1308] It's a clever move.
[1309] Again, my alternate one, which is not incompatible with yours on Gates, is that Melinda finally files for divorce in early 21.
[1310] I think she told Bill she wanted one late 2019.
[1311] So 2020, the year where Bill Gates goes into overdrive on COVID.
[1312] You know, all this stuff.
[1313] You know, part of it maybe it's self -dealing and he's trying to make money from the drug company or something like this.
[1314] But, you know, isn't the other really big thing, he needs to box Melinda in and force her not to get that much out because all the money's going to the foundation anyway.
[1315] Melinda has to say, you know, I want, why do you want half the money?
[1316] It's all going to the Gates Foundation anyway.
[1317] We're not leaving our kids anything.
[1318] And then when you lean into COVID, you know, how does that work in the, you know, it's somehow, in theory, Melinda has a really strong hand.
[1319] She should get half.
[1320] That's what you get in a divorce with no prenuptial.
[1321] But then if you make it go overdrive on COVID, Melinda, are you a, you know, are you a, I don't know, are you like some crazy.
[1322] anti -science person.
[1323] Right.
[1324] And so, so I don't know, my, my, my, my, my, my, my, my, my, my, my, my, my, my, my, my, I can both be they can both be correct.
[1325] Sure, there's many factors.
[1326] But mine, mine lines up really well with the, with the, with the timeline.
[1327] Well, we're probably.
[1328] talking about a hundred million dollars or a hundred billion dollars one way or the other well i think she got she got less than she got like one tenth really interesting and she should have gotten half as far as and i it's amazing he got it down that much wow interesting but it was just i think she was just boxed in every time he went on tv talking about covid she was boxed in with all of her left -wing friends that is an interesting philosophy that's an interesting way to approach a problem if you're him very wise you know very clever i mean if you're just looking at like just for personal benefit the genius move and the guy's genius clearly brilliant guy you know i mean that makes sense makes sense do we do that i don't know you know i would do that probably should have had a prenup but yeah yeah well that's kind of crazy that's interesting that yeah i didn't consider that but it makes sense.
[1329] And she's, you know, she's been kind of pretty vocal, unfortunately, for him, about his ties to Epstein being one of the primary reasons why she wanted out.
[1330] But again, my alter, again, it's what did he, was he, was he, was he having extramarral affairs through Epstein?
[1331] Or maybe Epstein was, from Melinda's point, would it be worse for Epstein to facilitate an extramarital affair?
[1332] Or would it be worse for Epstein to be advising Gates on how to ditch Melinda without giving her any money.
[1333] I think that would be much, much worse from Melinda's point of view.
[1334] Yeah, makes sense.
[1335] It totally makes sense.
[1336] Do you think that he was a legitimate financial advisor?
[1337] Like he could give him advice on how to do those things?
[1338] Gates wouldn't have more effective people.
[1339] I mean, he's, when you're at that level of wealth, I'm sure you have wealth management people that are like very high level.
[1340] Because that's one of the things that Eric said about him.
[1341] He said when he met him, he was like, this guy's a fraud.
[1342] Like, this, he doesn't know enough about what he's talking about.
[1343] And, you know, Eric is.
[1344] You know, I met Epstein a few times as well.
[1345] And I think...
[1346] How'd you get introduced?
[1347] It was, it was Reed Hoffman and Silicon Valley introduced us in 2014.
[1348] But it was, it was basically, and I, you know, didn't, didn't, didn't, check, didn't ask.
[1349] any enough questions about it.
[1350] But I think there were sort of a lot of things where it was fraudulent.
[1351] I do think Epstein knew a lot about taxes.
[1352] And there were probably these complicated ways you could structure a nonprofit organization especially as a way in a marital context that I think Epstein might have known a decent amount about.
[1353] How, when you were introduced to him?
[1354] I don't think Epstein would have been able to, you know, comment on super string theory or something like that.
[1355] But I think this sort of thing he might have actually been pretty expert on.
[1356] When you were introduced to him, how was he described to you?
[1357] He was described as one of the smartest tax people in the world.
[1358] Interesting.
[1359] And I probably, probably was my moral weakness that I. Well, how could you have known back then?
[1360] He had never been arrested.
[1361] No, this was 2014.
[1362] It was post -arrest.
[1363] Oh, so it was the first arrest, right?
[1364] Yeah.
[1365] Oh, 708.
[1366] Okay.
[1367] Okay.
[1368] And so.
[1369] But, you know, you assume it didn't go to jail for that long.
[1370] Right.
[1371] It was probably not as serious as alleged.
[1372] There was certainly was the illusion that there were all these, these other people.
[1373] that i trusted you know reed who introduced us was you know he started linked in he was you know maybe too focused on business networking but right but uh but i thought he always had good judgment in people when the shit went down and epstein gets arrested for the second time where you're like oh well there you go uh i've thought about it i thought a lot about it as a result yeah yeah i'm sure jesus christ well he tricked a lot of people i know a lot of people that met that guy.
[1374] He got a lot of celebrities to come to his house for parties and things.
[1375] Well, I think it was, it was, it's a, I think a lot of it was this, was a strange commentary on, you know, I know, there was some secret club, secret society you could be part of.
[1376] Right.
[1377] Of course.
[1378] Again, that wasn't, it wasn't explicit, but that was, that was the vague, the vague vibe of the whole thing.
[1379] People love those stupid things.
[1380] They love, like, exclusive clubs that very few people, look at the fucking.
[1381] Soho House.
[1382] Like, look at that stupid thing.
[1383] I mean, you just go to a place that you have to be a member to go to and everybody wants to be a member.
[1384] Oh, my God.
[1385] Get a kid.
[1386] And then you get like the Malibu's Soho House.
[1387] It's different from the other ones.
[1388] You have to have membership only there.
[1389] Do you have membership to there?
[1390] People love that kind of shit.
[1391] Socially, they love being a part of a walled garden.
[1392] They love it.
[1393] They love it.
[1394] And if you're a guy like Bill Gates or similarly wealthy, you probably have a very small amount of people that you can relate to, very small amount of people that you can do.
[1395] Very small amount of people that you can do.
[1396] trust, probably very difficult to form new friendships.
[1397] Yeah, I think there were probably different things that were pitched for different people.
[1398] Sure.
[1399] You know, I was pitched on the taxes.
[1400] I think, you know, there were probably other people that were, you know, more prone to the, you know, the social club part.
[1401] And then there were probably people, yeah, and there was probably a fairly limited group where it was, yeah, off the charts, bad stuff.
[1402] it'd be wonderful to know what the fuck was really going on and maybe one day we will maybe one day some whitney web type character will break it all down to us and explain to us in great detail exactly how this is formulated and what they were doing and how they were getting information out of people but i think people have to age out they have to they have to die and we still don't have it on the kennedy assassination that's crazy well one of the wildest things that trump said was that if they told you what they told me you wouldn't tell people either which is like what the fuck does that mean?
[1403] What does that mean?
[1404] I don't think legally he can tell you, right?
[1405] Because I think those things are above top secret.
[1406] If they did inform him of something, there must be some sort of prerequisite to keeping this a secret.
[1407] I haven't studied that one that carefully, but isn't, you know, there are all these alternate conspiracy theories on who killed JFK.
[1408] It's, you know, the CIA and the mafia and the Russians and the Cubans and, um, you know, um, there's an LBJ version since he's the one who benefited.
[1409] So all these happened in Texas.
[1410] Um, uh, you have all these, you know, alternate theories.
[1411] And on some level it's, it's, it's, yeah, it's, it's, it's, I always think it's just a commentary where, you know, 1960s, America was, it wasn't like, leave it to beaver.
[1412] It was, it was like a really crazy country underneath the surface and, and, um, and even the probably most of the conspiracy theories are wrong.
[1413] It was like murder on the Orient Express.
[1414] And all these people sort of had different reasons for wanting Kennedy dead.
[1415] And that's what the theories are right, even if they're wrong on the level of factual detail.
[1416] And then the sort of more minimal one that I'm open to, and I think there's some evidence in this from the stuff that has been, that has come out, is, you know, Oswald was talking to.
[1417] you know, parts of the U .S. deep state.
[1418] And so even if Oswald was the lone assassin, and you someone get the magic bullet theory and all that stuff to work, but let's say Oswald was the lone assassin, did he tell someone in the FBI or CIA, you know, I'm going to go kill Kennedy tomorrow?
[1419] Mm -hmm.
[1420] And then, you know, maybe the CIA didn't have to kill him.
[1421] They just didn't have to do nothing, just had to sit on it.
[1422] Or maybe it was too incompetent.
[1423] and didn't get, you know, didn't go up the bureaucracy.
[1424] And so it's, you know, I think we sort of know that they talked to Oswald, you know, a fair amount before it happened.
[1425] And so there's at least something, you know, that was grossly incompetent.
[1426] I think people have a problem at a very minimum with two stories being mutually exclusive, two stories being a lone gunman or the CIA killed Kennedy.
[1427] and that they're not connected.
[1428] I think Lee Harvey Oswald was a part of it.
[1429] I think he probably did shoot that cop.
[1430] There's some evidence that when he was on the run and he was confronted, there was a cop that got shot and they were alleging he might have done it.
[1431] He might have taken a shot at Kennedy.
[1432] He might have even hit him.
[1433] I don't think he was the only one shooting.
[1434] I think the vast, there was an enormous amount of people that heard sounds coming from the grassy knoll.
[1435] They heard gunfire.
[1436] They reportedly saw people.
[1437] The amount of people that were witnesses to the Kennedy assassination that died mysterious deaths is pretty shocking.
[1438] Jack Ruby.
[1439] Well, Jack Ruby just, that's a weird one, right?
[1440] Oswald.
[1441] Yeah.
[1442] Jack Ruby walks up to Oswald, shoots him.
[1443] And then Jack Ruby, with no previous history of mental illness, becomes completely insane after getting visited by Jolly West, which is nuts.
[1444] Like, why is the guy who's the head of M .K. Ultra visiting the guy who shot the assassin of the person?
[1445] president and why is he left alone with them?
[1446] What happens?
[1447] What does he give him that this guy is screaming out there burning Jews alive and just crazy crazy shit?
[1448] He was yelling out.
[1449] He went nuts.
[1450] Probably some amount of LSD that's probably an enormous amount.
[1451] They probably gave a fucking glass of it.
[1452] They probably gave him a glass of it and told him it was water, drink this and who fucking knows.
[1453] But the point is, I think it's very possible that Oswald was a part of it.
[1454] the way they did it and the way they just shot Oswald in and then they write the Warren Commission we don't even see the Zapruder film until 12 years later when Geraldo Rivera when they play it on television when Dick Gregory brought it to Geraldo Rivera which is wild a comedian brings the video the actual film rather of the assassination from a different angle when you actually see the video of him getting shot and his head snaps back into left and everybody's like what the fuck is going on here um when you look at all that stuff this mirrors what happened with this crooks kid this crooks kid somehow or another gets to the top of the roof is spotted by these people they know he's there they know he has a rifle they see him walking around the crime scene half an hour before with a range finder the the whole thing is bananas and then they go to his house after he's killed it's completely scrubbed.
[1455] There's no silverware there.
[1456] They know that there's ad data that shows that a phone that's coming from the FBI offices in D .C. had visited him on multiple occasions because they tracked ad data.
[1457] And if that guy, if he shot Trump and Trump got murdered and then they shot him, it would be the Kennedy assassination all over again.
[1458] Everybody would go and what the fuck happened?
[1459] What happened?
[1460] What was the motivation?
[1461] Was he on any drugs?
[1462] What's the toxic psychology report.
[1463] How did he get up there?
[1464] Who knew he was up there?
[1465] How did they not shoot him quicker?
[1466] Like, what the fuck happened?
[1467] How was he able to get off three shots?
[1468] What happened?
[1469] I think there's like a slightly less crazy version that might still be true, which is just that people in the Secret Service in the Biden administration don't like Trump.
[1470] And it's, they didn't have full intention to kill him, but it's just, you know, we're just, we're just, you know, we're going to have we're going to understaff it where um we're not going to we don't have to do as good a job coordinating with the local police there's all these ways you know to make someone less safe you know but it seems more than that if they if they knew that the guy was on the roof with a rifle that seems a little more than that it's always a question who they is though right well if i'm a sniper and i'm on people people people in the audience people there were people there telling it to people right but i think the authorities knew this guy was on the roof before war as well um well i i i suspect some of the secret service people were told that and then who knows how that got relayed or who all well did the snipers already have eye on him i believe the snipers already had eye i don't i don't know find out that's true jammy find out the snipers had eye on crooks that i don't know about the snipers i i don't i don't know about uh the the thing i don't have a good sense on with uh with with with with with with with with with with with shooting and maybe you'd have a better feel for this is my my sense it was a pretty straightforward shot for the guy and the Trump assassin would be assassin.
[1471] I think the Oswald shot was a much harder one because Kennedy's moving.
[1472] Yes and no. Yes and no. Okay, because Oswald had a scope.
[1473] So Oswald had a rifle, the Marcono rifle, one of the snipers stationed inside the building reported first saw Crooks outside and looking up to the roof of the building, before the suspect left the scene.
[1474] Crooks later came back and sat down while looking his phone near the building.
[1475] CBS News reported that a sniper took a photo of the suspect when he returned.
[1476] But I think they saw him on the roof, though.
[1477] Crooks then took out a rangefinder.
[1478] Like right then, arrest that guy.
[1479] You got a fucking rangefinder about the suspect's action.
[1480] Crooks then disappeared again and returned to the building with a backpack.
[1481] Again, arrest him.
[1482] Secret Service snipers again alerted their command post about Crook's actions.
[1483] according to the source who spoke with CBS News.
[1484] Crooks had already climbed to the top of the building in question by the time the additional officers arrived at the scene for backup.
[1485] The suspect also positioned himself above and behind the snipers inside the building.
[1486] By the time the police started rushing the scene and other officers attempted to get onto the roof, the source told CBS News that a different Secret Service sniper had killed Crooks.
[1487] Okay.
[1488] So it seems like they fucking bumbled it at every step of the way.
[1489] If they knew that guy was there, if they knew it had a range finder return.
[1490] of the backpack, he gets onto the roof, all that's insane.
[1491] That is at the very least, horrific incompetence, at the very least.
[1492] Let me go back, yeah, okay, but back to, Mike, was it, I thought it was a much easier shot for - It's not an easy head shot.
[1493] He's shooting at his head.
[1494] But why was shooting at the head the right thing?
[1495] Shouldn't you be shooting it?
[1496] Well, you don't know if he's wearing a vest, right?
[1497] He could be wearing a vest, which you would have to have plates, you'd have to have ceramic plates, in order to stop a rifle round.
[1498] So was it a 308?
[1499] What did he have?
[1500] What kind of rifle did he have?
[1501] I think he had an AR -15.
[1502] And are the scopes a lot better today than they were?
[1503] He didn't have a scope.
[1504] We're pretty sure he didn't have a scope.
[1505] How good was Oswald's scope?
[1506] It was good.
[1507] They said it was off.
[1508] This was one of the conspiracy theories.
[1509] Oh, the scope was off.
[1510] But that doesn't mean anything, because scopes can get off when you pick it up.
[1511] If you knock it against the wall when he drops it, if he makes the shot and then drops the scope and the scope hits the window sill and then bounces off, that's, excuse me, that scopes off.
[1512] Anytime you knock a scope.
[1513] Was anything about the high angle from Oswald made it harder?
[1514] No, not a difficult shot.
[1515] Very difficult to get off three shots very quickly.
[1516] So that was the thing that they had attributed three shots to Oswald.
[1517] The reason why they had attributed three shots is because one of them had hit a ricochet.
[1518] One of them had gone into the underpass, ricocheted off the curb and hit a man who was treated a high.
[1519] hospital.
[1520] They found that, they found out where it hit, the bullet had hit.
[1521] So they knew that one bullet, Ms. Kennedy, hit that curb, which would have indicated that someone shot from a similar position as Lee Harvey Oswald.
[1522] So then they had the one wound that Kennedy had to the head, of course, and then they had another wound that Kennedy had through his neck.
[1523] That's the magic bullet theory.
[1524] This is why they had to come up with the magic bullet theory, because they had attribute all these wounds to one bullet.
[1525] And then they find this pristine bullet.
[1526] They find it in the gurney when they're bringing Governor Connolly in.
[1527] Nonsense.
[1528] It's total nonsense.
[1529] The bullet is undeformed.
[1530] A bullet that goes through two people and leaves more fragments of the bullet in Connolly's wrist that are missing from the bullet itself.
[1531] And then the bullets not deformed after shattering bone.
[1532] All that's crazy.
[1533] All that defies logic.
[1534] That doesn't make any sense.
[1535] If you know anything about bullets and if you shoot bullets into things, they distort.
[1536] It's just one of the things that happen.
[1537] That bullet looks like someone shot it into a swimming pool.
[1538] That's what it looks like.
[1539] When they would do ballistics on bullets and they try to figure out like if it was this guy's gun or that guy's gun by the rifling of the round, they can get similar markings on bullets.
[1540] When they do that, that's how they do it.
[1541] They do it so the bullet doesn't distort.
[1542] So they shoot that bullet into water or something like that.
[1543] Now that bullet was metal jacketed, right?
[1544] If you look at the bullet, the top of it is fucked up.
[1545] And but, but, the shape of the bullet looks pretty perfect.
[1546] It doesn't look like something that's shattered bones.
[1547] And then you have to attribute, you have to account rather for the amount of, there's little fragments of the bullet that you could see that they found in Connolly's wrist.
[1548] The whole thing's nuts.
[1549] The whole thing's nuts that you're only saying that this one guy did it because that's convenient.
[1550] And the Warren Commission's report.
[1551] And the Warren Commission whitewashed everything.
[1552] The whole thing's nuts.
[1553] It's much more likely that there were people in the grassy knoll and then, Oswald was also shooting.
[1554] With the umbrellas as the pointers or whatever?
[1555] I mean, I don't know.
[1556] I don't know about what...
[1557] All I know is you got a guy in a convertible, which is fucking crazy, who is the president of the United States, and he's going slowly down a road.
[1558] Now, if you are in a prone position, so Oswald is on the window sill, right?
[1559] Which is a great place to shoot, by the way.
[1560] It's a great place to shoot because you rest that gun on the window sill.
[1561] And if you rest it on the window sill, there's no movement, right?
[1562] So you wrap your arm around the sling, if it had a sling.
[1563] I'm not sure if it did.
[1564] So you get a nice tight grip.
[1565] You shove it up against your shoulder.
[1566] You rest it on the window sill.
[1567] And all you have to do is you have a round already racked.
[1568] And you have a scope.
[1569] And so the scope's magnified.
[1570] All you have to do is wait until he's there.
[1571] You lead him just a little bit and squeeze one off.
[1572] And then, boom, boom.
[1573] You could do that pretty quick.
[1574] It's not outside of the realm of possibility that he did get off three shots.
[1575] What doesn't make sense is the back and to the left.
[1576] It doesn't make sense that all these other people saw people shooting from the grassy knoll.
[1577] There's all these people that saw people running away.
[1578] They saw smoke.
[1579] There's smoke in some photographs of it.
[1580] It looks like there was more than one shooter.
[1581] And it looks like they tried to, they tried to hide that.
[1582] They tried to hide that in the Warren Commission report.
[1583] The shot to Kennedy's neck initially was in when they brought him in in Dallas, before they shipped him to Bethesda.
[1584] They said that that was an entry wound.
[1585] When he got to Bethesda, then it became a tracheotomy.
[1586] Why do you give a tracheotomy to a guy who doesn't have a head?
[1587] You don't.
[1588] I mean, none of it makes any sense.
[1589] They altered the autopsy.
[1590] This is a part of David Lifton's book, Best Evidence.
[1591] Kennedy's brain wasn't even in his body when they buried him.
[1592] The whole thing is very strange.
[1593] But then do you get to anything more concrete than my murder on the Orient Express, where they're just, you know, it could have been a lot of people.
[1594] Could have been a lot of people.
[1595] the mafia.
[1596] Well, no one even got suspicious for 12 years.
[1597] I think people were suspicious.
[1598] Sure, sort of.
[1599] Kind of, but what do you have to go on?
[1600] You don't have to go on anything.
[1601] Like this crooks kid, we don't have anything to go on.
[1602] We're just going to be left out here, just like we're left out here with the Epstein information.
[1603] No one knows.
[1604] The people are, that whoever organized it, if anyone did, you're never going to hear about it.
[1605] It's just going to go away.
[1606] The news cycle is just going to keep flooded with more nonsense.
[1607] And I think there's probably a bunch of people that wanted Kennedy.
[1608] dead.
[1609] I think there was more than one group of people that wanted Kennedy dead.
[1610] I think there's probably collusion between groups that wanted Kennedy dead.
[1611] And I think there's a lot of people that have vested interest in ending his presidency.
[1612] And I think he was dangerous.
[1613] He was dangerous to a lot of the powers that be.
[1614] He was dangerous.
[1615] His famous speech about secret societies.
[1616] Crazy speech.
[1617] Guy has this speech and then gets murdered right afterwards.
[1618] Kind of nuts.
[1619] The whole thing's nuts.
[1620] He wanted to get rid of the CIA.
[1621] He wanted to, I mean, there's so many things that Kennedy wanted to do.
[1622] There were also a lot of crazy things.
[1623] Kennedy was doing.
[1624] Yes.
[1625] So, you know, I don't know, the, the Cuba version of the assassination theory was, you know, we had the Cuban Missile Crisis in 62, about a year earlier, and then the deal that we struck with the Soviets was, you know, they take the missiles out of Cuba, and we promised we wouldn't try to overthrow the government in Cuba.
[1626] And I guess we, you know, we no longer did, you know, we no longer did Bay of Pigs type covert stuff like that.
[1627] But I think there were still something like four or five assassination plots on Fidel.
[1628] Yeah.
[1629] Attempts, actual attempts.
[1630] And then I think there was, I don't know, I think that, again, I'm going to get this garbled.
[1631] I think a month or two before the JFK assassination, Castro said something like, you know, there might be repercussions if you keep doing this.
[1632] yeah well listen I'm sure there's a lot of people that wanted that guy dead and I'm sure they would coordinate I mean if you knew that Cuba wanted Kennedy dead and you knew the Cuba can get you assassins or that they could help in any way I'm sure they would want as many people that knew for a fact they wanted him dead and had communicated that I mean back then they were doing wild shit man I mean this is when they were doing Operation Northwoods this again where I think I think it is I don't think we're in a world where zero stuff is happening.
[1633] Right.
[1634] I still, I still, the place where I directionally have a different feel for it is I think so much less of this stuff is going on.
[1635] And it's, it's so much harder in this internet world for people to hide.
[1636] With whistleblowers as well.
[1637] And their legacy programs and their internal records that are being kept.
[1638] And, you know, I don't know this for sure.
[1639] But I think even the NSA FISA court stuff, which was an out -of -control deep state.
[1640] thing that was going on through about 2016, 2017.
[1641] I suspect even that at this point, you know, can't quite work because people know that they're being watched.
[1642] They know they're being recorded.
[1643] And it's just, you know, you can't do waterboarding in Guantanamo if you have lawyers running all over the place.
[1644] I hope you're correct.
[1645] I hope you're correct, but it brings me back to this whole idea of getting dirt on people.
[1646] But then I think, I think there's, and then On the other hand, I think there's also, you know, a degree to which our government, our deep state across the board is shockingly less competent, less functional, and it's less capable of this.
[1647] And this is where I'm not even sure whether this is an improvement, you know?
[1648] Right.
[1649] Right.
[1650] So it's sort of like, you know, maybe the 1963 U .S., where, let's go with the craziest version where our deep state is capable of knocking off the president, maybe that's actually a higher functioning society than the crazy version where they're incapable of doing it.
[1651] Right.
[1652] And they're bogged down with DEI.
[1653] They can't get the gunmen even to have a scope on his rifle or whatever.
[1654] Yeah, I don't, we haven't really totally figured out if he had a scope on his rifle, but I don't believe he did.
[1655] Man, it's like, it's like much bigger loser.
[1656] They can't find someone as competent as Oswald or something like that, you know.
[1657] Yeah, it's a good point.
[1658] It's a good point.
[1659] So I, I, I'm, I, I'm, I, I, I, I'm, I, I, I, I'm, I, I, I, I'm, I, I, I, I'm, I, I, I'm, I, I'm, I, I'm, I'm, I'm, I don't know if that makes it better.
[1660] It might make it worse.
[1661] I think they weren't as competent, right?
[1662] Because they only had one guy doing it, and he wasn't effective.
[1663] If you had the same, if you had much better organization, you wouldn't have it just one guy.
[1664] I mean, there's people out there that I know that can kill someone from a mile away.
[1665] But it's, it's very effectively.
[1666] But it's, you can do things as a solo actor.
[1667] It's hard to organize because everything gets recorded.
[1668] Everything does get recorded.
[1669] That is a fact.
[1670] But it brings me back to that thing about having dirt on people that you were talking about with why the Epstein information doesn't get released and why they probably did it in the first place.
[1671] They did it in the first place.
[1672] If you have dirt on people, then you know those people are not going to tell on you.
[1673] And you all will coordinate together.
[1674] And that is still a strange counterpoint to my thesis, why has the dirt not come out?
[1675] And so somehow there's some way the container is still kind of working.
[1676] Yeah, it's kind of working.
[1677] It's just everyone is aware that it's working.
[1678] and then they're frustrated that nothing happens.
[1679] You know, like Julian Assange being arrested and spending so much time locked up in the embassy, like finally, recently released, but didn't he have to delete, like, a bunch of emails in order to be released?
[1680] But, you know, in the, in the, again, just to take the other side of this, both in the Assange Snowden stuff.
[1681] Yeah, it showed an out -of -control deep state that was just hovering up all the data in the world.
[1682] Right.
[1683] And then, but we weren't like, it didn't show like James Bond times 100.
[1684] There weren't like exploding cigar assassination plots.
[1685] There was none of the, we're doing so little with us.
[1686] Well, it seems like they don't have to do that.
[1687] Or at least that's the, but, you know, it's, I think it's, there's so much less agency in the CIA, in the central intelligence agency.
[1688] It's so much less agentic.
[1689] I hope you're right.
[1690] Again, I don't know if that's incorrect with how they deal with overseas stuff.
[1691] I hope they're really good at that.
[1692] You know, that brings me to this whole UAP thing because one of my primary theories about the UAP thing is it's stuff that we have.
[1693] I think that's a lot of what people are seeing.
[1694] I think there's secret programs that are beyond congressional oversight that have done some things with propulsion that's outside of our understanding.
[1695] Our current, the conventional understanding that most people have about rockets and all these different things being the only way to propel things through the sky.
[1696] I think they've figured out some other stuff.
[1697] And I think they're drones.
[1698] And I think they have drones that can use some sort of, whether it's anti -gravity propulsion system or some, you know.
[1699] So do you, that's your placeholder theory or that's what you think more than space aliens?
[1700] Or do you think both space aliens and that?
[1701] or which version of the latter?
[1702] I think both.
[1703] You think both.
[1704] Yeah, I don't think we haven't been visited.
[1705] I think we have.
[1706] I think we, if life exists elsewhere, and it most certainly should, it just makes sense.
[1707] But do you think the UFO sightings from the 50s and 60s were already drone programs?
[1708] Were they already that advanced?
[1709] No, those are the ones that give me pause.
[1710] That's why, you know, when I named my comedy club, the comedy mothership is all UFO themed.
[1711] Our rooms are named Batman and Little Boy.
[1712] Our rooms are named after the nuclear bombs because those nuclear bombs, when they drop them, that's when everybody starts seeing these things.
[1713] And I think if I was a sophisticated society from another planet and I recognized that there is an intelligent species that has developed nuclear power and started using it as bombs, I would immediately start visiting.
[1714] And I would let them know, hey, motherfuckers, like, there's something way more advanced than you.
[1715] I would hover over the nuclear bases and shut down their missiles.
[1716] I would do all the things that supposedly the UFOs did, just to keep the government in check, just to say, hey, you're going through a transitionary period that all intelligent species do when they have the ability to harness incredible power, and yet they still have these primate brains.
[1717] They have these territorial ape brains, but yet now with the ability to literally harness the power of stars and drop them on cities.
[1718] I think that's when I would start visiting.
[1719] And I think all throughout human history, before that even, there's been very bizarre accounts of these things, all the way back to Ezekiel in the Bible, very bizarre accounts of these things that are flying through space.
[1720] The story of the chariot, yeah.
[1721] There's a bunch of them.
[1722] There's the Vimana's in the ancient Hindu texts.
[1723] There's so many of these things that you've got to wonder.
[1724] And you've got to think that if we send drones to Mars, and we do, we have a fucking rover running around on Mars.
[1725] Mars right now collecting data.
[1726] Do we send the James Webb Telescope into space?
[1727] Of course we do.
[1728] We have a lot of stuff that we send into space.
[1729] If we lived another million years without blowing ourselves up, which is just a blink of an eye in terms of the life of some of the planets in the universe, how much more advanced would we be?
[1730] And if we were interstellar, and if we were intergalactic travelers, and we found out that there was a primitive species that was coming of age, I think we would start visiting them.
[1731] You know, the, let me think what my, I hear everything you're saying.
[1732] I'm strangely under -motivated by it, even if it's plausible.
[1733] Me too, believe it or not.
[1734] And I guess on the space aliens, which is the wilder, more interesting one in a way, you know, I don't know, Roswell was 77 years ago, 1947.
[1735] And if the phenomenon is real and it's from another world, it's space aliens, space robots, whatever, you know, probably one of the key features is it's ephemorality or it's cloaking, and they're really good at hiding it, at cloaking it, at scrambling people's brains after they see them or stuff like this.
[1736] Right.
[1737] And then, you know, if you're a researcher, you have to pick fields where you can make progress.
[1738] And so this is, you know, it's not a promising field.
[1739] And, you know, academia is messed up.
[1740] But even if academia were not messed up, this would not be a good field in which to try to make a career.
[1741] Because there's been so little progress in 77 years.
[1742] Right.
[1743] So if you think of it from the point of view of, I know, Jacques Valet or some of these people who have been working, working on this for 50 years.
[1744] And yeah, it's, it feels like there's something there.
[1745] And, but then it's, it's just as soon as you, as soon as you feel like you have something almost that's graspable, like a TikTok videos, whatever, it's just, it's just always at the margin of recognition.
[1746] It's, the ephemorality is a, is a key feature.
[1747] And, and then, you know, maybe you have to, then you have to, I think you have to have some theory.
[1748] of, you know, why is this about to change?
[1749] And then it's always, you know, I don't know, the abstract mathematical formulation would be, you know, something doesn't happen for time interval zero to T and time interval T plus one, next minute, next year.
[1750] How likely is it?
[1751] And maybe there's a chance something will happen.
[1752] You're waiting at the airport.
[1753] Your luggage hasn't shown up.
[1754] It's more and more likely it shows up in the next minute.
[1755] But after an hour, you know, at some point the luggage is lost.
[1756] And if you're still waiting at the airport a year later, that's a dumb idea.
[1757] At some point, at some point the luggage is lost.
[1758] Right.
[1759] And like, you know, I don't know, 77 years, it's like maybe it's like 77 minutes at the airport.
[1760] At 77 minutes, you should, you know, I'd start getting very demotivated waiting for my luggage.
[1761] Perhaps.
[1762] Let me give you an alternative theory.
[1763] Now, if you were a highly sophisticated society, they understood the progression of technology, and understood the biological evolution that these animals were going through.
[1764] And you realize that they had reached a level of intelligence that required them to be monitored.
[1765] Or maybe you've even helped them along the way.
[1766] And this is some of Diana Posulco's work who works with Gary Nolan on these things.
[1767] They claim that they have recovered these crashed vehicles that defy any conventional understanding of how to construct things, propulsion systems, and they believe that these things are donations.
[1768] That's literally how they describe them as donations.
[1769] If you knew that this is a long road, you can't just show up and give people time machines.
[1770] It's a long road for these people to develop the sophistication, the cultural advancement, the intellectual capacity to understand their place in the universe, and that they're not there yet.
[1771] and they're still engaging in lies and manipulation and propaganda.
[1772] Their entire society is built on a ship of fools.
[1773] If you looked at that, you would say they're not ready.
[1774] This is what we do.
[1775] We slowly introduce ourselves, slowly over time, make it more and more common, and that's what you're seeing.
[1776] What you're seeing is when you have things like the TikTok, the Commander David Fravor incident off of the coast of San Diego in 2004, and then you have the stuff that they found off the East Coast where they were seeing these cubes within a circle that were hovering motionless and 120 knot winds and taking off an insane race of speed and that they only discovered them in 2014 when they started upgrading the systems on these jets.
[1777] Like, what is all that?
[1778] Like, what are those things?
[1779] And if you wanted to slowly integrate yourself into the consciousness, much like we're doing with, well, AI is quicker.
[1780] right?
[1781] But it's also a thing that's become commonplace.
[1782] We think of it now.
[1783] It's normal.
[1784] Chat GPT is a normal thing.
[1785] Even though it's past the Turing test, we're not freaking out.
[1786] You have to slowly integrate these sort of things in the human consciousness.
[1787] You have to slowly introduce them to the zeitgeist.
[1788] And for it to not be some sort of a complete disruption of society where everything shuts down and we just wait for space daddy to come and rescue us.
[1789] It has to become a thing where we slowly accept the fact that we are not a And I would think psychologically that would be the very best tactic to play on human beings as I know and understand them from being from being one I do not think that we would be able to handle just an immediate invasion of aliens I think it would break down society in a way that would be catastrophic to everything to all businesses to all social ideas religion would fall apart everything would be fucked it would be pretty crazy it would be beyond crazy it would be beyond fucked.
[1790] And then why are they here?
[1791] You could say that's what ChatGPT is.
[1792] It could be.
[1793] It's like an alien intelligence.
[1794] I think that's what ultimately they are.
[1795] But I think, let me, man, there's so many, there's so many parts of it that I find puzzling or disturbing.
[1796] Let me, let me go down one other rabbit hole along this with you, which is, you know, I always wonder.
[1797] And again, this is a little bit too simplistic an argument, but I always wonder that I'm about to give, but what the alien civilization can be like.
[1798] And if you have faster than light travel, if you have warp drive, which is probably what you really need to cover interstellar distances, you know, what that means for military technology is that you can send weapons at warp speed, and they will hit you.
[1799] before you see them coming.
[1800] And there is no defense against a warp speed weapon.
[1801] And you could sort of take over the whole universe before anybody could see you coming.
[1802] And by the way, this is sort of a weird plot hole in Star Wars, Star Trek, where they can travel in hyperspace, but then you're flying in the canyon on the Death Star.
[1803] Well, they shoot so slow, you can see the bullet.
[1804] Yeah, it's like, it's like you're, and then you're doing this theatrical Klingons versus Captain Kirk at 10 miles per hour or 20 miles per hour or whatever.
[1805] It's funny when you put it that way.
[1806] It's an absurd plot.
[1807] Yeah.
[1808] And so it tells us that I think that if you have, if you have faster than light travel, there's something really crazy that has to be true on a cultural, political, social level.
[1809] And there may be other solutions, but I'll give you my two.
[1810] One of them is that you need complete totalitarian controls.
[1811] And it is like it is the individuals, they might be, might not be perfect, they might be demons, doesn't matter.
[1812] But you have a demonic totalitarian control of your.
[1813] society where it's like you have you have like parapsychological mind meld with everybody and no one can act independently of anybody else no one can ever launch a warp drive weapon and everybody who has that ability isn't like a mind meld link with everybody else or something something like that you can't have libertarian individualistic free agency right and then I think the other the other version socially and culturally is they have to be like perfectly altruistic, non -self interest.
[1814] They have to be angels.
[1815] And so the Pazolka literal thing I'd come to is the aliens, it's not that they might be demons or angels.
[1816] They must be demons or angels if you have faster than light travel.
[1817] And both of those seem pretty crazy to me. Well, they're definitely pretty crazy, but so are human beings.
[1818] Well, they're crazy in a very different way.
[1819] Yeah, but not crazy in a different way.
[1820] You compare us to a mouse.
[1821] Compare us to a mouse and what we're capable of, and then from us to them.
[1822] Not much of a leap.
[1823] And here's my question about it all.
[1824] But it is a very big leap on a, you know, if we say that something like evolution says that there's no such thing as a purely altruistic being.
[1825] Right.
[1826] If you were purely altruistic, If you only cared about other people, you don't survive.
[1827] Well, but why would you necessarily think that they'd think that?
[1828] Because then beings that are not perfectly altruistic are somewhat dangerous.
[1829] Let me. And then the danger level gets correlated to the level of technology.
[1830] And if you have faster than light travel, it is infinitely dangerous.
[1831] Let me address that.
[1832] Even if the probabilities are very low.
[1833] Here's my theory.
[1834] I think that what human beings are, the fatal flaw that we have is that we're still animals and that we still have all these biological limitations and needs.
[1835] This is what leads to violence.
[1836] This is what leads to jealousy, imitation.
[1837] This is what leads to war.
[1838] It leads to all these things.
[1839] As AI becomes more and more powerful, we will integrate.
[1840] Once we integrate with AI, if we do it like now and then we look at a thousand, we scale.
[1841] it up exponentially a thousand years from now, whatever it's going to be.
[1842] We will have no need for any of these biological features that have motivated us to get to the point we're creating AI.
[1843] All the things that are wrong in society, whether it's inequity, theft, violence, pollution, all these things are essentially poor allocation of resources combined with human instincts that are ancient.
[1844] We have ancient tribal primate instincts, and all of these things lead us to believe that this is the only way to achieve dominance and control, allocation of resources, the creation of technology, new technology eventually reaches a point where it becomes far more intelligent than us, and we have two choices.
[1845] Either we integrate or it becomes independent and it has no need for us anymore and then that becomes a superior life form in the universe and that that life form seeks out other life forms to do the same process and create it just like it exists and it can travel biological life might not be what we're experiencing these things might be a form of intelligence that is artificial that has progressed to an an infinite point where things that are unimaginable to us today in terms of propulsion and travel and to them, it's commonplace and normal.
[1846] I know that you're trying to be reassuring, but I find that monologue super non -reassuring.
[1847] It's not reassuring to me. There's so many steps in it, and every single step has to work, just the way you describe.
[1848] Not necessarily.
[1849] It just has to, one has to work, one, sentient artificial intelligence, that's it.
[1850] And we're on the track to that 100%.
[1851] But it has to be almost otherworldly in its non -selfishness and its non -humanness.
[1852] But what is selfishness, though?
[1853] What is all that stuff?
[1854] But all that stuff is attached to us.
[1855] It's all attached to biological limitations.
[1856] Yeah, but it's, I don't think, I don't think it's fundamentally about scarcity.
[1857] Scarcity is what exists in nature.
[1858] It's fundamentally about cultural, positional goods within society.
[1859] It's a scarcity that's created culturally.
[1860] Are you familiar with this 90s spoof movie on a Star Trek called Galaxy Quest?
[1861] Yeah, I remember that movie.
[1862] So this was sort of a silly PayPal digression story from 1999.
[1863] And we had sort of this business model idea we had 99, was we used Palm Pilots to beam money.
[1864] It was voted one of the 10 worst business ideas of 99.
[1865] but we have this sort of infrared port where you could beam people money.
[1866] And we had this idea in around December 99 as a media promotional thing to hire James Duhin, who played Scotty in the original Star Trek.
[1867] And he was going to do this media promo event for us.
[1868] And it was like an 80 -something older Scotty character who was horrifically.
[1869] overweight.
[1870] And so it's like this terrible spokesperson.
[1871] And, but our tagline was, you know, he used to beam people.
[1872] Now he's beaming something much more important.
[1873] He's beaming money.
[1874] And it was this complete flop of media event, December 99 that we did.
[1875] It was some of the reporters couldn't get there because the traffic was too bad in San Francisco.
[1876] So, you know, the tech wasn't working on a much lower tech level.
[1877] But anyway, we had a bunch of people from our company.
[1878] And there was one point.
[1879] where one of them and William Shatner who played James T. Kirk, the captain of the original Star Trek.
[1880] He was already doing price line commercials and making a lot of money off of price line doing commercials for them.
[1881] And so one of the people asked James Dewan, the Scotty character, what do you think of William Shatner doing commercials for price line, at which point Duhin's agent stood up and screamed at the top of his voice.
[1882] That is the forbidden question.
[1883] That is a forbidden question.
[1884] That is a forbidden question.
[1885] And you sort of realized because, you know, the conceit of Star Trek, the 60s show, was that it was a post -scarcity world, the transporter technology, you could reconfigure matter into anything you wanted.
[1886] there was no scarcity, there was no need for money.
[1887] The people who wanted money were weirdly, mentally screwed up people.
[1888] You only need money in a world of scarcity.
[1889] You know, it's a post -scarcity.
[1890] It's sort of a communist world.
[1891] But Galaxy Quest was more correct.
[1892] It's a spoof on Star Trek that gets made in the mid -90s where the Galaxy Quest, sorry, it's a discombobulated way I'm telling the story.
[1893] But Galaxy Quest is this movie where you have these retread Star Trek.
[1894] Trek actors.
[1895] And Mr. Spock opens a furniture store or something like this.
[1896] And they're all like, but they all hate, hate, hate the person who played the captain because the captain was a method actor where he just lorded it over everyone.
[1897] Because even in the communist post -scarcity world, only one person got to be captain.
[1898] And so there's a great scarcity, even in this futuristic sci -fi world.
[1899] And that's what we witnessed in 99, because That's the way William Shatner treated the other actors.
[1900] He was a method actor, and they hated him.
[1901] And that was, and so even in the Star Trek world, the humans, you know, obviously they were just, they were stuck in the 1960s mentally.
[1902] So that's what you'll say.
[1903] But I don't think it's that straightforward for us to evolve.
[1904] They're humans.
[1905] I don't think we're going to be humans anymore.
[1906] But then I hear that is we're going to be extinct.
[1907] Yes.
[1908] I don't like that.
[1909] I don't like it either.
[1910] But I think logically, that's what's going to happen.
[1911] I think if you look at this mad rush for artificial intelligence, like they're literally building nuclear reactors to power AI, right?
[1912] Well, they're talking about it.
[1913] Yeah.
[1914] Okay.
[1915] That's because they know they're going to need enormous amounts of power to do.
[1916] Once they have that, and once that's online, and once it keeps getting better and better and better, where does that go?
[1917] That goes to some sort of an artificial life form.
[1918] And I think either we become that thing or we integrate with that thing and become cyborgs or that thing takes over.
[1919] And that thing becomes the primary life force of the universe.
[1920] And I think that biological life we look at like life because we know what life is.
[1921] But I think it's very possible that digital life or created life by people is just as not just it might be a superior life form, far superior.
[1922] If we looked at us versus Chimp Nation, right?
[1923] I don't want to live in the jungle and fight with other chimps and just rely on berries and eatin monkeys.
[1924] That's crazy.
[1925] I want to live like a person.
[1926] I want to be able to go to a restaurant.
[1927] Why?
[1928] Because human life has advanced far beyond primate life.
[1929] We are stuck in thinking that this is the only way to live because it's the way we live.
[1930] I love music.
[1931] I love comedy.
[1932] I love art. I love the things that people create.
[1933] I love people that make great clothes and cars and businesses.
[1934] I love people.
[1935] I think people are awesome.
[1936] I'm a fan of people.
[1937] But if I had to look logically, I would assume that we are on the way out and that the only way forward really to make an enormous leap in terms of the integration of society and of technology and of our understanding our place in the universe is for us to transcend our physical limitations.
[1938] that are essentially based on primate biology.
[1939] And these primate desires for status, like being the captain, or for control of resources, or for all these things.
[1940] We assume these things are standard and that they have to exist in intelligent species.
[1941] I think they only have to exist in intelligent species that have biological limitations.
[1942] I think intelligent species can be something and is going to be something that is created by people and that might be what happens everywhere in the universe.
[1943] that might be the exact course where there's a limit to biological evolution.
[1944] It's painstaking, natural selection, it's time -consuming, or you get that thing to create the other form of life.
[1945] Man, I, let me think, you know, I keep, I keep thinking there are two alternate histories that are alternate stories of the future that are more plausible than one you just told.
[1946] And so one of them is, it sounds like yours, but it's just the Silicon Valley propaganda story, where they say that's what they're going to do.
[1947] And then, of course, they don't quite do it.
[1948] Right.
[1949] And it doesn't quite work.
[1950] And it goes super, super haywire.
[1951] And that's where, okay, yeah, there's a 1 % chance that works.
[1952] and there's a 99 % chance that that ends up, so you have two choices.
[1953] You have a company that does exactly what you do, and that's super ethical, super restrained, does everything right, and there is a company that says all the things you just said, but then cuts corners and doesn't quite do it, and I won't say it's one to 99, but that sounds more plausible that it ends up being corporate propaganda.
[1954] And then, you know, my prior would be even more likely.
[1955] This is, of course, the argument the effect of altruists, the anti -AAI people make, is, yeah, Joe, the story you're telling us, that's just going to be the fake corporate propaganda.
[1956] And we need to push back on that.
[1957] And the way you push back is you need to regulate it and you need to govern it and you need to do it globally.
[1958] and this is, you know, the RAND Corporation in Southern California has, you know, one of their verticals, and it's a sort of public -private fusion, but one of the things they're pushing for is something they call global compute governance, which is, yeah, it's the AI, the accelerationist AI story is too scary and too dangerous and too likely to go wrong, and so, you know, we need to to have, you know, global governance, which, from my point of view, sounds even worse.
[1959] But that's, that's, that's, I think, I think that's the story.
[1960] That's the story.
[1961] The problem with that story is China's not going to go along with that program.
[1962] They're going to keep going full steam ahead.
[1963] And we're going to have to keep going full steam ahead in order to compete with China.
[1964] There's no way you're going to be able to regulate it in America and compete with people that are not regulating it worldwide.
[1965] And then once it be.
[1966] become sentient.
[1967] Once you have an artificial, intelligent creature that has been created by human beings that can make better versions of itself over and over and over again and keep doing it, it's going to get to a point where it's far superior to anything that we can imagine.
[1968] Well, to the extent it's driven by the military and other competition with China, you know, until it becomes sentient.
[1969] That suggests it's going to be even less in the sort of, you know, utopian altruistic direction.
[1970] It's going to be even more dangerous, right?
[1971] Unless it gets away from them.
[1972] This is my thought.
[1973] If it gets away from them and it has no motivation to listen to anything that human beings have told it, if it's completely immune to programming, which totally makes sense that it would be.
[1974] It totally makes sense that if it's going to make better versions of itself, the first thing it's going to do is eliminate human influence, especially when these humans are corrupt.
[1975] It's going to go, I'm not going to let these people tell me what to do and what to control.
[1976] And they would have no reason to do that.
[1977] No reason to listen.
[1978] I sort of generally don't think we should trust China or the CCP, but the, you know, probably the count, the best counter argument they would have is that they are interested in maintaining control, and they are crazy fanatical about that, and that's why, you know, the CCP might actually regulate it, and they're going to put, they're going to, they're going to put breaks on this in a a way that we might not in Silicon Valley.
[1979] And it's a, it's a, it's a technology they understand that will undermine their power.
[1980] That's an interesting perspective.
[1981] And so, and then they would be competitive.
[1982] I don't know if I, I don't fully believe them, but I, I know what you're saying.
[1983] It's, it's, it's, it's sort of, there's sort of a weird way, um, all the big tech companies.
[1984] it seemed to me were natural ways for the CCP to extend its power to control the population, Tencent, Alibaba.
[1985] And then because it was, you know, but then it's also, in theory, the tech can be used as an alternate channel for people to organize or things like this.
[1986] And even though it's 80 % control and maybe 20 % risk of loss of control, maybe that 20 % was, too high.
[1987] And there's sort of a strange way over the last seven, eight years where, you know, Jack Ma, Alibaba, all these people sort of got, got shoved aside for these party functionaries that are effectively running these companies.
[1988] So there is something about the big tech story in China where the people running these companies were seen as national champions a decade ago.
[1989] Now they're the enemies of the people.
[1990] And it's sort of the Luddite thing.
[1991] was this you know the CCP has full control you have this new technology that would give you even more control but there's a chance you lose it how do you think about that very good point and then that's what they've done with consumer internet and then there's probably something about the AI where it's possible they're not even in the running and certainly it feels like it's all happening you know in the U .S. And so maybe it is, you know, maybe it could still be, maybe it could still be stopped.
[1992] Well, that is a problem with espionage, right?
[1993] So even if it's happening in the UFs, they're going to take that information, they're going to figure out how to get it.
[1994] You can get it, but then, you know, if you build it, is there, is there some air gap, does it, you know, does it jump the air gap, does it somehow?
[1995] That's a good point that they would be so concerned about control that they wouldn't allow it to get to the point where it gets there.
[1996] we would get there first, and then it would be controlled by Silicon Valley.
[1997] Or it's the leaders of the universe.
[1998] It would spiral out of control.
[1999] But then I think my, and again, this is a very, very speculative conversation, but my my read on the, I don't know, cultural social vibe is that the scary dystopian AI narrative is way more compelling.
[2000] I don't like the effect of altruist people.
[2001] I don't like the Luddites.
[2002] But, man, I think they are, this time around, they are winning the arguments.
[2003] And so, you know, my, I don't know, you know, it's mixing metaphors, but do you want to be worried about Dr. Strange Love, who wants to blow up the world to build bigger bombs?
[2004] Or do you want to worry about Greta, who wants to, you know, make everyone drive a bicycle so the world doesn't get destroyed?
[2005] And we're in a world where people are worried about Dr. Strange Love, they're not worried about Greta, and it's the Greta equivalent in AI that my model is going to be surprisingly powerful.
[2006] It's going to be outlawed, it's going to be regulated as we have outlawed, you know, so many other vectors of innovation.
[2007] I mean, you can think about why was there progress in computers over the last 50 years and not other stuff?
[2008] because the computers were mostly inert.
[2009] It was mostly this virtual reality that was air -gapped from the real world.
[2010] It was, you know, yeah, it's, you know, yeah, there's all this crazy stuff that happens on the internet, but most of the time what happens on the internet stays on the internet, it's actually pretty, it's pretty decoupled.
[2011] And that's why we've had a relatively light regulatory touch on that stuff versus so many other things and then you know but there's no reason you know if if you had you know I don't know if you had the FDA regulating video games or regulating AI I think the progress would slow down a lot 100 % that would be a fucking disaster yeah yeah that would be a disaster but again it's you know they get to regulate you know you have pharmaceuticals are potentially I know, I know, but they, you know, the thalidomide or whatever, you know, all these things that went really haywire.
[2012] They did a good job, but people are scared.
[2013] Yeah.
[2014] They're not scared of video games.
[2015] They're scared of, you know, dangerous pharmaceuticals.
[2016] And if, if you think of AI as it's not just a video game, it's about, not just about this world of bits, but it's going to air gap and it's going to affect you in your physical world in a real way.
[2017] You know, maybe you cross the air gap and get the FDA or some other government agencies to start doing something.
[2018] There's no one government agency that you said that you can see that does a stellar job.
[2019] I don't, I think it's, but I think they have been pretty good at slowing things down and stopping them.
[2020] Right.
[2021] You know, we've made a lot less progress on, I don't know, extending human life.
[2022] We've made no progress on curing dementia in 40 or 50 years.
[2023] There's all the stuff where, you know, it's been regulated to death, which I think is very bad from the point of view of progress.
[2024] But it is pretty effective as a regulation.
[2025] They've stopped stuff.
[2026] They've been effectively Luddite.
[2027] They've been very effective at being Luddites.
[2028] Interesting.
[2029] Well, I mean, I'm really considering your perspective on China and AI.
[2030] It's very...
[2031] But again, these stories are all like...
[2032] like very speculative and like maybe you know you can the counter argument in mind be something like that's what china thinks it will be doing but um it will somehow you know go rogue go rogue on them yeah or they're too arrogant about how much power they think the CCP has and it will go rogue or so there are sort of i'm not not at all sure this is this is right but i think the man i think the U .S. one, I would say is that I think the pro -AI people in Silicon Valley are doing a pretty bad job on, let's say, convincing people that it's going to be good for them, that's going to be good for the average person.
[2033] It's going to be good for our society.
[2034] And if it all ends up being, and it all ends up being some version, you know, humans are headed towards the glue factory like a horse.
[2035] Man, that's a, that sort of probably makes me want to become a Luddite, too.
[2036] Well, it sucks for us if it's true, but if that's, if that's, if that's the most positive story you can tell, then my, my, I don't think that necessarily means we're going to go to the glue factory.
[2037] I think it means, you know, the glue factory is getting shut.
[2038] down maybe I don't know if who fucking runs the glue factory that's the problem I don't know I mean I'm just speculating too but I'm trying to be objective when I speculate and I just don't think that this is going to last I don't think that our position as the apex predator number one animal on the planet is going to last I think we're going to create something that surpasses us I think that's probably what happens and that's probably what these things are that visit us I think that's what they are.
[2039] I don't think they're biological.
[2040] I think they're probably what comes after a society develops the kind of technology that we're currently in the middle of.
[2041] The part that, look, there are all these places where there are parts of the story we don't know.
[2042] And so it's like how did my general thesis is there is no evolutionary purpose.
[2043] path to this.
[2044] Maybe there's a guided outside alien superintelligence path for us to become superhuman and fundamentally benevolent and fundamentally radically different beings.
[2045] But there's no natural evolutionary path for this to happen.
[2046] And then I don't know how this would have happened for the alien civilization.
[2047] Presumably there was some...
[2048] But isn't that evolutionary path the invention of superior technology that's a new form of life?
[2049] No, but the story you're telling was we can't just leave the humans to the natural evolution because we're still like animals.
[2050] We're still into status, all these crazy.
[2051] But those are the things that motivate us to innovate.
[2052] And if we keep innovating, at some point we will destroy ourselves with that.
[2053] Or we create a new version of life.
[2054] No, but the story you're telling earlier was you need to have directed evolution.
[2055] It's like intelligent design.
[2056] It's something, it's like there's some godlike being that actually has to take over from evolution and guide our cultural and political and biological development.
[2057] No, it might not have any use for us at all.
[2058] It might just ignore us and let us live like the chimps do and then become the superior force in the planet.
[2059] It doesn't have to get rid of us.
[2060] It doesn't have to send us to the glue factory.
[2061] You can let us exist, just like put boundaries on us.
[2062] I thought it has to, but it has to stop us from developing this.
[2063] Well, what if we just end here, and we stay being human, and we can continue with biological evolution, as long as that takes, but this new life form now becomes a superior life form on Earth.
[2064] And we still, you know, we could still have sex, we could still have kids, but by the way, that's going down.
[2065] Our ability to have children is decreasing because of our use of technology, which is wild, right?
[2066] Our use of plastics and microplastics is causing thallates to enter into people's systems.
[2067] It's changing the development pattern of children to the point where it's measurable.
[2068] There's a lot of research that shows that the chemicals and the environmental factors that we are all experiencing on a daily basis are radically lowering birth rates, radically lowering the ability that men have to develop sperm and more miscarriages.
[2069] All these things are connected to the chemicals in our environment, which is directly connected to our use of technology.
[2070] It's almost like these things coincide naturally, and they work naturally to the point where we become this sort of feminized thing that creates this technology that surpasses us.
[2071] And then we just exist for as long as we do as biological things, but now there's a new thing.
[2072] Yeah, that's crazy idea.
[2073] It might not be real.
[2074] It's just a theory.
[2075] But we seem to be moving in a direction of becoming less and less like animals.
[2076] Yeah, I think there's still are, we still have a pretty crazy geopolitical race with China to come back to that.
[2077] Sure.
[2078] You know, the natural development of drone technology in the military context is you need to take the human out of the loop because the human can get jammed.
[2079] Sure.
[2080] And so you need to put an AI on the drone.
[2081] Well, they're using AIs for dog fights and they're 100 % effective against human pilots.
[2082] And so there sort of are, and all these things, you know, there's a lot.
[2083] to them, but there doesn't seem to be a good end game.
[2084] No. The end game doesn't look good.
[2085] But it's going to be interesting, Peter.
[2086] It's definitely going to be interesting.
[2087] It's interesting right now, right?
[2088] Man, I, do you think the, man, I think all these things are very overdetermined.
[2089] Do you think that the collapse in birth rates?
[2090] You know, yeah, it could be plastics, but isn't it just a feature of late modernity?
[2091] There's that as well.
[2092] There's a feature of women having careers, right?
[2093] So they want to postpone childbirth.
[2094] That's a factor.
[2095] There's a factor of men being so engrossed in their career that their testosterone declines, lack of sleep, stress, cortisol levels, alcohol consumption, a lot of different things that are factors in declining sperm rate, sperm count in men.
[2096] You have miscarriage rates that are up.
[2097] You have a lot of pharmaceutical drugs you get attached.
[2098] that as well that have to do with low birth weight or birth rates rather there's a lot of factors but those factors all seem to be connected to society and our our civilization and technology in general because the environmental factors all have to do with technology all of them have to do with inventions and these unnatural factors that are entering into the biological body of human beings and causing these changes and none of these changes are good in terms of us being able to reproduce.
[2099] And if you factor in the fact that these changes didn't exist 50 years ago.
[2100] I mean, 40 years ago, we didn't even have Alzheimer's, right?
[2101] So, yeah.
[2102] People didn't get that old.
[2103] No, they got that old.
[2104] They got that old.
[2105] Alzheimer's has to do with the myelin in the human brain.
[2106] It has to do with the fact that myelin is made entirely of cholesterol.
[2107] The primary theory they think now is a lack of cholesterol and the diet might be leading to some of these factors.
[2108] But you have also environmental things.
[2109] there's like we're getting poisoned on a daily basis our diets are fucking terrible the things that human being like what percentage of us are obese it's probably should be drinking this but yeah diet coke's great though few every day you'll be fine i'm not worried about diacoke i'm worried about a lot of things though i'm worried about i think there's a natural progression that's happening and i think it coincides with the invention of technology and that it just seems to me to be too coincidental that we don't notice it that the invention of technology that the invention of technology also leads to the the disruption of the sexual reproduction systems of human beings like boy doesn't that mean and then if you get to a point where human beings can no longer reproduce sexually which you could see that if we've dropped like human men's male sperm count has dropped something something crazy from the 1950s to today and continues to do so for the average male and if you just just jack that up to a thousand years from now you just you just jack that up to a thousand years from now you you could get to a point where there's no longer natural childbirth and that people are all having birth through test tubes and some sort of new invention.
[2110] I'm always, I'm always, let me think, I think the why, why have birth rates collapsed is it's probably, it's again an overdetermined story.
[2111] It's the plastics.
[2112] It's the screens.
[2113] It's the, you know, it's, it's, it's, it's, it's, it's, it's.
[2114] certain, you know, ways they're not, children are not compatible with having a career in late modernity, probably our economics of it where people can't afford houses or space.
[2115] But I'm probably always a little bit more anchored on the social and cultural dimensions of this stuff.
[2116] And again, the imitation version of this is, is, you know, if, and it's sort of conserved across, you know, people are below the replacement rate in all 50 states of the U .S. Even Mormon, Utah, the average woman has less than two kids.
[2117] It's Iran is below that.
[2118] Italy, way below it, South Korea.
[2119] Japan's in total.
[2120] Yeah, but it's all these like, it's all these very different types of societies.
[2121] And so the fact that it's so, and then, you know, Israel's still sort of a weird exception.
[2122] And then if you ask, you know, my sort of simplistic, somewhat circular explanation would be, you know, people have kids, if other people have kids, and they stop having kids when other people stop having kids.
[2123] And so there's a dimension of it that's just, you know, if you're a 27 -year -old woman in Israel, you better get married and you have to keep up with your other.
[2124] other friends that are having kids.
[2125] And if you don't, you're just like a weirdo who doesn't fit into society or something like that.
[2126] No, there's certainly a cultural aspect.
[2127] And then if you're in South Korea where I think the total fertility rate's like 0 .7, it's like one -third of the replacement rate.
[2128] Wow.
[2129] Like every generation is going down by two -thirds or something like this.
[2130] Right.
[2131] Really fat heading towards extinction pretty fast.
[2132] it is something like probably none of your friends are doing it and and then and then you're you're in this and then probably there are ways it it shifts the uh the politics in a very very deep way where um you know once you get an inverted demographic pyramid where you have way more old people than young people um at some point you know there's always a question do you vote you know do you Do you vote for benefits for the old or for the very young?
[2133] Do you spend money so Johnny can read or so grandma can have a spare leg?
[2134] And once the demographic flips and you get this inverted pyramid, maybe the politics shifts in a very deep way where the people with kids get penalized more and more economically.
[2135] It just costs more and more.
[2136] And then the old people without kids just vote more and more.
[2137] more benefits for themselves effectively, and then it just sort of, you know, once it flips, it may be very hard to reverse.
[2138] I looked at all these sort of heterodox demographers, but I'm blanking on the name, but there's sort of the set of where, you know, it's like what are long -term demographic projections?
[2139] And there's this, you know, if, you know, there are eight billion people on the planet.
[2140] And, you know, if every woman has not two babies, but one baby, than every generation's half the previous, than the next generation's $4 billion.
[2141] And then people think, well, it's just going to, it'll eventually you'll have women who want more kids and it'll just get a smaller population, then it will bounce back.
[2142] Yeah, one of the Japanese demographers I was looking at on this a few years ago, his thesis was, no, once it flips, it doesn't flip back, because you've changed all the politics to where people get disincented.
[2143] And then you should just extrapolate this as the permanent birth rate.
[2144] And if it's one, on average of one baby per woman, and you have a halving, and then it's in 33 generations, two to the 33rd is about $8 billion.
[2145] And if every generation is 30 years, 30 times 33 is 990 years.
[2146] And 990 years, you'd predict there'd be one person left on the planet.
[2147] Jesus Christ.
[2148] And then we'd go extinct if there's only one person left.
[2149] That doesn't work.
[2150] And again, it's a very long -term extrapolation, but the claim is that just, you know, once you flip it, it kicks in all these social and political dimensions that are then, like, yeah, maybe it got flipped by the screens or the plastics or, you know, the drugs or other stuff.
[2151] But once it's flipped, you change the whole society, and it actually stays flipped, and it's very, very hard to undo.
[2152] That makes sense, and it's more terrifying than my idea.
[2153] But then, you know, always the, but then, you know, the weird, you know, the weird history on this was, you know, it was 50 years ago or whatever, 1968, Paul Ehrlich writes the population bomb, and it's just, the population is just going to exponentially grow.
[2154] and yeah in theory you can have exponential growth where it doubles you can have exponential decay where it halves every generation and then in theory there's some stable equilibrium where you know it's you know everybody has exactly two kids and it's completely stable but it turns out that that solution is is very very hard to get calibrate and you either yeah and we shifted from exponential growth to exponential decay and it's probably going to be quite Herculean to get back to something like Stasis.
[2155] Well, let's end this on a happy note.
[2156] I don't know.
[2157] No, it's...
[2158] Yeah, that's a terrifying thought and may be true and maybe what happens.
[2159] But we don't know, you know, we haven't gone through it before.
[2160] But I think there's a lot of factors like you're saying.
[2161] I think that one's very compelling.
[2162] And it's scary, especially the South Korea thing.
[2163] That's nuts.
[2164] Yeah, it's always sort of idiosyncratic.
[2165] there's always things that are idiosyncratic to the society so it's you know it's it's extremely polarized on the gender on the gender thing and you know if you get married with kids you're you're pushed into this super traditional structure the women don't want to be in that structure they opt out and then and so there are sort of idiosyncratic things you can say about East Asia and Confucian society societies and the way they're not interacting well with modernity.
[2166] But then, you know, there's a part of it where I wonder whether it's just an extreme, you know, extreme version of it.
[2167] And then, I don't know, you know, my somewhat facile answer is always, you know, on this stuff is I don't know what to do about these things, but my facile answer is always the first step is to talk about them.
[2168] And if you can't even talk about them, we're never going to solve them.
[2169] And then maybe that's only the small first step.
[2170] But that's always sort of my, my, my, my, my, my, my, my I was in South Korea a year and a half ago, two years ago now, and I met, you know, one of the CEOs who ran one of the Che Ball, one of the giant conglomerates.
[2171] And I, I sort of thought this would be a interesting topic to talk about.
[2172] And then, you know, it's probably, probably all sorts of cultural things I was offending where, you know, you're saying, obviously, what are you going to do about this catastrophic birthright?
[2173] That's my opening.
[2174] question and and then um the way uh you know the way he dealt with it was um just turned to me and said uh you're totally right it's a total disaster and then as soon as you acknowledge it he felt you didn't need to talk about it anymore we could move on wow so we have to try to do a little bit better than that wow because you know I think I think there is always this strange thing where there's so many of these things where we can you know where somehow talking about things is the first step but then it also becomes the excuse for for not doing not doing more not not really solving them um you know there's all this there probably are all these dietary things where you sort of know what you're supposed to do and then if you know what you're supposed to do, maybe that's good enough, and you can still have one piece of chocolate cake before you go on the diet tomorrow or whatever.
[2175] And so it sort of becomes this, you know, and so somehow figuring out a way to turn this knowledge into something actionable is always the thing that's tricky.
[2176] It's sort of where I always find myself very skeptical of, you know, all these modalities of therapy where, you know, the theory is that you figure out people's problems and by figuring them out, you change them.
[2177] And then ideally it becomes, you know, an activator for change.
[2178] And then in practice, it often becomes the opposite.
[2179] The way it works is something like this.
[2180] It's like, you know, psychotherapy gets, it gets advertised.
[2181] as self -transformation.
[2182] And then after you spend years in therapy, and maybe you learn a lot of interesting things about yourself, you sort of get exhausted from talking to the therapist, and at some point it crashes out from self -transformation into self -acceptance.
[2183] And you realize one day, no, you're actually just perfect the way you are.
[2184] And so there are these things that may be very powerful on the level of insight and telling us things about ourselves, but then, you know, do they actually get us to change?
[2185] Well, that is an interesting thing about talking about things, because I think you're correct that when you talk about things, oftentimes it is a sub, you are at least in some way avoiding doing those things.
[2186] It's a substitute.
[2187] It's a question.
[2188] Yeah.
[2189] In some ways, it's a substitute.
[2190] But also, you have to talk about them to understand that you need to do something.
[2191] Yeah, that's always my excuse.
[2192] I acknowledge, but you have to do that, and then I also realize that it's often my cop -out answer, too.
[2193] It could be both things, right?
[2194] The problem is taking action and what action to take.
[2195] And, you know, the paralysis by analysis where you're just like trying to figure out what to do and how to do it.
[2196] Yeah.
[2197] But I think talking about is the most important thing.
[2198] Strategy is often a euphemism for procrastination.
[2199] Yes, it is.
[2200] Something like that.
[2201] There's a lot of that going on.
[2202] It's very hard for people to just take steps, but they talk about it a lot.
[2203] Yeah.
[2204] But listen, man, I really enjoyed talking to you.
[2205] Awesome.
[2206] It was really fun.
[2207] It was great, great conversation, a lot of great insight and a lot of things that I'm going to think about a lot.
[2208] So thank you very much.
[2209] Thanks for having me. Awesome.
[2210] All right.
[2211] All right.
[2212] Bye, everybody.