The Joe Rogan Experience XX
[0] Ah.
[1] Ha, ha, ha.
[2] Four, three, two, one.
[3] Boom.
[4] Thank you.
[5] Thanks for doing this, man. Really appreciate it.
[6] You're welcome.
[7] Very good to meet you.
[8] Nice to meet you, too.
[9] Thanks for not lighting this place on fire.
[10] You're welcome.
[11] That's coming later.
[12] How does one, just in the middle of doing all the things you do, create cars, rockets, all this stuff you're doing, constantly innovating, decide to just make a flamethrower?
[13] Where do you have the time for that?
[14] well the thing i wouldn't put we didn't put a lot of time into the flamethrower the this was an off -the -cuff thing and um so i have sort of like it's sort of a sort of a hobby company called the boring company uh which started out as a joke uh and we decided to make it real um and and dig a tunnel under l a and then dig then people other people asked us to dig tunnels and so we said yes in a few cases.
[15] And then we have a merchandise section that only has one piece of merchandise at a time.
[16] And we started off with a cap.
[17] And there was only one thing.
[18] It was just boring company .com slash cap or hat.
[19] That's it.
[20] And then we sold the hats, limited edition.
[21] It just said, the boring company.
[22] And then I'm a big fan of Space Bowls, the movie.
[23] and in Space Bowls Yogurt goes through the merchandising section and they have a flamethrower in the merchandising section of Space Bowls and like the kids love that one that's the line when he pulls out the planthower it's like we should do a flamethrower so we does anybody tell you no does anybody go Elon Maybe for yourself, but selling a flamethrower, the liabilities, all the people you're selling this device to, what kind of unhinged people are going to be buying a flamethrower in the first place?
[24] Do we really want to connect ourselves to all these potential arsonists?
[25] Yeah, it's a terrible idea.
[26] Terrible.
[27] I shouldn't buy one.
[28] I said don't buy this flamethrower.
[29] Don't buy it.
[30] Don't buy it.
[31] That's what I said.
[32] But still people bought it.
[33] There's nothing I can do to stop them.
[34] It's, you build it, they will come.
[35] I said, don't buy it.
[36] It's a bad idea.
[37] How many did you make?
[38] It's dangerous.
[39] It's got, it's wrong.
[40] Don't buy it.
[41] Still people bought it.
[42] I just couldn't stop them.
[43] How many did you make?
[44] 20 ,000.
[45] And they're all gone.
[46] In three, I think four days?
[47] It sold out in four days.
[48] Are you going to do another run?
[49] No. No, it's it.
[50] Yes.
[51] Oh.
[52] I said we did 20 we did 50 ,000 50 ,000 hats and that was a million dollars and we'll sell something for 10 million and that was 20 ,000 plane throwers at $500 each they went fast.
[53] How do you have the time to do that though?
[54] I mean I understand that it's not a big deal in terms of all the other things you do but how do you have time to do anything?
[55] I just, I don't understand your time management skills.
[56] I mean, I didn't spend much time on this flamethrower.
[57] I mean, to be totally frank, it's actually just a roofing torch with an air rifle cover.
[58] It's not a real flamethrower.
[59] Which is why it says not a flamethrower.
[60] That's why we were very clear.
[61] This is not actually a flamethrower.
[62] And also we were told that various countries would ban shipping of it.
[63] They would not, they would ban flamethrowers.
[64] So we're very, to solve this problem for all the customs agencies, we labeled it not a flamethrower.
[65] Did it work?
[66] Is it effective?
[67] I don't know.
[68] I think so.
[69] So far.
[70] Yes.
[71] Because they said you cannot strip a flamethrower.
[72] But you do so many different things.
[73] Forget about the flamethrower.
[74] Like, how do you do all that other shit?
[75] Like, how do you, how does one decide to fix L .A. traffic by drilling holes in the ground?
[76] And who do you even approach with that?
[77] Like, when you have this idea, who do you talk to about?
[78] about that?
[79] I mean, I'm not saying it's going to be successful or sort of, you know, I know, it's just like asserting that it's going to be successful.
[80] But so far, I've lived in L .A. for 16 years, and the traffic has always been terrible.
[81] And so I don't see any other, like, ideas for improving the traffic.
[82] So in desperation, we're going to dig a tunnel.
[83] And maybe that tunnel will be successful, and maybe it won't.
[84] I'm listening.
[85] Yeah.
[86] I'm not trying to convince you it's going to work.
[87] And are the people that...
[88] Or anyone.
[89] But you're starting this, though.
[90] This is actually a project you're starting to implement, right?
[91] Yeah, yeah.
[92] I know we've dug about a mile.
[93] It's quite long.
[94] It takes a long time to walk it.
[95] Yeah.
[96] Now, when you're doing this, what is the ultimate plan?
[97] The ultimate plan is to have these in major cities and anywhere there's mass congestion and just try it out in L .A. first?
[98] Yeah, it's in L .A. because I mostly, live in L .A. That's the reason.
[99] It's a terrible place to dig tunnels.
[100] This is one of the worst place to dig tunnels because the, mostly because of the paperwork.
[101] People think it's like, what about seismic?
[102] It's like, actually, tunnels are very safe in earthquakes.
[103] Why is that?
[104] Earthquakes are, earthquakes are essentially of surface phenomenon.
[105] It's like waves on the ocean.
[106] So if there's a storm, you want to be in a submarine.
[107] So being in a tunnel is like being in a submarine.
[108] Now the way the tunnel's constructed is it's constructed out of these interlocking segments kind of like a snake.
[109] It's sort of like a snake exoskeleton with double seals.
[110] And so even when the ground moves, the tunnel actually is able to shift along with the ground, like an underground snake.
[111] And it doesn't crack or break or and it's extremely unlike that both seals would be broken and it's it's it's capable of taking five atmospheres of pressure it's waterproof methane proof or gasproof of any kind and meets all california seismic requirements so when you have this idea who do you bring this to i'm not sure what you mean by that well when you're you're implementing it so you're digging holes in the ground like you have to bring it to someone that lets you do it yeah so um there were there were there were Some engineers from SpaceX who thought it would be cool to do this.
[112] And the guy who runs it, like, the day to day, Steve Davis, he's a longtime SpaceX engineer.
[113] He's great.
[114] So Steve was like, I'd like to help make this happen.
[115] I was like, cool.
[116] So we started off with digging a hole in the ground.
[117] It's got like a permit for a pit, big pit.
[118] and just dug a big pit.
[119] And do you have to tell them what the pit's for, or you just say, hey, we just want to dig a hole?
[120] Now you just fill out this form.
[121] That's it.
[122] Yeah, it was a pit in our parking lot.
[123] But do you have to give them some sort of a blueprint for your ultimate idea, and do they have to approve it?
[124] Like, how does that work?
[125] Now we just started off with a pit.
[126] Okay.
[127] Big pit.
[128] And, you don't, you don't really, you know, they don't really care about the existential nature of a pit.
[129] just say like I don't want a pit right yeah and it's a hole in the ground so then we got the permit for the pit and we dug the pit and we dug it in like like three days two three days actually like 48 hours something like that um because uh eric arsetti was coming by for the hype to he was going to attend the hyperloop competition uh which is like a student competition we have for who can make the fastest pod in the hyperloop.
[130] And he was coming, the finals were going to be on Sunday afternoon.
[131] And so Eric was coming by on Sunday afternoon.
[132] He was like, you know, we should take this pit and then, like, show Eric.
[133] So we, this was like Friday morning.
[134] And then, yeah, so it was about a little over 40 hours later.
[135] We dug the pit.
[136] There's like we're in 24 -7, oh, 24, 48 straight hours, something like that.
[137] And dug this big pit.
[138] And, and we're like, showed Eric the pit.
[139] Like, obviously it's just a pit.
[140] But hey, it's hole in the ground is better than no hole in the ground.
[141] And what do you tell me about this pit?
[142] I mean, you just say this is the beginning of this idea.
[143] Yes.
[144] We're going to build tunnels under L .A. to help funnel traffic better.
[145] And they just go, okay.
[146] We've joked around about this in the podcast before.
[147] They're like, what other person can go to the people that run the city and go, hey, we're going to dig some holes on the ground and put some tunnels in there.
[148] And they go, oh, yeah, okay.
[149] Nothing wrong with a hole in the ground.
[150] But it's...
[151] People dig holes in the ground all the time.
[152] But my question is, like, I know how much time you must be spending on your Tesla factory.
[153] I know how much time you must be spending on SpaceX, and yet you still have time to dig holes under the ground in L .A. and come up with these ideas and then implement them.
[154] I've got a million ideas.
[155] I'm sure you do.
[156] There's no shortage of that.
[157] Yeah.
[158] I just don't know how you manage your time.
[159] I don't understand.
[160] understand it.
[161] It doesn't seem, it doesn't even seem humanly possible.
[162] You know, I do basically, I think people don't totally understand what I do with my time.
[163] They think, like, I'm a business guy or something like that.
[164] Like my Wikipedia page says business magnate.
[165] What would you call yourself?
[166] I'm a business magnet.
[167] Can someone please change my Wikipedia page to magnet?
[168] They'll change it right now.
[169] It's probably already changed.
[170] It's locked.
[171] So somebody has to be able to unlock it and change it to.
[172] a magnet.
[173] Someone will get that.
[174] I want to be a magnet.
[175] I do engineering and manufacturing and that kind of thing.
[176] That's like 80 % more of my time.
[177] Ideas and then the implementation of those ideas.
[178] That's like hardcore engineering.
[179] Yeah.
[180] Designing things, you know.
[181] Right.
[182] Structural, mechanical, electrical software, user interface, engineering, aerospace engineering.
[183] But you must understand it's not a whole lot of human beings like you.
[184] You know that, right?
[185] To you're an oddity.
[186] Yes.
[187] To chimps like me. We're all chimps.
[188] Yeah, we are.
[189] We're one notch above a chimp.
[190] Some of us are a little more confused.
[191] When I watch you doing all these things, I'm like, how does this motherfucker have all this time and all this energy and all these ideas, and then people just let them do these things?
[192] Because I'm an alien.
[193] That's what I've speculated.
[194] Yes.
[195] I'm on record saying this in the past.
[196] I wonder.
[197] It's true If there was one I was like if there was like maybe an intelligent Being that we created You know like some AI creature That's a superior to people Maybe we just hang around with us for a little while Like you've been doing and then fix a bunch of shit Maybe that's the way I might have some mutation or something like that You might do you think you do Probably Do you wonder?
[198] Like you're around normal people You're like hmm You're like what's up with these boring dumb motherfuckers Ever?
[199] Not bad for a human But I think it will not be able to hold a candle to AI.
[200] You scared the shit out of me when you talk about AI between you and Sam Harris.
[201] I didn't even consider it until I had a podcast with Sam once.
[202] That's great.
[203] He made me shit my pants.
[204] Talking about AI, I realized like, oh, well, this is a genie that once it's out of the bottle, you're never getting it back in.
[205] That's true.
[206] There was a video that you tweeted about one of those Boston Dynamic robots.
[207] and you're like in the future it'll be moving so fast you can't see it without a strobe light yeah you could probably do that right now and no one's really uh paying attention too much other than people like you or people that are really obsessed with technology all these things are happening and these robots are did you see the one where pita uh put out a statement that you shouldn't kick robots it's probably not wise for retribution their memory is very good I bet it's really good It's really good I bet it is Yes And getting better every day It's really good Are you honestly Legitimately concerned about this Is like AI One of your main worries In regards to the future Yes It's less of a worry than it used to be Mostly due to Taking more of a fatalistic attitude Hmm So you used to have more hope And you gave up some of it and now you don't worry as much about AI you're like this is just what it is pretty much yes yes yes no it's not it's not necessarily bad it's just it's definitely going to be outside of human control not necessarily bad right yes not it's not necessarily bad it's just it's just outside of human control now the thing that's going to be tricky here is that it's going to be very tempting to use AI as a weapon.
[208] It could be very tempting.
[209] In fact, it will be used as a weapon.
[210] So the on -ramp to serious AI, the danger is going to be more humans using it against each other, I think, most likely.
[211] That'll be the danger.
[212] Yeah.
[213] How far do you think we are from something that can make its own mind up?
[214] Whether or not something's ethically or morally correct or whether or not it wants to do something or whether or not it wants to improve itself or whether or not it wants to protect itself from people or from other AI.
[215] How far away from something that's really truly sentient?
[216] Well, I mean, you could argue that any group of people, like a company is essentially a cybernetic collective of people and machines.
[217] That's what a company is.
[218] and then there are different there's different levels of complexity in the way these companies are formed and then there are sort of there's there sort of like a collective AI in the Google sort of search Google search where we're all sort of plugged in as like nodes on the network like leaves on a big tree and we're all feeding this network without questions and answers we're all collectively programming the AI and Google plus the older humans that connect to it are one giant cybernetic collective this is also true of Facebook and Twitter and Instagram and all these social networks they're giant cybernetic collectives humans and electronics all interfacing and constantly now constantly connected Yes.
[219] Constantly.
[220] One of the things that I've been thinking about a lot over the last few years is that one of the things that drives a lot of people crazy is how many people are obsessed with materialism and getting the latest, greatest thing.
[221] And I wonder how much of that is, well, a lot of it is most certainly fueling technology and innovation.
[222] And it almost seems like it's built in to us.
[223] It's like what we like and what we want, that we're fueling this thing that's constantly around.
[224] around us all the time.
[225] And it doesn't seem possible that people are going to pump the brakes.
[226] It doesn't seem possible at this stage where we're constantly expecting the newest cell phone, the latest Tesla update, the newest MacBook Pro.
[227] Everything has to be newer and better.
[228] And that's going to lead to some incredible point.
[229] And it seems like it's built into us.
[230] It almost seems like it's an instinct that we're working towards this, that we like it.
[231] that our job just like the ants build the ant hill our job is to somehow another fuel this yes i mean i made this comment some some years ago but it feels like we are the biological bootloader for ai effectively we are building it and then we're building progressively greater intelligence and the percentage of intelligence that is not human is increasingly increasing.
[232] And eventually, we will represent a very small percentage of intelligence.
[233] But the AI is informed, strangely, by the human limbic system.
[234] It is, in large part, our id writ large.
[235] How so?
[236] Well, you mentioned all those things, the sort of primal drives.
[237] There's all the things that we like and hate.
[238] and fear.
[239] They're all there on the internet.
[240] Their projection of Olympic system.
[241] That's true.
[242] No, it makes sense.
[243] And the thinking of it as a, I mean, thinking of, thinking of, thinking of, thinking of just human beings communicating online through these social media networks as some sort of an organism that's a, it's a cyborg.
[244] It's a combination.
[245] It's a combination of electronics and biology.
[246] Yeah.
[247] This is...
[248] In some measure, like, the success of these, online systems is sort of a function of how much limbic resonance they're able to achieve with people.
[249] The more limbic resonance, the more engagement.
[250] Whereas like one of the reasons why probably Instagram is more enticing than Twitter.
[251] Lumbic resonance.
[252] Yeah.
[253] You get more images, more video, tweaking your system more.
[254] Yes.
[255] Do you worry about or wonder, in fact, about what the next step is?
[256] I mean, a lot of people didn't see Twitter coming that, you know, communicate with 140 characters or 280 now, would be a thing that people would be interested in.
[257] Like, it's going to excel.
[258] It's going to become more connected to us, right?
[259] Yes.
[260] Things are getting more and more connected.
[261] They're at this point constrained by bandwidth.
[262] Our input output is slow, particularly output.
[263] Output got worse with thumbs.
[264] You know, we used to have input with 10.
[265] 10 fingers now we have thumbs.
[266] But images are also, there are a way of communicating at high bandwidth?
[267] You take pictures and you send pictures to people.
[268] That communicates far more information than you can communicate with your thumbs.
[269] So what happened with you where you decided or you took on a more fatalistic attitude?
[270] Like was there any specific thing or was it just the inevitability of our future?
[271] I try to convince people to slow down, slow down AI.
[272] to regulate AI.
[273] This was futile.
[274] I tried for years.
[275] This seems like a scene in a movie where the robots are going to fucking take over and you're freaking me out.
[276] Nobody listened.
[277] Nobody listened.
[278] No one.
[279] Are people more inclined to listen today?
[280] It seems like an issue that's brought up more often over the last few years than it was maybe five, ten years ago it seemed like science fiction.
[281] Maybe they will.
[282] So far they haven't.
[283] I think people don't, like normally the way that regulations work.
[284] It's very slow.
[285] It's very slow indeed.
[286] So usually it'll be something, some new technology.
[287] It will cause damage or death.
[288] There will be an outcry.
[289] There will be investigation.
[290] Years will pass.
[291] There will be some sort of insight committee.
[292] They will be rulemaking.
[293] Then there will be oversight, eventually regulations.
[294] This all takes many.
[295] years.
[296] This is the normal course of things.
[297] If you look at, say, water motive regulations, how long did it take for seatbelts to be implemented, to be required?
[298] You know, the auto industry fought seatbelts, I think, for more than a decade, successfully fought any regulations on seatbelts, even though the numbers were extremely obvious.
[299] If you had a seatbelt on, you would be far less likely to die.
[300] We'll be seriously injured.
[301] It was unequivocal.
[302] And the industry fought this for years successfully.
[303] Eventually, after many, many people died, regulators insisted on deep belts.
[304] This is a, this time frame is not relevant to AI.
[305] You can't take 10 years from the point at which is dangerous.
[306] It's too late.
[307] And you feel like this is decades away or years away?
[308] from being too late.
[309] If you have this fatalistic attitude and you feel like it's going, we're in almost like a doomsday countdown.
[310] It's not necessarily a doomsday countdown.
[311] It's a...
[312] Out of control countdown?
[313] Out of control, yeah.
[314] People call it the singularity.
[315] And that's probably a good way to think about it.
[316] It's a singular.
[317] It's hard to predict like a black hole what happens past the event horizon.
[318] Right.
[319] So once it's implemented, it's very different because it will be able to...
[320] model what's going to happen and it will be able to improve itself yes that's where it gets spooky right the idea that it can do thousands of years of innovation we're very very quickly yeah and then we'll be just ridiculous ridiculous we will be like this ridiculous biological shitting pissing thing trying to stop the gods no stop we like we like living with a finite lifespan and and watching you know norman rockwell paintings it could be terrible and it could be great.
[321] It's not clear.
[322] Right.
[323] But one thing is for sure we will not control it.
[324] Do you think that it's likely that we will merge somehow or another with this sort of technology and it'll augment what we are now?
[325] Or do you think it will replace us?
[326] Well, that's the scenario.
[327] The merge scenario with AI is the one that seems like probably the best.
[328] for us yes like if you if you can't beat it join it that's yeah yeah you know um so from a long -term existential standpoint that's like the purpose of neuralink is to create a high bandwidth interface to the brain such that we can be symbiotic with AI because we have a bandwidth problem.
[329] You just can't communicate through what I think it's too slow.
[330] And where's neural link at right now?
[331] I think we'll have something interesting to announce in a few months.
[332] That's at least an order of magnitude better than anything else.
[333] I think better than probably anyone thinks is possible.
[334] How much can you talk about that right now?
[335] I don't want to drop the gun on that.
[336] But what's like the ultimate, what's the idea behind it?
[337] Like what are you trying to accomplish with it?
[338] What would you like best case?
[339] scenario?
[340] I think best case scenario, we effectively merge with AI, where we, AI serves as a tertiary cognition layer, where we've got the limbic system, kind of the primitive brain, essentially.
[341] You've got the cortex.
[342] So you're currently in a symbiotic relationship, your cortex and limbic system are in a symbiotic relationship.
[343] And generally people like their cortex, and they like the Olympic system.
[344] I haven't met anyone who wants to delete their limbic system or delete their cortex.
[345] Everybody seems to like both.
[346] And the cortex is mostly in service to the limbic system.
[347] People may think that that their thinking part of themselves is in charge, but it's mostly their limbic system that's in charge.
[348] And the cortex is trying to make the limbic system happy.
[349] That's what most of that computing power is, aren't towards.
[350] How can I make the limbic system?
[351] happy.
[352] That's what I was trying to do.
[353] Now, if we do have a third layer, which is the AI extension of yourself, that is also symbiotic, and there's enough bandwidth between the cortex and the AI extension of yourself such that the AI doesn't de facto separate, then that could be a good outcome.
[354] That could be quite a positive outcome for the future.
[355] So instead of replacing us, it will radically change our capabilities.
[356] Yes.
[357] It will enable anyone who wants to have superhuman cognition.
[358] Anyone who wants.
[359] This is not a matter of earning power because your earning power would be vastly greater after you do it.
[360] So it's just like anyone who wants can just do it in theory.
[361] That's the theory.
[362] And if that's the case, then and let's say billions of people do it then the outcome for humanity will be the sum of human will the sum of billions of people's desire for the future but billions of people with enhanced cognitive ability radically enhanced which would be how much different than people today Like if you had to explain it to a person who didn't really understand what you were saying, how much different are you talking about?
[363] When you say radically improved, like, what do you mean?
[364] You mean mind reading?
[365] It would be difficult to really appreciate the difference.
[366] It's kind of like how much smarter are you with a phone or computer than without?
[367] It's you're vastly smarter, actually.
[368] You know, you can answer any question.
[369] If you connect to the internet, you can answer any question pretty much instantly, any calculation, that your phone's memory is essentially perfect.
[370] You can remember flawlessly.
[371] Your phone can remember videos, pictures, everything perfectly.
[372] Your phone is already an extension of you.
[373] You're already a cyborg.
[374] Well, most people don't realize, they are already a cyborg.
[375] That phone is an extension.
[376] extension of yourself.
[377] It's just that the data rate, the rate at which, the communication rate between you and the cybernetic extension of yourself that is your phone and computer is slow.
[378] It's very slow.
[379] And that, it's like a tiny stroll of information flow between your biological self and your digital self.
[380] And we need to make that tiny, stroll, like a giant river, huge high band with the interface.
[381] It's an interface problem, data rate problem.
[382] You solve the data rate problem, then I think we can hang on to human machine symbiosis through the long term.
[383] And then people may decide that they want to retain their biological self or not.
[384] I think they'll probably choose to retain their a biological self.
[385] Versus, some sort of Ray Kurzweil scenario where they download themselves into a computer?
[386] You will be essentially snapshoted into a computer at any time.
[387] If your biological self dies, you could probably just upload into a new unit.
[388] Literally.
[389] Pass that whiskey.
[390] We're getting crazy over here.
[391] This is getting ridiculous.
[392] Down the rabbit hole.
[393] Grab that sucker.
[394] Give me some of that.
[395] This is too freaky.
[396] See, if I was just talking about this for a long time, by the way.
[397] believe you have.
[398] If I was talking to one, cheers by the way.
[399] Cheers.
[400] Yeah, this is a great whiskey.
[401] Thank you.
[402] I know where this came from.
[403] Who brought this to us?
[404] Trying to remember.
[405] Somebody gave it to us.
[406] Old camp.
[407] Whoever it was.
[408] Thanks.
[409] Yeah, it is good.
[410] This is just inevitable.
[411] Again, going back to your, when you decided to have this fatalistic viewpoint.
[412] So you tried to warn people.
[413] You talked about this pretty extensively.
[414] I've read several interviews where you talked about this.
[415] And then you just sort of just said, okay, it just is.
[416] let's just and you in a way you're by communicating the potential fear I mean for sure you're you're getting the warning out to some people yeah yeah I mean I was really going on the warning quite quite a lot I was warning everyone I could you've met with Obama and just for one reason look just talk about AI yes and what did he say So what about Hillary?
[417] Worry about her first?
[418] Shh.
[419] Everybody be quiet.
[420] No, he listened.
[421] He certainly listened.
[422] I met with Congress.
[423] I met with, I was at a meeting of all 50 governors and talked about just AI danger.
[424] And I talked to everyone I could.
[425] No one seemed to realize where this was going.
[426] Is it that or do they just assume that someone smarter than them is already taking?
[427] care of it?
[428] Because when people hear about something like AI, it's almost abstract.
[429] It's almost like it's so, it's so hard to wrap your head around it.
[430] By the time it already happens, it'll be too late.
[431] Yeah.
[432] I think they didn't quite understand it or didn't think it was near term or not sure what to do about it.
[433] When I said like, you know, an obvious thing to do is to just establish a committee, government committee, to gain insight, you know, before you do oversight, before you do make regulations, they just like try to understand what's going on.
[434] And then if you have an insight committee, then once they learn what's going on, get up to speed, then they can make maybe some rules or propose some rules, and that would be probably a safer way to go about things.
[435] It seems, I mean, I know that it's probably something that the government's supposed to handle, but it seems like I wouldn't want the, I don't want the government to handle this.
[436] Who do you want to handle this?
[437] Oh, geez.
[438] Yeah.
[439] I feel like you're the one who could ring the bell better.
[440] Because if Mike Pence starts talking about AI, I'm like, shut up, bitch, you don't know anything about AI.
[441] Come on, man. He doesn't know what he's talking about.
[442] But I don't have the power to regulate other companies.
[443] What am I supposed to?
[444] Right.
[445] But maybe companies could agree.
[446] Maybe there could be some sort of a...
[447] We have agreements where you're not supposed to dump toxic waste into the ocean.
[448] You're not supposed to do certain things that could be terribly damaging, even though they'd be profitable.
[449] Maybe this is one of those things.
[450] Maybe we should realize that you can't hit the switch on something that's going to be able to think for itself and make up its own mind as to whether or not it wants to survive or not, and whether or not it thinks you're a threat.
[451] And whether or not it thinks you're useless.
[452] Like, why do I keep this dumb?
[453] finite life form alive, why, why keep this thing around?
[454] It's just stupid.
[455] It just keeps polluting everything, it's shitting everywhere it goes, lighting everything on fire and shooting each other.
[456] Why would I keep this stupid thing alive?
[457] Because sometimes it makes good music, you know, sometimes it makes great movies, sometimes it makes beautiful art and sometimes, you know, sometimes it's cool to hang out with.
[458] Yeah, all those reasons.
[459] Yeah, for us, those are great reasons.
[460] Yes.
[461] But for anything objective, standing outside, I go, oh, this is definitely a flawed system.
[462] This is like if you went to the jungle and you watch these chimps engage in warfare and beat each other with sticks.
[463] They're fucking real mean.
[464] They're fucking mean.
[465] I saw that movie Chimpanzee.
[466] I thought it was going to be like some Disney thing.
[467] I was like, oh, like, cow.
[468] What movie was that?
[469] It's literally called chimpanzee.
[470] Is it a documentary?
[471] Yeah.
[472] Yeah, it's kind of like a documentary.
[473] I was like, damn, these chimps are mean.
[474] They're mean.
[475] Yeah.
[476] They're cruel.
[477] Yeah, they're calculated.
[478] Yeah.
[479] They're calculated.
[480] Yeah.
[481] They sneak up on each other and like I didn't realize chimps did calculated cruelty.
[482] Mm -hmm.
[483] It's pretty...
[484] I left that meeting, kind of thinking, what, this is dark.
[485] Right.
[486] Well, we'd know better because we've advanced.
[487] But if we hadn't, we'd be like, man, I don't want to fucking live in a house.
[488] I like the chimp ways, bro.
[489] Chimp ways to go.
[490] This is it, man. Chimp life.
[491] It's the only like I know.
[492] But we, in a way, to the AI, might be like those chimps.
[493] I'm like, these stupid fucks launching missiles out of drones shoot each other underwater, like we're crazy.
[494] We've got torpedoes and submarines and fucking airplanes that drop nuclear bombs indiscriminately on cities.
[495] We're assholes.
[496] Yeah.
[497] They might go, why are they doing this?
[498] It might like look at our politics, look at what we do in terms of our food system, what kind of food we force down each other's throats, and they might go, these people are crazy.
[499] They don't even look out for themselves.
[500] I don't know.
[501] I mean, how much do we think about chimps?
[502] Not much Very little It's like These chimps are at war These like It's like groups of chimps Just attack each other And they kill each other And they torture each other That's pretty bad They hunt monkeys They're But like this is probably the most You know I mean when's the last time You talk about chimps?
[503] Me?
[504] Yeah All the time You do?
[505] Talking to the wrong guy This fucking podcast?
[506] Dude, I talk about Chips every other episode.
[507] People are laughing right now.
[508] Yeah, constantly, I'm obsessed.
[509] I saw that David Attenborough documentary on Chimps when they were eating those columbus monkeys and ripping them apart.
[510] I saw that many, many years ago.
[511] It just changed how I go, oh, this is why people are so crazy.
[512] We came from that thing.
[513] Yeah, exactly.
[514] Yeah.
[515] And there's the Bonobos.
[516] Yeah.
[517] They got a better philosophy.
[518] Yeah, they're like swingers.
[519] Yeah, they really are.
[520] They seem to be way more.
[521] even than us way more civilized They just seem to resolve everything with sex Yeah The only rules they have Is the mom won't bang the son That's it Okay It's it Mom won't bang your sons They're good women Yeah Good women in the Bonobo community Everybody else is banging and out Yeah I haven't seen the Bonovo movie Well they're disturbing Just at a zoo You know You have bonobos at the zoo They're just constantly going Constly fucking Yeah That's all they do This won't stop.
[522] Yeah, and they don't care.
[523] Gay, straight, whatever.
[524] Let's just fuck.
[525] What's with these labels?
[526] I haven't seen Bonobos at a zoo.
[527] It's probably not on the PG section.
[528] Yeah, I don't think they have them at many zoos.
[529] We've looked that up before, too, didn't me?
[530] It's probably pretty awkward.
[531] I think that's the thing.
[532] They like to keep regular chimps at zoos because bonobos are just always jacking off.
[533] Yeah.
[534] What's that?
[535] They have in San Diego?
[536] San Diego's has got some.
[537] Really?
[538] Probably separate them.
[539] I mean, how many other in a cage?
[540] You know, it's going to be pretty intense.
[541] Yeah, yeah.
[542] Yeah, we're a weird thing, you know, and I've often wondered if whether or not we're, you know, our ultimate goal is to give birth to some new thing.
[543] And that's why we're so obsessed with technology.
[544] Because it's not like this technology is really, I mean, it's certainly enhancing our lives in a certain way, but is, I mean, ultimately, is it making people happier right now?
[545] Most technology I would say no. In fact, you and I were talking about social media before this about just not having Instagram on your phone and not dealing, and you feel better.
[546] Yes.
[547] I think one of the issues with social media, it's been pointed out by many people, is that I think maybe particularly Instagram, people look like they have a much better life than they really do.
[548] Right.
[549] By design.
[550] Yeah, people are posting pictures of when they're really happy.
[551] They're modifying those pictures.
[552] to be better looking.
[553] Even if they're not modifying the pictures, they're at least selecting the pictures for the best lighting, the best angle.
[554] So people basically seem they're way better looking than they basically really are.
[555] And they're way happier seeming than they really are.
[556] So if you look at everyone on Instagram, you might think, man, they're all these happy, beautiful people, and I'm not that good looking and I'm not happy.
[557] So I must suck, you know, and that's going to make people sad.
[558] So when, in fact, those people you think are super happy, actually, not that happy.
[559] Some of them are really depressed.
[560] They're very sad.
[561] Some of the happiest seeming people, actually some of the saddest people in reality.
[562] And nobody looks good all the time.
[563] It doesn't matter who you are.
[564] No. It's not even something you should want.
[565] Why do you want to look great all the time?
[566] Yeah, exactly.
[567] So I think things like that can make people quite sad.
[568] Just by comparison, because you just sort of, people generally think of themselves relative to others.
[569] We are constantly re -baselining our expectations.
[570] And you can see this, say, if you watch some show like naked and afraid, or, you know, if you just go and try living in the woods by yourself.
[571] for a while and you're like the land of civilization is quite great it's a lot of it's people want to come back to civilization pretty fast on naked and afraid wasn't that a thorough quote the comparison is a thief of joy yeah well happiness is reality minus expectations that's great too but the comparison is the thief of joy really holds true to people is it Deodore Roosevelt?
[572] Roosevelt fascinating and when you're thinking about Instagram because what essentially Instagram is with a lot of people is you're giving them the opportunity to be their own PR agent and they always go towards the glamorous you know and when anybody does show you know hashtag no filter if they really do do that like oh you're so brave look at you no makeup you know yeah they look good anyway you look great what you're doing oh my god you don't have makeup on you still look hot as fuck you know what you're doing I know what you're doing too they're they're they're letting you know and then they're feeding off that comment section ooh just a little sitting there like like it's a fresh stream of love like you're getting right up to the sources it comes out of the earth and you're sucking that a lot of emojis water mojis emojis yeah a lot of emojis my my concern is not so much what instagram is is that I didn't think the people had the need for this or the expectation for some sort of technology that allows them to constantly get love and adulation from strangers and comments and this ability to project this sort of distorted version of who you really are.
[573] But I worry about where it goes.
[574] Like what's the next one?
[575] What's the next one?
[576] Like where is it?
[577] Is it going to be augmented?
[578] It's some sort of a weird augmented or virtual sort of Instagram type situation where you're not going to want to live in this real world.
[579] You're going to want to interface with this sort of world that you've created through your social media page, some next level thing.
[580] Yeah, go I love in the simulation Yeah, maybe Some ready player one type shit that's real That seems we have that HTC vibe here I've only done it a couple times Quite honestly because it kind of freaks me out My kids fucking love it man They love it They love playing these weirdo games And walking around with that headset on But part of me watching them do it goes Wow, I wonder if this is like the precursor Just sort of like if you look at that phone that Gordon Gecko had on the beach.
[581] And then you compare that.
[582] A big cell phone.
[583] Yeah, you compare that to like a Galaxy Note 9.
[584] Like how the fuck did that become that, right?
[585] And I wonder when I see this HTC vibe, I'm like, what is that thing going to be 10 years from now when we're making fun of what it is now?
[586] What is it how, I mean, how ingrained and how connected and interconnected is this technology going to be in our life?
[587] It will be at some point indistinguishable from reality.
[588] where we'll lose this, we'll lose this.
[589] Like you and I are just looking at each other through our eyes.
[590] I see you and you see me, I think, I hope.
[591] You think so.
[592] I think you probably have regular eyes.
[593] This could be some simulation.
[594] It could.
[595] Do you entertain that?
[596] Well, the argument for the simulation, I think, is quite strong because if you assume any improvements at all over time, any improvement, 1%, 0 .1%, just extend the time frame, Make it a thousand years, a million years.
[597] The universe is 13 .8 billion years old.
[598] Civilization, if you count it, if you're very generous, civilization is maybe seven or 8 ,000 years old, if you count it from the first writing.
[599] This is nothing.
[600] This is nothing.
[601] So if you assume any rate of improvement at all, then games will be indistinguishable.
[602] from reality or civilization will end one of those two things will occur therefore we are most likely in a simulation or we're on our way to one right well but just because we exist we could most certainly be on the road we could be on the road to that right it doesn't mean it has to avert be in base reality we could be in base reality we could be here now on our way to the road or on our way to the destination where this can never happen again, where we are completely ingrained in some sort of an artificial technology or some sort of a symbiotic relationship with the internet or the next level of sharing information, but right now we're not there yet.
[603] That's possible too, right?
[604] It's possible that a simulation is one day going to be inevitable, that we're going to have something that's indistinguishable from regular reality, but maybe we're not there yet.
[605] That's also possible.
[606] We're not quite there yet.
[607] This is real when I touched that.
[608] It feels very real.
[609] Maybe that's why everybody's, like, into, like, mason jars and shit.
[610] Mason jars.
[611] Swayed shoes.
[612] People are into, like, craft restaurants, and they want raw wood.
[613] Everyone has to see metal.
[614] People, it seems like people are, like, longing towards some weird log cabin -type nostalgia.
[615] Yeah, like, holding on, like, clinging.
[616] Sure.
[617] Dragging their nails through the mud, like, don't take me yet.
[618] Yes.
[619] I want to.
[620] But then people will go get a mason jar with a wine stem or a handle.
[621] That's dark.
[622] Makes me lose faith in humanity.
[623] Wine stem and a handle.
[624] They have those?
[625] Yes.
[626] Oh, those dirty people.
[627] That's just assholes.
[628] That's like people make pet rocks.
[629] Rough.
[630] Right?
[631] Some people are just assholes.
[632] They take advantage of our generous nature.
[633] It was made with the wine stem.
[634] Made with a handle.
[635] They made it that way.
[636] So it wasn't like they welded it onto the mace.
[637] Oh, you fuck.
[638] That would be fine if there was, they glued it on or something.
[639] Right.
[640] It was made that way.
[641] Like trash chic.
[642] Oh, this is the skin.
[643] It's disgusting.
[644] Look at this.
[645] It is right there.
[646] Yep.
[647] This is terrible.
[648] Yeah.
[649] That's like fake breasts that are designed to be hard, like fake breasts from the 60s.
[650] It's like if you're really along for the ones with ripples, here we go.
[651] Yeah.
[652] That's almost what that is.
[653] Yeah.
[654] What are you going to do, man?
[655] There's nothing, you know, it's not you do stop certain terrible ideas from propagating.
[656] Yeah.
[657] Anyway, I don't want to say, sound like things are too dark because I think like you kind of have to be optimistic about the future there's no point in being pessimistic it's just too negative it doesn't help it doesn't help you know I think you want to be I mean my theory is like you'd rather be optimistic I think I'd rather be optimistic and wrong than pessimistic and right right at least or on that side right yeah because if you're pessimistic it's going to be miserable yeah yeah nobody wants to be around you anyway if it's the end of the world you're like a fuck told you bro yeah the world's ending yeah it is what it is for all i mean enjoy the journey right if you really want to get morose i mean it is what it is for all of us anyway we're all going to go unless some something changes yeah i mean ultimately you know even if we just sort of existed as humans forever we'd be would still eventually that'd be like the heat death of the universe right as a million years from now right even if we get it past the sun if we figure out away past the sun running out of juice eventually it's going to end it's just a question of when right so it really is all about the journey hmm or transcendence from whatever we are now into something that doesn't worry about death the universe as we know it will dissipate into a fine mist of cold nothingness eventually and then some someone's going to bottle it and put a fragrance to it, sell it to French people in another dimension.
[658] It's just a very long time.
[659] So I think it's really just about how can we make it last longer?
[660] Are you a proponent of the multi -universes theory?
[661] Do you believe that there are many, many universes?
[662] And that even if this one fades out, that there's other ones that are starting fresh right now and there's an infinite number of them, they're just constantly in this never -ending cycle of birth and death?
[663] I think most likely this is just about probability.
[664] There are many, many simulations.
[665] These simulations are, we might as well call them reality, or you could call them the multiverse.
[666] These simulations, you believe, are created?
[667] Like someone has manufactured.
[668] They're running on a substrate.
[669] So?
[670] That substrate is probably boring.
[671] Boring?
[672] Mm -hmm.
[673] How so?
[674] Well, when we create a simulation, like a game or a movie, it's a distillation of what's interesting about life.
[675] You know, like, it takes, you can take a year to shoot an action movie.
[676] And then that's all to settle down into two or three hours.
[677] So let me tell you, if you see an action movie being filmed, it's frigging, it's boring, super boring.
[678] It takes, there's like lots of takes, everything's in a green screen, looks pretty goofy, doesn't look cool.
[679] But once you add the CGI, it's, and have great editing, it's amazing.
[680] So I think most likely If we're a simulation It's really boring outside the simulation Because why would you make a simulation that's boring Because make a simulation way more interesting to the base reality That is if this right now is a simulation Yes And ultimately inevitably we're if as long as we don't die or get hit by a meteor We're going to create some sort of simulation If we continue on the same technological path we're on right now.
[681] now.
[682] Yes.
[683] But we might not be there yet.
[684] So it might not be a simulation here.
[685] But it most likely is you feel other places.
[686] This notion of place or where is a...
[687] Flawed.
[688] Yes.
[689] Flawed perception.
[690] Like that, if you have this sort of, that vibe, you know, which, that's made by Valve and it's really Valve that made it.
[691] HCC did the hardware, but it's really a Valve.
[692] thing um makers of half life yes well great company great company um when you're in that in that virtual reality which is only going to get better where are you where are you really right you aren't anywhere well whereas in the computer what what defines where you are exactly right it's your perception is it your perceptions or is it you know a scale that we have under your butt you're right here i've measured you you're the same weight as you wear when you left but you meanwhile your experience why do you think you're where you are right now you might not be i'll spark up a joint if you keep talking your manager's going to come in here we might have to lock the door right now you think you're in a studio in L .A that's what I heard you might be in a computer oh listen man i think about this all the time yeah i mean it's unquestionable that one day that'll be the case as long as we keep going, as long as nothing interrupts us.
[693] And if we start from scratch and, you know, we're single -celled organisms all over again, and then millions and millions of years later we become the next thing that is us with creativity and the ability to change its environment, it's going to keep monkeying with things until it figures out a way to change reality, to change, I mean, to almost like punch a hole through what is this thing into what it wants it to be and create new things and then those new things will intersect with other people's new things and then it will be this ultimate pathway of infinite ideas and expression all through technology yeah and then we're wonder we're going to wonder like why are we here what are we doing let's find out well i mean i think we should take the actions the set of actions that are most likely to make the future better.
[694] Yes.
[695] Right.
[696] Yeah.
[697] And then re -evaluate those actions to make sure that it's true.
[698] Well, I think there's a movement to that.
[699] I mean, in terms of like a social movement, I think some of it's misguided and some of it's exaggerated and there's a lot of people that are fighting for their side out there.
[700] But it seems like the general trend of like social awareness seems to be much more heightened now than has ever been.
[701] been in any other time in history because of our ability to express ourselves instantaneously to each other through Facebook or Twitter or what have you.
[702] And that the trend is to abandon preconceived notions, abandoned prejudice, abandon discrimination, and promote kindness and happiness as much as possible.
[703] You're looking at this knife?
[704] Somebody gave it to me. Sorry.
[705] Yeah, what is this?
[706] What the fuck did you do?
[707] My friend Donnie, he brought this with him and it just stayed here.
[708] I have a real samurai sword.
[709] If you want to play with that, I know you're into weapons.
[710] That's from the 1500s.
[711] Samurai's right at the end of the table?
[712] Yeah, that's cool.
[713] I'll grab it.
[714] Hold on.
[715] Yeah, that's a legit samurai sword from an actual samurai from the 1500s.
[716] If you pull out that blade, that blade was made the old way where a master craftsman folded that metal and hammered it down over and over again over a long period of time and honed that blade into what it is now.
[717] what's crazy is that more than 500 years later that thing is still pristine I mean whoever took care of that and passed it down to the next person who took care of it until it got to the podcast room it's pretty fucking crazy one day someone's going to be looking at a Tesla like that these fucking back doors they pop up sideways like a Lamborghini you should see what the Tesla can do you didn't you should I'll show you afterwards well I've driven one I love them yeah but most people don't know what it can do in terms of like ludicrous mode in terms of like driving super fast and irresponsibly on public roads is that what you're saying well any call can do that yeah what can it do that that I need to know about I mean the Model X can do this like ballet thing to the Trans Siberian orchestra it's pretty cool where it dances yes legitimately like moves around yeah Uh -da -da -da -da -da -da -da -da -da.
[718] Why would you program that into a car?
[719] It seemed like fun.
[720] That's what I get about you.
[721] That's what's weird.
[722] Like when you showed up here, you were all smiles and you pull out a fucking blow torch and not a blow torch.
[723] But I'm like, look at this dude.
[724] Not a flamethrower.
[725] Not a flamethrower.
[726] I'm like, you're having fun.
[727] You're having fun.
[728] Like this is a, like this thing, when you know, you program a car to do a ballet dance, you're having fun.
[729] How do you have the day?
[730] time to do that.
[731] I don't understand why you're digging holes under the earth and sent rockets into space and powering people in Australia.
[732] Like how the fuck do you have time to make the car dance ballet?
[733] Well, I mean, in that case, there were some engineers at Tesla that said, you know, what if we make this car dance and play music?
[734] I was like, that sounds great.
[735] Please do it.
[736] Let's try to get it done in time for Christmas.
[737] We did.
[738] Is there a concern about someone just losing their mind and making it do that in the highway?
[739] No, it won't do that.
[740] What if it's in bumper -to -bumper traffic?
[741] Nope.
[742] No, won't do it.
[743] Nope.
[744] Actually, you have to, it's an Easter egg.
[745] Oh, it's an Easter egg.
[746] Yeah, that's why people don't know about it.
[747] Including people have the car.
[748] Well, it's like, it can do lots of things, lots of things.
[749] Once Reddit gets a hold of it.
[750] Everyone's going to know that.
[751] Oh, you just have, it's everyone, if you search for it on the internet, you will find out, but people don't know that.
[752] they should even search for it oh well they do now yes yes there's so many things about the model x and the model s and the model three that people don't know about we should probably do a video or something and explain it because i have close friends of mine and i say do you know the car can do this and they're like nope do you want to do a video that or do you like the fact that some people don't know no i think it's probably not we should tell people yeah probably yeah That would help your product.
[753] I mean, it's not like you don't sell enough of them.
[754] You sell almost too many of them, right?
[755] I mean, I think a Tesla is the most fun thing you could possibly buy ever.
[756] That's what it's meant to be.
[757] Well, our goal is to make, it's not exactly a car.
[758] It's actually a thing to maximize enjoyment.
[759] make it's maximum fun okay electronic like big screen laptop ridiculous speed handling all that stuff yeah do you have and we're gonna put video games in it you are yeah is that's wise well you won't be able to drive while you're playing the video game but we're like for example we're just putting the Atari emulator ROM emulator in it so we'll play a missile command and Lunar Lander and a bunch of other things.
[760] That sounds cool.
[761] It's pretty fun.
[762] I like that.
[763] Yeah.
[764] I mean, improve the interface of Missile Command because it was too hard with the old trackball.
[765] So this is a touchscreen version of Missile Command.
[766] So you have a chance.
[767] Do you have an old car, don't you?
[768] Don't you have like an old Jaguar?
[769] Yeah.
[770] That's most people don't know.
[771] 61 Series 1, E -type Jaguar.
[772] I love cars.
[773] It's great.
[774] Yeah, I love old cars.
[775] that's one i think yeah the only two got only two gasoline cars i have about that and an old like a ford model t that a friend of mine gave me that's my only two gasoline cars is the for model t all stock well oh there's your car oh look at that i have the convertible that is a gorgeous car that's a good looking car yes is that yours that is it's not mine it's extremely close to mine But I don't have a French license plate of mine.
[776] Oh, that's a beautiful car.
[777] They nailed that.
[778] That's mine looks like that.
[779] God, they nailed that.
[780] That's what mine looks like.
[781] Maybe that is mine.
[782] There's certain iconic shapes.
[783] Yes.
[784] And there's something about those cars, too.
[785] They're not as capable, not nearly as capable as like a Tesla.
[786] But there's something really satisfying about the mechanical aspect of like feeling the steering and the grinding of the gears and the shifting.
[787] There's something about those that's extremely satisfying, even though they're not that competent.
[788] Like, I have a 1993 Porsche 964.
[789] It's like a lightweight.
[790] It's not, in the RS America, it's not very fast.
[791] It's not, like, in comparison to a Tesla or anything like that.
[792] But the thing about it is, like, it's mechanical.
[793] You feel everything's like, it gives you this weird thrill, like you're on this clinky ride and all this feedback.
[794] There's something to that.
[795] Yeah, yeah, absolutely.
[796] I mean, yeah.
[797] My e -type is like basically no electronics.
[798] Yeah.
[799] So you like that, but you also like electronics?
[800] Yes.
[801] Like your Tesla's soup.
[802] It's like the far end of electronics.
[803] Yes.
[804] Drives itself.
[805] It's driving itself better every day.
[806] Yeah.
[807] It's like we're about.
[808] to release the software that will enable you to just turn it on, and it'll drive from highway on ramp to highway exit, do lane changes to overtake other cars to go from one interchange to the next.
[809] If you get on, say, the 405, get off 300 miles later and go through several highway interchanges and just overtake other cars and hook into the nav system.
[810] And then...
[811] And you're just meditating.
[812] Home?
[813] Yeah.
[814] Your car's just traveling.
[815] It's kind of eerie It's kind of eerie.
[816] What did you think when you saw that video That dude falling asleep behind the wheel?
[817] I'm sure you've seen it.
[818] The one in San Francisco is like right outside of San Jose.
[819] The dude's out cold like this And the car's in bumper -to -bumper traffic moving along.
[820] Yeah.
[821] Have you seen it, right?
[822] Yeah, yeah.
[823] Did you say what have I done?
[824] We changed the software.
[825] That's, I think, an old video.
[826] We changed software where if you don't touch the wheel, it will gradually slow down and put the emergency lights on and wake you up.
[827] Oh, that's hilarious.
[828] That's hilarious.
[829] Can you choose what voice wakes you up?
[830] Well, it's sort of more of a...
[831] It sort of honks.
[832] Oh, it honks?
[833] Yeah.
[834] You should be like, wake up, fuck face.
[835] You're endangering your fellow humans.
[836] We could gently wake you up with a sultry voice.
[837] Oh, that would be good.
[838] Like something with a southern accent.
[839] Hey, wake up.
[840] Wake up.
[841] Wake up, sunshine.
[842] Hey, sweetie.
[843] Exactly.
[844] Won't you wake up.
[845] You could pick your...
[846] Right.
[847] Like, see what you want.
[848] Yeah, I choose.
[849] The Australian girl for Siri.
[850] Yeah.
[851] I like her voice.
[852] Do you want it seductive?
[853] My favorite.
[854] I like Australian.
[855] What flavor of, do you want it to be angry?
[856] It could be anything.
[857] Do you want those Australian prison lady jeans?
[858] Now, when you program something like that in, is this in response to a concern or is it your own?
[859] Yeah.
[860] Do you look at it and go, hey, they shouldn't just be able to fall asleep.
[861] Let's wake them up.
[862] Yeah.
[863] Yeah.
[864] It's like, you know, we're like, you know, people have.
[865] falling asleep we've got to do something about that right but when you first released it you didn't consider it right you're just like well no one's gonna just sleep people fall asleep in cars all the time and crash yeah at least our car doesn't crash that's better it's better not to crash imagine if that guy had fallen asleep in a in a gasoline car they do it all the time for sure yeah then they'd crash in somebody yeah and in fact the thing that really you know got me to it's like man or get autopilot going and get it out there was a guy was in an early tesla driving down the highway and he fell asleep and he ran over a cyclist and killed him and cycle's like man if we had autopilot wouldn't run out of the one we might have fallen asleep but at least he wouldn't run over that cyclist so how did you implement it like did you just use cameras in yeah programmed with the the system so that if it sees images it slows down and how much time do you give?
[866] Is the person who's in control of it allowed to program how fast it goes?
[867] Yes.
[868] Yeah, you can program it to be more or less, like more conservative, or like more aggressive driver, and you can say what speed you want it to, what speed is okay.
[869] I know you have ludicrous mode.
[870] Do you have douchebag mode?
[871] Ha -ha.
[872] It just cuts people off.
[873] Well, for lane changes, it's tricky because if you're in, like, L .A., like, unless you're pretty aggressive, it's hard to change lanes sometimes.
[874] You can't.
[875] It's hard to be Sartanam.
[876] It's hard to be Namaste all out here in L .A. Yeah.
[877] If you want to hit that Santa Monica Boulevard off -ramp.
[878] I mean, you've got to be a little pushy.
[879] You've got to be a little pushy, yeah, especially when people are angry.
[880] They're a little angry.
[881] They don't want you in.
[882] They speed up.
[883] Sometimes, you know, I think people, like, overall are pretty nice on the highway, even in L. but sometimes they're not.
[884] Do you think the neural link will help that?
[885] Probably.
[886] Everybody'd be locked in together, this hive mine.
[887] Tunnels will help it.
[888] We won't have traffic.
[889] That'll help a lot.
[890] Yeah.
[891] How many of those can you put in there?
[892] Nice thing about tunnels.
[893] The last thing about tunnels is you can go 3D.
[894] So you can go many levels.
[895] Right.
[896] Until you hit hell.
[897] Yeah.
[898] But you could go, you can have 100 levels of tunnel.
[899] No problems.
[900] Jesus Christ.
[901] I don't want to be on 99.
[902] That would be in the 99th, negative 99 floors.
[903] This is one of the fundamental things people don't appreciate about tunnels is that it's not like roads.
[904] The fundamental issue with roads is that you have a 2D transport system and a 3D living and workspace environment.
[905] So you've got all these tall buildings or concentrated work environments, and then you want to go into this like 2D transport system, which is pretty low density.
[906] because cars are spaced out pretty far.
[907] And so that obviously is not going to work.
[908] You're going to have traffic guaranteed.
[909] But if you can go 3D on your transport system, then you can solve all traffic.
[910] You can either go 3D up with a flying car or go 3D down with tunnels.
[911] You can have as many tunnel levels as you want.
[912] And you can arbitrarily relieve any amount of traffic.
[913] You can go further down with tunnels, then you can go up with buildings.
[914] You have 10 ,000 feet down if you want I wouldn't recommend it What was that movie With what's his face Bradley Cooper Christian No what the fuck's his name Batman?
[915] Who is Batman?
[916] Christian Bale Where they fought dragons Him and Matthew McConaughey They went down deep into the earth How deep can you go?
[917] I don't think that was Batman Fort Dragons No it wasn't Batman But it's Christian Bail Rain of Fire You ever saw that?
[918] no terrible movie but good I wouldn't recommend drilling super far down yeah but you get real deep it gets hot right molten earth is a giant ball of lava with a thin crust on the top which we think of as like the surface this thin crust and it's mostly just a big ball of lava that's earth but 10 ,000 feet is not a big deal have you given any consideration whatsoever to the flat earth movement.
[919] Ha ha.
[920] I think that's a troll situation.
[921] Oh, it's not.
[922] No, it's not.
[923] You would like to think that because you're super genius.
[924] But I, as a normal person, I know there's people way dumber than me. And they really, really believe.
[925] They watch YouTube videos, which go on uninterrupted and spew out a bunch of fucking fake facts very eloquently and articulately.
[926] and they really believe these people really believe I mean if it works for them sure fine fine it's weird though right that in this age where you know there's ludicrous mode in your car goes 1 .9 seconds 0 to 60 2 .2 2 .2 which one's 1 .9 the roaster the next generation roadster standard edition yeah I'm on top of this shit that's the that's without standard edition yeah throughout the performance package What performance package?
[927] What the fuck do you need?
[928] We're going to put rocket thrusters on it.
[929] For real?
[930] Yes.
[931] What are they going to burn?
[932] Nothing.
[933] Ultra high pressure, compressed air.
[934] Whoa.
[935] Just air?
[936] Cold gas thrusters.
[937] Do you have to have air tanks?
[938] Or are they just sucking the air out of it?
[939] Yeah, it's an electric pump.
[940] Whoa.
[941] Pump it up to like 10 ,000 PSI.
[942] And how fast are we talking?
[943] Zero to 60.
[944] How fast do you want to go?
[945] We could make it just to fly.
[946] I want to go back and turn.
[947] time.
[948] Make it fly.
[949] You make it fly.
[950] Sure.
[951] Do you anticipate that as being, I mean, you were talking about the tunnels and then flying cars.
[952] Do you really think that's going to be real?
[953] It's too noisy.
[954] And there's too much airflow.
[955] So the fundamental issue with flying cars, I mean, if you get, like, one of those, like, toy drones, think of how much, how loud those are and how much air they blow.
[956] Now, imagine if that's, like, a thousand times heavier.
[957] This is not going to be, make your neighbors happy.
[958] your neighbors are not going to be happy if you land a flying car in your backyard it'll be very helicopter like or on your roof they're just really going to be like what the hell that it was annoying you can't even look if you want a flying car just put some wheels on a helicopter is there a way around that like what if they figure out some sort of magnetic technology like all those bob lazar type characters were thinking that was a part of the UFO technology they were doing at area 51 Remember, didn't they have some thoughts about magnetics?
[959] Nope.
[960] No, bullshit?
[961] Yes.
[962] Really?
[963] Yeah, there's a fundamental momentum exchange with the air.
[964] So you must accelerate, there's a certain, you have a mass and you have gravitational acceleration.
[965] And mass times, your mass times gravity must equal the massive airflow times the acceleration of that airflow to have a new.
[966] neutral force.
[967] So it's impossible to get around.
[968] And then you won't move.
[969] But if MG is greater than MA, you will go down.
[970] And if MA is greater than MG, you will go up.
[971] That's how it works.
[972] There's just no way around that.
[973] There is definitely no way around it.
[974] There's no way to create some sort of a magnetic something or another that allows you to...
[975] Technically, yes.
[976] You could have a strong enough magnet.
[977] But that magnet would be so strong that you would create a lot of trouble.
[978] Would you just suck cars up into your car?
[979] Just pick up axles and shit.
[980] You'd have to repel off of either material on the ground or in a really nutty situation off of Earth's gravitational field and somehow make that incredibly light.
[981] But that magnet would cause so much destruction you'd be better off with a helicopter so if there was some sort of magnet road like you have two magnets and they repel each other if you had some sort of a magnet road that was below you and you could travel on that magnet road that would work ha ha ha yes yes you could have a magnet road a magnet road is that too ridiculous no it's it's ridiculous too right i would not recommend it there's a lot things i don't recommend i was super not recommend that not good not wise i think no no no that no definitely not definitely not i was good i would cause a lot of trouble so you've put some time in consideration into this other than you know instead of like my foolishly rendered thoughts so you think that tunnels are the way to do it oh it'll work for sure that'll work yes and your these These tunnels that you're building right now, these are basically just like test versions of this ultimate idea that you have?
[982] You know, it's just a hole in the ground.
[983] Right.
[984] We played videos of it where your idea is that you're going to drop that hole in the ground.
[985] There's a sled on it, and the sled goes very fast, like 100 miles an hour plus.
[986] Yeah, you can go real fast.
[987] You can go as fast as you want.
[988] And then if you want to go long distances, you can just draw the air out of the tunnel, make sure it's real straight.
[989] Draw the air out of the tunnel.
[990] Yeah, yeah, it's a vacuum tunnel.
[991] Because the, and then depending on how fast you want to go, you can do these wheels, or you could use air bearings, depending upon the ambient pressure in the tunnel, or you could maglevitt if you want to go super fast.
[992] So magnet road?
[993] Yes, but underground magnet roads.
[994] Underground magnet roads.
[995] Yeah, otherwise you're going to really create a lot of trouble with, because there's metal things.
[996] Ah, so magnet road is the way to go, just underground.
[997] If you want to go really fast underground, you would be Maglev in a vacuum tunnel.
[998] Mag in a vacuum tunnel.
[999] Magnetic levitation in a vacuum tunnel.
[1000] With rocket launchers.
[1001] No, I would not recommend putting any exhaust gas in the tunnel.
[1002] Oh, okay.
[1003] I see what you're saying.
[1004] Because then you're going to pump it out.
[1005] Right, you'll have to pump it out.
[1006] And you probably have a limited amount of air in the first place.
[1007] Like, how much can you breathe?
[1008] Do you have to pump oxygen into these cubicles?
[1009] No, you don't have a like a pressurized pod.
[1010] It'd be like a little tiny underground spaceship, basically.
[1011] Like an airplane.
[1012] Because you have air in an airplane, it's not getting new air in.
[1013] It is.
[1014] It is?
[1015] Yes.
[1016] They have like a little hole?
[1017] Yeah, they have a pump.
[1018] Really?
[1019] Yeah.
[1020] So it gets it from the outside?
[1021] Yes.
[1022] Wow, I didn't know that.
[1023] And it's like the air is, airplanes have it have it easy because essentially you can, they're pretty leaky.
[1024] Jesus.
[1025] Yeah, but so long as the, as the air pump is.
[1026] working at a decent speed.
[1027] They have backup pumps.
[1028] Oh.
[1029] So they have like, you know, three pumps or four pumps or something.
[1030] And then that, then there's like, there's an, it exhausts through the outflow valve and through whatever seals are not sealing quite right.
[1031] Usually the door doesn't seal quite right on a plane.
[1032] So there's a bit of leakage around the door.
[1033] And, and the pumps exceed the outflow rate.
[1034] And then that sets the pressure in the, in the cabin.
[1035] now have you ever looked at planes and gone i could fix this i just don't have the time i'm i have a design for a plane you do yes a better design i mean probably i think it is yes who have you talked to about this and i've talked to friends friends friends and i'm your friend girlfriends you can tell me what do you got what's going on well i mean the exciting thing to do would be some sort of electric vertical takeoff and landing supersonic jet of some kind vertical takeoff and landing being no need for a runway just shoot up straight in the air and then ooh how would you do that and they do that on some military aircrafts correct yes the trick is that you have to transition to level flight and then you the the thing that you would use for vertical takeover and landing is not suitable for high -speed flight.
[1036] So you have two different systems?
[1037] I've thought about this quite a lot.
[1038] I've thought about this quite a lot.
[1039] The interesting thing about an electric plane is that you want to go as high as possible, but you need a certain energy density in the battery pack because you have to overcome gravitational potential energy.
[1040] Once you've overcome gravitational potential energy and you're at a high altitude, the energy you use in cruise is very low.
[1041] and then you can recapture a large part of the gravitational potential energy on the way down.
[1042] So you really don't need any kind of reserve fuel, if you will, because you have the energy of height, gravitational potential energy.
[1043] This is a lot of energy.
[1044] So once you can get high, the way to think about a plane is it's a force balance.
[1045] So the force balance, so a plane that is not accelerating is a neutral force balance.
[1046] The force of gravity, you have the lift force of the wings, then you've got the force of the thrusting device, the propeller or turbine or whatever it is, and you've got the resistance force of the air.
[1047] Now, the higher you go, the lower the air resistance is.
[1048] is.
[1049] Air density drops exponentially, but drag increases with the square.
[1050] And exponential beats a square.
[1051] The higher you go, the faster you will go for the same amount of energy.
[1052] And at a certain altitude, you can go supersonic with less energy per mile, quite a lot less energy per mile, than an aircraft at 35 ,000 feet.
[1053] Because it's just a force balance.
[1054] It makes sense, though.
[1055] No, I'm sure it does.
[1056] Now, when you think about this new idea of designing, when you have this idea about improving planes, are you going to bring this to somebody?
[1057] Or you just chuck this around?
[1058] Well, I have a lot on my plate.
[1059] Right.
[1060] That's what I'm saying.
[1061] I don't know how you do what you do now, but if you keep coming up with these, but it's got to be hard to pawn these off on someone else either.
[1062] Hey, go do a good job with this vertical, take off and landing system that I want to implement to regular planes?
[1063] The airplane, electric airplane isn't necessary right now.
[1064] Electric cars are important.
[1065] Solar energy is important.
[1066] Stationary storage of energy is important.
[1067] These things are much more important than creating electric supersonic feet tall.
[1068] Also, the planes naturally, you really want that gravitational energy density for an aircraft, and this is improving of a time.
[1069] So, you know, it's important that we accelerate the transition to sustainable energy.
[1070] That's why electric cars, it matters whether electric cars happen sooner or later.
[1071] You know, we're really playing a crazy game here with the atmosphere and the oceans.
[1072] We're taking vast amounts of carbon from deep underground and putting this in the atmosphere.
[1073] This is crazy.
[1074] We should not do this.
[1075] It's very dangerous.
[1076] So we should accelerate the transition to sustainable energy.
[1077] I mean, the bizarre thing is that, obviously, we're going to run out of oil in long term.
[1078] You know, there's only so much oil we can mine and burn.
[1079] It's total logical.
[1080] We must have a sustainable energy transport and energy infrastructure in the long term.
[1081] So we know that's the end point.
[1082] We know that.
[1083] so why run this crazy experiment where we take trillions of tons of carbon from underground and put it in the atmosphere in oceans this is an insane experiment it's the dumbest experiment in human history why are we doing this it's crazy do you think this is a product of momentum that we started off doing this when it was just a few engines a few hundred million gallons of fuel over the whole world not that big of a deal and then slowly but sure of it early over a century, it got out of control.
[1084] And now it's not just our fuel, but it's also, I mean, fossil fuels are involved in so many different electronics, so many different items that people buy.
[1085] It's just this constant desire for fossil fuels, constant need for oil, without consideration of the sustainability.
[1086] The thing is like oil, oil, coal, gas, it's the easy money.
[1087] it's easy money have you heard about clean coal the president's been tweeting about it it's got to be real clean coal all caps did you say he used all caps clean coal um well you know it's very difficult to put that CO2 back in the ground it doesn't like being in solid form have you thought about something like that like some sort of a filter giant building sized filter sucks carbon out of the atmosphere is that possible No, it doesn't, it's not possible.
[1088] No, no, no, definitely not.
[1089] So we're fucked.
[1090] No, we're not fucked.
[1091] I mean, this is quite a complex question.
[1092] Right.
[1093] You know, we're really just, the more carbon we take out of the ground and add to the atmosphere and a lot of it gets permeated into the oceans, the more dangerous it is.
[1094] Like, I don't think right now, I think we're okay right now.
[1095] We can probably even add some more.
[1096] but the momentum towards sustainable energy is too slow.
[1097] There's a vast base of industry, vast transportation system.
[1098] Like there's two and a half billion cars and trucks in the world.
[1099] And the new car and truck production, if it was 100 % electric, that's only about 100 million per year.
[1100] So it would take, if you could snap your fingers, and instantly turn all cars and trucks electric, it would still take 25 years to change the transport base to electric.
[1101] It makes sense because how long does a car truck last before it goes into the junkyard and gets crushed about 20 to 25 years?
[1102] Is there a way to accelerate that process, like some sort of subsidies or some encouragement from the government financially?
[1103] Well, the thing that is going on right now is that there is a, an inherent subsidy in any oil -burning device, any power plant or car is fundamentally consuming the carbon capacity of the oceans and atmosphere, or just say atmosphere for short.
[1104] So you can say, okay, there's a certain probability of something bad happening past a certain carbon concentration in the atmosphere.
[1105] And so there's some uncertain number where if we put too much carbon in the atmosphere, things overheat, oceans warm up, ice caps melt, ocean real estate becomes a lot less valuable.
[1106] Something's underwater.
[1107] And, but it's not clear what that number is.
[1108] But it's definitely a scientist.
[1109] It's really quite, the scientists, scientific consensus is overwhelming.
[1110] Overwhelming.
[1111] I mean, I don't know any serious scientists.
[1112] Actually zero, literally zero, who don't think that we have quite a serious climate risk that we're facing.
[1113] And so there's fundamentally a subsidy occurring with every fossil fuel burning thing.
[1114] plants, aircraft, car, frankly even rockets.
[1115] I mean, rockets use up, you know, they burn fuel.
[1116] But there's just, you know, with rockets, there's just no other way to get to orbit, unfortunately.
[1117] So it's the only way.
[1118] But with cars, there's definitely a better way with electric cars.
[1119] And to generate the energy, do so with photovoltaics, because we've got a giant thermonuclear reactor in the sky called the sun.
[1120] It's great.
[1121] It sort of shows up every day.
[1122] Very reliable.
[1123] So if you can generate energy from solar panels, stored with batteries, you can have energy 24 hours a day.
[1124] And then you can send it to the poles or near to the north with high voltage lines.
[1125] Also, the northern parts of the world tend to have a lot of hydropower as well.
[1126] But anyway, all fossil fuel -powered things have an inherent subsidy, which is their consumption of the carbon capacity of the atmosphere and oceans.
[1127] So people tend to think of like, why should electric vehicles have a subsidy?
[1128] But they're not taking into account that all fossil fuel burning vehicles fundamentally are subsidized by the cost, the environmental cost to Earth.
[1129] but nobody's paying for it.
[1130] We are going to pay for it, obviously, in the future.
[1131] We will pay for it.
[1132] It's just not paid for now.
[1133] Now, what is the bottleneck in regards to electric cars and trucks and things like that?
[1134] Is it battery capacity?
[1135] Yeah, I've got to scale up production.
[1136] Got to make the car compelling.
[1137] Make it better than gasoline or diesel cars.
[1138] Make it more efficient in terms of, like, the distance it can travel.
[1139] Yeah, you've got to be.
[1140] be able to go far enough recharge fast and your roadster you're you're you're anticipating 600 miles is that correct yeah yeah we what is it what is that right now like have you driven one 600 miles now no we could totally make one right now that would do 600 miles but the thing is too expensive so like the car's got how much more so well you know just have a 200 kilowatt battery pack and you can go 600 miles right versus what do you have now 330 mile range.
[1141] So that's plenty for most people.
[1142] What is that in terms of kilowatts?
[1143] Well, that would be for a model S, 100 kilowatt hour pack.
[1144] We'll do about 330 miles, maybe 335.
[1145] Some people have hyper -miled it to 500 miles.
[1146] Hyper -miled it?
[1147] What does that mean?
[1148] Yeah, I just like go on...
[1149] 45 miles an hour or something?
[1150] Yeah, they're like 30 miles an hour.
[1151] It's like on level ground with...
[1152] You pump the tires up really well and going to smooth.
[1153] office and you can go for a long time.
[1154] But you can like definitely comfortably do 300 miles.
[1155] Is there any?
[1156] This is fine for most people.
[1157] Usually 200 or 250 miles is fine.
[1158] 300 miles is you don't even think about it really.
[1159] Is there any possibility that you could use solar power that solar powered one day, especially in Los Angeles?
[1160] I mean, as you said about that giant nuclear reactor a million times bigger than Earth just floating in the sky, is it possible that one day you'll be able to just power all these cars just on solar power.
[1161] I mean, we don't ever have cloudy days if we do, there's three of them.
[1162] Well, the surface area of a car is without making the car like really blocky or having some...
[1163] Like a G -wagon.
[1164] Yeah, and just like having a lot of surfs area where like maybe like solar panels fold out or something.
[1165] Like your E -class.
[1166] That's what we need.
[1167] The E -type?
[1168] Yeah, the E -W.
[1169] Jaguar E -type with a giant long hood.
[1170] That could be a giant solar panel.
[1171] Well, at the beginning of Tesla, I did want to have this, like, unfolding solar panel thing.
[1172] They'd press a button, and it would just, like, unfold these solar panels and, like, charge, recharge your car in the parking lot.
[1173] Ah.
[1174] Yeah, we could do that, but I think it's probably better to just put that on your roof.
[1175] Right.
[1176] And then it's going to just be facing the sun all the time.
[1177] Because, like, otherwise your car could be in the shade, you know, it could be in the shade, it could be in a garage or something like that.
[1178] Yeah.
[1179] Didn't a Fisker have that on the roof?
[1180] on the roof, the Fisker, Karma, new generation for, I believe it was only for the radio.
[1181] Is that correct?
[1182] Yeah, I mean, I think it could, like, recharge like two miles a day or something.
[1183] Did you laugh when they started blowing up, when they got hit with water?
[1184] Do you remember what happened?
[1185] They got, what?
[1186] Yeah, when they had a dealership or, the Fisker Carmas were parked.
[1187] Was that like that with a flood in Jersey?
[1188] Yes, yes.
[1189] When the hurricane came in, they got overwhelmed with water.
[1190] and they all started exploding.
[1191] This fucking great video of it.
[1192] Did you watch the video?
[1193] I didn't watch the video, but I did see it.
[1194] I was a picture of the ass with the aftermath.
[1195] I'd be naked, lubed up.
[1196] Watch that video laugh my ass off.
[1197] They all blow up.
[1198] They got wet and they blew up.
[1199] That's not good.
[1200] Yeah, we made out battery waterproof so that doesn't happen.
[1201] Smart move.
[1202] Yeah, there was a guy in Kazakhstan that, I think it was Kazakhstan, that he just boated through a tunnel under an underwater.
[1203] tunnel like a flooded tunnel and just turn the wheels to steer and press the accelerator and it just floated through the tunnel and he steered around the other cars you're like that's amazing it's on the internet what happens if your car gets a little sideways like if you're driving in snow like what if you're driving if your autopilot is on and you're in like denver and it snows out and your car gets a little sideways does it correct itself does it oh yeah it's got great traction control but does it know how to like correct do you know how like oh yeah your ass end kicks out you know how to counter steer oh yeah no it's really good it knows how to do it yeah whoa it's pretty crazy that's pretty crazy yeah so like if you're going sideways it knows how to correct itself it generally won't go sideways it won't no why not it will correct itself before it goes sideways even in black eyes yeah you should this video is we You could see the car, the traction control system is very good.
[1204] It makes you feel like Superman.
[1205] It's great.
[1206] You feel like you can, like, it will make you feel like this incredible driver.
[1207] I believe it.
[1208] Yeah.
[1209] Now, how do you program that?
[1210] We do our testing on like an ice lake in Sweden.
[1211] Oh, really?
[1212] Yeah, and like Norway and Canada and a few other places.
[1213] Porsche does a lot of that too.
[1214] New Zealand as well.
[1215] They do a lot of their, they do some of their.
[1216] their driver training school on these frozen surfaces.
[1217] So you're just, the car is going sideways whether you like it or not, and you have to learn how to slide into corners and how to adjust.
[1218] Well, electric cars have really great traction control because the reaction time is so fast.
[1219] Right.
[1220] So with the gasoline car, you've got a lot of latency.
[1221] It takes a while for the engine to react and for the electric motors incredibly precise.
[1222] That's why, like, you can imagine, like, if you had like a printer or something, you would only, you wouldn't have a gasoline engine printer.
[1223] That would be pretty weird.
[1224] Or, like, a surgical device.
[1225] It's going to be an electric motor on the surgical device, on the printer.
[1226] Gasoline engine's going to be just chugging away.
[1227] It's not going to have the reaction time.
[1228] But to an electric motor, it's operating at the millisecond level.
[1229] So it can turn on and off traction within like inches of getting on the, like let's say drive a patch of ice.
[1230] It'll turn traction off and then turn it on a couple inches right after the ice, like a little patch of ice.
[1231] Because in the frame of the electric motor, you're moving incredibly slowly.
[1232] You're like a snail.
[1233] You're just moving so slowly.
[1234] because it can see at 1 ,000 frames a second.
[1235] And so, say, one Mississippi, it just thought about things a thousand times.
[1236] So it's realized that your wheels are not getting traction.
[1237] It understands there's some slippery surface that you're driving on.
[1238] Yes.
[1239] And it makes adjustments in real time.
[1240] Yes.
[1241] And milliseconds.
[1242] That would be so much safer than a regular car.
[1243] Yes.
[1244] It is.
[1245] Just that alone, for loved ones, you'd want, them to be driving in your car?
[1246] I'm on board.
[1247] Fuck motors.
[1248] Dude, fuck regular motors.
[1249] The SX and 3 have the lowest probability of injury of any cars ever tested by the U .S. government.
[1250] Wow.
[1251] This is yeah.
[1252] But it's pretty funny.
[1253] It's pretty crazy.
[1254] Like we, you know, people still sue us.
[1255] Like they'll have like some accident at 60 miles an hour where they've like twisted an ankle and they sue, like, there would be dead in another car, they still sue us.
[1256] But that's to be expected.
[1257] It is to be expected.
[1258] Do you take that into account with like the same sort of fatalistic, you know, undertones, just sort of just go, uh, you got to let it go.
[1259] This is what people do.
[1260] Tell you, I've got to, that's what it is.
[1261] Quite a lot of respect for the justice system.
[1262] Judges are very smart.
[1263] And they see, they've, like, I haven't, so far, I've found judges to be.
[1264] be very good at justice because I like what like and jury is a good too like they're actually quite good um you know people you know you read about like occasional errors in the justice system let me tell you most of the time they're very good um and and like the you know the guy mentioned that who fell asleep in the car and he he rode over a cyclist and you know and and that was what encouraged me to get autopilot out as soon as possible.
[1265] That guy sued us.
[1266] He sued you for falling asleep.
[1267] Yes, he, I'm not kidding, he blamed it on the new car smell.
[1268] What?
[1269] Yes.
[1270] He blamed him falling asleep on your new car smell.
[1271] There's someone who's a lawyer, this is a real thing that happens.
[1272] Somewhere there's a lawyer that thought that through in front of his laptop before he wrote that up.
[1273] Yes, he got a lawyer and he sued us, and the judge was like, you're, this is crazy.
[1274] Stop bothering me. No. Thank God.
[1275] Yes.
[1276] Thank God.
[1277] Thank God.
[1278] There's a judge out there with a brain.
[1279] I'll tell you, judges are, judges are very good.
[1280] Some of them.
[1281] What about that judge?
[1282] I have a lot of boys up, up the river in Pennsylvania, who was selling those kids out.
[1283] You know about that story?
[1284] No. Judge was selling young boys to prisons.
[1285] He was, like, literally, yeah, literally under bribes for, he was.
[1286] Was this an elected judge or at a, Or, like, sometimes you say you have a judge that's, like, actually a politician.
[1287] No, he's elected judge.
[1288] This is a very famous story who, he's in jail right now, I think, for the rest of his life.
[1289] And he put away, he would take, like, a young boy would do something like steal something from a store.
[1290] And he would put him in detention for, you know, five years, something ridiculously egregious.
[1291] And they investigated his history and they found out that he was literally being paid off.
[1292] Was it by private prisons?
[1293] Is that what the deal was?
[1294] There was some sort of...
[1295] Anyway, this judge is...
[1296] Two judges?
[1297] Two judges.
[1298] Kids for cash scandals, what it's called.
[1299] 2008, yeah.
[1300] Common pleas judges.
[1301] So I think they are elected.
[1302] And who was paying them?
[1303] Oh.
[1304] Someone...
[1305] It was proven to the point where they're in jail now that someone was paying them to put more asses in the seats in these private prisons.
[1306] A million dollar payment to put them in the youth centers.
[1307] builder a million dollar payment i do think it's this private prisons thing is creating a bad incentive right yes um but i mean that judge is in prison um thank god yes but i for people who think perhaps the justice system consists entirely of judges like that i want to assure you this is not the case the vast majority of judges are very good i agree and they care about justice and they could have made a lot more money if they wanted to be a trial lawyer and instead they care of about justice and they made less money because they care about justice.
[1308] And that's why they're judges.
[1309] I feel that same way about police officers.
[1310] I feel like there's so many interactions with so many different people with police officers that the very few that stand out that are horrific, we tend to look at that.
[1311] Like, this is evidence that police are all corrupt.
[1312] And I think that's crazy.
[1313] No, most police are very honest.
[1314] Yes.
[1315] And, and, and, and, like, the military, military personnel that I know of Very honorable, ethical people and much more honorable and ethical than the average person.
[1316] That's my impression.
[1317] That is my impression as well.
[1318] And that's not to suggest that we be complacent and assume everyone's honest and ethical.
[1319] And obviously, if somebody is given a trusted place in society, such as being a police officer or a judge, and they are corrupted, then we must be extra vigilant against such situations and take action.
[1320] But we should not think that this is somehow broadly descriptive of people in that proficient.
[1321] I couldn't agree more.
[1322] I think there's also an issue with one of the things that happens with police officers, prosecutors, and anyone that's trying to convict someone or arrest someone is that it becomes a game.
[1323] And in games, people want to win.
[1324] And sometimes people cheat.
[1325] Yes.
[1326] Yes.
[1327] I mean, you know, if you're a prosecutor issue, you should not always want to win.
[1328] There are times when you should like, okay, I just should not want to win this case.
[1329] and then just pass on that case sometimes people want to win too much that is true I think also it becomes tough if you're if you're like a district attorney you know you tend to sort of see a lot of criminals and then your view of the world can get negatively you know have a negative you know you can have a negative view of the world because you know you're just interacting with a lot of criminals but actually most of society is not consistent criminals.
[1330] Right.
[1331] And I actually had this conversation at dinner several years ago with the history of Tony.
[1332] I was like, man, it must sometimes seem pretty dark because, you know, man, there's some terrible human beings out there.
[1333] And he was like, yep.
[1334] And he was like dealing with some case which consisted of a couple of old ladies that would run people over somehow for insurance money.
[1335] It was rough.
[1336] I was like, wow, that's pretty rough.
[1337] It's like hard to maintain faith in humanity if you're a district attorney, but, you know, it's only a few percent of society that are actually bad.
[1338] And then if you go to the worst, say, 0 .1 percent of society or the worst one in a thousand, one in a million, you know, like how bad is the millionth worst worst person in the United States?
[1339] Pretty damn bad.
[1340] Like damn evil Like the The million Like the millionth Like the millionth Well one in a million of evil Is so evil People cannot even conceive of it But there's 330 million people in the United States So that's 330 people Out there somewhere But by the same token There's also 330 people Who are incredible angels And unbelievably good human beings Yeah On the other side But because of our fear of danger, we tend to, our thoughts tend to gravitate towards the worst -case scenario.
[1341] Yes.
[1342] And we want to frame that.
[1343] And it's one of the real problems with prejudice.
[1344] Whether it's prejudice towards different minorities or prejudice towards police officers or anything.
[1345] It's like we want to look at the worst -case scenario and say, this is an example of what this is all about.
[1346] And you see that even with people, how they frame genders.
[1347] Some men frame women like that.
[1348] They get ripped off by a few women, and they decide all women are evil.
[1349] Some women get fucked over by a few men, all men are shit.
[1350] And this is very toxic.
[1351] It is.
[1352] It's also a very unbalanced way of viewing the world, and it's very emotionally based, and it's based on your own experience, your own anecdotal experience.
[1353] And it can be very influential to the people around you, and it's just, it's a dangerous way, it's a dangerous thought process and pattern to promote.
[1354] It is.
[1355] It is a very dangerous board pattern.
[1356] I really think, you know, people should give other people the benefit of the doubt and assume that they're good until proven otherwise.
[1357] And I think really most people are actually pretty good people.
[1358] Nobody's perfect.
[1359] They have to be.
[1360] If you think of the vast numbers of us that are just interacting with each other constantly, we have to be better than we think we are.
[1361] There's no other way.
[1362] I mean, here are these weapons.
[1363] But how many times nobody's presumably tried to motor you Nobody yet.
[1364] Yes, nobody's like, but the right for swords right there.
[1365] Fake flamethrower here.
[1366] Exactly.
[1367] Not a flamethrower.
[1368] Now we've got a real problem.
[1369] I'm going to put it on that side too.
[1370] I'm going to leave it for the guests.
[1371] Yeah.
[1372] I'm like, look, man, if I say something that fucked up, it's right there.
[1373] It's liven things up for sure.
[1374] It's guaranteed to make any party better.
[1375] Yeah.
[1376] Well, that's, I mean, that's the armed civilization theory, right?
[1377] That an armed community is a safe and polite community.
[1378] You ever in Texas?
[1379] It's kind of true.
[1380] Yeah, I mean.
[1381] People in Texas are super polite and everybody's got a gun.
[1382] Yes.
[1383] Don't make somebody angry.
[1384] I don't know what's going to happen.
[1385] Yeah, it's not a good move.
[1386] Piss people off when everybody can have a gun.
[1387] Yeah.
[1388] They're off to just let that guy get in your lane.
[1389] Yeah.
[1390] Yeah.
[1391] You know, we've got a big test site in central Texas near Waco.
[1392] Oh, beautiful.
[1393] Yeah, SpaceX in McGregor.
[1394] It's about 15 minutes away from where you go.
[1395] That's close to where Ted Nugent lives.
[1396] It is?
[1397] Shout out to Ted Nuget.
[1398] Okay, cool.
[1399] Yeah.
[1400] Yeah, they're, you know, we have lots of fire and loud explosions and things, and people they're cool with it.
[1401] They don't give a fuck out there.
[1402] They're very supportive.
[1403] Yeah, you can buy fireworks where, you know, your kids go to school.
[1404] Yeah, it's, you know, it's dangerous.
[1405] Yeah, but it's free.
[1406] It's free.
[1407] There's something about Texas that's very enticing because of that.
[1408] It is dangerous.
[1409] but it's also free Right Yeah Yeah I kind of like Texas Actually Well I prefer it Over places That are more restrictive But more liberal Because you could always be liberal Like just because things are free And just because you have a certain amount of right wing type characters It doesn't mean you have to be that way You know And honestly there's a lot of those people That are pretty fucking open -minded And let you do whatever you want to do Right Unless you don't bother them Yeah Yeah, exactly.
[1410] That's my hope right now with the way we're able to communicate with each other today and how radically different it is than generations past is that we all just, the dust settles.
[1411] And we all realize, like what you were saying, that most people are good.
[1412] Most people are good.
[1413] The vast majority.
[1414] Yes.
[1415] I think you should give people the benefit of the doubt, for sure.
[1416] I think you're right.
[1417] Yeah.
[1418] You know what could help that?
[1419] Mushrooms.
[1420] Mushrooms.
[1421] Don't you think?
[1422] They're delicious.
[1423] Yeah.
[1424] right yeah they're good for you too yeah all of them all kinds of them um what do you see in terms of like when you think about the future of your companies what do you see as like bottlenecks you want some more of us uh sure thank you what do you see in terms of like bottlenecks of things that are that are holding back innovation is it regulatory commissions and and people that don't understand the technology that are influencing policy, like what could potentially be holding you guys back right now?
[1425] Is there anything that you would change?
[1426] Yeah, that's a good question.
[1427] You know, I wish politicians were better at science.
[1428] That would help a lot.
[1429] That's a problem.
[1430] Yes.
[1431] There's no incentive for them to be good at science.
[1432] There isn't.
[1433] Actually, you know, they're pretty good at science in China, I have to say.
[1434] Yeah, the mayor of Beijing is, I believe, an environmental engineering degree, and the deputy mayor is a physics degree.
[1435] I met them.
[1436] And the mayor of Saishanghai is really smart.
[1437] You're up on technology.
[1438] What do you think about this government policy of stopping use of Huawei phones?
[1439] And there's something about the worry about spying.
[1440] I mean, from what I understand from real tech people, they think it's horseshit.
[1441] oh i like phones i don't know i don't know um like the government say don't you buy hawai phones does that do you are you up on that or all no should we just abandon this idea well i think like i guess if you're if you're if you're if you have like top secret stuff um then you want to be pretty careful about what hardware you use uh but you know like most people do not have top secret stuff right and like nobody really cares what porn you watch You know, like, it's like nobody actually cares, you know.
[1442] And if they do, that's kind of on them.
[1443] Like, like, national spy agencies do not care.
[1444] Do not give a rat's ass what porn you watch.
[1445] They do not care.
[1446] So, like, what secrets does a national spy agency have to learn from the average citizen?
[1447] Nothing.
[1448] Well, that's the argument against the narrative.
[1449] And the argument by a lot of these tech people is that the real concern is that these companies, like Huawei, are innovating at a radical pace.
[1450] And they're trying to stop them from integrating into our culture and letting this.
[1451] Right now, they're the number two cell phone manufacturer in the world.
[1452] Okay.
[1453] Samsung's number one.
[1454] Huawei is number two.
[1455] Apple is now number three.
[1456] They surpassed Apple as number two.
[1457] And the idea is that this is all taking place without them having any foothold whatsoever in America.
[1458] There's no carriers that have their phones.
[1459] you have to buy their phones unlocked through some sort of a third party and then put...
[1460] So, and the worry is, you know, that these are somehow, they're controlled by the Chinese government.
[1461] The communist Chinese government is going to distribute these phones.
[1462] And I don't know if it's the worry's economic influence, that they'll have too much power.
[1463] I don't know what it is.
[1464] Are you paying attention on any of this?
[1465] Not really.
[1466] No. I don't think we should worry too much about Huawei phones.
[1467] Maybe, you know, maybe, you know, our national security agencies shouldn't have how well Huawei phones.
[1468] Maybe that's a question mark.
[1469] But I think for the average citizen, it doesn't matter.
[1470] They're not pretty sure the Chinese government does not care about the goings on of the average American citizen.
[1471] Is there a time where you think that there will be no security, where it will be impossible, to hold back information, that whatever bottleneck will let go, we're going to give in.
[1472] That whatever bottleneck between privacy and ultimate innovation will have to be bridged in order for us to achieve the next level of technological proficiency, that we're just going to abandon it.
[1473] And there'll be no security, no privacy.
[1474] Do people want privacy?
[1475] Because they seem to put everything on the Internet.
[1476] Well, right now they're confused.
[1477] But when you're talking about your neuralink and this idea that one day, We are going to be able to share information and we're going to be some sort of a thing that symbiotically create, symbiotically connected.
[1478] I think we really need to worry about security in that situation for sure.
[1479] That's like a security very paramount.
[1480] Sure.
[1481] But also what we will be is we'll be so much different.
[1482] Our concerns about money, about status, about well, all these things will seemingly go by the wayside if we really become enlightened.
[1483] If we really become artificially enlightened by some sort of.
[1484] of an AI interface where we have this symbiotic relationship with some new internet -type connection to information.
[1485] What, you know, what happens then?
[1486] What is important and what is not important?
[1487] Is privacy important when we're all gods?
[1488] I mean, I think the things that we think are important to keep private right now, we're probably will not think are important.
[1489] Shame, right?
[1490] Information, right?
[1491] What do you hide in?
[1492] Emotions, what do we hide in?
[1493] I mean, I think, like, I don't know, maybe it's like embarrassing stuff.
[1494] Right, embarrassing stuff.
[1495] But there's actually, like, I think people, there's like not that much that's kept private that people, that is actually relevant, that people what other people actually care about.
[1496] I mean, think other people care about it, but they don't really care about.
[1497] about her and certainly governments don't well some people care about it but then it gets weird when you when it gets exposed like jennifer lawrence when all those naked pictures of her got exposed like i think in some ways people liked her more that they realized like she's just a person that's just a girl who likes sex and is just alive and has a boyfriend and sends them messages and now you get to look into it and you probably shouldn't have but somebody let it go and they put it online and all right she seems to be doing okay she's a person she's just you and me and it's the same thing.
[1498] She's just in some weird place where she's on a 35 -foot -tall screen with music playing every time she talks.
[1499] Yeah, I mean, I'm sure she's not happy about it, but she's clearly doing fine.
[1500] But once this interface is fully realized, where we really do become something far more powerful in terms of our cognitive ability, our ability to understand irrational thoughts and mitigate them and that we're all connected in some sort of an insane way what are our thoughts on wealth our thoughts on social status like how many of those just evaporate and our need for privacy maybe our need for privacy will be the ultimate bottleneck that we'll have to that we'll have to surpass I think the things that we think are important now will probably not be important in the future but there will be things that are important.
[1501] What will be more important?
[1502] They just are different things.
[1503] I don't know there might be some war of ideas, potentially.
[1504] I don't think Darwin's going away.
[1505] Right.
[1506] That one's going to be there.
[1507] No, it's not.
[1508] Darwin will be there forever.
[1509] Forever.
[1510] It would just be a different arena, a different arena.
[1511] A digital arena.
[1512] Different arena.
[1513] Darwin's not going away.
[1514] What keeps you up at night?
[1515] well it's quite hard to run companies yeah especially car companies i have to say it's quite challenging the car business is the hardest one of all the things you do yes because it's a consumer already in business as opposed to like SpaceX and not that SpaceX is no walk in the park but but a car company it's very difficult to keep a car company alive it's very difficult You know, there's only two car companies In the history of American car companies That haven't gone bankrupt And that's Ford and Tesla That's it Yeah, Ford rode out that crazy storm, huh?
[1516] They're the only ones By the skin of their teeth Shout out to the Mustang Yeah, by the skin of their teeth That is interesting, right?
[1517] Same with Tesla, we barely survived How close did you get to folding?
[1518] Very close We, 2008 is not a good time to be a car company, especially a startup car company and especially an electric car company.
[1519] That was like stupidity squared.
[1520] And this is when you had those cool roadsters with the T -top?
[1521] Yeah.
[1522] With a target top?
[1523] Yeah.
[1524] We had like, use a highly modified lease chassis.
[1525] The body was all completely different.
[1526] By the way, that was a super dumb strategy that we actually did.
[1527] Why was it dumb?
[1528] It was based on two false premises.
[1529] One false premise was that we'd be able to cheaply convert the Lotus of Lease and use that as a car platform and that we would be able to use technology from this little company called AC propulsion for the electric drive train and the battery.
[1530] Problem is the AC propulsion technology did not work in production, and we ended up using none of it in the long term, none of it.
[1531] we had to redesign everything and then once you add a battery pack an electric motor to the car it got heavier it got 30 % heavier invalidated the entire structure all the crash structure everything had to be redone nothing I think it had less than 7 % of the parts were common with any other device including cars or anything Less than 7 % everything including tires and wheels bolts brakes steering wheel the steering wheel was I think the steering wheel was almost the same yes the windscreen the wind screen no I think the wind screen is the same yes I think we were able to keep the wind screen less than 7 % so that's right basically every body panel was different the entire structure was different the we couldn't use the like the HVAC system the air conditioner was a belt driven air conditioner of a, so now we needed something that was electrically driven.
[1532] We needed a new AC compressor.
[1533] And all that takes away from the battery life as well, right?
[1534] Yeah, we needed a small, highly efficient air conditioning system that fit in a tiny car and was electrically powered, not belt driven.
[1535] It was very difficult.
[1536] How much of those way?
[1537] Those cars?
[1538] The Roadster.
[1539] I think it was about 2 ,700 pounds.
[1540] It's still very light.
[1541] Depend on which version, 2650 to 2750.
[1542] pound something like that and what was the weight distribution um it was about 50 well there were different versions of the car um so it's about 55 on the rear because it was rear rear bias right but not bad like considering like a 9 -11 which is like one of the most popular sports cards of all time heavy rear -end bias well i mean yeah the 9 -11 like the joke is like they managed to do it despite Newton not being on their side.
[1543] If you're fighting Newton, it's very difficult.
[1544] Well, it's like you've got this, the moments of inertia on a 9 -11 don't make any sense.
[1545] They do, once you understand them.
[1546] Once you understand that.
[1547] You don't want to hang the engine off the ass.
[1548] This is not a wise move.
[1549] You don't want to let up on the gas when you're in a corner.
[1550] The problem with something that's where the engine is mounted over the rear axle or off the rear axle towards the rear.
[1551] is that your pull a moment of inertia is fundamentally screwed.
[1552] You cannot solve this.
[1553] It's unsolvable.
[1554] You're screwed.
[1555] You're screwed.
[1556] Like, essentially, if you spawn the car like a top, that's your polar moment of inertia, you're just...
[1557] I promise they wouldn't swear on this show, by the way.
[1558] Really?
[1559] To who?
[1560] So, it was my friend.
[1561] Tell that friend to go fuck himself.
[1562] Who told you not to swear?
[1563] No, friend.
[1564] That's not a good friend.
[1565] Yes.
[1566] Only I would swear.
[1567] Realize your fucking Elon Musk.
[1568] You can do whatever you want, man. If you ever get confused, call me. I'll swear in private.
[1569] Swear up a storm.
[1570] Just say frickin.
[1571] It's a fun way.
[1572] There's like old house moms.
[1573] Wives and shit that have children.
[1574] Oh, this freaking thing.
[1575] Yeah.
[1576] But anyway, like the Porsche, it's kind of incredible how well Porsche handle is given that it's the physics.
[1577] Yes.
[1578] The moments of inertia aren't so messed up.
[1579] To actually still make it work well is incredible.
[1580] Well, if you know how to turn into the corner, once you get used to the feeling of it, there's actual benefits to it.
[1581] You know, there are some benefits.
[1582] I enjoy, the car I had before Tesla was a 9 -11.
[1583] Oh, okay.
[1584] That was...
[1585] 997 or 6?
[1586] 997?
[1587] Yeah.
[1588] Great car, man. Yeah, I mean, particularly on the Porsche, it was over when they had the variable veins, turbo and you didn't have the turbo lag that was great yeah that was really great the turbo lag was is like you know if you floor it like phone home call your mom the older ones right about an hour later the car accelerates and super dangerous too because then the real real start spinning yeah yeah yeah there's something fun about it though like feeling that rear weight kicking around you know and again it's great i could feel to it yeah yeah i agree but that's that's what i was talking about earlier about that little car that I have than 93, 9 -11.
[1589] It's just, it's not fast.
[1590] It's not the best handling car, but it's more satisfying than any other car I have because it's so mechanical.
[1591] It's like everything about it, like crackles and bumps, and it gives you all this feedback.
[1592] And I take it to the comedy store because when I get there, I feel like my brain is just popping and it's on fire.
[1593] It's like a strategy for me now that I really stop driving other cars there.
[1594] I drive that car there just for the brain juice, just for the interaction.
[1595] I mean, you should try Model S, P -100D.
[1596] I'll blow your mind out of your skull.
[1597] Okay.
[1598] Tell me what to order.
[1599] I'll order it.
[1600] Model S .P. 100D.
[1601] Okay.
[1602] That's the car that I drive.
[1603] Okay.
[1604] Okay, I'll get with the car you drive.
[1605] Okay.
[1606] It will blow your mind out of your skull.
[1607] How far can I drive?
[1608] I believe you.
[1609] How far can I go?
[1610] About 300 miles?
[1611] That's good.
[1612] For L .A. regular days.
[1613] You'll never notice the battery.
[1614] Never.
[1615] Never.
[1616] How hard is it to get, like, one of them crazy plugs installed in your house?
[1617] Is that difficult?
[1618] No, it's super easy.
[1619] It's like, yeah.
[1620] It's like a dryer plug.
[1621] It's like a dryer outlet.
[1622] Didn't you come up with some crazy tiles for your roof that are solar paneled?
[1623] Yeah, yeah.
[1624] I have it on my roof right now, actually.
[1625] I'm just trying it out.
[1626] It's like, the thing, it takes a while to, like, test roof stuff because roofs have to last a long time.
[1627] Right.
[1628] So, like, you want your roof to last, like, You could you put it over a regular roof?
[1629] Now, so there's two versions.
[1630] There's, like, the solar panels you put on a roof.
[1631] So, like, depends on whether your roof's new or old.
[1632] So if your roof's new, you don't want to replace the roof.
[1633] You want to put, like, solar panels on the roof.
[1634] Right.
[1635] So that's, like, retrofit, you know.
[1636] And then we're trying to make the retrofit panels look real nice.
[1637] And then, but then the new product we're coming out with is if you have a roof that's either you're building a house or you're going to replace your roof anyway, then you make the tiles have solar cells.
[1638] embedded in the tiles and then it's quite a tricky thing because you want to not see the solar cell behind the glass tile so you have to really work with the glass and the various coatings and layers so that you don't see the solar cell behind the glass otherwise it doesn't look right right so it's really tricky there it is Jamie put it up there yeah and that looks good see is there a more see like if you look closely you look closely You can see if you're zooming right, you can see the cell.
[1639] But if you zoom out, you don't see the cell.
[1640] Right.
[1641] Well, it looks cool, though.
[1642] That's hard.
[1643] That's really hard.
[1644] Because you have to have sunlight go through.
[1645] Right.
[1646] But when it gets reflected back out, it doesn't, it hides the fact that there's a cell there.
[1647] Now, are those available to the consumer right now?
[1648] Well, we have, I think about those on that roof right there?
[1649] Yes.
[1650] That's amazing.
[1651] Oh, that looks good.
[1652] Yeah.
[1653] Ooh, I like that.
[1654] That one is hard.
[1655] Oh, so you get that kind of fake Spanish -looking thing.
[1656] I like that.
[1657] That's French slate.
[1658] That's white people in Connecticut smoking pipes.
[1659] Look at that one.
[1660] Yeah.
[1661] That's badass, dude.
[1662] Those will actually work.
[1663] I believe you.
[1664] So the solar panels that are on that house that we just looked at, is that sufficient to power the entire home?
[1665] It depends on your energy.
[1666] On how efficient.
[1667] Yeah, yeah.
[1668] So generally, yes.
[1669] I would say it's probably for most.
[1670] It's going to vary, but anywhere from more than you need to maybe half.
[1671] Like call it half to 1 .5 of the energy that you need, depending on how much roof you have relative to living space.
[1672] And how ridiculous you are with TV.
[1673] TV is no problem.
[1674] Air conditioning.
[1675] Air conditioning.
[1676] Air conditioning.
[1677] is the problem.
[1678] If you have an efficient air conditioner and you don't, and depending on how, like, are you air conditioning rooms when they don't need to be air conditioned, which is very common because it's a pain in the neck.
[1679] It's like programming a VCR.
[1680] It's like, you know, it's just a blinking 12.
[1681] So people just like, I'll have that.
[1682] I'm just going to make it this temperature all day long.
[1683] Right.
[1684] They don't have a smart home where if you're in the room, then it stays cool.
[1685] Right.
[1686] Yeah.
[1687] It should predict when you're going to be home and then cool the rooms that you're likely to use with a little bit of intelligence.
[1688] We're not talking about, like, genius home here.
[1689] We're just talking, like, elementary, basic stuff.
[1690] Right.
[1691] You know, like, if you could hook that into the car, like, it knows you're coming home.
[1692] Like, there's no point cooling the home, keeping the home really cool when you're not there.
[1693] Right.
[1694] But it can tell you're coming home it's going to cool it to the right temperature.
[1695] Do you have an app that works with your solar panels?
[1696] or anything like that?
[1697] Yeah.
[1698] Yeah, we do.
[1699] But we need to hook it into the air conditioning to really make the air conditioning work.
[1700] Have you thought about creating an air conditioning system?
[1701] I know you have.
[1702] Trick question.
[1703] Cannot answer questions about future of potential products.
[1704] Okay, let's just let it go.
[1705] We'll move on to the next thing.
[1706] That would be an interesting idea.
[1707] Yeah, I would say.
[1708] Radiant heating, all that.
[1709] Good ideas.
[1710] Now, when you think about the efficiency of these homes and you think about implementing solar power and battery power and is there anything else that people are missing?
[1711] Is there any other?
[1712] Like, I just saw a smart watch that is powered by the heat of the human body and some new technology.
[1713] It's able to fully power that way?
[1714] I don't know if it's fully or if it's like this watch right here.
[1715] This is a Casio.
[1716] It's called a ProTech.
[1717] And it's in, like, an outdoors watch, and it's solar powered.
[1718] Okay.
[1719] And so it has the ability to operate for a certain amount of time on solar.
[1720] Yeah.
[1721] So if you have it exposed, it could function for a certain amount of time on solar.
[1722] Yeah, well, you know, like there's the self -wining watches where, you know, it's just got a weight in the watch.
[1723] And as you move your wrist, the weight moves from one side to the other, and it winds the watch up.
[1724] That's a pretty cool thing.
[1725] Yeah, yeah.
[1726] Well, it's amazing that, like, Rolex is that it's all done mechanically.
[1727] There's no batteries in there.
[1728] There's no nothing.
[1729] Yeah, you could do the same.
[1730] Same thing, create a little charger that's based on wrist movement.
[1731] It really depends on how much energy your watch uses.
[1732] You know what's fucked up about that, though?
[1733] We accept a certain amount of, like, fuckery with those watches.
[1734] Like, I brought my watch.
[1735] I have a Rolex that my friend Lorenzo gave me, and I brought it to the watch store, and I said, this thing's always fast.
[1736] I said it's always, like, after a couple months, it's like five minutes fast.
[1737] And they go, yep.
[1738] They go, yeah.
[1739] Really?
[1740] It's just what it does.
[1741] And I go, hold on.
[1742] I go, so you're telling me that it just is always going to be fast.
[1743] They's like, yeah, it's just like every few months you've got to like reset it.
[1744] It seems like they should recalibrate that thing.
[1745] They can't.
[1746] They tried.
[1747] They say every few months, whether it's four months or five months or six months, it's going to be a couple of minutes fast.
[1748] Okay, it seems like they should really recalibrate that because if it's always fast, you can just, you know, delete those minutes.
[1749] You need to fucking kick down the door at Rolex and go, you bitches are lazy.
[1750] It's kind of amazing that you can keep time mechanically on a wristwatch with these tiny little gears.
[1751] It's amazing.
[1752] Yeah.
[1753] I mean, the whole luxury watch market is fascinating.
[1754] I'm not that involved in terms of, like, I don't buy them.
[1755] I bought them as gifts, so I don't buy them for myself.
[1756] But when I look at them online, there's a million -dollar watches out there now that are like, they have like little rotating moons and stars.
[1757] And they live, like, look at this thing.
[1758] How much is that one, Jamie?
[1759] These are fucking preposterous I like gears I love them I love them I think they're beautiful but there's some of these people that are just taking it right in the ass they're buying these watches for like $750 ,000 I'm like yo that's a time X son nobody knows it's not any better than some cassio that you could just buy online like look at that though well here's the thing if you're a person that doesn't just want to know the time you want craftsmanship You want some artisans touch.
[1760] You want innovation in terms of, like, a person figuring out how gears and cogs all line up perfectly to every time it turns over, it's basically a second.
[1761] I mean, that's just, there's art to that.
[1762] Yeah, I agree.
[1763] Yeah, it's not just telling time.
[1764] I like this watch a lot, but if it got hit by a rock, I wouldn't be sad.
[1765] Yeah.
[1766] It's just a watch.
[1767] It's a mass -produced thing that runs on some courts.
[1768] battery but those things that's there's art yeah no i agree it's beautiful yeah yeah love it yeah there's something there's something amazing about it it's because it represents the human creativity it's not just it's not just uh electronic innovation there's something there's this this person's work in that yes but you don't have a watch on no ever i used to have a watch What happened?
[1769] My My phone tells the time.
[1770] Good point.
[1771] What if you lose your phone?
[1772] You wait, hold on.
[1773] Let me guess.
[1774] You are a no -case guy.
[1775] That's correct.
[1776] Living on the edge.
[1777] Living on the edge without a case.
[1778] Neil deGrasse Tyson.
[1779] Neil deGrasse Tyson was in here last week.
[1780] I marveled at his ability to get through life without a case.
[1781] That's right.
[1782] You know, he takes his phone and he flips it in between his fingers.
[1783] Like a soldier would do with his rifle.
[1784] Really?
[1785] He just rolls that shit in between his fingers.
[1786] Okay.
[1787] Marvelous.
[1788] Wow.
[1789] He says that's the reason why they do it.
[1790] He said, when you look at someone who has a rifle, why would they do that?
[1791] Why would they flip it around like that?
[1792] Right.
[1793] So that when it goes to drop, they have it in their hand.
[1794] They catch it quickly.
[1795] So that's what he does with his phone.
[1796] He's just flipping his phone around all the time.
[1797] I got that in Mexico.
[1798] I was hoping it holds joints.
[1799] Does it do anything?
[1800] It seems to open.
[1801] It's just a hole.
[1802] You can store things in there.
[1803] But like try to put a joint in there.
[1804] Close it.
[1805] You put like one, one blunt.
[1806] One.
[1807] But it seems pretentious.
[1808] You know?
[1809] That's the idea behind it.
[1810] I bought it when I was in Mexico because I figured it would be a good size to hold joints.
[1811] Or not.
[1812] So is that a joint?
[1813] Or is it a cigar?
[1814] No. Okay.
[1815] It's marijuana inside of tobacco.
[1816] Okay.
[1817] So it's like pot, tobacco, posh.
[1818] You never had that?
[1819] Yeah, I think I tried one once.
[1820] Come on, man. You probably can't because of stockholders, right?
[1821] I mean, it's legal, right?
[1822] It's totally legal.
[1823] Okay.
[1824] How does that work?
[1825] Did people get upset at you if you do certain things?
[1826] There's tobacco and marijuana in there.
[1827] That's all it is.
[1828] The combination of tobacco and marijuana is wonderful.
[1829] first turned on to it by Charlie Murphy and then reignited by Dave Chappelle.
[1830] There you go.
[1831] Plus whiskey.
[1832] Exactly.
[1833] Balances it out.
[1834] Alcohol is a drug.
[1835] It's been grandfathered in.
[1836] Well, it's not just a drug.
[1837] It's a drug that gets a bad rap.
[1838] Because you just have a little.
[1839] It's great.
[1840] Fine.
[1841] Yeah, a little sip here and there and your inhibitions are relaxed and it shows your true self and hopefully you're more joyous and friendly and happy and everything's good.
[1842] The real worry is the people that can't handle it, like the real worry about people who can't handle cars that can go zero to 60 in 1 .9 seconds or anything.
[1843] Have you ever considered something that, like, imagine if one day everyone has a car that's on the same, at least technological standard as one of your cars.
[1844] And everyone agrees that the smart thing to do is not just to have bumpers, but to perhaps have some sort of a magnetic repellent device, something, some electro -magnetic field around the cars, that as cars come close to each other, they automatically radically decelerate because of magnets or something.
[1845] Well, I mean, our cars break automatically.
[1846] Break?
[1847] Yeah.
[1848] Yeah, when they see things.
[1849] Yes.
[1850] But like a physical barrier.
[1851] Like, well, the wheels work pretty well.
[1852] The wheels do.
[1853] Yeah, yeah.
[1854] They work pretty well.
[1855] decelerate at, you know, 1 .1 to 1 .2 Gs, that kind of thing.
[1856] Is there a concern that one day all your cars will be on the road and then there there'll still be regular people with regular cars 20, 30 years from now, that'll get in the mix and be the main problem?
[1857] Yeah, I think it'd be sort of like, you know, it was a time of transition where there were horses and gasoline cars on the road at the same time.
[1858] It's been pretty weird.
[1859] Oh, that would be the weirdest.
[1860] Yeah.
[1861] Horses were tricky.
[1862] You know, back in Manhattan had like 300 ,000 horses.
[1863] You figure, like, a horse lives 15 years, got 20 ,000 horses dropping dead every day, or every year, I should say.
[1864] Every year, I should say.
[1865] Every year is 20 ,000 horses if there's 300 ,000 horses and 15 -year lifespan.
[1866] to move the horse.
[1867] Right.
[1868] They'll probably get pretty freaked out if they have to move a dead horse.
[1869] Do you think they know what's going on?
[1870] Yeah, I mean, it's got to be like pretty weird.
[1871] No, I would imagine.
[1872] Why am I dragging this dead, you know, horse around, and I'm a horse.
[1873] Do you ever stop and think about your role in civilization?
[1874] Do you ever stop and think about your role in the culture?
[1875] Because me as a person who never met you until today, when I think of you, you know, I've always thought of you as being this, weirdo super inventor dude who just somehow another keeps coming up with new shit but there's not a lot of you out there like everybody else seems to be i mean obviously you make a lot of money and there's a lot of people that make a lot of money you like that clock yeah pretty dope right this is a great clock yeah you want one i'll get you one sure okay done i like weird things like this oh this is the coolest it's tgt promotion what is it TGT Studios?
[1876] TGT Studios.
[1877] Yeah.
[1878] It's a gentleman who makes all this by hand.
[1879] Yeah, it's really cool.
[1880] My study is filled with weird devices.
[1881] Well, get ready for another one.
[1882] All right.
[1883] I'm sending it your way.
[1884] Cool.
[1885] You want a werewolf, too?
[1886] I hook you up.
[1887] All right.
[1888] I'll take one.
[1889] One werewolf and one clock coming up.
[1890] Do you think about your role in the culture?
[1891] Because me as a person who never met you until today.
[1892] I've always looked at you and like, wow, like, how does this guy just keep inventing shit?
[1893] Like, how do you keep coming up with all these new devices?
[1894] And what do you ever consider how on you, like, I had a dream once that there was a million Tesla's.
[1895] Instead of like one Tesla, there was a million Tesla's.
[1896] Okay.
[1897] Not Chesley the car, but Nicola.
[1898] Oh, yeah, sure.
[1899] And that in his day, there was a million people like him who were radically innovative.
[1900] It was a weird dream, man. It was so strange.
[1901] And I've had it more than once.
[1902] That would result in very rapid technology, innovation.
[1903] That's for sure.
[1904] It's one of the only dreams in my life I've had more than one time.
[1905] Okay.
[1906] Like where I've woken up and it's in the same dream.
[1907] I'm in the same dream.
[1908] And in this dream, it's 1940s, 1950s.
[1909] But everyone is severely advanced.
[1910] There's flying blimps with LCD screens on the side of them and everything is bizarre and strange.
[1911] and it stuck with me for whatever it obviously this is just a stupid dream but for whatever reason all these years that stuck with me like it takes one man like Nicola Tesla to have more than a hundred inventions that were patents right I mean he had some pretty great pretty fucking amazing ideas but there was in his day there was very few people like him that's true what if there was a million like what in the things would advance very quickly right but there's not a million Elon Musk's one motherfucker do you think about that or you just try to not hmm I don't think I don't think I don't think you'd necessarily want to be me well what's the worst part about you I think people would like it that much well most people wouldn't but they can't be you so that's like that's like some superhero type shit You know, you wouldn't want to be Spider -Man.
[1912] Rather, just sleep tight in Gotham City.
[1913] I hope he's out there doing his job.
[1914] It's very hard to turn it off.
[1915] Yeah.
[1916] What's the hardest part?
[1917] It might sound great if it's turned on, but what if it doesn't turn off?
[1918] Now, I showed you the isolation tank, and you've never experienced that before.
[1919] No. I think that could help you turn it off a little bit just for the night.
[1920] Yeah, just give you a little bit of sleep.
[1921] A little bit of perspective.
[1922] It's magnesium that you get from the water as well that makes you, makes you sleep easier because the water has ebb some salts in it.
[1923] But maybe some sort of strategy for sacrificing your biolog, not sacrificing, but enhancing your biological recovery time by figuring out a way, whether it's through meditation or some other ways to shut off that thing at night.
[1924] Like you must have like a constant stream of ideas that's running through your head all the time.
[1925] you're getting text messages from chicks no I'm getting text messages from from friends saying what the hell are you doing smoking weed is that bad for you it's legal it's government approved it's not you know I'm not a regular smoker of weed how often you smoke it almost never I mean it's I don't actually notice any effect well there you go There was a time where, I think it was Ram Dass or someone gave some Buddhist monk a bunch of acid.
[1926] Okay.
[1927] And he ate it and it had no effect on him.
[1928] I doubt that.
[1929] I would say that too, but I've never meditated to the level that some of these people have where they're constantly meditating all day.
[1930] They don't have any material possessions and all of their energy is spent trying to achieve a certain mindset.
[1931] I would like to cynically deny that.
[1932] I'd like to cynically say, they just fucking think the same way I do.
[1933] They just hang out with flip -flops on and make weird noises.
[1934] But maybe no. You know, I know a lot of people like weed, and that's fine.
[1935] But I don't find that it is very good for productivity.
[1936] For you?
[1937] Not for me. Yeah, I mean, I would imagine that for someone like you, it's not.
[1938] Someone like you, it would be more like a couple.
[1939] coffee right you want to you're a matte yeah it's more like the opposite of a cup of coffee what do you like a cup of coffee oh weed is yeah no i'm saying you would like more more like what would be beneficial to you would be like coffee i like to get things done i like to be useful um that is one of the hardest things to do is to be useful when you say you'd like to get things done like in terms of like what gives you satisfaction when you complete a project when something that you invent comes to fruition and you see people enjoying it, that feeling.
[1940] Yes, doing something useful for other people that I like doing.
[1941] That's interesting for other people.
[1942] Yes.
[1943] So do you think that that is maybe the way you recognize that you have this unusual position in the culture where you can uniquely influence certain things because of this I mean you essentially have a gift right I mean you would think it was a curse but I'm sure it's been fueled by many many years of discipline and learning but you essentially have a gift and that you have this radical sort of creativity engine when it comes to innovation and technology it's like you're just you're going at a very high RPMs all the time what is that like It doesn't stop.
[1944] I don't know what would happen if I got into a sensory deprivation tank.
[1945] Let's try it.
[1946] Sounds concerning.
[1947] It's like running the engine with no resistance.
[1948] Is that what it is, though?
[1949] Maybe it's not.
[1950] Maybe it's fine.
[1951] I don't know.
[1952] I'll try it.
[1953] Have you ever experimented with meditation or anything?
[1954] Yes.
[1955] What do you do?
[1956] Or what have you done, rather?
[1957] I mean, you just sort of sit there and be quiet and then repeat some mantra, which acts as a focal point.
[1958] It does still the mind, it does still the mind, but I don't find myself drawn to it frequently.
[1959] Do you think that perhaps productivity is maybe more attractive to you than enlightenment, or even the concept of whatever enlightenment means?
[1960] It's like, what are you trying to achieve when you're meditating all the time?
[1961] With you, it seems like almost like there's a franticness to your creativity that comes out of this burning furnace.
[1962] And in order for you to calm that thing down, you might have to throw too much water on it.
[1963] It's like a never -ending explosion.
[1964] Like, what is it like?
[1965] Like, try to explain it to a dumb person like me. What's going on in the head?
[1966] Never -ending explosion.
[1967] It's just constant ideas.
[1968] Just bouncing around.
[1969] yes whew damn yeah so when everybody leaves it's just Elon sitting at home brushing his teeth just a bunch of ideas bouncing around your head yeah all the time when did you realize that that's not the case with most people I think when I was I don't know 5 or 6 or something I thought I was insane why did you think you were insane because it was clear that other people do not their mind wasn't exploding with ideas all the time.
[1970] They weren't expressing it.
[1971] They weren't talking about it all the time.
[1972] And you realized by the time you were five or six, like, oh, they're probably not even getting this thing that I'm getting.
[1973] No. It was just strange.
[1974] It was like, hmm, I'm strange.
[1975] That was my conclusion.
[1976] I'm strange.
[1977] But did you feel diminished by it in any way?
[1978] Like, knowing that this is a weird thing that you really probably couldn't commiserate with, with other people?
[1979] They wouldn't understand you?
[1980] I hope they wouldn't find out, because they might, like, put me away with something.
[1981] You thought that?
[1982] For a second, yes.
[1983] When you were little?
[1984] Yeah, and they put people away.
[1985] What if they put me away?
[1986] Like, when you were little, you thought this?
[1987] Yes.
[1988] Wow.
[1989] Like, you thought this is so radically different than the people that are around me. If they find out I got this stream coming in.
[1990] Yeah.
[1991] Wow.
[1992] But, you know, I was only like five or six.
[1993] Do you think this is like, I mean, there's outliers biologically.
[1994] You mean, there's people that are seven foot nine.
[1995] There's people that have giant hands.
[1996] There's people that have eyes that are 2015 vision.
[1997] There's always outliers.
[1998] Do you feel like you caught this?
[1999] Like you've got some, you're like on some weird innovation, creativity sort of wave that's very unusual like you tapped into I mean just think of the various things you've been able to accomplish in a very short amount of time and you're constantly doing this that's a weird you're a weird person right I agree yeah like what if there's a million Elon Musk's well that would be very very weird who yeah it'd be pretty weird I agree real weird definitely yeah what if there are a million Joe Rogids oh there probably is it's probably two million I mean I think that's the case with a lot of folks yeah I mean but like you know my goal is like try to do useful things try to maximize it probably the future is good um make the future exciting something you look forward to you know you know with you know with Tesla we're like trying to make things that people love you know it's like not how many how many things can you buy that you really love that really give you joy so rare so rare I wish there were more things that's what we're trying to just make things that somebody loves when you when you think about making things that someone loves like do you specifically think about like what things would improve people's experience?
[2000] Like, what would change the way people interface with life that would make them more relaxed or more happy?
[2001] You really think, like, when you're thinking about things like that, is that, like, one of your considerations?
[2002] Like, what could I do that would help people?
[2003] Yeah.
[2004] That maybe they wouldn't be able to figure out.
[2005] Yeah, like, what are the set of things that can be done to make the future better?
[2006] like you know like so i think the future where we are a space faring civilization and out there among the stars this is very exciting this makes me look forward to the future this makes me want that future you know the things there need to be things that make you look forward to waking up in the morning you wake up in the morning you look forward to the day forward to the future in a future where we are a space -faring civilization and out there among the stars, I think that's very exciting.
[2007] That is a thing we want.
[2008] Whereas if you knew we would not be a space -faring civilization but forever confined to Earth, this would not be a good future.
[2009] That would be very sad, I think.
[2010] We don't want the sad future.
[2011] Of the, just the finite lifespan of the Earth itself and the solar system itself, that even though it's possibly, you know, I mean, how many, how long do they feel like this sun and the solar system is going to exist?
[2012] how many hundreds of millions of years well it's probably if you're saying when does the sun boil the oceans about 500 million years so is it sad that we never leave because in 500 million years that happens is that what you're saying no I just think like if there are two futurists and one futurist we're out there among the stars and the things we read about and see in science fiction movies the good ones are true we have these starships and we're going to see what other planets are like and we're a multi -planet species and the scope and scale of consciousness has expanded across many civilizations and many planets and many star systems this is a great future this is a wonderful thing to me and that's what we should strive for but that's what we should strive for but that's That's biological travel.
[2013] That's cells traveling physically to another location.
[2014] Yes.
[2015] Do you think that's definitely where we're going?
[2016] No. Yeah.
[2017] I don't think so either.
[2018] I used to think so.
[2019] And now I'm thinking more likely less than ever, like almost every day, less likely.
[2020] We can definitely go to the moon and Mars.
[2021] Yeah.
[2022] Do you think we'll colonize?
[2023] We can go to the asteroid belt and we can go to the moons of Jupiter, Saturn, you can even get to Pluto.
[2024] That'd be the craziest place ever if we colonized Mars and re -teroformed it and turned it into, like, a big Jamaica.
[2025] Just oceans and, I think that would be, I mean, imagine.
[2026] Great.
[2027] That'd be great.
[2028] It's possible, right?
[2029] We could turn the whole thing into Cancun.
[2030] Well, I mean, over time.
[2031] There wouldn't be easy, but yes.
[2032] Right.
[2033] You could just warm it up.
[2034] Yeah, you can warm it up.
[2035] You could add air.
[2036] You get some water there.
[2037] I mean, over time, hundreds of millions of years or whatever it takes.
[2038] We're going to be a multi -planet species.
[2039] That would be amazing.
[2040] We're a multi -planet species.
[2041] That's what we want to be.
[2042] Legitimately like air -conditioned Saturn.
[2043] I'm pro -human.
[2044] Me too.
[2045] Yeah.
[2046] Me too.
[2047] I love humanity.
[2048] I think it's great.
[2049] We're glad as a robot that you love humans because we love you too and we don't want you to kill us and eat us.
[2050] I mean, you know, strangely, I think a lot of people don't like humanity and say it as a blight, but I do not.
[2051] Well, I think one of those, I think part of that is just they've been, You know, they've been struggling.
[2052] When people struggle, they associate their struggle with other people.
[2053] They never internalize their problems.
[2054] They look to other people as holding them back and people suck and fuck people.
[2055] And it's just, you know, it's a never -ending cycle.
[2056] But not always.
[2057] Again, most people are really good.
[2058] Most people, the vast majority.
[2059] This may sound corny.
[2060] It does sound corny.
[2061] But love is the answer.
[2062] It is the answer.
[2063] Yep.
[2064] Yeah, it is.
[2065] It sounds corny because we're all scared.
[2066] You know, we're all scared of trying to love people being rejected or someone taking advantage of you because you're trying to be loving.
[2067] Sure.
[2068] But if we all could just relax and love each other.
[2069] Wouldn't hurt to have more love in the world.
[2070] It definitely wouldn't hurt.
[2071] Yeah.
[2072] It'd be great.
[2073] Yeah, we should do that.
[2074] Yeah.
[2075] I agree, man. Really?
[2076] How do you get to fix that?
[2077] You have a love machine that you work enough?
[2078] No, but probably.
[2079] spend more time with your friends and less time on social media now deleting social media from your applications from your phones that give you 10 % boost to happiness was the what do you think the percentages I think probably something like that yeah it's about right good 10 % yeah I mean I mean the only thing I've kept is Twitter because I kind of like need some means of getting a message out you know right um well that's a about it.
[2080] So far, so good.
[2081] Well, what's interesting with you, you actually occasionally engage with people on Twitter.
[2082] Yeah.
[2083] What percentage of that is a good idea?
[2084] Good question.
[2085] Probably 10%.
[2086] It's hard.
[2087] It's mostly, I think it's on balance more good than bad, but there's definitely some bad.
[2088] So hopefully the good outweighs the bad.
[2089] Do you ever, think about how odd it is the weird feeling that you get when someone says something shitty to you on Twitter and you read it the weird feeling this weird little negative jolt it's like a subjective negative jolt of energy they don't really need to absorb but you do anyway like well fuck this guy fuck him I mean there's a lot of negativity on Twitter it is but it's a weird it's in in its form like the way if you ingest it as if you're like you try to be like a little scientist as you're ingesting it you're like wow how weird is this and I'm even getting upset and some strange person saying something mean to me it's not even accurate i mean the the vast number of negative comments uh so the vast for the vast majority i just ignore them the vast majority yeah but every now and again you're drawn in it's not good it's not good make mistakes yes you can make mistakes we're all human we can make mistakes yeah it's hard and people love it when you say something and you take it back and they're like fuck you we saved it forever we fucking screenshot that shit bitch you had that thought you had that thought you're like well I deleted it not good enough you had the thought I'm better than you I never had that thought you had that thought you piece of shit look I saved it I put it on my blog yeah I'm not sure why people think that anyone would think that's deleting a tweet makes it go away it's like hello been the internet for a while yeah well it's even like anything is they don't want you to be able to delete it because the problem is if you don't delete it and you don't believe it anymore it's really hard to say hey that thing above i don't really believe that anymore i changed the way i view things yes because people go well fuck you i have that over there i'm gonna just take that i'm not going to pay attention to that shit you wrote underneath it's on your permanent record it's forever bro like high school we'll put this on your permanent record.
[2090] It's like a tattoo.
[2091] You keep it.
[2092] Oh, yeah.
[2093] Yeah.
[2094] Well, it's this thing where there's a, there's a lack of compassion.
[2095] It's a lack of compassion issue.
[2096] People are just like intentionally shitty to each other all the time online and trying to catch me. They're more trying to catch people doing something that's arrestable, like a cop trying to like get, you know, arrests on his record.
[2097] It's like they're trying to catch you with something, more than they're logically looking at it thinking it's a bad thing that you've done or that it's an idea they don't agree with so much they need to insult you trying to catch you yeah i mean it's way easier to be mean on social media than it is to be mean in person yes way easier yeah yes it's weird it's it's not a normal way of human interacting it's cheating we're not supposed to be able to interact so easily what people are not looking at yes you would never do that you'd never be so mean to somebody looking in their eyes and if you did you'd feel like shit most people yeah unless you're sociopath you'd feel terrible yes Elon Musk this has been a pleasure yeah likewise it really has been it's been an honor thank you for having me um thanks for doing this because uh i know you don't do a lot of uh long form stuff like this i hope i didn't weird you out and hope you don't get mad that you smoke weed i mean it's bad it's legal we're in california this is just as legal is this whiskey we've been drinking exactly this is all good right cheers thank you um is there any message you would like to put out other than loves the answer because i think you really nailed it with that no i think you know i think people should be nicer to each other and give people and give give more credit to others and don't assume that they're mean until you know they're actually mean you know just it's easy to demonize people you're usually wrong about it.
[2098] People are nicer than you think.
[2099] Get people more credit.
[2100] I couldn't agree more.
[2101] And I want to thank you not just for all the crazy innovations you've come up with and your constant flow of ideas, but that you choose to spread that idea, which is it's very vulnerable, but it's very honest.
[2102] And it resonates with me, and I believe it.
[2103] I believe it's true, too.
[2104] So thank you.
[2105] All you assholes out there.
[2106] Be nice.
[2107] Be nice, bitch.
[2108] All right.
[2109] Thank you, everybody.
[2110] Thank you, Elon.
[2111] Thank you.
[2112] Good night, everybody.