The Daily XX
[0] Hey, it's Michael.
[1] This weekend, we're bringing you something a little different from our colleagues here at the New York Times.
[2] Today, an interview with Elon Musk, one of the most consequential, complicated, and controversial people of our time.
[3] Just a few days ago, Musk sat down with business columnist Andrew Ross Sorkin for an interview before a live audience.
[4] It's a remarkable conversation.
[5] Sorkin presses Musk on a recent public controversy, but he also explores Musk's ideas about a variety of topics, freedom of speech, technology, optimism, aliens, and screen time.
[6] It was all part of a series of live interviews put together by our colleagues at Deal Book with significant leaders, including Vice President Kamala Harris and former house speaker Kevin McCarthy.
[7] If you want to hear them all, you can listen on our NYT audio app, or search Deal Book Summit wherever you get your podcasts.
[8] Now, here's Andrew Ross Sorkin in conversation with Elon Musk.
[9] My mind often feels like a very wild storm.
[10] Is your storm a happy storm?
[11] No. This is Andrew Ross Sorkin with The New York Times, and you're listening to the best interviews from our annual Deal Book Summit event recorded live yesterday in New York City.
[12] Everybody, thank you so much for being with us throughout the day, and I couldn't be more pleased to sit with Elon Musk as our final interview of this remarkable time we've all had together.
[13] He doesn't need much of an introduction, but I want to say a couple things.
[14] He's the richest person in the world.
[15] He may very well be the most consequential individual in the world right now.
[16] He runs the most innovative companies in the world, Tesla, SpaceX, Starlink, which is part of that, Neurlink, the boring company, X, and his X .A .I. And he's disrupted each of these lanes.
[17] He's moved to breakneck speeds, but he's faced a storm of controversy in the process.
[18] He joins us today following a visit, as you all know so well, and we discussed earlier on Monday to Israel, where he met with the prime minister there and the president of Israel.
[19] And we're going to talk about everything.
[20] And my hope is that we can talk about how he's.
[21] thinks about his influence, about his power, about all of it.
[22] And we're going to talk about innovation and everything else.
[23] I want to say just two other things real quick.
[24] We met each other for the first time 16 years ago.
[25] Yeah, it's a long time.
[26] It's been a long time.
[27] And all this kids were three.
[28] When we first met, I think you're just, you're about to deliver your first roadster.
[29] I don't think you had yet.
[30] Larry Page was still waiting.
[31] Yeah, That's to be, like 2007?
[32] 2007, 2008.
[33] I remember going back to the newsroom and saying, I think I just met the next Steve Jobs.
[34] And I'm going to hold to that.
[35] Okay.
[36] I'm going to hold to that.
[37] But a lot has happened between when I first met you and now.
[38] You came to deal book.
[39] It's not been boring, that's for sure.
[40] Wait, actually, I take a guy I do have a boring company.
[41] 2012, you came to deal book and sat on this stage, and we're thrilled to have you back but there's been so much that's happened between now and then and there's been so much that's happened in the past week week and a half and a lot of folks called me up and said you're really going to host Elon Musk here can you believe what he just said on Twitter on X on X no idea what this Twitter thing should you keep talking about should you platform him that's what they said should you platform them and I said I think that it's our role and I know you have issues with journalists.
[42] I have a platform.
[43] I know you have an issue with journalists oftentimes, but I said it's our role to have conversations and to inquire and to sometimes even interrogate ideas.
[44] And that's, I'm hoping we can do that.
[45] So I want to start just so we can begin this conversation and just level set.
[46] Take us through everything that happened, if you could.
[47] Everything.
[48] No, over the past week and a half.
[49] How long have you got?
[50] We've got the time.
[51] Okay.
[52] You send out a post or X or a tweet.
[53] I don't know what you want to describe it as.
[54] I'm trying to change.
[55] Like, when things were just 140 characters, it made sense to call them a tweet because it was like a bunch of little birds chirping.
[56] But, you know, the point in which you can put like three -hour videos on, it's like a very long tweet.
[57] So here we are.
[58] So post is more descriptive, I think.
[59] And at some point, I don't know where you were, but you write in responding to another tweet.
[60] Yes.
[61] This is the actual truth.
[62] And it's set off a firestorm of criticism all the way to the White House.
[63] Right.
[64] And then you make this trip to Israel.
[65] You have advertisers who've left the platform.
[66] People calling it.
[67] Well, the trip to Israel is independent of, it wasn't some like apology tour, I want to be clear.
[68] That was.
[69] Okay, well, let's talk about that.
[70] But just take us back to the moment at which you write that.
[71] Trip to Israel is independent of, it wasn't like in response to that at all.
[72] we'll do we'll do Israel in just a moment and I have no problem being hated by the way I hear you hate away well but you know what let's go straight to that then for a second sure because there is an idea and you could say that I think it's a real weakness to want to be liked a real weakness and I do not have that let me ask you this then there's a difference you're saying I don't care if anyone likes me or they hate me but given your power and given what you have amassed and the importance you have, I would think you want to be trusted.
[73] I would think maybe you don't need to be liked or hated, but trusted matters.
[74] If X is going to become a financial platform where people are going to put their money, where the government's going to give you money for rockets, where people are going to get into the cars, they need to ultimately decide that you are, they don't have to say that they love you, but that you are ultimately a decent and good human being.
[75] Yes, I mean, I think I am, but I'm certainly not going to do some sort of tap dance to prove to people that I am.
[76] As for trust, I mean, I think we can break that down in a few ways.
[77] If you want satellites sent to orbit reliably, SpaceX will do 80 % of all mass to orbit this year.
[78] China will do 12%.
[79] The rest of the world will do eight.
[80] That includes Boeing, Lockheed, and everyone else.
[81] so the track record of the rocket is the best by far of anything you could you could hate my guts next you could not trust me it is relevant the rocket track record speaks for itself with respect to Tesla we make the best cars whether you hate me like me or indifferent do you want the best car or do you not want the best car so so I will certainly not pander And, Jonathan, like, the only reason I'm here is because you were a friend.
[82] Like, what was my speaking fee?
[83] First of all, I'm Andrew.
[84] But, yeah, sorry.
[85] It's okay.
[86] Second of all, we've noticed with a very long time.
[87] I'm talking.
[88] Yes.
[89] And, um...
[90] Listen.
[91] You know...
[92] What I'm trying to illustrate is that sometimes I say the wrong thing.
[93] I think there are a lot of people who are tired, but let me, let me go back.
[94] You should hear the sketches that SNL wouldn't post, by the way.
[95] Those are really good.
[96] And I would say, unfortunately, or fortunately or unfortunately, whatever friendship we have, not great, we don't talk to you that much.
[97] But let me ask you this.
[98] That's true.
[99] That's true.
[100] Where am I?
[101] Doesn't return the phone calls.
[102] What is I'm here because you're a friend, not because I'm being paid all because I need any validation or anything, and I promise you I'd be here, and that's why I'm here.
[103] Well, I appreciate you being here.
[104] Let me ask you this, then.
[105] You write this tweet that says that this is the actual truth.
[106] People read that tweet.
[107] Yes.
[108] And they say, Elon Musk is an anti -Semite, that he is riling up this base.
[109] You're hearing it from, as I said, the White House, you're hearing it from Jewish groups all over.
[110] I think Jonathan Greenbl from the ADL is here.
[111] There's lots of people who say this.
[112] And by the way, it's not just that Did you read the whole thing?
[113] I did.
[114] And that's why I want to ask you about it.
[115] Excuse me?
[116] I said more than what you just read.
[117] No, there was absolutely more.
[118] Yes, but I'll tell you the thing that struck me. It wasn't, and I'm an American Jew, it wasn't just the people who had that view.
[119] It was actually people who really are anti -Semites, who said, oh my goodness, go Elon.
[120] This is fabulous.
[121] And that actually was the thing that really set me back.
[122] I said to myself, what's going on here?
[123] And I want to know how you felt about that in that moment when you saw all of this happening.
[124] Yeah.
[125] First of all, I did clarify almost immediately what I meant.
[126] I would say that that was, you know, if I could go back and say I should, in retrospect, not have replied to that particular person and I should have written in greater length as to what I meant.
[127] I did subsequently clarified in replies, but those tarifications were ignored by the media and essentially I handed a loaded gun to those who hate me and arguably to those who are anti -Semitic and for that I'm quite sorry that that is not that was not my intention so I did you know post on my primary timeline to be absolutely clear that I'm not anti -Semitic and that I in fact if anything am phylo -Semitic and the trip to Israel was planned before or any of that happened.
[128] It was nearly a handle there.
[129] Do you see this thing?
[130] Do you know what it is?
[131] I do because I actually followed your entire trip to Israel.
[132] Why do you tell everybody?
[133] This says, it says bring them home to hostages.
[134] It was given to me by the parents of one of the hostages.
[135] And I said I would wear it as long as there was a hostage store remaining.
[136] And I have.
[137] what was that trip like?
[138] And obviously, you know that there's a public perception that that was part of a apology tour, if you will.
[139] That this had been said online.
[140] There was all of the criticism.
[141] There was advertisers leaving.
[142] We talked to Bob Eagher today.
[143] I hope they stop.
[144] You hope?
[145] Don't advertise.
[146] You don't want them to advertise?
[147] No. What do you mean?
[148] If somebody's going to try to blackmail me with advertising, blackmail me with money, go f***ing yourself.
[149] Fuck yourself.
[150] Is that clear?
[151] I hope it is.
[152] Hey, Bob, if you're in the audience.
[153] Well, let me ask you then.
[154] That's how I feel.
[155] Don't advertise.
[156] How do you think then about the economics of X?
[157] If part of the underlying model, at least today, and maybe it needs to shift, maybe the answer is it needs to shift away from advertising, if you believe that this is the one part of your business where you will be beholden to those who have this view, what do you do?
[158] G -F -Y.
[159] I understand that, but there's a reality, too.
[160] Right?
[161] Yes.
[162] No, no. I mean, Linda Yacorino's right here, and she's got to sell advertising.
[163] Absolutely.
[164] So, no, no, no, totally.
[165] So, so, no, actually, what, what this advertising boycott is, is going to do, it's going to kill the company.
[166] And you think that the, I, but, and the whole world will know that those advertisers killed the company, and we will document it in great detail.
[167] But those advertisers, I imagine, are going to say, they're going to say, we didn't kill the company.
[168] Oh, yeah?
[169] They're going to say.
[170] Tell to Earth.
[171] But they're going to say, Elon, that you killed the company because you said these things, and that they were inappropriate things, and they didn't feel comfortable on the platform, right?
[172] That's what they're going to say.
[173] And let's see how Earth responds to that.
[174] So, me, okay, then this goes back to the...
[175] We'll both make our cases.
[176] Right?
[177] And we'll see what the outcome is.
[178] What are the economics of that for you?
[179] I mean, you have enormous resources, so you can actually keep this company going for a very long time.
[180] Would you keep it going for a long time if there was no advertising?
[181] I mean, if the company fails because of an advertised boycott, it will fail because of an advertised boycott, and that will be what bankrupted the company, and that's what everybody on earth will know.
[182] What do you think then of the, against, this goes back to the idea of trust, though.
[183] Then it'll be gone.
[184] And it'll be gone because of an advocate.
[185] as a boycott.
[186] But you recognize that some of those people are going to say that they didn't feel comfortable on the platform.
[187] And I just wonder and ask you, and think about that for a second.
[188] Tell it to the judge.
[189] But the judge is going to be...
[190] The judge is the public.
[191] And you think that the public is going to say that Disney is making a mistake?
[192] Yes.
[193] And they're going to boycott Disney?
[194] They already are.
[195] Well, there are some that are for lots of different reasons.
[196] but you think that this is going to, that you have the, this goes to actually the interesting of power and leverage.
[197] Let the chips fall where they may. Let the chips fall where they may. Can I ask why that is the approach?
[198] I ask it because you've been very, well, you've been very particular about, I mean, the approach to Tesla.
[199] When you think about the engineering involved in that, the approach to SpaceX, the approach to some of the stuff you're doing with AI has been very specific, right?
[200] There's not a, let the chips fall where they may approach to those businesses, I don't think.
[201] No, we're focused on making the best products.
[202] And Tesla's gotten to where it's gotten with no advertising at all.
[203] I understand that.
[204] Tesla currently sells two twice as much in terms of electric vehicles as the rest of electric car workers in the United States combined.
[205] Tesla has done more to help the environment than all other companies combined.
[206] refer to say that therefore as a leader of the company I've done more for the environment than any single human on earth How do you feel about that?
[207] How do I feel about that?
[208] Yeah, no, I'm asking you personally how you feel about that because we're talking about power and influence and...
[209] I'm saying what I care about is the reality of goodness, not the perception of it.
[210] And what I see all over the place is people who care about looking good while doing evil.
[211] Fuck them.
[212] Okay?
[213] let me ask you this because I think part of this by the way there's some people who said look owning x to begin with has just created problems that you've created so many amazing things that are changing our world and i know you want to make x this fabulous town square free speech platform but that unto itself that that has created such a distraction of all of these things this is the conversation we're having we're not focused or we're not talking at least yet and we will on Tesla, you have your cyber truck deliveries tomorrow and everything else that you're doing.
[214] But is there any...
[215] It would be the biggest product launch of anything by far on earth this year.
[216] Is there any part of you, though, that just says, you know what?
[217] I just shouldn't have done this.
[218] Or maybe I should sell it or give it away or do something else.
[219] With the X piece of it.
[220] Yeah.
[221] Given the propensity for some of the things that you do and say on that platform to create these issues.
[222] Yeah.
[223] Of all the posts I've done on the platform, I think there might be 30 ,000 or something like that.
[224] Right.
[225] Once in a while, I will say something foolish.
[226] And I have.
[227] And I would certainly put that comment, as you said, the actual truth, among perhaps one of the most foolish, if not the most foolish thing I've ever done on the platform.
[228] and I did do my best to clarify afterwards that I certainly do not mean anything anti -Semitic in that the nature of the criticism was simply that the Jewish people have been persecuted for thousands of years there is a natural affinity therefore for persecuted groups this has led to the funding of organizations that essentially promote any persecuted group or any group with the perception of persecution.
[229] This includes radical Islamic groups.
[230] Everyone here has seen the massive demonstrations for Hamas in every major city in the West.
[231] That should be jarring.
[232] Well, a number of those organizations received funding from prominent people in the Jewish community.
[233] They didn't expect that to happen.
[234] But if you generically, without condition, if you fund persecuted groups, in general, some of those persecuted groups, unfortunately, want your annihilation.
[235] And what I meant by that, and I subsequently clarified, is that it's unwise to fund organizations that support groups that want your annihilation.
[236] Is this coming across clearly?
[237] Yeah, no, it is.
[238] My question to you, though, is...
[239] I think logically this makes a lot of sense.
[240] Is there any part of you...
[241] Just tell me what happens, though, once all of this happens...
[242] Let's say you fund a group, and that group supports the Mars who wants you to die.
[243] Perhaps you should not fund them.
[244] But you do...
[245] Thank you.
[246] You do appreciate that when you weighed into these very...
[247] very delicate waters, at these very delicate times, that it can create a real, I mean, as it created headlines for the past two weeks and economic impact.
[248] I'm just so curious what, in your brain, when you see all this happening, are you sitting there going, oh my God, I stepped in it, I wish I didn't do that?
[249] Are you saying, screw them, I hate these people, why are they after me?
[250] But all of that.
[251] Yeah, all of that.
[252] I mean, look, I'm sorry for that tweet or post.
[253] It was foolish of me. Of the 30 ,000, it might be literally the worst and dumbest post that I've ever done.
[254] And I try to my best to clarify six races Sunday.
[255] But, you know, at least I think over time it will be obvious that, in fact, far from being anti -Semitic, I'm in fact, philosemitic.
[256] And all the evidence in my track record would support that.
[257] There are people who say crazy things on X, as you know.
[258] Maybe you think they're crazy, maybe they're not.
[259] The aspiration for X is to be the Global Town Square.
[260] Now, if you were to walk down to, let's say, Times Square, do you occasionally hear people saying crazy things?
[261] Yes, but they don't have the megaphone, right?
[262] And that's the conundrum.
[263] They can only say it to the 50 or 100 people that are standing there in Times Square.
[264] Look, the joke I used to make about old Twitter was it was like giving everyone in the psych ward a megaphone.
[265] So, you know, I'm aware that things can get promoted that are negative beyond the sort of circle of somebody simply screaming crazy things in Times Square, which happens all the time.
[266] It's pretty rare for something, frankly, that is hateful to be promoted.
[267] It's not that it never happens, but it's fairly rare.
[268] I mean, I would encourage people to look at, for those that use the system, when you look at the feed that you receive, how often is it hateful?
[269] And over time, has it gotten more or less hateful?
[270] And I would say that if you look at the X platform today versus a year ago, I think it is actually much better.
[271] I mean, what is your personal experience?
[272] Are you surprised?
[273] I'm just curious.
[274] I use the platform religiously.
[275] So you'll admit to being an addict.
[276] And I use the for you.
[277] And I will say, now, the problem is, because I'm a journalist, I go looking for stuff.
[278] Well, that's, I'm not awful.
[279] I'm just saying.
[280] And because I, and I also think the algorithm for me personally, because I'm looking for stuff also is feeding me other things.
[281] This is actually a challenge in that.
[282] Like, sometimes people will say, like, why is it showing me, you know, posts from this person that I hate?
[283] And we were like, well, did you interact a lot with this person that you hate?
[284] Well, yes.
[285] Well, therefore, it thinks that you want to interact more with this person that you hate.
[286] That's like a reasonable, you know, you kind of want to have an argument.
[287] When you tweet, do you ever, post, let's say post.
[288] When you post.
[289] Listen, I'm open, if anyone can have come up with a better word, that would be great.
[290] When you post, though.
[291] But the least bad word I can think of is post.
[292] When you post, though, do you, are you, are you, trying to rile up either a base or an audience, do you recognize the power you have in that?
[293] And also, by the way, not just rile up, but also rile down, which is to say, as I said, there are people who are demonstrably anti -Semitic on the site who I get Jew boy things and all sorts of things that come from my way.
[294] For a while I thought I was Jewish, so they would, you know, I'd get it to.
[295] But no, but the question is, my name is super dark.
[296] Do you ever think to yourself, you know what, I'm going to go online and I'm going to say, these people, I condemn these people that are on my site, saying these things.
[297] I have, you say I've condemned anti -Semitism, but do you ever go?
[298] Yeah, I said I can, I literally posted, I condemn anti -Semitism in all its forms.
[299] Like, that is a literal, I believe, literal post that I made.
[300] I mean, I'm like, listen, if I can get out the thesaurus if you, you know, and we could, you know, let me ask you a different question.
[301] You, you compose it, I'll post it.
[302] Okay, let me ask you this.
[303] You were on a podcast about a month ago, and you said something that struck me, and it struck me as accurate, came out of your mouth, so hopefully it is.
[304] But I'm hoping to go deep on this.
[305] Just because it came out of my mouth does not mean it's true.
[306] No, no. You said my mind is a storm.
[307] I don't think most people would want to be me. They may think they want to be me, but they don't know.
[308] They don't understand.
[309] what did you mean by that what was what your mind being a storm and I think it I mean I have known you for quite some time I think it is a bit of a storm yes um yeah I mean I it as much as a a weather metaphor makes sense um my mind is often feels like a like a like a very wild storm um I mean I have I have a fountain of ideas I mean I have more ideas than I could possibly execute.
[310] So I have no shortage of ideas.
[311] Innovation is not the problem.
[312] Execution is the problem.
[313] I've got a million ideas.
[314] I mean, I've got an entire design for an electric supersonic vertical takeoff jet, but I mean, I just, I just can't do that as well.
[315] I've had that for 10 years.
[316] I mean, there's a million things.
[317] Is your storm a happy storm?
[318] No. It's not a happy storm.
[319] Tell us about that.
[320] Because I think that that actually, when people try to really understand you, I think that there's a lot of this comes from some other place.
[321] And I want to talk about that.
[322] What do you think that is?
[323] We should really need like a psychiatrist couch here or something.
[324] You know, I think to some degree I was born this way, and then it was amplified by a difficult childhood, frankly.
[325] So, but I can remember even in a happy moments when I was a kid that there's just, it just feels like there's just a rage of forces in my mind.
[326] constantly.
[327] Now this, you know, productively manifests itself in technology and building things for the most part.
[328] So I think on balance, the output has been very productive.
[329] I think the results as we discussed earlier with SpaceX, Tesla, PayPal, which is, you know, still growing today.
[330] the first internet company that I started, in fact, the first internet company I started as of two was funded by New York Times company, Hearst, Knight Ritter, and we wrote some of the software for the New York Times website, and we helped bring online several hundred newspapers that previously were only in print.
[331] Now this is in the 90s, which at this point is like, I'm like, their grandpa left, but basically, you know, the 90s and internet feels like a pre -Cambrian era when there were only sponges.
[332] So, anyway, so, you know, I feel like a lot of productive things have been done, and you can also look at Tesla as being, sort of, many companies in one.
[333] Like our supercharging network is, if it were, if the Tesla supercharging network where its own company, it would be a Fortune 500 company by itself.
[334] just the supercharging system we also make the cells we build the power electronics and the power train from scratch we have the most innovative structural design the largest castings ever used we have the best manufacturing technology a Tesla better manufacturing technology than companies that have been doing it for a hundred years so so these these demons of the mind you know are for the most part honest to productive ends but that doesn't mean that once in a while they you know go wrong but and this is a question I think a lot of people you know are always trying to figure out about not just you but sometimes themselves meaning what is driving all this you're doing all of these things do you think it's do you think that you would be as successful whatever success is if it wasn't being driven by some I think that there's something you're trying to prove either to yourself or to somebody I don't know we're all trying to prove something to prove to my mother I don't know no if I would describe my philosophy it as a philosophy of curiosity I mean I did have this existential crisis when I was around 12 about what's the meaning of life isn't it all pointless why not just commit suicide why exist I read the religious texts I read the philosophy books that especially the German philosophy books made me quite depressed frankly once you not read Schopenhauer Nietzsche as a teenager but then I read Douglas Adams Hitcher is The Guide to the Galaxy, which is a book on philosophy in the form of humor.
[335] And the point that Adams was making there was that we don't actually know what questions to ask.
[336] That's why I said that the answer is 42.
[337] Like basically it was a giant computer and it came up with the answer 42.
[338] But then to actually figure out what the question is, that's the actual hard part.
[339] I think this is generally true also in physics.
[340] At the point of which you can properly frame the question, the answer is actually the easy part.
[341] So my motivation then was that, well, my life is finite, really a flash in the pan on a galactic time scale.
[342] But if we can expand the scope and scale of consciousness, then we are better able to figure out what questions to ask about the answer that is the universe.
[343] And maybe we can find out the meaning of life or even what question to, what the right question to ask is.
[344] You know, where do we come from?
[345] Where are we going?
[346] Where are the aliens?
[347] Are there aliens?
[348] You know, these questions, you know, is there new physics to discover?
[349] Or is this, because there's, This needs to be some real questions around dark matter and dark energy.
[350] So the purpose of SpaceX is to extend life beyond Earth on a sustained basis so that we can at least pass one of the Fermi -grade filters, which is that of being a single -planet civilization.
[351] If we are single -planet civilization, then we are simply waiting around for some extinction event, whether that is man -made or natural.
[352] But if you're a single -planet civilization, eventually, you will, something will happen to that planet and you will die.
[353] If you're a multi -planet civilization, you will live much longer.
[354] Also, a multi -planet civilization is, that's the natural stepping stone to being a multistella civilization and being out there among the stars.
[355] So now this, I think, has two, there's not simply a defensive motivation but it is also one where that you know that gives meaning man's search for meaning but let me finish this philosophy point even though it may seems rather esoteric it may resonate with a few people we must get past this Fermi filter of being a this great filter of being a single planet civilization and if we do that we're more likely to understand the nature of the universe and what questions to ask if you're believer in the philosophy of curiosity then then I think you should support this ambition and but it's more there's being a multi -planet species is more than than simply you know life insurance for life collectively that's a defensive reason but but I think also that that life has to be more than simply solving one sad problem after another.
[356] You know, there have to be reasons where you wake up in the morning and you're happy to be alive.
[357] There have to be reasons that you have to say, why are you excited about the future?
[358] Like, what gives you hope?
[359] And if you're unsure, ask your kids.
[360] and I think the idea of us being a space -faring civilization and being out there among the stars is incredibly inspiring and exciting and something to look forward to.
[361] And there need to be such things in the world.
[362] Let me ask you a different question about confidence.
[363] We were having a conversation here earlier, but people and where people get their confidence from, some people have great insecurity, other people have great confidence, And I was thinking about you because you have a very interesting history where people have told you over and over again that you're wrong.
[364] Well, sometimes they're right.
[365] Well, sometimes they are.
[366] But I would say that when it comes to Tesla, when it came to SpaceX, people told you that you were crazy.
[367] You were out of your mind.
[368] This was never going to happen.
[369] This is never going to work.
[370] And so what I ask you this, though, is now.
[371] Now, when people say you're wrong, this isn't right, do you look at that and say, you know what, that's like a red flag for me because, you know, I've been told so often that I'm wrong that I know, and I know I'm right, because I've had that experience, or are there people in your life when they say, you know what, Elon, this is not, this is not right.
[372] Do you know what I'm saying?
[373] I mean, I think what you start trying to say is that do I at this point think because I've been right so many times for others that said I'm wrong, that now I fast believe I'm right when I fact I'm wrong.
[374] You do very well.
[375] What do you think?
[376] No, I'm right.
[377] So, yeah, no, look, here's the thing.
[378] Physics is unforgiving.
[379] Physics is unforgiving.
[380] So, I mean, I have, you know, these very little sayings I've come with that physics is the law and everything else is a recommendation.
[381] Right.
[382] In the sense that you can break any law made by humans, but try breaking a law made by physics.
[383] That's much more difficult.
[384] So if you are wrong and persist in being wrong, the rockets will blow up and the cars will fail.
[385] So we're not trying to figure out what flavor of ice cream is the best flavor of ice cream.
[386] There's a thousand things that can happen on a rocket flight, and only one of them gets the rocket to orbit.
[387] And so being wrong results in failure when dealing with physical objects.
[388] But that's the interesting part.
[389] you've built this, these great companies that physically, the physics of them are enormously successful, so successful, arguably, that you have leverage over everybody else, right?
[390] Nobody else can do Starlink.
[391] Nobody else can get, I, nobody else can get the rockets in space yet, Amazon, and Jeff Bezos are trying, but they haven't yet.
[392] I hope he does.
[393] You hope he does?
[394] Yeah, yeah.
[395] I mean, I think, you know, but I actually agree with, with, with, with, with, with Jeff's motivations.
[396] I mean, I think, you know, he's, you know, so I'm, let me put this way.
[397] If there was a button I could press that would delete Blue Arjun, I wouldn't press it.
[398] So I think it's good that he's spending money on, on making rockets too.
[399] You know, it's just perhaps he spent more time on it, but, you know, it's up to him.
[400] The, the, but I should make a point here.
[401] So nothing any of my companies have done has been to stifle composition.
[402] In fact, we've done the opposite.
[403] So at Tesla, we have open -sourced our patents.
[404] Anyone can use our patents for free.
[405] How many companies do you know who've done that?
[406] Can you name one?
[407] I can't.
[408] At SpaceX, we don't use patents.
[409] So I've said once in a while we'll file a patent just so some people, Patent control doesn't cause trouble, but we're not stopping any, we've done, we've done nothing anti -competitive.
[410] We've done nothing to stop our competitors.
[411] I'm not just you at all.
[412] I also want to clarify for the audience, because some companies have done anti -competitive things.
[413] I think the strange thing, or the unusual thing about SpaceX and Tesla is that we've done things that have helped our competition.
[414] So at Tesla, we have made our supercharger system.
[415] open access, we've made our charge of technology available for free to the other manufacturers.
[416] The reason I...
[417] No wall of garden.
[418] We could have put a wall up.
[419] But instead, we invited them in.
[420] The reason I mention this, though, is because you've had the success in the physical physics world, you now have these very difficult decisions that have huge impacts on the world that are not physical decisions at all.
[421] They're decisions of the mind.
[422] the decisions that you and others have to make.
[423] And there's a question whether you should be making these decisions at all.
[424] And I think about it in the context of Starlink.
[425] Obviously, there was the report about how it's being used in Ukraine and the Russia War.
[426] There's questions about, you know, Taiwan, whether Taiwan should use it or will use it.
[427] I believe they're not right now because they're worried that at some point maybe the Chinese will tell you that you have to, they have leverage over you and you're going to have to turn that off.
[428] Right?
[429] I mean, these are very difficult decisions.
[430] And I'm so curious how you think about that.
[431] And not just the decisions, the fact that you have that power.
[432] I think it's important for the audience to understand that the reason I have these powers not because of some anti -competitive actions.
[433] It's simply because we've executed very well.
[434] Oh, I'm not dismissing that.
[435] I think there are so many people, by the way, who are huge supporters of what you've created.
[436] There are other satellites out there, you know.
[437] But they're not as good as yours.
[438] And we can say maybe make the same argument out of the cars and everything else.
[439] But as a result, that gives you.
[440] enormous leverage.
[441] With the exception of the, by the way, these advertisers who aren't on X, in every other instance, everybody needs you.
[442] I mean, nobody's, let me use our product if it's better than use somebody else's product if it's other products better.
[443] And I accept that.
[444] It may be one day somebody else who created better products.
[445] Is it like, you know, how is it a bad thing to make better products as other companies?
[446] Well, and I want to go back to this, to the Starlink piece of it, though, because that has sort of a geopolitical ramification in terms of your power and how you think about that specific power and then the power that the U .S. government might have either over you or not over you, the power the Chinese government might have over you or not over you, and how those things get used.
[447] I mean, what are you suggesting?
[448] I'm asking the question around this very idea of how these satellites are going to be used, whether you think that you should have control of them, whether the government should have control of them.
[449] Do you trust the government?
[450] Well, there's a lot of people who don't trust the government.
[451] Exactly.
[452] But then this goes back to the trust of you, right?
[453] I mean, like I said, we're not the only company who has communication satellites.
[454] Our satellites are just much better than theirs.
[455] So it's not like we have a monopoly.
[456] Do you feel like anybody has...
[457] Do you feel anybody has leverage over you?
[458] I mean, I think at the end of the day, if we make bad products that people don't want to use, then the users will vote with their resources and use something else.
[459] Let me pivot the conversation for a second.
[460] I mean, my company is overseen by regulators.
[461] And while, you know, once in a – since SpaceX, Starlink, Tesla are overreveceive, overseen by, you know, cumulatively over 100 regulators.
[462] And actually more than that, a few hundred regulators, because we're in 55 countries.
[463] If you sum up all the times that I had an argument with regulators of hundreds of regulators over decades, it can sound really terrible, except when they forgot to mention that there were 10 million regulations we complied with, and only five that I disagreed with.
[464] But they list all the five.
[465] And it sounds like, wow, this guy's a real maverick.
[466] I'm like, yeah, but what about the 10 million we complied with?
[467] Do you, let me, one related thing on this, the leverage of countries and things over you, regulators.
[468] X is this free speech platform.
[469] You do business in China, lots of business.
[470] China, that's an important part of your business, I imagine.
[471] Well, that's SpaceX.
[472] How do you think about the leverage?
[473] that the Chinese have over you, and do they have leverage over you?
[474] And how do you feel about, some people would say, is it hypocritical for you to be doing business in China or, frankly, in other countries as it relates to X and other things, that don't follow this free speech path that you have espoused?
[475] The best that the X platform can do is adhere to the laws of any given country.
[476] Do you think there's something more we could do than that?
[477] I think it would be very hard, but I just wonder, given the sort of strong philosophical approach that you've been vocal about, whether you say to yourself, you know, maybe I shouldn't be doing business in that country.
[478] Well, first of all, Starlink and SpaceX do our no business in China whatsoever.
[479] Tesla has one of four factories, four vehicle factories in China, and China is, you know, I don't know, a quarter of our market or something like that.
[480] And so it's a quarter of the market of one company.
[481] The same is true, by the way, of all the other car companies.
[482] They also have something on that order of quarter of their sales in China.
[483] So if that's a problem for Tesla, it's a problem for every car company.
[484] I mean, I think one has to be careful about not conflating the various companies because I can only do things that are within the bounds of the law.
[485] I cannot do beyond that.
[486] My aspiration is to do as much good as possible and to be as productive as well.
[487] possible within the bounds of what is legal.
[488] More than that, I cannot do.
[489] We'll be right back.
[490] I want to pivot and talk about AI for a moment.
[491] We had Jensen Wong here, who's a big fan of yours, as you know.
[492] Yeah, Jensen's awesome.
[493] Talk about bringing you the first box, by the way, with Ilya.
[494] Interestingly enough, back in 2016, I think.
[495] There's a video of Jensen and me unpacking the first AI computer at Open So I'm so curious what you think of what's just happened over the past two weeks.
[496] While you were dealing with this other headline, series of headlines, there was a whole other series of headlines at Open AI.
[497] What did you think?
[498] You founded it.
[499] Co -founded it, yeah.
[500] Well, the whole arc of Open AI, frankly, is a little troubling because the reason The reason for starting open AI was to create a counterweight to Google in Deep Mind, which at the time had two -thirds of all AI talent and basically infinite money and compute.
[501] And there was no counterweight.
[502] It was unipolar world.
[503] And Larry and Paige and I used to be very close friends, and I would stay at his house, and I would talk to Larry into the late hours of the night about AI safety.
[504] and it became apparent to me that Larry did not care about AI safety I think perhaps the thing that gave it away was when he called me a specious for being pro -humanity as in you're not like a racist but for species so I'm like, wait a second what side are you on Larry and then I'm like okay listen this guy calling me a specious He doesn't care about AI safety.
[505] We've got to have some counterpoint here because this seems like we could be...
[506] This is no good.
[507] So Open AI was actually started, and it was meant to be open source.
[508] I named it, Open AI, after open source.
[509] It is, in fact, closed source.
[510] It should be renamed Superclosed Source for Maximum Profit AI.
[511] So, because this is what it actually is.
[512] I mean, Faye loves irony.
[513] I mean, in fact, friend of mine has this, says, like, the way to predict outcomes is the most ironic outcome is the most, it's like this Occam's razor, like, the simplest sort of explanation is most likely.
[514] And my friend Jonah's view is that the most ironic outcome is the most likely.
[515] And that's what's happened with Open AI.
[516] It's gone from an open source foundation.
[517] foundation of 5123 to suddenly it's like a 90 billion dollar full profit corporation with closed source so i don't know how you go from fair to there that seems like a i don't know how you get i don't know if is this legal i'm like that's legal so as you saw sam altman get ousted yeah by somebody you know ilia and ilia was somebody was a friend of yours yes you brought him there uh your relationship with larry page effectively broke down over you recruiting him away, I think.
[518] That's correct.
[519] That was the that was the, Larry refused to be friends with me after I recruited Ilya.
[520] And so here's Ilya, apparently, saying something is very wrong.
[521] I think we should be concerned about this, because I think Ilya actually has a strong moral compass.
[522] He thinks about, you know, he really sweats it over questions of what is right.
[523] And if Ilya felt strongly enough to want to fire Sam, you know, fire Sam, Well, I think the world should know what was that reason.
[524] Have you talked to him?
[525] I've reached out, but he doesn't want to talk to anyone.
[526] Have you talked to other people behind the scenes?
[527] Is this all happening?
[528] I've talked to a lot of people.
[529] Nobody, I've not found anyone who knows why.
[530] Have you?
[531] I think we are all still trying to find out.
[532] I mean, look, one of two things is, either it was a sense.
[533] serious thing and we should know what it is or it was not a serious thing and and then the board should resign what do you think of Sam Altman I have mixed feelings about Sam I do you know the ring of power you know can corrupt and this is the ring of power so you know I don't know I think I want to know why Elia felt so strongly as to fire Sam this sounds like a serious thing I don't think it was trivial.
[534] And I'm quite concerned that there's some, you know, dangerous element of AI that they've created.
[535] Discovered?
[536] Yes.
[537] You think they've discovered something?
[538] That would be my guess.
[539] Where are you with your own AI efforts relative to where you think open AI is, where you think Google is, where you think the others are?
[540] I mean, on the AI front, I've been somewhat of a quandary here because I've thought AI could be something that would change the world in a significant way since I was in college, I mean, like 30 years ago.
[541] So the reason I didn't go bold AI right from the get -go was because I was uncertain about which edge of the double -edged sword would be sharper, the good edge of the bad edge.
[542] So I held off on doing anything on a lot.
[543] I could have created, I think, leading AI company, and open AI actually kind of is that, because I was just uncertain if you make this magic genie, what will happen?
[544] Whereas I think building sustainable energy technology is much more of a single -edged sword that is single -edged good, making a multi -planetary, I think single -edged good.
[545] You know, Staling, mostly single -edged good.
[546] I mean, giving people better connectivity, to people that, you know, don't have connectivity or too expensive, I think is very much a good thing.
[547] Staling was instrumental, by the way, and halting the Russian advance, and the Ukrainians said so.
[548] So, you know, I think that is, but with AI, you've got the magic genie.
[549] problem.
[550] You may think you want a magic genie, but once that genie's out of the bottle, it's hard to say what happens.
[551] How far are we away from that genie being out of the bottle, you think?
[552] We think it's already out.
[553] And the genie is certainly poking his head out.
[554] The AGI, the idea of artificial general intelligence, given what you now are working on yourself and you know how easy or hard it is to train, to create the inferences, to create the weights.
[555] I hope I'm not getting too far in the weeds of just how this works, but those are the basics behind the software end of this.
[556] It's funny, you know, all these weights, they're just basically numbers in a common separated value file, and that's our digital god, a CSB file.
[557] Not that funny.
[558] But that's kind of literally what it is.
[559] So I think it's coming pretty fast, Is that, I mean, you famously have admitted to overstating how quickly things will happen, but how quickly do you think this will happen?
[560] If you say smarter than the smartest human at anything, it may not be then quite smarter than all humans, well machine augmented humans, you know, because people have got computers and stuff, there's a higher bar, but you say smarter than any, you know, can write as good a novel as say, J .K. Relling or discover new physics or invent new technology, I would say that we are less than three years from that point.
[561] Let me ask you a question about XAI and what you're doing.
[562] And because there's an interesting thing that's different, I think, about what you have relative to some of the others, which is you have data, you have information, you have all of the stuff that everybody in here has put on the platform to sort through.
[563] And I don't know if everybody realized that initially.
[564] What is the value of that?
[565] Yeah, I mean, data is very important.
[566] You could say data is probably more valuable than gold.
[567] But then maybe you have actually, maybe you have more, maybe you have the gold in X in a different way.
[568] In a way, again, that I don't know if the public appreciates what that means.
[569] Yes.
[570] X is the, might be the single best.
[571] source of data.
[572] I mean, it is, there are more, you know, people, links that go to, feel click on more links to X than anything else on Earth.
[573] Sometimes people think Facebook or Instagram is a bigger thing, but actually there are more links to X than anything.
[574] You can, there's public information, you can Google it.
[575] Okay, let me ask you a. So it is, it is where you would find what is happening right now on Earth at any given point in time.
[576] the whole open -air drama played out, in fact, on the X platform.
[577] So it is one of the, it's not, you know, Google certainly has a massive amount of data, so is Microsoft.
[578] So it's not like, but it is one of the best sources of data.
[579] Can I ask you an interesting IP issue, which I think is actually something I can say as somebody who's in the creator business and journalistic business and whatnot, or care of a copyright?
[580] So one of the things about training on data has been this idea that you're not going to train or these things are not being trained on people's copyrighted information historically.
[581] That's been the concept.
[582] Yeah, that's a huge lie.
[583] Say that again?
[584] Yes.
[585] These AI, all trained on copyrighted data, obviously.
[586] So you think it's a lie when Open AI says that this is not, none of these guys say they're training on copyrighted data?
[587] That's a lie.
[588] It's a lie, straight up.
[589] A straight -up lie.
[590] Okay.
[591] 100%.
[592] Obviously, it's been trained on proprietary data.
[593] Okay, so let me ask the second question, which is all of the people who have been uploading all of the people have been uploading articles, the best quotes from different articles, videos to X. All of that can be trained on.
[594] And it's interesting because people put all of that there and those quotes have historically being considered fair use, right?
[595] People are putting those quotes up there.
[596] And individually, on a fair use basis, you'd say, okay, that makes sense.
[597] But now there are people who do threads, and by the way, there may be multiple people who've done, you know, an article that has a thousand words.
[598] Technically, all thousand words could have made it onto X somehow.
[599] And effectively, now you have this remarkable repository.
[600] And I wonder what you, how you think about that, again, and how you think the creative community and those who were the original IP owners should think about that.
[601] I don't know, except to say that by the time these lawsuits are decided, we'll have digital God.
[602] So I asked digital God at that point.
[603] These lawsuits won't be decided before in a time frame that is relevant.
[604] Is that a good thing or a bad thing?
[605] I think we live, you know, there's that, I don't know if it's actually a real Chinese saying or not, but may you live in interesting times, it's apparently not a good thing.
[606] But I would prefer to, personally, I would prefer to live in interesting times.
[607] And we live in the most interesting of times.
[608] I think, for a while later, I was like really getting demotivated and losing sleep over the sort of the threat of AI danger.
[609] And then I finally sort of became fatalistic about it and said, well, even if I knew it was annihilation was certain, would I choose to be alive at that time or not?
[610] And I said, I probably would choose to be alive at that time because it's the most interesting thing, even if there's nothing I could do about it.
[611] So then, you know, then basically sort of a fatalistic resignation helped me sleep at night because I was having trouble sleeping a night because of AI danger.
[612] Now, what to do about it?
[613] I mean, I've been the biggest, the one banging the drum, the hardest, by far the longest, or at least one of the longest, for AI danger.
[614] And these regulatory things that are happening, the single biggest reason that are happening is because of me. Do you think they're ever going to get their arms around it?
[615] We talked to the vice president this afternoon.
[616] She said she wants to regulate it.
[617] People have been trying to regulate social media.
[618] for years and have done nothing effectively?
[619] Well, there's regulation around anything which is a, like a physical danger, a danger to the public.
[620] So cars are heavily regulated.
[621] Communications are heavily regulated.
[622] Rockets and aircraft are heavily regulated.
[623] The general philosophy about regulation is that when something is a danger to the public, that there needs to be some government oversight.
[624] So I think, in my view, view, AI is more dangerous than nuclear bombs, and we regulate nuclear bombs.
[625] You can't just go make a nuclear bomb in your backyard.
[626] I think we should have some kind of regulation with AI.
[627] Now, this tends to cause the AI accelerations to get up in arms, because they think AI is sort of heaven, basically.
[628] But you typically don't like regulation.
[629] You've pushed back on regulators for the most part, in the world of Tesla.
[630] So many instances where we read articles about you pushing back on the regulators.
[631] I'm so curious why, in this instance, now you own one of these businesses.
[632] As I said a moment ago, one should not take what is viewed in the media as being the whole picture.
[633] There are literally hundreds, this is not an exaggeration, so there are probably 100 million regulations that my companies.
[634] comply with.
[635] And there are probably five that we don't.
[636] And if we disagree with some of those regulations, it's because we think the regulation that is meant to do good doesn't actually do good.
[637] But that's an interesting thing.
[638] But the question, if there are laws and rules, whether the idea is that you're making the decision that the law and the rule shouldn't be the law and the rule and then, right, isn't?
[639] No, I'm saying you're fundamentally mistaken.
[640] And it should be obvious that you're mistaken.
[641] My company's automotive is heavily regulated.
[642] We would not be allowed to put cars on the road if we did not comply with this vast body of regulation.
[643] Now you could fill up this stage with literally you know, six foot high with the regulations that you have to comply with to make a car.
[644] You could have a room full of phone books.
[645] That's how big the regulations are.
[646] if you don't comply with all of those, you can't sell the car.
[647] And if we don't comply with all the regulations for Rockets or for Starlink, they shut us down.
[648] So in fact, I am incredibly compliant with regulations.
[649] Now, once in a while, there'll be something that I disagree with.
[650] The reason I would disagree with it is because I think the regulation, in that particular case, in that rare case, does not serve the public good.
[651] And therefore, I think it is my obligation to object to a regulation that is meant to serve the public good, but doesn't.
[652] That's the time I object, not because I seek to object.
[653] In fact, I'm incredibly a rule following.
[654] Let me ask you a separate question, a social media related question.
[655] We've been talking about TikTok today ahead of the election.
[656] Sir.
[657] TikTok is...
[658] What do you think of TikTok?
[659] Do you think it's a national security threat?
[660] I don't use TikTok.
[661] Say it again?
[662] You don't?
[663] I don't personally use it.
[664] But for people that, for teenagers and people in their 20s, they seem almost religious addicted to TikTok.
[665] Some people will watch TikTok for like two hours a day.
[666] I stopped using TikTok when I felt the AI probing my mind, and I don't, it made me uncomfortable.
[667] So I stopped using it.
[668] And in terms of anti -Semitic content, I mean, TikTok is rife with that.
[669] It has the most viral anti -Semitic content by far.
[670] But do you think the Chinese government is using it to manipulate the minds of Americans?
[671] Is that something that you think we should worry about?
[672] I mean, you have different states that are trying to ban it.
[673] I don't think this is some Chinese government plot.
[674] But it is, the TikTok algorithm is entirely AI powered.
[675] So it is really just trying to find the most viral thing possible.
[676] It's what is going to keep you glued to the screen?
[677] That's it.
[678] Now, on sheer numbers, there are on the order of 2 billion Muslims in the world.
[679] And I think, you know, much smaller number of Jewish people, what, 20 million, something?
[680] Many orders magnitude fewer.
[681] So if you just look at content production, just on sheer numbers basis, is going to be overwhelmingly anti -Semitic.
[682] Let me ask you a number.
[683] Let me ask you a political question.
[684] And I've been trying to square this one in my head for a long time.
[685] Yeah.
[686] In the last two or three years, you have moved decidedly to the right, I think.
[687] Have I?
[688] Well, we can discuss this.
[689] I think that you have been espousing and promoting a number of Republican candidates and others.
[690] You've been very frustrated with the Biden administration over, I think, unions and, feeling like they did not respect what you've created.
[691] Well, I mean, without any, doing nothing to provoke the White administration, they held an electric vehicle summit at the White House and specifically refused to let Tesla attend.
[692] This was in the first six months of the administration.
[693] And we inquire, we're like, we literally make more electric cars than everyone else combined.
[694] Why are we not allowed?
[695] Why are you only letting Ford GM, Chrysler, and your 8th?
[696] and you're specifically disallowing us from the EV summit at the White House.
[697] We had done nothing to provoke them.
[698] Then Biden went on to add insult to injury and publicly said that GM was leading the electric car revolution.
[699] This was in the same quarter that Tesla made 300 ,000 electric cars and GM made 26.
[700] Does that seem fair to you?
[701] So, but tell me this then.
[702] It doesn't seem fair.
[703] And I've asked repeatedly, you've probably seen me. Oh, I had a great relationship with Obama.
[704] So there was not a...
[705] So, but then there's this.
[706] I stood in line for six hours for six hours to shake Obama's head.
[707] Okay.
[708] So, but, okay, so let me say that's on a personal level.
[709] I can see it in your face.
[710] This, this hurt you personally.
[711] And I hurt the company, too.
[712] And it was the insult to, you know, Tesla has 140 ,000 employees.
[713] Okay.
[714] Of the, half of them are in the United States.
[715] Tesla has created more manufacturing jobs than everyone else combined.
[716] So let me ask this then.
[717] You've devoted at least the last, close to 20 years of your life, if not more, to the climate, climate change, trying to get Tesla off the ground, in part to improve climate.
[718] You've talked about that.
[719] Yeah, a real right -wing motive is...
[720] Repeatedly.
[721] Got it far right, if anything.
[722] No, I understand that.
[723] And then it's...
[724] It's reverse psychology next level.
[725] Well, no, but so here's then the...
[726] question, which is how do you square the support that you have given?
[727] I believe you were at a fundraiser for Vivek Ramoswamy, for example, who says that the climate issue is a hoax.
[728] Right?
[729] I disagree with him on that.
[730] But I would think that that would be such a singular issue for you.
[731] I would think that the climate issue be such a singular issue for you that actually it would disqualify almost anybody who didn't take that issue.
[732] seriously?
[733] Well, I haven't endorsed anyone for president.
[734] I mean, I wanted to hear what Pavake had to say, because I think some of his things are, that's some of the things he says, I think are pretty solid.
[735] You know, he's concerned about government overreach, about government control of information.
[736] I mean, the degree to which old Twitter was basically a sock puppet of the government was ridiculous.
[737] So, you know, it seems to me that there's a, there's a a very severe violation of the First Amendment in terms of how much the government control the government had over old Twitter.
[738] And it no longer does.
[739] So, you know, there's a reason for the First Amendment.
[740] The reason for the First Amendment for freedom of speech is because the people that immigrated to this country came from a place where there was not freedom of speech.
[741] And they were like, you know what, we've got to make sure that that's constitutional.
[742] because where they came from, if they said something, they'd be put in prison, or they'd be, you know, something bad would happen to them.
[743] So, and freedom of speech, you have to say, when is it relevant?
[744] It's only relevant when someone you don't like can say something you don't like, or it has no meaning.
[745] And as soon as you sort of, you know, throw in the towel and concede to censorship, it is only a matter of time before someone censors you.
[746] And that is why we have the First Amendment.
[747] We'll be right back.
[748] Could you see yourself voting for President Biden?
[749] If it's a Biden -Trump election, for example?
[750] I think I would not vote for Biden.
[751] You'd vote for Trump.
[752] I'm not saying I'd vote for Trump, but I mean, this is definitely a difficult choice here.
[753] Would you vote for Nikki Haley?
[754] Nikki Haley, by the way, wants all social media names to be exposed, as you know.
[755] No, I think that's outrageous.
[756] Yeah, no, I'm not going to vote for some pro -s censorship candidate.
[757] Like I said, I mean, I think these, you have to, you have to, you know, consider that there is a lot of wisdom in these amendments, you know, in the Constitution.
[758] And, you know, a lot of these, a lot of things that we take for granted here in the United States that don't even exist in Canada.
[759] There's not no constitutional rights to freedom of speech in Canada.
[760] So, you know, and there's no Miranda rights in Canada.
[761] People think, like, you know, you have the right to remain silent.
[762] You don't, actually, in Canada.
[763] So, you know, half Canadian, I can say these and I can go off those.
[764] But, you know, so, like, you just got, you, the freedom of speech is incredibly important.
[765] Even when people, and like I said, it's actually especially important.
[766] In fact, it is only relevant when people you don't like can say things you don't like.
[767] And do you think right now that's meaningless?
[768] You think right now the Republican candidates or the Democrats are more inclined?
[769] I mean, this is where you go to, I assume, to woke and anti -woke and the mind virus issue that you've talked about, which party do you think is more pro -freedom of speech, given all the things you've seen?
[770] Because we also see, you know, DeSantis, you know, preventing people from reading certain things.
[771] Maybe you think that's correct.
[772] No. Look, we actually are in an odd situation here where, on, on, On balance, the Democrats appear to be more pro -censorship than the Republicans.
[773] I mean, that used to be the opposite.
[774] It used to be, you know, the left position was freedom of speech.
[775] You know, I believe at one point, the ACLU even defended the right of someone to claim that they were Nazi or something like that, you know.
[776] So, like, there really were, like, the left was freedom of speech as.
[777] fundamental.
[778] And I mean, my perception, perhaps it isn't accurate, is that the pro -censorship is more on the left than the right.
[779] We certainly get more complaints from the left than the right.
[780] Let me put it that way.
[781] So, but my aspiration for the X platform is that it is the best source of truth, or the least inaccurate source of truth.
[782] And well, you know, I don't know if people will believe me or not, but I think honesty is the best policy, and I think that the truth will win over time.
[783] And the, you know, we've got this great system, and it's getting better, called Community Notes, which is fantastic, I think, at correcting falsehoods, or adding context.
[784] In fact, we make a point of not removing anything, but only adding context.
[785] Now, that context could include that this is completely false, and here's why.
[786] And no one is immune to this.
[787] I'm not immune to it.
[788] Advertisers are not immune to it.
[789] In fact, we've had community notes, which has caused us some lost in advertising, speaking of loss of advertising revenue.
[790] If a community note, if this false advertising, the community note will say, this is false, and here is why.
[791] I mean, like, And there's one specific example that is public knowledge, so I'll mention it, which is, at one point, Uber had this ad, which said, earn like a boss.
[792] And it was community noted, if by a bus you mean $12 .47 an hour.
[793] This did cause at least a temporary suspension of advertising from Uber.
[794] I got to ask you a question that might make everybody in the room uncomfortable or not uncomfortable.
[795] It goes to the free speech issue.
[796] The New York Times Company and the New York Times newspaper, it appeared.
[797] over the summer to be throttled.
[798] What did?
[799] The New York Times.
[800] Well, we do require that everyone has to buy a subscription and we don't make exceptions for anyone.
[801] And I think if I want the New York Times, I have to pay for a subscription.
[802] And they don't give me a free subscription.
[803] So I'm not going to give them a free subscription.
[804] But were you throttling the New York Times relative to other news organizations, relative to everybody else?
[805] was it was it was it specific to the to the times they didn't buy a subscription and by the way it costs like a thousand dollars a month so if they just do that then then they're back in back in the saddle but but you are saying that it was throttled I'm saying I mean was there a conversation that you had with somebody you said look you know I'm unhappy with the times they should either be buying the subscription or I don't like their content or whatever whatever any organization that refuses to buy a subscription is not going to be recommended.
[806] But then what does that say about free speech?
[807] Well, it says it's like that's amplifying free.
[808] It costs a little bit.
[809] Right, but that's it, but that's an interesting you know, it's like in South Park when they say, you know, freedom isn't free at cost a buck or five or whatever.
[810] So, but it's pretty cheap.
[811] Okay.
[812] It's low -cost, low -cost freedom.
[813] I got a couple more questions for you.
[814] You're headed back to Texas after this to launch the cyber truck.
[815] Yeah.
[816] It's going to be a big launch.
[817] But I wanted to ask you right now more broadly just about the car business and what you see actually happening.
[818] And specifically, the government put in place lots of policies, as you know, to try to encourage more EVs.
[819] And one of the things that's happened uniquely is you have a not.
[820] a lot of car companies saying, actually, this is too ambitious for us.
[821] These plans are too ambitious.
[822] 4 ,000 dealers, I don't know if you saw just yesterday, sent the letter of the White House, saying, this has gone too far.
[823] You're going too far.
[824] You had this.
[825] Anti -EV?
[826] It was, this is going too fast, too far, and that there's not enough demand.
[827] Underneath all this is this idea that maybe there's not enough demand for EVs, that the American public has not bought into the, I mean, they've bought into, with your company, but they haven't bought into it broadly enough.
[828] Well, I think if you make a compelling electric car, people will buy it.
[829] No question about it.
[830] I mean, electric car sales in China are gigantic.
[831] That's by far the biggest category.
[832] And I think that would be the, you know, I mean, it's worth noting.
[833] Okay, so probably the best reputation of that is that the Tesla Model Y will be the best -selling car of any kind on Earth this year.
[834] of any kind, gasoline or otherwise.
[835] Is there another car company that you think is doing a good job with these?
[836] I mean, I think the Chinese car companies are extremely competitive.
[837] By far, our toughest competition is in China.
[838] So, I mean, there's a lot of people who are out there who think that the top -ten car company is going to be Tesla, followed by nine Chinese car companies.
[839] I think they might not be wrong.
[840] So, China is super good at manufacturing, and the work ethic is incredible.
[841] So, you know, like if we consider different leagues of competitiveness at Tesla, we consider the Chinese league to be the most competitive.
[842] And by the way, we do very well in China because our China team is the best China.
[843] How worried are you that the unionization effort that just took place at, well, I shouldn't say effort, but the new wages and the like at GM and Ford are that they're coming for you.
[844] And they are coming for you.
[845] What is that going to mean to you in your business?
[846] Well, I mean, I think it's generally not good to have an adversarial relationship between people online, you know, one group at the company and another group.
[847] In fact, I mean, I disagree with the idea of unions, but perhaps for a reason that is different.
[848] than people may expect, which is I just don't like anything which creates kind of a lords and peasants sort of thing.
[849] And I think the unions naturally try to create negativity in a company and create a sort of lords and peasants situation.
[850] There are many people at Tesla who have gone from working on the line to being in senior management.
[851] There is no lords and peasants.
[852] Everyone eats at the same table.
[853] Everyone parks in the same parking lot.
[854] You know, at GM there's a special elevator for only for senior executives.
[855] We have no such thing at Tesla.
[856] You know, and the things that I actually know the people on the line because I worked on the line and I walked the line and I slept in the factory and I worked beside them.
[857] So I'm no stranger to them.
[858] And there are actually many times where I've said, well, can't we just hold a union boat?
[859] But apparently a company is not allowed to hold a union vote.
[860] So it has to be somehow called for, but the unions can't do it.
[861] So I said, well, let's just hold a vote and see what happens.
[862] The actual problem is the opposite.
[863] It's not that people are trapped at Tesla building cars.
[864] The challenge is how do we retain great people to do the hard work of building cars when they have like six other opportunities that they can do that are easier?
[865] That's the actual difficulty, is that building cars is hard work, and there are much easier jobs.
[866] And I just want to say that I'm incredibly appreciative of those who build cars, and they know it.
[867] You know, so there's, I don't know, maybe there will be unionized.
[868] I say, like, if Tesla gets unionized, it will be because we deserve it and we failed in some way.
[869] But we certainly try hard to, you know, ensure the prosperity of everyone.
[870] We give everyone stock options.
[871] We've made many people who were just working the line who didn't even know what stocks were.
[872] We've made the millionaires.
[873] We're going to run on time.
[874] Final couple quick questions.
[875] When do you have the time to tweet or to post?
[876] I actually think about it all the time.
[877] Well, I use the bathroom sometimes.
[878] I use it all the time, meaning if we were to open up our phones and look at the screen time, what does yours look like?
[879] Well, about every three hours, I make a trip to the lavatory.
[880] And that's the only time you do this?
[881] Seems like you're on there a lot.
[882] No, I mean, there'll be like brief moments between meetings.
[883] I mean, it's not, obviously, I've, I've, like, 17 jobs, so, you know, and, um, no, no, I guess technically it's work at this point.
[884] It is, but I'm thinking just in terms of your mind share.
[885] I mean, by the way, there's a lot of people who should be working who are on, who are on this.
[886] Technically posting on Twitter is, or X is work.
[887] It does count as work.
[888] So that's, you know, there's that.
[889] But no, I mean, I think I'm on, well, I guess, usually, you know, it's work.
[890] Probably I'm on for longer than I think I am.
[891] I know, but do you think that's five hours a day, four hours?
[892] If you look at the screen time of like a number of hours per week, sometimes that's a scary number.
[893] It's probably, I don't know, it's a little over an hour a day or something like that.
[894] Just an hour a day.
[895] If we really looked at this together, do you have your phone with you?
[896] Yeah.
[897] You want to look?
[898] Okay.
[899] Okay, here we go.
[900] You ready?
[901] Screen time.
[902] Hit general.
[903] Yeah, screen time.
[904] Sometimes there's a scary number of it.
[905] I know.
[906] That's why I thought...
[907] I just got a new phone, so I think this is not accurate because it's one minute.
[908] Pretty sure it's more than that.
[909] Wait, over the week, there we go.
[910] Yeah, go to the week.
[911] Okay, so it's still wrong.
[912] It's more than four minutes.
[913] I just got a new phone, so this is not accurate.
[914] It literally says four minutes.
[915] New phone.
[916] Tim Cook said any to that phone?
[917] New phone, who does?
[918] Yeah.
[919] I should ask, by the way, because I just mentioned Tim Cook, do you feel like you're going to have to have a battle with him eventually?
[920] Is that the next fight over the app store?
[921] The idea of making a phone, what do you mean like?
[922] No, no, no. Over the app store.
[923] Do you ever make a phone?
[924] Sam Alvin's apparently thinking about making a phone with Johnny Hive.
[925] I mean, I don't think there's a real need to make a phone.
[926] I mean, if there's an essential need to make a phone, I'd make a phone.
[927] But I got a lot of fish to fry.
[928] So, I mean, I do think there's a fundamental challenge that phone makers have at this point because you've got basically a black rectangle.
[929] You know, how do you make that better?
[930] So do you want to do that?
[931] What does that look like in Elon's head?
[932] No, that's literally, yeah, good phrase in the head, a neural link.
[933] Well, there we go.
[934] We need to touch that before it's over.
[935] You know, the best interface would be a neural interface directly to your brain.
[936] So that would be a neural link.
[937] How far are we, do you think, from that?
[938] And how excited or scary does that seem to be?
[939] And we read these headlines, obviously, about monkeys who died, as you know.
[940] What should we think about that?
[941] Yeah, actually, the this is, the USDA inspector, who came by neuralink facilities, literally said, in her entire career, she has never seen a better animal care facility.
[942] We are the nicest to animals that you could possibly be, even to the rats and mice, even though they did the plague and everything.
[943] So it is like monkey paradise.
[944] So the thing that gets conflated is that there were some terminal monkeys where, you know, this is actually several years ago, where the monkeys were about to die and we're like, okay, we've got an experimental device, it's that kind of thing we should only put on a monkey that's about to die.
[945] And then, you know, now the monkey died, but it didn't die because of the NeurLink, died because it was, you know, had a total case of cancer or something like that.
[946] So, NeurLink has never caused the death of a monkey.
[947] Unless they're hiding something from me, it has never caused a death of a monkey.
[948] And in fact, we've now had monkeys with NeurLink.
[949] implants for like two, three years, and they're doing great.
[950] So, and we've even replaced the NeurLink twice.
[951] And we're getting ready to do the first implants in hopefully in a few months.
[952] The early implementations of NeurLink, I think are unequivocally good.
[953] Speaking of the double -edged sword, I think these early implementations are single -edged sword because the first implementations will be to enable people who have lost the brain -body connection to be able to operate a computer or a phone faster than someone who has hands that work.
[954] So you can imagine if Stephen Hawking could communicate faster than someone who had full -body functionality.
[955] How incredible that would be.
[956] Well, that's what this device will do.
[957] And we should have proof of that in a human, hopefully in a few months.
[958] It already works and monkeys and worked quite well with monkeys that can play video games just using, just by thinking.
[959] So, and the next application after the sort of those dealing with tetraplegic from quadriplegic is going to be vision.
[960] Vision is the next thing.
[961] So it's like if somebody is like has lost both eyes or the optic nerve has failed basically where they have no possibility of having sort of some ocular correction that will be the next thing for Neurrelink is a direct vision interface and in fact then you could be like Jordial LaForge from Star Trek you could see in like any frequency actually you could see in radar if you want two final questions and then we're going to end this conversation which I think has taken everybody inside the mind of Elon Musk today not as well as Neurlink will It actually goes to self -driving cars and vision and everything else.
[962] And I asked this question of Pete Buttigieg, the transportation secretary.
[963] It's actually something you retweeted.
[964] So I wanted to ask you the same question.
[965] There's a big question about autonomous vehicles and the safety of them.
[966] But there's also a question about when it will be politically palatable in this country for people to die in cars that are controlled by computers, which is say we have 35, 40 ,000 deaths every year in this country.
[967] Yeah.
[968] If you could bring that number down to 10 ,000, 5 ,000, that might be a great thing.
[969] But do we think that the country will accept the idea that 5 ,000 people, that your family might have perished in a vehicle as a result not of a human making a mistake but of a computer yes well first of all humans are terrible drivers so people text and drive they drink and drive they get into arguments they you know they do all sorts of things in cars that they should not do so it's actually remarkable that they're not more deaths than there are.
[970] What we'll find with computer driving is, I think, probably in order of magnitude reduction in deaths.
[971] I think, now, and the U .S. has actually far fewer deaths per capita than the rest of the world.
[972] If you go worldwide, I think there's something close to a million deaths per year due to automotive accidents.
[973] So I think computer driving will probably drop that by 90 % or more.
[974] It won't be perfect, but it'll be 10 times better.
[975] And do you think that the public will accept that?
[976] Do you think the government will accept that?
[977] Well, in large numbers, it will simply be so obviously true that it really cannot be denied.
[978] And what do you think?
[979] I know we've talked about the timeline before, and I know people have criticized you for putting out timelines that may not have come true just yet.
[980] but what do you think it was honest and by the way do you ever say to yourself I shouldn't have said that sure of course um wait I should have said that so yeah I mean I'm optimistic about I mean I think I'm like naturally optimistic about timescales and if I was not naturally optimistic I wouldn't be doing the things that I'm doing I mean I suddenly wouldn't have sought a rock company or like a car company if I wouldn't have some sort of pathological optimism frankly So, as you pointed out, many people said they would fail.
[981] And in fact, I said, actually, I agreed with them.
[982] I said, yes, it probably will fail.
[983] And they're like, hmm, okay.
[984] But I thought it's basic since I had less than 10 % chance of success when we started them.
[985] So, yeah, anyway, but the self -driving thing is, I've been optimistic about it.
[986] We certainly made a lot of progress.
[987] If anybody has tried, it has been using the sort of full self -driving beta, the progress is, you know, every year has been substantial.
[988] It's really now at the point where in most places, it'll take you from one place to another with no interventions.
[989] And the data is unequivocal that supervised full self -driving is somewhere around four times safer or maybe more than just, you know, you know, you know, human driving by it by themselves um so i'm i'm i you know i can certainly see it coming actually really think it's another five or ten years i mean people no no no definitely not definitely not um did you feel like investors have invested in something that that hasn't happened yet is that is that fair to them and that's the other question that people have about that well i mean i think that they've they've all with rare exception uh thought it was wasn't happening.
[990] So they were investing in, despite thinking, they're very clear that they don't think it's real.
[991] So they're not saying, oh, we just believe everything you know and says, hook, line, and sinker.
[992] But the thing is that, I mean, I would be a fair criticism of me to say that I'm late, but it isn't, but I always deliver in the end.
[993] And ask you the final question.
[994] I took note of this.
[995] It was November 11th, and you took to Twitter and you wrote only two words.
[996] you said amplify empathy right I was taken back by that given all the things that have been going on in the world do you remember what you were thinking well I think it's quite literally I understand it but what was going on why did you write that well I was encouraging people to amplify empathy literally I tend to be quite literal but was there something that it happened that you had seen, that you said to yourself, I want to say that.
[997] I think I'm talking to some friends, and we all agreed that we should try to amplify empathy, and so I wrote to amplify empathy.
[998] If you wanted an unvarnished look inside the mind of Elon Musk, I think you just saw it.
[999] Look, sometimes it's pretty simple, you know.
[1000] Elon Musk, thank you very, very much for the conversation.
[1001] All right, thank you.
[1002] Thank you.
[1003] Thank you so much.
[1004] That was a conversation from the Deal Book Summit.
[1005] You can check this feed for other interviews from the Deal Book stage, where we speak to leaders in business, politics, and culture who are shaping the world.
[1006] This episode was produced by Evan Roberts.
[1007] It was edited by Elaine Chen.
[1008] Mixing by Kelly Peeklo.
[1009] Original music by Daniel Powell.
[1010] The rest of the Deal Book events team includes Julie Zahn, Caroline Brunel, Haley Duffy, Angela, Austin, Haley Hess, Dana Priskowski, Matt Kaiser, Yen Wei Liu.
[1011] Special thanks to Sam Dolnik, Nina Lassum, Ravi Mattoe, Beth Weinstein, and Kate Carrington.
[1012] This is a production of The New York Times.