The Bulwark Podcast XX
[0] Hello and welcome to the Bullwark Podcast.
[1] I'm delighted to be here with Kara Swisher, host of the On with Kara Swisher podcast, co -host of the Pivot podcast, which is always right around the Bullwark podcast and the Apple charts.
[2] Not that I obsess over those.
[3] I do.
[4] I haven't loved it.
[5] And she's the author of three books, including the brand new memoir, Burn Book.
[6] Thanks for doing this, Kara.
[7] Thank you for having me. So if you don't mind, I want to start, like, as I was reading the book, I became obsessed with one question that's kind of not really a topic of the book, but it's adjacent.
[8] So if you don't mind, can we do one big picture question?
[9] Nope, not at all.
[10] It's your podcast.
[11] Let's do it.
[12] Sure.
[13] So, you know, because you started this, what, in the 90s?
[14] In the 90s, you started reporting on this?
[15] Early 90s.
[16] So I was thinking about this as I was reading through.
[17] And I was like, going back through the 90s into the mid -aughts, maybe even late -a -auts.
[18] If you asked people if they thought, the technological advances to date were like good or banned for society.
[19] Basically, everybody would have said good.
[20] There are always some Luddites, but basically everybody would have said positive.
[21] Today, if I raise that question with my peers, there's a lot of uncertainty.
[22] And so I wonder where you fall on that spectrum, sitting at now, 2024.
[23] Has it been a net positive, this transformation you've covered, or net negative?
[24] It's interesting.
[25] I don't think you could even net it out, right?
[26] I think one of the key quotes in the book is the Paul Virillio quote, which is when you invent the ship, you invent the shipper, when you invent electricity, you invent electrocution.
[27] And I think that is, you know, is electricity a net positive or cars a net positive?
[28] Yeah, sounds like, yeah.
[29] It is, but maybe not.
[30] If we, if the planet burns up, right?
[31] Sure.
[32] We don't know what we don't know in the future and how, where things are going to come out.
[33] And we never will, because things change over time.
[34] I would say it's a net positive and has the potential to be a real net positive.
[35] But I would say the negatives have far outweighed the positives in some critical areas like democracy, right?
[36] And the things, the deleterious effects of wealth, the deleterious effects of partisanship have been boosted and amplified by social media and these technologies.
[37] And it's given some very bad people and ability to be very bad at scale.
[38] And so that's been super problematic for any kind of comedy.
[39] And it doesn't have to pull us apart, but that's what these tools have been used for for the most part.
[40] Yeah.
[41] I mean, I wonder if that's true, though, that it doesn't have to.
[42] And I guess my challenge the question is, I think about the phones, right?
[43] Because in the book, I admitted to you in the green room, I hopped around.
[44] It's good.
[45] It's long.
[46] You know, I'm trying to get through everything.
[47] But I hopped around and I hopped to the Apple chapter, or the various parts where you talk about Apple.
[48] And, you know, cook and jobs are on balance.
[49] I mean, you paint three dimensional pictures, but on balance, you know, more towards the white hat side of things.
[50] I would agree.
[51] You know, I really think about this question and it kind of comes down to a lot of the negatives that have happened have been downstream of the hardware phone question, right?
[52] And if you think about the loneliness, the teen suicide, the democracy, the polarization, right?
[53] The fact that we're getting all of this right now in our handheld divide.
[54] And I just, I do wonder if you look back on that with a little bit of man. I don't know.
[55] Is this a cigarette situation?
[56] I'm not doing a, you know, guns don't kill people.
[57] People kill people thing here.
[58] But in this case, you know, it's just a phone.
[59] It could be used for a lot of things.
[60] It's a tool.
[61] You know, there's, Bradson.
[62] Smith at Microsoft had a book, which I thought was very smart.
[63] I think it was tools and weapons.
[64] It's either a weapon or a tool.
[65] Every single nuclear power tool, yes, for sure, and a very promising one.
[66] Weapons, absolutely.
[67] You know what I mean?
[68] That kind of thing.
[69] And I think it's just how you use these things, in this case, a screen, a TV screen.
[70] Is that a weapon?
[71] Because it's used to broadcast propaganda by Donald Trump?
[72] Yes, but it really is the propaganda, right?
[73] Really is what he's saying.
[74] And he would find any media.
[75] I say, You know, Hitler didn't need Instagram, did he?
[76] He had lots of other tools.
[77] But if he had it, very powerful piece of technology for him.
[78] And so I tend not to blame the items themselves, even the software itself.
[79] What I do blame is when they, for example, Facebook, when they allowed lots of people to come in some of these chat groups, the way they did it, they didn't limit it.
[80] So rage could move very fast.
[81] I do blame certain social networks for pushing more virality over context and accuracy.
[82] I blame them for doing it at speed.
[83] I blame them for not putting safety measures in place.
[84] Once it's deployed, how they manage it is more what I worry about.
[85] And so I think it's very hard to blame the device itself, except if it's built in a way that is addictive, which I think some of these things are, some of the software is, or it's built in a way so you can't put it down.
[86] It's like cigarettes.
[87] It's addiction.
[88] Yeah, I basically agree with that.
[89] It's just so I'm addicted.
[90] So I'm on the addict list.
[91] So I'm just trying to work through it.
[92] I can see that.
[93] I can see it from your activity.
[94] I don't hide it.
[95] I don't hide my addiction at all.
[96] And so, you know, but when I'm on X, you see somebody will put up a viral chart.
[97] And they'll be like, oh, my goodness, loneliness, like look at how much it's dropped or happiness.
[98] Look how much it's dropped.
[99] Or polarization, look how much it's up.
[100] It always feels like it's like 2011, 2013.
[101] Like that chart starts to go up or down.
[102] And it's like, well, what was that?
[103] And it's like, well, it's when it was about the time when everybody had phones and social media in their hands.
[104] Yeah, it wasn't phones as much.
[105] It was a social media on top of it and the addictive nature, you know.
[106] Yeah, the smartphone element, right?
[107] Not like the flip phone.
[108] Yeah, yeah.
[109] Yeah, the social media on your phone, I guess, the combination.
[110] Yeah, exactly.
[111] But it's a combination of all of them.
[112] Tristan Harris has become an advocate against this stuff, you know, against the way tech is used.
[113] He worked for Google.
[114] And one of the things, you know, he is absolutely right about is this stuff crawls down your brainstem, right?
[115] It appeals to human nature.
[116] There's lots of parts of you.
[117] human nature of wanting to coalesce and be with people.
[118] And then there's a part of human nature that wants to be by themselves, you know, sniff in the whatever, right?
[119] And with your addictive, whatever, it happens to be in many cases as a phone.
[120] These things are built for addiction.
[121] And they don't have to be, right?
[122] You can turn, for example, if you hit the side of an iPhone three times, it becomes black and white.
[123] And it becomes less interesting to people on a visceral level.
[124] And you don't touch it as much if it's black and white.
[125] I know it sounds dumb, but it really does work.
[126] And they can, do a lot of things that don't make you descend into addiction.
[127] They could put, like an Uber app, are you spending a lot of time on your Uber app?
[128] No, you call it, you use it, it goes.
[129] That should be on the front page.
[130] Facebook should be deep in a folder, so it takes a minute to get to it.
[131] But they don't naturally do that.
[132] The other thing is when they design these things, it's very much like a casino where if you push this button, you want to push it, don't you?
[133] Push this button.
[134] And it goes way back to AOL days when I was covering it.
[135] It was here in Washington, D .C. Britney Spears was a big clicker.
[136] I mean, people would click on anything about Britney Spears back then.
[137] And one of the pictures was fuzzy of her and the front page of AOL.
[138] And I said, why is that fuzzy?
[139] Why is that picture fuzzy?
[140] And this guy who ran the front page said, well, we make it fuzzy, so they click in.
[141] They lean in.
[142] They click in.
[143] You know, it's the same thing as a casino or whatever is else used.
[144] Well, now we literally have the casinos on our phone, too.
[145] That's right.
[146] You know, the sports gambling, so we do both.
[147] You don't have to design it like that.
[148] doesn't have to be designed so it appeals to addictive qualities.
[149] And that is on tech companies.
[150] The way they design the software is designed to make you not be able to put it down.
[151] You described yourself and looks kind of as Cassandra about some of these threats.
[152] There have been some critics that have said, oh, well, even you were too chummy early on.
[153] You just kind of talk about that process for you.
[154] You're living amidst this.
[155] I'm sorry to tell you, it's largely from men I competed with and be.
[156] So fine.
[157] All right, we'll put it.
[158] there.
[159] When I started my career at the Washington Post and Walser Journal, I was a beat reporter.
[160] You know what that is.
[161] You can't go on these assholes, right?
[162] Like, you cannot do that.
[163] This happened today at Google.
[164] This happened today.
[165] You know, it was you're a news reporter.
[166] That's what you do.
[167] And, you know, it's like when they accuse people who cover Trump of that.
[168] I'm like, they're beat reporters.
[169] What do you want?
[170] You want them to go in with like a hammer at him?
[171] Like, I'm not sure what.
[172] We need to learn.
[173] We need to know.
[174] I'm always a Maggie Haberman defender on this.
[175] There needs to be Maggie Haberman's and people that shout about how terrible what she reports is.
[176] She's just not.
[177] It's not true.
[178] It's just, but I see why they do it because they hate them.
[179] They want her to do something about it because she's near him.
[180] Fine.
[181] She's a beat reporter.
[182] I'm sorry, kids.
[183] That's what she does.
[184] She tells you what they're doing.
[185] Every now and then, the Times makes a mistake, but usually they do a pretty good job covering it in general.
[186] I know they're mad at the age thing, but whatever.
[187] That's because they want to win.
[188] That's different from anything else.
[189] So I was a beat reporter.
[190] And then when I left to do all things D, a lot of these things are written by people who were born years after we were covering this stuff.
[191] We did very heavy -duty coverage and very critical coverage of Google trying to take over the search market.
[192] We did extensive coverage about sexual harassment in Silicon Valley around the Ellen Pau trial.
[193] We were one of the leading groups of people pushing on the disaster that was Uber, including its terrible CEO, Travis Kalanick.
[194] This was in the times this thing was, wasn't tough on them until 2020.
[195] Hey, why don't you look at the archives of the Times in 2018, my very first column for them, I called them digital arms dealers.
[196] That's nice.
[197] Like, give me a break.
[198] I'm sorry.
[199] It's just not true.
[200] And so, you know, chummy?
[201] I don't, I have to know them.
[202] I have to speak to them.
[203] You don't seem very chummy to me, by the way, just if I'm not.
[204] I would pick any 10 fortune covers back then over Caraswisher.
[205] Like, we were known as me. And what was really interesting in this phenomena is all these PR people that I, I cut that I had to deal with back then was like, I don't know who the fuck you were talking to, but she terrified us and was really not very nice to us, like not, not, was tough on us.
[206] And yeah, but PR people defending you, I guess, I don't know.
[207] But one of the things that drives me crazy about it is that we were among the first to call in the question in the, through these interviews, Mark Zuckerberg and the anti -Semitic stuff.
[208] 2010, I did an interview with Mark in which Walt and I really drilled him on privacy.
[209] And he did so much so that he started sweating and had to take up that was a him to take off his hoodie.
[210] Yeah, exactly.
[211] So in a lot of ways, I'm like, what do you want me to do?
[212] I have to speak to them.
[213] It centers around Elon.
[214] I literally say in the book, I really loved what he was doing when he was doing space and cars.
[215] And he was a little bit of a narcissist.
[216] He was a little bit juvenile.
[217] I didn't see this coming.
[218] And for some reason, people are like, we knew it was coming.
[219] I'm like, where?
[220] Where did you write it was coming?
[221] Nobody did.
[222] Nobody saw this dramatic shift.
[223] Very few people, maybe one or two, in the industry around him.
[224] Everybody loved this guy.
[225] And he was more interesting than everyone's because he was doing significant things around.
[226] Starlink was amazing.
[227] I'm sorry, it just is.
[228] And the fact if you say Starlink's amazing, everyone's like, you love Elon.
[229] I'm like, I really don't.
[230] You can see I don't.
[231] But Starlink was amazing.
[232] What he did was Tesla.
[233] It pushed forward electric vehicles.
[234] I'm sorry to tell you, but it was dead until he pushed it forward.
[235] It was.
[236] Same thing was space.
[237] He's innovated space.
[238] And this is a guy who attacks me regularly.
[239] I still say, I'm sorry, but you have to be honest about his accomplishments, even though he's become one of the more dangerous figures in technology.
[240] And now he is, so what are we going to do about that now?
[241] So that's what drives me nuts.
[242] I'm sort of like, okay.
[243] Who do you think one of my, when I was asking around, what do I ask Kara?
[244] You said he was one of the most dangerous.
[245] Who is the most dangerous right now of our overlords?
[246] He is.
[247] Yeah.
[248] He's got money and means to sue.
[249] He's been suing all kinds of.
[250] He just sued Open AI today because he's, you know, he's hurt that they kicked him out, I think.
[251] But he has some cockamamie reason for it.
[252] You know, he sued another Roberta Kaplan case, this group that was pointing out the hate on.
[253] He's trying to quash their free speech is what he's doing.
[254] You know, he's got his myths all over space, and he can decide things that our government should be deciding.
[255] He's got his mitts in Ukraine.
[256] He really is ill -equipped to do so.
[257] Our space program depends on Elon Musk right now.
[258] So that's not good.
[259] I interviewed Walter about this.
[260] I love Walter.
[261] Isaacson.
[262] And his point is right is like, it's a government problem.
[263] I heard your interview and we talked about your interview with him and there are some very good critiques of his book.
[264] But he is right on this point about the Starlink thing.
[265] This is our government.
[266] How do we get in the situation where this crazy person is responsible for this?
[267] I would agree.
[268] I think that's correct.
[269] It is our government's fault.
[270] But the fact the matter is, that's a privatization that's been going on forever, right?
[271] The privatization of space.
[272] our government, which built the internet, by the way, paid for the internet, created, and then everybody else made money off of it, except our government, you know, really has abrogated its responsibility in basic research in AI.
[273] AI now is being run by private companies.
[274] That's why Elon's suing.
[275] He wants to get in, right?
[276] He wants to get in on it.
[277] And he's doing his own thing.
[278] But right now, AI is dominated by Microsoft.
[279] Open AI is a smaller company, but is dominated by all the bigs again.
[280] And so this is a critical national security.
[281] issue and everything else.
[282] And our government is sitting on its hands.
[283] So, you know, I want to, I want to get to the AI thing, but just a couple of things on Elon.
[284] Just really quick, there's NBC story yesterday that's really good that lists all of the various oversight things Elon's dealing with right now, from the SEC to all, you know, all the various agencies and how I think he's also lost his mind.
[285] But he's financially motivated, incentivized to try to help Trump this time because of all the threats facing him.
[286] He is.
[287] I'm curious your take on the psychology of this.
[288] He, he He did a tweet yesterday.
[289] I never went to therapy on my gravestone.
[290] We highly recommend therapy on this podcast.
[291] I don't know if you have any mutuals anymore with Elon and if you can help him get there.
[292] You know what I think that was?
[293] I said something publicly and they said, what can Elon do?
[294] I said, seek therapy.
[295] It might have been a joke, but it's a good advice.
[296] Actually, he should seek therapy.
[297] You know, and I think probably he probably read that.
[298] I think the big thing about Elon is, is it related to Twitter, right?
[299] Is there something about the Twitter platform that breaks people's brains?
[300] because he's not alone on this, or was this underlying, and you have this hilarious Haram -based story in the book where, like, you introduce Salzberger to him, and you knew him personally.
[301] It's like, was this craziness always underneath and something triggered him, or was it something about the app?
[302] Like, how do you assess?
[303] Well, you know, he's always been a troubled person, and he doesn't hide it.
[304] Like, if you go back and look at some New York Times stories, he's sort of very emotional around when Tesla was in big trouble, and he's talked about it compared to a lot of people.
[305] He talks about his mental health struggles.
[306] He has several times.
[307] He said he's manic -depressive, I think, at one point.
[308] He doesn't hide his unhappiness, and he never was.
[309] And that made him unlike people, because a lot of them feel robotic.
[310] And Elon always felt emotional all the time.
[311] You know, you could see, I ran into him at a party once, and I go, how's it going?
[312] He goes, I'm really lonely.
[313] And I was like, oh, okay, TMI.
[314] You're nine children.
[315] Yeah, I know.
[316] I was like, I didn't know what to say.
[317] I was like, oh, well, okay.
[318] All right, then I'll get a drink over here.
[319] Maybe if you had hugged him.
[320] Maybe if you'd hugged him in that moment, we wouldn't be here right now, Kara.
[321] No, thank you.
[322] You know, he wasn't dating someone, I think.
[323] It was weird.
[324] I remember being, I felt bad for him.
[325] And I think he has long mental health struggles.
[326] I think, as you saw in the Wall Street Journal, he enjoys medicating himself with a variety of drugs, self -medicating.
[327] And I think that story was very important to write because it links to some of the behavior.
[328] I think COVID was a real moment for him.
[329] that he, you know, a lot of people got radicalized during COVID, the vaccine stuff.
[330] For some reason, he got pulled into that whole anti -vax kind of thing or questioning the vacs.
[331] And then he got into Invermectin.
[332] And we had an interview during that period where he just went off the rails.
[333] And he had never done that.
[334] I have to say in an interview, for sure, where he threatened to leave the interview because I doubted his intelligence on COVID.
[335] And I was like, I just don't think you know what you're talking about.
[336] And that offended him greatly.
[337] And he didn't leave the interview, of course.
[338] course because he's such a paper tiger in that regard.
[339] And so, you know, I think it built.
[340] And there was always an element of these dank memes, boob and penis jokes, ha ha, boobs, you know.
[341] And I remember thinking when he did it a couple of times, God, this guy is in his 40s.
[342] What is he doing?
[343] This is kind of sad.
[344] Like, how sad, that kind of thing.
[345] But it was a minor part of his personality.
[346] Maybe his prefrontal cortex development.
[347] Yeah, I was like, oh, whatever, it's so juvenile, but okay.
[348] But I think Twitter did helped do that.
[349] I think it was a combination of COVID.
[350] I think he's got, as he got richer, you know, all these people, it happens in politics, too, and they're not even rich.
[351] They have people around them licking them up and down all day.
[352] They think they own the world.
[353] They're so hypocritical.
[354] Like, you saw that Hunter Biden thing with Matt Gates, where he goes, you know, did you take drugs?
[355] He's like, you're not the person to be talking to me about that.
[356] But that's how Matt Gates is.
[357] He's, you know, come on, Matt Gates.
[358] We know you're a part of you.
[359] It's ridiculous.
[360] And to be so high -handed about drug use.
[361] By the way, I don't find anything, whatever, take whatever drugs you want.
[362] But I think he changed.
[363] He got radicalized.
[364] Stay away from needles, kids.
[365] Needles, needles, kids, yes.
[366] Stay away from needles, kids.
[367] I'm talking about, you know what I'm talking about.
[368] So he changed.
[369] He became radicalized.
[370] I know it sounds crazy, but the one thing that I remember him getting so upset about in one of the interviews was Biden did not invite him to a car confab, electric car confab.
[371] He had all the big ones.
[372] And I got to say, he is the pioneer of that, right?
[373] He was the pioneer of that.
[374] He was the that.
[375] And he wasn't invited and he was so mad not to be invited.
[376] He was like a little much.
[377] I deserve to be there.
[378] I like, uh, that's where he turned on Biden.
[379] And I remember calling someone from the Biden administration.
[380] I was like, you know, they didn't because of the union issues.
[381] That was what was the problem there.
[382] Because it's not a unionized shop, Tesla isn't.
[383] And he was furious about that.
[384] It was fascinating to me. I'm like, what do you care?
[385] And he was like, I deserve to be there.
[386] So to sum it up, I think he's become more radicalized.
[387] I think he's changed.
[388] And he thinks because he's so rich, he thinks he's untouchable.
[389] And who does that remind you of?
[390] Who has changed also, by the way?
[391] Trump was not this way.
[392] Are you sure?
[393] He was a little bit.
[394] But it was harmless.
[395] It was harmless and silly and performative when he was on that show.
[396] A lot of it was tongue and cheek.
[397] You know what I mean?
[398] And then he became the character he was playing on TV.
[399] And it fed into the way he was.
[400] And by the way, now that we see all the sexual assault stuff over the years, it's like, oh, yes, no, he was always like this.
[401] But he hid it well, I guess.
[402] He hit it well.
[403] I see a little bit of a different parallel that it's kind of similar, though.
[404] When you talk about this rich guy resentment, that's hard for me to get.
[405] And one thing I was just, I was dying to ask you about is the Andresen manifesto.
[406] Mark Andresen is one of these guys people don't know, big venture capitalist.
[407] Also a brilliant guy, started Netscape.
[408] And, you know, a manifesto about tech optimism, I'm interested in your take on, and I just want to read one bit for it.
[409] Our enemy is the ivory tower, the know -and -all credentialed expert worldview, indulging in abstract theories, luxury beliefs, social engineering, disconnected for the real world, delusional, unelected, and unaccountable playing God with everyone else's lives with total insulation from the consequences.
[410] I have two questions about this.
[411] One, why are the richest people in the world so resentful of people in the supposed ivory tower?
[412] and do they, why do they not realize he's talking about himself here?
[413] He's talking about himself.
[414] Very confused.
[415] He's a very, he's always been a very troubled person.
[416] I don't know else to say.
[417] He's, he's a very difficult, complex person.
[418] And when I knew him, I used to talk to him almost nightly, which was interesting.
[419] Really?
[420] Yeah, we used to text about politics or text about, you know, different things.
[421] He's very gossipy.
[422] He was a very gossipy personality.
[423] He was.
[424] I'm sure he still is.
[425] That's about him.
[426] That is about him.
[427] These people in Silicon Valley, it's a miracle that they can see themselves in mirrors there.
[428] You know what I mean?
[429] It's a miracle.
[430] They're like vampires.
[431] They can't see themselves.
[432] And so...
[433] Why?
[434] What is it about?
[435] What is the resentment about?
[436] Well, it's, again, a combination of mental challenges, of extreme wealth, godlike tendencies.
[437] They all think they're in a video game in which they're ready player one.
[438] They think they know better because they know about one thing.
[439] They know about, oh, I'm going to tell you about Ukraine or whatever.
[440] By the way, one of the good things about tech is a natural questioning of the status quo.
[441] That's a good thing.
[442] Like, why are we doing it like this?
[443] But instead of why are we doing it like this, now the thing is what they're doing is bad and we must kill it.
[444] Like it's changed from let's try a new way to let's kill them because they're hurting us.
[445] And so they're contrary for a contrary sake, which is ridiculous.
[446] And it's infected people in the media too.
[447] Yeah.
[448] You know, really badly some people.
[449] Everybody.
[450] Like there's a whole bunch.
[451] There's a whole strain of, you know, Matt Tybee, those people who were like lap dogs to Elon Musk.
[452] And then he kicked them, which was a surprise.
[453] You know, they kicked, he kicked all of them.
[454] He's kicked everybody in that Twitter files thing.
[455] He's kicked them all.
[456] It's fantastic.
[457] I knew it would happen.
[458] That was kind of satisfying.
[459] It was sad.
[460] It was sad, actually, for them.
[461] But you knew where that was going.
[462] You know, they really feel like they're victims.
[463] One of the things that I used to get, because I was considered, although many men think I'm not tough enough, too bad.
[464] Mama's not mean enough, too bad.
[465] They used to call me mean.
[466] Like they'd always, they would call me these tech moguls when I'd write something and they're like, you're mean to me. And I'm like, what are you talking about?
[467] Your company collapsed.
[468] I said it collapsed.
[469] They're like, yeah, that's real mean.
[470] And I was like, again, I would always like, I'm not your fucking mama.
[471] I don't know what your problem is.
[472] We're not friends.
[473] I'm not trying to get you.
[474] It's just facts.
[475] Your company collapsed.
[476] I would get that a lot.
[477] You're not nice to me. Or you're my, There was that scene in the book with the Google guys where I called them, I said, I was writing a story about them trying to take over search.
[478] And this was early 2000s at some point, 2008 maybe.
[479] And I wrote this thing in Dr. Seuss for saying, would not, could not have a monopoly or something like that.
[480] I made it rhyme.
[481] I had covered the Microsoft trial where they were the antitrust trial many years before in the 90s.
[482] And I said they, at least Microsoft knew they were thugs.
[483] These people pretend they're not.
[484] You know, they have their giant colorful balls and their pogo sticks and their soft food, but they're the same.
[485] It's the same killer.
[486] So they called me all hurt.
[487] They're like, that really hurt us calling us thugs.
[488] And I was like, well, I think you're thugs.
[489] I don't know what to tell you.
[490] And they said, we're not bad people.
[491] And, you know, they referenced their don't be evil, you know, thing.
[492] And I said, you know what, guys, I don't think you're evil.
[493] I really don't, actually.
[494] I said, I'm worried about what you're building.
[495] The next person is going to be evil.
[496] And they're coming.
[497] You know, they're coming.
[498] Evil is coming for this.
[499] These tools are so powerful.
[500] They're so pervasive.
[501] They can amplify really bad things.
[502] What you're building is dangerous.
[503] Even if you aren't bad, the next guy is sure to be bad.
[504] Or he's coming.
[505] The bad guy's coming.
[506] And they never got that.
[507] They never understood that they never understood history or anything else.
[508] And that was very troubling to me about these people.
[509] And they would always say, you're mean.
[510] And I'm like, I'm not mean.
[511] I'm just, I'll tell you, one of example is I wrote a column in the New York Times in 2019 in which I said if Trump loses the election this is my hypothetical if Trump loses the election he's going to start saying it was stolen he's going to say it was a lie he's going to perpetrate it up and down the online ecosystem it's going to have resonance because it's going to go up and down up and down it's going to radicalize people and then he's going to ask people to do something about it in the real world it's going to jump off online into offline and we are fucked if he does that like this is going to get violent because He had already started with violent phrases on Twitter before that.
[512] And I said, I think it was...
[513] He was doing it in 2016.
[514] Right, exactly.
[515] Right.
[516] I put this scenario out, right?
[517] Which is happened, right?
[518] And I said, this is the most likely scenario based on what I've seen this guy do.
[519] I got calls from every one of those social media sites saying, how dare you say this?
[520] This is, this will never happen.
[521] I'm like, this will happen.
[522] This is exactly where this is headed.
[523] And they were mad at me for saying, so.
[524] And, you know, and I said, I think at this moment, you are quickly becoming handmaidenes to sedition.
[525] That's what you're doing.
[526] Yeah, let's do the Trump thing.
[527] Because JVL and our, in the newsletter yesterday for the trial, I wrote, created JVL's law, which I really liked, which is relevant to this.
[528] It says any person or institution, not explicitly anti -Trump will become a tool for Trump's authoritarianism eventually.
[529] And this was true of all.
[530] And he was talking about the courts and Mitch McConnell, but this is true of the tech companies too.
[531] And I just, you know, all of these guys, you write about this in the book, about how, you know, Trump wins and then they all go to try to work them over, to try to meet with them, to try to be on the inside.
[532] That even includes the white hack guys.
[533] Tim Cook is out there, you know, trying to work Trump over, and they're putting out press releases together about manufacturing or whatever.
[534] Talk about that, how that was happening in real time and what you write about in the book, about these guys accommodating Trump and the dangers of that.
[535] Well, I hadn't been a beat reporter for a while, but I got the tip that they were all going, which was a shock to me because nobody said anything.
[536] And you know, these people are so performative.
[537] Everything they do requires a press release or a tweet or whatever.
[538] But suddenly it was silent because they were embarrassed.
[539] They had trash trumped to me off the record a million times, right?
[540] Like, oh, what a clown, what a buffoon.
[541] A buffoon was the common word.
[542] And he can't win, and he's an idiot.
[543] We can work with Hillary.
[544] You know, that's what they thought was going to happen.
[545] And some of them were more explicit.
[546] Cheryl Sandberg was a big supporter who was at Facebook was a big supporter of Hillary Clinton.
[547] Meg Whitman had famously shifted.
[548] Now, by the way, she didn't go to the meeting.
[549] She said he's a despot, is what she said.
[550] She was a Republican.
[551] She was like the only Republican.
[552] The Never Trump was to do the right thing.
[553] We were the ones.
[554] We see it clearly.
[555] Meg is a fellow traveler.
[556] Yeah, she was.
[557] And she was very, for her to shift like that was really quite something to watch.
[558] And because she was, she's conservative, but she's a typical Republican.
[559] And being a Republican in Silicon Valley in California right there, she was a unicorn.
[560] There's a couple of them.
[561] Chambers, I think was one.
[562] There's a couple, but not many.
[563] And there certainly were no Trumpers.
[564] There were no Trumpers.
[565] And so when they, I heard about this, I was literally with my son at a farmer's market.
[566] And I'm like, they're going where?
[567] All of them?
[568] And then I started to see who was going.
[569] And I was like, it's all of them going.
[570] And so I said, surely they're going to say something publicly about his comments on immigration because immigration built Silicon Valley.
[571] Surely they can't go to this meeting without making a statement about immigration.
[572] And I got on the phone with all them, including Elon.
[573] He was the one who actually was like, listen, I don't think he's going to do this Muslim ban.
[574] I'm going to stop him.
[575] Blah, blah, blah, blah.
[576] Like, I'm Jesus kind of thing.
[577] And I said, you're not going to stop him.
[578] He said he's going to do it.
[579] This guy, for all his ridiculous clownishness, I think he's going to do it.
[580] He said so.
[581] He promised his people.
[582] This is not a hard thing to do, like the wall or whatever, but he said it.
[583] And I had counted it up.
[584] And I was like, he said it seven hundred and twelve times on the campaign trail.
[585] He's going to, he doesn't let, he's a racist.
[586] He's a long -time racist.
[587] This guy has persistently been attacking people of color, so I feel like he's going to do it.
[588] And different people.
[589] And it's just, I don't know, anyway, I talked to all of them.
[590] They thought he wasn't going to do it.
[591] And they're like, we'll talk to him off the record.
[592] And I'm like, no, you're the powerful people.
[593] You're the ones who stand up for immigration because it's helped build your industry.
[594] And none of them did.
[595] And it was really something to see.
[596] And then they skulked out.
[597] They never made a statement.
[598] And Trump used the entire thing as a press release.
[599] Trump was smart enough to use it, and he did a little...
[600] Multiple times he used all them for press releases.
[601] Love me. I put them on my counsel.
[602] They're on my side.
[603] The smart guys are on my side.
[604] Tim Apples, bringing the jobs back to America from China, the whole thing.
[605] Which he wasn't, but okay, you know, and he got it wrong in lots of ways.
[606] But, you know, when he got it wrong, they didn't correct him either, by the way, which was fine.
[607] I got that.
[608] Someone from Apple was like, what are we going to do, say the president's an idiot?
[609] I said that we could start.
[610] Maybe, yeah.
[611] But they can't.
[612] I got that one.
[613] I got that he's a...
[614] polite man. He's not going to call him out right there.
[615] But all of them were happy to call me and insult him.
[616] But none of them were happy to do it on the record, which I thought was really nefarious.
[617] I just was like, you're kidding me. Welcome to my life, Carol.
[618] Yeah, I know.
[619] They wanted their money back.
[620] There was all this income and they wanted the money repatriated.
[621] It hadn't been repatriated, this cash that they wanted.
[622] They wanted tax breaks and they wanted no regulation.
[623] And so that's what they got.
[624] I want to do another area where you were warning and how it ties to now, which is media stuff.
[625] You warned all these companies, the Murdox.
[626] You told Don Graham in the book that this would wipe out his classified business.
[627] He laughed and said, Ouch, ooh, guess he was wrong on that one.
[628] You can talk about that if you want, but I'm also, I'm more curious about where your warnings would be now to these media companies with particularly with regards to the AI and how things are going to get even worse, frankly, or more complicated at least, maybe not worse.
[629] When we have these technological upheavals, there's one in farming a long time ago.
[630] you know, most people we used to, a third of people used to be farmers.
[631] Now it's so tiny, the population of people who do farming.
[632] Same thing with manufacturing, mechanization and robots and things like that.
[633] That's changed that completely.
[634] Now it's coming for the white collar.
[635] This AI stuff is for white collar, really.
[636] And it's going to decimate certain industries and it's going to really change the way we work.
[637] And media is one of those places.
[638] I don't think decimation, but I do think we've already had the shit kicked out of us in terms of online advertising, which is now dominated by two tech.
[639] companies, which is Google and Facebook or meta and alphabet, they have sucked up all the digital advertising for the most part.
[640] And then some companies do okay, like the New York Times and some others.
[641] So the economic stuffing is knocked out of it to start with.
[642] And now these tools will make it so every single company that has information will be able to be much more efficient and cut costs.
[643] And where do you think the costs are, people.
[644] That's where most costs are.
[645] And so anything, you know, one of the lines I have in the book is anything that can be digitized, will be digitized.
[646] Now, it's not just digitized, but it's smart digitization.
[647] Like, it'll take, it'll do head, like in media, it'll do all kinds of things.
[648] Now, it's not going to write stories or report them.
[649] That is, that is not true.
[650] But it can collate and collect information in a way that people used to do, that we don't need people to do that anymore.
[651] I worry a little less about the jobs.
[652] I worry about it than I do about the consumers.
[653] You know, your co -host, Scott Galloway's, because he's kind of AI optimist -ish with caveats, you know, smart about it.
[654] And so when I was pushing it.
[655] I'm back on this, the one area where we kind of both are like, yeah, this one's tough, is I said to him, I was like, if I sometimes get confused, not that often, but every once in a while, I'll get tricked by something online.
[656] And I am a, we just talked, I'm an addict.
[657] I'm a, I consume more information than anyone.
[658] So if I'm getting, if I'm getting tricked, what is, what is my aunt going to do?
[659] What's my, you know what are people that didn't go to college going to do it with AI?
[660] I don't think we have anybody's even trying to come up with an answer to this.
[661] Well, I think it's going to not affect, you know, blue collar workers as much at all.
[662] I mean, some of it is.
[663] I mean, people are worried about, say, autonomous vehicles.
[664] I think there's not enough truck drivers, and I think a lot of truck driving should be done.
[665] It's dangerous job.
[666] And so it could change that industry in a good way, actually, you could see it.
[667] But it's very hard.
[668] I think one of the problems with tech is that it's addictive, but it's also necessary.
[669] You can't do your job in a white collar situation without digitization.
[670] You just can't.
[671] And so it's unavoidable.
[672] It's unavoidable and addictive, and it knocks the stuffing out of the economics of most businesses.
[673] That's really scary.
[674] What about the misinformation side of it, though?
[675] What about people getting confused, people not knowing what's real and what's fake?
[676] Well, it started with cable, like with Fox News, which is very effective, but now it's at scale, right?
[677] Now it's at scale.
[678] So people are getting all their news from Facebook, what Facebook picks to put in front of them is important.
[679] The problem is the people of Facebook don't care what they put in front of people.
[680] Nazis are cat videos.
[681] It's all the same to them.
[682] right kind of thing and so and then it also it gives you what you want so if you start down one road you get to the other road right and so it's a it's a path of radicalization that happens it used to be called propaganda but now it's propaganda at scale and that you do it yourself propaganda they don't have to put up a poster you know in berlin in the 30s of depicting Jewish people as vermin for example they don't have to do that you know what they can do they can send an individual message to one person.
[683] They know their fears.
[684] They know what they like.
[685] They know their fears, what they like.
[686] They know their habits.
[687] They can send messaging that is so designed at them that it's dangerous.
[688] It really, it's designer propaganda is what it is and very much aimed at individual people.
[689] You know, I've talked about my mom just being totally, you know, just complete, and that's just Fox News during COVID.
[690] She's like, it's just the flu.
[691] That went on for a while.
[692] I did one time, which was incredible, I did an interview with Hillary Clinton, and my mom called me, and she goes, oh, that Hillary Clinton, she's saying this, this is this about people like me. People like me is their favorite phrase, right?
[693] They're trying to get us, people like me. And I said, that sounds vaguely familiar.
[694] And I said, can you just tell me more about it?
[695] And she started to tell me more about it.
[696] And it was my interview.
[697] She was quoting, except it was through the lens of right wing media, right?
[698] Which wasn't accurate at all.
[699] They had twisted it.
[700] And I said, mom, that's not what you said.
[701] Oh, she's like, no, that's what she said.
[702] And I go, it's your daughter, and it was my interview.
[703] It's not what she said.
[704] I made her go listen to it.
[705] And she did.
[706] And she came back.
[707] She's like, okay, that's not what she said.
[708] But she's still, you know, plotting against our country and really secretly running it.
[709] And I was like, like, it didn't matter.
[710] So that's propaganda.
[711] And it's very good.
[712] It's propaganda on speed is what it is.
[713] My just hope level for our politicians' ability to actually regulate this in a way is just, is basically nil.
[714] I know that Mark Andresen's worried that he's being overregulated.
[715] Our mutual friend Luther Lowe, he texted me and I was like, what should I ask Carrie?
[716] He said, these guys can't even, you know, end the self -preferencing that he's obsessed with, right?
[717] Which is that like Google is, you know, is putting its own products at the top of Google search.
[718] So if government, if these guys can't regulate just the basic stuff about privacy, about, you know, self -preference, how in God's name are they going to handle the AI side of this?
[719] And There's, like, what is the optimistic angle on that?
[720] There isn't, because this is so private, right?
[721] That's the issue, is that not just AI, but space is private.
[722] Everything that's important that government used to have a hand in is private.
[723] AI is run by big companies.
[724] It's not run by our government.
[725] It's not.
[726] Our government doesn't have a handle on it.
[727] This is something our government, because of national security issues, because of all kinds of things, should be deep into, and they just aren't in the way they used to be, at least.
[728] And so now decisions are being made by big companies.
[729] I'm not sure what could happen.
[730] And also, there's a ton of money at stake.
[731] Like, Sam Altman is raising $7 trillion for a chip factory.
[732] Microsoft is a multi -trillion dollar company, Nvidia, which makes chips, a multi -trillion dollar company, Apple, a multi -trillion dollar company.
[733] You know, they're all - We've got guys making $180 grand a year in D .C. in charge with trying to, you know, put some bumpers on this.
[734] And there's just no hope.
[735] But they haven't.
[736] They've had the chance for, three decades now, and they haven't.
[737] And, you know, one of the things is, um, I was at an event last night, and Amy Klobuchar has tried her hardest to get even a basic antitrust spill through, you know, and she got kneecapped by the tech companies who are spending in districts, including by Democrats, FYI, who just pulled away from her bills because they got kneecapped in the, you know, in their own districts or whatever.
[738] These companies have enormous lobbying organizations now that, that really can move the needle here.
[739] And it's like standard oil got it together before we could break it up, right?
[740] They got it together.
[741] And so, and there's so many of them.
[742] There's so many of them.
[743] This is my free market, my old free market coming through.
[744] I think that the regulatory side of this is more important than the competition side, right?
[745] I don't know.
[746] To me, I always think that the obsession with the antitress is a little overstated.
[747] I'm just using it as one example.
[748] With the exception of Google, with the exception of Google, right?
[749] Because, yeah, right, because some of these, we see now.
[750] I mean, And even Google now, ChatGB, like, there are other companies that are disrupting it.
[751] You know, people's like, Facebook's a monopoly.
[752] I'm like, really?
[753] They're 19 social media companies.
[754] I'm going to push back on that because I am also a capitalist myself.
[755] I've built lots of businesses.
[756] But in that time, they got to dominate digital advertising.
[757] In that time that Amazon didn't have to pay sales taxes when other retailers did, they got to dominate.
[758] So they get to dominate and build a great business off the backs and they don't pay the cost.
[759] Facebook got to dominate and didn't pay the price for property.
[760] anti -Semitism, guess who's, it's like they're opiate makers.
[761] And we're saying, thank you, and you don't have to pay the price of your damage.
[762] It's, it's, you know, pharmaceutical companies don't get to do it.
[763] They definitely break laws, but there's laws to break.
[764] There are zero laws in place.
[765] I mean, there should be more than one, right?
[766] There just should be.
[767] And there aren't any.
[768] There's, and trust is just one of them.
[769] I just think we need to update our antitrust laws.
[770] It's 100 years now.
[771] Companies shifted, and it should be done smartly.
[772] I think we've got to update our algorithm?
[773] What is in those algorithms?
[774] How do they make decisions?
[775] Let's talk about safety.
[776] These are just basic things that don't hinder capitalism.
[777] Privacy.
[778] Why are they scraping your content and mine?
[779] At least we have copyright laws that are good.
[780] Should they be able to just shoplift your shit, Tim?
[781] No. I mean, why?
[782] And you have to sue them to get it back.
[783] I want a penny.
[784] Give me that Spotify cash.
[785] I want a .001 pennies for every time they use my tweets.
[786] You know, AOL was doing it.
[787] At one point, they're like, we make $50 for every user.
[788] I'm like, where's my VIG?
[789] Because it's my information.
[790] I'm pastor made by walking.
[791] It's a very famous quote.
[792] That's my walking.
[793] I want to be paid for my, you know, they scrape everybody's information and then they count themselves.
[794] We're down here in the content minds.
[795] Right.
[796] And then they say, you're welcome and also say, oh, you know, because it's capitalism.
[797] I'm like, is this capitalism?
[798] No, it's the really top people get to use.
[799] their positions to help them in other ways.
[800] I don't think that's competition.
[801] The way America wins is through innovation at the lower level.
[802] And if every, of all AI is dominated by big companies, do you think there's going to be a lot of little companies, which are the lifeblood of this country in terms of innovation?
[803] You know, I like these big companies, good for them, but we need little companies growing almost constantly to, one, be innovative and two to keep up in terms of things.
[804] And that's what's going to get killed here is real capitalism, which is what, what I am for, not, you know, because, so.
[805] Okay, we're about out of time.
[806] A couple rapid fires.
[807] I've got my happy person, my happy character for the book.
[808] I did not know this about you, but you're kind of responsible for America's best governor, Jared Polis.
[809] I am.
[810] You know, because you told his mother to cash out on their digital greeting card company.
[811] Give us one sentence on that.
[812] Oh, Jared.
[813] He was such, he was like, you know that Michael J. Fox show where he was the conservative and his parents were hippies?
[814] Alex P. Keaton.
[815] He was Alex P. Keaton, a gay one.
[816] But he was Alex B. Keaton.
[817] And so...
[818] Yeah, same.
[819] Maybe this is why I'm so long.
[820] Yeah, that was interesting.
[821] It was called Bloom Mountain Arts.
[822] At the time, they were buying traffic.
[823] Everybody was trying to buy traffic.
[824] And so they sold their company.
[825] It was a greeting card company.
[826] And he was doing candies and flowers.
[827] He was so funny.
[828] He was a funny little...
[829] And you got no big on that either.
[830] But you suggested to his mom, maybe it was time to sell.
[831] They used to send me cards.
[832] I was like, I don't want your cards, real cards and stuff like that.
[833] I'd like cash on the barrel.
[834] On the other side of the equation, Are the worst people in the world the friends of the actual genius entrepreneurs who get rich off RSUs like David Ballsacks?
[835] Are those the worst characters in the book?
[836] He's not in the book.
[837] He's just on the back of the book.
[838] I ignore him completely.
[839] I'm not interested in talking about.
[840] That's probably the best way to do it.
[841] I'm not interested in talking about enablers and minions.
[842] I'm not.
[843] They don't interest me. They only interest me because they're real.
[844] I do think in some ways I kind of, I at least dislike Elon and Thiel and these people, but at least they were innovators, kind of people that are riding on their backs, the ones that bug me. Okay, this is a request from a listener.
[845] You can, you can reject it if you want, and we're going to PG it up a little bit.
[846] Kiss, Mary, disappear, Elon Musk, Mark Zuckerberg, Sam Altman.
[847] It's tough.
[848] Disappear, Elon.
[849] Go to Mars.
[850] Enjoy yourself.
[851] It would be great.
[852] We've had enough of you and take Bill Ackman with you.
[853] I probably would marry Sam Altman because it would be a beautiful gay marriage together with us.
[854] I guess I'd have to kiss Mark He's actually He's a nice fella I'm not going to kiss Mark He's got his muscles now though He's been working out He was so skinny So was Jeff Bezos He was skinny skinny They were both skinny little things Speaking of the gays I think the most lesbian clause Ever written is in this book The Hardware Store is My Safe Space Okay my final question I ask you you can tell one story That's in the book I want to hear the ice sculpture story Okay So during this period of craziness It's when everyone was adorable.
[855] My wife worked for Google many years after I had started covering them, but she went there.
[856] I stopped covering them when she went there.
[857] We went to this baby shower for Ann Wigiewski and Sergey Brin.
[858] They've since divorced, but they were having their baby.
[859] And you walked in and there were all these baby photos.
[860] And when you walked in, they always have these assistants.
[861] They're full of assistance, these people.
[862] And they all have swingy blonde hair, all of them, women.
[863] They're all women.
[864] And they said, would you like a onesie?
[865] Do you have a swinging blonde hair assistant?
[866] No, as you can see, I do not.
[867] It's just Kara and me and my Eeyore back.
[868] there.
[869] They said, would you like a onesie or a diaper?
[870] And I was like, what?
[871] And so they made, they love dressing up.
[872] These people like forced fun.
[873] I used to call it forced joy.
[874] So they made you put on a diaper or a onesie and then gave you a sucker and a baby hat and a bottle of a fake baby bottle to put liquor in.
[875] And I said, I'm not putting any of this on.
[876] There's no fucking way.
[877] You know, I'm putting any of this stuff on.
[878] And a rattle.
[879] There was a rattle involved.
[880] And so I ran inside before they could make me do it.
[881] And they like chased me. And I was like, I'm not putting on this shit.
[882] And I walked inside and it was like it was a dystopian version of people pretending there were babies.
[883] And there was a bounce house.
[884] There was baby food, everything in little baby food jars, all the food was in baby jars.
[885] People who are, Sergey was in a onesie on roller skates.
[886] There was all kinds of bouncy balls.
[887] It was just like, it was a nightmare.
[888] And I had toddlers.
[889] And so I was like, this is bullshit.
[890] Like, this is really weird.
[891] You know, I was like, I don't need this.
[892] I have it at home.
[893] I don't need this stuff.
[894] And I didn't like.
[895] They'd always tried to get you to act like a child, which I hated.
[896] They had slides in their offices.
[897] Just metaphor there, I think.
[898] Yeah, like, we're childish, childlike, and I was like, you're childish, that's for sure.
[899] And so there was a nice sculpture there, too, which I was riveted to, is a full, it's a torso of a woman.
[900] And out of the breasts came white Russians that you put your cup up and they, to the boob, like it was breastfeeding.
[901] It was so ridiculous.
[902] And I look over, right near the breastfeeding is Gavin Newsom, who was mayor of San Francisco at the time.
[903] he's in one of his fantastic suits.
[904] That guy can dress, right?
[905] And he didn't have a diaper on.
[906] And I was like, he's like, how did you get out of it?
[907] I said, I ran.
[908] I wasn't going to do it.
[909] Dignity.
[910] And he's, I said, how did you get out of him?
[911] He said, I knew you'd be here.
[912] And you would take my picture and that would be the end of my political career in the diaper at the behest of billionaires.
[913] And we were laughing hysterically because it was so, calling to fact check this with him was so funny.
[914] We were laughing the heart.
[915] He was like, oh, my God.
[916] This was not a hallucination, right?
[917] This really happened.
[918] just need you to confirm.
[919] And then we'd had some of the white rush and it was delicious.
[920] So I love that.
[921] It was just so, it was everything wrong and right with that time of period.
[922] It was so fucking ridiculous.
[923] But at the same time, it was kind of sweet.
[924] It was weird and sweet and strange.
[925] And also what is wrong with these people.
[926] Also therapy inducing.
[927] Yeah, take this full circle.
[928] Kara Swisher, host of the podcast on with Kara Swisher and Pivot.
[929] She's got a new book, Burn Buck.
[930] Go get it.
[931] Thank you so much for taking the time with us.
[932] I remember, I never wore a diaper and neither did Gavin Newsom.
[933] So vote for him for president.
[934] We didn't.
[935] We resisted the diaper.
[936] We resisted the diaper.
[937] It's a low president bar these days, but you know, we're going to take it.
[938] That's where we are right now.
[939] Thank you so much, Tim.
[940] I love your podcast.
[941] I love your work.
[942] But I would get off the internet a little bit for you.
[943] I have to say, you're very present.
[944] Thank you.
[945] That's a good advice.
[946] I appreciate that.
[947] My husband agrees.
[948] We'll talk to you later.
[949] Okay.
[950] Thanks, Tim.
[951] Our podcast is produced by Katie Cooper with audio engineering and editing by Jason Brown.