The Bulwark Podcast XX
[0] Welcome to the Bullwork podcast.
[1] I am Charlie Sake.
[2] By the way, by the time you listen to this, you'll probably be a lot smarter than we are.
[3] Begills, you'll know what the House GOP did, whether or not Jim Jordan is, in fact, the Speaker of the House of Representatives.
[4] And our guest today is Kara Swisher, Kara, I have to tell you something.
[5] Can I make a confession?
[6] Sure, go ahead.
[7] The stupidity really does burn.
[8] Does it?
[9] Why?
[10] Are you surprised?
[11] Yeah, well, no. No, I shouldn't be, right?
[12] Because, I mean, we've been coming to this.
[13] this for so long.
[14] It just, the fact that we have to talk about the possibility as absurd, as ludicrous as Jim Jordan becoming speaker, probably one of the least effective members in a pretty mediocre body, you know, a coup co -conspirator, complete idiot.
[15] What fascinates me is this logic that Jim Jordan is a legislative terrorist.
[16] Now, that's John Boehner's.
[17] He's a legislative terrorist and he's an extremist.
[18] And so the argument is, okay, so we need a terrorist in there to keep the other terrorists align.
[19] We have an extremism problem.
[20] So let's select an extremist, right?
[21] Well, apparently he's nice now.
[22] I don't know.
[23] That's what I'm hearing.
[24] You know, there's a really good line.
[25] Sorry, I sound so bad.
[26] I have a cold.
[27] I have toddlers.
[28] But there's a really good line.
[29] There's a woman named Louise Gluck, who just died this week, very famous poet.
[30] And she has a poem called Circe's Power.
[31] And the line I love from this poem is, I never turned anyone into a pig.
[32] Some people are pigs.
[33] I make them look like pigs.
[34] That's what I think about when I think.
[35] about a lot of these people.
[36] That's what they are.
[37] You know, not pigs per se, but this is what they are.
[38] And they try to pretend they're not.
[39] And they either use social media or a new, this is what we are now.
[40] And they try to manipulate people and use people's good intentions of maybe this time.
[41] And there is no maybe this time.
[42] They are who they are the whole time.
[43] I was not expecting poetry this morning in a discussion about Jim Jordan.
[44] By the way, I should introduce our guest today who really does need no introduction.
[45] Swisher, host of the podcast on with Kara Swisher, co -hosts of the podcast, Pivot, and has a forthcoming book, Burn Book, a tech love story.
[46] Yeah.
[47] We'll be out in March.
[48] February now.
[49] February, actually.
[50] February, actually.
[51] They moved it up.
[52] They moved it up.
[53] Yeah.
[54] So the fact that Jim Jordan spent the weekend bullying the moderates.
[55] And, you know, and I suppose it's cynical, in my newsletter, I quoted somebody as saying, you know, moderates always cave.
[56] You know, it's a story as old as time.
[57] They do.
[58] The normies are just like, and apparently they are.
[59] still afraid of a mean tweet.
[60] They are still terrified of Sean Hannity burning them.
[61] Right.
[62] Well, this is the point.
[63] This is the point.
[64] If people are shameless and they're willing to do it, people will do it.
[65] And they have this tool now, which is social media, to do it.
[66] I think most people don't pay attention to this stuff, but it has an outsized power in Washington that they think it does and that they will be primaries or that they'll be made fun of on Fox News.
[67] and it's certainly effective in getting people to cave.
[68] I just think if more people stood up, like say John McCain did many years ago, if you remember when he did his thumbs up thing, that more people would realize they have power in their hands and the people without power are the ones who, you know, use bullying as governing, which is not governing.
[69] So later it will be a problem because he bullied them into it, right?
[70] But today it works and it doesn't really work ever.
[71] The house is still going to be dysfunctional.
[72] It's still going to be a complete clown car because whoever is, the speaker still has that narrow majority, and you have the, you know, craze -slavering Jackal Caucus that still has power.
[73] But you make an interesting point.
[74] Right.
[75] You know, one of the things that I think would have surprised the founders is the willingness of so many people in power to step back and give up their power.
[76] They thought people would be jealous of their power.
[77] The number of senators or centrist congressman, committee chairman, who've decided, you know what?
[78] I could be relevant.
[79] I could have clout, but I'm going to turn myself into a power.
[80] planted plant is pretty amazing.
[81] I'm not surprised.
[82] I think they did anticipate it.
[83] They did anticipate this idea that people would give up power, and they were scared of it, actually.
[84] They were scared of that man who had no shame, right?
[85] They anticipated it many times in much of their writing.
[86] And I think that's why they designed it the way they did, which was because they understood that someone who felt like a king would show up at some point.
[87] It took this long for it to happen, and that they would have no governors.
[88] And what's interesting, is that it does work.
[89] And ultimately, if you have no shame, you can do very well in politics these days.
[90] I don't think you actually are effective in the end.
[91] No, not necessarily effective.
[92] I mean, but it defends how you define effectiveness.
[93] So, you know, Jim Jordan, if he becomes the speaker, we'll be able to push through, what, defunding the prosecution of Donald Trump.
[94] Yeah, but I mean, he won't actually defund it.
[95] But, I mean, it's effective in the sense that, you know, he gets to, you know, pose for holy pictures.
[96] He gets the clicks.
[97] he gets the buzz, he gets the love.
[98] Right, right.
[99] It's all about attention seeking.
[100] I mean, that's what's happening among the tech people.
[101] It's all about, look at me, it's narcissism, you know, writ large.
[102] And I think what's interesting is I was looking about the election in Poland.
[103] I think most people don't like this.
[104] And everyone's like surprised at the nationalist loss.
[105] I'm not.
[106] People are tired of this hypocrisy.
[107] And I think regular voters are for sure.
[108] Not everybody.
[109] There's going to be a group of people that no matter what you do believes in this because they've been fully propagandized through social media, through constant repetition, and they will be that way, and that's the way they are, and there's nothing you can do about it.
[110] Now, most people are not like this.
[111] So talk to me about how this played out on social media, because, you know, on one level, it appears that what used to be known as Twitter, which has been completely fucked over by Elon Musk, is less influential.
[112] But I just get kind of the sense that the siloing has become more intense.
[113] But so give me your sense of how this speaker's race played out on social media.
[114] Yeah, I think it's just being destroyed in real time.
[115] You're seeing smaller and small echo chambers between and among people.
[116] And then it's fed by a media, I'm sorry to say, that just covers it like it's breathless.
[117] Like, now this person's out.
[118] Now that person.
[119] That's a lot of information and no meaning.
[120] What does it actually mean?
[121] And so it's designed to be like that.
[122] It's designed for enragement.
[123] It's designed for engagement.
[124] but it's not designed for wisdom.
[125] And it's also addictive.
[126] You know it.
[127] You can't stop looking.
[128] Yes.
[129] And so what actually happens?
[130] What's the actual result?
[131] Which is nothing, which is gridlock, which is nothing ever passes.
[132] And slowly but surely, they lose elections and they're not comfortable.
[133] They're much more comfortable being, and I hate to use this term, but a bomb thrower than an actual governor, right?
[134] It's much easier to do that.
[135] And social media makes that easier from a, from a, a word point of view is throw, you know, verbal bombs all around the place and call attention to yourself.
[136] That's, you know, that's the phenomena of George Santos.
[137] Why are we looking at this incompetent person?
[138] Why?
[139] Like, every minute you look at him is a waste of a minute you have in your life.
[140] But it's, it's addictive.
[141] Yeah, it is addictive and I've wasted so many minutes.
[142] So that's just on Jim Jordan for a moment.
[143] And we've talked about the role that he played on January 6th, his long track record of legislative incompetence, inefficiency, streamism.
[144] He's the guy responsible for that tweet.
[145] You know, was it Elon, Kanye, Trump?
[146] Yeah.
[147] But as you pointed out on your podcast, there's more, there's more stuff coming on him, which is, which makes this all the more interesting because he's got all of this baggage, all of these questions, all of this history.
[148] And yet, there are more shoes to drop with Jim Jordan, aren't there?
[149] Yes, 100%.
[150] These are these stories around what he did when he was a wrestling coach.
[151] Now, look, everyone's innocent till proven guilty.
[152] But there's a lot of material here about what how he behaved.
[153] That is problematic, at the very least.
[154] We'll see how the reporting comes out.
[155] Apparently, the Washington Post is working on a piece about this.
[156] It sort of reminds me, and I don't remember the guy's name, because I've luckily forgotten him, who was running in Alabama, who had all kinds of hair on them.
[157] Right more?
[158] Yeah, picking up girls in malls, that kind of thing.
[159] In this internet age, people are narcissists, and they think it can't, they can't be touched.
[160] They think there aren't repercussions for their behaviors long ago.
[161] Now, again, I don't know if he did what they said.
[162] I certainly have listened to a lot of testimony from wrestlers who said he did it.
[163] We'll see.
[164] We'll see if that happens in these reported pieces, if there's lawsuits.
[165] It shows the height of arrogance to think that this is not going to be a problem.
[166] I think it is going to be a problem.
[167] And to put yourself out there says to me, wow, you really don't think you can be touched.
[168] And of course, in the internet age, you know, there's that line.
[169] It's not erasable.
[170] The internet age is not erasible.
[171] and you know everything is written in indelible ink and that's an issue and of course then there's the propaganda that's on top of it and Jordan in particular I've dealt with him mostly around tech stuff he's completely ignorant around tech stuff and he always whenever he gets in trouble he starts yelling First Amendment and I'm sort of like that's not the point here but he does that for he's co -splaying a defender of the First Amendment which of course he's not and he does that That's when I've had that experience with him, which I'm like, you're actually not very smart is what my problem with him is.
[172] Again, and that's one of the reasons why he is not ready for prime time.
[173] I mean, so the spotlight is going to get even brighter on him.
[174] The Washington Post working on this investigative piece, which you talked about.
[175] George Clooney has this documentary with HBO on the allegations that he covered.
[176] So we'll see that.
[177] Now, you use the word arrogance, but it also goes back to this shameless political culture.
[178] And politicians are looking around in the age of Trump and going, you know what?
[179] this stuff would have killed me, you know, five, six, seven years ago, but nothing matters anymore.
[180] My base will not care.
[181] They don't care.
[182] They wouldn't care if Donald Trump was aborting babies in the White House.
[183] They won't care any of this stuff.
[184] So, I mean, part of it is that you're living in a post -shame world, because Jim Jordan wouldn't be possible except in a post -shame world, right?
[185] Well, you know, there were people like him, like U .E. Long.
[186] We've had a history of these kind of, you know, coastplayers is what I call them.
[187] Yeah, demagogues.
[188] It's not a new, fresh thing for the United States of America or anywhere, really.
[189] The problem is the only one who's actually good at it is Donald Trump.
[190] Like, let me just say, he is the greatest troll in history, and he's quite good at it.
[191] I don't think these other people have as much protection when they start down this road, right?
[192] I think there's two people who actually do very well, and they're quite opposite, which is AOC, who's very deft at social media, and Donald Trump.
[193] And I wrote a column comparing them.
[194] It's very different people, very different messaging.
[195] But I think everyone else really is, you know, I sort of am like, step away from the keyboard.
[196] Step away right now.
[197] And you see it everywhere, whether it's this Israeli Hamas thing or Ukraine.
[198] Everybody's an expert when they have no expertise.
[199] And, you know, they don't have any governor because it's addictive.
[200] But they all think they're experts.
[201] That's also part of the addiction of social media.
[202] Yeah.
[203] Is that you spend enough time and you think you are smart.
[204] What you really have is a couple of phrases, bumper stickers that are tacked together.
[205] It is a misinformation universe, which brings me back to the thing I really wanted to talk to you about.
[206] The last time you and I spoke, I think was just about the time that Elon Musk was taking over Twitter.
[207] And nobody could really have anticipated what a complete shit show it was going to be.
[208] I kind of did.
[209] I kind of did, but okay.
[210] Okay, but I mean, wow, Kara.
[211] So, I mean, can we just step back here?
[212] Because Elon Musk is going through some things.
[213] You know the guy.
[214] You have dealt with the guy in the past.
[215] So is he intentionally trying to destroy Twitter or is he just have no fucking idea what he's doing?
[216] You know, just because people are good in one thing.
[217] And let me just say he's very good at rockets.
[218] He's very good at cars, although he's got a lot more competition in the latter.
[219] Be nice if the self -driving cars stop killing people.
[220] Well, they're not that bad.
[221] No. They're not that bad.
[222] People kill a lot of people in cars, just FYI.
[223] They do.
[224] The problem of people driving is really the problem.
[225] Cars are pretty good, actually.
[226] I've been driving a lot of them in San Francisco, and they're getting pretty good.
[227] Again, the problem is people bashing into them.
[228] They have issues, but they'll fix them.
[229] It will 100%.
[230] Degression.
[231] Okay, digression.
[232] I apologize for the digression.
[233] That's okay.
[234] That's okay.
[235] I just think they don't, actually.
[236] And this is where I agree with Elon Musk in that regard.
[237] The things he knows about, he does know about 100%.
[238] He's a little bit of a showman.
[239] Sometimes he doesn't know things in tech and talks about them anyway.
[240] But he's pretty smart in all the things around tech.
[241] He's very smart.
[242] I think when you move into media, it's a very different story.
[243] And that's what this is.
[244] A lot of tech pros have tried to get into media, whether it's Andreessen Horowitz or, you know, starting their own media, whatever the hell they were doing, and they closed it down pretty quick.
[245] And they think they can bypass the press.
[246] First of all, it's not a very good business, Charlie.
[247] You understand that, right?
[248] It's not like the greatest business on earth anymore.
[249] It used to be.
[250] But they think they can do better and they feel misunderstood.
[251] They have grievance if people don't lick them up and down all day.
[252] So they want to speak for themselves, right?
[253] And that's what he's doing here is he's playing out a lot of personal trauma.
[254] I think there is personal trauma, although everybody has personal trauma because he can afford to instead of buying, you know, a sports team or a yacht or marrying 20 times, which he's done.
[255] This is how he's playing out his personal traumas.
[256] And so I do think it's a lot mixed in with the ability to be able to do these things with the money he has.
[257] And what happens when you have that much money is nobody questions you that maybe you're an idiot, right?
[258] You're not smart just because you have money.
[259] I think a lot of people are questioning him.
[260] Right.
[261] The thing is, he way overpaid for Twitter.
[262] He appears to be in the process of destroying much of its market value.
[263] It is worth a fraction.
[264] It was never worth that much.
[265] It's going back down to where it was worth.
[266] But, I mean, he has set tens of billions of dollars on fire to deal with his personal traumas.
[267] I mean, that just seems not an obvious life choice.
[268] There's a way out.
[269] He can buy the debt up at some point if the banks have to sell it.
[270] he can buy the debt off the $13 billion in debt.
[271] He could take it public and be a meme stock.
[272] There's ways out for him that actually, you know, there's one born every minute, essentially, and there's a lot of Elon stands.
[273] So there's a way out for him here.
[274] It's just kind of problematic, right?
[275] But you're right.
[276] He's turning it in his own little playground of neuroses, I think.
[277] You're right.
[278] He obviously has a lot of trauma, a lot of neuroses.
[279] These are preexisting conditions.
[280] Yeah.
[281] It does feel like he's going through something.
[282] Yeah.
[283] I mean, it feels as if he sort of put his toe into politics, sort of right -wing politics.
[284] Sure.
[285] And then he hit this toboggan slide into right -wing conspiracy theory, some of the darker places on the Internet.
[286] So this seems to have excelled.
[287] What is going on with him?
[288] Well, it's propaganda at the end, like I say.
[289] It's not, you know, everyone's like, this is new and fresh.
[290] This is propaganda, and this has happened over the course of our history very many times.
[291] And so it's very attractive.
[292] But why does he want to do this to himself?
[293] I don't know.
[294] This guy was the time man of the year.
[295] He's got brand.
[296] He's got other companies.
[297] I mean, why is he playing around with the peppy the frog memes and stuff like that?
[298] I mean, the real dark, ugly side.
[299] Why would he embrace that?
[300] He always had these qualities of jokester, 12 -year -old boyness.
[301] And Silicon Valley encourages men to become boys, right?
[302] They encourage juvenilia.
[303] They encourage people to just be brats.
[304] They do.
[305] If you can do whatever you want, you start doing whatever you want.
[306] And so it's very attractive to have explanations of a very confusing and difficult world.
[307] And he was like this to an extent, but it was more, it was more silly memes, right?
[308] It was more silliness.
[309] And now it's a little bit like Pinocchio.
[310] You remember when he went to the island of the boys, they started off fun and games, and then it became donkeys.
[311] You know, that's how it happened.
[312] And when you combine in this country gerrymandering with social media and say Fox News, You've got a prescription for real.
[313] It's propaganda.
[314] I don't know.
[315] It's not that fresh an idea.
[316] And I think he's missing something that he can't pull himself back.
[317] What is your thumbnail review of the Walter Isaacson book that came out in the midst of all of this?
[318] Now, by the way, I'm a deep believer in the wait until somebody has actually finished their career or their lives before you do the definitive biography.
[319] And so basically, you do this big biography in the midst of this massive shitstorm.
[320] You've talked to Walter Isaacson, you're a friend of his, but...
[321] Yeah, I did an interview with him.
[322] I gave him a pretty hard time.
[323] What would you thought about the book?
[324] I wrote a review of it in like 10 words, whatever I basically said.
[325] You know, troubled young man gets a lot of money and decides to take his trauma out on everybody else.
[326] Sometimes he's right.
[327] Sometimes he's wrong.
[328] Mostly he just seems crazier and crazier, you know.
[329] I think the problem with the book, yeah, 600, it's a doorstop of a book.
[330] I think he was a little bit of an aminousis to this guy.
[331] He just typed what he said.
[332] there's a lot of sourcing problems in the book.
[333] You know, he sort of is like, Elon says this, his dad says this.
[334] I don't know.
[335] You know, that's not what a reporter does.
[336] Like in many stories, including around his childhood trauma, I don't know who's telling the truth.
[337] And I would like a biographer to tell me who's telling the truth and not just use it as he has demons.
[338] What can we do?
[339] You know what I mean?
[340] I'd like to know who's lying a lot more.
[341] I think there's one thing that inadvertently happens as you see him playing out the same trauma over and over again with different people, whether it's partners in his early companies or he feels aggrieved or it's later the people who actually founded Tesla.
[342] He's still angry at that person.
[343] I'm the one who did it.
[344] It can be only me. And I don't fault him.
[345] He's very talented.
[346] But I think it's a constant series of him getting in beefs with people.
[347] And I don't know what's served by that and letting someone off the hook.
[348] He lets him off the hook.
[349] And one of the things I said to Walter in the interview was, you're harder on Amber Heard than you are on Elon Musk.
[350] Seriously?
[351] Okay, she seems a little unstable as a person, but does she deserve the enormous amount of reporting that goes into her problems versus this guy?
[352] And that was my issue.
[353] It was a lot of excusing bad behavior as genius.
[354] And I don't agree with Walter on that.
[355] This is not new.
[356] And you know this a lot better than me. So help me with all of us.
[357] You mentioned that Silicon Valley encourages the.
[358] tech bros, all be little boys.
[359] So the performative assholery of the tech pros is neither new nor isolated.
[360] It feels like it's intensifying.
[361] And it's not just Elon Musk.
[362] No, just Mark Andreessen just put out this techno -optimist.
[363] It's ridiculous.
[364] It's such, this is this, you know, he likes to write essays now and again.
[365] He was famous for writing software as eating the world, which is like, well, thank you for that obvious revelation.
[366] But okay, great, he wrote that.
[367] You know, he likes to do that with phrases.
[368] And he wrote this tech.
[369] techno -optimist guide.
[370] And it was such a straw man, like those who hate AI, those who do this.
[371] He sets up a false dichotomy.
[372] I think AI is amazing.
[373] And I also am worried.
[374] Adults are like that.
[375] You know what I mean?
[376] Adults take a moment.
[377] And so one of the things that was really problematic with that piece is that you're either with us or against us.
[378] And I'm sort of like, it's okay to be worried about implications of inventions and at the same time understand they're important.
[379] And so he was setting up this idea as anyone who doesn't think it's all great is an idiot and kind of evil for pushing back and stupid.
[380] And that's how they have to do it because honestly, they didn't go to college long enough and they don't know how to argue and they don't know how to make a case.
[381] But instead they rather do this scorched earth policy that is just a straw man. You know, Trump does it all the time and everyone recognizes it.
[382] Again, nothing new about narcissistic.
[383] adolescent tech pros.
[384] But a lot of them seem to have decided that they want to be more than tech pros.
[385] They want to be oligarchs.
[386] They want to be important.
[387] They want to be deep thinkers.
[388] And so we're entering a phase in which, you know, we live in their world.
[389] Sure.
[390] You know, there are so many billionaires with outsized egos and power and cloud.
[391] And they affect this environment far more than, say, who, you know, is chairman of the House, Ways and Means Committee necessarily.
[392] So talk to me a little bit about the ambitions of the text pros, what they want to grow up to be.
[393] I don't know.
[394] I honestly don't.
[395] Essentially, he wrote a 5 ,000 word rant.
[396] I don't know what else to say.
[397] That's what it was.
[398] And one of the problems is, you know, this idea that this is the way to go.
[399] We are being lied to.
[400] It's the same stuff, whether it's vaccines.
[401] Well, there's an audience for that.
[402] It's the same language around RFK Jr., vaccines.
[403] We were told technology takes our jobs, reduces our wages, increase.
[404] is inequality.
[405] We are told to be bitter, angry, and resentful about tech.
[406] No, we are not.
[407] You know what I mean?
[408] It's kind of like setting up kind of a ridiculous argument.
[409] I don't know why they do it.
[410] Did they not get loved enough as a child?
[411] I don't know.
[412] I don't know what happened to them, but they're unable to have an actual argument.
[413] And anyone who argues with your or provides feedback is the enemy.
[414] And it's weird.
[415] It's just, it's just weird.
[416] Well, when you're a billionaire, you can create your own bubble, right?
[417] You're surrounded by people who are telling you that you are, you are funny, you are beautiful.
[418] Have you done something great with your hair?
[419] Love the outfit, right?
[420] That last bromide that you said is like, this is Socrates.
[421] You are brilliant.
[422] You are a philosopher king as well.
[423] So it wasn't that long ago that places like Twitter, I'm going to keep referring to it as Twitter.
[424] I mean, how do you pronounce this?
[425] X?
[426] I mean, it's like whatever shit or whatever.
[427] Twitter was the place you went to be able to follow news.
[428] And so in the before times, it's something like.
[429] the Israeli war, we would be looking for information about what's happening in real time.
[430] And now, you've talked about this rather extensively because Twitter has just become a cesspool of fake videos, misinformation.
[431] There was a propaganda network of 67 accounts found on Twitter who were coordinating a campaign of posting false inflammatory content related to Israel and Hamas.
[432] CBS reported Elon Musk laid off.
[433] Much of the team response.
[434] for monitoring posts.
[435] The account profiles were like sleeper accounts posting innocuous information until they were activated after the attacks by Hamas.
[436] I mean, what's going on here?
[437] I mean, clearly, Elon Musk took down all of the controls right at the moment when perhaps they were most critical.
[438] Well, he did it right away.
[439] You know, he did it a while ago.
[440] And so this is one of the first big crises that has happened.
[441] Twitter was never good at this.
[442] They were always sort of, there was always a bot problem.
[443] There was always a misinformation problem.
[444] them.
[445] Same thing with all of them.
[446] It's a very difficult issue, but not even trying and just letting it go.
[447] And his philosophy is, now you know what they think, and therefore it's a better world.
[448] Actually, what it is is a more confusing, and it's a problem to figure out what's real and what's not.
[449] And when people are in a high emotional state, that's not good, right?
[450] It's not that you don't want to keep things from people.
[451] It's that they're in a high emotional state, and something that's inaccurate can send them into the next version of that.
[452] And so, when What is the purpose served by serving up fake videos?
[453] There's none.
[454] There's none except to get people more upset.
[455] It doesn't bring us to a better place.
[456] And, you know, it's absolutely true that you are very upset in the moment.
[457] Like, and as we were after 9 -11, if you recall, everybody had an emotional reaction.
[458] And then everybody, even as horrific as it was, everybody calmed down and said, okay, what should we do instead of just lashing out?
[459] And that's how adults behave, right?
[460] Yeah.
[461] And so I think what has happens is when he creates a information network where information is bad, you're going to have bad consequences no matter how you slice it.
[462] Now, the good thing is a lot of people understand this, but not everybody for sure.
[463] And therefore, everyone's going to try to game the system and try to create, you know, chaos.
[464] And a lot of people just want, they don't want to create illumination.
[465] They want to create chaos.
[466] And this is the perfect tool to do so.
[467] That's their goal is chaos.
[468] And again, you know, it's propaganda is what it is.
[469] The Europeans have a different approach than we do.
[470] You know, we, you know, kind of throw up our hands about all of this, but the EU has instructed Twitter and meta to tackle all of this disinformation and misinformation.
[471] Yeah.
[472] And there are fines and fees involved for not complying.
[473] So was that going to have any impact?
[474] Does that make a difference?
[475] Well, you know, Elon's sort of like sue me, you know, make me. You know, I think he's done that with rent.
[476] Like, I'm not paying the rent.
[477] What are you going to do?
[478] If you're willing to, again, getting back to shameless of being like, I'm not paying the rent.
[479] I'm firing these people.
[480] I'm not paying them their severance.
[481] They have to come at you, right?
[482] And therefore, you're in a more powerful position.
[483] If you're willing, Donald Trump perfected this.
[484] Put me into bankruptcy.
[485] Whatever.
[486] You know, what he's doing around the courts right now is the same thing is, come at me, arrest me. I'm fine.
[487] Very few people are willing to do that.
[488] It's the same thing with Jim Jordan.
[489] You know, go against me, see what happens.
[490] So I want to come back to social media in just a moment because, you know, there are these, I'm constantly getting people saying, okay, how long do you stay on Twitter?
[491] You go into threads, you know, what about this and all of this?
[492] We'll come back to that.
[493] But you have a very interesting interview to Christian Amampur a couple of weeks ago about Rupert Murdoch's legacy.
[494] Things move so quickly.
[495] We kind of forget that, hey, you know, we actually had succession in real life.
[496] Fox News has gone through a few things this year, lost that $78 million lawsuit.
[497] to Dominion fired Tucker Carlson, Rupert Murdoch, kind of surprised people, shouldn't have surprised people, but he left so.
[498] Left in parentheses, that old crocodile hasn't gone anywhere.
[499] He's still wandering around the swamp, you know, ready to kill.
[500] You know, the Lachlan's going to put the mirror under his nose when they're burying him.
[501] I mean, they're going to check, going to poke him with pins.
[502] I always say I would never turn my back on that old man. You said that you think that he was the most single destructive force in America, England, and Australia.
[503] Now that's saying quite a lot.
[504] Yeah, one of them.
[505] I mean, it has to be combined.
[506] A lot of competition?
[507] Well, there is, but I think he was willing.
[508] It's the same set of willingness to just not care about the truth, right?
[509] It's the same set of, you know, audience, audience, audience, what will make them excited and upset.
[510] You know, at first it was tabloid, which everybody gets tabloid.
[511] It's fine.
[512] You know, the New York Post, whatever, they can scream about whatever they want.
[513] But he took it to an extreme.
[514] that, you know, just shamelessness to an extreme.
[515] The cynicism, too.
[516] He's cynicism.
[517] And, you know, I essentially, you know, you'd like to say he created a monster and it got out of his control, but I don't think it got out of his control.
[518] I think he purposely created a monster.
[519] And then, you know, when people weren't doing what he said, say Tucker Carlson, who decided he'd had enough of the old man, you know, he fired him before he could leave, right?
[520] That's what it looked like there.
[521] Between social media, Fox News, and gerrymandering.
[522] It explains a lot of what's happened in this country.
[523] And I do think they were the first to go there and do this, even though he wasn't particularly good at Internet.
[524] He was very good at broadcast and making an entire group of people believe things that weren't necessarily true.
[525] Why do you think he fired his biggest star, Tucker Carlson?
[526] What do you think it happened?
[527] He was a liability.
[528] I think the lawsuits, the Smartmatic and that's still pending.
[529] I think he cost him a lot of money.
[530] and for sloppiness.
[531] I think he, you know, indulged him forever.
[532] And then the guy started thinking he was bigger than Rupert Murdoch, which maybe he is.
[533] Maybe he's not.
[534] I don't know.
[535] Now, that was my next question.
[536] So there was a lot of speculation that Tucker had become so big that, in fact, he was bigger than Fox and that even after he left Fox, well, that's what I want to know.
[537] Because I keep waiting for Tucker's next big thing.
[538] He seems smaller to me. Well, that was my question.
[539] Because as we started this interview, and I was thinking about this, So when's the last time I heard Tucker Carlson mentioned?
[540] When's the last time I saw anything?
[541] He and Elon were supposed to create this huge...
[542] Well, they are.
[543] I'm sure he'll get some money.
[544] Yeah?
[545] Well, I think he's probably going to get some money and try to create a media company like Ben Shapiro, who's actually very...
[546] I can't stand Ben Shapiro, but he's good at business.
[547] I'll tell you that.
[548] You know, maybe he'll do something like that, and then we'll see how good he is at it.
[549] But is that bigger than Fox News?
[550] I mean, everyone...
[551] It seems like everybody's slicing the salami narrower and narrower and narrower.
[552] I'd have to see the product, but he certainly seems smaller on Twitter.
[553] You know, he reminds me of Gloria Steinem in Sunset Boulevard.
[554] It's the pictures that got smaller, not me. I'm still a star.
[555] And, you know, at the end where she's like going like, I'm ready for my close -up, that's what he reminds me of is the pictures got smaller, not me. I'm still big.
[556] We'll see what he creates.
[557] I'm certainly open to seeing.
[558] I'm interested in the transition because there are a lot of people who've left, you know, the news, they're trying to stay relevant.
[559] They still have their platforms.
[560] And they may be lucrative for them.
[561] I mean, I don't know how much, you know, money.
[562] But, I mean, you know, think how big Megan Kelly was at Fox.
[563] And then, you know, NBC paid her just a crap load of money.
[564] They did.
[565] And now she's got a podcast.
[566] Yeah, she does.
[567] Okay, she's out there.
[568] She's part of the conversation.
[569] She's active.
[570] But clearly at a completely different level than she used to be.
[571] Well, she's got our audience, right?
[572] This is where it's coming to.
[573] Everyone's got their audience.
[574] And that's powerful.
[575] It is powerful.
[576] Before it was Spray and Prey, right, where you could get anybody.
[577] Now they're targeting.
[578] And that's what's happening.
[579] And so Tucker will have his people.
[580] He will.
[581] Megan will have her people.
[582] And she'll continue to do her song and dance at whatever it happens to be that pleases her audience.
[583] It's not something I particularly like.
[584] But she has her audience, right?
[585] And so you're going to see a lot of that.
[586] The question is, people like that, I'm not impressed with them as I am with some TikTok stars.
[587] They're a little too old.
[588] I hate to say that.
[589] But Joe Rogan is much better at it, right?
[590] He came out of nowhere and created a product that has a bigger audience.
[591] And so the question is, can they leverage YouTube and Reddit?
[592] That's where young people are, by the way, YouTube and Reddit.
[593] And can they leverage that?
[594] Maybe, I don't know.
[595] I would look at it from younger people, you know, and I hate to be ages because I'm old myself.
[596] I'm right here, Karen.
[597] I'm sitting right here.
[598] I can hear you.
[599] Okay, but you have a smaller audience, that's all, and you're preaching to the choir.
[600] I mean, it used to be that in order to be influential, you had to, you know, be on network show with, you know, 40 million people.
[601] And now, if you can get a million subscribers, you've landed in Clover.
[602] You're doing very, very well.
[603] Yes, you have.
[604] You can be influential.
[605] You can have your audience.
[606] You can monetize it.
[607] All right, so speaking of this, you watch all of this stuff.
[608] I am very confused about what is the successor to Twitter, where are we going to go?
[609] I feel like a sucker to think that somehow, you know, Mark Zuckerberg was going to, you know, rescue the world with threads and Instagram.
[610] I like threads, too.
[611] I mean, there was post out there.
[612] There was a blue sky.
[613] I'm going to leave somebody out.
[614] Mastodon, I just gave up on.
[615] I apologize.
[616] I just, I couldn't do it.
[617] What is going to be out there?
[618] I mean, Trump went off and created his own true social.
[619] He's got his audience.
[620] I think it's all collapsed.
[621] I don't think there's going to be a central place.
[622] I think people are going to find, like I said, find there's smaller audiences.
[623] I do think Threats is pretty good as a product.
[624] They're sort of pushing away from news, and that makes sense to me for them because they got into so much trouble before.
[625] Right.
[626] And they had Holocaust deniers running all over Facebook.
[627] So I think, you know, you'll be in your little neighborhood versus a mass thing.
[628] I think Twitter is, was a mass thing.
[629] And now it's just a bad product experience, really.
[630] You know, you have porn.
[631] I got, first time I got porn.
[632] I got people calling me names.
[633] You've got Cheech and Chong ads all the time, which is like, okay, and I'm not in the mood for weed right now, 40 times a day.
[634] And then the product is bad.
[635] It doesn't work as well.
[636] It doesn't have as much impact.
[637] And it's more narcissistic, right?
[638] You just are kind of out there, like we talked about, talking to yourself and coastplaying kind of thing.
[639] I have talked about the implosion of all these social media platforms.
[640] Just the way everything went to cable, Charlie, if you remember, it was the networks and then cable.
[641] That's where we're going with this.
[642] And everyone will be in their own little worlds without a center.
[643] And I think that's probably good.
[644] I guess, you know, part of it is that, and I've experienced this thing, you do forums with individuals, and you realize that it's very, difficult to have a conversation in politics because there's no shared reality.
[645] But there is.
[646] But there is.
[647] No, no, there is a shared reality.
[648] But, I mean, you can't talk to somebody about it.
[649] So I was on a panel with a young Republican, and I asked him about, we're talking about the jobless rate.
[650] He thought the jobless rate was the highest in American history.
[651] No. When I asked him about Donald Trump's comments about Hezbollah, he'd never heard of them.
[652] What's your source for that?
[653] Things happen that just don't even register.
[654] Charlie, he's just fucking with you.
[655] He knows.
[656] He knows.
[657] Oh, no, he, this, this, mm, you are kinder and gentler than I am because it was like, really?
[658] I just think they're cynical.
[659] There was a moment when I asked him about America's role in NATO.
[660] And, you know, that look that people get in their eyes.
[661] Yeah.
[662] Where you can tell he didn't know what NATO was.
[663] Probably not, yeah.
[664] Well, then he's just ignorant.
[665] Well, speaking of the shared reality, of course, what's going on.
[666] I don't want to get into the weeds on, on Israel and Hamas, but kind of an extraordinary moment where Joe Biden's getting on an airplane.
[667] He's going there tomorrow.
[668] He is.
[669] He's doing a great job, actually, honestly.
[670] What do you think of that?
[671] Because on one level, I think this is a bold move that shows how vigorous he is.
[672] I think he's handled it well.
[673] I think he has drawn the moral lines very, very clearly.
[674] But generally, you don't throw the president into a situation like this without knowing whether you have all the diplomatic ducks in a row.
[675] This is a big risk.
[676] Yeah, it's a risky move.
[677] But I kind of like it.
[678] It's sort of, you know, he understands that everything.
[679] thing now is this kind of thing, where acts, symbolic acts.
[680] My only issue is the danger, right?
[681] There's obviously the security issues around it in this region right now.
[682] That said, I think he understands the power of pictures and photos and support that people, look, he's in a presidential race against Trump right now.
[683] And you have this guy, I don't know what's going on with him.
[684] Like lately, it's kind of like there's something going on with his tan, and then he's slurring his words, and then he's talking about Amas being smart.
[685] And then there was something about Christmas.
[686] That was weird.
[687] It was, but that's all right.
[688] Every day of the week, and twice on Sunday, Trump does something weird.
[689] And we're used to that.
[690] We're sort of become a nerd to his behaviors.
[691] And does any of it make any difference?
[692] I mean, I'm sorry to even ask the question.
[693] No, not to his fans.
[694] Well, no, my mother was not a Trump supporter, then became a Trump supporter through Fox News.
[695] And now it's like, what is wrong with him?
[696] And I think there's a lot more people who are saying, what is wrong with him?
[697] She really thinks he's in it for himself.
[698] Like, you know what I mean?
[699] Like she's, and I don't know if that will stick, but it's certainly, I've never heard her say that until recently.
[700] The other thing, big defense I get is when you point out there's something wrong with this guy, of course, is the people on the right are completely pre -programmed right now to say, well, but Joe Biden, Joe Biden is senile, Joe Biden can't do anything, which again makes a bold move like going to Israel.
[701] He gave a great speech and he's going to Israel and this guy's talking about Christmas or Christmas.
[702] We'll see about that.
[703] I don't know if you saw Chris Christie as a new ad where he actually highlights Donald Trump's comments on Hezbollah, so smart.
[704] And he uses the word this fool.
[705] Only a fool would say this.
[706] And I thought it was interesting how many times he said we cannot reelect this fool?
[707] Chris Christie's not going to be the next president of the United States, but boy, he's throwing some roundhouses.
[708] He is.
[709] I think he's doing talk about someone who knows how to use social media well.
[710] He's really good about that.
[711] I interviewed him.
[712] I think one of the problems that he has is he was with him until he was against him, right?
[713] It's sort of John Kerry asked.
[714] I think it's fine to have changed your mind.
[715] And I think that's what we don't allow people to do.
[716] Like, I was wrong.
[717] You know, whether it's Anthony Scaramucci or Chris Christie, we have to allow people to say, wow, I was, you know, and that's hard to do because, you know, I just encourage you when my mom is saying stuff like that.
[718] I'm like, yeah, look at this, like that.
[719] Like, you know, you don't want to say you were stupid to have bought this song and dance.
[720] And, you know, at a certain point, it's all hands on deck.
[721] We cannot have the luxury of not having the enemy of our enemy, not be our friend.
[722] Right.
[723] I find that distasteful.
[724] One of the things I think about is, you remember the silent majority with Nixon?
[725] Oh, yeah.
[726] I think there's a silent majority of people who've just about friggin had it.
[727] And they just want to deal with their kids and the economy and crime.
[728] Crime is something you should be concerned about and drug use.
[729] And making a good living, they're concerned about AI and what it'll do their jobs, justifiably.
[730] And I think there's a lot of people like that who are like, this is a circus.
[731] And Jim Jordan is a clown.
[732] And Donald Trump is a clown.
[733] and there are serious things happening.
[734] And I do believe there's a quiet group of people.
[735] You see it in elections across the country.
[736] And in Poland, where we're like, that's enough of this nonsense.
[737] I believe that.
[738] I do believe that.
[739] Well, it's going to be interesting because you can see that Democrats are very, very anxious to run against Jim Jordan.
[740] I guess here's this, my five -cent theory on this, because, you know, before this all came down, I'm thinking, well, there's what?
[741] There's 12, 20 Republicans who were elected in Biden districts.
[742] Why would they want to go along with Jim Jordan?
[743] Why would they want to be linked to this crazy?
[744] Maybe what they've decided is the greater crazy is Trump, the lesser crazy, which is, hey, we're on the ballot with Donald Trump.
[745] If we're going to nominate Donald Trump incrementally, it's not that much worse to have Jim Jordan.
[746] But it is going to be a mess.
[747] Well, that's kind of good, though, right?
[748] He can't do anything, correct, politically?
[749] Your point is interesting, is that he is counting on that post -shame politics.
[750] I don't think he's ready for prime time.
[751] don't think that he's going to survive in the spotlight as well as somebody else.
[752] And if he thinks that he's going to get the same moral pass as Donald Trump, I think he's somewhat naive.
[753] No. Kara Swisher, host of the podcast on with Kara Swisher, co -hosts of the podcast, Pivot.
[754] Cannot wait for your book in February, Burn Book, a tech love story.
[755] Thank you so much for coming back on the podcast.
[756] All right.
[757] I'll come on and talk about it.
[758] Thanks, Charlie.
[759] It's always a pleasure to talk to you.
[760] Thank you.
[761] Thank you all for listening to today's Bold War podcast.
[762] I'm Charlie Sykes.
[763] We will be back tomorrow and we will do this all over again.
[764] The Bullwark podcast is produced by Katie Cooper and engineered and edited by Jason Brown.