The Joe Rogan Experience XX
[0] Joe Rogan podcast, checking out.
[1] The Joe Rogan Experience.
[2] Train by day, Joe Rogan podcast by night, all day.
[3] Hello, Matt Tyree.
[4] Hey, Joe, how's it going?
[5] You're to see it.
[6] It's always so hard to get rolling after you've been talking about.
[7] I mean, so I'm always excited to see you, so we're just blabbing, and now we're rolling.
[8] So what's cracking?
[9] A lot.
[10] A lot.
[11] It's been a crazy couple of months.
[12] I have enjoyed your work with the Twitter file.
[13] I would enjoy all your work, but I really have enjoyed the Twitter files.
[14] That has been some really fascinating views behind the curtain.
[15] It's been one of the weirder, more surreal experiences of my life because, you know, as a reporter, you're always kind of banging away to try to get one little piece of reality, right?
[16] Like, you might make 30 or 40 phone calls to get one sentence.
[17] The Twitter files is, oh, by the way, here, you know, take a laptop and look at 50 ,000 emails, you know, full of all kinds of stuff.
[18] And so it's, you know, for somebody like me, it's like a dream come true.
[19] We get to see all kinds of things, get the answers to questions that we've had for years.
[20] And it's been really incredible.
[21] Has anything been surprising to you?
[22] A little bit.
[23] I think going into it, I thought that the relationship between the security agencies like the FBI and the DHS and companies like Twitter and Facebook, I thought it was a little bit less formal.
[24] Like I thought maybe they had kind of an advisory role.
[25] And what we find is that it's not that.
[26] It's very formalized.
[27] They have a really intense structure that they've worked out over a period of years where they have regular meetings.
[28] They have a system where the DHS handles, you know, censorship requests that come up from the states and the FBI handles the international ones and they all float all these companies.
[29] And it's a big bureaucracy.
[30] And I don't think we expect it to see that.
[31] It's very bizarre to me that they would just open.
[32] openly call for censorship in emails and these private transmissions, but ones that are easily duplicated, you could send them to other people.
[33] It can easily get out, like that they're so comfortable with the idea that the government should be involved in this censorship of what turns out to be true information, especially in regards to the Hunter Biden laptop, that they would be so comfortable that they would just send it in emails.
[34] Yeah, yeah.
[35] Well, I think that shows you the mentality, right?
[36] Like, that they really genuinely felt that they were impregnable, that they don't have anybody to answer to.
[37] I mean, a normal person doesn't put incriminating things in emails because we all have the expectation that someday it might come out, you know.
[38] But these folks didn't act that way.
[39] I mean, you see, I was especially shocked by an email from a staffer for Adam.
[40] Schiff, the congressperson, the California congressman.
[41] And they're just outright saying, we would like you to suspend the accounts of this journalist.
[42] And anybody who retweets information about this committee, you know, I mean, this is a member of Congress.
[43] Right?
[44] Most of these people have legal backgrounds.
[45] They've got lawyers in the office for sure.
[46] And this is the House Intelligence Committee.
[47] You would think they would have better operational security.
[48] Another moment that was shocking to me, there was an email from an FBI agent named Elvis Chan in San Francisco to Twitter.
[49] And they're setting up this signal group, which is going to include all the top sort of censorship executives at all the big companies.
[50] And it's a word document that has all the phone numbers of all these important executives.
[51] And the email just, the subject line reads, phone numbers, right?
[52] And the word doc is just called secret phone numbers.
[53] Right?
[54] And I'm thinking, this is how they taught you to do it at Quantico?
[55] Really?
[56] You know?
[57] I mean, like, I mean, even a journalist can't miss that.
[58] You know what I'm saying?
[59] Call it something else, you know?
[60] I don't know.
[61] That part of it was amazing.
[62] It's so strange.
[63] It's so strange to get such a peak because I don't think anybody ever anticipated that something like this would happen where Twitter would get sold to an eccentric billionaire who's intent on letting all the information get released.
[64] Yeah, I mean, I think Elon Musk essentially he spent $44 billion to become a whistleblower of his own company.
[65] Yeah.
[66] And I mean, I don't really fully know his motives in doing that.
[67] I think he's got a pretty developed sense of humor, though.
[68] And that comes through.
[69] I think he gets a kick out of seeing all this stuff come out on Twitter, which used to be the kind of the private stomping ground of all these whiny journalists.
[70] And now here is all this information that is just horrifying to all of them.
[71] I mean, $44 billion is a lot to spend on that thrill, but I'm glad he did.
[72] Well, he truly believes that censored social media is a threat to democracy.
[73] He really believes that.
[74] Absolutely.
[75] And I believe it too.
[76] Yeah.
[77] I just don't have $44 billion.
[78] Right.
[79] Even if I did, I'd be like, I don't want that heat.
[80] Right.
[81] Right.
[82] Yeah.
[83] I don't think that's what I would spend it on.
[84] But no, he believes that.
[85] I think he also believes that the credibility of these companies can, only be restored by telling people what they talk about in private or what they have been talking about, you know, with the government and that sort of thing.
[86] Yeah.
[87] So he might be right about that, you know.
[88] We'll see, I guess.
[89] I think he is.
[90] I mean, it's going to be interesting.
[91] It's going to be interesting to see how this plays out.
[92] There's an amazing amount of resistance against him.
[93] And, you know, just the publicity campaign against him has been fascinating to watch.
[94] people go from thinking that Elon Musk is the savior that's bringing us.
[95] He's amazing electric cars and engineering new reusable rockets, too.
[96] He's an alt -right piece of shit who wants Donald Trump back in the office.
[97] And it's like, it's very wild.
[98] The speed with which they can sort of shuffle somebody into the Hitler or the month club routine, right?
[99] Like, you know, we've always done this with foreigners, you know, whether it's Noriega or Saddam Hussein or Milosevic or Assad or whatever it is.
[100] Like we have a playbook for cranking out negative information about, you know, foreigners who get in our way for whatever reason.
[101] But now we've kind of refined that technique for domestic people who are inconvenient, you know?
[102] I think they did it with Trump, obviously.
[103] You know, they try to do it with Tucker Carlson with you, you know.
[104] I mean, you got a taste of that for a few times.
[105] Yeah, it's interesting.
[106] Right.
[107] And then, you know, with Elon, yeah, he went from being the guy who made electric car sexy to like, you know, something to the right of Victor Orban in like 10 seconds.
[108] It's amazing.
[109] It is amazing.
[110] And the narrative has spread through progressive people.
[111] Well, they'll just say it now.
[112] It's like they've reached the memo.
[113] The memos got to them.
[114] And then they just, I hear people in L .A. I hear people that I know, like, oh, Elon's just so crazy.
[115] It's like, something happened to him.
[116] He went nuts, and he's a right winger now.
[117] Like, how?
[118] What are you saying?
[119] Like, what examples do you have?
[120] Like, they don't have an example.
[121] They just have this narrative that reached them the signal.
[122] Like, Elon bad now.
[123] Oh, Elon bad now.
[124] Elon bad now.
[125] Elon bad now.
[126] And they just start saying it.
[127] And you go, like, what examples are you using of, like, his behavior?
[128] Well, he let Trump back on the platform.
[129] Okay, well, the Taliban's there.
[130] Right.
[131] Yeah, exactly.
[132] You didn't know a problem with the Taliban?
[133] The Taliban just bought blue check marks.
[134] Do you know that?
[135] Did they really?
[136] Yes.
[137] They didn't know that.
[138] Yeah, they're buying blue check marks so they could be verified.
[139] I'm the real terrorist.
[140] The fucking Taliban is on and nobody has a problem with it.
[141] The CCP's on Twitter.
[142] No one has a problem with it.
[143] Right.
[144] But they're like Trump.
[145] They'll let Trump back on.
[146] Look, Trump is hilarious.
[147] He's a ridiculous person.
[148] But don't you think it's better that his tweets get out there and then a bunch of people get to attack him in the tweets.
[149] And if those tweets that people attack him with are good, if people are saying good things, then those things get retweeted and liked and then they rise up to the top of the algorithm.
[150] It's all good.
[151] Like, you need a voice against someone like that.
[152] You can't have that guy howling into the wind on some QAnon forum and all those wackos just so they're only talking to each other with no pushback at all.
[153] If you really don't like Trump, you want him on Twitter.
[154] You want that guy to have some pushback.
[155] You want that guy to have some pushback.
[156] You You want people to be talking against what he's saying.
[157] You want Twitter, the real Twitter now, which will actually fact -check everybody.
[158] They fact -check Biden, right?
[159] They'll fact -check him.
[160] So if he says something stupid, they'll go, no, no, that's not what's true.
[161] Here's what's true.
[162] Right.
[163] That would be good.
[164] And that was actually for a while Twitter's official policy.
[165] They had something called the public interest policy, which specifically laid out exactly what you said.
[166] Like when a world leader, no matter who it is, says anything, we want it to be out there because we want it to be debated.
[167] We want people to see it.
[168] We wanted people to talk about it.
[169] We want people to reach conclusions about it.
[170] And one of the things that we found in the Twitter files was after January 6th, there was this intense debate within the company where they were basically saying, oh, thank God, we're going to repeal the public interest policy.
[171] or we're going to poke a hole in it, right, and no longer have that belief system that just because somebody is a world leader, we need to hear what they have to say.
[172] So they invented a new policy called glorification of violence, or they called it that.
[173] And essentially what they said was you had to look at Trump, not in terms of each individual tweet, but in terms of what they called the context surrounding, his whole career, all the people who followed him, whether or not they were violent, whether or not they said the things that were offensive.
[174] It's like the speech version of stochastic terrorism.
[175] I don't know if you ever heard that term.
[176] Stochastic terrorism is this idea that you can incite people to violence by saying things that aren't specifically inciting.
[177] but are statistically likely to create, you know, somebody who will do something violent, even if it's not individually predictable.
[178] And that's what they did with Trump.
[179] They basically invented this concept that, yes, he may not have actually incited violence, but the whole totality of his persona is inciting.
[180] So we're going to strike him.
[181] And so they sort of massively expanded the purview of what, of, things they can censor just in that one moment.
[182] And it's, you know, you can see it in these dialogues, how they came to that decision, which is just fascinating.
[183] It's just such an extraordinary amount of power to give people the ability to censor people on the biggest public forum in the world.
[184] It's so extraordinary.
[185] And the fact that they can come up with these justifications for why this is a good idea without anyone pushing back, without anyone saying, do you understand where this goes to?
[186] This eventually leads to government control of all thought and speech.
[187] This is where you're going.
[188] You're allowing the government to influence you based on one specific problematic individual, and that could spread out into every one of us, all of us, easily, quickly.
[189] Right, right.
[190] I mean, we heard at the World Economic Forum, right?
[191] We heard the...
[192] Brian Stealthor was there.
[193] Brian Stealthor.
[194] Brian Stealthor is now at the World Economic Forum.
[195] What can we do about these problems?
[196] He looked very comfortable there, didn't it?
[197] Of course he does.
[198] He's with evil lizard people that are trying to control the world.
[199] That's his bosses.
[200] He knows how to handle that kind of situation.
[201] I've been around evil, evil other people.
[202] He looked as happy as maybe he's ever been.
[203] Well, he's probably very excited just to be working again in any way, shape, or form.
[204] That's true.
[205] He's not a guy that really supposed to be in front of a camera, right?
[206] He's supposed to be a journalist, but he's not even good at that.
[207] So what he's doing now is holding water for the evil leaders of the world.
[208] who want to institute hate speech policies nationwide and, you know, centralized digital currency and they want everybody to eat bugs and you will own nothing and be happy.
[209] This is the fucking people he's working for now because he's basically a prostitute.
[210] And, you know, they hired him to go over there and do that.
[211] And he's like, what can we do?
[212] What can we do better?
[213] What can we do different to get everybody to stand in line?
[214] Yeah.
[215] What can we do?
[216] And for a journalist to sit there, and there was that one moment where that woman, And Vera Yorovah, she's an EU official, and she's talking about hate speech laws.
[217] And then she touches the knee of somebody sitting next to her and saying, you're going to have that in America soon.
[218] And Brian Stelter is sitting there grinning, you know, like that's not offensive to him.
[219] Like a European basically saying, oh, yeah, you're going to have this too soon, like even though it's completely antithetical to everything that we believe in in this country.
[220] Well, I think when you're working in a corporate news structure, and you could speak to this better than I could, obviously, but I think when you're working in an environment where you have editors and people in your ear and you have producers and you have narratives that the company is pushing and then you have sponsorships that you're beholden to, it's very difficult to form any sort of problematic or controversial, independent thought and then try to express it publicly.
[221] You're not going to do it.
[222] It's just too scary and sketchy.
[223] So when you're trying to keep that job, and here's a guy like Brian Stelter, who already lost one of the biggest gigs in all of broadcast news.
[224] He was on fucking CNN.
[225] And then, you know, here he's standing there and they're saying, you're going to have hate speech laws in America do.
[226] He's like, okay, everything's running smooth.
[227] Everyone's smiling.
[228] Like, he doesn't, he's not suitable for that role.
[229] Right.
[230] He doesn't belong there.
[231] You don't have the stones to carry that conversation in a way that's going to benefit all these people that are listening to it.
[232] What you want is someone who's in that position that goes, hold on, what do you think is hate speech?
[233] What's hate speech to you and what's hate speech to me?
[234] And who gets to decide?
[235] Yeah.
[236] How is that going to be adjudicated?
[237] Like, what's your definition?
[238] What's your definition of hate?
[239] What's your definition of speech?
[240] Exactly.
[241] You know what I mean?
[242] There are a lot of questions.
[243] Does context matter?
[244] Yeah.
[245] And how do you decide?
[246] And obviously when you're looking at things over text, context gets very blurry.
[247] You don't know if someone's joking around.
[248] like there's so many pages that I get sent that are satire I've got a hilarious one there's a new one Rick Rubin sent me this this one he's like this got to be satire right it's brilliant brilliant satire but there's this person who has like the best version of super liberal like my children will never eat food from a gas stove like like that kind of shit and there's so many of these like it's hard to tell who's who's what and what's real and it's it's just one of those things where you're you're it's it's hard when you're looking things through text because people are sneaky they're really good at it and people are so ridiculous in real life that a really subtle parody is very hard to discern so is that hate speech if someone's doing it as a parody is that hate speech like when do you decide that something is hateful and that's exactly why traditionally in this country judges have always said well they haven't always said it.
[249] But they eventually came around to the idea that we can't involve ourselves in these questions.
[250] They're too difficult, and it's not our job.
[251] We're going to step in only the most extreme cases, right?
[252] So the current standard is, you know, the Supreme Court case, Brandenburg v. Ohio, which outlaws incitement to imminent lawless action, right?
[253] So you have to be basically saying, you know.
[254] Let's go get them.
[255] Go get them.
[256] Break into the White House.
[257] Shoot, shoot that person.
[258] That's illegal speech, right?
[259] Anything short of that, we're going to stay out of it because it's just too confusing.
[260] It's too complicated, right?
[261] Like, if you start getting into what's satire, what isn't, what's incitement, what isn't, like, as we see at companies like Twitter, you know, you can spend endless amounts of time building sandcastles trying to figure out what is what, and it will always end in a place where the government interprets it to its greatest advantage.
[262] And that's why we don't want it, you know, ultimately it's not a good thing for most people.
[263] It's just very hard for people to realize, even though this thing that you're talking about wielding, this weapon, will work against your enemies.
[264] It can ultimately also be used against you.
[265] That was the thing with the Patriot Act.
[266] When the indefinite detention, when they were talking about just being able to detain people, and Obama was like, don't worry, well, I would never do that.
[267] But you're not going to be in the president forever.
[268] Like someone else is going to come along.
[269] And perfect example, that next person was Trump.
[270] Right.
[271] Well, what if someone's crazier than him?
[272] Like, what if something happens?
[273] What if there's some sort of a nuke goes off somewhere?
[274] and then everybody gets way more radicalized.
[275] And then you can get a really fucking insane, like a Stephen King character.
[276] What was that movie where the one evil guy becomes president and...
[277] Oh, God, the Greg Stilson.
[278] Dead zone.
[279] Dead zone.
[280] Thank you.
[281] I mean, we're not far removed from that in terms of plausible plots that this wacky country can fall into.
[282] And that's the same thing with censorship.
[283] Like, they can use it against you.
[284] So, like, if you're, you think you're using this to push back against right -wing extremism, they can use that to push back against progressive ideas that would generally benefit, genuinely benefit good people.
[285] Right.
[286] Genuinely benefit families.
[287] Genuinely benefit people in need.
[288] Genuinely benefit people in terms of health care and education.
[289] They can stop that.
[290] Absolutely.
[291] They can stop that if it's unprofitable with the same sort of tools.
[292] Absolutely.
[293] You've got to have free speech.
[294] It's the most important thing we have.
[295] And it's the one thing that separates us from everybody else.
[296] So when you have liberals and progressives that are screaming against removing people from platforms and stopping this and stopping that, understand what the fuck you're saying.
[297] Yeah.
[298] And they don't.
[299] Right?
[300] Yeah.
[301] I mean, like.
[302] It's just convenience.
[303] It'll work against my enemies.
[304] You talk about how they can use it to shut down things on the other side.
[305] Yeah.
[306] We see reports in these files of government.
[307] government agencies sending lists of accounts that are accusing the United States of vaccine corruption.
[308] Now, what they're really talking about is pressuring foreign countries to not use generic vaccines, right?
[309] And, you know, that's a liberal issue.
[310] That's a progressive issue.
[311] Like, the progressives want generic vaccines to be available to poor countries, okay?
[312] But, you know, you can use this tool to eliminate speech about that if you want, too, right?
[313] Like, I think that's what they don't get is that the significance is not who.
[314] The significance is the tool.
[315] Like, what is it capable of doing?
[316] Right.
[317] How easily is it employed?
[318] And, you know, how often is it used?
[319] And they don't focus on that.
[320] They focus on, oh, it's Donald Trump, so therefore we want it.
[321] I mean, that's, and that's where their mistake is.
[322] It's a very interesting and very nuanced conversation as to what should be allowed and what should not be allowed and why.
[323] And I think it's complex and it's ever -changing and it depends upon the tools that are involved and it depends upon what are you talking about.
[324] And then it also depends upon like, here's a big one that drives me nuts about this January 6th.
[325] why is it okay for the FBI to have agents that incite people to go into the capital?
[326] Why is that okay?
[327] What benefit is that for society?
[328] How much influence did they have?
[329] How much rabble -rousing influence did they have?
[330] How much coercion?
[331] I mean, why is that okay?
[332] So this is another topic that is fascinating because it hasn't gotten a ton of press.
[333] But if you go back all the way to the early 70s, the CIA and the FBI got in a lot of trouble for various things.
[334] The CIA for assassination schemes involving people like Castro, the FBI for, you know, co -intel pro and other programs, domestic surveillance.
[335] And they made changes after congressional hearings, the church committee, that basically said, The FBI from now on, you have to have some kind of reason to be following somebody or investigating somebody.
[336] You have to have some kind of criminal predicate.
[337] And we want you mainly to be investigating cases.
[338] But after 9 -11, they peeled all this back.
[339] There was a series of attorney general memos that essentially refashioned what the FBI does.
[340] And now they don't have to be doing crime fighting all the time.
[341] Now they can be doing basically 100 % intelligence gathering all the time.
[342] They can be infiltrating groups for no reason at all, not to build cases, but just to get information.
[343] And so that's why they're there.
[344] They're in these groups.
[345] They're posted up outside of the homes of people they find suspicious, but they're not building cases.
[346] and they're not they're not investigating crimes it's sort of like minority report right it's pre -crime yeah and and the public has accepted this you know without much trouble yeah there's a little bit of pushback from people online then it goes away where there's no real repercussions like the governor whitmer case right where there's 14 people involved in kidnapping and 12 of them are FBI informants right which is fucking bonkers And then the two guys were doing hard time.
[347] They're like, we thought it was fantasy.
[348] Like, we're idiots.
[349] We didn't know.
[350] Like, one of the guys literally said, I never planned on doing anything.
[351] To me, it was just fantasy.
[352] Yeah, well, I mean...
[353] They're morons who get talked into this.
[354] And imagine you're getting talked into this by 12 people who turn out to be informants.
[355] Right.
[356] That's wild.
[357] Yeah.
[358] Yeah.
[359] And why are there so many informants, you know, like hanging around with this?
[360] Fucking head of the proud boys.
[361] Right.
[362] The head of the proud boys was an informant.
[363] Is that true?
[364] I didn't know that.
[365] What's his name?
[366] Enrique Tario?
[367] Yeah, pulled that up.
[368] He was a fucking informant.
[369] So this guy who is at the head of the proud boys, the guy who's organizing the things and get, he was an FBI informant.
[370] Wow.
[371] The whole, do you know the story of the proud boys?
[372] The real origin story?
[373] No, I don't, actually.
[374] Oh, my God.
[375] The origin story is amazing.
[376] The origin story, here we go.
[377] Proud Boy's leader was prolific informer for law enforcement.
[378] Enrique Atario, leader of the Proud Boys extremist group, has a past informer for federal and local law enforcement, repeatedly working undercover for investigators after he was arrested in 2012, quoted to a former prosecutor in transcript of a 2014 federal court proceeding obtained by Reuters.
[379] So the proud boys started off as a joke on Anthony Coombe is a radio show where Gavin McGuinness, who was a regular guest, they made a joke about one of the guys who was an intern, and he was, they were doing a joke about him being in a musical, and the musical like, Proud of My Boy, and they were singing a song, like, we're the proud boys, proud of my boy, and they're like, we're going to put together a group called the Proud Boys.
[380] And so they decided to have like this fake group of people.
[381] And to get into this group, you had to, they had to punch you in the arm and you have to read off like remember different breakfast cereals.
[382] Like it's all like really hilarious stupid shit.
[383] But then Gavin, you know, Gavin's one of those guys that just like, he's a legitimate maniac, which was great when he was running vice and not so great when he gets involved in extremist groups.
[384] Right.
[385] And so his idea was like, fuck these.
[386] these crazy Antifa people, we're going to develop our own group, but we're going to go after them.
[387] We're going to fight them when we go to these, because they would show up at any time he had to do speeches and they would protest him.
[388] He's like, we're going to be the proud boys are going to show them.
[389] We're going to fight them.
[390] And I'm like, hey, hey, hey, like, that doesn't end well.
[391] Right.
[392] And obviously it ended terribly.
[393] And that's where the proud boys' origin story come from.
[394] That's amazing.
[395] It came from the Anthony Coomee of Radio show.
[396] From Opie and Anthony.
[397] Anthony from Opie and Anthony.
[398] Dude, it's a joke Like, Anthony came on the podcast And told the origin story of it There's a famous clip that's on YouTube And it's hilarious And then it ends up being like You know, a central issue In like the presidential campaign Yes, yes Yes It's nuts America couldn't be It couldn't be more ridiculous Dude, but if you know Anthony And you know Gavin Like I've known Gavin forever Gavin's just nuts Like he's not a extremist.
[399] He's not an evil person.
[400] He's just nuts.
[401] But he fucked up when he created that group.
[402] I'm like, dude, you can't just punch people in the face.
[403] Because you punch people with the face, they come back with a bat, now you have a war.
[404] Like this, you just can't just say, we're going to go hit people.
[405] Like, that always ends badly.
[406] Right, right, right.
[407] But he just was having fun and kept pushing it.
[408] He's a push the limits, push the envelope guy.
[409] And then now it's this hate group that people bring up in political speech.
[410] And the proud boys and the white supremacists and like the proud boys and they have a life of their own now right now now it's beyond they don't even like him anymore right they kicked him out of a meeting he went to a meeting in Vegas he's persona non grata in the proud boys now he can't even go to the a thing that he started as a joke has now completely ran away from him because like with any group right if you have a group that people can join anyone can join well anyone could join you're going to get assholes to join it of course and you're also going to get law enforcement and that's how you got this Enrique Atario guy.
[411] Right, right.
[412] Wild.
[413] Yeah, yeah.
[414] That's unbelievable.
[415] It's wild.
[416] So I guess they found his cooperator agreement.
[417] Yeah.
[418] And what?
[419] Did he continue to be the head of the proud boys after that?
[420] I think they arrested him recently.
[421] That's the interesting.
[422] Also the thing about these guys is they don't get off Scott Free.
[423] Like they make you work as an informant.
[424] They make you be a snitch.
[425] And then they put you in jail.
[426] Right.
[427] They'll still put you in jail.
[428] But this won't put you in jail for life.
[429] Right.
[430] Okay, we're going to give you ten.
[431] years instead of 50.
[432] And so these guys just do it anyway.
[433] I mean, they they treat you like a real bitch.
[434] Like they don't treat you like a half a bitch or like, you know, hey, we're going to work out a deal.
[435] No, they treat you like a bitch.
[436] Two days before a far right mob stormed the U .S. Capitol, police arrested the leader of the proud boys militia group for burning a black lives matter flag at a different protest.
[437] Wow.
[438] That's convenient.
[439] Meanwhile, he's black, which is hilarious.
[440] Right, right.
[441] The fact that like you can't burn that flag.
[442] If you burn an American flag, no problem at all.
[443] Right.
[444] That's fine.
[445] Yeah.
[446] That's fine.
[447] I mean, I actually, I'm for that.
[448] You're for them arresting people for burning flags?
[449] No, no, no. I'm fine with people burning flags if they feel like it.
[450] Yeah.
[451] I don't think you should arrest people for burning a flag.
[452] Like, I don't think you should arrest people for burning a piece of paper.
[453] Right.
[454] Like you should be able to, if you want to burn something in protest, it's not like we can't make more flags.
[455] Right.
[456] Like, what are you doing?
[457] Yeah, we got plenty.
[458] If you want to buy a flag and burn it, okay.
[459] That's your big thing.
[460] Go ahead.
[461] I think we should make flags out of asbestos, solve all the problems.
[462] I would make for a lot of unfodogenic protests, but that would be funny.
[463] But what the fuck?
[464] I just, you know, I'm just astounded by the lack of ability to see the future.
[465] You know, the lack of ability to see where all this goes.
[466] Like, in letting this happen and not being outraged, in letting the government creep into all aspects of your life in this way there's no net positive there's in the end it's just government control of speech and thought and content and everything you do right and complete capture of the media and and all of that you know and that's that's you know that's kind of what we're trying to to fight against and it's uh you know the the one heartening thing is that is that the quote unquote mainstream press is now, really, it's in free fall now, right?
[467] Its influence is more and more limited every day.
[468] You know, the problem is that something needs to step up in its place.
[469] And, you know, but they're just, they don't have any authority at all outside their own little bubble.
[470] Yeah.
[471] Propaganda is a scary thing.
[472] And when you have mainstream news organizations going along with what, appears to be propaganda with no pushback at all like where where is journalism like journalism is such an important part of any sort of functioning culture where people need to find out what what is the real information and there's people that have a responsibility to try to find that information and then give it to people so they can make informed decisions and they can know what what is the workings behind the machine what's the wiring what's happening how are these decisions getting made when the corporate media doesn't do that anymore we're fucked and you in your time have seen that you've seen this transition into like the media becoming an arm of propaganda as opposed to what it was in the 70s or what it was in the 60s where it was the news this is what's happening this is what we've uncovered this is our undercover investigation These are our facts.
[473] Our informant has told us this.
[474] Now we know that.
[475] Nixon did this.
[476] Kennedy was aware of that.
[477] We know these things now because of real journalism.
[478] Right.
[479] And it seems like for whatever reason, there's two branches going on with journalism.
[480] There's people like you and Barry Weiss and Glenn Greenwald and the substack people that are like, hey, hey, hey, hey, this is not what I fucking signed up for.
[481] Right.
[482] I'm here to do actual real journalism, and you people in these gigantic mainstream organizations are losing your fucking minds.
[483] You're crazy, and you're doing it for so many reasons, because Trump sucks, because you're pushing a woke agenda, because, you know, you want, whatever the reason is.
[484] Like, you've decided to become a part of a propaganda machine, and it's not journalism anymore.
[485] Right.
[486] You're ignoring really important stories that are inconvenient.
[487] to the narrative that you feel like it's your obligation to push.
[488] Yeah.
[489] Remember when Trump became president and he was making noise about not letting certain people have credentials to get into the White House?
[490] And there was this big hue and cry like, oh, my God, he's not going to let us into the White House.
[491] And my first reaction to that was, who fucking cares if, you know, if you're not let into the White House?
[492] You have an adversarial relationship already.
[493] You're supposed to with government.
[494] Right?
[495] If they don't let you in, just report on it anyway, you know?
[496] Like, it's not a big deal.
[497] But for the new generation of journalists who've come in, they imagine themselves because they're socially the same people that they're reporting on.
[498] They hang out in the same circles.
[499] They go to the same parties.
[500] The idea of not being let behind the rope line is an atrocity to them.
[501] They don't understand it.
[502] And they see that.
[503] their role as helping explain the point of view of power.
[504] I mean, it's just what we're talking with Brian Stelter before.
[505] Like, my job here is to kind of sell this to the population.
[506] Whereas the old school journalists were not like that.
[507] Like, they saw their role as, yeah, you know, we're patriotic.
[508] You know, like, we love our country and if it does the right thing, we'll, you know, we'll report that.
[509] But if it fucks up, we got to report that too.
[510] Like, you know, our job is to ask difficult questions.
[511] And if we have difficult truths, we got to report those things.
[512] And so we're not really on your side.
[513] Like, we're not your friends, you know.
[514] You can hang out with us.
[515] We can hang out with you at a campaign stop.
[516] But there's supposed to be tension there.
[517] There's always supposed to be tension there.
[518] And what you see now is that there's no tension.
[519] at all, right?
[520] Like, there's just this sort of seamless community of people who all think the same way, you know, whether it's, you know, Rachel Manow or, you know, Don Lamont or whoever it is and, you know, some Biden administration official, like, they're all kind of on the same page.
[521] They see themselves as part of the same group.
[522] They see themselves as having the same mission.
[523] But the press, The press has to have its own mission or else it's not legitimate.
[524] Well, I don't think they're journalists.
[525] I mean, they're really just television propagandists.
[526] That's what they are.
[527] And they're working for these enormous corporations and it benefits those corporations to have a narrative.
[528] And so you have a spokesperson for the narrative.
[529] They're not, there's no way Don Lemon is a journalist.
[530] Right.
[531] Like, there's no way Brian Stelter is a journalist.
[532] They're just not.
[533] Right, right.
[534] But they are the way like that insurance lady on those insurance commercials.
[535] is a stand -up comic.
[536] You know what I mean?
[537] The progressive lady.
[538] Yeah, is she really a comedian?
[539] I guess she might make a few people laugh, like maybe.
[540] You know, but if she comes to the green room with the comedy club, we're all going to look at her and go, hey, what are you doing here?
[541] Like, this is, you know, it's not like Sarah Silverman showing up.
[542] She's not really a comedian.
[543] That lady is a, she's a paid spokesman for the insurance company that just happens to make you laugh.
[544] Right.
[545] And what those people are is paid spokesman for the narrative.
[546] that just happened to be saying what's happening in the world, but they're not journalists.
[547] Right.
[548] Yeah, they're reading off, you know, a version, yeah, of something.
[549] And then there's their idea of journalism is sort of digging up facts to defend whatever that is.
[550] Yeah, right?
[551] And not looking at things objectively.
[552] Not looking.
[553] There's a very clear narrative.
[554] You have to push that narrative.
[555] Right.
[556] Yeah.
[557] And when it changes, you don't say a goddamn thing.
[558] When the science changes and when new information comes out, the refutes everything you've said in the past, you just shut the fuck up and keep moving.
[559] Right.
[560] They do that and they shouldn't do that because, again, it's another thing that loses you faith with audiences, right?
[561] And this is another thing that drives me crazy about this propagandistic model of media is that it, in addition to being wrong, It doesn't work, right?
[562] Yeah.
[563] Like, propaganda only, it only goes so far.
[564] You actually need people to trust you and trust is a complicated thing.
[565] You know, you have to have a relationship with your audiences, and audiences will not believe you if they see you making a mistake and then they see you not owning up to it.
[566] Yes.
[567] Right?
[568] Yeah.
[569] Which is why, you know, you and I both have had this experience of saying something wrong and then coming out and saying, you know what?
[570] I screwed that up.
[571] Like I did that.
[572] I made a really bad call about the beginning of the Ukraine War.
[573] I didn't believe the reports that were coming out.
[574] They sounded wrong to me. And I kind of wrote a sarcastic column about how this is ridiculous.
[575] Everyone's being so credulous about this.
[576] And then, of course, the invasion happened shortly after that.
[577] And I look like an idiot.
[578] But you got to come out and see.
[579] say after that, you know what?
[580] I fucked up.
[581] Yeah, I fucked up.
[582] I was over my skis on that thing.
[583] And, and, you know, if you don't do that, they never trust you.
[584] They never trust you.
[585] And they shouldn't trust you.
[586] Right.
[587] And you should.
[588] When you fuck up, you should own it.
[589] It's like it's just a human being, you know, and you're, you're talking about things in real time.
[590] Especially when you're doing a podcast, you're literally thinking out loud publicly.
[591] Right.
[592] And you're, and it becomes permanent record.
[593] Right.
[594] Right.
[595] Good luck with that.
[596] Yeah.
[597] No, I, I don't know how that's got to be incredibly difficult to you're not even sitting there doing what I do is choosing my words carefully and typing them right well a lot of times I'm intoxicated it's a fucking good solid amount of the time yeah no it's but that's also why people like it because they feel like you're really hanging out with them having a conversation because that's what it sounds like because it is what it's like I mean it's like the same way I talk to you you and I could be at dinner right now.
[598] And we would probably be having the same conversation.
[599] Yeah, exactly.
[600] And people can pick up on that.
[601] Yeah.
[602] You know what I mean?
[603] Like that, I always thought that was, you know, one of the reasons that your show is so successful is, is that people don't detect that there's something staged about it, you know.
[604] And, but you can see that so clearly in, in every other kind of media now that, you know, it's just, I don't know.
[605] It's a dirty business.
[606] It's a dirty business.
[607] It's a dirty business.
[608] And it's dirty for them, too, because if they could be free, they would like to.
[609] I think almost everybody would like to just be able to be themselves and have their own opinions and be able to express themselves and be able to think about things openly.
[610] But when you are working for a major news organization and you have an enormous paycheck every week that comes to you if you keep this thing going, you have to keep this charade going, keep this con going.
[611] You're going to keep going.
[612] You know, you're going to be Rachel Maddow.
[613] And, like, and that's, I love the fact that the way you compared Rachel Maddow to Bill O 'Reilly.
[614] You're like, it's the same person.
[615] Right.
[616] It's the same thing.
[617] It's just one's doing it with progressive values and one's doing it with right wing values.
[618] Right.
[619] And you were really the first person who's a liberal guy to openly say that.
[620] I'm like, yes.
[621] Yeah, that's what it is.
[622] Yeah.
[623] No, it's, it's, you identify an audience.
[624] You give them stuff they want to hear.
[625] you do it over and over again, rinse, repeat, blah, blah, blah.
[626] And, you know, look, that's a dangerous pattern that you can fall into when, you know, the business works the way it does.
[627] But people do it, you know.
[628] And the converse of that is that if you work in these organizations, and this is something that, believe it or not, Nome Chomsky wrote about a million years ago in a book called Manufacturing Consent.
[629] You see coming up in the business that when somebody tries to buck the system and tries to force through an unpopular story or refuses to write a story that's not true or does anything that the editors don't like, they see that those people are moved out of the business sooner rather than later, right?
[630] They just sort of end up being washed out with reputations for being difficult people.
[631] You know, Chris Hedges is somebody who comes to mind, right?
[632] Like he's, they kind of just squeeze you out.
[633] There's no particular thing that happens.
[634] And that's a sense signals down the ranks of people in journalism that if you want to get ahead, just keep doing the shit that we want you to do.
[635] You know, you don't have to be a genius to figure out what that is.
[636] just keep doing it and you know well you'll eventually rise up through the ranks and before you know it you'll be you'll have your own show you or you'll be running a desk but you won't have anything to say because you'll because early on you'll have made the decision to abandon your individuality like that that's the key to the whole thing is that it's not people who are making these big decisions to sell out when they're 50 they make the decision.
[637] to sell out when they're 22 or 23.
[638] They get the very start.
[639] When they first see it and they understand how the business works and they start climbing, that's when they sell out.
[640] So by the time they get to be that, you know, older, like, it's who they are.
[641] Well, you see it in politics too, right?
[642] Like, you see people like AOC who starts out as this, like, you know, really kind of inspiring story.
[643] Girl wore her shoes out campaigning, just walking around, going door to door, and now she's cozying up to the likes of Nancy Pelosi and they're, you know, they're all in this weird sort of group together making decisions and people don't like it.
[644] They don't know.
[645] They see where it's going like, oh, if you want to be president, you have to go down this road.
[646] And you're going down that road.
[647] Exactly.
[648] And you're going to be that lady that presses the button that drops the bombs.
[649] You're going to be that lady that like sends the fucking drones out and find some way to justify the fact that kills 90 % civilians.
[650] You're going to be that person because that's where that person starts.
[651] You start out idealistic.
[652] You start out this person who's very progressive and really wants to help lower income families and really wants to help inner city schools.
[653] It really wants to help all of it.
[654] And then along the way, you get indoctrinated into the system and you figure out how everything works.
[655] This is how you have to do it.
[656] This is how you play ball.
[657] This is the bill you have to sign.
[658] This is what you have to get in on.
[659] In order to get this, we have to do that.
[660] In order to get that, we have to do this.
[661] And next thing you know, you're a fucking politician.
[662] Right.
[663] And you're the Speaker of the House and you're doing insider trading and making hundreds of millions of dollars because everybody else is doing it.
[664] Right.
[665] Remember when she came in and everybody was, there was a whole clan of sort of senior Democratic Party officials in Congress who were giving her a hard time because she had, she was on Twitter a lot.
[666] Right.
[667] And she was being successful with Twitter.
[668] And they were like, you know, you have to make the choice between whether you want to be on social.
[669] media or whether you want to be a politician.
[670] And I actually admired her at the time.
[671] I didn't agree with her about everything, but I, like you, I thought her story was interesting.
[672] I thought that she had taken an alternative path to getting elected.
[673] She was very clever in her use of social media.
[674] You know, it's better to be good on Twitter than it is to, you know, whore out your beliefs to some donor.
[675] You know what I'm saying?
[676] But once she got into Congress, they let you know, right?
[677] Look, if you ever want to be a committee chair, if you want to get in line for these powerful positions, if you want to get appropriations money sent to whatever district, you got to play ball, right?
[678] And if you do, then, you know, you very quickly start climbing the ladder.
[679] If you don't, you end up just somebody who tends to be on the outside.
[680] and his portrait as a nut, you know.
[681] Bernie Sanders.
[682] Bernie Sanders or Ron Paul, right?
[683] You know, I mean, that's how it works.
[684] There are people who are kind of like in Congress, you know, you might disagree with what they believe, but they're honest, you know, and you can tell they're honest because the leadership hates them, you know, and doesn't give them, doesn't give them the opportunity.
[685] But you're absolutely right.
[686] It's the same thing, you know.
[687] You know, it's this organizational sort of belief system.
[688] You either have to go with it or not.
[689] I just think it's especially, it's especially offensive in journalism because the whole idea of being a journalist, the whole, you're not doing it for money or power.
[690] You know, if you're going into this business, what are you doing it for if you're not going to be trying to break.
[691] cool stories like go into some other thing go into finance or right right I just I never understood that that part of it doesn't make any sense do you think that this understanding of this now and because people are talking about it and then the birth of substack and the fact that it's become very successful and that people are flocking towards genuine independent journalists whether they're on social media like YouTube or Twitter or substack or do you think that this is in many ways like changing the way young people see the possibilities because I think young people looking at the two options like one you can kind of be a hero and or two you can kind of be a whore right you know there's a lot of horrors they don't seem happy these whores seem very upset at everything and they're always pulling their fucking hair out and they're probably on antidepressants and then you have these people that are like breaking stories and it's like oh my God, journalism is alive.
[692] It's just alive in like, like, you know, when people travel with a fire and they have like embers and they're blowing on them and they get them to the next camp and then they can start a fire with it.
[693] That's journalism right now.
[694] It's not this raging bonfire with that everybody can go get warmed.
[695] You know, the information will warm everyone.
[696] No, it's like these people have like small wooden vessels filled with embers and they're blowing on them as they run through the woods and people are fanning them to try to keep them.
[697] live.
[698] But I think for young people that are considering paths, like what to do with their future, they don't want to be contained.
[699] They want to be free.
[700] And because of social media and because of the fact that any kid can just start a YouTube page, just start talking about things.
[701] Exactly.
[702] And because of that kind of the ability to now do your own show, becoming independent is becoming not just more plausible, but more attractive.
[703] Absolutely.
[704] And I think you know the younger people have less tolerance for phoniness or at least historically they did yeah it's been a little weird lately um I haven't always been sure of that lately but um but you know they yeah the people who are going to go into journalism when they're 18 or 19 once upon a time they all wanted to be woodward and Bernstein or or sigh Hirsch or hunter Thompson or whoever it was because they just wanted to be a rule breaker, somebody who told the truth, and consequences be damned, and because that's what it's about.
[705] It's about being free and speaking your mind, right?
[706] Like, what is it William Blake said?
[707] You can always be ready to speak to your mind, and a baseman will avoid you, right?
[708] Like, that's what journalism is.
[709] Like, you derive power from your willingness to say the unpopular true thing, right?
[710] And that's an attractive, idealistic thing for a young person.
[711] But if they see that that path closed, they're not going to go into, I mean, why would you go into journalism and try to work at the New Yorker or MSNBC if you know you're never going to get to do that, basically, you know.
[712] Why would you?
[713] So, but you can, you can create your own show with almost no overhead and do the same thing and, and have, have a much bigger impact.
[714] And you'll have a bigger audience even.
[715] Much bigger.
[716] Right?
[717] That's what it's done.
[718] Yeah.
[719] I mean, as you know, as you very well know, right?
[720] I mean, that's, and these corporate media companies have been living for a long time.
[721] their name and on the memory of the prestige that their names inspired.
[722] But if people, if they actually had to sell how much reach they had now, they wouldn't have much to talk about, right?
[723] Like their audiences are shrinking.
[724] Their influence is very, very small.
[725] And, you know, the jobs that they're offering are just less.
[726] less exciting for young people.
[727] We were talking about CNN, like, how does CNN even keep the lights on?
[728] They have an enormous building in Atlanta, giant CNN sign on the front of it, and they get terrible ratings.
[729] Oh, yeah, absolutely.
[730] And they have so many people working there.
[731] You have a whole giant building filled with people, and then your product, no one wants it.
[732] Like, no, like, people are in, like, just accidentally watching it, flipping through the channels they're watching it like there's nothing compelling that they have to offer yet they are in the business of selling compelling information you're literally the most compelling thing because the news is supposed to be one of the most compelling things everybody traditionally would come home and watch the evening news because you need to know what the fuck is going on in the world but now because of social media and because of just websites and phones and just news off your apps the different apps that people use google apps no one cares anymore.
[733] So it's, you're just howling into the wind.
[734] It's something, it's background you see at an airport, maybe.
[735] Yeah.
[736] Right.
[737] But yeah, you're right.
[738] They have, they have a huge building in New York, too.
[739] You know, the Time War Center and.
[740] Bizarre.
[741] And they're not even in the top 20, I think, of cable news shows anymore, right?
[742] So, and then look what happened with CNN Plus.
[743] I mean, you know, they, they went and they high.
[744] I hired Chris Wallace, and they were going to launch this big subscription service, CNN Plus, where I guess the idea was they were going to get people to pay to watch the same stuff they were already refusing to watch for free.
[745] And they had to cancel the service after three weeks.
[746] Yeah, they spent hundreds of millions of dollars on it.
[747] I mean, how could they have not seen that?
[748] How could they think that people wanted to pay for something that they don't like for free?
[749] I don't know I don't know But what's clear Is that they don't They don't have any conception of where the way out is for them Right Well they're like blockbuster video There's no way out Well I think there would be a way out If they started actually doing their jobs But they just don't know what that is anymore Well has the new guy said They want to switch away from opinion Editorial type news stories in public people to people that just disseminate objective views of information just like this is what's happening in the world this is what's going on this is the crash that happened this is the that they have said that but is it too late the problem is like people now have associated CNN with bullshit propaganda right that you treat you like you're dumb they think you're stupid they think you know that you could just tell people that someone's taking horse dewormer and you could just repeat it over and over again and people believe it I mean, the amount of damage that they did to their own reputation saying things like that, it's so, because most people would look at that and go, is he really doing that?
[750] And then some people would go, well, that's not even true.
[751] Like, wait a minute, CNN says this.
[752] What else are they lying about?
[753] Right.
[754] What about international stories?
[755] What about financial stories?
[756] What about things that have to do with crypto or what narratives are they spitting out that are just bullshit?
[757] Right.
[758] Right.
[759] And there's a lot of them, you know?
[760] You mean, remember when the Bountygate story came out and then...
[761] Bountygate was Bountygate.
[762] Bountygate was this weird story that came out.
[763] I think it was in 2020 when basically they were reporting that Russians were paying bounties in Afghanistan to kill American soldiers.
[764] Yeah.
[765] And it turned out to be like, you know, basically one theory that...
[766] that somebody within the intelligence community was positing, the army itself came out a couple months later and said, yeah, we don't really have evidence for this.
[767] And then, you know, a year later, they came on even more strongly and said, you know, we don't, we can't back that up.
[768] Well, here's a crazier one, the Russia Gate.
[769] Oh, well, yeah.
[770] That's the craziest one.
[771] Absolutely.
[772] The fact that they pushed that for three years.
[773] And they've never come out and said, we were misinformed.
[774] informed.
[775] That is not the case.
[776] There really wasn't this crazy collusion between Russia and Donald Trump.
[777] And in fact, there was some information that seems to point to that Hillary Clinton had involvement with Russia, too, and that they've kind of all had involvement with Russia.
[778] And this wasn't some grand conspiracy to elect a Russian puppet as the president of the United States.
[779] Sorry.
[780] Yeah, it was a three and a half year sort of mass hysteria.
[781] experiment, right?
[782] And I mean, this is one of the things, it's one of the reasons I got kind of quietly moved out of mainstream journalism, right?
[783] I didn't have a particular problem at Rolling Stone, but, you know, early on in the Trump years, I said something wrong with the story.
[784] I think there are elements of it that aren't provable.
[785] I don't think we should be running this stuff, you know?
[786] And then And before I knew it, I was working independently.
[787] But anyway, at the Twitter files, we're finding stuff that now tells you absolutely what actually the truth was during that time.
[788] Like, for instance, one of the big Russiagate stories was from early 2018 when Devin Noons, remember he was the Republican congressman, he was the head of the House Intelligence Committee at the time.
[789] He wrote a memo basically saying, we think they faked FISA applications.
[790] We think the FBI used the steel dossier to try to get surveillance authority against some Trump people like Carter Page.
[791] And we think they lied and cheated to do that.
[792] And so he submitted this classified memo.
[793] And not only was he denounced everywhere as a liar and wrong and all that, but there was this big story that was all over the place that a hashtag, hashtag released the memo, had been amplified by Russian bots.
[794] You probably don't remember this, but this story was everywhere in January and February of 2018.
[795] This idea that released the memo was basically a Russian operation and that Noon's was benefiting from it.
[796] Well, I'm reading the Twitter files.
[797] I was looking for something else entirely.
[798] And then suddenly we come across a string of emails internally at Twitter where the Twitter officials are saying, you know, we're not finding any Russians at all behind this hashtag.
[799] And we told the members of Congress who asked about this that there are no Russians involved in this because Diane Feinstein, Richard Blumentdahl of Connecticut, they all came out with this accusation about it being linked to Russia.
[800] We told them that there's nothing there and they went and they did it anyway, you know.
[801] And so there are lots of stories like that now that are kind of falling apart, right?
[802] And most people, I think, don't even know that the Russia collusion thing was bullshit.
[803] I think the general public that heard that Russia gate narrative, the people that haven't looked into it past what they've seen on television probably still believe there's some sort of collusion.
[804] Yeah, because there's never been a reckoning for it.
[805] Right.
[806] You know, I mean, after the WMD thing, which, you know, went on for a surprisingly long time considering how little evidence.
[807] there ever was for that.
[808] Yeah.
[809] And considering that there were lots of journalists at the time who would have liked to have proved Bush wrong about that, it still took years and years and years for the business to admit that they screwed that up.
[810] They blamed it almost entirely on one person, Judy Miller from New York Times.
[811] Really?
[812] Yeah.
[813] Other people who got that story just as wrong, like Jeffrey Goldberg, he's.
[814] He's now the editor of Atlantic Magazine.
[815] Like, there are all kinds of people who totally screwed that story up and got promoted.
[816] And so there, but there was, there was at least a little bit of reflection about getting a big story wrong.
[817] Like, you, that's such a big story.
[818] Right?
[819] That's such a big story.
[820] Yeah.
[821] The fact that there really were no weapons of mass destruction and they, we really did start a war for nothing that really did kill somewhere in the neighborhood of a million innocent people.
[822] Right.
[823] It's over a fake news story.
[824] Yeah, over a fake news story.
[825] I mean, there should be sorrow, you know what I mean, within news organizations, about a mistake of that magnitude.
[826] And the fact there's no repercussions and the people were promoted that promoted that very same story that led to that, that led to the public support of it, of the war.
[827] Yeah, and not only did we, not only did we promote the people who got that story wrong, You know, except in that one case with Judy Miller who had, who was sort of villainized, you know, but people were fired who were, who were questioning it.
[828] Like Phil Donny, who had a show, a very highly rated show on MSNBC at the time.
[829] He lost his job.
[830] What did he say?
[831] He was just very critical of the war effort.
[832] He didn't believe the whole thing.
[833] And he thought.
[834] And that's why he lost the show?
[835] Yeah, they took him off the air.
[836] Jesse Ventura will tell you the story.
[837] He's currently, he lives in a compound in Mexico, Casa MSNBC, he calls it, because they hired him thinking that because he was a former Navy SEAL, he was going to be pro -war.
[838] When they found out on the phone that he didn't feel that way, that he was very skeptical of the whole thing, they basically bought out his contract.
[839] They just paid him the balance and said, thanks, but no thanks.
[840] and, you know, we're not going to want that show of yours.
[841] So he was right, but instead of going on the air, he got a mansion in Mexico, you know, so the business has a history of doing stuff like this, but at least in the WMD episode, they had the decency to admit now, you know, like, A decade later, we screwed that up.
[842] Like, it has the reputation of being a media mistake.
[843] They haven't done that yet with the Russia stuff.
[844] Well, the WMD thing, though, there's no repercussions because over time, everybody had kind of either forgotten about it or been overwhelmed by news stories.
[845] And when the WMD came out, when that sort of thing came out at the beginning of the war, you're also dealing with a very different internet.
[846] And the news cycle wasn't as extreme and dynamic.
[847] like nowadays things that happen like no one gives a shit that Epstein was murdered and that the cameras were shut off and that there's no list of all the people that went to the island that's just gone there's too much new stuff to come that it's in front of your eyes do you have to pay attention to right that's the new cycle we do need a few more answers on that one I think yeah yeah I mean it's wild so that's the new cycle of today that it's just overwhelming and then it just keeps coming forward and you can't can't stop it.
[848] You know, when you were working at Rolling Stone, did you interact with Yon Wienner a lot?
[849] Yeah.
[850] Did you see my conversation that I had with him?
[851] Yeah, yeah, absolutely.
[852] Did you think that was kind of sad in a way?
[853] Very, very.
[854] Yeah.
[855] I mean, look, Yon and I, Yon was always good to me, you know.
[856] He didn't agree with my politics.
[857] He didn't agree with my approach to the job a lot.
[858] And I know that my story's got him in a lot of trouble socially.
[859] So that came out from time to time.
[860] But he, you know, he never, you never went to the step of firing me, you know, and he let me do the stories that I wanted to do for the most part.
[861] But, you know, I think, I think as you found out, like, he, somewhere along the line, he, you know, he lost interest in being, you know, part of a real actual journalistic venture, right?
[862] I think he has a hard time concentrating on the nuances of all these different things and balancing out.
[863] Like when he was talking about, the government regulating the internet.
[864] That was the most shocking.
[865] Look, I was a fan of the guy because I was a fan.
[866] I've always been a fan of Rolling Stone.
[867] And I'm a giant fan of Hunter S. Thompson.
[868] And I knew he and Hunter had this very close relationship.
[869] And I wanted to bring him up.
[870] It was complicated.
[871] But yeah.
[872] I'm sure.
[873] But I wanted to talk to him just about that.
[874] I mean, and I enjoyed that.
[875] But then when we, it's to say you were a fan of that guy.
[876] And now you're saying, you're talking to me about the government censoring the internet.
[877] And when I was like, you're talking about the same people that gave us bad information about weapons of mass destruction that led to a war.
[878] And then he's sort of balancing that out.
[879] No, no, those were politicians.
[880] Like, that's the government.
[881] Right.
[882] You're talking about the government.
[883] Yeah.
[884] Like, to see him try to wrestle with that in his head and to realize he had never wrestled with it before.
[885] Yeah.
[886] That's what was fucked.
[887] It's like, I mean, is he just tired?
[888] Is he just older?
[889] Or is he just like completely insulated in those liberal cocktail parties where, you know, you have to wear a mask and you have to say this.
[890] And if you don't say that, you'll be ostracized from the social group.
[891] Like, what kind of narrow.
[892] bandwidth are you operating on like that you could just say that yeah I mean again I feel bad because you know I'm living in a home probably that was paid for by Jan Wenner and and all that stuff and and he was like I said he was always good to me personally even though we had some pretty intense disagreements and arguments and there was a lot of yelling that went on but um but I think you know what happens is that yeah you do you do end up in a bubble and even people who are in who spent their whole lives in the journalism business and not just in the journalism business but rock and roll journalism right like you're supposed to be kind of on the edge right and Hunterrest Thompson was was completely out of control right like his his writing was wild and free.
[893] That was what was beautiful about it, right?
[894] And it took a lot of guts to publish that, right?
[895] And to send somebody like that on the campaign trail was a revolutionary idea at the time, you know?
[896] But, you know, Jan, and this came out in 2016 because he endorsed Hillary in 2016.
[897] And I asked permission to write a counter to that and endorse.
[898] Bernie.
[899] But his whole reasoning was when we were young and we were and we were supporting McGovern, we were wrong because McGovern was a bad candidate and Nixon got elected.
[900] We needed to support somebody else who had a better chance of winning.
[901] And so his whole idea of youthful idealism is nice and all, but it's not pragmatic.
[902] right and this was the place that he had ended up in and that leads you to other things like you know internet censorship yeah um you know one of the reasons that uh one of the first signs that i knew that i wasn't going to have a future at the magazine is when he told me i um just flat out shouldn't touch the russia story anymore uh this this was in the first year of of the uh of that scandal and And I had written a bunch of columns saying, you know what?
[903] I don't think this is true.
[904] And because he perceived that, I think, as helping Donald Trump, you know, he didn't want me writing about him.
[905] I ignored him.
[906] But, you know, you do get in the bubble and, you know, it's good.
[907] I think there's also being the boss.
[908] Well, yeah, being the guy was, yeah, and you're, you're insulated in the fact that no one wants to challenge you and, yeah.
[909] Yeah, and look, most of the people who worked at Rolling Stone had kind of a love -hate relationship with Jan. Like, he could be tempestuous and he could be, he would go into fits of anger and things would happen.
[910] But at the other hand, on the other hand, like, there was a lot of great journalism that came out of Rolling Stone.
[911] And he had, you know, he had a really, a sort of brilliant sense of how magazines worked and what audiences would like in magazines.
[912] And that mechanism in his head functioned extremely well for like 40 years or so.
[913] And so, you know, people respected him as a leader on that front.
[914] But, you know, over time, the magazine, you know, it started to become, I don't know.
[915] It lost its sense of purpose.
[916] And, yeah, it became something other than what it had been, right?
[917] It was a symbol of rebelliousness once.
[918] And now it's the opposite.
[919] It is the opposite.
[920] Yeah, it was sad to me to have that conversation with them.
[921] Like, part of it I really enjoyed talking about Hunter and talking about the early days of the magazine and what it was like to take a chance on a magazine like that in this counterculture.
[922] environment that they found themselves in but then you know seeing and you're just like sometimes people just get tired man they just get tired and they get old and they kind of like given in narratives yeah and they don't they don't want to explore the the subtle nuance of each individual topic because sometimes those are uncomfortable and you have to wrestle with those thoughts and you know sometimes people would rather just medicate themselves and go to sleep yeah and also you can you can also get used to not having not answering difficult questions, which also came through in that interview, I think.
[923] Right.
[924] Because you're the boss.
[925] You're the boss.
[926] Yeah.
[927] Right.
[928] Like you're, you're used to being asked, you know, throwing a whole bunch of softballs.
[929] And, you know, if you're, if you're not in that place, you know, yeah, that can be difficult too.
[930] I have a good friend who used to be an executive at Google.
[931] And we were at a party.
[932] And it was me and my wife and this lady who was one of the big wigs at YouTube.
[933] And so So she sits down with us and we're talking and she's, you know, asking me about, you know, podcasting and this and that.
[934] And we're having this conversation.
[935] And I say, what, how does YouTube decide, like, what gets marked as bad or, you know, I go, because there's a conversation between Sam Harris and Douglas Murray, two public intellectuals that someone put on their YouTube playlist.
[936] Like, they didn't even make this conversation.
[937] They just, like, this is something that I have on my.
[938] channel on my playlist and it got they got flagged for that and she yeah they got they got a strike against their channel for that and I said to I go why would that be and she goes because it was hate speech and so I go how did you just say that I go how did you just say it was hate speech I go do you know the content of the conversation you're talking about two public intellectuals right and you just said it was hate to and my wife is like squeezing my knee she she she sees I'm getting red you know I'm like, you just said it was hate speech.
[939] Do you understand what they talked about?
[940] Do you know what they talked about?
[941] Why did you just say that?
[942] But she was just like, it's hate, what they said was hate speech.
[943] She just like this arrogance because she was in this position of power where she could say, that's hate speech, that's not hate speech.
[944] Right.
[945] And this was quite a while ago.
[946] It was like 2015, 16.
[947] Oh, wow.
[948] So it's early.
[949] Yeah, early.
[950] Wow.
[951] But her saying, because it was hate speech.
[952] Like this look at my, that's what it is.
[953] Do you remember that commercial where.
[954] it's like during the drug war the height of the drug war propaganda and during the the brain on drugs thing no no no the one where there's a man and a younger man and they're eating dinner and he's saying if you buy drugs you support terrorism and he's like eating his food at a steakhouse he goes what do you mean he supports terrorism it just does it just does you're supporting terrorism it's like this this no -nonsense guy and then this young goofy guy Like, what are you saying?
[955] I just like to smoke pot.
[956] If you buy drugs, you support terrorism.
[957] You ever seen that commercial?
[958] No, I don't remember it, but it's a play it.
[959] Play it from the beginning.
[960] Play it from the beginning, because this is...
[961] It's a ploy.
[962] What?
[963] This drug money funds terror, it's a ploy.
[964] A manipulation.
[965] Ploy.
[966] Drug money funds terror.
[967] I mean, why should I believe that?
[968] Because it's a fact.
[969] A fact.
[970] F -C -T fact.
[971] So you're saying that I should believe it because it should.
[972] true.
[973] That's your argument.
[974] It is true.
[975] Says who?
[976] Says who, motherfucker?
[977] And that's, that's me and that lady from YouTube.
[978] Because it's hate speech.
[979] Because it's hate speech.
[980] Because it's true.
[981] While you're eating salad.
[982] What the fuck?
[983] He was probably eating too, right?
[984] Yeah, she was eating.
[985] Yeah, we were eating.
[986] I was so red.
[987] I was so hot.
[988] I was like, what?
[989] And this is before I was even getting censored on YouTube.
[990] This was in the early days.
[991] Like, back, you know, no one gave a fuck about what I was doing.
[992] Right.
[993] And sitting there with this woman was telling this interesting conversation about the the change this was like when he was doing the strange death of Europe that book Douglas Murray's book uh -huh uh -huh and you know they were having this conversation about what happens when religious ideology starts to change the environment of these European cities but this that fucking attitude that people have like because it's true because it is true like fucking says who like you can't just say that Right.
[994] Like, you tell me how.
[995] You tell me how you know.
[996] We're going to talk forever, motherfucker.
[997] We're going to sit here forever.
[998] I'm not going to be done with this dinner.
[999] I'm going to put my fork down.
[1000] You tell me. How's it funding drugs?
[1001] Is it funding terrorism?
[1002] How is that that fucking conversation between Douglas Murray and Sam Harris?
[1003] How's that hate speech?
[1004] Right.
[1005] But they're used to not having to go past that first part of the conversation.
[1006] Exactly.
[1007] Exactly.
[1008] You know, like that's...
[1009] If my wife wasn't squeezing my niche, She was, like, squeezing my knee under the table.
[1010] She's like, she's like, oh, Jesus, I know what's happening here.
[1011] Right, right.
[1012] No, but the people, and that was another strange thing about my experience at Rolling Stone.
[1013] Like, early, I guess it was 2017, 2018, when they first started to really aggressively police the Internet, I did a story about how they wiped out a bunch of accounts.
[1014] under the guise.
[1015] This is after the Alex Jones thing.
[1016] But Facebook just sort of zapped a whole bunch of accounts.
[1017] And some of them were just sort of ordinary, hardworking people who had built up these independent media channels.
[1018] And the company just sent them notices.
[1019] You are coordinated in authentic activity and your pages down.
[1020] This is after they'd spent tens of thousands of dollars on Facebook ads to build up their – You know, their pages and everything.
[1021] They weren't bots.
[1022] They were real people.
[1023] And not only could I not convince other people in the business that it was a significant story that these companies were now doing this, but within Rolling Stone, you know, the story, the headline had to be refashioned.
[1024] If you look at it now, the story is called Who Will Fix?
[1025] Facebook because they wanted to imply that the problem was that Facebook was out of control and needed to be policed more, right?
[1026] And sort of my headline that I submitted was very different.
[1027] It was something like, you know, censorship on Facebook is out of control or whatever it is, right?
[1028] But they, the, this belief that the censorship is a good thing, that we need more of it, I just think it became an upper -class kind of New York, Washington, just cocktail party belief, right?
[1029] I mean, it was something you started to hear from people right around the time that Trump got elected.
[1030] Oh, we just need more of that, you know.
[1031] We have to do something to reckon with all those people or whatever it is.
[1032] And they haven't let go of it, I don't think, have they?
[1033] I don't know, have they?
[1034] Yeah, I think they're still in that place for the most part.
[1035] You know, this move toward that world economic forum version of a more regulated Internet, I think we're only in the beginning stages of that.
[1036] I think they're going to make many steps, you know, that are going to be much more significant in the future to try to, prevent, you know, things like your show from breaking out, right?
[1037] Like, they're not going to want that in the future.
[1038] Well, how does the World Economic Forum actually wield influence?
[1039] Like, what works there?
[1040] How does it work?
[1041] You know, one thing we know about this year's event is, we know two things.
[1042] One, that hookers are going there in droves.
[1043] There was an article about the increase in the population of prostitutes, which makes sense.
[1044] Yeah, it makes a lot of sense.
[1045] Yeah, when the boys are away.
[1046] And then the other thing was that Klaus Schwab and George Soros decided to op out this year, which is interesting.
[1047] That is interesting.
[1048] Maybe it's getting too hot because it used to be a thing that really didn't get much public attention.
[1049] They could go there and they could all have these meetings and decide the fate of the world and try to sort of move the world in general directions.
[1050] And then there was also Michael Schellenberger released a bunch of stuff this week.
[1051] showing that they lie about things that they've said that have become very problematic.
[1052] One of them was, you will own nothing, you'll be happy.
[1053] And they said, no, that was all just internet conspiracy theories.
[1054] It's not true.
[1055] But it is true.
[1056] And then there was websites.
[1057] They had a whole advertising campaign based on this.
[1058] They did that.
[1059] They said that.
[1060] They absolutely put it up.
[1061] They really did.
[1062] Or I think another one was eating bugs, right?
[1063] Yes.
[1064] Yeah.
[1065] Yeah.
[1066] Yeah, exactly.
[1067] Well, I don't have a problem with eating bugs.
[1068] I do have a problem with people trying to say what is good and is not good for the world when I know that if you say it is good, it's going to benefit enormous groups economically and it's going to lock other people out.
[1069] And I think that's what they're doing with things like plant -based meat.
[1070] When all those people were saying plant -based meat is the future, like the fuck it is.
[1071] It's really bad for it.
[1072] you.
[1073] It's really bad for you.
[1074] Not only that, it's monocrop agriculture, which is terrible for the land.
[1075] It's terrible for living creatures.
[1076] This idea, if one life equals one life, you're way better off buying cows and eating cows than you are buying corn.
[1077] Because in order to grow a stock of corn, a lot of shit has to die.
[1078] And if you're using monocrop agriculture and using industrialized farming methods and you're controlling enormous swaths of land with only one crop, that is totally unnatural doesn't exist anywhere in nature and in order to do that you have to poison everything else you have to kill all the animals you have to poison the let you have to strip the top soil you have to use industrialized fertilizer you can't grow things that way normally that's why there's only using these industrialized method there's only like 60 more like more um crop circles that they could do or crop cycles they can do like there's only a certain amount of top that's even left that's viable to grow food on because they don't use regenerative agriculture anymore.
[1079] Huh.
[1080] The people that like white oak pastures and polyface farms like Joel Salatin and Will Harris, these like grizzled old farmers who use these regenerative methods that are very like almost boutique, they're very rare now.
[1081] But they're more popular than ever because people are aware of them.
[1082] But most of the stuff that you buy is using industrial.
[1083] Fertilized fertilizer.
[1084] What these people are doing is they're letting cows graze.
[1085] They take the manure.
[1086] They use the manure as fertilizer.
[1087] Chikins roam the land.
[1088] Chikins peck the bugs and eat the stuff.
[1089] Pigs roam.
[1090] And then they cycle where these animals are.
[1091] So what they're essentially doing is they're recreating nature in a contained environment.
[1092] And that is actually carbon neutral.
[1093] It actually sequesters carbon in the soil in a lot of cases.
[1094] But if you want to buy plant -based, food and plant -based meat, you're not getting that.
[1095] Right.
[1096] You're supporting monocrop agriculture, industrialized farming, and you're supporting very unhealthy food.
[1097] And the idea of a small group of people who meet in a ski resort town in Switzerland making the decisions about this for people all around the world.
[1098] And they're doing it on private jobs.
[1099] He was there, and I think Klaus.
[1100] Or he was never supposed to be there.
[1101] Bill Gates and Klaus Schwab was there.
[1102] It says post falsely claimed Bill Gates withdrew from Davos Forum.
[1103] Yeah, but Klaus Schwab did withdraw.
[1104] I think he was there, though.
[1105] Well, he said he couldn't make it.
[1106] There was a public release, and George Soros also said he couldn't make it.
[1107] Okay.
[1108] I mean, you could Google that.
[1109] I was looking.
[1110] I thought I saw a video of him talking.
[1111] Well, he's definitely talked at it before.
[1112] It could be from another time.
[1113] Klaus Schwab.
[1114] I think there was a thing that said that due to a scheduling conflict, He couldn't make it.
[1115] I thought so, too.
[1116] I remember reading that too, but now I'm Googling it, and it's like it doesn't come up.
[1117] Like, I'm looking for, did Klaus Schwab believe Davos?
[1118] Is he not there?
[1119] And, you know, I don't get it a report.
[1120] Did you Google, did Klaus Schwab opt out of the World Economic Forum this year?
[1121] What, like, who organized this thing?
[1122] That's what's super bizarre.
[1123] Like, these billionaires and then Justin Trudeau and then they all fly there.
[1124] This is what happens when I Google that.
[1125] Did Klaus Schwab, he spelled Schwab wrong, but it's okay.
[1126] Opt out, is that I spell his name?
[1127] It doesn't, it gives you the right answer.
[1128] It knows what you're looking for.
[1129] Did Klaus Schwab, what does it say?
[1130] It doesn't say that he's not there.
[1131] It just sort of seems like he's there.
[1132] I don't know, it was confusing.
[1133] That's why I was trying to just clarifying.
[1134] Okay.
[1135] I mean, this is part of this whole infrastructure.
[1136] with Aspen Institute, World Economic Forum, like, what kind of influence do they actually have?
[1137] Like, how do they, you know, when we have young global leaders, when he talks about, like, Trudeau being one of our young global leaders, and this is what they do.
[1138] They get their young global leaders that are indoctrinated into the world economic forum's ideas, and they implement them in politics.
[1139] Yeah, it's like it's the same thing as Justin Timberlake being a musketeer, and then later on he gets to have a real career in, entertainment.
[1140] I mean, it's, right?
[1141] It's the same, it's the same exact concept.
[1142] You know, they're, they bring people along.
[1143] There's, there's a feeder system for how people become powerful politicians.
[1144] We've seen how it works.
[1145] You can, you know, if you want to be a financial regulatory official, you run a desk at Goldman Sachs for a few years.
[1146] And next thing you know, you'll be running the World Bank and, um, in Canada or running, running the, um, You know, it'll be the chief economist of the World Bank or you'll be the chief economist of the ECB or the Bank of Canada or, you know, whatever it is.
[1147] There's just all these places where politicians come from.
[1148] You do a tour in the military, maybe in the CIA, maybe you work for a consulting firm like McKinsey, you do a little time, you know, working for, you know, this.
[1149] or that politician as an aide, and then, you know, they raise some money for you to become a candidate in Congress, and next thing you know, you're running for president.
[1150] Like, they, you know, they know how they do these things, right?
[1151] Yeah.
[1152] The politicians don't come from too many places, right?
[1153] Right.
[1154] Especially in America, we've got a pretty, you know, stable system of how that works.
[1155] But the global thing freaks me out a little bit.
[1156] Like this idea that leaders from all over these countries are getting together and setting an agenda that may be completely contrary to what people in the individual nations might want.
[1157] Yeah, that's upsetting.
[1158] Like that seems totally anti -democratic and disturbing.
[1159] And yeah, I don't know.
[1160] I have a lot of fears about that.
[1161] Yeah, as do I. And this isn't a fairly new thing in terms of like the public zeitgeist.
[1162] Like people didn't know about the World Economic Forum six, seven years ago.
[1163] At least most people didn't.
[1164] They didn't hear about it.
[1165] They didn't hear about Klaus Schwab and, you know, you would own nothing and you would be happy.
[1166] When you hear him say stuff like that and hear them talk about young global leaders, you're like, what is happening here?
[1167] Like, you seem like a bad guy in a science fiction movie.
[1168] Like, he wears that crazy outfit that I show you, the picture we have in the bathroom.
[1169] Yeah, yeah, yeah.
[1170] Like, could you be any more obvious that you're fucking insane?
[1171] That you're an insane megalomaniacal dictator -type character who wants to run the world, and you're literally dressing like a Star Wars character.
[1172] Yeah, there's got to be some weird sexual fetishism thing going on there, too.
[1173] Pull that picture up, Jamie.
[1174] You know that nutty picture.
[1175] That picture is so, every time I go to take a leak, I look at that picture.
[1176] That's why it's there.
[1177] Because I'm like, what kind of freak shit is that guy into when no one's around?
[1178] Because if you're dressing like that publicly and you're telling people they're going to eat bugs and that you're going to own nothing and everyone's...
[1179] And then when people catch you on it, you go, those are conspiracy theories.
[1180] We have nothing to say to those things.
[1181] Like, look at that fucking outfit.
[1182] He looks like a space druid.
[1183] That's awesome.
[1184] It's amazing.
[1185] Yeah.
[1186] The fact that he chose to leave his house dressed like that, like, yes, I will adjust the peons.
[1187] All the public needs to know that Vien control Look at this.
[1188] He's got like that weird star on the right one And whatever the fuck it says on the left one Right Yeah Yeah But on his vest Like what is that thing on his vest Is that the same thing?
[1189] What is that symbol?
[1190] It looks like a sun Right, but what does it stand for?
[1191] I could look that up Fucking goat with a cross Oh, a fucking goat with a cross in his head It's a bull I think Whatever it is, a bull Jesus Christ, you fucking psychos.
[1192] Science ingenuity truth, I think, is what that says.
[1193] I almost wish that I didn't have this podcast and I could just go and hang out with those people.
[1194] But doesn't that look like the outfit?
[1195] Oh, wow, look at that one.
[1196] I don't think that's him.
[1197] No. Well, how about The Great Reset?
[1198] He wrote a book called The Great Reset, and then they denied that The Great Reset is the thing they're working towards.
[1199] Like, bro, you wrote a book.
[1200] Right, yeah.
[1201] You wrote a book.
[1202] It's called The Great Reset.
[1203] This is not like the fucking, like, the steel dossier.
[1204] Like, Trump didn't write a book called a Steele Dossier and say, I never peed on anybody.
[1205] Like, this is what you're doing.
[1206] Right, right.
[1207] Or isn't New World Order another term that they...
[1208] Yes.
[1209] Right.
[1210] You see it on the background of, you know, some of their events, but...
[1211] How do they have influence, though?
[1212] Like, other than, like, are they just financing politicians?
[1213] And so they have this meeting where they get together and they say, oh, you know, this is what we want you to do.
[1214] And it's just understood that if you follow them...
[1215] those people and you do that you'll have some sort of a career like almost like a workshopping thing like a conference for people to get together that are fucking aluminum siding salesmen and like they find out what's the new tech and what's the the latest stuff and sales techniques i assume it just works the same way that think tanks work in the united states right like if you if you uh there it goes it was founded on the 24th of january in 1971 by german engineer and economist, Claude Schwab.
[1216] Jesus, he founded that in 71.
[1217] The foundation which is mostly funded by its 1 ,000 member companies, typically global enterprises with more than $5 billion U .S. dollars in turnover, as well as public subsidies.
[1218] I like to find out what those subsidies are.
[1219] Views its own mission as improving the state of the world by engaging business, political, academic, and other leaders of society to shape global, regional, and industry agendas.
[1220] Boy, does that sound gross.
[1221] Yeah.
[1222] The WF is mostly known for its annual meetings at the end of January in Davos, a mountain resort in the eastern alps of Switzerland.
[1223] The meeting brings together some 3 ,000 paying members and selected participants among whom are investors, business leaders, political leaders, economists, celebrities, and journalists up to five days to discuss global issues across 500 sessions.
[1224] There was some guy who was trying to interview someone from MSNBC And, you know, he was like some independent journalist guy And he was trying to talk to this guy on the street Yeah, yeah And the guy from MSNBC said said something along the lines of someone should knock that fucking guy out Like threatened this guy for asking him questions about speaking truth to power Okay, thank you.
[1225] Where are you from?
[1226] Rebel news Yes, but what is your answer?
[1227] interest.
[1228] What is your...
[1229] What do you mean?
[1230] I'm covering the news.
[1231] I'm doing what your bosses are supposed to be doing.
[1232] Why did you get so upset?
[1233] What's he so scared about?
[1234] No, I'm not scared.
[1235] No, you, not you, not you, your boss.
[1236] He seemed really scared.
[1237] He ran in there and called you out.
[1238] No, it's just we have to know who is out here.
[1239] My name's Avi Amini.
[1240] I work for Rebel News.
[1241] We're reporters.
[1242] We do what CNBC is supposed to be doing and he seemed a bit upset that we were asking some questions in the public area outside.
[1243] Yeah, if you're here, it's public area.
[1244] It's no worries.
[1245] That's private area from here on.
[1246] So you're doing fine.
[1247] We show a very nice day.
[1248] You too.
[1249] So, like, I think he keeps going, because this is where they threaten to punch him out.
[1250] What CNBC is doing here?
[1251] You can't, I can't ask you.
[1252] No, you can't.
[1253] I'm father you didn't put a camera in my face, thank you.
[1254] Really?
[1255] But you're here as an invited guest and you're an editor for CNBC.
[1256] Don't you think that's a bit of a conflict of interest?
[1257] I'd like you to go away.
[1258] I haven't agreed to an interview.
[1259] If you're doorstepping me, like, go away.
[1260] Don't touch the mind.
[1261] Don't touch the mind.
[1262] You're meant to be speaking truth to power.
[1263] Are you here just to take your marching orders?
[1264] Is that what you're here for?
[1265] Do you want to go away?
[1266] Not really.
[1267] I'm here to do what you should be doing.
[1268] Yeah?
[1269] Please take this out of my mouth.
[1270] I'm going to have you all sorted off on security.
[1271] All right.
[1272] I like how we just get the...
[1273] The cigarette.
[1274] There you go.
[1275] So he goes inside, keep it going, because he goes inside and that's when he says someone should knock this fucking guy.
[1276] Escort me. I want to hear what he actually says.
[1277] I'm paraphrasing.
[1278] I know he's calling security to escort me off the premises as well.
[1279] What's the problem?
[1280] You're my problem.
[1281] You've been very rude to me this morning.
[1282] I haven't asked me anything, so I'd like you to take the camera off me. I've literally asked you questions politely.
[1283] which should be your job.
[1284] The guy just littered, you fucking piece of shit.
[1285] That's your job, sir.
[1286] I'm doing your job.
[1287] I'm just not getting paid for by Klaus Schwab.
[1288] You were inside as he walked in a bit upset.
[1289] What did you hear him say?
[1290] I heard him say, I'm going to paraphrase here because I have the exact thing, but he came in, sounding quite angry, saying, I'm going to punch him out.
[1291] Paraphrasing there, he's knocked out, a punchout, but, you know, he wanted to hit you.
[1292] Yeah, and there's an actual recording of it.
[1293] Okay.
[1294] But whatever.
[1295] We get it.
[1296] We get it.
[1297] Is that same guy, the guy that got set up by Jim Jeffrey show back in the day?
[1298] The guy that he was talking to, the CIA.
[1299] Avi Yemeni.
[1300] I'll check.
[1301] Please check.
[1302] I think he is.
[1303] I think he's a guy that got set up and they took a bunch of his words out of context and tried to pretend that he was saying something horrible and he wasn't.
[1304] And he had a recording of the entire event because he recorded on his cell phone knowing that they were going to set him up.
[1305] if it's not that guy we have to edit this part out activist exposes Jim Jeffery's deceptive tactics yeah that's him yeah yeah I actually have a story that's very similar to that really I want to talk about sorry go ahead no that was just auto play so the first time I got sent to cover the presidential campaign for Rolling Stone I was in 2004 and I was I was on the plane with Kerry you know it's teeming with journalists obviously and there was a story that came out probably everybody's forgotten it but there was a story that turned out to be fake that Matt Drudge put out about well maybe it wasn't fake but it was at least not proved that Carrie had a secret mistress in Africa right and if you look this up you'll find stories about it that were out there and Carrie came out in the morning and all the journalists were at we're sort of peppering him with questions about the mistress and you know I don't care about John Kerry but I thought it was odd that they went straight from reading something that where there's no evidence to posing this question and having it on camera right so I asked some of the journalists and I was kind of the new kid I I said, why were you doing that?
[1306] Like on the basis of what were you asking that question?
[1307] And the minute they perceived that I was actually trying to ask another journalist a question, like for a story, this one guy, he sort of steps in front of all the other ones.
[1308] And he says, dude, this is a fucking no -fly zone.
[1309] Right?
[1310] Like, in other words, we don't cover each other in here, right?
[1311] Like, that's, that was the message.
[1312] And, like, from that point forward, I was always in the back of the plane, like, with the tech people, whenever I covered presidential politics.
[1313] Because the press does not like it, even though it is a crucial part of the story, it denies that it has that role.
[1314] And it insists on not being covered.
[1315] and you can see how nervous these guys get when a camera's on them.
[1316] Like, oh, my, you're putting a camera on me?
[1317] When I Google John Kerry's Secret Mistress Africa, it brings me down a John Edwards hole, which is fucking weird.
[1318] That is bizarre.
[1319] But when you put it into Bing, I get a story.
[1320] So Microsoft won't censor it, but Google will?
[1321] There was more, there's a couple stories about that.
[1322] Like John Kerry stuff comes up when I look at that.
[1323] Different John Kerry stuff?
[1324] Yeah.
[1325] It doesn't redirect to John Edwards, you know.
[1326] So that's even weird.
[1327] That's even weird, right?
[1328] Like, you're getting different versions of reality based on what search engine?
[1329] Well, you most certainly do.
[1330] If you use duck, duck go, you just get what's out there.
[1331] Right.
[1332] When you use Google, you get really, like, I noticed that during the pandemic.
[1333] There was a doctor that had a heart attack immediately after taking his second shot of, I think it was, And so I was like, what is that about?
[1334] And this was like very early on.
[1335] And I googled it.
[1336] I could not find it.
[1337] I could not find the story.
[1338] And then I went to duck, duck go immediately.
[1339] And I was like, whoa, this is wild.
[1340] Like, they're hiding this story.
[1341] Right.
[1342] So they would hide certain stories because they thought that they would increase vaccine hesitancy.
[1343] Yeah.
[1344] Yeah.
[1345] See, that's terrifying.
[1346] It's terrifying.
[1347] You're carrying water for the pharmaceutical companies, which is really spooky.
[1348] Because, I mean, you want to talk about the people that have had, like, the biggest criminal fines in U .S. history.
[1349] And they've been lying about the adverse effects of their medications forever.
[1350] Right.
[1351] Withholding information in order, gain a profit.
[1352] And then you're hiding stories.
[1353] Right.
[1354] That may implicate them in someone's death.
[1355] It might be somehow or another involved.
[1356] Like, we don't know.
[1357] And you're just hiding it?
[1358] Why are you hiding that story?
[1359] I googled Florida doctor.
[1360] Adverse reaction, vaccine.
[1361] Vactors.
[1362] vaccine heart attack instantly on duck, duck go.
[1363] I get all these articles about this guy that died.
[1364] Couldn't find shit on Google.
[1365] Wow.
[1366] Well, I mean, look, they've gotten very sophisticated in their ability to suppress certain things, you know.
[1367] And, and, you know, this is where you see the influence of, you know, how money works with the content suppression thing.
[1368] I mean, like, you take something like the digital forensic research lab for the Atlantic Council.
[1369] It's one of the things that these platforms use to decide whether or not a news story is true.
[1370] But if you look at where they get their money, you know, it's the German Marshall Fund, which is a mishmash of sort of sovereign wealth funds.
[1371] and, you know, Fortune 500 companies.
[1372] So it's, it's, you're paying for the fact check essentially, right?
[1373] Like, that's how, all these sites that are allegedly deciding what's true and what's not, they're all influenced, you know.
[1374] And that's, that's another thing that drives me crazy is this, this persistent belief that people have that you can objectively decide what, is true and what is not somehow.
[1375] It doesn't work like that.
[1376] The only way it works is you either, over time, you come to trust some stations over others because they have a record of being more right about something.
[1377] That's the only way it works.
[1378] Yeah, independent fact checkers, when the independent fact checkers review certain things and you find them on social media where they have like a little warning or a little notification afterwards.
[1379] And you actually go down the rabbit hole and say, well, what have you done?
[1380] Like, what is the, a lot of it is subjective.
[1381] They've just decided that this is not true or decided this is partially true.
[1382] Or it's missing context.
[1383] That's our favorite thing.
[1384] Yeah.
[1385] Missing context is great.
[1386] Misting context, right?
[1387] Like, oh, it's true, but here are eight reasons why you should think otherwise.
[1388] Like, you know, like, that's not our job.
[1389] You know, and by the way, reporting by itself is fact checking.
[1390] That's the whole point of it.
[1391] Like, we don't, we don't need a separate thing called fact Checking right to go with report I don't I that that whole phenomenon drives me nuts It is it's weird though that we don't have and I mean used to be Snopes and a lot of people used to go to Snopes but then I read about Snopes and you find out all the the wacky shit about the people that are involved in Snopes and that the The guy was the head of it is like very heavily left leaning and then he married a prostitute and like All kinds of wild shit.
[1392] It's like Snopes is not like some like rock.
[1393] Like rock.
[1394] solid, independent, purely objective organization that it's dedicated to the dissemination of truthful information.
[1395] Like, no, they're like fucking heavily left leaning.
[1396] Right, right.
[1397] And in subjective circumstances when it's subjective whether or not this is real or not real or lacking context or whatever, like they can give, they can paint a narrative that at least is biased.
[1398] I mean, every outlet is subjective, but that's why you have to allow them all.
[1399] Yes.
[1400] You know what I mean?
[1401] Yeah.
[1402] Like, we're growing up, let's read all the stuff, and then we'll decide.
[1403] Yeah.
[1404] But they don't want to do it that way.
[1405] They want to have a hierarchical system that decides what's more.
[1406] Like, you talk about Google's search engine.
[1407] Like, they changed, they had a thing called Project Owl that they implemented in, I think it was 2017, where they changed their, um, their way of measuring what stories come up first, and they shifted to a model that emphasized what they call authority.
[1408] And when I asked them what that meant, they told me that the analogy they gave was think about if you search for baseball previously, you might have gotten your local little league.
[1409] Now you're going to get MLB .com, right?
[1410] So whatever we consider the more authoritative source, and that's based on search.
[1411] surveys of people, what people think is authoritative, that's what's going to come up first.
[1412] So instead of, if you search for, let's just say, Trotskyism, instead of getting the world's leading Trotskyist website, right, which is the world socialist website, you will get a like a New York Times story about Trotskyism instead, right?
[1413] Because they want to push you towards the authoritative source.
[1414] Right.
[1415] But that's subjective, right?
[1416] And again, it's hierarchical.
[1417] And it's away from the spirit of how we would like to ingest information, which is just let's see all of it and make our own decision.
[1418] And if you did come up with your own search engine or your own fact -checking organization that decided what's true or is not true, the real fear would that be that that would eventually get compromised and that someone would come up.
[1419] along and they pay for your advertising and do this and do that and then slowly but surely get their hooks into you.
[1420] Which is what they've done with Wikipedia.
[1421] Wikipedia was originally like this open source, you know, kind of free thing.
[1422] Now, like just, I mean, I'm discovering this now with the Twitter files.
[1423] You can't get Twitter files information into Wikipedia because they will not recognize what they call like a, I forget the term they use.
[1424] It's not an authoritative source.
[1425] It's like a recognized source or something like that.
[1426] So as long as the big newspapers don't cover it, they don't have a site that allows them to put it into Wikipedia, that allows the algorithm to put it in.
[1427] Has no mainstream media source covered the Twitter files?
[1428] Not really.
[1429] No. They've done hit pieces on me and on Elon and on Barry, but they haven't covered the stuff.
[1430] in the stories.
[1431] Which is wild.
[1432] Yeah, I mean, like, you know, that is wild.
[1433] No matter what you think about me or Elon Musk or whatever, like the stuff in the files is clearly newsworthy.
[1434] I mean, if you didn't know, the idea that the FBI and Homeland Security having a system of sending moderation requests to every, you know, internet platform in the country, the idea that that's not a new story is insane to me. I can't even process that, but you have to make a conscious decision to not do that story, which is what they've done, you know, so.
[1435] Which is really indicting.
[1436] Yeah, and of course, you know, they've done a gazillion stories about, you know, how I've become this evil sellout right -wing character.
[1437] character, the Washington Post actually humorously described me as a conservative journalist, and they scrubbed it within a day because there was so much blowback on Twitter.
[1438] But I don't care about that sometimes because I'm used to it by now, but it's a message that's sent to other journalists, which is if you step outside the club, we're just going to dump buckets of shit on you all day long.
[1439] and that's going to be your life forever, right?
[1440] Like, you're just, and you have to get used to that.
[1441] And so, you know, that's a new part of the business.
[1442] Like, once when you broke a big story, you got like plaudits from your peers.
[1443] And now, you know, it's a very different thing.
[1444] But you still got to do it, definitely.
[1445] You still got to do it.
[1446] But more importantly, when they continue, to do that and call someone like you a right -wing journalist or call someone a far -right this or an alt -write that.
[1447] And then people objectively know that that's not true.
[1448] Then it undermines all of their credibility and slowly but surely dissolves all confidence that people have in every other story they come out with.
[1449] And that's what we're seeing.
[1450] That's why the, like what New York Times is today, to the generation that's coming up today, I used to deliver the New York Times just because it was the New York Times.
[1451] It wasn't even profitable for me. I thought it was cool that I was delivering the New York Times because I delivered the Boston Globe and I delivered the Boston Herald and I got a route for the Times.
[1452] That's awesome.
[1453] And the Times is a pain in the ass because I had a drive like if I was doing the Boston Globe, which was the most popular paper.
[1454] Wait, what town were you in?
[1455] Boston.
[1456] Oh, okay.
[1457] Newton.
[1458] Newton.
[1459] So I would deliver, I would get up every morning.
[1460] I'd work 365 days a year.
[1461] You have to deliver every day if you have a paper out.
[1462] I used to have a paper out in Massachusetts, too.
[1463] So I would get up and I would go to the depot.
[1464] I'd pick up my heralds, I'd pick up my globes at a different place, then I'd go and get my New York Times at a different place.
[1465] And the New York Times is a nightmare because, like, if I was going to deliver to the Boston Globe, if I'm on one street, I might have ten houses on the street.
[1466] But the fucking times, I might have one, and then might have to go a mile to my next house.
[1467] And you've got to carry the back all the way, right?
[1468] And I would try to coordinate my routes, right?
[1469] So I would have a route that was all the globes, and then I have a route that was the heralds and then I would have the Times and the Times was a nightmare but I delivered the Times just because it was the New York Times because you thought it was cool it was cool yeah they have a blue plastic bag everyone else had a white one and you know it's like if that guy got the times delivered that's a smart dude that's a guy was reading the New York Times and I read the New York Times so I was like you know I'm going to deliver the Times like I felt like I was a cooler person for delivering the New York Times that doesn't that does not exist with 21 year old people of today.
[1470] When I was 21, that was like, I mean, I had aspired to be a more intelligent, more well -read person, and that to me was a symbol of that.
[1471] And I would get a free copy of it every day, and I would read it after I was done working.
[1472] That doesn't exist anymore.
[1473] Nobody gives a fuck about the New York Times, and they think about them as like some left -wing tumbler blog.
[1474] Right.
[1475] That's what it is.
[1476] It's like...
[1477] Yeah, it's not even, it's not even like the establishment paper.
[1478] No. It's just, yeah, like you say, it's like a Facebook group for a small group of kind of, you know, wealthy people who, you know, all went to the same schools.
[1479] I don't know.
[1480] It's, it's dull and uninformative at the same time.
[1481] Ideologically driven beyond belief.
[1482] Like, 100 % for sure are not getting an objective analysis of whatever story it is.
[1483] getting it from a very particular slant, one way or the another.
[1484] And that's not sustainable.
[1485] It's not sustainable with this new group of people that are coming up that have the internet.
[1486] These kids have access to all these independent people talking about things, whether they're doing it on YouTube or podcasts or substack or whatever it is.
[1487] They get access to independent people that are talking about real information.
[1488] And every time the New York Times prints bullshit, every time the Washington Post, Prince bullshit.
[1489] It further undermines their credibility and further slides them down this inevitable road that they're on.
[1490] Right.
[1491] Yeah.
[1492] It's a road to obscurity.
[1493] Nobody's going to care anymore.
[1494] It's like you're seeing it take place at CNN.
[1495] You're seeing it take place with all these cable news networks.
[1496] You're seeing it take place with late night television.
[1497] No one gives a fuck about it anymore.
[1498] I know.
[1499] And it's slowly, you have a bad business model and you're not adapting.
[1500] Yeah.
[1501] Yeah.
[1502] The creativity is gone.
[1503] for late night comedy, right?
[1504] Like, that's not really a thing anymore.
[1505] You got Jimmy Fallon doing a song about different variants of COVID.
[1506] Yeah.
[1507] Did you see that?
[1508] Yeah.
[1509] Could we see it, though?
[1510] Have you seen it?
[1511] I saw the AniVex Barbie one, which was amazing.
[1512] I didn't see that one.
[1513] But this one is so strange.
[1514] It's like, I mean, what did they drug him up with to get him to agree to do this?
[1515] like I want to be in the meeting I want to be a fly on the wall in that meeting where they go okay Jimmy this is what we're going to do you're going to be singing about all the different variants and it's like here's a song and you're going to be dancing and saying he's like okay okay okay okay what is happening there so that's supposed to be is that Devo that he's No Love Shack yeah oh that's right it's B52's right No who's love shack Was it B -52s?
[1516] That sounds right.
[1517] Yeah, it is a B -52's, right?
[1518] Yeah.
[1519] That's it, right?
[1520] Yeah, you're right.
[1521] Yeah, you're right, of course, of course, yes.
[1522] Oh, God.
[1523] But that's just, like, straight up...
[1524] What is that?
[1525] That's insanity.
[1526] Like, look at his little dance.
[1527] Like, there's no enthusiasm in his face.
[1528] It's like he's drugged.
[1529] I mean, it kind of reminds you of, like, like a USO thing, right?
[1530] I don't know what it reminds me of.
[1531] It reminds me of madness.
[1532] It's just pure madness.
[1533] Like, that's not what he got in his show business.
[1534] Like, what they'll, someone talked him into doing that.
[1535] Like, that's not his idea.
[1536] Somebody had a meeting.
[1537] I don't think it's his idea.
[1538] 30 million subscribers to that video did not do very well.
[1539] 104 ,000 views and 4 ,000 of them are us.
[1540] I know we played it twice or three times now.
[1541] That's really.
[1542] Nobody gives a fuck about that.
[1543] That's nonsense.
[1544] How do they have that many subscribers, too?
[1545] They probably buy them.
[1546] That's the problem, too.
[1547] You're finding out how many people buy social media followers.
[1548] Right, right.
[1549] Yeah.
[1550] And that people were selling blue check marks at Twitter.
[1551] That's the other weird thing, where they were selling verified accounts.
[1552] So, like, you could pay someone, and they would get you verified.
[1553] People were spending, like, $5 ,000, $10 ,000.
[1554] How little of a life do you have to have for that to matter that much, to you.
[1555] I guess if you're like an independent journalist or you're some sort of a YouTuber that's trying, like if you have that check next to your name, that gives you more credibility.
[1556] I love when the fact they were removing check marks, like we're going to take away your verification.
[1557] Oh yeah, they did that to Thomas Chatterman Williams.
[1558] Remember that after the Harper's letter?
[1559] No. What was that about?
[1560] We still don't know.
[1561] They just took his checkmark boy um he was the guy who organized the harper's letter the for you know the pro pro free speech declaration no i'm not aware of that so forgot about it it was like a it was a a petition where he he organized a bunch of high profile people basically to say that like canceling is bad and we should we should all respect each other's opinions and and and you know support academic McFreedom and that sort of thing.
[1562] So we got people like Salman Rushdie and Noam Chomsky to sign onto it.
[1563] But then there was also Barry Weiss and J .K. Rowling were were on the list.
[1564] And it soon became a thing in the media that to be on the Harper's letter was like, it was like membership in a hate society.
[1565] uh and he was just absolutely dumped upon um you know he he was denounced as a racist even though he's black uh and you know he got he got his check taken away i i i don't know if he had it he was trying to get verified it seems like that's i'm reading this article about it right now and he's saying like he tried to get verified here for the second time maybe he got it taken away and i think he got it taken away because i i he had a at one point.
[1566] It says denied verification.
[1567] Did he definitely have it at one point in time?
[1568] I don't, I'm just saying I don't know that that's what this story sounds like it was.
[1569] It sounds like he was trying to get it again.
[1570] Isn't that what is, that is so amazing.
[1571] I mean, this brings me back to that conversation I have with that YouTube lady who said it's hate speech.
[1572] It is so amazing that you would say that freedom of speech and to be able to talk about things openly is somehow hate speech.
[1573] Yeah, no, the Harper's Letter thing was nuts because if you actually read the Harper's letter, it's so anodyne.
[1574] It barely says anything at all.
[1575] It's very kumbaya.
[1576] It's just like, hey, let's all get along.
[1577] Like, you know, let's not get people fired for saying harmless things, that kind of thing.
[1578] Crazy talk.
[1579] Right, yeah.
[1580] Outright crazy talk.
[1581] Yeah.
[1582] And it became a huge thing.
[1583] thing.
[1584] I mean, it changed media.
[1585] Like, it ended up being one of the reasons that Matt Iglesias left Vox, which he co -founded because he signed the letter and there were somebody on staff who felt threatened by that.
[1586] It was a whole curfuffle within the media industry, which is, you know, endlessly navel -gazing anyway.
[1587] But, yeah, no, the, you can be, you can become the, the, the, the, the, the, you can become the, the, the, the, of one of those ridiculous villain for the day, you know, campaigns really, really easily now.
[1588] Do you think that that is changing because now people aren't scared to speak their mind on Twitter?
[1589] Like, you're seeing so much pushback.
[1590] When someone type something on Twitter now and it's ridiculous, now people aren't scared to go in after it.
[1591] They're not worried about losing their account, which they were before, which is, I think, One of the more interesting things about Elon Musk buying Twitter is that you are seeing a much more vigorous debate.
[1592] You're seeing a lot more trolling.
[1593] You're seeing a lot more people that are posting, like, ridiculous gifts, like to make fun of people after they say something stupid.
[1594] Yeah.
[1595] Yeah.
[1596] I mean, I hope that's one of the results, right?
[1597] Because the old Twitter was, you know, was just a grindstone of official messaging where if you see, if you see.
[1598] said what like a thing like a micrometer outside whatever the narrative was you could expect to just be descended upon by all these people and and nobody you just ended up not wanting to bother right so you you wouldn't say anything but you know I I hope people are are feeling encouraged to to say more now but as you know as my experience shows you can still end up getting lots and lots of shit in the media for doing the wrong thing.
[1599] And that can last quite a long time.
[1600] But I mean, how much influence does that media really have anymore?
[1601] I mean, and just because something's written down, how much different is it than people just having a conversation and putting that conversation on YouTube?
[1602] Like the actual idea that someone writing an article about someone, like a hit piece on you, for instance, that that actually has an impact anymore.
[1603] It's no, it's really no different than two people on some sort of a progressive podcast talking about, oh, Matt, Matt Taibis, now we're right winger now.
[1604] It's like crazy.
[1605] Like, what happened to him?
[1606] He used to be the guy for the Rolling Stone.
[1607] He was so progressive.
[1608] And now he's a right.
[1609] He used to go after, you know, Wall Street.
[1610] Now, it's not, those articles don't work.
[1611] No, they don't work.
[1612] And people aren't reading them.
[1613] Yeah.
[1614] And look, I mean, I mean, you know this too, right?
[1615] because when there was that whole movement to try to get Bernie Sanders to announce you and everything like that, like after you endorse them, if you're not afraid of whatever the ultimate consequence is, like, you know, you learn that, you know, these cancellation episodes are survivable.
[1616] Once that happens, you lose your fear of it pretty quickly.
[1617] And I think Well, they emboldened people After they've survived it That's the other thing Yeah That now you realize that Oh, this is okay Like I could do this Yeah Not only that Like when all the thing It was happening with Spotify with me I gained two million subscribers In a month Right So it worked the opposite way Like Patrick Bet David did a whole thing And he said I He goes in my estimation The amount of publicity They gave him Was worth about 300 million dollars that's great yeah but that's the new world whereas using those methods before we had independent journalism before we had the internet before we had YouTube and all these different ways that you could just get a message out it was a death sentence yeah if they all came at you in targeted fashion like they did you're a fucked like they they were going to change the narrative of you and it changed the narrative already with some people some people still believe certain things about me because they read it on CNN or they heard it on CNN Sure.
[1618] Sure.
[1619] There's no way you can get around that, but for most people that are actually paying attention, all it does is undermine the credibility of those sources.
[1620] Anybody who's calling you a right -wing journalist.
[1621] Like, anybody who knows you knows that's straight horseshit.
[1622] Right.
[1623] So, like, the amount of damage they're doing to their own reputation by printing that, the individual author and the publication itself, the publication should be terrified of anybody that would be so willing to undermine their credibility by calling you a right -wing journalist for just one point about one thing that they disagree with you on.
[1624] So they're going to make this blanket statement that's so patently untrue and so obviously researchable.
[1625] Right.
[1626] You should if that was my news organization, be like, what the fuck are you doing?
[1627] Do you know what damage you're going to do by calling him a right -wing journalist?
[1628] Now, what 100 ,000 people that are going to read this?
[1629] You've got 50 ,000 people that now think you're capable being full shit.
[1630] Right.
[1631] And they're probably going to go over to the substack or they're going to Spotify or whatever.
[1632] Yes.
[1633] I mean, like, that's, yeah, they haven't figured that out yet.
[1634] But, you know, there's still the collateral damage of, you know, they're able to say nasty things about you that people hear, which is not fun.
[1635] Yeah.
[1636] But, you know, you're right.
[1637] There was a moment before independent media where if, if they all decided to do it, you were done, you know?
[1638] I mean, I remember the first time that, you know, that I knew there was a story coming out about me in my past with the exile, and I knew I was in serious trouble.
[1639] And, like, at the time, there was no alternative.
[1640] Like, you know, if the club kicked you out, there was nowhere else to go in journalism, more or in any kind of media job.
[1641] and uh but but that's different that that's no longer the case they don't have that absolute power anymore they have less power than the independence which is nuts and it happened so quickly right right i mean if you look at like what crystal and saga have done and that in breaking points breaking points is fucking gigantic now and i remember when they weren't independent they were thinking about going independent and i was like i'll help you i'm like we can do this you guys you guys are fantastic You're honest.
[1642] You talk about things.
[1643] I mean, I might not agree with you, but you're talking about things based on your actual interpretation of what's going on and your opinions on these things.
[1644] That's what people want.
[1645] Yeah.
[1646] And they had a concept that at the time was forbidden, which was people on the opposite end of the political spectrum trying to have a civilized conversation.
[1647] Remember when they used to have a show like that on Fox, Hannity and Combs?
[1648] Right.
[1649] Remember that?
[1650] But that was sort of like pro wrestling.
[1651] Like, Combs' job was to get pinned.
[1652] Right.
[1653] You know?
[1654] I mean, it wasn't a real fight.
[1655] Oh, he tapped out quick, too.
[1656] Yeah.
[1657] Yeah, he always got pinned.
[1658] But that's what the show was.
[1659] It was like, we got a guy from the left.
[1660] We got a guy from the right.
[1661] And, you know, Hannity went on to become a big star.
[1662] And where did Combs go?
[1663] You quit.
[1664] Probably got brain damage from being pinned so many times.
[1665] That would be an interesting one.
[1666] Whatever happened to Alan Combs, that would be an interesting question to...
[1667] Does he have a podcast or anything?
[1668] Where is that guy?
[1669] But he was just so wishy -washy.
[1670] He was like the perfect caricature of a left -wing guy confronted by a strong right -wing pro -America.
[1671] Right.
[1672] You know, Sean Hannity.
[1673] Right.
[1674] I'm friends with Trump.
[1675] You know, like...
[1676] Yeah, no, I remember I knew a guy named Jeff Cohen who was briefly on Crossfire.
[1677] He played the from the left person.
[1678] And he told me, like, afterwards that the role of the liberal in that show was to be somebody who couldn't punch back.
[1679] Right?
[1680] Like, they were trying not so much to talk about the politics, but the highlight the kind of weaniness of that character, right?
[1681] Yeah.
[1682] And so, because he's like, he didn't go along with it, so he didn't last terribly long in that role.
[1683] But, but that's the kind of person they wanted.
[1684] They wanted somebody who was kind of snively retreating.
[1685] Yeah.
[1686] And the conservative was always like this attacking aggressive, you know, sort of no nonsense.
[1687] No nonsense.
[1688] Like sort of cackling, confident character.
[1689] It made for good TV, but it, you know, as politics, it was totally nuts.
[1690] So.
[1691] Yeah.
[1692] So what Crystal and Saga are doing is like the better version of that.
[1693] Right, yeah.
[1694] But an honest version of that.
[1695] An honest version of it.
[1696] And they're really our friends.
[1697] They like, they like each other.
[1698] He is on the right.
[1699] She is on the left.
[1700] And they disagree on things.
[1701] But it works.
[1702] It really works.
[1703] Right.
[1704] And why is that forbidden?
[1705] That's a really interesting question, right?
[1706] Like, what?
[1707] Why do people not want us to know that it's possible for people on the right and the left to talk in a civilized way and disagree on some things but still get along?
[1708] Like, why doesn't anybody want us to know that?
[1709] Like, I think that's a question that's worth exploring.
[1710] Like, why does CBS not want us to know that?
[1711] Why does Fox, for that matter, not want us to know that?
[1712] Well, I think the fear is that if you do allow that, like say if you're NBC, or CNBC, and you allow this right -left thing to happen on your show.
[1713] What if the person on the right makes a really good point?
[1714] Mm -hmm.
[1715] And what if they swing people more towards the right?
[1716] Like, what if this person's on the air multiple times, and they're really compelling?
[1717] And maybe they're better at arguing, or maybe they're more reasonable, or maybe they're more objective, or maybe they're more calm.
[1718] Maybe whatever about them is more attractive than the person who's on the left.
[1719] Now, a sudden, you've got a problem, because now you have people that are, tuning in specifically for this one woman or one man who is right -leaning on a network that has a progressive agenda.
[1720] You have a left -wing agenda.
[1721] Right.
[1722] And you're funded by these left -wing super PACs and left -wing special interest groups and left -wing advertising and you're like, hey, hey, hey, what is this fucking abortion is murder argument that this guy just made reasonably?
[1723] What is this, you know, like term limits argument that this person made?
[1724] this, what is this argument this person made about getting money out of politics?
[1725] Are you fucking crazy?
[1726] Right.
[1727] Get that off of there.
[1728] Yeah.
[1729] And, and that's too bad because what it, what ends of happening is we, we end up in this sort of system that's a bifurcated media where everybody's in armed camps, like they, they don't talk to one another, because there's no model for that in American society.
[1730] We, like, we, we, we don't have a place where we can see people of differing, political opinions getting along with one another and acting like civilized human beings like it doesn't really exist in like establishment culture establishment media um but people people that's why people are rejecting it because they know like they they they're picking up their kids at school and talking to some to to their neighbors who they know are have totally different politics and they're getting along fine with them yeah right and and so they know it's a lie you know And I think it's exhausting.
[1731] And it's starting to run its course, which is great because I've been waiting for it to run its course for a long time.
[1732] So it's cool to see.
[1733] Well, that's why the World Economic Forum and things along those lines are so fascinating.
[1734] Because you can see that they're the ones that are holding the strings that dangle the narratives in front of the people that make them attractive.
[1735] And then you realize, like, well, this is not our real problem.
[1736] Our problem is not really these narratives.
[1737] Our problem is who's promoting these narratives, and what are they doing while they're promoting that and we're distracted?
[1738] They're trying to institute a centralized digital currency.
[1739] They're trying to give people vaccine passports and some sort of a social credit score system.
[1740] They're trying to do all sorts of weird methods of control that you're not going to be able to get out of it if you're on the left or if you're on the right.
[1741] It's going to fuck up everybody's life.
[1742] And in the meantime, we're arguing about who's right, Greta Thornburg or Andrew Tate.
[1743] Right.
[1744] Like, it's like these distractions that they put in front of us in the media that get us so hyped up.
[1745] While real shit is going down that there's real decisions that could be made that might affect you forever, the amount of freedom you have, your ability to travel, your ability, like in China, you say the wrong thing.
[1746] You can't buy a plane ticket.
[1747] You can't go anywhere.
[1748] Oh, sorry, you're not allowed to buy a home.
[1749] We saw you tweet about something.
[1750] We found disagreeable.
[1751] Well, we have to worry about that now, too, in the state.
[1752] though and right we have to you have to worry if you cross a certain line are you going to are you are they going to make it difficult for you to process your credit card transactions will you be unemployable will you be unemployable like you know um will you will you not be able to use PayPal anymore you know right you like it's it's stuff like that that you know that kind of creeping dystopian uh sort of systems of control control.
[1753] It's a big news story.
[1754] And I think people recognize that it's a serious thing.
[1755] But we don't, we don't see it talked about very much in the corporate press because, again, they're in favor of it.
[1756] Right.
[1757] So that, but I'm terrified by all that stuff.
[1758] When PayPal was saying they were going to find people for misinformation.
[1759] And I'm like, hey, hey, hey, you guys are just supposed to be a way I can buy things on.
[1760] Yeah, that's it.
[1761] When you're saying what misinformation on what?
[1762] On what platform?
[1763] On any platform?
[1764] What if I'm at home?
[1765] Are you listening?
[1766] Like, how the fuck do you, what are you saying when you're saying misinformation?
[1767] You're going to find me?
[1768] Where's that money going?
[1769] You're stealing money from me because you don't agree.
[1770] And what happens if it turns out that misinformation turns out to be true?
[1771] Yeah, no, I did a, I did a story about, I think it was Mint Press got, they had funds frozen.
[1772] by PayPal, if I remember correctly.
[1773] And, yeah, but the idea that this company, they should be doing one thing.
[1774] They're trying to make a transaction happen.
[1775] Yeah.
[1776] Why are they in the truth business?
[1777] Exactly.
[1778] Like, that can only happen if something has gone wildly wrong in society and somebody feels the need to start using all these different pressure points to control people.
[1779] Like whether or not you can process credit card transactions or, you know, PayPal, you know, if you leave a record of, you know, certain kind of web surfing, maybe, you know, that's going to, you know, be a negative that will appear somewhere.
[1780] Like, yeah, that stuff is all scary.
[1781] Like we, it's a, it's a serious problem, I think.
[1782] When it comes to money, like you're, you, you can freeze funds, you can move money around so you can withdraw money from someone's account because you think, like, what was it, like $2 ,500 or something around those lines?
[1783] Something like that, yeah.
[1784] Some fine that you would get for misinformation.
[1785] So if your grandma posts some crazy shit about Trump really winning the election, you know, you have a crazy Q &N grandma, like, they're going to steal her money?
[1786] Yeah.
[1787] Like, what are you saying?
[1788] Right.
[1789] Or the, or the Canadian trucker thing where it's, you know, The GoFundMe thing, right?
[1790] Like, you raise a whole bunch of money.
[1791] But not just the go fund me thing, about they froze their fucking bank accounts.
[1792] And all these people did was protest.
[1793] Right.
[1794] And you don't have to agree with the protest, but you certainly have to be freaked out by their response to it.
[1795] Well, also freaked out that Trudeau is one of the young global leaders at the World Economic Forum.
[1796] And that Trudeau labeled the truckers as misogynists and racists.
[1797] Right.
[1798] Like, says who?
[1799] Right.
[1800] Where are you getting this from?
[1801] You're not even like pointing to a thing they've said.
[1802] You're just saying that in this blanket statement to try to diffuse everything they've said and everything they stand for.
[1803] It's so transparent and so such a checkers move in a world of 4D chess.
[1804] Yeah, I mean, I always think back to this moment in the 2016 presidential campaign when Bernie was drawing some blood against Hillary.
[1805] by, you know, talking about her, the gigantic speaking fee she was taking from banks.
[1806] And they tried to throw a bunch of stuff back at him.
[1807] None of it worked until one day she came out and she said this thing.
[1808] If we break up the banks tomorrow, will that end racism?
[1809] And suddenly this idea sort of popped out into the ether that talking about Hillary Clinton's ties to banks was somehow somehow racist or somehow not progressive, right?
[1810] And because Sanders, who had grown up his whole life in that ecosystem, there was nothing more terrifying than being accused of racism or misogyny or whatever it was because they came out with the Bernie bro thing right after that.
[1811] Yeah, exactly.
[1812] And it was a disciplinary method, basically, right?
[1813] Like if you, if you go to a certain place, we're going to start dropping these words on you.
[1814] And those words are not survivable in certain areas.
[1815] Right.
[1816] You know, so it's very effective.
[1817] But I think it can't be effective forever, I don't think.
[1818] No, I think that effectiveness is waning like a bucket with a hole in the bottom of it.
[1819] And I don't think there's any escape.
[1820] I think that the path they've put themselves on, you can't return from it.
[1821] And I think they're doomed.
[1822] I really do.
[1823] I think we're looking at a future where almost all credible media is independent.
[1824] I really do.
[1825] I just don't think they're going to make it.
[1826] I think the only thing that Hollywood will be good for and like these entertainment corporations would be good for is creating things that are exorbitantly expensive, like films with special effects.
[1827] Well, they are good at that story.
[1828] Yeah, but that's the only thing they'll be able to do.
[1829] They make good action movies with superheroes and them, I guess.
[1830] I mean, you can make films with iPhones now.
[1831] Right.
[1832] I mean, a real film.
[1833] Like, I have a guy that was on the podcast, his name Sunny from Best Ever Food Review.
[1834] He goes to these countries and samples their exotic foods and travels and sees their cultures and hangs out with these tribal people.
[1835] Really interesting show on YouTube.
[1836] He went to Egypt and they confiscated all of their equipment, everything.
[1837] They took all of their cameras, even though they had visas, they had working visa.
[1838] He said it's the worst place to film.
[1839] They're very restrictive.
[1840] And since then, because he made a video about it, they've actually changed the laws.
[1841] Point being, he decided to film the entire episode on iPhones, and it looks great.
[1842] Right.
[1843] It looks amazing because these new iPhones are so fucking good.
[1844] And these new Samsung phones are so good.
[1845] You don't need really complicated equipment anymore.
[1846] You can make a really great 4K video with your phone easily.
[1847] Right.
[1848] You're plenty of storage.
[1849] It's not hard to do.
[1850] Yeah.
[1851] So that's what they did.
[1852] And now he's looking at it like, man, why are we traveling with all this shit?
[1853] I could just have my cameramen use iPhones.
[1854] Yeah.
[1855] And you don't need a big institutional backer anymore.
[1856] No. And if the only way they can fight back against independent content creators is by calling every single one of them, a racist, misogynist, right -wing, or whatever, whatever, pretty soon you're going to get to the situation where we're near it now, where all those people, like, are running into one another and, like, every single one of them in a room has already been through episodes like that, right?
[1857] Like, you know, it loses its power at that point, you know, once you've done it to a million people or two million people like people stop being shocked uh yeah the term it's cry wolf yeah exactly so yeah it's um it's a weird time really interesting yeah it's fun yeah it's fun to watch everything get fucking thrown up into the wind and blah and scatter all over the place it's fun it is it is fun and i'm actually having fun with the job for the first time in a long time so um i hope you are too I'm having a great time.
[1858] Do you enjoy doing substack and being independent and doing things the way you do and doing your podcast?
[1859] Are you enjoying that?
[1860] Yeah.
[1861] I mean, it's a little different because I used to have the luxury to spend 10 weeks on investigating something.
[1862] And I don't anymore.
[1863] I got to crank stuff out, right?
[1864] Yeah.
[1865] But I would never have been able to do this Twitter files thing, you know, just on a lark without asking.
[1866] lots of people for permission.
[1867] You know what I'm saying?
[1868] Can I ask you this?
[1869] How did, how does Elon set that up?
[1870] How did he approach you guys?
[1871] Individually.
[1872] You know, I mean, I woke up and I got a text one day, basically.
[1873] And, you know, would you like to do this?
[1874] And the answer, of course, is yes.
[1875] And by the way, people talk about this all the time.
[1876] They want to make a big deal about, because there was an army of people after the first footer files who all said the same thing.
[1877] Like, imagine doing PR for the richest man on earth, right?
[1878] Like, that was the universal response of all the many Hassans of the world.
[1879] Look, this is the story here is about organizations that are vastly more powerful even than the richest man on earth.
[1880] It's about the FBI, the NSA, the CIA, the DOD, like DHS, and it's about, it's an opportunity to see how these figures operate in the wild.
[1881] When you get a source like that, like, it's not important what their motives are.
[1882] What's important are what your motives are, you know?
[1883] And my motives are I want to know what was going on.
[1884] and what these, how these organizations operate.
[1885] And you would never turn down that, no real journalist would turn down that opportunity.
[1886] You know, and incidentally, I kind of like Elon Musk.
[1887] I mean, he's, you know, he's got a, he's got a sense of humor about this.
[1888] And I think, and I think he's his ideology.
[1889] in terms of, you know, the desire for putting this out there.
[1890] I mean, who would do this?
[1891] Who would spend that much money to do this?
[1892] His sense of humor is an internet sense of humor.
[1893] Oh, absolutely.
[1894] Like, he posted that meme of the pregnant man next to the photo of Bill Gates and his pot belly.
[1895] And he said, in case you want to lose a boner real fast.
[1896] And he put that on Twitter.
[1897] I mean, imagine getting dunked on by the richest man. in the world on Twitter.
[1898] But that's hilarious.
[1899] It's hilarious.
[1900] Yeah.
[1901] Again, could you imagine John Jacob Astor or, you know, one of the Guggenheims doing like a teenage joke in public?
[1902] Of course not.
[1903] It's funny.
[1904] Yeah.
[1905] It's the same phenomenon with Trump.
[1906] It's like, you know, I couldn't stand in his politics, but if you denied that he was funny.
[1907] You were lying.
[1908] Right.
[1909] Like, the campaign was funny.
[1910] When he called Kim Jong -un, little rocket man. I mean, come on.
[1911] That shit's hilarious.
[1912] The guy's got great timing.
[1913] I mean, he could have been a stand -up.
[1914] His timing is excellent.
[1915] Yeah.
[1916] Have he had, you think, yeah, he probably would have been good.
[1917] Yeah.
[1918] Sure.
[1919] Yeah.
[1920] Probably said some hilarious shit.
[1921] Look, the guy can, and he's good off the cuff.
[1922] It's not saying he's the best leader we've ever had.
[1923] Not saying he's a great statesman or a great president or even a good representation of what America is supposed to stand for because he's not.
[1924] He's not.
[1925] You know, and it's a real problem.
[1926] He's a real problem because there's a lot of people that are conservative -minded, fiscally conservative, hardworking people.
[1927] They don't like any of his antics.
[1928] And they're forced to choose between someone they deeply disrespect versus someone who they also deeply disrespect.
[1929] It's like, what am I do here?
[1930] How do I go?
[1931] I'm trying to figure out if I'm conservative anymore.
[1932] I'm, I'm Am I a liberal now?
[1933] Am I the, and you don't know because you don't want to align yourself with problematic personalities that also embody some of the economic ideas that you agree with.
[1934] Right, right.
[1935] But his send -up of the whole process was accurate.
[1936] Yeah.
[1937] You know?
[1938] The swamp.
[1939] I mean, the, the self -seriousness of it, right?
[1940] Like, he was constantly kind of making fun.
[1941] of how seriously people, like, Hillary Clinton, took themselves.
[1942] Yeah.
[1943] And there was no way it was not going to land.
[1944] Like, you know, Jeb Bush, you know, saying, my mother is the strongest person in the world and him, you know, saying she should be running, you know, like, that stuff was just, it was, it was design, it was never not going to work.
[1945] You know, and the fact that none of us, or none of the reporters could see it at the time was, was kind of amazing.
[1946] Yeah, well, I think people were just so terrified that an asshole like that could actually win and become president.
[1947] And I remember we were watching, we did an end -of -the -world podcast from the comedy store.
[1948] We were a bunch of comics.
[1949] We, you know, we were like watching the election take place and talking shit while it was happening.
[1950] And then afterwards, you know, it was all over.
[1951] We were like all stunned and we went back to the comedian's bar and I was watching Jake Tapper on TV with just like somber look in his face.
[1952] Well, it really does look like Trump is the president.
[1953] The whole thing was so surreal and wild that, I mean, they just did everything they could to stop that from happening and it didn't work.
[1954] Yeah.
[1955] Yeah.
[1956] No, I mean, and even until the end, I didn't think it was going to have.
[1957] But then Florida came in.
[1958] Do you remember that moment?
[1959] Yeah.
[1960] And then I was like, holy shit, this is going to happen.
[1961] One of the best things is watching the compilation of the young Turks watching the election go down and the beginning being super confident.
[1962] And then towards the end, they're like, fuck!
[1963] And then there's the same people that have to pretend that Biden's okay.
[1964] It's amazing how well this country is running while Biden is literally not there.
[1965] Oh, yeah.
[1966] Completely absent.
[1967] Out of it.
[1968] It's gone.
[1969] I mean, it's weekend at Bernie's the whole, the whole presidency.
[1970] And I love the idea that they're going to do Weekend of Bernie's too, which is great.
[1971] Now, I wanted to ask you about this, because this is, I had a conspiracy theory.
[1972] I think when he announced that he was going to run again, and he said that he was going to run again, he talked about running again, then they started finding, like, classified files.
[1973] Absolutely.
[1974] Absolutely.
[1975] Absolutely.
[1976] I mean, you saw last week, Andrew Weissman, who was one of the lead prosecutors in the Mueller investigation, was tweeting all these things about Biden, right?
[1977] You know, so there's no question that the party and maybe some folks in certain agencies were sending him a message, I think.
[1978] Yeah.
[1979] Yeah, you can't prove that.
[1980] It's not easily provable.
[1981] But it certainly feels.
[1982] It seems very transparent, like a lot of news stories.
[1983] You know, it seems pretty obvious what's going on.
[1984] But again, that's what we need a real press for to get to the bottom of it so that we can actually talk about it.
[1985] Yeah.
[1986] Yeah.
[1987] But clearly, yeah, right?
[1988] I mean, it's been six years.
[1989] They didn't go looking for that stuff in the Corvette until there were suddenly a decision that, But, no, we don't really want him running again.
[1990] But it's just so amazing to watch the hypocrisy play out.
[1991] Like, do you not remember what you said about the documents at Mara Lago?
[1992] That wasn't that long ago.
[1993] And at least in Mara Lago, they were in a safe.
[1994] Oh, my, was in a locked garage.
[1995] Oh, your garage was locked?
[1996] Well, those are impossible to get into.
[1997] They're way hard to get into than a safe.
[1998] was it in the backseat of a classic car he had to make his fucking Corvette right didn't he yeah exactly yeah and he made a big deal about that too and then they weren't just there they're in multiple locations so they keep finding these these classified documents after he had given Trump so much shit like they wanted to take Trump down for having these classified documents they're making it like this huge sticking point I mean they're throwing the espionage act out of which is like, you know, five years account for that, right?
[1999] I mean.
[2000] And meanwhile, Biden has even more.
[2001] Right.
[2002] I know.
[2003] I know.
[2004] And his own AIDS turned him in.
[2005] Yeah.
[2006] It's like an Inspector Cluso act.
[2007] The whole thing is, it's incredible.
[2008] What do you think happens in 2024?
[2009] Like, who do the Democrats pick?
[2010] Do you think they go with Kamala again or does she develop a disease?
[2011] Maybe she's like anxiety or maybe she's got restless leg syndrome.
[2012] and she can't do it anymore.
[2013] Yeah, something unfortunate is going to happen to her.
[2014] I mean, look, they have to know that she's not viable as a candidate because they tried twice already to make her the candidate in the last election cycle.
[2015] Well, maybe if they wait enough time.
[2016] But she, but she, but she, passage of time.
[2017] It wasn't that she wasn't, you know, reaching a, a contender threshold.
[2018] She was basically flatlining despite massive media attention, you know?
[2019] Do you think that if they have good speechwriters and they get a whole of her, I go, listen, this is your last fucking chance at this dance.
[2020] Okay, we've got to do this right.
[2021] And this is what you got to do.
[2022] You've got to listen to the speechwriters.
[2023] No more ad -libbing.
[2024] No more doing this.
[2025] She did this a lot.
[2026] We're going to write some stuff for you and no more going off script.
[2027] When she goes off script, she rambles, you know?
[2028] Yeah, I don't I think they want Gavin Newsom to be the candidate Do you think that's sustainable though Like he's Oh, he's totally unlikable Yeah Like in the worst way But but I get the sense That's who they want to be the candidate I mean look They had the guy in the White House When Biden went overseas Was he there?
[2029] Yeah, he went to the freaking White House He's filmed going into the White House When Biden takes a trip overseas What was he doing there?
[2030] I forget But this was at the time when he was running campaign ads against DeSantis.
[2031] Don't you think people go ballistic if they try to do Kamala Harris and him?
[2032] Because if Kamala Harris decides to stay, then, like, what, is she the president?
[2033] Or is he the president?
[2034] And if Kamal Harris leaves, people are going to freak out, like, where is she going?
[2035] Like, something has to almost, like, if I was writing a script, not saying in the, and I definitely don't want anything to happen to her.
[2036] But if I was going to write a script, I was like, some shit has to go down.
[2037] No, I mean, they can just It could be a scandal Yes, that's what I'm saying Some shit has to go down And there's stuff there, isn't there?
[2038] I mean, there's stuff for their husband And finances And, you know Is there?
[2039] Yeah, I think so So, at least the appearance of it So who would it be?
[2040] It would be Gavin Newsom and who else?
[2041] I don't know, I mean, look, they've clearly tried to make Buttigiegichich a candidate But he's a fool But again, he's another one of these people who was market tested extensively in the 2020 campaign.
[2042] I mean, I covered that campaign.
[2043] They don't, these candidates, it's, it's somebody's idea of who would be a popular candidate, but in reality, these candidates do not register with ordinary voters.
[2044] Like, Better Our Work was another one.
[2045] Yeah.
[2046] You know, I watched them on the trail in Iowa, and voters just, you know, he would sort of tearfully talk about, problems at the border and um and they just weren't interested meanwhile like he's a soap opera actor yeah that's what he's like right he's like a shitty soap opera actor trying to go through these lines and you're like i don't resonate with anything you're saying he does look like a like the handsome young doctor who has an affair out of wedlock right that actually is is kind of he might be better at that.
[2047] But the big story of 2020 to me was always how stubbornly high Biden's numbers were, right?
[2048] Like people did respond to him, even though he was clearly crazy.
[2049] Like, you know, he would go out there and his emotional register would be all off.
[2050] He would stick his finger in people's chests.
[2051] He would go off on people in crowds, but somehow people responded to that.
[2052] and he kept, you know, he wouldn't sink far enough in the polls so that he would be pushed out of the race.
[2053] How accurate are polls and how manipulated are polls?
[2054] I think polls are, they can be useful over periods of time, right?
[2055] Like, because the polls were clearly wrong about Trump, you know, the polling analysis.
[2056] of, you know, like, for instance, they'll do things like, you know, favorability, unfavorability, ratings, but that sometimes doesn't take into account other issues like people will still vote for somebody.
[2057] They feel unfavorably toward if they hate the other candidate more, you know?
[2058] Yeah.
[2059] And I think they do tell you something.
[2060] I mean, as reporters, you should never get in the habit of being too reliant on them as as indicators, but if a candidate can't get above two or three percent over a year, right, then you might want to, you know, take that seriously, and especially if it follows through and is matched by results.
[2061] Yeah.
[2062] But to me, it's almost like the heavyweight division when Tyson was a champion, there was no challengers.
[2063] That's true.
[2064] It's like, what do I get to be excited about?
[2065] Right.
[2066] There's no one that seems like they could step up.
[2067] That's true, yeah.
[2068] Who was the – Bone Crusher Smith might have been the best guy he fought, right?
[2069] I'm trying to remember who – Well, there was a run.
[2070] Like Bruce Selden looked good.
[2071] He looked apart.
[2072] There was a few guys.
[2073] Frank Bruno looked apart.
[2074] Yeah.
[2075] But it's like Tyson was so dominant that like there was no one to get excited about.
[2076] And it's not like saying that there's anyone dominant like that.
[2077] I mean, I guess Trump is pretty dominant, but he's got a lot of resistance on the right too.
[2078] But the point is that, like, in the left, there's no contender that is like – that is compelling that you could see.
[2079] I mean, they've tried to push Buttigieg.
[2080] They gave up an Elizabeth Warren.
[2081] They've tried to push some of these people, but no one stands out.
[2082] Yeah.
[2083] I mean, they made a mistake.
[2084] I think they had a window where Bernie would have been a viable candidate.
[2085] They're scared of Bernie.
[2086] Oh, of course.
[2087] Same way they're scared of Tulsi Gabbard.
[2088] Right.
[2089] You can't control her.
[2090] Yeah.
[2091] No. No, they're not organizational people, right?
[2092] even though Bernie tries very hard to be a good Democrat.
[2093] I mean, that's, I think, a part of his personality that ended up being a fatal flaw, I think, for him.
[2094] I mean, he loves the Democratic Party.
[2095] He grew up in it, and if you talk to him about it, he'll talk about how his fond memories of the party and how he doesn't want to see it fractured.
[2096] But that ended up being his undoing.
[2097] He needed to be in burn it all down mode the way Trump was.
[2098] Yeah.
[2099] Imagine if he did go that way?
[2100] Right.
[2101] Oh, my God.
[2102] He might have won.
[2103] Yeah, he might have won if he if he started.
[2104] Rant and Raven, I'm mad as hell and I can't take it anymore.
[2105] Exactly.
[2106] Yeah.
[2107] But he didn't want to permanently damage either Biden or Hillary, even though, especially Biden, because he liked Biden as a person.
[2108] And this I know because I've talked to people who work for Bernie and, you know, Biden was nice to him when Bernie came to the Senate.
[2109] He couldn't stand Hillary, right?
[2110] So he was very aggressive toward her in the beginning, and that was when he was doing really, really well.
[2111] I think if he had pushed it a little further in that first year, if he had been a little bit more balls out, he might have won that one, you know.
[2112] Yeah.
[2113] It would have been tough.
[2114] I mean, he might have won the nomination, at least.
[2115] Well, when you saw the collusion between the Democrats and during the primaries with Bernie, that Donna Brazil talked about in her book, it's like there was an effort to try to get rid of him.
[2116] And there's like calculated maneuvers to try to get him out of there, which is really wild.
[2117] It's really wild to see the way these intricate little chess pieces move around behind closed doors.
[2118] And then someone like Donna Brazil writes a book and comes out with him.
[2119] that and you get to see like what what they were up to yeah i mean they had a whole system they had worked out the invisible primary and you know the the endorsements are all lined up ahead of time and the money's all lined up ahead of time and um you know but bernie did well to to fight back against that i mean i think the his big accomplishment um looking back is going to be the proof of concept that you can you can be the top fundraiser in a race without taking corporate money which he did do in 2020 you know that's that's an important thing that he figured out but you're right you know they have lots and lots of ways to put the thumb on the scale the differences Trump overcame all of those you know just by just with sheer bullshit and assholedom.
[2120] And you know what?
[2121] But that's, on one level, that's impressive, right?
[2122] Yeah.
[2123] Like, but that's what it takes, you know?
[2124] And I don't think, I don't see a character like that in the Democratic side who's going to be able to pull that off to you.
[2125] Do you think, no, I don't, I don't see anybody like that.
[2126] I don't see anybody coming up and I don't, I don't see anybody in like the distant future.
[2127] either.
[2128] No. No. But do you think that Ron DeSantis can overcome Trump?
[2129] Like, Ron DeSantis, I think, can get a lot of people that are on the fence.
[2130] Yeah, I think so.
[2131] Whereas I don't think Trump can.
[2132] I think so.
[2133] I mean, Trump is going to have die -hard supporters, and I've learned never to write him off.
[2134] Like, I did that after the Access Hollywood thing happened.
[2135] I made the mistake of putting that down in print.
[2136] I'm never going to do that again.
[2137] The guy's like Jason, he never fucking dies.
[2138] He always comes back.
[2139] But, you know, Desantis is, he's survived one of the things that's usually fatal for a Republican politician, which is the approval of pundits in the Washington Post and New York Times.
[2140] Like, they all kind of portrayed him as the more civilized Republican alternative, and usually that's a death knell for Republican candidate in the Trump era.
[2141] But he's still hanging around.
[2142] He's a powerful enough figure.
[2143] And, you know, veteran.
[2144] He's got a lot of things going for him.
[2145] And what he did with Florida actually turned out to have worked, you know.
[2146] Right.
[2147] Right.
[2148] Yeah.
[2149] I mean, I don't, I think it's going to be difficult for the, I mean, it's, it's so unpredictable to the, on the Democratic side right now, it's weird to see the, the incumbent party be in such disarray at this stage of a presidential race.
[2150] Yeah.
[2151] So I'm looking forward to covering it.
[2152] It's going to be fun.
[2153] I'm looking forward to see it how it plays out without someone censoring Twitter.
[2154] Right.
[2155] Yes, there's that too.
[2156] That's a big one.
[2157] It's a big factor.
[2158] It's a giant factor.
[2159] I mean, if Twitter did not censor the Hunter Biden laptop story, if that went viral and everyone knew about it and they were forced to cover it on the news and they showed the images and all the talk about 10 % to the big guy and the fact that he was getting these contracts with Burisma where he's making millions of dollars, totally unqualified to get that money, should not have been in that position of power, saying that he could use his influence and saying, he could connect people and get people to the dance.
[2160] That was wild shit.
[2161] And the fact that they came along and censored that on Twitter.
[2162] And Facebook, yeah.
[2163] And then, I mean, Zuckerberg on this podcast talked about it.
[2164] I remember when that you did that interview because, you know, you have moments in this period where media has been so controlled where you think, man, am I crazy?
[2165] or did that just happen?
[2166] You know, like, I thought that was a really big deal when that happened.
[2167] It was a big deal.
[2168] And then, you know, when that interview, when Zuckerberg said it out loud, even though he had testified about it before, when he said it out loud in that setting and he kind of described it, you know, you have this sort of almost feeling of psychological relief, like, okay, all right, I wasn't crazy.
[2169] I'm not crazy.
[2170] You know what I'm saying?
[2171] Yeah, they really are doing that.
[2172] Yeah.
[2173] And the thing is when he testified about it, like you have to seek out that testimony You have to read a review of that testimony.
[2174] You could just be consuming a podcast just because you're, you know, you're jogging or whatever.
[2175] And he says, well, the FBI contacted us and you're like, what?
[2176] The FBI contacted Facebook and told you that this smacks of Russian disinformation, but it doesn't.
[2177] So they lied.
[2178] Right.
[2179] You did something that helped get one guy elected over the other guy based on lies.
[2180] And lies that the FBI helped.
[2181] what else the FBI lie about and then you get into this long history of manipulation and you're like holy shit yeah yeah and they're having they're having these regular industry meetings now that we know about right yeah um you know we're we're looking at the agendas of those and you know it'll say other other government organization i'm sorry other government agency briefing right which is like c i briefing, you know, what are they talking about in those briefings?
[2182] Like, what else are they locking down on and are what else are they amplifying, deamplifying?
[2183] Like, who knows, right?
[2184] Like, that's terrifying.
[2185] So it would be nice to go back to a presidential campaign where maybe we have something more like an organic landscape to judge all this stuff.
[2186] Well, we certainly will on Twitter.
[2187] Unless something radical happens over the next two years, which is a possible.
[2188] Possibility.
[2189] I mean, you know, Elon said that when he bought Twitter, it was on the fast track to bankruptcy.
[2190] And, you know, that was interesting to find out, too, that the reason why they took that deal was like it really wasn't profitable, you know, which is crazy because he bought it for $44 million and it's worth almost nothing.
[2191] Right.
[2192] But it's very valuable.
[2193] Although it's not profitable.
[2194] It's very valuable in what it can do.
[2195] And now he's trying to figure out a way to make it profitable.
[2196] Yeah, I don't envy him there.
[2197] Oh, fuck.
[2198] Yeah, right.
[2199] I guess, like, the deal is to attract content creators and give them a better portion of the revenue than YouTube does.
[2200] Yeah.
[2201] And you could do that.
[2202] It does have the infrastructure to do something like that.
[2203] And it could become a hub of podcasts.
[2204] Yeah.
[2205] You could easily have, I mean, see, there's no incentive for people to keep their podcasts only on iTunes.
[2206] One of the things that iTunes has done, it's like a tremendous blunder, in my opinion, is they never figured out a way to monetize podcasts.
[2207] So they act as an aggregator for podcasts, but they never make any money off of it, which is nuts.
[2208] That's interesting.
[2209] I didn't know that.
[2210] Yeah, it's nuts.
[2211] It's fucking nuts.
[2212] If you think about the fact that Spotify makes fucking untold billions off of it and Apple makes zero, like when you put your podcast on iTunes on Apple Podcasts, all you're doing is like linking an RSS feed to Apple Podcasts.
[2213] But Apple doesn't profit off of it.
[2214] They just distribute it.
[2215] Like, we had meetings with them years ago before I ever went to Spotify.
[2216] And I was like, you guys want to make any money off of this at all?
[2217] Like, this is crazy.
[2218] Like, how could you have something that's so widely distributed through your company and you make zero money off of it?
[2219] That seems like a pretty big oversight.
[2220] It's a giant fuck up.
[2221] Because I think in the beginning they thought of podcasts as being just a joke, like no big deal.
[2222] Right.
[2223] Right.
[2224] And then it became this enormous, like, media thing.
[2225] Yeah.
[2226] Yeah.
[2227] Well, no, it's just a lack of foresight, I guess.
[2228] But it's ironic, right?
[2229] The podcast is the one thing that you can't control.
[2230] You can't algorithmically clamp down on, and that becomes the most enormously popular format.
[2231] I don't think that's a coincidence.
[2232] It's very bizarre.
[2233] Matt Taibi, I appreciate you very much.
[2234] And thank you for coming in here.
[2235] Joe, thanks so much for having me. Thanks for everything you do.
[2236] No, thank you.
[2237] You're one of the last of the real ones out there doing it.
[2238] I appreciate it.
[2239] And thanks for having me on.
[2240] I'm so glad your shows, you're doing great.
[2241] And are these the skulls of your vanquished enemies?
[2242] Yes.
[2243] This is Brian Seltter.
[2244] This is Chris Cuomo.
[2245] This is, what's the other guy's name?
[2246] The one was Jim DeCosta.
[2247] Jim Acosta.
[2248] Yeah, they're all.
[2249] I got them all.
[2250] That's great.
[2251] Excellent.
[2252] Well, may there be more next time I come on.
[2253] Yes, for sure.
[2254] Thank you, brother.
[2255] Appreciate you.
[2256] Thanks, all right.
[2257] Bye, everybody.