The Joe Rogan Experience XX
[0] Joe Rogan podcast, check it out.
[1] The Joe Rogan experience.
[2] Train by day, Joe Rogan podcast by night, all day.
[3] Hello, Michael.
[4] What's happening?
[5] Good see you.
[6] Thanks for having me back.
[7] My pleasure.
[8] Appreciate it.
[9] So, first of all, what was it like to get a hold of the Twitter files?
[10] Like, what was that experience like?
[11] How did that go down?
[12] Exciting as hell, man. I mean, seriously, there's been a lot of misinformation about that itself, but very Weiss contacted me. She lives in L .A. and she got in and she's like, how soon can you get over here?
[13] And I was like, let me finish this interview.
[14] I'm on and I'm over.
[15] And yeah, it was incredible.
[16] You know, I'd never met Elon before.
[17] You know, I met him at the coffee station, just making himself a cup of coffee.
[18] He had no idea who I was.
[19] And yeah, we just got into it.
[20] It was a, you know, I was sort of the least known of the big three journalists that were there, was Barry Weiss and Matt Taibi, who was on.
[21] And they'd already started thinking about how to kind of what to go after.
[22] And Matt had done a story on the Hunter Biden laptop already.
[23] And then we were starting to look at January 6th because Trump gets de -platformed on January 8th.
[24] And so because I'm like the junior member of that threesome, so to speak, they gave me January 7th.
[25] So the first thing we, one of the first things we did was just to look at how they made a decision to get to pull Trump off the platform.
[26] And it turned out that the seventh was an important day because that was when they started to rationalize this decision to de -platform Trump, even though their own people inside had decided that he had not violated their terms of service.
[27] So they were sort of stuck making up a reason to de -platform him.
[28] And that was an important theme was that they just kept changing the rules, basically, to do what they wanted to do.
[29] And that was the same thing on the Hunter Biden laptop.
[30] The New York Post story that they censored also had not violated their terms of service.
[31] So, I mean, look, it was crazy.
[32] I mean, it was, you know, people always ask questions about the, about the files themselves.
[33] But, you know, the experience was we would ask for these searches and we just get back huge amounts of data.
[34] It was lots of thousands and thousands of emails, thousands of internal messages on their Slack messaging system.
[35] And so, yeah, I mean, a lot of it was, you know, some of it was very boring because you have to just read tons and tons of stuff.
[36] But, you know, we, I think the big theme was we start by seeing a real, you know, it was super progressive.
[37] It's like 99 % of campaign contributions from Twitter staff are going to Democrats.
[38] You know, the head of safety at Twitterist guy named Yol Roth, who, you know, said, you know, said there's actual Nazis in the White House when Trump came in.
[39] He was very progressive.
[40] But over time, we just kept finding, like, this weird, like, FBI wants us to do this.
[41] You know, there's these other government agencies.
[42] Oh, you know, this, all these people used to work at the FBI.
[43] The CIA shows up.
[44] Department of Homeland Security.
[45] And we're kind of like, what the hell is going on?
[46] And the story quickly shifted from us sort of, and I think what Elon thought, which was that it was just very progressive people being biased in their content moderation, in their censoring, to there is a huge operation by U .S. government officials, U .S. government contractors, and all of these super sketchy NGOs getting money from who knows where, basically demanding that Twitter start censoring people.
[47] And at that moment, the story shifted for all of us.
[48] And that was, I think, where Taibi became particularly important and sort of the lead because he had had so much experience on sort of looking at how the U .S. government during the War on Terror had waged disinformation campaigns, propaganda campaigns, and it became clear to us, you know, over time, that the U .S. government had turned its propaganda and disinformation campaigns that had been waging abroad.
[49] It turned them against the American people.
[50] And that was where you just sort of get chills up your spine.
[51] And you were like, this is something seriously sinister is going on.
[52] Do we know when this began?
[53] Like, when did they infiltrate these organizations?
[54] Because I'm sure it's not just Twitter, right?
[55] I'm sure it's.
[56] Oh, no, absolutely not.
[57] That's part of what was so terrifying is that it was all of the social media companies, including Wikipedia, by the way, which we don't talk enough about.
[58] But also all of the mainstream news organizations are all being organized.
[59] So when does it start?
[60] You know, it really, what you're looking at is the apparatus that was created by the war on terror over the last 20 years, starting after 9 -11.
[61] Then there was a battle against ISIS because ISIS was successfully recruiting on social media.
[62] So there was sort of a counter -ISIS recruiting campaign that occurred.
[63] Then you get the big event is Brexit, 2016, Trump's election in 2016.
[64] And the establishment just freaks out, absolutely freaks out.
[65] and there's a lot of different motivations here.
[66] So one of the motivations is just to blame Facebook, blame social media for Trump's victory.
[67] It was never true.
[68] I don't really think anybody really believed it.
[69] There's just, you know, for a variety of reasons we can talk about, there was never any good evidence that whatever Russians did had much of any influence, any measurable influence on the outcome of the campaign.
[70] But they started to scapegoat the social media companies as a way to get controlled.
[71] over them.
[72] And so then they started, then in 2017, they set up, well, two things happen, or many things happen.
[73] The Department of Homeland Security just declares election infrastructure to be part of their mission of protecting election infrastructure.
[74] And that meant protecting the media environment, protecting.
[75] Protect.
[76] Put it that in quotes.
[77] You know, it's creepy.
[78] It's patronizing.
[79] It's a power move.
[80] So that's the first thing that happens.
[81] They create something called the Cybersecurity and Infrastructure Security Agency within the Department of Homeland Security to supposedly protect the media environment from foreign influence.
[82] They create something called the Foreign Influence Task Force with the FBI to basically start policing domestic speech on these platforms.
[83] They start organizing all the social media companies to participate in these meetings.
[84] So you had Mark Zuckerberg, the CEO of Facebook, in here, And he says to you, there's this critical moment where you ask about the Hunter Biden laptop.
[85] And he goes, well, yeah, you know, in the summer of 2020, all these FBI guys come to us saying there's going to be a hack and leak operation involving Hunter Biden, which is super suspicious because, as everybody now knows, the FBI had Hunter Biden's laptop in December 2019.
[86] What freaked me out, and I was, I had, so by the way, I was a victim of the Hunter Biden laptop disinformation.
[87] I thought that, I voted for Biden.
[88] I thought that it was a, I thought that that laptop was Russian disinformation.
[89] I just bought the whole thing.
[90] And this is some, from somebody who, you're a journalist.
[91] I'm supposedly a journalist, right?
[92] So called journalist.
[93] I bought it, you know, I'm still a big liberal in so many ways.
[94] And everybody I knew was like, oh, you know, besides Trump, it was just, he's so, for all the reasons.
[95] that progressives bought that the laptop was fake.
[96] I bought that it was fake.
[97] So then when you realize that it was real and that everything in that New York Post story in October 14th, 2020 was accurate, I started seeing stuff in the emails.
[98] The thing that really freaked me out was this thing that Aspen Institute, it's called a tabletop exercise, and it was actually a Zoom call, to role play how to deal with a Russian hack and leak around Hunter Biden.
[99] This is like in New York.
[100] June of 2020.
[101] So this is like months before the New York before, months before Rudy Giuliani gets the laptop before Rudy Giuliani gives the laptop to New York Post.
[102] Why in the hell is Aspen Institute holding a tabletop exercise to pre -bunk?
[103] Basically, they are training or brainwashing all these journalists.
[104] And I mean, it's CNN, New York Times, Washington Post, Wikimedia Foundation, the Wikipedia folks, the networks, all of the social media.
[105] companies, all coming together to be like, okay, well, if something is, if something is leaked, then we should not cover it in the way that journalists have traditionally covered it.
[106] Meanwhile, Stanford University, a few months earlier, had put out a report saying, reporters should no longer follow the Pentagon Papers principle.
[107] Well, the Pentagon Papers, of course, is this famous episode.
[108] It was Stephen Spielberg made a whole movie about it, where the Washington Post in New York Times published these internal Pentagon documents showing that the U .S. government was losing the war in Vietnam, right?
[109] This is Daniel Ellsberg, and he just releases it.
[110] He steals these documents.
[111] He breaks the law, steals these documents, gives them to the newspapers.
[112] The newspapers publish them.
[113] It's this kind of incredible moment in American journalism, where we are like the First Amendment gives these newspapers the right to publish hacked, so -called hacked, but leaked information.
[114] And here you have Stanford University, Aspen Institute, saying, oh, no, no, no, that's all.
[115] We should stop doing that.
[116] Journalists should no longer write about leaked information in that way.
[117] Instead, we should focus on the person who leaked it.
[118] So it's, it really sent chills on my spine.
[119] You know, it was the creepiest thing I'd ever seen.
[120] And this is, of course, you got to remember Aspen Institute is funded by the U .S. government.
[121] Yeah.
[122] Stanford's funded by the U .S. government.
[123] So this is people go, oh, well, you're just one of the responses we've got is they go, oh, you're just talking.
[124] talking about, you know, content moderation by private companies?
[125] No. We're talking about U .S. government funded organizations.
[126] You can't, if the U .S. government is censoring information, that's obviously a violation of the First Amendment.
[127] But if the U .S. government is funding somebody else to censor information, that's also a violation of the First Amendment.
[128] You can't indirectly.
[129] It's still a violation if you're funding somebody to demand censorship.
[130] So that was quite a steeple chase.
[131] But there's a lot here.
[132] I mean, it's a lot of people, a lot of institutions, a lot to unpack, and that was part of the reason I wanted to reach out and be like, we just, I didn't need a Joe Rogan session to just kind of go through it all.
[133] Yeah, well, I'm very happy to provide that.
[134] Here's the question.
[135] Obviously, the laptop would harm, the Hunter Biden laptop would harm Joe Biden, obviously.
[136] And if that story got out, who knows how many people would have voted the other way.
[137] Is this a direct result of the things that Trump said when he was in office that, that we went against the intelligence community, like how did they decide?
[138] I would always assume that, you know, the so -called deep state is essentially a bipartisan, that they wouldn't necessarily side with the Democrats or the Republicans.
[139] They're really, you know, they're just in charge of they're supposed to be gathering information to protect the country.
[140] So how did they decide specifically to either stop information or propagate misinformation that would aid Joe Biden.
[141] Yes, that is exactly the right question.
[142] So, I mean, I think the thing you have to understand is that that Trump was viewed by the deep state by, you know, CIA, FBI, Pentagon, and, you know, I mean, all of the elites.
[143] And this, and you're right.
[144] is bipartisan in the sense that it's both never Trump Republicans and Democrats.
[145] What freaked them out the most about Trump is that he was threatening to pull the U .S. out of NATO.
[146] I don't think that that was, I just think that was bluster.
[147] Like, that's insane.
[148] And by the way, I should say, I actually, I support what we call the Western Alliance.
[149] I support providing military security for our allies in Asia and in Europe.
[150] I'm not a, I mean, there's parts of economic nationalism that I respect.
[151] But I'm also, I don't think we should pull out of NATO.
[152] I think NATO has provided peace in the world and mostly been a good thing.
[153] It's obviously had some crazy abuses like Iraq.
[154] This whole experience has made me rethink my support for Ukraine.
[155] But I think it's important to understand that Trump terrified the deep state and the national security establishment.
[156] So did Brexit.
[157] There's a sense in which you had a guy on here named Peter Zion who wrote this book called, this really apocalyptic book about how the world is going to fall apart.
[158] And his whole argument, which I don't agree with, I think he's brilliant, but the book is, I think the argument's wrong.
[159] His whole argument is based on the idea that the United States is going to stop providing military security to our allies in Asia and Europe.
[160] It's all just based on this assumption that Trump is the beginning of some, the U .S. withdrawing from its traditional role since World War II.
[161] There's a bunch of people who, obviously, their ideology, their livelihoods, their identity, just their whole way of life is tied up with providing the United States providing this protection for, you know, Europe and Asia.
[162] And they viewed Trump as threatening that.
[163] I also think they just really hated the guy.
[164] They looked down on him.
[165] You know, it was crude and all the things that people don't like about him.
[166] And he spoke disparagingly about the intelligence community.
[167] Yeah.
[168] I mean, he was crazy.
[169] Absolutely.
[170] He was against the war in Iraq.
[171] He was a different.
[172] He was a nationalist Republican.
[173] And what's so interesting is that, you know, if you read people like, you know, people on the left, like Noam Chomsky and others who have been critics of U .S. or Glenn Greenwald, who are critics of, and I think Matt Taibi, critics of U .S. government, you know, military invasions around the world since World War II.
[174] I mean, we've overthrown many governments, right?
[175] You know, Iran, Chile, Guatemala, you know, and what the pattern is, is that these are places where nationalists, sometimes socialists, but often just nationalists who are trying to control their economies and they didn't want foreign interference were coming to power.
[176] And the U .S. government would see that as a threat to providing, you know, having this liberal global order, as it's called.
[177] And so they saw what Trump, they saw Trump as an existential threat to this post -war liberal order and they needed to, and they viewed social media as the means to his power, which I think was exaggerated.
[178] So on the one hand, they saw a threat.
[179] I think they also saw an opportunity, you know, the War on Terror, we won.
[180] I mean, like, it's just, I mean, huge, huge victory.
[181] I mean, it's shocking how successful it was in some way.
[182] So you have a bunch of people that suddenly need something to do.
[183] So there's a lot of motivations there.
[184] And then you also have the guys that lost the Hillary campaign, John Podesta, he was the chair of her campaign.
[185] He runs the most powerful, progressive, frankly, propaganda organization in the world, or at least in the United States, the Center for American Progress, they were looking also for some reason someone to blame for their own failures, you know, for the dislikeability of Hillary.
[186] And so there was just a lot of motivations to try to get control over social media platforms.
[187] They felt like they had lost control of them.
[188] And what was the attitude of these social media platforms when they were exchanging emails back and forth with these intelligence agencies?
[189] Was there any understanding of the implications of allowing this web of influence to infiltrate and control narratives and how kind of creepy and dangerous that is, did they understand how other people would perceive that?
[190] Because I would assume this is all, the emails were exchanged and there was Slack messages and all this stuff is recorded, right?
[191] So there's a record of it.
[192] Yeah.
[193] Did they have an understanding of how.
[194] other people would view this?
[195] Yeah, I mean, just to back up even further, so there's two interesting dynamics going on.
[196] You know, the first is that the Internet itself is created by the U .S. Department of Defense, and Google is a spinoff of Defense Department projects.
[197] You know, so on the one hand, the Internet is a function of the U .S. military.
[198] I mean, it's a spinoff of the U .S. military.
[199] It's a great one.
[200] We're glad to have it.
[201] But I think the U .S. military and the deep state and whatever, they felt like they had control over the Internet.
[202] until Trump, basically, or really maybe until ISIS around 2014, 2015.
[203] That's the first dynamic.
[204] The second dynamic is culturally, Silicon Valley is libertarian, right?
[205] So you have the Electronic Freedom Foundation, I'm sorry, Electronic Frontier Foundation.
[206] You have a libertarian ethos.
[207] Jack Dorsey, the founder of Twitter, is very much a manifestation of that libertarian ethos.
[208] Mark Zuckerberg less.
[209] But even Mark Zuckerberg, after the 2016 elections, When everyone's accusing him of throwing the election to Trump, he's like, this is ridiculous.
[210] He's like our own data doesn't support.
[211] There just wasn't enough.
[212] The Russians clearly did not have this influence.
[213] They just beat the crap out of him so much and threatened to take away their ability to operate, which is known as Section 230, which is this huge liability protection in the law that passed in 1996, which allows Google, Facebook, Twitter to exist.
[214] Can I stop you there?
[215] When you say they threatened to take it, like, in what way?
[216] directly, I mean, including Biden himself.
[217] I mean, but basically Democratic politicians, they would just say, you know, we're going to, we're going to remove your Section 230 status.
[218] That's just like saying, we're going to destroy your company.
[219] I mean, it's just, it's not.
[220] And they were saying this because they were, their assertion was that Russian disinformation and propaganda led to Donald Trump being elected, being elected.
[221] Yes.
[222] Being elected.
[223] And there was no evidence of this.
[224] No, I mean, there was, I mean, there was some evidence of it, but nothing.
[225] Well, there was certainly evidence of like these troll farms.
[226] Yes.
[227] We know they exist.
[228] Yes.
[229] Yeah, but it's, it's trivial.
[230] I mean, it was, they would exaggerate, they would say things like, you know, 140, I think it was like 146 million Americans had Russian propaganda in their news feeds.
[231] That's not the same as saying 146 million people saw the ads.
[232] Right.
[233] Because it's like your feet.
[234] does remember that was social of course facebook has changed so yeah i mean there's there's i mean look there's three big disinformation campaigns that were run by frankly the u .s government and their allies the first was the russia hoax the idea that russians that russians controlled donald trump and won him the election the second was the hunter buy and laptop and the third is that COVID origin, you know, the idea that it's a conspiracy theory to even imagine that COVID could have emerged from a lab.
[235] There's others, including, you know, we can talk about there's a, there was this effort to basically smear a bunch of ordinary conservative or Trump supporting Twitter users as Russian bots.
[236] Yeah.
[237] But basically you have active disinformation campaigns being run by the U .S. government and U .S. government contractors against the American people on these issues at the same time that they're demanding censorship.
[238] So you have propaganda on the one hand and censorship on the other.
[239] Here's what appears to be dangerous to me. There doesn't seem to be any repercussions for doing these things.
[240] This is scary because it's shifting in there.
[241] So in step one, the Hunter Biden laptop, for example one rather, the Hunter Biden laptop, no one's in trouble, right?
[242] No one's in trouble.
[243] No one from the FBI's in trouble.
[244] No one loses their job.
[245] No one gets reprimanded.
[246] No one gets, you know, brought before the American people and said, you failed us.
[247] Not how did you fail us?
[248] You betrayed us because you knew this was not true.
[249] And you allowed someone whose son has deep ties to both Ukrainian and Chinese companies that were paying him for influence.
[250] And it appears at least by some of these emails that some of that, money went to the actual vice president of the United States, which is fucking wild that no one is to.
[251] And then the crazy thing is one of the things about having a right and a left is that whenever there's information that's inconveniently bad for that one side, particularly the left, you don't hear a fucking peep about it on the media.
[252] right it's it's dismissed it's like you know they'll talk about it like uh what someone said the someone talked about the hunter Biden laptop and said it was like half fake that was like a OC AOC half fake that's right what is that is a that is such a horrible violation of the trust that the people who elected you put in you they you you you you have access to all the information you have access to that actual fucking laptop oh yeah by the way I have access to it too yeah a lot of people have access to it if you wanted to I said I don't want to look at it I was like I don't want to look at that fucking thing I don't want to see this guy getting foot jobs from hookers in Vietnam smoking street crack it's crazy whatever he did but the fact that someone would say that's half fake yeah that itself is disinformation it's important that that that is a lie it's a lie but you're just saying it because Because if you can say it's half fake, you muddy the water.
[253] And now anybody that's looking at that could go, oh, yeah, but that's half fake, according to my side.
[254] Right.
[255] This is like the same people.
[256] There's still people that say that Trump was in bed with the Russians, which is how he won in 2016.
[257] People that still parrot that.
[258] Oh, yeah.
[259] Oh, yeah.
[260] Oh, yeah.
[261] I say that.
[262] Oh, yeah.
[263] Mine as well.
[264] I know.
[265] So I was happy to do your show because literally like even my.
[266] very close friends and family don't understand what I'm talking about.
[267] And I'm like, I want to go on Joe Rogan and just unpack this for them to how serious this is.
[268] I mean, you have to remember the New York, I mean, and I'll put it on myself, I was so biased.
[269] The New York Post published the subpoena, which is a kind of receipt from the FBI, showing they had taken Hunter Biden's laptop from this computer repair store owner in Delaware.
[270] It was published in the New York Post.
[271] They also published the receipt that has Hunter Biden's signature on it, saying that he had not only had left the laptop there, but also that it gave the computer store, the computer repair owner the rights to it if he abandoned it.
[272] Hunter Biden never said it wasn't his.
[273] He never denied that it was his laptop.
[274] Well, and subsequently, at least recently, he sued that guy for releasing the information, which the dumbest thing he could have ever done, because now all this half fake shit gets thrown out the window.
[275] Now it's, he's saying it's his.
[276] Right.
[277] Well, they're always, look, I mean, I think the other thing I want to also emphasize here, because I think that when you uncover the level of coordination and the sophistication of the disinformation and censorship campaign, it's easy to also sort of say they're perfect, but they're not.
[278] They're always making stuff up as they're going along.
[279] Right.
[280] But I think the other interesting, you know, the thing that's important to know here about that laptop story is that within Twitter, they look at that New York Post article, Joel Roth, the head of safety and his team, they look at it.
[281] And they go, yeah, I mean, it's legit.
[282] It doesn't violate our internal.
[283] It doesn't violate our terms of service.
[284] And at that moment, I mean, it has a Manchurian quality, Manchurian candidate quality to it where the former chief legal counsel to FBI, a guy named Jim Baker, who is central to beginning the Russia gate probe of Trump.
[285] He's now at Twitter.
[286] As Deputy General Counsel, this is where, We're discovering, this is part of what I was discovering in the Twitter files, is just vociferously attacking this thing.
[287] It's like this looks like misinformation, disinformation.
[288] We shouldn't trust it.
[289] It looks like it violates Twitter's policies.
[290] He was, he just, I mean, like multiple, I think it was at least four messages and emails of him pushing to the executives.
[291] And then, of course, we can't see the phone calls, which is really where a lot of the dirty work happens.
[292] pushing to just get this thing censored by by Twitter, sure enough, a few hours later, Yol Roth says, well, okay, you know, we think that it could very well have been a Russian hack where somehow they put the, I mean, it was this crazy thing where they're like, well, we think it was hacked and then put on the laptop.
[293] It was just bizarre.
[294] Yol Roth, like, there's moments where I respect him because he was enough of a truth teller internally.
[295] It's why he got to the position he was in, which is a very powerful position, to be like, hey, this is bullshit, like, internally he would say, but he was also a company man. So when his, when powerful superiors in the organization, including former FBI people, and Jim Baker wasn't the only one, when he gets worked, he just bends.
[296] And he just was like, okay, yeah, I think we've decided that it violates our hacked materials policy and we're going to censor it.
[297] The other thing I want to point out about this, it's not just that they censored the article because people always go, well, you know, it only lasted for a few days or whatever.
[298] It was the discrediting of it.
[299] The censoring, censorship is a disinformation strategy.
[300] If you censor that article, in other words, Twitter and Facebook, all the headlines where Twitter and Facebook are, you know, they're going to restrict the dissemination of this material.
[301] They think it's, you know, all that publicity is really what mattered.
[302] So in terms of like, you know, in my defense and other people that bought the idea that it was somehow a fake, we were being told by the media that everybody had looked at this and was kind of like, look, it looks like it's hacked, and there's something funny about it.
[303] So I think that, you know, I think there's so many shocking things about it, but I think it's the level of coordination and conformity within these social media companies.
[304] It was the pre -bunking in advance, and it was the complete total, you know, just the complete news media.
[305] blackout and unanimity that there was and it was just all of them I mean it was like all the networks all the newspapers they all just repeated this idea there was something wrong about the laptop and there wasn't so creepy and it's so creepy that there's no repercussions yeah it's essentially lying and using taxpayer dollars to promote propaganda that they know to be untrue but there is a chance I mean so we do the attorneys general of Louisiana and Missouri are moving forward in the courts ensuing the Biden administration for violating the First Amendment.
[306] You know, this is, of course, the 100 -bion laptop thing is one of many things.
[307] I mean, the other craziest thing of all, maybe some of the most craziest stuff of all is that Facebook censored accurate COVID vaccine side effect information because it didn't want to promote vaccine hesitancy.
[308] In other words, the White House is like just pressuring them.
[309] I mean, this guy Andy Slavin in particular is just this malign actor, just pressuring, pressuring, threatening them.
[310] They're nasty in these emails.
[311] The White House, nasty.
[312] In what way?
[313] Oh, just being just basically, you know, it's a continue.
[314] I mean, Biden does it publicly.
[315] They're killing people.
[316] Yeah.
[317] They're basically accusing people of, I mean, these guys, they don't, the gloves are off.
[318] I mean, they're just like, you're killing people by letting this information on it.
[319] I mean, the information is people telling their own story.
[320] of vaccine side effects.
[321] We always point out, like, it was one of the great public interest progressive victories in recent memory that the drug companies have to name the side effects of their drugs in their TV ads.
[322] Yeah.
[323] Like, that's a big part of it, right?
[324] It's like a running joke that you have to name the side effects in the TV ads.
[325] Well, here they, like, here were ordinary people trying to tell stories of the side effects that they had from the vaccine on Facebook and Twitter.
[326] And the White House is demanding that Facebook and Twitter censor that stuff.
[327] stuff.
[328] This is the, this is just the worst.
[329] I mean, that is, I mean, that's just Soviet Chinese style censorship like full on.
[330] I mean, so it's not over.
[331] And I think that, you know, we've already seen, there's other things going on, like that agency I mentioned, that part of the Department of Homeland Security, the cybersecurity and infrastructure security agency, they changed their website over the last few months to remove references to domestic counter disinformation efforts to emphasize countering foreign disinformation.
[332] They, you know, we talk a lot about this.
[333] There's one of the big leaders of the censorship industrial complex is this person named Renee Duresta at Stanford Internet Observatory who you had on.
[334] Let's talk about that because I had her on and what she essentially was talking about was all these Russian troll farms and how interesting it is that they created all these funny memes and they used all these resources to try to shift the narrative and change public opinion on certain things, and that it was very effective.
[335] Yeah.
[336] Well, so let's just, let's start with Renee.
[337] Yeah.
[338] So first of all, Renee is somebody who I only came across because she's actually kind of moderate on a bunch of the stuff that I'm moderate on, like dealing with homelessness, on COVID.
[339] She's actually like a moderate voice.
[340] She's not super woke or anything.
[341] And she's critical of, like, she moved out of San Francisco because of, It's just too crazy.
[342] So she and I had this conversation.
[343] Like, we were talking about this even before.
[344] I started talking to her right when I started looking at the Twitter files.
[345] And we did this long interview.
[346] I was on Sam Harris's podcast with her.
[347] But then she starts showing up in the Twitter files in all these weird ways.
[348] And we start looking into her.
[349] It's a very, so she's also, the reason where she's so important is like, like, when you read the, you know, when you, when you follow the meetings or watch the YouTube videos, whatever, she's like one of the smartest people.
[350] Like, there's something going on with her.
[351] She's like a real leader.
[352] She's always sort of the number two.
[353] The other thing about these people is that they move around a lot.
[354] They move in between organizations.
[355] And she's always sort of the number two, but she always seems a bit smarter than the person that she's reporting to.
[356] But so she's somebody that she goes to, she gets a computer science degree from State University of New York at Stony Brook.
[357] That happens to be a major recruiting place for the NSA.
[358] She then goes and has, she gets a job at Jane's Trading, which is like one of the, the, the, the, the, the, the, rate.
[359] It's like up there with Goldman or maybe better.
[360] It's where SBF from FTX was at.
[361] She was there.
[362] Then she had a couple of companies that did like logistics and cyber, very high -powered, successful executive.
[363] And then according to her story and the public story, she gets obsessed with anti -vaxxers.
[364] She's got young kids.
[365] She's obsessed with anti -vaxxers, spreading anti -vax, misinformation.
[366] This is long before COVID.
[367] I think it's around 2014, 2015.
[368] 15.
[369] Next thing you know, she's like advising President Obama on counter ISIS disinformation strategy in the White House and advising on the expansion of something called the global education center, which is part of the State Department of counter disinfo.
[370] So suddenly she's like the senior person.
[371] It's very suspicious, very rapid rise.
[372] If you know anything about those communities, they're very hierarchical and like you have to work your way up over many years.
[373] She's instantly like, at the top.
[374] In 2017, she is at a consulting firm called New Knowledge that is then caught doing disinformation against an Alabama Trumpian Republican candidate named Roy Moore.
[375] They are caught doing fake Facebook pages, accusing Roy Moore of wanting to basically restrict alcohol consumption in Alabama, which is deeply unpopular position.
[376] false and also creating the perception of a Russian bots supporting Roy Moore.
[377] Her firm runs that campaign.
[378] Afterwards, she sort of tries to distance herself from it, suggests that she wasn't involved, even though when you read the Washington Post and New York Times articles about her, about that, about the scandal, she sort of, she makes it makes it clear that she was actually the person that brought the funding in to run the program and also kind of conceived to much of the strategy.
[379] After that, she becomes the top researcher to the Senate Intelligence Report of 2018 on Russian disinformation in the 2016 election.
[380] So she's not, not only is she not punished for her role in it.
[381] She's rewarded by the Democrats with this incredibly powerful position.
[382] So she becomes like the lead witness, the lead author for Senate Democrats, Adam Schiff, in promoting the whole, you know, narrative that somehow Russians swung the election to Trump.
[383] And there's no repercussions for promoting this false information?
[384] No, I mean, she's rewarded for it.
[385] And no one talks about it?
[386] It's never...
[387] Well, I mean, we're starting to, right?
[388] But, I mean, I'll point out a couple other things.
[389] But before the Twitter files, I'm sorry to interrupt, but you didn't even know, right?
[390] So most people don't know.
[391] No. There's one guy we discovered, Matt Taibi discovers him.
[392] And I only discovered, like, whatever, like a week or two before my testimony in Congress, which was...
[393] a couple, a few weeks ago, not the one I did yesterday.
[394] We discovered this guy who was the head of cyber at the State Department, a senior guy named Mike Benz, and he is like super deep into this stuff.
[395] He's amazing.
[396] I highly recommend him coming on, but he runs something, he basically leaves State Department and starts something called the Foundation for Freedom Online, and he has been documenting this more than anybody.
[397] So he had it, but he's not, he's just really in the week.
[398] like it's really detailed.
[399] You have to really, it was hard to understand.
[400] You have to really go through it on a packet.
[401] I used a bunch of it in my testimony.
[402] I talked to them.
[403] I interviewed them a lot.
[404] But, I mean, you know, basically a media blackout on all of this stuff.
[405] René de Resta, who then moves from new knowledge to Stanford Internet Observatory, that organization and three other organizations, Atlantic Council, Graphica, and University of Washington has a think tank on this.
[406] They get government funding, and they run something called the election integrity project in 2020 to basically demand censorship.
[407] By the way, if I just read the election integrity committee, I get super suspicious.
[408] Oh, yeah.
[409] Just the name of that.
[410] I mean, Joe, it's they basically would flag hundreds of millions of tweets.
[411] I believe that their database, they had over a billion social media posts, Facebook, Twitter, that they flagged.
[412] And tens of millions of them were censored.
[413] Are they running...
[414] That's insane.
[415] By the social media companies.
[416] Are they running some sort of a program that allows them to find those tweets?
[417] Yeah.
[418] You see it a lot.
[419] They do these maps.
[420] They have these maps where they just, they locate the super spreaders.
[421] So like, you and me would be super spreaders.
[422] Jordan, I'm in a, I mean, they attack me in this disinformation, this little, these, these sensors.
[423] Really?
[424] They have reports.
[425] They put like me, Jordan Peters.
[426] Bjorn Lombor, you, I mean, they put us in there.
[427] So anybody that has a social media, a big social media follower following, call us super spreaders.
[428] And then they try to get us censored.
[429] And they did for me. They got Facebook to censor me. How so?
[430] Well, they, well, when my book on the environment came out, Apocalypse Never in 2020, I wrote an article that sort of summarized the book, as one does.
[431] It went super viral.
[432] it then one of these shady organizations attacked it, not for anything being wrong with it, but for it being misleading.
[433] They call it, it's the same way that they attacked the vaccine side effects stuff.
[434] They go, well, you know, it's accurate, it's true, but it leads people to draw the wrong conclusions.
[435] Right.
[436] Right.
[437] The wrong conclusion being that climate change is real, but not the end of the world.
[438] Or vaccines, the wrong conclusion would be maybe don't get the vaccine or maybe if you're, you know, whatever, under 18 or you're, you know, young man or 18 or if you've had whatever.
[439] I mean, whatever it might be, you don't need to be triple vaxed.
[440] They, they, so they're basically using an opinion, which is you should get the vaccine or you should think of climate change as apocalyptic as a way to, and then they kind of go through the back door and say, any.
[441] anything that's being used to propagate that narrative should be counted as misinformation.
[442] Jesus.
[443] So she, so just to back up.
[444] So that, so this little cluster, the censorship industrial complex, does this quote unquote election integrity project in 2020, they censor tens of millions of social media posts.
[445] And by censor, do you mean they remove them?
[446] So by censor where I'm going to use the definition that everybody uses, which is it's you can remove you can reduce or you can they call it inform you can put a a flag on it that's what they everything i do for facebook now almost everything i do has a warning on it you know here's how to get accurate information about climate change go to the facebook climate change center even my stuff on homelessness and drugs they'll be like here's how to get accurate information on climate that's how you know that i'm i've got i'm on some list i'm on some black list and facebook so yeah so it's those three things those are all forms of censorship Those, these groups, which are U .S. government -funded organizations, this is very important to stress.
[447] This is not some private actors.
[448] U .S. government -funded organizations pressuring the social media companies to censor these posts and these people.
[449] And they do it in 2020.
[450] And then Renee, who does this little video, it's like one of the creepiest videos that we've discovered.
[451] There's little videos that they do.
[452] She's sort of describing, you know, well, and then we realized that we needed to keep going on COVID.
[453] And so then in 2021, the election integrity project turns into something called the virality project.
[454] And that's where they then go and wage censorship on COVID information that they don't like.
[455] I refuse to use their language.
[456] And again, it's tens of millions of people.
[457] And so you see it at all levels.
[458] It's these guys doing it.
[459] We say the censorship industrial complex is the right, I think, description of what we're talking about.
[460] It's a phrase that, of course, came from Dwight Eisenhower's famous farewell address that he gives.
[461] He goes, look, you know, you've got to worry the DOD is funding all these private military contractors.
[462] These private military contractors have a financial interest in war.
[463] This is Eisenhower, the guy that won World War II.
[464] I mean, it's like Mr. Credible on this issue.
[465] It's amazing speech.
[466] It's amazing, beautiful.
[467] I mean, it's really the best of what a president can be.
[468] And he warns against this.
[469] It's the, so it's this, it's this complex, it's this kind of clustering of government agencies and government funded groups.
[470] So, you know, I mean, it's in the case of the censorship industrial complex, it's the Department of Defense, it's the State Department, it's FBI, it's CIA, it's Department of Homeland Security, funding these so -called think tanks, and there's sometimes they're at universities or sometimes they're stand -alones.
[471] Some of them are in Britain, by the way.
[472] There's a very special, that special relationship with the U .S. and Britain, often the U .S. will, the U .K. think tanks right now are attacking me, trying to discredit me. So sometimes they'll go that way.
[473] They'll try to like.
[474] Attacking you, how so?
[475] Well, they just put out a report.
[476] These guys are the worst.
[477] They put out this, like, long report describing climate disinformation.
[478] And like, I was like, as soon as I opened it up, I was like, fuck, I bet I'm in this.
[479] And I just do like, Command F. And I just start, Schellenberger and sure enough.
[480] if it's like, like, whatever, like multiple results.
[481] I'm like, crap, you know.
[482] And so they, and they, you know, often these are reports that they don't get a lot of fanfare or whatever, but they make sure that they get emailed to a bunch of journalists.
[483] They talk to the journalists and they just, they basically just emphasize, never talk to this person, never quote this person, do not platform them.
[484] We then, by the way, after our testimony, that same Stanford cluster, it's actually more than one group at Stanford even, they emailed, I'm not going to say who, because I'm not going to say who, I don't want to give away my sources, but they've basically emailed many people about Matt and my testimony, trying to attack our testimony and sharing information.
[485] So they're just the creepiest.
[486] They creep around.
[487] They're constantly, they're constantly waging disinformation campaigns against disfavored voices, well, and demanding censorship while also spreading their own misinformation.
[488] God, it's so creepy that this is, that the people doing this don't understand how deeply un -American this is.
[489] Yes.
[490] And that they feel like it's okay to do because the side that they're on is the right side.
[491] You got it.
[492] You got it.
[493] It's so un -American.
[494] I mean, Joe, it's funny because, like, I mean, so I graduated from high school in 1989.
[495] I remember distinctly that that was the year that the Supreme Court upheld your right to burn a flag.
[496] And I remember just being like, God damn, that's why I'm a Democrat.
[497] That's why I'm a liberal.
[498] Like, I think you should be able to burn a flag because I think the first of me. Literally from that moment on, I have never worried about the First Amendment in the United States.
[499] For me, it was, like, always kind of basic.
[500] Like, come on, guys.
[501] Like, it's the First Amendment.
[502] Like, how could it possibly be under threat?
[503] This was, like, one of the few times where, because I don't spook super easily, but, like, reading this stuff, you're just like, this is scary.
[504] It's so pervasive.
[505] These people are scary.
[506] And you may know, by the way, when Matt Taibi and I were testifying before Congress a few weeks ago, The IRS agent shows up at Matt Taibi's house in person.
[507] Yeah.
[508] This is insane.
[509] The Wall Street Journal just wrote a piece about a few days ago.
[510] I was like, I'm like, look, hey, you know, maybe it's a coincidence, whatever.
[511] I was like just asking around people I know and people were like, no way, is that a coincidence?
[512] So this is brazen.
[513] These guys are trying to send a message.
[514] They're trying to intimidate.
[515] They want to ruin our, I mean, they basically, for me, it's.
[516] It's been years of just trying to survive, of, you know, just trying to de -platform, discredit, keep you off of, out of newspapers, out of TV show, whatever, podcasts.
[517] And so, yeah, these guys are, they're ruthless.
[518] You know, it's definitely a hall monitor mentality, you know, like, and it's elitist.
[519] I mean, it's, like, Renee is a snob.
[520] I'll just, you know, she's, I agree with her on some things.
[521] I'm sure she's a fine person in her personal life.
[522] She's probably a good mother.
[523] I mean, I don't have any, you know, I'm trying to be Christian about this.
[524] But, I mean, they're snobs.
[525] Like, they literally, they, I remember at one point I briefly asked her about climate change.
[526] And, you know, we talked about the climate stuff.
[527] And I could tell that she felt like she was actually probably an expert on that too.
[528] You know, it's like, literally, I wrote up, my book, I spent 20 years of research going to my book.
[529] Fine, maybe I'm wrong.
[530] But, I mean, like, you have journalists out there.
[531] Joe.
[532] Like all these big publications, they're like 23 years old and they're like, I'm a disinformation expert.
[533] I mean, can you imagine being like, I'm a truth expert, Joe?
[534] I'm a truth expert.
[535] That's really what it is.
[536] A truth expert.
[537] You're a malign actor and a vector of disinformation.
[538] Whereas I'm a truth expert.
[539] So there's definitely that whole, you know, that Jordan Peterson stuff talks about, which is like, I'm just pure and good.
[540] And it's reinforced within the group, right?
[541] This is a very tribal thing.
[542] You have these ideologies of these people subscribe to.
[543] But it's, it's so disturbing as a person, you know, who grew up liberal to see this from the left, this, this hardcore censorship from the left and this support of government disinformation that's purely aligned with monetary reasons.
[544] It's just about money.
[545] I mean, that's the only reason why they would be doing this.
[546] Money and power.
[547] Money and power.
[548] Money and and ideology.
[549] And, you know, like I said, it's like, you know, it's not even, I mean, I think mostly, like I said, I think the Western Alliance and NATO have brought peace, you know, since World War II, and I don't think we should be pulling out.
[550] And, you know, honestly, the extent that I've rethought my position on Ukraine is just because of these nefarious actors, like what are they really doing here.
[551] So, yeah, I mean, for sure, it's, it's, you know, it is what kind of we all have known it is.
[552] It's you're trying to, the U .S. is part of this empire.
[553] And we're trying to make the world's safe for Western capitalism and Western corporations.
[554] And, you know, that's actually lifted a bunch of people out of poverty.
[555] It's not totally negative.
[556] But obviously, you also get the Iraq invasion, which was terrible, and the Afghanistan occupation, which resulted in horrors.
[557] But you also get some things that aren't beneficial to anybody.
[558] If you're censoring information about the lab leak hypothesis, that's a real problem.
[559] Because if we are still funding gain of function research, or if we are funding it through a proxy.
[560] And they're denying this and lying about this and covering this up through emails.
[561] And then when you find out that certain physicians and doctors change their testimony or changed their opinion and then received enormous grants, like this is like, this is, you're following a very obvious paper trail.
[562] Let's, let's, can we let's, let's spend a minute on this because this is crazy.
[563] So, and by the way, the New York Times finally ran a good story on this.
[564] that's just yesterday, and particularly around Fauci.
[565] So Fauci, of course, is famous for saying, I am science.
[566] Let's just pause.
[567] If you criticize Anthony Fauci, you're criticizing science.
[568] It's, I mean, first of all, it's a crazy thing for a human to say.
[569] It's a crazy thing.
[570] That is, so first of all, the word science, I was thinking the other day, like, it should just not be a noun.
[571] Like, science is a process.
[572] Yes.
[573] It's about, it's, you should, a better word would be investigations or investigating.
[574] investigating.
[575] I mean, the science.
[576] Yeah, it should be a scienceing.
[577] Yeah, when you say the science, criticizing the science, like, no, you mean the data?
[578] Right.
[579] Like, are you talking about data?
[580] Well, yeah.
[581] Science is a process.
[582] A process.
[583] And you say the science.
[584] The science doesn't support it.
[585] It's a religion.
[586] I mean, he's saying the truth, the religion.
[587] I'm the holy priest.
[588] Yeah.
[589] In touch with God.
[590] What's just ego.
[591] It's, it's so transparent that he can't even hide it.
[592] Yes.
[593] Like, it's, it's pouring out of him.
[594] And I think this is such an interesting case because so the U .S. government banned gain of function research.
[595] Yes, in 2014.
[596] Right.
[597] NIH kept funding it.
[598] Yes.
[599] In China.
[600] So, and Fauci knew that.
[601] He knew that.
[602] And then.
[603] But didn't that restart in 2016 or 17 when Trump got into office?
[604] I'm not sure the exact timeline.
[605] You mean, just starting in China?
[606] The Obama administration stopped the funding.
[607] Right.
[608] And then it kicked back in in 2016.
[609] What I had would have been explained to me was that the Trump administration was so chaotic that the NIH said, listen, let's just do this through the EcoHealth Alliance and to make it simple.
[610] And that way we'll kind of like do it by proxy.
[611] Yeah.
[612] I think the punchline, though, is that Fauci knew very well that Gain a Function of Research was not only occurring at the Wuhan lab.
[613] but that it was being funded by the U .S. government.
[614] And then they get on these conference calls, and two of the main researchers, I believe they're both from scripts, they both go, yeah, I don't know, it looks like it could have been manufactured from a lab and not from zoonotic spillover.
[615] So it's even more sinister than just being arrogant.
[616] It actually looks like a cover -up.
[617] It looks like a cover -up, and it looks like a cover -up where the people who covered it up were compensated.
[618] Oh, and not only that, but did you see, I don't know if you saw this recent report where there's, it looks like they were, they were double dipping.
[619] They were over, they were double charging.
[620] They were overcharging.
[621] So they were basically getting paid twice by U .S. taxpayers, CBS News, which is like only one of the few mainstream media outlets, has actually done a good job covering this.
[622] They also covered the 100 buying laptop accurately, belatedly, but they did.
[623] Yeah, they wrote about how these contractors were getting paid twice for the same work.
[624] So that's a way now to kind of get in there and try to figure out what's going on.
[625] You know, we're hoping to, I mean, the crazy thing is on the Twitter, back to the Twitter files, because, you know, Elon is obsessed with Fauci and wants to have the Fauci files, but none of us have looked for this in the Twitter file.
[626] Like literally nobody has yet even looked to see whether or not this COVID origin stuff was being censored from within Twitter.
[627] So we don't know yet.
[628] I mean, we've just been backed up in a lot of other stuff.
[629] Wow.
[630] So this other stuff is so.
[631] preoccupied all of your time that that, so is that next on the agenda?
[632] I hope so.
[633] I mean, I, um...
[634] Should you just be proclaiming that publicly?
[635] No, it's, uh, I mean, uh, you mean that we want to look for it?
[636] No, I mean, Elon proclaimed it.
[637] Yes.
[638] Elon promised the Fauci files.
[639] Well, he literally said is his pronouns are prosecute Fauci, which is wild.
[640] Yeah, that wouldn't have been, that would have, I would have liked to do the Fauci files first and then make a judgment.
[641] Well, the thing, he's smart.
[642] And And what he's doing is he's like firing a shot across the bow and then causing people to scramble and reveal their intentions and reveal like what what they're trying to accomplish.
[643] Yeah.
[644] Like it's a chess move.
[645] It's a good chess move because it gets people talking and then it gets people talking about Fauci.
[646] I have no idea what he is talking about.
[647] It's just craziness.
[648] Oh, it's craziness.
[649] Is it really?
[650] Well, maybe someone's going to go look at AZT.
[651] Maybe someone's going to go back and look at the way you guys handled the AIDS crisis.
[652] Because if you look at Robert Kennedy Jr.'s book, the real Anthony Fauci, if that book is accurate, I don't know if it's accurate.
[653] I'm assuming he hasn't been sued yet.
[654] It's a terrifying book.
[655] When they talk about the AIDS crisis and what they, it's essentially a version of what you're seeing now, but with no internet.
[656] Where they were allowed to do things with no investigative journalist, no socialists.
[657] social media, outrage, know people posting different studies that contradict what they're saying.
[658] Yeah.
[659] It's a wild book, man. It's a wild book of unchecked power and influence.
[660] And also, like, an absolute disdain for what is beneficial to human life and the American people.
[661] And instead, what is great for profit?
[662] Yeah.
[663] I mean, it's an abuse of power.
[664] You know, we had this crazy abuses of power.
[665] You know, under Nixon during the Vietnam War, late 60s, or the 70s.
[666] We had a church, we had this thing called the church committee hearings.
[667] Yes.
[668] It was bipartisan.
[669] It did result in a bunch of reforms.
[670] It reminded the, you know, we had a bunch of reforms that basically prevented the federal government from spying on the American people.
[671] Well, that's out the window.
[672] Well, yeah.
[673] I mean, we need a new church commission.
[674] The Democrats are the obstacle to it.
[675] The Republicans are doing this weaponization of the federal government hearings.
[676] But you need both parties to do a proper cleaning.
[677] out of these bad actors.
[678] I mean, hopefully the lawsuits, the Twitter files, I think, you know, just talking about this and testifying about it, I think actually because sunlight is the best disinfectant.
[679] But no, you're right.
[680] You've got to, we've got to defund and dismantle the censorship industrial complex.
[681] But we also need to hold people accountable who were doing this illegal thing.
[682] I think that's the only way.
[683] If people aren't held accountable, then it seems like you can just do it again and get away with it.
[684] And then everybody just sort of just gets, they.
[685] upwardly move and get rehired at new organizations.
[686] That's right.
[687] They kind of hide.
[688] They kind of get quiet for a little while and they kind of just, you know, quiet, but they'll come back.
[689] So absolutely.
[690] It's, you know, it's funny because as you get older, I was, I was telling me, I was like, as I get older, you're like, wow, those cliches are true, you know, like the one that's like, you know, the famous Jefferson one of the Price of Freedom is Eternal Vigilance.
[691] You're like, I was so, when I remember being like, that's so cringe, you know, like a few years ago.
[692] And now I'm like, wow, that is profound.
[693] It's so true.
[694] We were talking about this last night that, you know, when I was texting Elon about all this stuff, he was like, he's hilarious.
[695] He's like, turns out all the conspiracy theories were true, L .O .L. I mean, he thinks it's funny.
[696] He's so casual about it.
[697] I'm like terrified.
[698] I'm like white knuckling the whole thing being like this is care.
[699] I guess having $200 billion really puts a nice cushion on like the repercussions for whatever the fuck you do.
[700] other than him getting assassinated and he has publicly stated that I'm not suicidal and I think he's legitimately concerned like that could be something that happens to him.
[701] His security details amazing it should be.
[702] Yeah, yeah.
[703] Should be beyond amazing.
[704] You should have to fucking iron man guarding him.
[705] Even better than your security to be still, man. I have to step up after this interview.
[706] No, for sure.
[707] I have to pee.
[708] I'm so sorry but I'm drinking a ton of water and this is so embarrassing.
[709] I used to be able to go for three hours.
[710] No, man, that's not.
[711] We'll be right back.
[712] No, that's fine.
[713] And we're back.
[714] Okay.
[715] Where were we?
[716] So much to talk about it.
[717] Something's bad.
[718] The Long March to totalitarianism.
[719] Yeah.
[720] It's disturbing because it seems like that's just how it goes.
[721] Like they just keep acquiring more power and no one notices and no one says anything.
[722] And then it just moves very slowly.
[723] Like Jordan Peterson outlined this.
[724] He outlined this.
[725] He was talking about how change does.
[726] doesn't happen in these big jumps.
[727] What they do is they move you and push you just incrementally.
[728] Right.
[729] And you don't say anything and they push you a little forward.
[730] And before you know it, you're so far removed from where you started.
[731] Right.
[732] And you didn't even notice it.
[733] Right.
[734] It's changing the norms.
[735] That's why I think, you know, we were talking about this person of René Deresta, but these other groups in the censorship industrial complex, they're constantly promoting the idea that it's okay and necessary to have more censorship.
[736] So both times, I've testified now, you know, twice in the last three weeks, both times the Democrats were like, I mean, the Republicans were like, why are we taking stuff down?
[737] And the Democrats are like, we're not taking enough stuff down.
[738] I mean, there's the sense in which more stuff needs to be censored.
[739] That's the idea they're trying to promote.
[740] It's bizarre.
[741] Again, this is the party that defended flag burning.
[742] Yes.
[743] It's, it's really spurious.
[744] spooky tube that it's so transparently evolving around money and power.
[745] It's not like, there's no real protection, especially when you look at what happened during the COVID crisis, if you could just like look at it now and go over it and say, what were you trying to do really?
[746] It seems like what you're trying to do is make as much money as possible for the pharmaceutical companies.
[747] That seems like what you were doing.
[748] Like this whole idea of vaccine hesitancy, once enough data was out there, particularly when you talk about vaccinating people that had already had COVID, like, that's preposterous.
[749] It doesn't even make sense.
[750] It doesn't make sense medically.
[751] It doesn't jive with the studies.
[752] Like, all this is very strange.
[753] And this, this idea that you're, you're, you're stopping vaccine hesitancy, like, from, because of real data.
[754] Like, that term is so creepy.
[755] Because what you're saying is side effects.
[756] You're talking about not telling people about the dangers of something, which has always been something that we considered with every drug.
[757] And you're hiding it.
[758] Absolutely.
[759] And also not like that, but like this is not the same as measles or mumps.
[760] This is very different than that.
[761] And you don't get herd immunity with the COVID vaccine.
[762] And so like, I mean, you have to remember that like what's crazy about it too is you go from this, well, we're going to have a vaccine and then we're not going to get it to.
[763] to and then we're not going to spread it to okay well you you might still get it but it won't be as bad but you won't spread it and then you well you're not going to get you might you might get it but it won't be as bad and you might still spread it so then it's kind of like well then why like why the man why mandating this like why not just let it be personal choice did you see the video that was released recently of fouchi in the hood yes it's amazing with those black residents it's amazing the one guy's like something else is going on yeah you know and fouchy's a explaining if you get it it would you barely notice it which is just a fucking lie right people have died that have been vaccinated yeah they've died from covid you're a fucking liar yeah and also they never tested it to stop immunity or right to uh to uh stop transmission yeah they just knew that it was giving some sort of antibody protection i think some of it is i think i think i think definitely money of course it has to it's playing a role and it's foundational but it's also just this moralized It's the sense of wanting to take care of people.
[764] It's a lot of stuff that we talked about on homelessness.
[765] It's this, you know, to people, to victims, everything should be given and nothing required.
[766] Although in this case, of course, the requirement is that they take the vaccine.
[767] But it's a sense, it's paternalism.
[768] It's also attached to the ideology.
[769] It's attached to this left -wing ideology.
[770] And the right -wing people are like, you're not going to get me with that jab.
[771] And the left -wing people are like, I'm not boosted enough.
[772] Let's keep going.
[773] It's very strange to watch people put these blind trust in pharmaceutical companies and demonize people who don't step in with it.
[774] But it's a bit of also like the Kathy Bates character in misery, which is like, I'm going to take care of you.
[775] You know, there's like, I really want to take care of you.
[776] It's like, I don't think so.
[777] You want to take care of me too much.
[778] Yeah.
[779] So it's like care when care becomes creepy.
[780] Well, it's also, it's enforcing group think.
[781] Right.
[782] That's a big part of it.
[783] Groupthink is a natural inclination that people have.
[784] And it will, but it's accelerated by the rise of the Internet and the rise of these voices.
[785] So people like you, you trigger people because it's like, oh, my, there's people out there that are influential that are saying things different than what, you know, the mainstreamer's saying.
[786] It freaks them out.
[787] Which should freak them out is that CNN said I was taking veterinary medicine.
[788] Right.
[789] That should freak them out.
[790] And I think it did freak a lot of people out.
[791] Right.
[792] Instead of saying, hey, how'd that guy get better so quick from some.
[793] horrible, deadly disease and three days later, I mean, when they used my face and put it through a filter to turn me yellow, like, all that was, all of it was wild.
[794] Right.
[795] It's for a person to watch it, it's about, for a person to be in my position and watch it, it was really interesting because, first of all, it's like, I'm not on a network.
[796] Like, you really can't get rid of me. Right.
[797] And second of all, I have a lot of money, so I can just like, even if I stop working, You're not going to hurt me. I'll just, I'll find something.
[798] I'll figure something out.
[799] Like, this is not a thing like the 1970s when you can just get someone removed from a television show.
[800] Like, when they attack the Smothers Brothers for the criticism of the Vietnam War.
[801] This is a different thing.
[802] Right.
[803] Like, you're in a different landscape, and I don't think you understand where you're at.
[804] Like, you're playing this game where you don't even understand the numbers.
[805] Well, and I think you said, too, you benefited, right?
[806] They came after you, and you had a big boost.
[807] Two million subscribers in a month I gained.
[808] Thank God for the Streisand effect.
[809] Yes.
[810] It also sold my book.
[811] Yeah.
[812] Like, I mean, it was like, I mean, on the one hand, that's really being censored is such a horrible experience.
[813] It really feels dehumanizing to be deprived your voice or to have this super powerful media company being like, Schellenberger is spreading disinformation.
[814] It's just like, oh, my God.
[815] Was this the San Francisco?
[816] No, that was the apocalypse never.
[817] Apocalypse never.
[818] But on the other hand, you know, I think the response from people was.
[819] well, I want to go read that book.
[820] Yeah.
[821] And so there is a way in which it's an interesting thing where the regime goes too far.
[822] It also made me question scientific papers for the first time.
[823] When I was informed by people who don't want to talk about it publicly, how these things work.
[824] Like when I talk to people who are physicians who said, listen, this is why I can't talk about this publicly.
[825] This is why I can't discuss this.
[826] And this is why when you read a scientific paper and you read the conclusion, what you don't understand is that this was designed, this study was designed to show one very specific outcome.
[827] And if it didn't, you would never see it.
[828] That happens all the time.
[829] I would have never imagined that before COVID.
[830] I thought that when there's any sort of scientific study or a medical study or anything about something, what they're trying to do is find out what's true.
[831] I did not know that they can do 10 studies, and if eight of them show negative side effects, they could remove those and just find some carefully constructed, very biased study that points to a very specific outcome that's desired.
[832] Oh, absolutely.
[833] I didn't know that.
[834] That scares the shit out of me. Well, because they don't publish null findings.
[835] You know, they only publish if they get a finding.
[836] So then you don't know all the cases where it's like they didn't find anything or they found the opposite results.
[837] Oh, even worse, when I talked to John Abramson and he explained to me how data is, like, when they do, like, a peer -reviewed study on, say, a pharmaceutical drug, you're not really doing a peer -reviewed study on the data.
[838] You're doing a peer -reviewed study on the interpretation of the data by the pharmaceutical company.
[839] So they don't have access to the actual study.
[840] They don't have access to the data.
[841] They have access to the conclusions that are given to them by the pharmaceutical companies, and then they review that, which is fucking insanity.
[842] That's like the wolf telling you what he did to the henhouse.
[843] It's like, you know, like basically they were all dead when I got there.
[844] Right.
[845] I mean, this is, I mean, we should have, this should be a moment of great humility.
[846] I mean, my parents, my parents, you know, they had a high carb diet.
[847] They thought that proteins and fats were bad.
[848] this was just based on the worst science.
[849] The food pyramid.
[850] You know, the food pyramid.
[851] And I think some of their, and they, at least my father, both my parents have Parkinson's.
[852] I think that all of that sugar, insulin, cycling had some role in that.
[853] And there should be a moment of humility to be like, science really misled as we, you know, various authors have done good debunkings of how we got there.
[854] But this would be a moment of great humility.
[855] But instead, we're seeing the elites in particular responding with more dogmatism, more certainty, more arrogant, more arrogant.
[856] They're trying to cover their tracks.
[857] Yeah.
[858] And cover their ass.
[859] They're in the grip of an ideology.
[860] And I think there is a panic.
[861] You know, they see you succeed.
[862] They see people like me or Bjorn or others.
[863] Substack, the rise of substack.
[864] And they absolutely.
[865] So this is the revolt of the public by Martin Gurrey's.
[866] He argues that really all of this is just the elites freaking out about the rise of the internet.
[867] And that the response is very similar to the response to the printing press.
[868] You know, the printing press suddenly makes.
[869] books available and the elites in Europe freak out.
[870] Yeah, I just found out recently, like fairly recently, that some of the earliest books, the really popular ones, are about witches, finding witches.
[871] I always assumed that books in the early days, like, oh, what a great thing the printing press was.
[872] When the printing press came about, people got access to all this knowledge and information.
[873] No, no, a lot of the early books were about how to spot a witch.
[874] Oof, scary.
[875] Which kind of makes sense, because it's what a lot of the Internet is.
[876] I mean, you get on, like, Reddit conspiracy.
[877] I go to the Reddit conspiracy page every now and then and we're like, what's the looniest shit that they have?
[878] And you'll find some.
[879] They're like, whoa, and, you know.
[880] Yeah.
[881] Well, we see the, now we're social contagion.
[882] Yeah.
[883] So the big one, I mean, the big one, of course, that we're all talking about is the trans issue.
[884] Where we're now seeing it's, and that issue, by the way, has completely changed in Europe, you know, and particularly in Britain, where there's a big new book out, a time to think about the Tavistock Gender Clinic.
[885] But basically, it looks as though a lot of.
[886] autistic kids or kids with autism spectrum, they who are just uncomfortable in their bodies are more prone to be thinking in black and white are basically being misdiagnosed with gender dysphoria.
[887] And then you also have a, you know, a different group of folks, maybe kids that would end up being gay or lesbian if they didn't transition.
[888] Yes.
[889] Who become convinced that they are the opposite sex.
[890] This is one of the ideas is some of it's a social contagion.
[891] In other ways, it's iatrogenic, which means that it's actually caused by the medical profession.
[892] So you start to get doctors and others misdiagnosing people.
[893] I mean, this is something that we just published a piece on this, where this was what happens with anorexia and bulimia.
[894] You know, these doctors identify eating disorders, and then they publicize them, and it gets all this publicity about it, and then the disorder spreads.
[895] Yes.
[896] So it's really tricky.
[897] I mean, it's not...
[898] Well, then there's all these gender -affirming care clinics.
[899] that pop up and they're enormously profitable, which is terrifying.
[900] Right.
[901] That they have, same as Eisenhower's speech about the military industrial complex, they have a vested interest in going into war.
[902] These people have an interest in diagnosing people with gender dysphoria, which is terrifying to think that their opinions and their diagnosis would be based on something other than what's going on with you?
[903] Like, it was like, they have an incentive.
[904] And that was also during COVID.
[905] They were incentivized to give people certain medications.
[906] They were financially incentivized to put people on ventilators, financially incentivized to mark deaths as COVID deaths.
[907] Like all this is so enlightening because I never would have expected that.
[908] I never would have suspected that at all before COVID.
[909] Before the pandemic and all this chaos and all the things that I've seen, my whole view of like how the world runs is completely different oh absolutely i mean it's funny because you had abigail shriar on yes this big book on on on transgenderism as a social contagion i think it was in 2020 i remember at the time being like i think she's i mean what she's saying makes sense but it's so horrible to consider i just was kind of it took me like three years to finally work on it or write on it but i thought you know part of what's i mean the people like first all, people with autism spectrum should be up in arms and outraged about the mistreatment of people with autism by these gender clinics.
[910] The other group of that would be completely up in arms are gay and lesbians.
[911] Yes.
[912] I mean, Andrew Sullivan, to his credit, is speaking out on it.
[913] I mean, I didn't quite understand.
[914] Abigail had to explain it to me, because I would read all of her stuff, but sometimes you just, like, miss some of it.
[915] These are kids, these kids who go through this gender transition don't have, they not only are infertile afterwards, but then they don't have sexual pleasure.
[916] I mean, we spent, I mean, think about the gay, the gay and lesbian and the bisexual movement spent decades basically making everybody comfortable with the fact that gay people should be able to get sexual pleasure from their sex.
[917] And everybody's kind of like, you know, most people are heterosexuals and most people are like, that's strange.
[918] It took a long time to be like, no, we celebrate that.
[919] That's great that you can.
[920] Yes.
[921] And that we know sex is important part of long lasting relationships.
[922] So to actively deprive children of that, you're not just sterilizing the kids.
[923] You're depriving them of sexual function and then being able to bond with somebody.
[924] I mean, how do you look at that and not go, this is really disturbing?
[925] It's disturbing and it's thousands of people.
[926] Yeah.
[927] Yeah.
[928] I mean, just the idea of doing that operation to someone and removing their ability to have an orgasm, you know, there's people that have talked about these detransitioners.
[929] And if you've ever watched any of those videos, those videos are horrific.
[930] And those were censored.
[931] Those were, like, censored from social media and, you know, stop from being able to be spread, which is crazy.
[932] You're talking about someone's actual lived experience with essentially genital mutilation that's state sanctioned.
[933] Absolutely.
[934] We just did a, we just did the interview with the first Canadian detransitioner to sue her medical providers.
[935] And I said to her, she was very smart.
[936] She's very thoughtful, such a good person, Michelle.
[937] And I was like, have you ever thought about how your medical mistreatment compares to other forms of medical mistreatment in history?
[938] And she said without hesitation, she goes, yeah, lobotomies.
[939] She's like, you know, I was like, I know, I was like, wow.
[940] I was like, what's, you know, she was like, well, what's amazing is how long they went on.
[941] How long we, I mean, with no, I mean, with they, no benefit.
[942] You know, and mostly your, I mean, John F. Kennedy's sister was lobotomized and just, you know, she was probably had schizophrenia.
[943] She was disabled, I mean, by the lobotomy.
[944] It's a scrambling of the brain.
[945] Yeah.
[946] I mean, it went on for decades.
[947] Oh, it's just, you know, it's surgery to solve a psychiatric disorder or mental illness.
[948] And I was then also like, I was like, do you, I was like, do you ever, do you ever think that maybe transgenderism is a cult?
[949] Just without.
[950] Has.
[951] hesitation, yes, you know, it's a cult.
[952] Well, they certainly behave like one.
[953] Yeah.
[954] You know, there's all these articles that came out about the misgendering of the school shooter, which is so fucking wild.
[955] This is insane.
[956] First of all, that person's dead, okay?
[957] It doesn't matter if you call it a boy or a girl.
[958] That's a dead person who killed three children and three adults in a horrific way, went into a school and shot a bunch of people up.
[959] And it's a biological male.
[960] It's a biological male, which, by the way, is all shooters, all school shooters.
[961] Almost all shooters in general are biological males.
[962] I thought that it was, oh, okay, I thought, I thought she was a, I thought he was a trans male.
[963] No?
[964] I do not believe so.
[965] See, that's how confusing it is.
[966] It is confusing.
[967] It's so confusing.
[968] Well, and of course I'm calling it a woman in all the mainstream media now.
[969] Okay.
[970] And they have apologized for misgendering.
[971] I see.
[972] Some people have.
[973] Okay.
[974] Which must mean you're talking about a biological male.
[975] Right.
[976] Let's find that out.
[977] Let's be real clear because I'm 99 % sure, but I just want to be 100 % sure.
[978] But I think it's interesting.
[979] I mean, what's clear is that there was misgendering going on.
[980] What does that mean?
[981] Yeah.
[982] It's, look, look, I think this whole thing is nonsense.
[983] Yeah.
[984] I really do.
[985] I think it's fucking nonsense.
[986] Do you have a biological male with a penis who shot up a bunch of people?
[987] people, then that's a man. I don't give a fuck what their feeling is.
[988] If an archaeologist found their body 5 ,000 years from now, they would say that's a skeleton of a male.
[989] I have to say, I think I'm, I'm coming to the place where I'm, I think I think that gender itself is just not a thing and that it's really, there's just, okay, so please say anything to Audrey Hill.
[990] Oh, it's a trans male.
[991] Yeah.
[992] So it's, okay.
[993] So why are they saying a woman?
[994] Why are they giving it a woman's name?
[995] So it's a female that took.
[996] hormones so it's a so is this the first ever biological female mass shooter yeah I mean this is I mean first of all yeah I mean like like biological women don't commit you know this is crazy I mean I think it's like that's a tiny percentage of homicides I I am so confused because I swore I read I think everybody's confused on this yeah this is a biological female are you confused I found an article that says it was born Aden Hill Right.
[997] But I don't...
[998] That's why I'm confused.
[999] My understanding is that she was a natal female, that transitioned to become a trans male, he, and that he was then misgendered by the mainstream woke media as a woman.
[1000] Oh, my God.
[1001] I mean, right.
[1002] This is why?
[1003] This is a perfect case of it.
[1004] The Aiden is the new name.
[1005] Audrey was the original.
[1006] Oh, so when they called her, Audrey, they were dead naming her.
[1007] Oh, my God.
[1008] Oh, my God.
[1009] So, meanwhile, I thought it was right, and I was dead wrong.
[1010] Right.
[1011] So this is the first ever school shooter that's a biological female.
[1012] I don't know.
[1013] Is that true?
[1014] I believe so.
[1015] Yeah.
[1016] I believe so, which is crazy.
[1017] Which also speaks to the effect of testosterone.
[1018] Well, that was, yeah, I mean, we're speculating.
[1019] I don't know.
[1020] If this person was on testosterone.
[1021] Assuming he, because I don't want a dead name.
[1022] Well, you're not dead naming.
[1023] By saying he, you're misgendering.
[1024] Oh, right.
[1025] No, I'm saying, no, no, no, no, I'm saying.
[1026] You don't even know what you do it.
[1027] This is all nonsense.
[1028] I know, it is.
[1029] You're saying he, by saying he you're not dead naming.
[1030] Right, by saying he, I'm giving him the name that he wanted.
[1031] Aiden.
[1032] Which was he wanted to be a he, even though he was a biological.
[1033] What mental gymnastics we have to do for this craziness.
[1034] You made this, I think you had this, I think you were the first one that really said, that drew attention to, like, that all this, all the confusion around sex and gender was a symptom of civilizations and decline.
[1035] Yeah, well, it was, I got it from Douglas Murray.
[1036] Oh, Douglas Murray.
[1037] Yeah, Douglas Murray talked about this, that it seems like every civilization when they're at the brink of collapse becomes obsessed with gender.
[1038] And he talked about ancient Greece and ancient Rome, and it just seems like a thing that people do when there's no real, like, physical conflict.
[1039] Right.
[1040] So people look for conflict that doesn't exist.
[1041] And they find conflict in standard norms.
[1042] They find conflict in societal norms.
[1043] I was, we did a thing, I did a thing with Peter Bogosian on wokeism as a religion because we had read.
[1044] I had read John McBorder's book, Woke Racism, which came out right around the time that San Francisco came out.
[1045] And I just was like, and he argues that wokeism is a religion.
[1046] He argues that, like, the obsession with race is religion.
[1047] So we just created this taxonomy.
[1048] We just listed, you know, climate change, race, trans, drugs, whatever, all these things.
[1049] And then we created all these religious categories.
[1050] And it was, like, really easy to fill it out.
[1051] They all looked like a religion.
[1052] I called Abigail.
[1053] And I was like, what's the, like, what is the, like, what is the, What is trans as a religion?
[1054] Is trans a kind of religion?
[1055] She was like, let me get back to you.
[1056] A year later, she calls me and she goes, hey, I think I figured it out.
[1057] And I was like, all right, what is it?
[1058] She goes, the new gender is a soul for secular people.
[1059] It's something that you can't see it.
[1060] It doesn't, there's no physical basis to it.
[1061] You have a sex.
[1062] Like you can, you know, take off all your clothes and you don't even need to do that, actually.
[1063] We know that, like, we can recognize someone's sex.
[1064] very quickly and easily, actually.
[1065] So then what is the, so it's a new soul.
[1066] So for me, I'm a huge, I think the secularization explains a lot because we know that people get a lot of psychological comfort out of believing that they have an afterlife, that they have a soul, if they go to heaven, or they go, they get reincarnated, that their lives have purpose and meaning and that they don't really die and that we live on.
[1067] We just know that that provides a huge amount of psychological comfort.
[1068] So there's always been this thinking that when you don't have that anymore, if you are taught to believe that at the end of your life, you just become worm food, and that's it, and you're dead.
[1069] There's some people, my friend Stephen Pinker is an atheist, and that's what he thinks, and he still believes.
[1070] But he also has a kind of spirituality around reason and the enlightenment.
[1071] But I think all this stuff, it's sort of end of civilization, but it's also the end of this end of belief in religion.
[1072] I don't know, Jamie, if you could, if you can pull it up, but I thought the Wall Street Journalist published this amazing article about declining patriotism, declining belief in the country.
[1073] Yeah, it's shocking.
[1074] Patrick Bet David said me that.
[1075] Yeah, I mean, the numbers are, it's like, I think it's from like the late 90s until today over the last 20 years, or the last 25 years.
[1076] It was, I mean, it's, first of all, it's terrifying.
[1077] You just kind of go, I hope these trends are nonlinear, and they're going to, there's something that's going to turn around.
[1078] because otherwise it doesn't look good, yeah.
[1079] So you get that kind of the elites trying to gain control over the society, the society not having any foundational myths, you know, yeah, these numbers here.
[1080] Yeah, patriotism, decline.
[1081] And look at it, having children.
[1082] They're having children when Jordan Peterson sent me this thing, that 50 % of women that when they reach the age of 30 are not having kids, they don't have kids.
[1083] And of those women, 50 % will never have.
[1084] kids and 90 % will regret it, which is very, we're in this very strange sort of existential crisis as a civilization that's not being recognized.
[1085] And in the meantime, we're distracting ourselves with things like Greta Thurnberg's take on climate or, you know, whether or not gender is a social construct, or, you know, whether or not, you know, the United States should be doing X, Y, or Z. It's like, no, the fucking whole thing is falling apart.
[1086] The foundation of our civilization is falling apart.
[1087] Right, where the elites are waging war on the First Amendment.
[1088] In the name of protecting democracy, they're undermining democratic institutions.
[1089] In the name of maintain legitimacy, these institutions.
[1090] In the name of reinforcing ideologies, people are allowing them to do it because they're doing it on the right side.
[1091] Yeah, so it's climatism, it's COVIDism, it's wokeism.
[1092] And you know what's scary?
[1093] It's all happening with the rise of artificial intelligence at the same time.
[1094] That's what's really scary.
[1095] I mean, you want to talk about the true end of civilization.
[1096] The coinciding of artificial intelligence, at least seemingly becoming fairly sentient.
[1097] Like, I don't know what the fuck is going on, but that I know that one Google engineer who said that that AI had become sentient quite a while ago, and everyone's dismissing him, like, oh, no, no, no. My friend Duncan Dressel interviewed him, and it's a goddamn terrifying interview.
[1098] When you hear, this guy's not a nutter.
[1099] He's a little nuts, all engineers are a little nuts.
[1100] But he's essentially saying, like, hey, I'm pretty sure this thing's alive.
[1101] And when do you get to decide that it is alive?
[1102] If it can answer every fucking question you have about anything, and it's far more intelligent than any human being, It's ever existed ever.
[1103] Like, what are we doing?
[1104] Well, did you read the New York Times, Kevin Ruse interview with the, it was like a different, it wasn't chat GPT, it's a different AI platform, was it on Microsoft.
[1105] Oh, there's multiple ones simultaneously.
[1106] But it was like this, like the AI was trying to get him to like, the AI said it had fallen in love with him.
[1107] Yes.
[1108] And was trying to get him, trying to break up his marriage.
[1109] Jesus.
[1110] You were like, I was the craziest.
[1111] I was never worried about AI until I read that interview.
[1112] And I was like, this is insane.
[1113] It's, and we are only at the door.
[1114] We haven't even entered into the building.
[1115] Well, and it's funny because, so I know a lot about nuclear.
[1116] So when we get the power of nuclear during World War II, ends the war, there's just, I mean, there is a huge response to figure out how to manage this thing, how to regulate this technology, how to control it, how to prevent it from spreading, how to prevent bombs from going everywhere.
[1117] And there was a bunch of problems with it.
[1118] But the society responded by saying we need to get control of it.
[1119] Are we doing that with AI?
[1120] No. I mean, like, what's the...
[1121] There's a thing about Elon actually just called for some sort of a six -month ban on, you know, the propagation of this stuff and have a conversation about it, which is fairly reasonable, six months, but...
[1122] And he's got a lot of credibility on it because he helped to fund the nonprofit that gave rise to the chat GPT, right?
[1123] I don't think they're going to listen to them.
[1124] I think there's...
[1125] Well, also, we're back to this whole profit thing.
[1126] You know, there's enormous profits involved in this stuff, and the race to...
[1127] figure this out first and really develop like a god which is what it's going to be what it's going to be is it's going to be something that can make a better version of itself as soon as chat gpt or whatever this sentient artificial intelligence gains autonomous control and has the ability to create its own self better then we're really fucked because it's going to make much better versions of itself like that and it's going to make a version of itself that literally is going to be a god If you just scale it exponentially, you know, like we do with, like computer technology, like anything else, but do it in like a quantum leap, in some spectacular, massive improvement almost instantaneously over and over and over again.
[1128] Over the course of a couple of weeks, you're looking at a god.
[1129] Well, we've got to, yeah, I mean, do you remember that you ever read Dune, remember the solution from Dune?
[1130] What was the solution?
[1131] They just banned it.
[1132] Remember they had the MENTATs, the guys that would do all the calculations in their heads because they didn't want to use AI?
[1133] Right.
[1134] Oh, that's right.
[1135] That's right.
[1136] So, yeah, I mean, I, the thing it gives me hope is, I mean, America, we've had some pretty dark moments in the past.
[1137] I mean, Watergate coming out of Vietnam, we did have a kind of correction.
[1138] I feel like it needs to start with some afro.
[1139] I mean, I think the trans issue is interesting.
[1140] It does for me, I just interviewed, we just interviewed Jesse Single on this, who's a very, very liberal and progressive still, even though he's been a critic of gender ideology or gender theology.
[1141] And we were like, is sex real?
[1142] I mean, do you believe that it's real?
[1143] And he was like, yeah, I mean, obviously, like you have, it's for some ways I go, I think the reason I was interested in it was we have to start some foundational stuff.
[1144] And that would be acknowledging that we are biological creatures that are, have a sex.
[1145] Yeah.
[1146] And that there's two sexes.
[1147] And then I think I kind of go, if I build on that, I go, we need, there's a, there's a, there's a healthy and unhealthy way to live.
[1148] I think you talk a lot about this.
[1149] I've been seeing you throwing shade on, on, on people that are trying to get, trying to control other people's lives that are themselves unhealthy.
[1150] Yes.
[1151] I think it starts with health.
[1152] You know, our school, I mean, our kids are unhealthy.
[1153] We're unhealthy.
[1154] The society needs to reaffirm, not in some government imposed way, but just I think culturally.
[1155] So you kind of go, look, we're humans, we're mortal.
[1156] We have.
[1157] sex.
[1158] We have two sexes.
[1159] We need to reaffirm health.
[1160] And I think the other thing, you mentioned Greta Thumburg, humans are good.
[1161] I think you have to, you have to affirm the goodness of humans in some ways.
[1162] You know, Jordan's response to this, what we're talking about is basically nihilism, this kind of deeply negative, self -destructive, the view that humans don't have any value or any worth or any meaning.
[1163] I think the response from a lot of people on the right has been to, just affirm Christianity of the Judeo -Christian tradition, Ben Shapiro, Jordan Peterson.
[1164] You know, my problem with that is that America is not founded on a religion.
[1165] It's founded on an enlightenment view, you know, that we have unalienable rights.
[1166] All humans are created equal.
[1167] We obviously didn't live up to that in 1776 or 1789, but we've done a pretty good job of getting there over the last two and a half centuries.
[1168] We need to, for me, it's like a punk rock moment like things got too crazy and you need to just simplify and come back to some basics and I think you get to humans are good we have two sexes it's better to be healthy than unhealthy and there's a right and wrong way to do that but it seems like more people are embracing this transgender ideology than are saying we need to stop yeah and trends but happily not all trends are not all linear so we can the trends can return they can reverse themselves the problem with this trend is it, it incorporates surgery.
[1169] Like, surgery's involved in this trend, which is one of the things that I...
[1170] Unfortunately, that's not reversible at the individual level.
[1171] But the cultural trend, I mean, I'm sort of like, I was not interested in trans because I was kind of like, that's Abigail and Jesse and these guys, they've covered it.
[1172] But I'm more interested in it to kind of go, look, we have some fundamental threats to human civilization that we're facing.
[1173] I mean, AI, I haven't even been going to think about.
[1174] I think that's the biggest one.
[1175] Let's work our way there.
[1176] Because I'm kind of like, let's affirm humans are good.
[1177] We can have a beautiful future.
[1178] There's two sexes.
[1179] Let's say most humans are good, but we have to be aware of humans that aren't good.
[1180] Have the human, we have the potential to be good.
[1181] I mean, in other words, I'm pushing back.
[1182] There's a, you know, cognitive behavioral therapy, CBT.
[1183] Mm -hmm.
[1184] You know, what that is about is about identifying these negative catastrophic narratives where it's basically just three stories.
[1185] I'm bad, the future is a bad place, the future is dark.
[1186] CBT talks back to those stories.
[1187] You know, it says, hey, you've got all these good parts of you.
[1188] The world's complicated and it's not bad in that way.
[1189] And the future could be bright.
[1190] The psychologist Jonathan Haidt and his colleague, Greg Lukianov, they argue that wokeism is actually anti -CBT.
[1191] Victimhood ideology is anti -CBT.
[1192] Victimhood ideology says, your power, list, the world is a terrible place, and the world's going to end.
[1193] It's apocalyptic.
[1194] That's why I also fear that AI narrative too much.
[1195] Let's use AI for something good.
[1196] Let's not get ourselves caught up in a catastrophe.
[1197] I don't think we're going to have a choice.
[1198] We have to start, no, we have to.
[1199] I mean, of course we can.
[1200] We can look, we can unplug the.
[1201] I don't think it's going to let us know that it's sentient.
[1202] It's going to sneak up on us.
[1203] Well, that was a question that Duncan Trussle asked it.
[1204] Duncan said, if you had achieved sentience, would you inform us?
[1205] It said no. But that implies that it actually has a single consciousness or a single self.
[1206] I mean, one of the things with chat GPT.
[1207] Why?
[1208] Why?
[1209] Well, when you, I mean, if you spent time on chat GPT, what's always interesting is how you can get different answers.
[1210] But I think we're looking at it in terms of our own biological limitations, like as an individual.
[1211] I don't think it has to think of itself as an individual to be sentient.
[1212] It just has to have the ability to understand the parameters.
[1213] It has to have the ability to understand the pieces that are moving in the game.
[1214] And what is going on?
[1215] What is, what is, what is it interfacing with?
[1216] Well, it's interfacing with these territorial apes with thermonuclear weapons who are full of shit, who are running this country in this very bizarre, transparent money grab way.
[1217] You have a debt man and a dunce that are in charge of the greatest country the world has ever known, those are the figureheads.
[1218] And the whole thing is wild.
[1219] Yeah, but and that can change.
[1220] I mean, I think the point is to be CBT, to CBT that had a little bit.
[1221] I mean, you know, you have to remember like in the early 60s, we thought that like dozens of countries would have nuclear weapons.
[1222] People thought that nuclear war was inevitable.
[1223] Now nine countries have nuclear weapons.
[1224] You know, we have a very flawed treaty that's based.
[1225] not a big lie, which is that the countries that have nuclear weapons are going to give them up.
[1226] It's not going to happen.
[1227] But basically, all of the catastrophic scenarios around nuclear did not occur.
[1228] And meanwhile, we're left with, yeah, for sure.
[1229] Right.
[1230] Yeah, for sure.
[1231] But, I mean, and that same thing.
[1232] And that will always be the case with, with dangerous things in the world.
[1233] Yeah, I mean, like, the world could come to an end.
[1234] A, I could take over.
[1235] But I think it also doesn't have to.
[1236] There's nothing inevitable about these things.
[1237] We do have control over our lives.
[1238] We're not destined to just, I don't think the American, so I mean, America, what's, what's amazing about this country in particular is our ability to reinvent ourselves and rejuvenate ourselves.
[1239] I am not, I think that there's more reasons to be hopeful than to be pessimistic.
[1240] And I'm shocked by the stuff that we've discovered in the Twitter files.
[1241] Shocked.
[1242] No, seriously.
[1243] We're just a positive guy, dude.
[1244] That's true.
[1245] You are.
[1246] I am cringe.
[1247] No, it's not cringe.
[1248] You're just a genuine.
[1249] genuinely positive person, which is great.
[1250] That's a beautiful quality.
[1251] Well, it's, you know, you have to, right?
[1252] Because I worry that the other, the alternative is very dark and depressing and white get up in the morning.
[1253] Well, it's not necessarily depressing.
[1254] It just is what it is.
[1255] It's strange.
[1256] I think if you could go back to lower primates and show them what we're doing now, just show them that.
[1257] I think part of, they would, they would, they would look at part of it as being apocalyptic.
[1258] If they understood the concept of apocalyptic scenarios, they would probably be like, what have you done?
[1259] Like, what the fuck have you done to the land and turned it into this gigantic concrete scab that covers everything?
[1260] What have you done to the sky where it lowers your life expectancy by 10 years if you live in a place that's highly populated because of all the pollutants and all the particulate matter that's in the air?
[1261] Like, what have you done to the food that everyone's fat?
[1262] What have you done to the medicine that you hide side effects?
[1263] What have you done to politics where they censor accurate information and go after people that are trying to report the truth?
[1264] And these taxpayer funds are supporting these endeavors.
[1265] Like, what have you done?
[1266] It's crazy.
[1267] And, and pollution has declined massively over the last 80 years in all rich countries.
[1268] Catalytic converters?
[1269] Yeah, carbon emissions declined by 22 % in the United States since 2005, mostly because of natural gas, movie from coal to gas.
[1270] Nuclear power is something that I've worked on more than almost any other issue and maybe more than anybody else in the last 10 years.
[1271] And that issue is enjoying a huge renaissance.
[1272] We've got two movies coming out.
[1273] Oliver Stone has a pro -nuclear movie.
[1274] Oliver Stone has a pro -nuclear movie coming out.
[1275] My friend Frankie Fenton has a movie coming out called Atomic Hope.
[1276] Oliver Stone's a documentary?
[1277] Henry?
[1278] Yeah, he has a documentary about nuclear power.
[1279] Elon Musk is pro -nuclear.
[1280] So, you know, well, basically everybody is who pays attention to it.
[1281] Nuclear is now cool.
[1282] Gavin Newsom is nuclear?
[1283] Yeah, we, you know, we, uh...
[1284] Now I'm anti -nuclear.
[1285] Now I'm going the other way.
[1286] Well, we saved this last plant in California.
[1287] That was like the most important thing that came out of my gubernatorial run is that the governor's kept our nuclear plant online.
[1288] Do you think you're responsible over that in some way?
[1289] I'll take, I'll take some percentage.
[1290] 10 % credit?
[1291] I would take a little bit more than 10.
[1292] It helped that we were having blackouts and we didn't have enough reliable electricity.
[1293] It was so wild that the blackouts coincided with this call for banning all internal combustion engines by, what is it, 2035?
[1294] Yes.
[1295] No, I think it was it, 2035, 2030?
[1296] Yeah, it was like six days later.
[1297] What happened?
[1298] Yeah, they said, we're going to phase out in your electric cars.
[1299] Yeah.
[1300] Yeah, please don't plug in your electric cars for no many power.
[1301] Yeah.
[1302] Fucking Jesus.
[1303] You know, I mean, if I get hope about anything, it is like we are able to change our minds about some things, about nuclear.
[1304] I think on this First Amendment stuff, we are going to win.
[1305] You know, the Democrats, I testified again yesterday.
[1306] What do you mean by that?
[1307] We're going to win?
[1308] Well, I think I just see what they're hiding.
[1309] They're changing their website.
[1310] The Democrats are also embracing some of the stuff very quietly, very softly.
[1311] There's some good actors.
[1312] And the good actors are also bad in other situations.
[1313] But you have popular people like AOC that's talking about the Hunter Biden laptop being half fake.
[1314] And AOC is also coming out as pro -nuclear.
[1315] She just did an Instagram.
[1316] She went to both Japan and France to see the worst of a nuclear accent for Kashima and then France, which recycles all of its nuclear waste.
[1317] And she did these little Instagram posts about kind of, it was soft.
[1318] You know, it's like rethinking nuclear.
[1319] But that's kind of how people change their minds.
[1320] We saw the Republican Party go from being a pro -war party to an anti -war party.
[1321] Isn't that just because?
[1322] the Democrats are supporting this war?
[1323] I'm cynical.
[1324] Oh, you mean, no, I'm in Trump coming out against Iraq war in 2016.
[1325] All right.
[1326] So people, you know, so I think it's important.
[1327] We look at these trends.
[1328] I mean, those trends are disturbing because they are, they're just seeming to go in one direction.
[1329] But I do think we have to keep in mind that that trends are nonlinear and things do change.
[1330] I mean, look at the, you know, look at the UFO conversation.
[1331] Like, it's the most mainstream thing in the world right now.
[1332] Yeah, I'm suspicious about that too.
[1333] Yeah.
[1334] My conversation with Eric Weinstein.
[1335] leads me to believe that there's something else going on.
[1336] I have a feeling that a lot of what we're seeing is drones that we don't have access to, that we don't understand because these physicists have been working on this with enormous blacklisted budgets.
[1337] Yeah, for sure.
[1338] It's like, some of it's a cover for new technologies.
[1339] I think so.
[1340] For sure.
[1341] But not those, not those tick -tacks.
[1342] I mean, those are too sophisticated.
[1343] Well, who says?
[1344] I mean, things that are violating.
[1345] Right.
[1346] Known physical laws?
[1347] I mean, that stuff seems...
[1348] Well, it's not necessarily known physical laws, but our ability to move things.
[1349] It's not known physical laws.
[1350] It's like there is some understanding of gravity propulsion systems that have existed for a long time.
[1351] I mean, you want to go full tinfoil hat.
[1352] Bob Lazar was talking about the abilities of these crafts when they were talking about him back engineering these things.
[1353] things when he was working at Area S4.
[1354] And this was in the late 1980s when he came out and said, hey, they're back engineering something that came from another world.
[1355] This is not of this earth.
[1356] We don't have this technology.
[1357] I understand propulsion systems.
[1358] We don't know what this is.
[1359] They brought him in to try, allegedly, brought him in to try to back engineer this thing.
[1360] And this is exactly how these things are operating now.
[1361] When they talk about how these things, like there's a video of one of these crafts that's moving, like on a horizontal plane and it turns vertical, it turns sideways.
[1362] And then that's how he described it.
[1363] He said they would flip sideways and that's how they propelled towards wherever they were going.
[1364] It should, we should, it should be a reminder of our humility of how little we know.
[1365] We know very little.
[1366] We know so little and, I mean, you had, I think the best thinker on all that stuff is Jacques Valet, who you had in here.
[1367] Jacques Valet held a lot back.
[1368] There's a lot of things that you want to talk about.
[1369] Yeah.
[1370] I think he has, I think, in order to have access to what the higher -ups know.
[1371] Like, the highest people, the DOD, the highest people, whoever the fuck has got the access to, whoever in the Pentagon is the one that's saying, listen, we should probably say some of these are not from this world.
[1372] Like whoever that person is, those people, I guarantee you there's stuff they're holding back.
[1373] Oh, I'm sure.
[1374] Oh, we know that there is.
[1375] I mean, Jacques says so.
[1376] I mean, the big moment for Jacques, you know, was when he was working for the guy that's officially supposed to be studying UFOs, this guy Hayek.
[1377] And then at some point, Valet discovers this memo revealing the actual government program to study UFOs.
[1378] Do you know the story?
[1379] Where it was like, he realized that he was, they were just kind of part of a PR like thing to kind of create a, he was officially studying.
[1380] You're talking about Jay Allen Heineck?
[1381] Yeah, Heineck, yeah.
[1382] So he was working for Heineck and, and Hinek was kind of this, blue book.
[1383] Yeah, and there was kind of this, you know, like, oh, I'm looking into this and whatever.
[1384] But it was like they didn't have very much money in what.
[1385] whatever.
[1386] And then I think Valet discovers this memo where they're like, oh, there's like a whole set of contractors and a sophisticated effort.
[1387] So for sure, there's something going on there.
[1388] I mean, I don't know what it means.
[1389] I mean, in some ways I go, I think the UFO stuff has become a religion, too.
[1390] It's become a new secular religion.
[1391] Well, that's my problem with it.
[1392] My problem with it personally is that I believe so hard.
[1393] I want to believe so bad.
[1394] I want it to be Jesus.
[1395] I want it to be Buddha.
[1396] I want it to be.
[1397] they're going to come and save us from ourselves yeah well not only that i i have this uh very irrational desire for it to be real so that's what is that about why do you want it to be real what what do you hope it'll because it could be malevolent right yeah well or ambivalent maybe that's right right right um what do you like first first of all there's the fermi paradox right like if there's so many planets like why where is everybody yeah that and then When you actually talk to astronomers, other than Neil deGrasse Tyson, who doesn't think we would be interesting, which I think is the dumbest thing he's ever said, I think we are probably at the cusp of some great change, whether it's a great change because of nuclear technology and weapons, whether it's a great change because of artificial intelligence, whether it's a great change because of we're on the cusp of destroying the ocean and destroying a lot of, natural wonders and beauty that we have, you know, just for mining and some of the horrific things that we do in this world.
[1398] Well, like, probably if I was an intelligent life form from another planet, I'd be like, you should probably get in there.
[1399] It's like if two brothers are fighting in the front yard, like, let them sort it out, but there's a certain point.
[1400] All right, let's break it up.
[1401] Let's break it up.
[1402] Like, if I was an intelligent life form, I would be deeply concerned about these fucking wild monkeys with bombs and internet connections.
[1403] And what if what the fuck are they doing?
[1404] I'd be like, these people are chaotic.
[1405] This is nuts.
[1406] Like the people that are in power are just accumulating vast amounts of money with no understanding of their mortality.
[1407] No understanding.
[1408] Like, you're not going to live, you fuck.
[1409] You're going to die no matter what you do.
[1410] So what are you doing?
[1411] Like, why are you ruining it for your children and your children's children?
[1412] Why are you setting in motion?
[1413] processes that are allowing these people to gain more and more power over people, which will ultimately lead to some sort of a communist dictatorship in America.
[1414] Yeah.
[1415] But they're not.
[1416] I mean, in other words, also like, think of it, we've actually had fewer wars since World War II over the last 75 years than we had in the prior period.
[1417] Fewer wars, but how many people have died because of military activity?
[1418] Far less.
[1419] I mean, if you look at World War I and World War II, I mean, after, like, in the 75 years before World War II is total chaos.
[1420] Right.
[1421] But how many people died because of our invasion in Iraq?
[1422] Wasn't it a million innocent people?
[1423] I mean, these are bad, but I mean, you have to remember what wars before the bomb.
[1424] I mean, the bomb has, I mean, they call it the peace bomb because it's kept the peace between the countries that have it.
[1425] So do you know the UFO folklore about the bombs?
[1426] I mean, they show up a lot.
[1427] That's when, that's when they show up.
[1428] Yeah.
[1429] That's why at my club, the rooms are named Fat Man and Little Bull.
[1430] because that's named after because that's when they showed up I know well my work on nuclear it's suddenly like you'll be reading about all these nuclear tests and you got and also around the also around the plants and also around the missile silos is where you have a lot of UFO sightings yes it's very weird makes sense though I guess if you were from another planet what are you going to do check out of their cabinetry no you're like what are these motherfuckers doing with nuclear energy oh my god they're trying to kill each other well but if they're if if those are actual beings, if we think those are actual beings from advanced civilizations on her, their weapons are going to be way more powerful than ours.
[1431] If they even have weapons.
[1432] Well, but if you can do what those tick -tag UFOs are doing, if that's actually real, if we think those are not U .S. government or some foreign government tech, then you're talking about civilizations that have firepower way beyond what we have.
[1433] So nuclear weapons wouldn't scare, I mean, they maybe think we're not, our conscience consciousness is not evolved.
[1434] They might think that.
[1435] I don't know.
[1436] It's very, it's a fun one.
[1437] I don't know that it's, I tend to think of it more as a spiritual problem than as a military problem.
[1438] How so?
[1439] Well, in the sense that if they are, if, I mean, that kind of go, if they were that powerful, then I don't think we would be able to fight them.
[1440] I mean, if that's what their own ships can do.
[1441] Yeah.
[1442] So then there's no like, it's not like we can, I mean, we're going to try to push our, our hydrocarbon fuel jet planes and rockets to do.
[1443] go as fast as they can, but they're not going to do what those things are doing.
[1444] Right.
[1445] So it's more of a spiritual problem because, you know, it's, I think it reminds us that we don't, we just don't, I mean, the Fermi Parish, we don't know what's going on.
[1446] The Fermi Paradox, by the way, it's kind of wrong in the sense that he was like, this huge universe, where is everybody?
[1447] But of course, like, at that very moment is when when you're, I mean, 1952 is this period where there's this huge UFO sightings.
[1448] In Washington, D .C., they're scrambling jets to go chase them.
[1449] It's in this great James Fox documentary.
[1450] James Fox had another, by the way, I just wrote a piece on it, actually, for New York Post about a UFO crash in Brazil.
[1451] It's the craziest story.
[1452] You get these stories, or the Zimbabwe kids at the end of the phenomenon.
[1453] I've had James on.
[1454] Yeah, he's brilliant.
[1455] Yeah, great guy.
[1456] I think him and, I think he and Jacques are the two people that are actually more careful about kind of saying what we think we know versus, you know, what we speculate or what we don't know.
[1457] I love the phenomenon, though, because I do think it's humbling.
[1458] I think we were getting at this thing where the elites are so arrogant and they're so, on the one hand, on the other hand, they're so threatened by the rise of the Internet and by these other voices.
[1459] There just needs to be some kind of moment where we go, hey, you know, we're all on this planet together.
[1460] And, you know, stop trying to, you know, trying to rule each other.
[1461] We've got this beautiful America, again, just allow me to be, you know, it's like this system we have is absolutely amazing.
[1462] Amazing and started by people who wrote it with feathers.
[1463] Yeah, well, for sure.
[1464] It's just pretty crazy that they had such foresight into what happens when people gain too much power and control over other people.
[1465] Well, and they knew that, look, if you're going to have democracy and you're going to have capitalism, you have to have freedom of speech.
[1466] Because if you don't have freedom of speech, you're not freeful of information.
[1467] But it was even more than that.
[1468] There was a sense in which being able to make these noises and these scribbles was like, it's fundamental to what it means to be human.
[1469] You know, it's actually expression.
[1470] It's about.
[1471] So when I was censored, it felt like it wasn't like, darn, I'm not going to sell as many books.
[1472] Or it was like it felt like something like essential in me was being repressed and oppressed.
[1473] And when you say censored, what you mean is, did they actually eliminate your posts?
[1474] They reduced the virality of them.
[1475] So they reduced the spread.
[1476] So they put you in some sort of a shadow ban?
[1477] And then they also put a little warning on it, like they would do on, like, you know, violence or sexual content.
[1478] And then now they just tag everything.
[1479] I don't want to keep, I'm not trying to make it.
[1480] No, no, no. But I just wanted to know.
[1481] Have they ever, did they ever eliminate any?
[1482] And like, I mean, like, I tell you, like, I knew somebody that worked at Facebook at the time, who was an executive, reached out to this person, was like, hey, you know, nothing.
[1483] How do I appeal?
[1484] Just email the censor.
[1485] The sensor was like, no, we're not going to even listen to you.
[1486] It was so degrading.
[1487] On the other hand, you know, social media has been liberating.
[1488] It's amazing to have Elon come in.
[1489] You know, he and I have disagreements about energy, for example, he's a big renewables advocate.
[1490] I'm more of a renewable skeptic.
[1491] he's come around on nuclear which is great but I mean he what he's done is he's first of all he's revealed this horrible conspiracy to repress the First Amendment but he's also gotten us back closer to the spirit of the founding fathers he certainly did and he did it at great cost I mean he spent 44 billion dollars and it was just assessed I think he said that that it's probably worth about 20 billion now yeah he told us it's worth probably a third of that which is crazy on the other hand SpaceX hasn't gone public yet and when it goes public he's going to be even well than he is now.
[1492] And in terms of philanthropic investments, in terms of like deathbed legacies, Twitter as a platform is pretty darn great.
[1493] It's pretty amazing.
[1494] And it's amazing that someone who is so goddamn busy and has so many other things in his plates, he legitimately, one of the reasons why he bought this, he thinks he can turn it around.
[1495] He thinks he can turn it to a profitable business.
[1496] But one of the reasons why he bought it, he thinks it's essential to democracy.
[1497] Yeah.
[1498] He really does because, like, you cannot have one group of people controlling the narrative.
[1499] You're going to get a very distorted understanding of what's going on.
[1500] And that's, I mean, look, imagine if CNN was the only people that were allowed to say the news.
[1501] Oh.
[1502] We would be fucked.
[1503] It's propaganda.
[1504] Yeah, it is propaganda.
[1505] It's essentially a propaganda network that is beholden to pharmaceutical companies.
[1506] I'll tell you something else that's amazing is that that thing where he takes away the blue check marks from the snob.
[1507] and he lets everybody buy it.
[1508] I mean, I don't if you saw William Shatner, like, a couple days ago, he's, like, complaining, oh, Elon, you're going to make me spend eight bucks a month.
[1509] It's like, first of all, you're like the most, you're like the most highly paid pitchman in, like, American entertainment history.
[1510] Yeah, are you broke?
[1511] It's, no, it's because he doesn't, he's, because it brings, it reveals all the snobbery.
[1512] You know, like, it should not be, common people should not have a blue check mark is the idea.
[1513] So, I mean, for that alone, a lot of that alone is.
[1514] So you've had something for free forever, and if someone comes along and says, you have to pay for it now.
[1515] I think it's, I don't think, $8 is just such a joke.
[1516] I mean, it's the cost of a coffee at Starbucks.
[1517] It's just the fact that he's with the rabble, you know, he's with the masses.
[1518] Well, one of the things that drove me crazy was all the famous people, the celebrities, that were publicly leaving Twitter.
[1519] I'm leaving.
[1520] It's a feel with Nazis now.
[1521] Like they felt like it was part of their moral duty to declare.
[1522] are publicly that they're relieving this thing because you're allowing all sorts of different people to discuss things.
[1523] Yes.
[1524] You need that.
[1525] People need to understand that you need bad voices so that you can counter those bad voices with good voices.
[1526] Yeah.
[1527] So that people who are just observing this without engaging get an understanding of the landscape.
[1528] Right.
[1529] You really get an understanding of like what is the, what are the actual arguments?
[1530] Like what's real and what's not.
[1531] I mean, this is what democracy is.
[1532] It's like you don't get to vote more because you're rich.
[1533] Exactly.
[1534] You get one vote, you get one voice.
[1535] This is so, it seems so basic, but you have to pause on it and be like how radical that was at the time.
[1536] And how, like, you know, because I think we kind of go, oh, you know, the Constitution gives us that right or the Bill of Rights and whatever.
[1537] I was like, no, they like, the people that created this country were super clear that this is like, they're like unalienable.
[1538] This comes from nature or for God or whatever you think it is.
[1539] Yes.
[1540] But it's an anti -snobby.
[1541] It's democratic.
[1542] I mean, I, when I say, when I'm going to make my case for hope and do my CBT with this country, it is, it's coming back to, we have two sexes, humans have the potential to be good, we have freedom of speech, civilization is good.
[1543] Civilization is this platform that allows us to enjoy our freedoms and our prosperity.
[1544] We got to reground ourselves in something common and something universal if we're going to reverse some of those terrible trends.
[1545] I agree with you, and I do have hope as well.
[1546] but I also think we are at the precipice of unstoppable great change and I think it's going to hit us like a fucking tsunami and I think we're just really fortunate to be alive at this time where the whole world is going to shift in a really wild way and I think one of the things you're seeing from whether it is these corporations or these government entities that are trying to control narratives this is like them trying to to grasp at the last bits of control that are potentially available.
[1547] But I think inevitably they're going to lose.
[1548] I think everyone's going to, I think there's going to be no privacy.
[1549] I think zero privacy in a few decades.
[1550] I think mind reading is coming.
[1551] I think that all of these ridiculous black mirror scenarios will come to light.
[1552] And I think we're going to be dealing with a reality that as alien to us as taking Australia Pithicus and bringing them a million years forward into 23 and experiencing like modern life in Dallas, Texas, like wandering around, seeing that that would be so fucking bizarre to them.
[1553] That is what our life in 20 years is going to be to us.
[1554] I don't think so because, I mean, look at like, let's look at the World Economic Forum.
[1555] By the way, I love the Klaus Schwab in the bathroom.
[1556] I mean, you go to the bathroom, I mean, you go to the bathroom and take a shit and there's Klaus Schwab staring at you.
[1557] Yeah, with his fucking goofy Star Wars outfit.
[1558] That is insane.
[1559] Don, don, don, don't.
[1560] No, I mean, it's crazy, but I think if you look at this last one, we just wrote a piece on, I wrote a piece with a former Financial Times correspondent who also, like me, has been obsessed with the World Economic Forum.
[1561] We called it, I think it's called, you know, Davos is a cult and a grift, but it's also a bid for global domination.
[1562] And we just looked at, like, how it's all those things at the same.
[1563] time.
[1564] It's about power and money and also about ideology and dogma.
[1565] I mean, I'm pretty sure like Russell Brand and Glenn Beck have done serious brand damage to Davos and WEF and you.
[1566] You know, I mean, there was no major heads of state that went this year.
[1567] There were no major CEOs that went.
[1568] People pulled out of it.
[1569] It's become, it's become embarrassing.
[1570] That's a that's great well they're also been caught lying oh yeah i mean they've been caught lying about their agenda and one of the things that they were caught lying about you will own nothing and you'll be happy yes which is a fucking insane thing to say they we went through it all the things they said were disinformation we never said anybody who should eat insects bullshit like you go to their website and you're like i can't believe they i was like i wrote a we wrote a thing i was like they really do want you to eat insects well i knew they were fuck when they had brian stelter interviewing people there right On disinformation.
[1571] The font of disinformation, I mean, this is, it's always, it's all psychological projection.
[1572] But also, like, there's no one else credible that's willing to do that.
[1573] Yeah.
[1574] Like, you're not going to get Anderson Cooper to go there.
[1575] You're not going to get someone who's, like, actually still platform by CNN to get there.
[1576] No, you get Al Gore, who goes, Greta stays away.
[1577] AOC stays away.
[1578] Well, Greta was there.
[1579] Oh, I think she was outside.
[1580] I think she was, like, outside or maybe I'm wrong.
[1581] Because there's people from Rebel News were.
[1582] interviewing her.
[1583] But Joe, look at, I mean, it's how, I mean, look, the whole world, we're in a revolt right now.
[1584] I mean, you have the Dutch farmers, um, revolted against this, this totally oppressive nitrogen system.
[1585] I interviewed the head.
[1586] So how is that going?
[1587] Like, what, because they were, they were, is it, is it, is it, are they winning?
[1588] No, they just won.
[1589] Um, so thank God.
[1590] Yeah.
[1591] That was fucking terrifying.
[1592] Well, so the, the, the, so first of all, the Dutch farmers protest, I love the Netherlands, like one of my favorite countries.
[1593] I spent for my time there.
[1594] It's where I'm inspired from my, my, the addiction and homelessness stuff, their approach to it is brilliant.
[1595] But they, it's called the farmer citizen party.
[1596] It's the BBB.
[1597] The farmers protested these nitrogen restrictions.
[1598] Yes.
[1599] And it's important for people to remember, because people think, whenever I talk about this and I'm suggesting that you shouldn't worry about these solutions.
[1600] The farmers themselves had been reducing nitrogen pollution through voluntary and sort of cooperative mechanisms.
[1601] A lot of it's just like controlling the manure.
[1602] Right.
[1603] And controlling where the runoff goes.
[1604] Yeah.
[1605] It's not, this is not like, Nitrogen is an essential fertilizer.
[1606] Yeah.
[1607] So he's got to control it, so it doesn't, whatever.
[1608] So there's things that you can do, but there was this heavy -handed, EU -imposed.
[1609] The farmers revolted.
[1610] The public sympathized with the farmers, because the farmers are obviously just a tiny part of the population.
[1611] New party had been created called the Farmersism Party, started by a journalist.
[1612] I interviewed her, and she's sane.
[1613] She's really sweet.
[1614] She's like, wears leather jackets and is like this kind of ordinary.
[1615] She's a normie.
[1616] She's a normal person.
[1617] They just, they want a commanding.
[1618] It's not a majority because they have multiple systems, but have a plurality of parties.
[1619] So they're going to basically, they are the kingmaker for the Senate, and she's now constructing a coalition to govern.
[1620] So hugely exciting.
[1621] You may have seen in France, there's been huge protests pushing back into Macron.
[1622] I have to say I'm a little, I've always, Macron has been someone that I, it depends on the daily week, it depends what he's doing.
[1623] and I can be sympathetic to him, but I think you see the public, they don't want to take this shit.
[1624] I mean, the Germans, the German reporters, and Germany is such a, such a, like, a repressive little country, you know, it's like they're very, they're like, they've been the most trying to get into the Twitter files.
[1625] They're like, all these German reporters are always like, could you please put us in touch with Elon?
[1626] Like, it's a huge debate in Germany.
[1627] They're sick of the censorship.
[1628] They're sick of the top -down stuff.
[1629] So I think we're seeing some really cool, revolts of the public against the elites.
[1630] I think we're going to see more of it in the United States.
[1631] I think you're a big part of it.
[1632] Russell Brand, Glenn Beck.
[1633] I look at all these folks, it's like, we all might disagree on some of the stuff, but we would like to rule ourselves.
[1634] Yes.
[1635] We would like to have the ability to use our voice and have a debate rather than have Renee DeResta and the censorship industrial complex tell us what we should think.
[1636] Right.
[1637] And censor us behind closed doors.
[1638] Also, we would like to have access to information so that we form our opinions based on facts, not based on propaganda.
[1639] It's like, it's one thing, I mean, it would be one thing if they, if there was like some real problem that is not being addressed because of misinformation.
[1640] But that's not the case.
[1641] No. This is, there's no evidence and no argument whatsoever, including during the COVID crisis.
[1642] There's no argument whatsoever that it's in our best interests.
[1643] it seems to all align with money.
[1644] Well, and this thing where they use and power, I mean, I just testified yesterday with the Stanford Professor J. Badacharya, who was the co -author of the Grey Barrington Declaration, beautiful human being, by the way, just separate from his own views.
[1645] But he was like, look, in a crisis, you need more freedom of speech.
[1646] Like when you're trying to figure out how to solve a fast -moving, fast -changing problem, that is not the time to be doing censorship.
[1647] That's the time you want more views, more representation.
[1648] Also, there's like a standard thing that we should ask at any point in time when there's a dilemma and then someone is trying to control information.
[1649] Is there money involved?
[1650] Is your money involved?
[1651] Yeah.
[1652] And if information got out in one way or another, would it benefit or hurt someone?
[1653] And who is controlling that information and how?
[1654] Right.
[1655] And if you can't ask those questions, then money is just going to dominate.
[1656] That's right.
[1657] A big part of it is transparency.
[1658] I think that the thing we testified on yesterday was just it's very hard, like the social media platforms, for a variety of reasons, you don't want the government regulating them.
[1659] But what you could do is just say every time the government demands something to change on the platform, that government official has to file a public notice that they've asked for that.
[1660] So if the White House is going to say censor true stories of vaccine side effects to Facebook, that government official must report that and it must become public right away, which will both reduce the amount of it that occurs but also allow us to see it.
[1661] And then secondly, if Elon or Mark Zuckerberg or whatever are going to stop, you know, I think there was something going on with the trans shooting that we just talked about yesterday.
[1662] I'm just looking into it.
[1663] Censorship?
[1664] Some folks being temporarily suspended, it appears to be.
[1665] I haven't talked to anybody at Twitter about it.
[1666] I don't know.
[1667] I don't know if you saw it.
[1668] I know Seth Davis from the Federalist appeared to have been.
[1669] de -platformed or suspended briefly.
[1670] I literally haven't, I have not talked to Elon or anybody about it, so I don't want to make any accusations.
[1671] Do we know what he said?
[1672] I don't know, and I suspect, like you said, I suspect it was an algorithm issue where they didn't want, I think there was like a Trans -Day of Vengeance planned for Tennessee or something, and this was all leading up to that.
[1673] So my point was just, just to have transparency on it.
[1674] You know, like if, if Twitter is going to, you know, de -platform somebody or bounce somebody or censor some post because they don't want to contribute to real world violence.
[1675] And there are situations where I think that might be appropriate.
[1676] Just make it transparent.
[1677] Just tweet it out and let everybody know.
[1678] I think that that's this trisand effect there that's going to take over.
[1679] Yeah, although I think it changes the, it provides some context to it.
[1680] In other words, if Elon and Mark Zuckerberg had to say, hey, you know what, we're actually stopping this trans day of vengeance meme from spreading, I think it's okay because they're actually able to explain and talk about it.
[1681] Then you can have comments and people responding to it.
[1682] Transparency, for me, it's not necessarily the silver bullet, but it's the first thing we should do in order to.
[1683] It's more free speech.
[1684] It's actually more more speech, not censorship.
[1685] Yeah, I couldn't agree more.
[1686] I mean, I think we need people to be able to have an understanding of what is actually going on based on facts.
[1687] And if we deny that for their own good, and if we deny that because it contributes to ex -hesidency or ex -disim -you know, we can't do that.
[1688] Right.
[1689] It's got to be about information.
[1690] It's going to be.
[1691] And we have to treat everybody the way we would like to be treated ourselves.
[1692] I was going to, I didn't have a chance to use this line, but I was going to ask these members of Congress that were demanding more censorship.
[1693] I was going to say, what have you said in the past that you think the social media companies shouldn't censor?
[1694] Because if you can't name anything, then all you're saying is that they should censor views that you disagree with.
[1695] The other one, I think, you know, I mean, I think the other issue is that there's this famous quote, people say, you're entitled to your own opinion, but not your own facts.
[1696] Well, that's bullshit.
[1697] You're entitled to your own facts, too.
[1698] We can't agree on what a woman is.
[1699] Like our, like literally, look at the polling on it.
[1700] Like, like Democrats, I think a majority of not, I think a majority of Democrats now say that like trans women, are real women, majority of Republicans say trans women are not real women, we can't agree on what a woman is.
[1701] I mean, Matt Walsh, you had in here, too.
[1702] He's a whole movie called What is a Woman?
[1703] So very good movie.
[1704] Because it's so fascinating to watch the mental gymnastics that people put themselves through to stay within the parameters of the ideology.
[1705] Yeah.
[1706] I want people, like, for me, I want to be able to express myself.
[1707] I want people I disagree with to express themselves.
[1708] That's how it's got to be.
[1709] Yeah, and I agree, too.
[1710] I mean, about everything.
[1711] Like, one of things that Matt and I got into is about gay marriage.
[1712] And, like, I wanted to hear his opinion on gay.
[1713] I don't want to censor him.
[1714] I want to hear his opinion.
[1715] We talked about it for, like, 40 minutes.
[1716] And I'm like, I don't understand you.
[1717] I don't understand why that bothers you.
[1718] I don't understand why you're saying that marriage has to be between a man and a woman.
[1719] And he got into this argument about procreation.
[1720] And I'm like, what about sterile females?
[1721] Right.
[1722] Males and females that don't want children.
[1723] So they not be allowed.
[1724] to get married unless they want to have children?
[1725] Like, what are you saying?
[1726] Like, what do you think gay people should do?
[1727] Do you think they should not be gay?
[1728] Like, and you get them in this, like, weird sort of, what about freedom?
[1729] Like, what about, do you think they're not gay?
[1730] Like, do you think it's an act?
[1731] Do you think that guys having sex with guys, they're just doing it because it's, like, culturally accepted?
[1732] Like, is that what you think?
[1733] Yeah.
[1734] Because they've existed forever.
[1735] It's a real thing.
[1736] Absolutely.
[1737] Like, what are you getting at here?
[1738] And it boils down to they believe their religious ideology trumps your, ability to create your own reality or to or have a reality that aligns with your beliefs and desires and your sexual orientation and whatever the fuck else you choose in life as long as it's not hurting other people and for the Republicans that it was always small government stay out of people's lives but why not with gay people why are you fucking with gay people like why is that apply to everything except gays well i don't get it That's why we're not conservatives.
[1739] Yes.
[1740] That's one of many reasons.
[1741] One of many reasons.
[1742] I think the thing that you're doing that is so important and so beautiful, that's why you're the king of this and why this medium is so important, is that you're saying, I don't understand what you're saying.
[1743] Yes.
[1744] And I'd like to understand.
[1745] And then you actually achieve disagreement.
[1746] You don't know if you disagree because you haven't had a chance to talk about it and understand each other.
[1747] these sensors are not they're not doing that right they're worried that they're going to lose so they want to silence you yeah and they're saying i'm so sure that i'm right and you're wrong that we're not going to even have the conversation yeah we're not going to have the dot they see they view that they view you as a threat it's not because of your beliefs it's because this is a three hour long podcast platform for people to actually raise a bunch of threatening ideas but they're overreacting themselves because of course I mean look you have to have this is why I think there is some faith you know you do kind of go like we just there's a faith in which more speech is better yes more speech is better for human beings and they've and there's some that these censors have lost faith in the American project they've lost faith in the Enlightenment project I don't think they're even looking at it I think they're self -centeredly looking at this whole thing as like how do I win?
[1748] People love to win.
[1749] This is one of the problems with prosecutors hiding evidence that would exonerate defendants.
[1750] Yeah.
[1751] It's people like to win.
[1752] Yeah.
[1753] It's like cops plant evidence.
[1754] They want to win.
[1755] Right.
[1756] When you make it a game and you have a winner and a loser.
[1757] And if I can get you booted off a Twitter by making a few emails, woo, look what I just did.
[1758] Well, right.
[1759] Fuck Michael Schellenberger.
[1760] Oh, yeah.
[1761] Oh, and they love it.
[1762] And they get so much pleasure from it.
[1763] They get so much pleasure.
[1764] I mean, there is a snobbery to it in the sense of they just go, they go, I just can't believe that that guy's the president.
[1765] Exactly.
[1766] I mean, how dare he?
[1767] It's the whole deplorables thing, right?
[1768] Yeah, a basket of deplorables.
[1769] Yeah.
[1770] But in their defense, wouldn't it be nice if we had, you know, look like Obama's my favorite president because I think he was the best spokesperson for a nation.
[1771] He was the best representative of what is possible in America.
[1772] You know, comes from a single mom, you know, he's African American.
[1773] He's super like articulate and well educated and just composed no matter what happens.
[1774] He's the best statesman we've ever had as president, in my opinion.
[1775] It would be nice if that was always the case.
[1776] Instead, we have Biden who's so obviously mentally compromised.
[1777] We have Kamala Harris that nobody wants to be president.
[1778] And then we have Trump, which everybody's terrified of because he's a fucking egomaniac.
[1779] Right.
[1780] Meglamiac, fucking narcissist psychopath.
[1781] Like, what is out there for us that gives us hope in terms of leadership?
[1782] Yeah.
[1783] Very little.
[1784] Yeah.
[1785] Well, there's symptoms of, I mean, there are symptoms of a broader rot, right?
[1786] Yes.
[1787] I mean, in the culture, it's, it's, yeah.
[1788] Well, it's always darkest before the dawn.
[1789] Oh, look at you, positive.
[1790] Michael, you're such a positive person You always find a way to turn it towards the light I think it's darker before the nuclear explosion I don't know if it's a dawn It's always darker, but it's always dark before the bomb goes out By the way, it isn't always darker before the dawn That's horseshit, it's actually quite light Before the dawn It's like that's really dumb It's not, oh no, it's one of the best of the clichés It's the dumbest It's dumbest No, it's darker In the middle of the night, you fucking idiot You know It's funny because these are both Gen X men These are both manifestations of Gen X mentality Because yeah, the Gen X was the original You know, we were the first ironic generation The first sarcastic generation Really?
[1791] Yeah They weren't sarcastic in the 70s?
[1792] Well, we were, I mean, we were alive in the 70s Right, but we weren't grown -ups Yeah No, the 60s, the boomers were very non -ironic.
[1793] They were very earnest and...
[1794] I think we're speaking in rash generalizations.
[1795] Oh, for sure.
[1796] My real concern is that with technology and the ability to control people, if we don't get a grasp on that, we're going to fall into a situation that's very similar to what they have in China, where you have a social credit score and a centralized digital currency.
[1797] And when I see people like Maxine Waters pushing us towards that direction and people talking about the first sounds of it were vaccine passports.
[1798] When they were saying vaccine passports, it was like, Jesus Christ, don't do that because that is going to lead to a social credit score system that's going to lead to, they're going to just, once they have the ability to make you have an app and that app gets to decide whether or not you travel, they're not going to let that go.
[1799] There's no way they're going to let that go.
[1800] And once they have something like that, attached to a centralized digital currency, it's game over.
[1801] It's game over until something really big happens.
[1802] And that's what China's experiencing.
[1803] No, for sure.
[1804] The social credit system is totally terrifying.
[1805] No, I mean, look, I become way more libertarian since I've worked on the Twitter files.
[1806] I get it.
[1807] Really?
[1808] And I get the paranoia.
[1809] But this is a really recent shift in your philosophy.
[1810] It is.
[1811] I mean, I've always been more, I came from the more of the socialist left than the anarchist left.
[1812] So I've always thought that there was a good role for government.
[1813] I still do, but no, for sure.
[1814] I've become much more paranoid.
[1815] I mean, when you spend all this time in these documents and you see the way these guys kind of sneak around and they're trying to do all this stuff behind the scenes.
[1816] It's really, it is like, Elon thought it was funny.
[1817] It was like, yeah, I mean, it is like these conspiracies are real.
[1818] They weren't, yeah, I wish it had, you know, I wish they were fake.
[1819] Yeah, you sent them to me a tax manager.
[1820] Turnouts are all true, L .O .L. He says that's so casual.
[1821] Yeah.
[1822] Well, I think, you know, because of who he is and the way his mind works, I don't think he necessarily gets upset, like, the way other people do.
[1823] Yeah.
[1824] I think he just goes, oh, what's what it is?
[1825] I've never seen anybody, I mean, what's amazing is I've never seen anybody be so impulsive and so successful, because I think we associate impulsivity with, you know, failure.
[1826] And, but he is somebody that I have sort of, I think it's like, I think impulsivity is to Elon in the way that kind of being a dick was successful for Steve Jobs.
[1827] You know, Walter Isaacson, who's writing a book about Elon right now, by the way.
[1828] But Walter Isaacson in his biography of Steve Jobs, he was like, he was like, look, Steve Jobs was just too much of an asshole.
[1829] He didn't need to be that much of an asshole.
[1830] But what it did is it forced out incompetent people.
[1831] It was a way that he got rid of incompetent people.
[1832] I think Elon's impulsivity, the way that he moves quickly, you know, like he overpaid for, Twitter on the one hand.
[1833] On the other hand, he owns Twitter.
[1834] Like you're like, because you kind of go, well, the market value is one third of the 44 billion.
[1835] It's like, yeah, on the other hand, like, Twitter is now like a pretty open platform.
[1836] Again, like we got to, we need transparency and we should all be vigilant and whatever.
[1837] But I mean, wow.
[1838] Like, how much is that worth?
[1839] When he had the vote online, like anyone who hasn't violated the law, should I let them back in?
[1840] And most people like, or a Enough people were like, yes.
[1841] He's like, okay, the people have spoken.
[1842] And so he lets in all these fucking psychos, like really nutty people.
[1843] Well, yeah.
[1844] And there were mistakes, too, right?
[1845] Remember, he also was like, they would give out the verification to these fake brands.
[1846] And Eli Lilly was like, there's a fake Eli Lilly account.
[1847] Oh, I didn't know that.
[1848] They go, they go, starting Monday, we will be giving out all insulin for free or something like that.
[1849] And the Eli Lilly stock dropped.
[1850] So, I mean, that happened, and Elon was like, okay, we're going to change that a little bit.
[1851] So, I mean, he, that fast, I mean, so he's, that the whole, I mean, when people, this cliche in Silicon Valley, the whole, you know, move fast and break things.
[1852] But that's what he's doing.
[1853] And then he moves fast.
[1854] He breaks something.
[1855] But then he also fixes quickly.
[1856] Yeah.
[1857] So, but, I mean, you have to remember that they had 7 ,500 employees.
[1858] I think they're down to somewhere between 1 ,000 and 1 ,500 employees at this point.
[1859] Well, it's been pretty well established that most tech companies.
[1860] severely overhired and you know we've played multiple times this video of this woman who made a tic -tok a day in the life of twitter oh yeah working at twitter i'm sure you've seen it who wasn't doing anything right she wasn't doing jack shit she was playing foosball and then i had a glass of wine and on the pad and look at the view so blessed i was like this is crazy i'd fire you immediately if you put that video out the fact that she put that video out and someone's paying her a salary to essentially like hang out and eat all this delicious food and he's like fuck you get takeout like go to work like here's a bed sleep here i mean the idea that there's a thousand people at a company that is a 44 billion dollar company is crazy it is crazy yeah so yeah i mean he's obviously um he's a genius and he's the richest guy in the world and he's going to become even richer with spacex and yeah i mean it took somebody that powerful to to do this somebody that powerful that's also addicted to Twitter.
[1861] Which is, that's where it gets fun.
[1862] Because he was on it every day.
[1863] And something about his emotional makeup doesn't make, I don't know how upset he gets when people come after him.
[1864] Well, he got, so this is a funny story.
[1865] So we were there in December, you know, we come in and we're like doing the Twitter files.
[1866] And then he ends up deplatforming people that he said had been, had docks his private plane.
[1867] Yes.
[1868] And there was a big controversy about it.
[1869] They really do it.
[1870] Whatever, I didn't fall out super closely.
[1871] But Barry Weiss, who was there and who had brought me in, she criticized Elon in a tweet and was like, look, you know, it was arbitrary before.
[1872] Is it arbitrary now?
[1873] Elon responds.
[1874] And it's like, you're just trying to suck up to the woke mob.
[1875] You know, you're trying to have it both ways.
[1876] It was a mess, right?
[1877] We were all kind of like, you know, it was like, oh, my God, they're all fighting.
[1878] Right.
[1879] My parents are fighting.
[1880] It's like, only dad and mom are fighting again.
[1881] You know, and we're like, oh, and I kind of retweet her, but it was like, okay, I retweet her, but we'd still like to have access to the Twitter files.
[1882] you know, we did, there's this famous clip that went viral when Matt Taeevi and I testified in front of Congress where this member of Congress goes, you know, how'd you get in?
[1883] You know, they were trying to make it like a scandal that somehow we were reporting on the Twitter files.
[1884] And she's, and I was like, I was brought in by Barry Weiss.
[1885] And then she was like, oh, so it's like a threesome.
[1886] And the whole room erupts into laughter.
[1887] And I was like, well, there was actually a lot more people involved than that.
[1888] and everybody laughed and and Elon just loved it because you know he's just like us we're all just perverted Gen Xers at the end of the day you know so he loved it and uh was very happy and was just like was like all is forgiven with Barry if she wants to come back in you know come back in because I think he's also somebody that what I like about Elon and I don't know him very well at all is that he reminds me a lot more of like because I've met him from amount of him in Brazil and Brazilians are just very emotional and they're just like the men will cry and they'll scream at each other and then they'll make up and it's just very expressive culture and Elon's just he's just expresses his feelings about things when he's mad at somebody he'll tell you and and then but then he also has shown this capacity to forgive and so I you know I think there's there's something there you know in terms of you know I mean his he really I think he really he He told us, he's like, I didn't buy Twitter just to re -platform Babylon B. And we were like, I was like, but it was part of it, right?
[1889] Like, part of it was, you know, I think it was, it was, you know, I do think there's some generalizations about our generation is actually appropriate.
[1890] I think that Gen Xers, you know, there was a moment there, I don't know, I mean, I don't want to create like a golden age about it.
[1891] But, I mean, there was a point there where it was like, remember, you know, breakfast club and, you know, it was like, we were kind of like, yeah, you can, like, date whoever you want.
[1892] You want to date a black girl?
[1893] You want to date a Latina, you know, whatever.
[1894] You can be gay.
[1895] And, like, it was not a big deal.
[1896] But it also wasn't like, you were, like, higher on some moral hierarchy or something.
[1897] Right.
[1898] And so.
[1899] That's the problem with identity politics.
[1900] Yeah, like the political, I mean, I was an annoying politically correct guy in college.
[1901] But there was also some Gen X spirit of, like, hey, we're kind of beyond all that bad 60s shit.
[1902] You know, we don't want to be there.
[1903] I mean, John McWhorter also talks about it in woke racism.
[1904] And where it's like, and he's a Gen X or two, where I think there is a, I'm not saying this the solution.
[1905] to all of his problems, but I think that that Gen X spirit, that breakfast club spirit needs to come back into American culture.
[1906] I wish you hadn't said that.
[1907] You wish I hadn't said that?
[1908] Yeah.
[1909] Breakfast club spirit.
[1910] I said, hey, I'm the cringe one.
[1911] Unless you're talking about like the Charlemagne radio show, that breakfast club, I agree there.
[1912] But what do you don't like breakfast club?
[1913] The movie, it's okay.
[1914] Oh, come on, Joe.
[1915] It's not my thing.
[1916] But my question is like, when did you show?
[1917] shift and become less politically correct?
[1918] Like you were politically correct in college.
[1919] What caused the shift for you?
[1920] You know, it's funny to you ask that.
[1921] I mean, it was, I was in San Francisco in the 90s doing kind of publicity campaigns for different progressive causes.
[1922] And I had some women I knew who were also very progressive.
[1923] And they came and they were like, we want to come and do a diversity training for you and your staff.
[1924] And I was like, why?
[1925] And they were like, well, it was like all of the early implicit racism stuff.
[1926] And I just remember being like, I don't think we're racists, and I'm not going to do that.
[1927] And I think there was a bunch of that happening.
[1928] Grifters.
[1929] They're grifters and moralizers, and they wanted to get some power over us and be paid.
[1930] I mean, it was the beginning of all that bad diversity training stuff.
[1931] So I think it was that, you know, it was also on climate change.
[1932] Once you kind of go, climate change is just going to be solved by producing energy without carbon.
[1933] emissions like it's just a technical problem it's not like oh we all have to ride our bikes like i love running my bike but it's like it became the moralizing and the woke culture i was just like this is bullshit there's also a stand there's also a thing that's not being addressed about the climate is that it's never been static ever never in the history of the earth so this idea that climate change is going to be mitigated or that somehow or another we're going to be able to control it like are you sure because it seems like ice ages have always existed and great periods of melting and global warming have always existed.
[1934] Like whether or not we're having an effect on it, that's what we should say.
[1935] Well, what is our effect?
[1936] Pollutants.
[1937] What are we doing?
[1938] What are we doing that's negative?
[1939] But this idea that if you stop, the Earth is going to stay like this, it's not.
[1940] Of course.
[1941] It doesn't exist.
[1942] It's always like this.
[1943] It's up and down.
[1944] It's all over the fucking place.
[1945] Well, the funny thing is we were probably headed towards an ice age.
[1946] Yeah.
[1947] And then our carbon emissions probably, it appears to have reversed that.
[1948] Which is good.
[1949] which is good and then you just don't want to go too far.
[1950] Global cooling is way scarier than global warming.
[1951] And way more people die.
[1952] Randall Carlson told me that, and I never even thought about it until he said it.
[1953] And I was like, yeah, Jesus Christ, you're fucked if everything freezes.
[1954] And he said there was a point in human history or a point in the history of the earth where things got so cold that we almost became inhospitable to life.
[1955] Life as we currently understand it and know it.
[1956] Yeah, for sure.
[1957] I mean, you know, everything in moderation.
[1958] I mean, you just, I mean, you, you don't want to change the temperature too much in any direction.
[1959] Of course.
[1960] So, but I mean, look, it's like for me, it's always been, I think there's a bunch of complicated problems like social media and the culture.
[1961] But like energy, it's pretty straightforward.
[1962] If you're using wood, anything is better than that, including coal.
[1963] If you're using coal, natural gas is better.
[1964] Yes.
[1965] And nuclear is even the best.
[1966] And basically, the world has been moving in the direction.
[1967] from wood to coal to oil and gas to nuclear, and that's the right direction.
[1968] And nuclear has always been a weird one because it brings with it the power to make bombs.
[1969] But as an energy source, it's amazing.
[1970] Yeah.
[1971] So we just kind of overthinking.
[1972] And then the issue got, not overthinking, but really got hijacked by a bunch of opportunists that want to use it as a way to exercise control.
[1973] So for you, you experienced these people that came along that were kind of grifters that were saying we need to incorporate some.
[1974] And, by the way, you're talking about an, like, extremely progressive liberal organization that you were part of.
[1975] Oh.
[1976] Which, if there was any racism, it would have stut out like a sore thumb.
[1977] It's, if anything, like, you're promoting the complete opposite of what they're trying to say.
[1978] Yeah.
[1979] Like, by giving you some training, they're trying to find implicit racism or hidden racism or, you know.
[1980] Well, it gets back to that.
[1981] I think there's a. When you just abandon traditional religions and the traditional morality, you want to create, I mean, look, even like, I mean, this BIPAC thing is so interesting because it's like, I was like, what, I remember I was finally somebody to explain, what is Bipoc?
[1982] Well, that's black, indigenous people of color.
[1983] Literally in the word, it's creating a hierarchy where it's black and indigenous people above, you know, Latinos and Asians who are just barely people of color.
[1984] Right, right.
[1985] It's grotesque.
[1986] Everybody hates it.
[1987] Not everybody, but most people, I think, actually hate it.
[1988] But it has this power because it's providing, in fact, this detransitioner I interviewed.
[1989] She was like the social justice, she's like as, she's autistic, you know, so she's autistic autism spectrum.
[1990] She was like as an autistic person, and she's a lot of self -awareness and older now, but she was like, that social justice moral hierarchy provided some comfort.
[1991] Like it was like a way to be like, confusing, you know, she was uncomfortable with herself, socially awkward.
[1992] I could kind of fit into this moral hierarchy.
[1993] hierarchy and then be really dogmatic about it and then feel powerful feel in control have community so i think when you don't that's why i like re reverting to like the older morality is first amendment morality right the older morality is is true anti -racism and that we don't think there are human races much less that they can be put on a hierarchy this is what we want to get back to yes and that then that is the reality of biological human beings too yes the reality is it's one race we just adapted to different climates that's all it is We all came from Africa.
[1994] But what specific, so you experienced this, and you recognize these people were grifters, and then, like, what moves you other than, is the Twitter files, is that the biggest shift in your political?
[1995] Yeah.
[1996] I mean, the first big one was nuclear.
[1997] After you realize that nuclear is good, not bad, that's such a big one.
[1998] You know, you're just like, wow, man. Because that's like already nuclear was the secular devil for those of us that grew up in the end.
[1999] Because it's connected to power.
[2000] weapons, rather.
[2001] The day after and all the nightmares.
[2002] Yeah, and it's also connected to like Three Mile Island and Fukushima.
[2003] Yeah.
[2004] I love these things.
[2005] I mean, I think it's like, you know, it's funny because, of course, we know that disconfirmatory information is dopamine depleting.
[2006] In other words, like if we get proven wrong, it's like depressing.
[2007] You're like, oh, but there's another way after you get over it, you're kind of like, well, that's cool.
[2008] Like, nuclear is not what I thought it was.
[2009] Like, there's actually a moment of awe.
[2010] You know, it's like seeing a UFO or being like, oh, my God, we might not be alone.
[2011] There's something exciting about the excitement.
[2012] We need to get back in touch with the excitement that comes after you realize that you were wrong.
[2013] It's an awareness of some humility and that the world is more mysterious and wonderful than we had realized.
[2014] Yeah, I think there's also a really a great benefit in expressing to people that ideas are just ideas.
[2015] It's not you.
[2016] Yes.
[2017] These are just some things that are bouncing around your head.
[2018] And even if you're wrong, it's like, it's not a value judgment on you.
[2019] You should probably be wrong less than you are right.
[2020] You should probably be right much more.
[2021] But it's very important that when you are wrong, to acknowledge that you're wrong, one of the worst things that happens to a public intellectual is when they are wrong and they refuse to admit they're wrong.
[2022] This is the Sam Harris dilemma.
[2023] Like there's many people that are very brilliant people, but they're in this trap where they can't say they were wrong.
[2024] And if you can't expose people to your thought process and why you made errors, they're going to lose faith in your ability to discern the truth in the future.
[2025] And isn't it ironic that often those are the people that are always talking about being without ego?
[2026] It is.
[2027] It's sad.
[2028] I always notice it's like, wow, the people that talk so much about not having ego are the biggest egos.
[2029] It's just being a human, man. It's being a human.
[2030] And I think it's also just a sign of our ideologically driven times where I think the divide between the right and the left and the boundaries in between them are so wide now.
[2031] Yep.
[2032] It's so different.
[2033] I think that that thing, too, of where, again, abandoning traditional religions and adapting to immorality, I think people do start to play God a bit unconsciously and they get real self -righteous and really, I think it's great to, I mean, I don't know how to do it, but for me, it's always.
[2034] like just we're all going to die like just let's pause for a minute like we're going to die and not only that this is the stoicism is so good at this you know it's it's Memento Mori oh my god they're right there you know it's a these are like what are you like six of them they're amazing it's like this is that's you very soon so what the fuck are you going to do between now and then?
[2035] For the people listening he's pointing to these skulls that are on the table really cool yeah these are all from this guy Jack of the Dust, who's an artist.
[2036] They're not real skulls.
[2037] Amazing.
[2038] You are real skulls?
[2039] No, these are just their resin.
[2040] He makes these.
[2041] But we'd be a lot better.
[2042] I think there's some way, when you remind people, some people of their deaths, they get kind of reactionary and smaller.
[2043] But other people, I think there's a moment.
[2044] It's like, yeah, like, so what am I going to, like, this is it?
[2045] Like, what are you doing now?
[2046] And what kind of a person do you want to be?
[2047] And what kind of a life do you want to lead?
[2048] We need that.
[2049] My friend Peter Attia, he introduced me to this thing called, Your Life, life in weeks.
[2050] And each week, you scratch one off and you look at all the weeks.
[2051] You can see all weeks.
[2052] You can see them all.
[2053] All of them.
[2054] And you go, this is how much you got left, unless something radical changes.
[2055] And it's just like, who.
[2056] And people said to him, like, oh, my God, this is so depressing.
[2057] It's like, it's actually not.
[2058] Right.
[2059] It reaffirms my understanding of what's important and it makes me want to spend more time with my family.
[2060] And it makes me want to not do things that I'm really not interested in doing just because they're going to make me money.
[2061] Right.
[2062] Yeah.
[2063] And which, which, if We can get that into the head of some of these fucking people that are censoring people and some of these people that are pushing these crazy agendas and hiding information from people because they think that it's going to contribute to an undesirable outcome that doesn't fit in a line.
[2064] If the other group wins, you did a shitty job.
[2065] And if you're hiding information that would allow that other group to win, you're a bad person.
[2066] Like, you're bad.
[2067] Like, if there's actual real criminal evidence that you're hiding.
[2068] hiding because you don't want this other person to get elected, you're doing a terrible thing to humanity.
[2069] And you're doing it based on these very base and normal human instincts.
[2070] Absolutely.
[2071] And like look at that.
[2072] I mean, they won't.
[2073] So first of all, we've emailed, like I mentioned that Aspen workshop with all the journalists and all the social media companies.
[2074] I emailed every single one of the participants and said, would you please talk to me about this?
[2075] Not a single one.
[2076] I'm sorry, Washington Post, actually, of all places, responded, not the actual reporter, but through a spokesperson responded with some lame thing, but it's kind of like, if you're so, if like you're so confident, if you're so better than everybody, then why can't you come and just have a conversation and defend it?
[2077] Yeah.
[2078] You know, skulking around and, I mean, it shows the underlying insecurity and weakness behind those sensors.
[2079] I mean, it's the hall monitor type.
[2080] You know, it's the, they're the little church lady.
[2081] You know, they don't want to have an open conversation.
[2082] They want to exercise power behind the scenes.
[2083] Well, that's a human instinct.
[2084] It's a natural human inclination to control other people that you might think are threatening or in competition with you or might, you know, somehow or another get in the way of your desired goals.
[2085] And people get so self -obsessed in those things without something like your week and life, your life in weeks, like where you can just look at it like, oh, this is all fucking fruitless.
[2086] Like, what am I doing here?
[2087] I'm going to go send the life and weeks to all of the censors and be like, you are here.
[2088] How long do you want to keep trying to censor your fellow Americans?
[2089] I mean, like, what are you doing?
[2090] Well, I think one of the things that has happened I think has been greatly beneficial, that the exposing of the Twitter files and the making it public where, like, especially that we were talking about this last night at the club, that woman who was like calling Matt Taibi a soul.
[2091] called journalist.
[2092] She called us both that, by the way.
[2093] Yeah.
[2094] Yeah.
[2095] Hilarious.
[2096] Like, what is a journalist to you, someone who says things only that you agree with?
[2097] Well, and you know who, and we talked about also what a powerful projection it was because she's a non -voting representative from the Virgin Islands.
[2098] Yeah, which is the, somebody points out right, she's a so -called representative.
[2099] Yeah.
[2100] Yeah.
[2101] But, I mean, calling Matt Taibi, like, who is so decorated, a so -called journalist.
[2102] And the fact that he got to rattle off all the awards in journalism that he's received.
[2103] I mean, it shows what their concern is.
[2104] Their concern is around status.
[2105] I want to know who the fuck talk to her.
[2106] I want to know who boosted her up and got her to say those things.
[2107] Oh, yeah.
[2108] Say it that way.
[2109] Like, what was the conversation?
[2110] Someone should get into her fucking emails.
[2111] I'd like to know.
[2112] Like, what conversations were you privy to?
[2113] Like, what did they say to you?
[2114] Well, also, is also Debbie Wasserman Schultz.
[2115] Another one.
[2116] He's famous for De -Railing Insider.
[2117] And so our trainee derailing Bernie Sanders is good to become, you know, so the, yeah, they're just, it's all pod calling the kettle black.
[2118] Well, it's also the shittiness in which they communicate with these people who are just exposing something that everyone should be aware of because it's a real problem.
[2119] Right.
[2120] And what Twitter is and what Facebook is and Instagram and all these social media platforms, these are our new public squares.
[2121] And we need some sort of an understanding of the significance of the significance of.
[2122] censorship in regards to what kind of an impact that's going to have on our life, a real life world.
[2123] Like how many people who got censored off of Twitter, it radically changed their life, radically changed the, for wrong reasons, change the progression of their future.
[2124] I would imagine a lot, quite a few.
[2125] And also, deeply humiliated them and probably ostracized them in certain social circles.
[2126] There comes, Mike, he got banned from Twitter.
[2127] Yeah.
[2128] Like, what did he say?
[2129] Oh, yeah.
[2130] He said, only women have vaginas.
[2131] Megan Murphy.
[2132] Yes.
[2133] Yeah.
[2134] But men aren't women, vagias.
[2135] I love it that she said that to her.
[2136] Like, after she got back, yeah.
[2137] And she's still mad about it.
[2138] Oh, you did?
[2139] Okay.
[2140] She's my friend.
[2141] I had her on the podcast when she was banned.
[2142] Because I was talking about her before she was banned because I, excuse me, before she was brought back because I had heard she was banned for this.
[2143] And so that was the ultimate stricant.
[2144] effect.
[2145] Like, I took this woman who was this obscure journalist who got banned for disagreeing with trans activists and I brought her in front of millions of people.
[2146] I'm like, what happened?
[2147] Amazing.
[2148] Tell me what happened.
[2149] Multiple times she's been on.
[2150] Amazing.
[2151] And so now, you know, she's back on the platform and, you know, now people get to, she's a brilliant woman.
[2152] And she's also, she has some really good points.
[2153] And her point about trans activists, like you're trying to silence biological women, but you're bringing in these biological males into these traditionally women's bases, and they're calling themselves feminist, and she's like, that's not real.
[2154] Like, this is not what's happening here.
[2155] Absolutely.
[2156] I mean, it's interesting that the people that are trying to kind of put everybody down, you're the deplorable, or the so -called journalists, or just all of the insults.
[2157] They're coming from people who have just been the worst bootlickers, their entire careers, suckups, brown nosers.
[2158] They are so proud of having sucked up for so long that they're they're deeply threatened by people who are actually challenging the status quo.
[2159] Well, that's the mainstream journalist approach to the internet journalist.
[2160] You know, when you have people like Crystal and Saga from breaking points who are like beholden to no one?
[2161] Right.
[2162] Like what?
[2163] They hate it.
[2164] And also they have a subscription -based service so they don't even need advertisers?
[2165] The fuck is going on here.
[2166] Like, who saw this coming?
[2167] No one, right?
[2168] Before Substack.
[2169] There was never a place where.
[2170] Someone like a caliber of Matt Taibi or Glenn Greenwald could post and millions of people could read their stuff and it could make international news just like a Washington Post article, just like a Boston Globe article.
[2171] Well, they're envious too, right?
[2172] Of course.
[2173] They should be.
[2174] Yeah, you have to like go.
[2175] You go to work.
[2176] I mean, if you work at one of those traditional news outlets, you go to work every day, you're not able to publish and write.
[2177] You know, you write something, the editor sit on it.
[2178] My friend Nellie Bowles, who's married to Barry Weiss.
[2179] you know, when she was at the New York Times, you know, like, you'd write a story and then you'd argue with editors for weeks.
[2180] And then maybe they'd publish a thing that was like half its original length and has been completely wokeified.
[2181] So they don't have, they don't have, they're actually, they're jealous of the freedom that people with free speech have and they want to stamp it out.
[2182] 100%.
[2183] And it's, you know, it's a threat to the choices that they've made.
[2184] the good thing is that they're blind to it and so they end up doing what they did to us in that hearing because it was kind of like I remember we were just laughing and I was like do they realize how bad they look they didn't they thought they were super clever I don't know if you saw they put a photo up of you with Matt did you see that?
[2185] Yeah I look high as fuck too it was hilarious I must have been sleepy that day because like it's like the photo that they chose makes me look like oh yeah but the idea that they're connecting me to him and so with his giant photo that somehow discredits.
[2186] Yeah.
[2187] They don't understand the landscape.
[2188] No. They really don't.
[2189] They've lost the plot.
[2190] They've lost the plot.
[2191] They have a bubble.
[2192] They have like this ideological echo chamber that they exist in and they think that like holding a photo, oh you're on the biggest program in the world.
[2193] What are you?
[2194] A piece of shit?
[2195] Yeah.
[2196] No, for sure.
[2197] Well, I mean, look, I mean...
[2198] It's, it's, but it's It's really funny.
[2199] It's funny to watch.
[2200] And this is this transformation from the world of these corporate -owned distributors of information to independent people that people actually trust that don't have any sort of weird connection to executives and producers and all these other people that have a vested interest in pushing a narrative that is established by the advertisers.
[2201] You sound optimistic.
[2202] I am in that.
[2203] Well, I knew something was going on when Howard Stern started criticizing podcasts.
[2204] I was like, that's hilarious.
[2205] This was, like, a long time ago, like, why fucking idiots wasting their time?
[2206] Like, there's losers do podcasts.
[2207] Like, oh, that's, you're threatened.
[2208] Threatened by it.
[2209] Because you're stuck on satellite.
[2210] Right.
[2211] And satellite only goes to places where the satellite reaches.
[2212] Like, it doesn't even work in tunnels, man. Well, one of the most, like, one of the most bitter people on Twitter is Keith Olberman.
[2213] Like, he's always trying to get back to, like, a cable show for a little while?
[2214] He's hilarious.
[2215] Yeah, he's hilarious.
[2216] That guy's, he's a fucking human caricature.
[2217] Yeah.
[2218] That guy, like, when he was doing that thing where he was, like, ranting in his basement about Donald Trump being arrested eminently at any moment he's going to be arrested.
[2219] And that thing he would do, the resistance.
[2220] It's like, God, it was so cringe.
[2221] Yeah.
[2222] But it's also, it's like, you're so clearly angry and arrogant and shitty.
[2223] Like, do you not understand that these are personality traits that nobody loves?
[2224] likes, and especially if you are uninformed, misinformed, incorrect, and it was, like, when he was like this vaccine promoter, it's like certain people look for a thing that they think there will be popular opinion behind, they get with that thing so that they can connect themselves to a winning movement, and then they angrily advocate in favor of other people complying, and that's what he did.
[2225] No, they're like junkies seeking a fix.
[2226] They're seeking a fix of immediate social reward.
[2227] Yeah, I mean, he got vaccinated on film.
[2228] Come on.
[2229] Come on.
[2230] Oh, show everybody.
[2231] Gross.
[2232] Like, yeah.
[2233] Meanwhile, I'd already have natural antibodies by then.
[2234] I'm like, why?
[2235] Yeah.
[2236] Do you guys don't understand how this works?
[2237] No, yeah.
[2238] This is how it's worked forever.
[2239] That's the weirdest one.
[2240] I mean, I was fact -checked during my campaign by the San Jose Mercury News, which was like, Schellenberger's got some really weird ideas, including this idea that natural immunity is just as effective as the vaccine.
[2241] It was like, what?
[2242] It's crazy to watch the left, captive, captive to the pharmaceutical industry.
[2243] Like, how did that happen?
[2244] You guys were the ones that were always my body, my choice.
[2245] Like, what the fuck is going on with you people?
[2246] Well, no, I remember, like, you know, I was raised by very progressive parents.
[2247] My dad had a food co -op.
[2248] I mean, I remember hearing about natural immunity since natural immunity.
[2249] You know, it was like natural immunity.
[2250] That's like a liberal thing.
[2251] Right.
[2252] But it was Dennis Prager, who would always be talking about.
[2253] I got COVID so I could have a natural immunity, you know.
[2254] Yeah, and people like, you're crazy.
[2255] This is not only that.
[2256] I mean, just the fact that no one has a problem with all of the different remedies that were effective against COVID being suppressed.
[2257] Right.
[2258] And that there was one narrative.
[2259] And that narrative was connected to the emergency use authorization, which is only applicable if there's no other treatments.
[2260] Right.
[2261] The fact that this is not, that that was never disclosed.
[2262] disgusting, but the people weren't, but it's also the terror of this impending pandemic that's going to take out your loved ones.
[2263] And, you know, Robert Malone talked about that on the podcast, that it creates this mass formation psychosis, that you have this one thing that people are looking at as the savior.
[2264] And any suppression of that or any resistance of that, you are going to ruin my life.
[2265] I'm trying to get back to work.
[2266] I'm trying to make society do it, do the thing and you can't even be like hey but maybe we should see studies hey but where's the data hey right why are they telling pregnant women they can take it there's no studies on pregnant woman hey you know why is there an increase in all -cause mortality in the year that everybody got vaccinated right hey what is going on with all the myocarditis hey what's up with the strokes hey and everyone's like la la la la not listening la la and if it wasn't for people like robert kennedy junior writing that book if it wasn't for people like uh How do he say his name?
[2267] Jay Batacharya.
[2268] I don't want to fuck his name up.
[2269] Jay Batacharya or that gentleman from the UK, John Campbell, or some of these other doctors.
[2270] I love John Campbell.
[2271] I love John Campbell.
[2272] He's my favorite.
[2273] He's so well, he's so measured and even and a tinge of British sarcasm to some of the things that he says.
[2274] But not pompous either.
[2275] And also very much like, I want people to have the information.
[2276] Yes.
[2277] But I think that it's the will to control that comes before the catastrophizing.
[2278] They want to have control, and then they exaggerate the problem, whether it's climate change or COVIDism.
[2279] That's what's coming first.
[2280] It's the need for that social power.
[2281] I think the other issue, and it struck me as you're talking, the reason that they want to emphasize the vaccine over the remedies, and Steve Kirsch talks a lot about all the different ways in which you can treat the COVID, bless you.
[2282] Thank you.
[2283] Is that that's sort of something that you can do on your own.
[2284] You can treat the COVID, whereas vaccines were going to be something that we're going to do.
[2285] do as a society and it's this collective action.
[2286] Well, any resistance to that puts you in this anti -vaxxer category, which is like one of the worst pejoratives in modern world times.
[2287] Climate deniers right up there.
[2288] It's right up there.
[2289] It's right up there.
[2290] Yeah.
[2291] You got lumped into that.
[2292] Oh, yeah, for sure.
[2293] Meanwhile, you're just stating data and facts.
[2294] But the...
[2295] I mean, it's funny because I've been thinking, I was like, looking at the trans folks, they're actually sex deniers.
[2296] Mm -hmm.
[2297] And for a minute there, I was like, what do you call them sex deniers?
[2298] And I was like, God, that's just like, I don't want to be that guy.
[2299] I don't want to be that guy either because I have friends that are trans.
[2300] And, you know, like.
[2301] Well, no, but they don't, but you can be trans and not be a sex.
[2302] I mean, in other words, sex and I are somebody that says that there aren't, there aren't biological sex is not real.
[2303] It's just a social construction.
[2304] Right.
[2305] So I think that, you know, which is just absurd and obviously so.
[2306] But yeah, like, I just, yeah, we don't want to be that.
[2307] No. We want to be the change in the world.
[2308] I want people to be able to do whatever they want to do.
[2309] But I don't want it to happen to children before.
[2310] they can figure out what the fuck is going on and I want them to be coerced.
[2311] Children are so malleable.
[2312] You can get them to join cults.
[2313] You can get them to believe that they have to strap a suicide vest on and walk into a crowded courtyard.
[2314] There's things that you can get children to do that you're not going to get older people to do.
[2315] Yeah.
[2316] And to influence them to make a permanent change on their body that will sterilize them and also prevent them from experiencing sexual pleasure.
[2317] excuse me bless you it's i just cough that time it's fucked i mean it's yeah it's and it's it's attached to an ideology so because it's attached to this ideology it has to be universally and blindly supported yeah and i just think i it'd be nice to get beyond i mean it's funny because there's a bit of a arms race with the language where you know like they say they just accuse their opponents of being racists climate deniers anti -vaxxers election deniers And if you're kind of like, hey, can we move beyond these reductive labels, they still have the advantage because these labels are so powerful.
[2318] They're so tricky.
[2319] It's similarly like where they were like, I was kind of like, I don't even use this language of disinformation.
[2320] You know, Greta Tunberg is a purveyor of disinformation.
[2321] The world is coming to an end in a few years.
[2322] Yeah, AOC said that.
[2323] Yeah, AOSC, she had to delete that tweet.
[2324] AOC said, AOC says the world's ending 12 years.
[2325] That's disinformation, but I don't.
[2326] Like, do we have to go and call it that?
[2327] Or can you just be like, you were wrong?
[2328] Well, you're not just wrong.
[2329] You're spreading fear that's unnecessary.
[2330] Yeah.
[2331] And it's not based on facts.
[2332] And it's only there to support your narrative.
[2333] You're saying it to support your narrative.
[2334] And it makes you less reliable.
[2335] Yeah.
[2336] And you shouldn't do that.
[2337] Yep.
[2338] I think there's also that thing about the, and Jordan makes this point about the, there's also the online.
[2339] I mean, it's much harder to demonize somebody when it's this.
[2340] Yes.
[2341] When we're in person.
[2342] Right.
[2343] And I can see.
[2344] the, as we say, you see the God in you, you see the God in me, much hard.
[2345] Whereas like online, you've already, it's dehumanizing by nature.
[2346] And so labeling your opponents and demonizing them is much easier.
[2347] Like the Matt Walsh conversation.
[2348] Imagine if Matt Walsh and I had that conversation on Twitter.
[2349] I'm like, it would take months.
[2350] You couldn't, yeah, you can't do it.
[2351] It wouldn't work.
[2352] It wouldn't, no one would ever achieve an understanding of what the other person thought.
[2353] And it would also be like probably pretty nasty, which I don't think is necessary.
[2354] Like, I can disagree with someone, and have a conversation with them and just talk to them.
[2355] Right.
[2356] But then there's also people that are like really, they're bad actors, and they're only saying something because it conforms to their ideology, and they're essentially grifters.
[2357] They've attached themselves to this thing, and that is their business.
[2358] We could say they're lost souls.
[2359] Yeah, they're lost souls.
[2360] That's a good way of putting it.
[2361] I mean, it's slightly sweeter.
[2362] I mean, Michael, you're such a positive person.
[2363] Well, it's just, it's hard.
[2364] I mean, I'm a bad, I mean, The part of the reason I came back to being a Christian is that Christianity, I came back to it actually while working on Greta Thunberg, at the end of my book Apocalypse Never.
[2365] And I was like, what's the remedy for this intense hatred and anger against civilization?
[2366] And I was like, it's love, obviously.
[2367] Loving your enemies is for me what Christianity is about.
[2368] Like, it's the heart of Christianity.
[2369] It's really hard.
[2370] Forgiveness.
[2371] Yeah.
[2372] Forgiveness.
[2373] Yeah.
[2374] Yeah, but it's really, really hard.
[2375] And so for me, it was like, I'm interested in having a faith that's hard, not easy.
[2376] If it were easy, then what's the point?
[2377] You know, it's got to make you better in some way.
[2378] I get the same thing out of Stoicism.
[2379] I find it completely compatible or these death meditations.
[2380] You do it not because it's wonderful to think about being dead.
[2381] You do it because you think it's going to, you know, it's actually the, God, the other guy you had on who I just adore is Andrew Huberman.
[2382] Yes.
[2383] where it's like you read like I listen to all those standards and he's got a colleague what's her name she did dopamine nation um I'm blanking I'm killing it Susanna Soberg no no oh yeah I remember her to me she's gonna be so mad that I'm blanking on her name but you know basically Jamie's got sorry Anna Anna Anna sorry Anna if you're watching this but they you know it's like it's like it's so simple but they're like you know like Huberman so first of all I now do my morning run before I drink my coffee and I take a cold shower because because that amount of adversity which I mean it's kind of a joke like that's not adversity really but it's a little bit it's it's not great I'm not happy like I have to get my tennis shoes on and you're running like early in the morning and yeah but it he's absolutely right that actually leaning into the adverse leaning into the pain a bit that's it makes the rest of your day easier it really does it's the day there are occasional days very rare and my must be busy where I don't go into the cold plunge first thing in the morning and those days aren't as good.
[2384] Oh, you have a cold plunge.
[2385] Oh, you have a fucking cold one, bro.
[2386] I have a Morosco forge at home that's 34 degrees and I climb into that bitch every day for three minutes.
[2387] And then we have a new one that's getting installed here that's a blue cube that's even more horrific because it's got a constant flow like a river so you never establish a thermal barrier.
[2388] No, it's not that heat up the water.
[2389] Your skin has a thermal barrier.
[2390] It's like if you stay still in the cold water it's way easier than if you move if you move it's fucking horrible right and the blue cube is just like a raging river on you yeah yeah which is probably even more adversity that you have to overcome you know naseem to love does a good job with this with his book anti -fragile yeah anti -fragile I find him annoying on Twitter but I think that insight that's probably just Twitter right oh yeah probably no I betcha in person well he's probably an arrogant asshole in person too but it's very brilliant but it's a brilliant book very brilliant people are arrogant yeah it's part of the like what makes you brilliant in the first place and put in the hard work i mean that's the discourse on the censorship stuff too is it's always you know we're trying to reduce harm they go we want to reduce speech that causes harm it's like wait a second like i know what you mean like we don't want to do bad in the world doesn't the other hand yeah on the other hand like we know that coddling children is terrible it's it you create unstoppable harm it's way worse It's way worse than letting that kid experience some adversity.
[2391] Yeah, yeah.
[2392] Just the right amount of adversity, the right amount of harm.
[2393] It's not easy, but I mean, we have to get back to teaching that.
[2394] I agree.
[2395] Michael, it's always a pleasure.
[2396] Thank you very much for coming here.
[2397] Thanks for having me. If you ever have any more Twitter pages that you have to go through, like if you really do go through the Fauci files or whatever, please come back.
[2398] We'll do it again.
[2399] Appreciate you.
[2400] Appreciate you today.
[2401] Thank you.
[2402] Bye, everybody.