Insightcast AI
Home
© 2025 All rights reserved
ImpressumDatenschutz
#2119 - James Lindsay

#2119 - James Lindsay

The Joe Rogan Experience XX

--:--
--:--

Full Transcription:

[0] Joe Rogan podcast, check it out.

[1] The Joe Rogan Experience.

[2] Train by day, Joe Rogan podcast by night, all day.

[3] Don't worry, sir.

[4] I'm good, Joe.

[5] Because you're American masculine.

[6] We both, we didn't even coordinate, but both wearing American flags.

[7] Yeah, well, I mean, it's that kind of, it's that time, right?

[8] It's time to start saying, you know what, I'm an American, and that's cool.

[9] Before you say that, I mean, if you don't, we're on the way to saying I'm Chinese.

[10] Yeah.

[11] Well, how's your Mandarin?

[12] Yeah, it might be a good time to learn it as they're all sneaking in across the border.

[13] That's one of the more disturbing things.

[14] When I talked to Brett Weinstein, we was talking about how many Chinese military aged men are sneaking across the border.

[15] And you want to look at it the best way possible.

[16] You say, well, it's probably a bunch of people that are looking for work.

[17] And it's probably a bunch of people that are, you know, there's not as many Chinese women and they're trying to look for a girlfriend or something.

[18] And why do they have military haircuts?

[19] Well, they're probably, you know, it's just like a young man thing.

[20] Yeah.

[21] I mean, I've heard more specifically, I can't vet it so I can't prove it.

[22] So, like, there's the grain of salt up front.

[23] But I have heard that even Chinese special forces, if I was a special forces of a hostile country, I'd try to sneak across and do infiltration.

[24] So I've heard that there might be even, you know, hundreds or thousands of those, not hundreds of thousands, hundreds or thousands.

[25] But I don't know if that's true.

[26] Well, I wouldn't, it's not even really sneaking in any.

[27] No, you just kind of walk across and, I mean, there's even memes that are like, I'm going to go to Honduras and give up my American citizenship and come back across so everything will be paid for for me. You know, it's like, no, it's not sneaking across.

[28] It's like, as they are saying, full scale invasion.

[29] Well, it's just weird.

[30] It's weird that we've just kind of going, whoa, well, we've always, I mean, there was always customs.

[31] There's always land.

[32] They check out your stuff.

[33] They look at your paperwork.

[34] They go through your passport, they ask you questions, why are you in this country, you know, and it's always been that way.

[35] Like I was watching this video with the, you know who Dead Mouse is?

[36] Yeah.

[37] Dead Mouse, the musician, the DJ.

[38] He was, he was trying to come into the country to visit his friend.

[39] And they said, no, you're coming into work.

[40] He's like, no, no, I'm coming because he's famous.

[41] He works.

[42] He's like, no, I get paperwork.

[43] And they kicked him out of the country for like seven years.

[44] Whoa.

[45] He should have just walked through.

[46] Yeah, right.

[47] And then it would have been fine.

[48] But what a bizarre thing.

[49] If you're undocumented, if you are poor, and if you're going to do cheap labor, walk right in.

[50] But if you're a highly skilled world famous DJ and you want to go visit a friend, we're concerned that you might actually be working there.

[51] Yeah, or like a super pro tennis player.

[52] who's going to go play in the U .S. Open, but no, maybe not, right?

[53] Well, that was the Vax.

[54] That was the Vax.

[55] By way, Neil Young came back to Spotify.

[56] Congratulations, Neil.

[57] Yeah, well, that's good news.

[58] And his excuse was he said that because all of the platforms are now allowing my disinformation.

[59] So let his just go back on Spotify, too.

[60] Oh, yeah.

[61] Great to know you've got some ethics.

[62] Yeah, well, everybody's doing it these days, disinformation.

[63] But yeah, I mean, there's a strategy.

[64] The reason the border is the way it is, well, there's a strategy.

[65] I don't know who's playing the strategy for sure, but the Cloward Piven strategy, I'm sure you've heard of that.

[66] Somebody's got to have talked to you about the Clowered Piven.

[67] Can you explain it?

[68] Yeah, it's pretty simple.

[69] The idea is that you take advantage of a system in the way that it's set up so that you overwhelm it, in particular.

[70] In this case, you're going to overwhelm social services.

[71] You're going to overwhelm, you know, border enforcement.

[72] you're going to overwhelm whatever they're doing in the cities, you know, it's like tens of thousands of dollars per taxpayer or whatever per year going to dealing what they're calling the migrant crisis.

[73] So you try to overwhelm the system in order to basically collapse it so that you can create a crisis and the crisis creates the excuse to bring in new policies.

[74] Oh, well, maybe what we need is, what do they call it, E -Verify or something?

[75] So we need a digital system where we can track who everybody is, but then they get their digital system and then you're off to the races.

[76] Yeah.

[77] But yeah, this is an old strategy, well -documented.

[78] documented.

[79] Who do you think is implementing this strategy?

[80] And what are the conversations do you think?

[81] Well, it's not possible to deny that the Biden administration is implementing it because, look, they went to, they tried to fight Texas on securing its own border to protect its own citizens.

[82] That blew up.

[83] What was it end of January?

[84] Is that when they, it all blew up?

[85] Very recently, yeah.

[86] Yeah, it was pretty, pretty recent.

[87] And so certainly they are.

[88] We know historically that the Open Society Foundation or the Soros Foundation, which is Open Society, has been funding that and has been helping out.

[89] We know that the UN is involved now.

[90] Like, these aren't, these aren't mysteries.

[91] The UN is coming and doing, you know, aid and coaching them.

[92] And somebody's organizing not just, it's like it's not just a bunch of people from South America and China or wherever else or Mexicans wandering up to the border and like just, hey, I'm here.

[93] There's like roots.

[94] There are, it's caravans.

[95] There's help.

[96] It's coordinated with a lot of money behind it.

[97] And we know that those organizations, the United Nations, particularly, is helping this.

[98] So big players.

[99] So what do you think the strategy is?

[100] The strategy is to implement some sort of a worldwide verification system.

[101] And the way to get these freedom -loving shitheads in America on board is to turn America into a crime -ridden place of immigrants coming from very hostile places where their life has been very hard.

[102] And they've been in prison or whatever, and they're escaping that, and they're coming to America, and then they're off to the races.

[103] Yeah, well, I mean, that's a plausible motive, right?

[104] Is let's overwhelm this system because these freedom -loving shitheads here in America, which I think I am for sure.

[105] Look at your shirts.

[106] Look at me, yeah.

[107] But no, I really am.

[108] Like, I'm still at the end of the day.

[109] I just kind of want to be left alone to live my life.

[110] Like, you do your thing.

[111] I'll do my thing.

[112] Like I really, if you understand the two, there are two lines.

[113] Let's be real clear.

[114] Before I say, do whatever you want, as long as you don't hurt anybody, it's not clear enough.

[115] There are two lines.

[116] Do you understand the difference between public and private?

[117] And do you understand the difference between adult and child.

[118] If you understand those two lines and you're on the right side of those, I don't care what you do in private, as long as just with adults, I don't care what you do.

[119] Leave me alone.

[120] I'll leave you alone.

[121] You're cool.

[122] I'm cool.

[123] Like, let's not interfere with one another.

[124] Which is how we all should be.

[125] That's what real freedom is.

[126] But we don't want a system, you know, tying us, like the Chinese social credit system's real, right?

[127] We're not, this isn't some conspiracy out in the world, whether or not it's coming to the United States as a question, whether Americans would want it as a question.

[128] But it's in China.

[129] It's real in China for, I mean, it's been there for a decade.

[130] I've been to China.

[131] I've experienced, you know, life there.

[132] And the fact of the matter is that this would, that worldwide verification system would set something like that up.

[133] You can also overwhelm the U .S. system so that all of a sudden, you know, it has to start taking some kind of an emergency measure to deal with whatever problems.

[134] You know, we can talk about the crisis here in the United States.

[135] But holy crap, look at what's going on in the U .K. I was over there right after, right at the end of October.

[136] So in October 7th, we all know what happened in Israel.

[137] And then all these huge protests broke out like pro -Palestine.

[138] So I had some places to go.

[139] I don't really give a shit about my surroundings all that much.

[140] I'm going to do what I want to do as long as I'm not like customer.

[141] So I'm walking against the grain up this, whatever they said, like 150 ,000 or something like that.

[142] People waving the Palestinian flag walking down the street, their other way on their march in London because I had to get where I was going.

[143] London's in trouble, right?

[144] Like the UK is in trouble.

[145] When we start talking about this overwhelming the system, we're looking at these kind of, you know, much more generous social democracies.

[146] Sweden, Germany's hosed.

[147] I mean, their economies possibly in free fall.

[148] the UK and what are the what at that point what does the solution look like right how could they fix that problem now belgium's a big one i was riding with this guy i went to spoke at the EU parliament this time last year so i'm riding with this dude and it turns out he's like the european james bond he's like driving me from the airport and he's like oh yeah you know um we've got to deal with this problem i do all the security stuff or whatever and he's talking to me about how uh you can get arrested if one of them starts a fight with you when you do anything about it, it's racism, and you'll end up hauled before a tribunal, and it's happened to him.

[149] And it's like, we've got to start figuring our way to get them out, but it's like, how do you get them out?

[150] And I don't mean everybody.

[151] I mean the people who are causing criminal problems, the people who aren't trying to follow Belgian law or UK law or whatever else.

[152] And they're going on TV saying this.

[153] Like, you know, there's that Imam or whatever the other day that famously went on and was like in London.

[154] And it was like, you know, we're going to take this country.

[155] Like, we're not going to follow your rules.

[156] We're not going to follow your law.

[157] I don't remember what he said.

[158] So that's not exactly right.

[159] A lot of information passes between these ears these days.

[160] But the fact of the matter is, the question becomes when you have a crisis at that scale, what are your options for fixing it?

[161] And I think that that's part of the cloward -piven strategies.

[162] How do you end up fixing a problem that's at that scale?

[163] I think they're doing the same thing.

[164] To be honest, it sounds all crazy conspiratorial.

[165] But I think this is why I've been peddled to the metal.

[166] with the transition stuff, the trans stuff.

[167] If you end up with a million kids, you've got a million kids.

[168] Like, that really are on the medical system.

[169] What do you do with them?

[170] What do you do with a million kids?

[171] And then their parents and their aunts and uncles, everybody, the whole system has to start bending around a reality that was kind of manufactured.

[172] And you can get some major changes.

[173] But it seems like this, if you want to go full tinfoil hat, there has to be a plan.

[174] So that means there has to be conversation.

[175] There has to be a bunch of people that agree to this.

[176] Like, who are those people and how do those conversations take place?

[177] Well, I mean, we do, who are the people?

[178] Well, again, I just point back.

[179] The Biden administration has to have had conversations.

[180] They petitioned the Supreme Court to stop Texas from enforcing its border.

[181] I would love to know what those conversations look like.

[182] Whoever is funding it at some point had to sit down at a table probably not exactly like this.

[183] It might not have as much cool stuff on it, but they sat down and they signed some.

[184] contracts and said this is where the money is going to go.

[185] Do you think it could be that it's the federal government putting power over state governments to make sure that state governments don't say we can do what we want?

[186] Well, I mean, that's the fight between Texas and the federal government.

[187] So for sure, that's part of it.

[188] But I think there's the United Nations that's kicking this too, that's pushing this.

[189] I mean, so a lot of people don't understand, and I'm skipping around, the United Nations sees itself as a kind of global entity, 193 member states, blah, blah, blah, 17 Sustainable Development Goals to Transform our World, all that.

[190] But I'm going to skip over and talk about like Soros for a second, because we know that the Open Society Foundation has pushed a lot of this kind of stuff, too.

[191] And it's a lot of people don't understand Soros, or what is the open society that he's talking about?

[192] Well, it's based off of, a lot of people don't know, Soros' mentor was the famous Carl Popper, and Carl Popper wrote a book in 1945 called Open Society and its enemies.

[193] And so the open society is what we've been taking for granted, basically, in the post -World War II era.

[194] And what's what we want.

[195] That's where it's a free society.

[196] It's a high trust society.

[197] It's a, you know, people can do what they want.

[198] They don't have to worry about, you know, whether they're going to get carjacked all the time or whatever else.

[199] And Soros is like, well, you could have that in the nation or you could have that where there's kind of one open society in the globe.

[200] So a lot of people start thinking that he's working with China, but he doesn't like China because China doesn't have an open society.

[201] That's not what he wants.

[202] But the idea that there's this line that comes across to south of Texas and New Mexico and Arizona and California where arbitrarily, so to speak, the United States says, this is our land and Mexicans have to stay out.

[203] He would be against that.

[204] That's not an – this should be like an open, Pan American kind of mega continent kind of in his mind with one society.

[205] So what do you have to do?

[206] Well, you have to dissolve a border, and how can you dissolve a border?

[207] Well, make so many people be able to cross that border through changes legally and through flooding the system so that the border doesn't really mean anything anymore because borders are simple, right?

[208] What is a border?

[209] It's a line we draw on a map and we say laws on this side of this border mean this and laws on the other side of this border are different, right?

[210] U .S. has law.

[211] Mexico has law.

[212] And this line is where we have U .S. law versus Mexican law on either, you know, one step across.

[213] and now you're in another set of laws.

[214] That's what they mean.

[215] That's what borders are as a political entity.

[216] But if you can water that down, so it's like, well, there's so many people coming across, like, is there really a border?

[217] Right.

[218] That's the idea, because Soros' idea is a global open society.

[219] Everything in the whole globe, you know, maybe, I don't know if it's that extreme, but maybe you don't need passports, you don't cry.

[220] It's like the EU, but for the whole world.

[221] Wouldn't a better option be America, but for the whole world?

[222] I would say so, because you can look at Europe and see that the EU's not doing really well.

[223] Well, we're not sneaking into Mexico.

[224] No. In fact, you can just drive into Mexico.

[225] Mariana Van Zeller, who does that fantastic show trafficked.

[226] Yeah.

[227] You ever watch that show?

[228] No. That lady is a gangster.

[229] Oh, man. She goes to the craziest place.

[230] She goes to Columbia and watches them make cocaine and then goes through the jungle with them when they have it on their backs.

[231] So she did one in Los Angeles where it turns out that cops, dirty cops in L .A. are confiscating weapons and then selling them to the cartel.

[232] And they just drive into Mexico with them because nobody checks you when you go into Mexico.

[233] So these guys have trunkfuls of AKs and they're just driving into Mexico and she goes with them.

[234] Holy crap.

[235] The whole episode is documenting this.

[236] And that's the, we're not sneaking into there.

[237] You could just go right into there.

[238] They're sneaking into here.

[239] With the best case scenario, is it even possible to have this everywhere?

[240] Well, that's what we wanted, right?

[241] And that was the whole idea of spreading democracy.

[242] But it doesn't totally seem like it worked.

[243] Yeah, no, it doesn't seem like it worked.

[244] And there are some big reasons for that.

[245] Well, powers.

[246] The power is a big one.

[247] And the fact is when you start getting divorced too far away from Mexican issues being ruled over and say Ottawa is a little bit difficult, right?

[248] Right.

[249] But so there's that kind of stuff.

[250] But there's also a huge geopolitical.

[251] I hate misusing that word.

[252] I learned what the word geopolitical really means.

[253] It means politics of earth things like waterways, oceans, dams.

[254] Oh, interesting.

[255] And so it's like, I've always used it wrong too.

[256] I go totally like autistic every time I say the word now.

[257] And I'm like, damn it, I know it means something different because we all use it wrong, but I'm going to use it wrong anyway.

[258] There's a geopolitical move from China right now called the Belt and Road Initiative.

[259] And the Belt and Road Initiative, that's tied to the bricks.

[260] That's the idea is that the entire global south with China as its head is going to become the new epicenter, the superpower of the world.

[261] And it's going to be not just trade.

[262] I mean, China doesn't exactly trade on fair terms.

[263] They're going to go and basically exploit places.

[264] We'll build you a nice airport.

[265] We'll build you a nice port.

[266] We'll build you some highways.

[267] By the way, they all go straight to the mine.

[268] And we're taking all of your lithium when we come in.

[269] That's your deal.

[270] And now you're economically dependent on us.

[271] pretty standard game that they're playing.

[272] And that Belt and Road initiative actually is a competing interest to spreading democracy around the world.

[273] So I know Vivek Ramoswami, you know, really hit this out of the park where he said, we went over to China and said, let's spread democracy to China.

[274] So in the sense, we bid off way more than we could chew if you want to think of it that way.

[275] Let's spread democracy to China.

[276] And China was like, ha, ha, ha, yeah, let's see.

[277] And they flipped the table on us and made it so that if you want to play in, if you want to get in the Chinese market, so first they become the manufacturing base of the world, but then if you want to play in the Chinese market, what do you have to do?

[278] Well, the CCP puts up a firewall, and if you don't play by the Chinese rules, you don't get into China.

[279] So now Nike and all these big corporations and all these other NBA, I named Nike because it just keeps coming to mind, but there's a huge consumer market over there that's buying up stuff like crazy.

[280] That's one of the things I witnessed in China.

[281] Everybody's starting to have money, so they're buying up brand name stuff everywhere that they can all the time to show that they have some money now.

[282] And huge market.

[283] So they want into the market.

[284] The market's gigantic.

[285] It's the manufacturing base.

[286] So it's, you know, relationships are built.

[287] But if you want to play, you play by Chinese rules.

[288] So spreading democracy partly didn't work because we have to play by China's rules.

[289] And that's their belt and road initiative is meant to create a global, global south network.

[290] So we're the global north that's, you know, south.

[291] America, parts of Africa, a lot of like Indonesia, India.

[292] And then China, of course, bricks just throws Russia into that mix, but otherwise that's who you're talking about.

[293] And China's setting itself up to be the kind of global superpower or hegemon of that entire project.

[294] And we're talking about, you know, the flow of trillions of dollars of goods and oil and energy and whatever every year.

[295] So that's a huge thing to play with.

[296] And it turns out, I don't think, If we take Vivek's line, we got out foxed in the deal.

[297] So spreading democracy, you know, there are lots of these cultural reasons.

[298] Oh, they're not ready for democracy.

[299] I don't know.

[300] Maybe some places, maybe not some places.

[301] But there are other pressures, too, that we've been asleep to.

[302] We have not been paying attention as a country.

[303] Maybe some of our, like, State Department people have been to China for the way that we should have been.

[304] Like, we should have been in the 80s and 90s like, oh, no, China, right?

[305] But we were like, oh, yeah, China.

[306] Okay, cool.

[307] Yeah, go, like, make all of our cheap stuff for us.

[308] Well, they've done an amazing thing in combining communism with capitalism.

[309] That's right.

[310] If you just have North Korea, you never develop a real superpower.

[311] Thank you, Joe.

[312] I beat this drum and I get called crazy all the time because what I'm trying to tell people is that communism is what's happening to this country.

[313] Okay.

[314] But it doesn't look like communism because it's like, how is Nike communist?

[315] Right.

[316] And I'm picking on Nike, how's Boeing?

[317] Let's pick on Boeing instead.

[318] How is Boeing communist?

[319] You know, all these, how was Disney communists, right?

[320] They're huge mega corporations.

[321] What did Google just lose over it's stupid AI?

[322] $90 billion or something.

[323] It's something insane.

[324] Like, I didn't even know they had that much money to lose.

[325] And it's like, holy crap, you know, and stockholder value or whatever, or shareholder value.

[326] I think it was only $9 billion.

[327] Was it?

[328] I thought that was Bud Light.

[329] That was nine.

[330] No, no. Bud Light was 27.

[331] Look at what we're like haggling over like.

[332] Insane amounts money.

[333] 11, 12 figure, you know.

[334] I know, right?

[335] Yeah.

[336] And so it's like at least 10 -figured numbers of money.

[337] Like Elon kind of rolls in that department, but nobody else does.

[338] So anyways, what, where was I going on with this?

[339] Because this is huge.

[340] Oh, the communist.

[341] Why?

[342] How in the world these huge things communist, right?

[343] So communism didn't work, right?

[344] Soviet Union sucked.

[345] North Korea sucked.

[346] Cuba sucks.

[347] Like, I'm sure it's like geographically beautiful, but we know those places are dysfunctional as hell.

[348] Right.

[349] We can go to the eastern block.

[350] They're still devastated in a lot of ways.

[351] They're still not all the way together.

[352] Like, communism didn't work.

[353] But if we think of like what Marx did, leading up to, say, 1917, when Lenin kind of took over, as communism 1 .0.

[354] That never really even got off the ground.

[355] Then Lenin got it off the ground and you get the Soviet model, which is Soviet just means committee, by the way, if you didn't know that, it's like a ruling council or committee.

[356] So Soviet model takes over with what they called Marxism -Leninism.

[357] And that worked kind of.

[358] It worked.

[359] worked, they still had it in China till Mao died.

[360] They had it in Soviet Union until, what, 89, 90, 91, something like that when it fell.

[361] But what happened was when Mao died, like the Soviet Union wasn't doing great.

[362] It was starting to fall apart.

[363] A new model got picked up.

[364] And nobody's, we talk about Mao Zedong sometimes.

[365] And I would love to talk to you all day about Mao.

[366] That's my new research project.

[367] But we don't talk about his successor.

[368] His successor was Deng Xiaoping.

[369] And this is where I actually disagree with Vivek about what I was just saying.

[370] Deng Xiaoping had a saying that was, I don't care if the cat is black or white as long as it catches mice.

[371] And what he was talking about is I don't care if we use markets or we use a Soviet -style central committee to organize our society as long as China's economy comes back.

[372] That's what he really meant.

[373] And so Deng didn't come up with this new model to open the markets on his own.

[374] We didn't go to China necessarily just to spread democracy.

[375] We went to build China.

[376] And who's we?

[377] Well, let's name the names.

[378] Who was in the meeting?

[379] And there's a movie about some of these meetings were in China, and there's not a movie, but there's a movie called Mr. Deng goes to Washington that took place in Washington, D .C., so you can go watch the movie.

[380] I'm not making this up.

[381] Deng Xiaoping was the leader of China.

[382] He's already networking with Klaus Schwab from the World Economic Forum in his spacesuit, but he meets with, and the list of people were Henry Kissinger, Jimineu, Brazinski, allegedly T .H. Chan, David Rockefeller, and the sitting new emperor or whatever, CCP chairman of China, Deng Xiaoping, and they cook up this plan to open Chinese markets.

[383] And the plan was to maybe to spread democracy into America, but I suspect it was mostly to get really rich.

[384] We open those markets, huge amount of money, giant multinational corporations are not tied to any geographical place, and they can get rich off their balls.

[385] Now, some of these guys, I think, were also ideologically motivated.

[386] The Rockefellers, if funded communist, crap all over the world for for a very long time.

[387] China was communist.

[388] Deng Xiaoping said, I'm not opening the market for the market.

[389] I'm opening the market for socialism to make socialism productive.

[390] And so they had a, I think there was more of a plan there than we take into account, which means Vivek gives, I need a tinfoil hat.

[391] Vivek sees that our motivations in building China were necessarily good.

[392] I think the motivations for building China were to create the pincher of a trap that's called Thucydides trap in ancient kind of military strategy that the only escape from would be to facilitate China's rise and decimate the West in order to avoid a nuclear -tipped World War III.

[393] And I think they knew what they were doing and were going to get rich on it.

[394] You think by spreading democracy, their idea was to reinvigorate China's economy so that China becomes a threat.

[395] Yeah.

[396] Really?

[397] Yeah.

[398] So is China get, so the Ducidity.

[399] That is so 4D chess, the back pages of Reddit conspiracy.

[400] Well, listen, we know that Klaus Schwab is kind of, if there are conspiracies, the James Bond villain, kind of not quite out of central casting.

[401] See the photo we have of them in the bathroom?

[402] Yeah.

[403] With the Darth Vader outfit on the space suit.

[404] I'm no one who told you about his spacesuit.

[405] We put it up on the screen.

[406] Yes, yes.

[407] Do you know who Klaus Schwab's mentor was?

[408] I do, but I forgot.

[409] Henry Kissinger, who was in the same meeting.

[410] This is a Harvard plot His father was a Nazi I can't vet that for positive sure But that's what I have heard What is the truth of that Let's find out Who is Klaus Schwab's father At least his father did something Like Wasn't he the guy that was like bringing like the nuclear technology For the Nazis to South Africa or something like that?

[411] Something crazy like that I mean I know the story vaguely I knew it at one point Look you can't help who your father is that's correct and you know unfortunately you get born your dad's a nazi i'm much less worried about klaus being a nazi than i am like he has an interview he gave where he's in his his office and behind him up on the bookshelf as a bust of lennon how to hell did that get there like other than jordan peterson who puts one of those up communist and Jordan's studying them and that's why he puts him up as a reminder no shade at jordan uh obviously but like klaus has got some you know big ambitions i think and his mentor was kisinger well he's such a strange guy the way he talks about it too it's so right out of a movie like this cannot be real no one is really you were all nothing and you would be happy with that accent and no one's freaking out i think that my favorite ones are he's having the conversation he's like yes in some years we will all have the chips on our brains yeah and so you will be sitting there and i will be sitting here and we will be having a conversation there is a false attribution so it's fake inaccurate This is from a book I read.

[412] Yeah, this is Reuters.

[413] Founder Klaus Schwab family tree shared online.

[414] So what is the inaccuracies?

[415] He was related to the Rothschild family.

[416] Oh, that was the fake one.

[417] Oh, yeah, because his mother is my super secret.

[418] Okay, so that's fake.

[419] That's not true.

[420] But the thing about his father.

[421] That's why I was getting, I mean, this is explaining what his father was too.

[422] His mother was Jewish.

[423] And his father did what?

[424] Well, if his father was supposed.

[425] Don't open that can.

[426] Well, I mean, that's the whole thing with Soros, though.

[427] You know, Soros was Jewish.

[428] and his uncle took him around as a young boy when they confiscated property from the Jews and he had to pretend that he was a Christian.

[429] Yeah.

[430] Did you ever see the interview with...

[431] So who is his dad?

[432] I mean, I'm trying to get to something that says it.

[433] Okay, we should really clarify that.

[434] You know, miss it.

[435] Okay, so who is this gentleman?

[436] Wilhelm.

[437] What did he do?

[438] Wilhelm, what did you do?

[439] Okay.

[440] Did Shoros's or Klaus Schwabb's father work for Hitler?

[441] False.

[442] Claim, George Soros, worked for Hitler.

[443] Okay, but what did his dad do?

[444] I got a research.

[445] Because this book was, this book that I read was about elite power structures and they go into the world economic form.

[446] I wish I could remember exactly what they were saying.

[447] But it was something to the tune of who his dad worked with.

[448] Well, I'll just be clear, since you have the tinfoil hat right now, Joe, that my source for his mentor being Kissinger is a book that was published by the World Economic Forum called the World Economic Forum the first 40 years, which was published in 2011 to brag about how cool they've been.

[449] He also brags that in 78, he started making connections to Deng Xiaoping and trying to bring the stakeholder, as he called it, capitalism model into China, which is what China actually installed.

[450] It's this dirty fusion of neoliberalism, which is basically how do you get huge corporations to basically suck off of the government?

[451] That's the thing the left has been mad about for 50 years.

[452] How do you fuse that to communism?

[453] And China's the answer.

[454] And what I think is all this ESG stuff was constructed around it to make the West have it too.

[455] So environmentalism, social, what is it, environmental social governance?

[456] That's right.

[457] Yeah.

[458] That's ESG.

[459] Yeah.

[460] And that stands for, like, what is the goal of ESG?

[461] Corporate control.

[462] The stated goal.

[463] But the stated goal.

[464] So it is to create the, it is to create a metric of measurement tool to assess the likely long -term viability of a corporation based on its environmental, social, and internal corporate governance policies.

[465] Long -term viability for the nation?

[466] No, for the corporation.

[467] For the corporation.

[468] Because here's what's going on is ESG was created at the United Nations in 2003 by a guy named James Gifford.

[469] And the point was, he said, well, there's at that point about $6 trillion of money that's sitting out there, it's people's pensions.

[470] It's like passive, right?

[471] Mutual funds.

[472] index, all this, mutual funds, particularly, 4 -1Ks, there's state pension funds in particular, six trillion dollars in the world sitting out there that's just people's retirement funds gaining interest, playing in the market through this, you know, money management.

[473] And the question James Gifford asked was how, he was a forest guy.

[474] He was like, how do we apply that to saving the forest, save the trees, right?

[475] And so he came up with this idea that if we had environmental assessments, anytime you have a metric, you can use that metric in some way or another.

[476] or you can game that metric, if we had metrics to say, well, how environmentally compliant are companies, like kind of an extension of corporate social responsibility, they used to call it.

[477] If we can measure that, then what we can do is we can start directing, you know, we can say, well, companies that have a long -term or that have good environmental policy have a better long -term portfolio, but these are 30 -year investments because they're people's pensions, so that's long -term success that we're interested in, not boom and bust cycles in the market.

[478] So the stated ambition, not just to do what I said, but is specifically to do that to bring that passively invested money into what they call impact investing.

[479] In other words, to do activism with investor money by investing in, you know, green energy companies or green other environmental companies or socially just companies or, you know, companies with good governance.

[480] And in principle, at least the good governance thing should work.

[481] But the thing is, is corruption exists.

[482] I don't know how they neglected to account for that if we'd give them all the credit in the world.

[483] So like right now, it's super corrupt.

[484] I just did a podcast about this where I had this document.

[485] It's not something some mysterious document.

[486] It's on the Harvard website where they're talking about corporate bonuses, right?

[487] So it's a Harvard corporate law website document.

[488] And they're talking about corporate bonuses and the corporate bonus structure and that your governance score, your SG score, the G part will go up if you give corporate bonuses to yourself for implementing ESG.

[489] That's just naked corruption, right?

[490] And so they can come in and say, well, you want to go to ESG score and they can make that important or whatever.

[491] I guess they have made that very important because everybody's doing it.

[492] And they say, well, if you want a good ESG score, you need to put an activist on your board or, you know, 30 % women on your board or DEI requirements on your Boeing board.

[493] Or you have to have a good corporate equality index score, which is public.

[494] published by the Human Rights Campaign, which means that you're not just having a non -discriminatory workplace for LGBT, but you're also promoting LGBT agendas.

[495] You're lobbying on behalf of bills one way or the other in the legislature.

[496] We'll tell you which ones.

[497] A couple of years ago, they told the airlines they needed to fly around activists to the Pride parade so they'd have more people at them for reduced prices.

[498] Oh, yeah.

[499] Why do you think Dylan Mulvaney's face was on a beer can?

[500] The whole fallout of the Dylan Mulvaney explosion at Bud Light, all of it was about the CEI.

[501] score because then the Human Rights campaign came out and said, well, you didn't stand up for Dylan, so we're going to lower your score anyway.

[502] And they were like, oh, no, and then everything got all tossed up.

[503] These numbers mean a lot to people.

[504] So the stated goal was to create a set of measurements that they could use to justify taking trillions of dollars of other people's money and doing activist investing with it.

[505] And that all turned into the S is now DEI.

[506] It's woke.

[507] It's woke social justice.

[508] It's not social responsibility.

[509] It's whatever they want.

[510] Elon Musk bought Twitter and his social score for Tesla went through the floor.

[511] Like, what did that have to do?

[512] And then all of a sudden Tesla is a racist company they accused him of.

[513] Like, what are you talking about, right?

[514] They didn't like that he bought Twitter.

[515] Weapons manufacturers like Dick Cheney's Halliburton were, you know, social bad, bad, bad, bad.

[516] And then all of a sudden the conflict and Ukraine breaks out and like, oh, we need, we need missiles.

[517] And they changed the score basically overnight.

[518] That's a, because the social environment of the world changed.

[519] These are real things.

[520] Like, This is all verifiable.

[521] So I think it's an instrument.

[522] Maybe it wasn't meant to be in 2003.

[523] Maybe the guy just wanted to save the trees.

[524] But it's become an instrument of control and effectively a social credit system for corporations to force corporations.

[525] And that's what Larry Fink said about it on TV.

[526] He said, you could pull up the, I'm sure we can find the video and pull it up where he says that we're interested in forcing behaviors.

[527] And that's what we're doing.

[528] I want to get to that, but I still, I don't want to gloss over Klaus Schwartz.

[529] No, of course.

[530] I want to remember that.

[531] And I got a George Soros.

[532] I'd love to not gloss over.

[533] too because he had a crazy interview in 2004 nobody knows about and i think he'll get a kick out of it so hold what you were just saying about larry frank we'll put that we yeah we're piling jamie after i could find is this uh newsweek and what is it about whether he's linked to nazi germany there this is i'll get down to here there's a post get this on the screen that's not him that's not his dad so that's a fake photo right so that that starts there okay uh there is he was he worked at this company i sure wice This is where like there's like there's no proof but it also says it's not definitive But there's no actual like I would say it says Hitler's father under the hand was the managing director of a subsidiary of Zurich based engineering firm Isher Weiss the history of Eugen's relationship with Nazism in general is complex But there's no substantive substantive evidence of ties to high -ranking German leadership particularly Hitler no evidence A fact check published by accredited German journalist DPA used denazification records to uncover that Eugen Swab was a member, Schwab, was a member of some national socialist organizations, but that alone does not prove any relationship to German high command or a belief in Nazi ideology.

[534] But wait a minute, but the German national socialist organizations back then essentially were Nazis, right?

[535] Right?

[536] That is what it means.

[537] That's what it means.

[538] Like, that's what Nazi means.

[539] That's what Nazi means, right?

[540] National Socialismus, yeah.

[541] So this is a weird sort of glossy.

[542] Yeah, it's dodging, right?

[543] Well, it says he doesn't have evidence of ties to high -ranking leadership.

[544] But that he says he was a member of organizations tied to the party.

[545] Well, just not Hillary.

[546] But hold this.

[547] While the Escher Weiss branch in Ravensburg, Germany, which Eugen managed, used prisoners of war and forced.

[548] laborers.

[549] It's not clear whether the company was forced to do so by the Nazis or because of a lack of workers.

[550] Wait a minute.

[551] You just admitted 100 % that he's a Nazi because that's what Nazis did.

[552] They use prisoners of war and forced laborers.

[553] So they ran prison camps with probably Jews.

[554] So what does that mean?

[555] That means that's what the Nazis did.

[556] Or we, I arguing over semantics.

[557] Well, they are.

[558] I don't think we are.

[559] But that's an incredible argument to say that he managed a plant.

[560] He managed the branch that used prisoners of war and forced laborers, but we're not clear whether he's a Nazi.

[561] This is a weird article.

[562] It's super weird.

[563] Newsweek.

[564] Well, maybe Newsweek was like, you've got to be real careful with what you say here.

[565] I mean, it's a bit of a damning accusation.

[566] confidon of Hitler and I think they're just saying there's not a proof that he was that close to him.

[567] Okay, but this is weird, right?

[568] They're discrediting it by saying maybe he wasn't that close to Hitler.

[569] There's no proof.

[570] But what they're not discrediting is that he did exactly what was horrific about what the Nazis did in World War II.

[571] Yeah.

[572] And I saw the word plutonium up there.

[573] So the nuclear stuff that we were talking about is connected to.

[574] What a weird article.

[575] Very weird article.

[576] That's how much power is at the top.

[577] Yeah.

[578] Well, you have to write weird articles like that going, well, there's no real proof that him and Hitler were homies.

[579] Yeah, he was just a member of some national socialist organizations.

[580] Unlike reports, his Hitler was not, his top nine on MySpace.

[581] That would be like, you'd think that they would write the article like about me because I've said like Make America Great again before, but I've never met Trump.

[582] So like, would they write the article like James has never met Trump?

[583] No, I got an SPLC profile.

[584] That's sort of the other way around.

[585] Did you know I'm an extremist now, by the way, Joe?

[586] I think I am.

[587] Oh.

[588] I think I've been labeled that.

[589] They put me in a category called general hate.

[590] So I sent a letter formally thanking them for the title.

[591] I am general hate.

[592] Like, it's like a war, you know, general.

[593] Yeah.

[594] I totally miss. Anyway, I'm funny.

[595] Well, it's just ridiculous.

[596] You're a brilliant guy and you're pointing out really important stuff.

[597] Do you know what it's one of the first things they go after me for on there is?

[598] It's like the second thing that they go after me. that I made a series of tweets.

[599] So you're a comedian.

[600] You get it, right?

[601] I made a series of tweets mocking George Floyd on January 6th.

[602] On January 6th?

[603] Yeah, I pretended that George Floyd is like, you know, leftist Santa Claus.

[604] So I was like, if you fight for justice for George Floyd, the spirit of George Floyd will bring you presents on January 6th, miss. Oh, boy.

[605] It's just stupid jokes.

[606] I totally had forgotten that I had done it.

[607] And then it was on my SPSC profile.

[608] I'm like, oh, my God, these people.

[609] Jokes are on your, well, did you see about that.

[610] Flemish guy who was a part of the government, who's just got sentenced to one year in jail for sharing racist memes in a private chat.

[611] I saw that last night.

[612] Yeah.

[613] Holy crap.

[614] A private chat.

[615] So if you got like a, you know, fucking I message chat group.

[616] Is that a private chat?

[617] Are they talking about that?

[618] Are they talking about social media platforms?

[619] Either way.

[620] This guy shared racist memes.

[621] Well, that was like when Tucker went to, Tucker Carlson went to Russia, which I'm.

[622] I'm kind of like, I don't know what that's about for sure, but Tucker Carlson went to Russia, and he finds out while he's there that the NSA is reading his encrypted signal chat.

[623] I have a theory about that.

[624] I don't think, if I was the government and there was a bunch of these companies that do something like that, I'd make my own company.

[625] Yeah, right, of course.

[626] Or I'd infiltrate all of them.

[627] Yeah.

[628] Come on, guys, it's not really encrypted, right?

[629] Yeah, totally.

[630] How the fuck do you know?

[631] a lot of people that trust those things.

[632] They'll say wild shit on those things.

[633] Hey, talk to me on Wii chat or whatever.

[634] Yeah, it's fucking get the fuck out of here, bitch.

[635] Yeah.

[636] Let's make a WhatsApp group.

[637] Oh, yeah, that's WhatsApp.

[638] WeChat is the one that's the Chinese one.

[639] But Jamie, so what was the story behind that?

[640] The Flemish guy?

[641] Yes.

[642] I'm reading it right now.

[643] I was trying to find out where they found those things.

[644] I want to know what the tweets, what the memes were if they're any good.

[645] It says they were accused of using a chat group to exchange racist, anti -Semitic and other extremist comments, but I'm not finding.

[646] Right, but is it, but they're saying memes.

[647] The problem with memes is it could just be funny.

[648] It could be like the Jews in the tunnel in New York City.

[649] Yeah, right.

[650] And something crazy like they're encountering Gallum down there.

[651] And that's a racist meme.

[652] Well, it wasn't like the pepe frog alone like a racist name or something.

[653] It's like a frog.

[654] Exactly.

[655] But the thing is like you could take that frog and put a Hitler arm band on them and now all the sudden the frog is tied to Hitler.

[656] Yeah, exactly.

[657] Which is what they do.

[658] Yeah, that's exactly right.

[659] But it's also people do use.

[660] that frog for crazy shit for funsies.

[661] Yeah, for right.

[662] Because they're talking, they're shit posting.

[663] Yeah, shit posting is totally a thing.

[664] And shit posting is a thing.

[665] And you have to understand, these people don't even mean what they're saying.

[666] They're saying something, some of them might, but a lot of them are just making something that's so outrageously offensive that it's funny.

[667] Right.

[668] And they're doing it anonymously and they're sharing with people just for the lulls.

[669] Yeah, shock comedy.

[670] It's totally a thing.

[671] And then you got some, some dude, you know, that passes physical fitness test wearing a polo and some khakis, like, oh, we've got, we've got an extremist here, guys.

[672] Right.

[673] You know, it's weird that physical fitness and exercise and health is being tied to right -wing extremity now, or extremists.

[674] Yeah, I saw your gym.

[675] You're totally a lunatic.

[676] I must be a lunatic.

[677] Yeah.

[678] But that's, that was, there's many times they've tried to push things like that.

[679] You're like, what is the motivation behind this?

[680] This is just for clicks and.

[681] outrage, it could be.

[682] Is there like someone who's actually saying that it would be a good idea if we connected health and fitness to right -wing extremism so that you would be scared to be fit and healthy?

[683] Like that's the full, you want to go full tinfoil hat.

[684] Who do you want to have a war with?

[685] Do you want to have a war with Trump supporters?

[686] Or do you want to have a war with the people who wear pink hats and are mad?

[687] Like our health ministers.

[688] Yeah, exactly.

[689] Which war do you want to go to the people that are unhealthy, I'll fight them all day long.

[690] They're going to quit.

[691] They don't have any training.

[692] You just walk up a hill.

[693] Yeah, they're going to give, yeah, fight them from the top of a large mountain.

[694] That's where you make your base.

[695] No one's making it up there except fit people.

[696] I mean, that's probably what the reason why they put civilizations up high, you know, make it really hard to get to them.

[697] Yeah, lots of advantages.

[698] This is stupid.

[699] It's a stupid thing.

[700] Everybody should be healthy, you fucking idiot.

[701] Like, what are you saying?

[702] Well, I'm going to get to take the Tim Foil hat back.

[703] I think there's a strategy.

[704] I call us the politics of compliance.

[705] And I think that we've gone through it with everything.

[706] In fact, I think it's all they do.

[707] It's the same thing over and over again, right?

[708] Whether it's COVID, whether it's Magas as deplorables, which worked kind of backfired big time, right?

[709] And then whether it's all the identity politics, whether it's the environmental stuff, even with this, though, what they do, and this is the politics of compliance.

[710] I just did this for Robert Maloney.

[711] had me come speak at his international crisis summit.

[712] And I'm sitting there and I'm going to talk.

[713] It's like nine in the morning.

[714] I'm not awake yet.

[715] I'm not a morning person.

[716] I'm like, what the hell am I going to talk about?

[717] And so I get an idea comes in my head of politics and compliance.

[718] So what it is is that you start off by saying, look, we're going to have this glorious better world, but there are people who are keeping us from getting there.

[719] Right.

[720] So there are the people who want to move forward into the glorious better world.

[721] But then there's the enemies of the people who are dragging their feet, the deplorables, the climate deniers.

[722] It doesn't matter if the climate change thing is true or not because there's a label now, right?

[723] The Christian nationalists, there's a label now, the racist, the transphobes.

[724] So we could have unity, but we can't have unity.

[725] You're making the sacrifice.

[726] You got the shot in your arm.

[727] You did what you were supposed to, but we can't open up a society yet because these other people are dragging their feet or resisting.

[728] So you have to have ways then why the fitness thing, right?

[729] You have to have ways to identify who the people are that aren't going along with the program.

[730] So it's like you.

[731] You got blown up for this.

[732] you're like well I got COVID I feel like shit I feel really really bad did you know by the way last time I was here I went home with COVID even though we did the test you got COVID yeah I went home I had COVID yeah I went home you think I think from when I went out to dinner after this because I felt fine until I got home so oh right from dinner like a couple hours later there was somebody at dinner that had like a cold or something so I think they had but the COVID is like so I did I barely got sick so I didn't know right that it was like the person I would have gone out to dinner I wouldn't have thought I was sick I didn't even say I really got really sick.

[733] That's part of the problem, too.

[734] But the thing is that you became an emblem of the thing that you're not allowed to talk about, right?

[735] The apple pectin, the horse paste, right?

[736] And so there's all those articles.

[737] That's why you got like all that drama.

[738] You got like kicked off stuff or whatever all happened to you.

[739] I don't remember exactly what happened.

[740] But you were the pariah, man. Why?

[741] Well, one of the things I remember you talking about was health and fitness.

[742] Like, I'm healthy.

[743] I'm fit.

[744] I got this set of drugs.

[745] I took it.

[746] Seems to work.

[747] I felt way better real quick.

[748] And they can't have that if they're trying to create this dynamic that all the people who are staying home and wearing a mask and, you know, cowering in fear primarily or later getting the shot.

[749] And more importantly, complying to pharmaceutical.

[750] That's why I call it the politics of compliance.

[751] Yeah.

[752] So all of a sudden you became an emblem of, well, you know, maybe you should go outside and exercise sometimes.

[753] Yeah.

[754] And that's a huge problem for that group of compliant people.

[755] And if you can whip them up or create conditions with misuses of power like many of our state governments and national or federal government did, Canada really did, and say, well, we have to keep everything closed down for your safety.

[756] And we could open it back up except disinformation right -wing extremists like Joe Rogan are out there pushing the wrong ideas.

[757] Well, in that case, you can get people to hate the person who's not going along with the program.

[758] That's why I call it the politics of compliance.

[759] This is what I could find about The Flemish story.

[760] So Belgium's far right prodigy gets prison term for inciting violence.

[761] So this goes back to 2018, so I'll shorter walk you through my ground.

[762] This is the sentencing that happened.

[763] I wish they showed the news.

[764] He got a year.

[765] Five other people in his group got suspended prison sentences.

[766] Their charges included hatred, racism, Holocaust, denial, and breaching local gun law.

[767] It's the only notice of that I saw.

[768] What is local gun law?

[769] Is that like depicting guns in a favorable way?

[770] Like, what is their law?

[771] It could be that from this.

[772] Because if it's a meme.

[773] That's why, in that wild, like, gun law of memes?

[774] I love how good you're getting it picking apart their BS, though.

[775] It said here it showed them posting pictures of themselves holding weapons.

[776] Saying they're totally ready.

[777] Oh, okay.

[778] So posting photos, is it illegal to hold the guns?

[779] Like, I want to know if they're illegally possessing.

[780] guns?

[781] Is that what they're saying with gun law?

[782] Or is it just photographs of the guns?

[783] This is the report from the 2018 documentary that got made about them.

[784] Some guy got infiltrated and got into their Discord, I think, is where they were sharing some of the stuff.

[785] So these are far -right, allegedly far -right people in Belgium.

[786] And I'll show you, not showing the audience this, but this shows some of the memes, I guess.

[787] Okay.

[788] Showing himself as.

[789] Yeah, which ones are illegal.

[790] That one's apparently...

[791] That Muhammad is a Lego puppet, a Lego toy.

[792] And this is this, what do you, Islamic harassment, reward, the heroic man with sex?

[793] Obviously, Nazi meme.

[794] When you go full gas.

[795] They're obviously Nazi meme.

[796] Mm -hmm.

[797] So these are problematic.

[798] You can be racist if there's, you can't be racist if there's only one race.

[799] Okay, that's like an anti -Hitler meme.

[800] Something like that with the gun, so it might have been.

[801] the problem?

[802] Just holding the gun.

[803] So here's the thing.

[804] Is that gun illegal there?

[805] I don't know which one.

[806] That didn't explain.

[807] Let's Google, just pause for a second and open up a new tab and Google Belgian gun laws.

[808] I want to know if they have laws similar to, there's some countries that have a high population of people having guns.

[809] Firearms are generally not allowed in Belgium, but goods such as switch blades and pepper spray are also considered a prohibited weapon.

[810] Okay, so you can't carry a gun?

[811] Click on that.

[812] Can citizens carry a gun in Belgium?

[813] See if there's any...

[814] See where it says it right there?

[815] Just click on that.

[816] Whether an arm is legal or illegal mainly depends on who owns it, sells it, or uses it.

[817] While most people are prohibited from owning or using automatic firearms, they are not illegal per se.

[818] The armed forces and the police may use them.

[819] Traders may procure these arms for them and arms collectors can own them.

[820] So if these guys were an armed collector, they could own them, but that gun was an automatic weapon, I think.

[821] I'm not really a gun expert, and I only got a quick glance at it, but it looked like an automatic weapon, at the very least, a semi -automatic weapon.

[822] Could have been that.

[823] It had a large magazine.

[824] Go to the photo again of the gun.

[825] Let me take a look at it.

[826] So either way, it seems like unless you're a security person, yeah, that's an automatic weapon.

[827] Unless you're a security person, it might not be.

[828] It could be an AR.

[829] be a semi -automatic weapon.

[830] I don't know.

[831] But at the end of the day, it seems like unless you're military or police, you're not supposed to own that.

[832] Right.

[833] So that could be the gun law.

[834] Like they had a photo of it.

[835] That makes me feel a little bit better than if it's just like a gun meme.

[836] Because what if it was just a meme, that'd be insane.

[837] Well, those memes, those are insane that they're using the Hitler one as an example.

[838] Because it's really like showing that like Hitler was crazy.

[839] Like, you can't be racist.

[840] This is my thing.

[841] We'll have only one race, so no one will be racist.

[842] Like, that's fucking ridiculous.

[843] I mean, that's one of the things that the SPLC accuses me of, though, is that I promoted the white genocide theory, which that is not true.

[844] But what are you going to do about these things?

[845] Well, there's plenty of people that have said crazy things about white people lately that you're allowed to say that just drives me nuts.

[846] I just said that if the logic of CRT was played out to its conclusion that it would end in a genocide of whites, which is a completely different thing.

[847] That word if means a lot.

[848] If it was taken to its fullest example, I think there's also a problem with, you know, when you tell people that a group of people are responsible for things or a group of people, like just completely composed of individuals with completely different lives and everyone's got different experiences.

[849] And we say that that group of people is either bad or that group of people was responsible for everything.

[850] Like as soon as you do that, you allow othering.

[851] That's right.

[852] Othering is the number one problem we have tribally, culturally, that we can look at other human beings as if they're not us.

[853] And this is what's going on in Gaza and Israel right now.

[854] That's what's going on with this guy.

[855] I'm not going to defend whatever.

[856] But this dude, like the counter reaction eventually to relentless identity politics is for the other side to start saying, okay, identity politics.

[857] Right or right.

[858] What it does is it creates more of itself.

[859] It's like it's contagious.

[860] It makes people more racist.

[861] Yeah, it does.

[862] It does.

[863] And when they feel like there's racist.

[864] racism allowed against white people, that there's this double standard, then they get racist.

[865] There's a lot of people that do that, man, and it fuels it.

[866] It's fucking horrible.

[867] We should, we should abhor racism with everything, with all, with all, we should just treat everyone as individuals.

[868] That's right.

[869] Any arbitrary power, especially when it's applied corporately to groups, we should oppose it.

[870] In Tennessee, it's in our state constitution, the second article or whatever, Section 1, Article 1, or I got that backwards.

[871] No, Article 1, Section 2 is that the non -resistance, I think I can almost do it from memory.

[872] The non -resistance against arbitrary power is to be considered slavish, absurd, and against the good and happiness of mankind or something like that.

[873] So we should resist racism's arbitrary power.

[874] You don't know that guy.

[875] It's in color.

[876] It's arbitrary to dislike him or to exclude him or whatever.

[877] Whether he's white, whether he's black, whether he's, you know, Hispanic.

[878] It doesn't matter.

[879] Same thing with, like, sexism.

[880] You don't know what that woman's capable of, for sure.

[881] Let her try.

[882] It doesn't mean you change the standards, right?

[883] And so this is the pattern that has been exploited, and this is where the double standards came from.

[884] It's, you should give us access.

[885] And our sensibilities are like, hell yeah, we should give people access.

[886] Let them try.

[887] Let them in.

[888] You know, don't, don't exclude people.

[889] Racism sucks.

[890] Homophobia sucks.

[891] The sexism sucks.

[892] Misogyny is awful.

[893] Let them in.

[894] But then what happens is they say, well, you're not accommodating us.

[895] You're not accommodating.

[896] So it's like firefighters.

[897] Like, well, we got to lower the standard or military.

[898] We got to lower the standard so more women can pass the test.

[899] Well, now we've got a problem.

[900] Right.

[901] And so after you make the accommodation, then you've changed the political structure.

[902] And if you, it's one thing if that's just about people.

[903] But what it's like almost all this stuff seems to forget that manipulators and sociopaths and psychopaths exist.

[904] Exactly.

[905] Because what they're going to do is they're going to come in and they're going to say, you have to change it for me. And you change it for them.

[906] And then they're going to say, you have to change it for me again.

[907] And then you change it form again.

[908] And then, you know, an inch or two at a time, you're a mile down the road.

[909] And you're like, how did I get here?

[910] Right.

[911] And but that's the thing.

[912] It's like I'm all in this restorative justice thing in the schools.

[913] Oh, let's sit around and have a talk circle and talk it out.

[914] Two kids get in a fight or whatever.

[915] Somebody's doing some antisocial, something.

[916] Let's talk it out.

[917] They're the group and let's heal.

[918] All right.

[919] So it sounds a little hippie to me, but, you know, fine.

[920] Let's look at it.

[921] Some percentage, I would guess it's probably three or four, not very big of the population.

[922] just to throw a guess out there because that's roughly where you start, what's the total number of psychopaths, borderline personality, and so on.

[923] It's about three or four percent of the population.

[924] They're going to be like, oh, I can get away with this.

[925] Oh, we have to, I'm not going to get in trouble if I bust some other kids' head at school.

[926] I just have to sit in a talk circle and say, oh, I'm sorry, okay, whatever.

[927] And then it's all over and we've healed.

[928] Like, they're going to game, there are people who will game a system.

[929] And it's like this kind of like empathy -driven airy -fairy.

[930] If we just gave everybody money, there would be no. crime nonsense is driving us off of a cliff.

[931] And it's causing these fights in our schools through terrible policies, like restorative justice policies.

[932] A lot of it's in criminals.

[933] I remember all these articles back, you know, a year or two ago.

[934] I don't know if they're still publishing it.

[935] It's like if we just paid people not to commit crime, they wouldn't commit crime.

[936] Yeah, I've seen that.

[937] Like, what the hell are you talking about?

[938] It's like, have you ever done something edgy?

[939] It's fun.

[940] I'm not a criminal.

[941] It's a part of their identity.

[942] That, for real.

[943] Like, If you're in a gang, that's how you, like, validate yourself, too.

[944] Wasn't there something they were, they were trying to do this recently?

[945] They were trying to give people money to not commit violent crimes.

[946] There was, like, an actual policy that was being proposed.

[947] Yeah.

[948] Like, like, Maryland or something.

[949] Somewhere nutty.

[950] Yeah.

[951] Like, where you're like, what the fuck did you just say?

[952] And the root of it, it's like if you don't have, everyone wants a meritocracy.

[953] We all agree to that.

[954] We want a meritocracy.

[955] We want the best people to, and we want competition.

[956] which is a little allows people to get better and it allows us to have the best products and the best thing and the best music and the best art but what is this I don't know I typed in get people money to not commit crimes and that's a pop -down this is it the Dreamkeeper Foundation Fellowship will pay participants I think this is it a new program out of San Francisco aims to decrease gun violence by paying high -risk individuals at least $300 a month to stay out of trouble that just means don't get caught That just means don't get caught.

[957] And also, how do you make your money?

[958] $300 is not going to cover.

[959] In San Francisco especially, yeah.

[960] If you're selling meth, you're making a lot more than $300.

[961] That's right.

[962] Like, what are they going to do?

[963] That's ridiculous.

[964] But my point was where I was getting to is like, everybody wants a meritocracy.

[965] But if you can keep it so there's no equality of opportunity.

[966] If you can keep it that way, you're never going to really get a meritocracy, and that would be a better way to control it.

[967] Yeah, sure.

[968] I mean, this is full tinfoil hat.

[969] Like, the reason why all these social justice people are, like, so excited about pumping billions of dollars into Ukraine and billions of dollars into whatever's happening with Israel and Palestine.

[970] But zero talk about doing that to Baltimore.

[971] Zero talk about doing that to Detroit.

[972] Like, Maui.

[973] That's the only, well, Maui is a, you know, that's a different thing.

[974] But that is also another thing.

[975] But now, I feel you because the crime.

[976] Yeah.

[977] The crime.

[978] Like, like, hey, there's guys.

[979] to be a way to fix this.

[980] There's got to be some sort of a solution.

[981] But if you don't, if no, if we never get, not in 10 or 20 years, never get to a quality of opportunity, you're always going to have a certain amount of disenfranchised people.

[982] You have a portion of your population for sure that's going to be in trouble.

[983] They're going to have problems.

[984] Then you always got solutions.

[985] You've always got, you've got opportunities.

[986] You've got like this little moving game.

[987] Yeah.

[988] If everything is even and then it's just complex.

[989] And then America thrives to be the greatest utopian idea of what we'd hoped it would be.

[990] Well, then it's really difficult to control people because they recognize that freedom is one of the most important aspects of having this kind of amazing opportunity to do whatever the fuck you want.

[991] Brother, this is why I wear a tinfoil hat now all the time, basically.

[992] Except I'm not really afraid of the radio wave, so I don't really wear a tinfoil hat at all.

[993] I'm more afraid that this is intentional in a lot of places, in a lot of ways.

[994] Well, we know in those cities, we know.

[995] But if they played it out this much, have they thought about it and said, you know what, we want more crime, we want more illegally and if we keep the chaos, then we keep passing laws, we get further down the road.

[996] Because if you read the Marxist literature, which is unfortunately my damn job, you can derive a number of different conclusions.

[997] One of these conclusions that you can derive absolutely is, do you know what repels a revolution in a country better than anything?

[998] Stability, social stability.

[999] So if you can destabilize a population, then you can get them to crave a revolution, or you can, like with the Patriot Act, you can get them to assent to sacrificing their liberties for security.

[1000] So if you can destabilize an area, then you can cause them to want to have radical political change.

[1001] We just saw this.

[1002] This woman news in the Fox News this morning, so I had to double check, but she's talking about, was she from Maryland?

[1003] She's in the government.

[1004] She says that she wants to burn the country to the ground so her ideology.

[1005] can rise out of it, out of the ashes, right?

[1006] So she said this publicly, and her, she's the equity coordinator in one of these cities.

[1007] We can, if you can find it, Jamie, you can pull it up, I don't know.

[1008] There's on Fox News, equity coordinator, Burn City, rise from the ashes, you'll probably find it.

[1009] So this, we know who's causing these crime problems.

[1010] It's those DAs.

[1011] We know who ran the DAs, who paid the money to run them is the Open Society Foundation.

[1012] They call them Soros DAs.

[1013] We finally broke the spell saying that this very, very rich man and now his very, very rich son are using their very large amounts of money to do things that aren't necessarily great politically.

[1014] Well, they used to be able to say if you criticize George Soros, you're an anti -Semite, and that was what they always went with.

[1015] We got to go.

[1016] Like, you got to see this thing.

[1017] I don't want to overload Jamie again, but in 2004, Soros gave an interview to the LA Times, and you actually look this up.

[1018] I thought, that can't be real.

[1019] It's real.

[1020] He said that he thinks he's a God.

[1021] He said that he always suspected that he might be a God, and he kind of controlled it for a long time.

[1022] And then finally, finally, he just kind of realized he is, and he kind of gave into it.

[1023] Is it possible that he was doing exactly what you were doing when you were making jokes about George Floyd in January 6th?

[1024] I mean, maybe I'm trying to think of a time where I would have talked to the Los Angeles Times in a deliberate interview and said, I'm, you know, I'm Zool.

[1025] Maybe he was a little drunk, and he was just like, why am I doing this bullshit?

[1026] interview.

[1027] I'm worth $30 billion.

[1028] I mean, he reads pretty intentionally, but it's possible.

[1029] Maybe George Soros was doing some shitposting to the L .A. Times.

[1030] I mean, I would shipposts.

[1031] I would love to shippost these big journalistic outlets now, but 20 years ago, I don't know.

[1032] Well, now they're all falling apart.

[1033] If you want a shitpost, you better do it quick.

[1034] I mean, you know what I did in the past.

[1035] I did my shit posting.

[1036] You did a lot of shit posting.

[1037] I did some epic shit posting.

[1038] What was the thing that we were just looking up before that, though?

[1039] Where we looked?

[1040] The Fox thing?

[1041] Yes.

[1042] Yeah, because like, you should see this.

[1043] This is for real.

[1044] I was like, I saw it this morning.

[1045] Yes, the lady said burn it all down and have her idea of what society should do rise in the ashes.

[1046] It's unfortunately, like I took a screenshot and I can pull it on my phone, but nobody can see that.

[1047] Well, there's a lot of people that don't have anything that haven't accomplished anything where that sounds like a good idea.

[1048] That's right.

[1049] So if you can make these people.

[1050] It's just the headline.

[1051] I'm trying to find the video.

[1052] My idea.

[1053] So my ideology can rise from the ashes.

[1054] The equity officer.

[1055] Well, I mean, it's like, okay, lady.

[1056] Equity.

[1057] But if you're, but if you've created a whole industry based on equity.

[1058] But also says, I don't want to work.

[1059] I don't want to work.

[1060] She also says, I don't want to work.

[1061] She doesn't want to work.

[1062] So we've got to ask the question, 100 % for real, though, right?

[1063] Equity official wants her ideology to rise from the ashes.

[1064] What ideology is it that she wants to, and it's the equity one.

[1065] Well, what is that?

[1066] Right.

[1067] Do you know the definition of equity?

[1068] What is it?

[1069] It is an administered system in which, what's the word I'm looking for?

[1070] It's an administered system in which shares are adjusted so that citizens or participants are made equal.

[1071] So it's not a quality of opportunity.

[1072] It's shares are adjusted so people are made equal.

[1073] So that's when you and I go to the range and I shoot the bow and I suck.

[1074] And you shoot the bow and you put it through the hole.

[1075] Then Jamie goes over and pulls my arrow out, moves it over three inches and sticks it in and like James tied you.

[1076] I can't beat you.

[1077] I have to tie you.

[1078] Right.

[1079] Or we pull your arrow out of the bullseye and we move it over and like put it like right below mine in like the third or fourth ring out.

[1080] Actually, I miss the target.

[1081] Let's not lie.

[1082] Let's not brag about my Twitter account.

[1083] It's worded slightly differently.

[1084] I can't wait for society to collapse so much.

[1085] my ideology can rise from the ashes.

[1086] What's different about that?

[1087] Burn.

[1088] I mean, the word burn is different.

[1089] Rise from the ashes?

[1090] It says the same thing.

[1091] That's what we read.

[1092] No, there it is.

[1093] It's in this other quote in 2020.

[1094] So it's a different, so Fox has stitched some things together here.

[1095] Okay.

[1096] So different quotes is already planning, been planning for how we will eat and live and grow after we burn it all down.

[1097] Well, I think the idea is that like there's enough money.

[1098] If you take away the money from the billionaires and distribute it evenly no one has to work yeah that's right it's like a 12 year old's idea of what to do with money it's exactly what it is and it's also it's like what do you like phones okay who do you think makes those phones who do you think designs those phones who do you think works really fucking hard to make sure that the new samsung galaxy s24 ultra is better than the iPhone who the fuck do you need competition yeah you the person who does that is somebody who's willing to lay it on the line for a huge reward if it works out.

[1099] They want a yacht.

[1100] They want a yacht.

[1101] That guy wants a yacht.

[1102] He's like, I can see myself right now in the fucking British Virgin Islands, party with my friends on this yacht.

[1103] That's what he wants.

[1104] So that's why he's willing to work so hard.

[1105] If you fucking get free money, you're not going to work that hard.

[1106] And you're not going to get the Samsung S -24 Ultra.

[1107] You're not going to get that.

[1108] As in the thing is not going to exist.

[1109] Nobody has the drive to make it.

[1110] No one's going to make it.

[1111] You're going to be forced to have a...

[1112] Imagine that person who said that.

[1113] Imagine of that person had to design electric cars, had to put together a manufacturing plant, had to figure out, imagine, imagine.

[1114] Some fucking person who says, I don't want to work.

[1115] I want to burn it all down.

[1116] I don't want to work.

[1117] So my ideology can rise.

[1118] And you take them seriously, and this is a person that's in charge of...

[1119] This is a thought person, a person who's in charge of ideas, and a person who's in charge of ideas.

[1120] in charge of implementing some sort of a better system for society, for real?

[1121] Yeah, no, an ideology is a word people don't understand.

[1122] I have to read in this.

[1123] It's a cult.

[1124] That's the word I was going for.

[1125] Thank you.

[1126] It is.

[1127] It's a cult.

[1128] It's a cult.

[1129] Okay?

[1130] So what it is, ideology is a fancy word for a mythology that the society buys into.

[1131] It has a direction and it has activity.

[1132] It's a cult.

[1133] What we're looking at is the dynamics of a cult.

[1134] Everything will work out.

[1135] If everybody believes it, we know how it'll work.

[1136] else knows how it'll work.

[1137] You put us in charge.

[1138] Obviously, when it doesn't work, somebody else is at fault.

[1139] That's actually, do you know that that's actually, you can talk to people who still believe that the Soviet Union could have worked out?

[1140] And they say the reason the communist countries failed, because obviously, you know, there are catastrophes, hundreds, millions dead, nothing works.

[1141] They collapse.

[1142] They say that it was because there are capitalist countries pressuring them from the outside that prevent them from working.

[1143] So you can say, I've had this conversation.

[1144] So if it can only work if every country is communist, and they're like, that's right.

[1145] That's a global cult is what that is.

[1146] That's not real.

[1147] That's fantasy land.

[1148] And that's the, that I think because equity means socialism, as I just told you, I think that that redistribute shares to make participants equal.

[1149] I think that that actually kind of shows that this is cult mentality that we're dealing with.

[1150] It is cult mentality, but it's ingenious.

[1151] This is what's in genius.

[1152] This is the genius aspect of it is that they've managed to cast such a wide net.

[1153] over what it means to be progressive, that they've included all these radical Marxist ideas that everybody dismissed forever.

[1154] And they threw them all in with this gender stuff and LBGTQ stuff, and then they threw that all in with race.

[1155] And then they threw that all in with immigration and then somehow attached it to funding international conflicts.

[1156] That's right.

[1157] At the expense of the people, the poorest people, who could benefit the most from that money.

[1158] That's right.

[1159] And using other people's pension money to fund a lot of it or to get it off the ground.

[1160] And if you oppose it, you're fascist.

[1161] It's kind of brilliant.

[1162] It's sort of brilliant.

[1163] It's kind of brilliant.

[1164] Yeah.

[1165] It's either intentional or we're just so vulnerable to ideologies, which seems to also be the case.

[1166] It's why there's so many different sex of religions, even sex of Christianity, the fucking Protestants, hated the Catholics forever.

[1167] What happened in Iraq with the Sunnis and the Shias.

[1168] This is always the case.

[1169] It's always the case.

[1170] People have like really rigid ideologies and the punishment for abandoning them or the punishment for stepping outside is death.

[1171] Yeah, that's right.

[1172] You're fucking dead, you bitch.

[1173] You're not one of us.

[1174] You're not one.

[1175] And that is what we're seeing in this country.

[1176] That's right.

[1177] We're seeing this weird leftist, progressive ideology with a super wide net that cover so many things, including all these industries that are set up to make it look like they want a better world where really they just want to dominate a sector of the market, whether it's green energy or agriculture or food or plant -based meat or any of this fucking psycho shit that they're trying to push all the time.

[1178] They're doing it for profit, and they're doing it this super wide net of being a good person, being a progressive.

[1179] That's why I call it neoliberal communism, and I say that that Deng Xiaoping character we were talking about earlier is like he's the guy that nobody knows about, except I mean the Chinese do, obviously, but we really need to pay attention to what he cooked up and how what they have, whether it's World Economic Forum or UN or the WHO or their god -awful treaty to, you know, the health sovereignty thing.

[1180] Have you seen?

[1181] You know what I'm talking about, right?

[1182] What is that?

[1183] Well, let me finish the thought.

[1184] I'll come back to it.

[1185] They're copying that same model.

[1186] It's neoliberalism, which is how do you get huge corporations to be able to basically get tons of money and have monopoly power and make it off of the government?

[1187] That's why the Rockefeller guy would have been interested in all this.

[1188] And how do you do it with a communist ideology at the same time?

[1189] China is the model.

[1190] We're seeing it build out in the West.

[1191] This stuff like we now are seeing proof.

[1192] It just came out the other day that the Chinese are like funding the trans stuff.

[1193] They're like pushing it, right?

[1194] I just wrote a book.

[1195] I didn't even know that to put it in the book.

[1196] I wrote a book about the trans stuff.

[1197] It just came out on the 29th called The Queering of the American Child to talk about how schools have been turned into like indoctrination centers.

[1198] It all goes back to the not just Marxist, but Maoist strategy to make the world conform, that politics and compliance, to make the world conform to this new ideological vision that they have.

[1199] And it's got to be like we were saying, it's got to be religious to the people who believe it.

[1200] It is religious.

[1201] It's new values.

[1202] They even say that.

[1203] Klaus Schwab said, you can't rationalize, or you can't, how did he say it?

[1204] You can't rationalize values through the intellectual process alone.

[1205] It requires faith.

[1206] We've all seen children that grow up in religious cults.

[1207] We've all seen the horror stories of children that come from these radical religious cults and they escape when they get older and they tell the story of the indoctrination and what all they believed.

[1208] when I see a woman and she's got three trans kids that is what I think of I think of someone who is a full adherent and ideologically captured by this cult to the point where they see value and having their child be a part of the LBGTQ movement because it looks good for them socially it's like they have a flag on their fucking porch and they wave their kids around and it's weird and it's not everybody no it's not everybody no everybody.

[1209] But it's not everybody that has trans kids or a kid who thinks he's trans.

[1210] It's probably gay and probably if you leave them alone and don't encourage them to castrate themselves.

[1211] Or a young woman with trauma or, you know, autism.

[1212] Autism going through her period, which is like what's going on in my body.

[1213] I don't like it.

[1214] If you have girls, that's, that's a traumatic experience for them.

[1215] It's very difficult.

[1216] It's confusing.

[1217] It's weird.

[1218] That's why we wrote this book, man. The whole book is like queer theory is the doctrine of religious cult.

[1219] It's based on sex.

[1220] It primarily targets kids, and it's got barely anything to do with gay people, almost nothing.

[1221] And so.

[1222] Also, this idea that the only way to fix what's bothering you is surgery is crazy or hormones.

[1223] Oh, my God.

[1224] It's insane.

[1225] Well, in Europe, they've already, in the UK, rather, they've banned these hormone blockers now for children.

[1226] Yeah, the UK, the NHS just backed off of that completely.

[1227] It's like, okay, your move, United States, because this is serious.

[1228] It is serious.

[1229] And what's scary, me is that they have a socialized medicine system and they can back off of things, I think, a little bit easier than weekend America when they've opened up how many gender affirming care clinics.

[1230] There's a path, and I mean, I don't know how many legislators pay attention to the show, but they should take it seriously.

[1231] Missouri has kind of tread the way.

[1232] A lot of these states, there's 26 states that have tried to ban transgender care so far.

[1233] And they're getting sued.

[1234] Of course they're getting sued, right?

[1235] Of course they're getting sued.

[1236] Some plaintiff comes in.

[1237] The ACLU shows up with an army of lawyers and they're like, no, it's civil rights.

[1238] It's, you know, medically necessary, blah, blah, blah.

[1239] And then it's a battle in the court and it depends on who the judge is.

[1240] There's another way.

[1241] Missouri actually, more or less, stopped this stuff with one simple change to the law.

[1242] They changed the statute of limitations for medical harm.

[1243] So my thought is, if you're under 20 and you undergo some of this medical treatment, you have a 20 -year statute of limitations.

[1244] Anybody who gets the surgery done under, you know, surgeries, hormones, whatever, under 20 years old, they have till their 40th birthday if they decide they regret it to file a malpractice suit, not to win the malpractice suit, but to file one.

[1245] The statute of limitations in Missouri previous set was either two or three years and they extended it and I don't know exactly how long.

[1246] And it basically shut it down.

[1247] We could shut a lot of this down by, because America, like you said, works differently.

[1248] We could, we don't have socialized medicine.

[1249] We could shut this down through litigation.

[1250] And that litigation, all you have to do is open.

[1251] open the what are they called rights to action so you know let's say i'm i'm in my 40s so it's not like that but if i'm a 19 year old or 17 year old or 15 year old like chloe cole was and i go and i get surgery say i get my breast removed or my you know genitals cut up or whatever and then come my 27th 28th birthday i'm like woof i got talked into that i shouldn't have done it i feel like i was misled by my doctors i want to file a lawsuit right now usually you cannot.

[1252] I mean, some of these detransitioned teenagers are suing, you know, Chloe was and some of the, you know Chloe, right, Chloe Cole.

[1253] I've seen the story.

[1254] Okay, yeah.

[1255] So some of these detransitioners.

[1256] It's horrible how they get treated, too.

[1257] It's like people attack them.

[1258] They get attacked like traitors.

[1259] Like traitors, like, well, because they left a cult.

[1260] Exactly.

[1261] And listen to how they talk.

[1262] They talk like I was in a cult.

[1263] Like, I was completely convinced.

[1264] I had other problems.

[1265] This would solve all my problems.

[1266] I was affirmed at every step.

[1267] That's love bombing.

[1268] So in this, so there's this queer educator, which is a fucking weird thing to even say, right?

[1269] His name's Kevin Kumishiro.

[1270] And Kevin Kumasiro wrote his paper back in 2002 titled Against Repetition.

[1271] And in this paper, he actually says that the point of social justice and specifically queer education is to lead children into personal crisis and then structure their environment so they resolve the crisis toward social justice.

[1272] that's trauma bonding that's cult recruitment that should be a prison sentence and they know they're doing it they're leading them i'm they're leading them into personal crisis it's like hard to even say the sentencing and you get mad uh because these children but it's so crazy that we've always protected children from influence we've always protected children from bad decisions that's why you can't get a tattoo no you're 18 years old they have to go after the kids they have to because they're soft targets.

[1273] You're 100 % right, but they're after childhood innocence.

[1274] They say that too.

[1275] There are papers against childhood innocence saying that it's a social construct meant to protect some kids and not others and meant to preserve normalcy and white heteronormativity and all of this other crap.

[1276] That's why the term minor attracted persons freaks me the fuck out.

[1277] When you're trying to normalize pedophilia, you're trying to normalize people who want to fuck kids.

[1278] That's crazy.

[1279] And And by the way, it's almost always men.

[1280] Like, you're, you are empowering the creepiest of creeps.

[1281] The creepy creeps.

[1282] The monsters of the male species.

[1283] The ones, if you want to talk about men, like men being toxic, the most toxic.

[1284] Yeah.

[1285] You're empowering.

[1286] Like the zero point zero one percent.

[1287] The evil ones.

[1288] The ones infected by demons.

[1289] Yeah.

[1290] They want them to go fuck kids.

[1291] You're empowering them by telling them that it's an identity.

[1292] didn't he?

[1293] Yeah, and I'll tell you, queer theory has, it is, actually, it doesn't not have limiting principles.

[1294] It's opposed to limiting principles on principle.

[1295] Let me give you another definition.

[1296] This is a book called St. Foucault, naming the Michelle Foucault, the postmodern guy we made fun of back in the grievance studies papers.

[1297] And so Michelle Foucault is lionized in this book.

[1298] And this is the book where queer in queer theory gets defined.

[1299] David Halpern wrote it, 95 is the date.

[1300] And it's right there in the paragraph that he writes, defining queer.

[1301] It starts with these three words, unlike gay identity has virtually nothing to do with gay people why he says because that's rooted in a positive truth you're gay right that's a truth you're gay he says queer need not be grounded he says in any positive truth or any stable reality it is whatever is opposed to the normal the legitimate and the dominant what so normal and legitimate i mean they're always after the dominant so we can just but isn't that local Because, like, if you're in West Hollywood, the dominant is gay.

[1302] Well, they don't want normal, though they don't like, they don't like that either.

[1303] It's not about gay.

[1304] So you'd have to be opposed to the gay, you'd be straight.

[1305] They literally, it's, I hate using huge words on shows.

[1306] It's homonormativity is what they call that.

[1307] If you consider being gay normal as a normal part of society, that's homonormativity.

[1308] It's as bad as heteronormativity.

[1309] It's no better.

[1310] Because, again, it's real simple.

[1311] We said it earlier.

[1312] Stability repels revolutions.

[1313] They need radical people.

[1314] They need queer activists who want to destabilize the, normal and undermine the legitimate every turn.

[1315] Jesus Christ.

[1316] Like, I don't have to have a tinfoil hat for this.

[1317] We can just crumple that up and throw it in the trash.

[1318] This is black and white in their literature.

[1319] This is what they're like, hey, look what we're going to do.

[1320] Let's protect the boy lovers.

[1321] Gail Rubin, 1984, I mean, you want the citation.

[1322] Like, these people are dead serious and they are opposed to the idea.

[1323] So this started by me saying they're opposed to limiting principles on principle.

[1324] What does that mean?

[1325] it means at some point somebody's going to say you know what you want to hump kids no that's a limiting principle we draw the line at kids that's a limiting principle they're opposed actually to anything that tells them no anything in the world including the world itself telling them no oh yeah you know you technically can't be a man who becomes a woman but let's just chop you up until you're close enough the world's telling you no but we're going to keep doing surgeries and hormones until it's kind of like yes Jesus Christ.

[1326] Dude, it's, you, I told, I know, I know you're probably not the biggest fan of Charlie Kirk, but I was on stage of Charlie Kirk talking about this.

[1327] He wanted to talk about critical race theory with me, and I'm like, damn it, Charlie, and I'm getting frustrated sitting there.

[1328] Why would you say I'm not the biggest fan of Charlie Kirk?

[1329] I don't know.

[1330] I figure that, maybe you are.

[1331] I think he's very smart.

[1332] He's freaking smart.

[1333] Nobody gives him credit for that.

[1334] No, because they, they associate him with the white nationalist sort of right way.

[1335] He's a different guy.

[1336] He is quick.

[1337] He's not who they label him to be.

[1338] he's very smart he's very very smart and so charlie and i were sitting on stage and i'm getting pissed at him right because it's like i don't want to talk about critical race theory again and so i'm like okay charlie we need to talk about queer theories like what's queer theory and i kind of explain the normal thing like i just and he's like whoa you know he didn't swear because he's a good christian boy but i'm like charlie let's put it real simple queer theory opens the gates to hell and i think that that's like the best way to put it this stuff they're pushing on the kids opens the gates to hell of course this is why i'm in trouble because i started saying okay groomer, you know, I got kicked off Twitter for okay groomer for like five months until Elon brought me back.

[1339] Thanks, Elon.

[1340] But why was I calling them groomers?

[1341] I get challenged on this.

[1342] People drag me in these interviews.

[1343] Well, you were part of that groomer thing.

[1344] Well, there's a paper that they wrote.

[1345] Everything I do is I read their stuff and I said something and people are like, that can't be real.

[1346] Turns out it is.

[1347] And so this paper is the drag queen story hour paper.

[1348] It's called drag pedagogy.

[1349] It's free access.

[1350] You can go look it up.

[1351] The title is drag pedagogy.

[1352] read it for yourself is written by a drag queen named Lil Miss Hotmess and a trans educator named Harper Keenan.

[1353] It came out in Curriculum Inquiry, which is a serious academic journal in curriculum for schools.

[1354] And at the end, they say that they're talking about the family -friendly aspect, right, the branding that it's family -friendly.

[1355] And what they say is that it's not so much that family -friendly is to sanitize drag.

[1356] That's not what it is.

[1357] It's And actually, and this is their exact word.

[1358] So I ask people when they challenge me in this, what word do I use for this?

[1359] They say it is a preparatory introduction to alternate modes of kinship.

[1360] That's a direct quote.

[1361] Then they say a preparative introduction to alternate modes of kinship.

[1362] Yeah.

[1363] I don't know if you want to get that pulled up and on screen and show it, but it's for real.

[1364] The paper's called drag pedagogy.

[1365] It's easy to find.

[1366] It's the first paragraph in the conclusion if you need to scroll.

[1367] Then it goes on to say that the family, in family friendly, actually refers to a queer code for the queer family you meet on the street.

[1368] Their words.

[1369] I'm not exaggerating.

[1370] The queer community.

[1371] You abandon your real family for a queer family.

[1372] The last sentence in the paper says that they're going to leave a trail of glitter that will never come out of the carpet.

[1373] You know, I'm funny, and I could make stuff up.

[1374] I didn't make this up.

[1375] I'm like, so I, these guys are like, you can't say, okay, Groomer.

[1376] And I'm like, what word do I?

[1377] use for that.

[1378] And I've yet to have somebody tell me what word is better.

[1379] Of course that's the word for that.

[1380] Alternate mode of kinship.

[1381] And they say it's all about queer world making.

[1382] Same paragraph.

[1383] Queer world making has always been a project based in desire.

[1384] So for them, let's try to steal me on this.

[1385] Imagine if you're a gay person and you were picked on in school, you always felt out of place there was no one that was there that you could turn to that could say hey it's okay you're just gay and you know these kids are cruel but the reality is there's a beautiful gay community that will accept you yeah and to tell other kids hey this is just how this kid is born and this is picking on him for that is no different than picking on someone for their skin color or where they're from it's all gross don't do it yeah And you could look at it that way.

[1386] Like they wish that there was a path for someone like them.

[1387] Yeah, I have three actual responses to that.

[1388] So first, we're not talking about sitting down and having the hey buddy talk with your kid who just did a jerk thing.

[1389] We're talking about drag queens in a classroom, which is a little bit more, which they call, in fact, a generative introduction to queer world making in the paper, which is pretty insane.

[1390] But secondly, in the paper, the immediate section before the conclusion, the last.

[1391] section before the conclusion.

[1392] So right before what I just told you, there's a section titled From Empathy to Embodied Kinship.

[1393] That's their title for it.

[1394] That's the title of the thing.

[1395] And they explain that this empathy root is a marketing strategy.

[1396] They use that for marketing to justify its inclusion, but its real purpose is to lead children to discover queer aspects of themselves.

[1397] So that's not what they're doing it for.

[1398] That's not the purpose.

[1399] So there must be another purpose.

[1400] And what's the other purpose?

[1401] I think it's a cult.

[1402] initiation ritual.

[1403] I think the point of the drag queen is to get the kids exactly what they say in the paper start asking questions.

[1404] Why is that man dressed as a woman?

[1405] Do we always have to follow rules?

[1406] Can we do whatever we want?

[1407] Isn't this more fun?

[1408] And they start asking the questions and having the conversation and then the kids who show interest end up going off into, you know, the club after school where they get affirmed.

[1409] And I think that's where the cult initiation's going on.

[1410] And I'm dead serious.

[1411] I think the Dry Queen Story Hour was a cult initiation ritual for queer activism for our kids.

[1412] But then let's do the medical approach.

[1413] And I'm not talking about, you know, I got a PhD in math.

[1414] That was my background.

[1415] And so one of the things that I was shocked when I was learning math back in the, you know, 20 years ago was, here's a question for you, just ask, do you know why we don't do universal cancer screenings?

[1416] Why everybody doesn't go to the doctor every year?

[1417] Because, of course, they'd be able to make boo -ku bucks off this, right?

[1418] So wouldn't it be good if we did universal cancer screenings.

[1419] Everybody goes, they get the check, whatever it is, universal mammograms, whatever it happens to be.

[1420] There's a reason, there are a few reasons we don't do that.

[1421] It's the same reason, sort of, that we don't just give everybody, say, Ritalin because some kids have ADHD.

[1422] But what it is is that if we tested everybody for cancer, that test has a false positive and a true positive and a false negative and a true negative rate.

[1423] These are called the specificity and sensitivity of the test.

[1424] And if we screen every person, we screen every person, what happens is you actually end up with way more false positives because there are way more healthy people than sick people and the sum percentage of them turns out to be a far larger number so that if you do that what you end up doing is telling thousands of people per year that they have cancer when they don't freaking them out causing them to rearrange their lives plus it's expensive and then some of them will have a false positive twice and depending on the specificity and sensitivity of the test it can be almost three times before you hit 50 -50 as to whether you got a positive test.

[1425] So imagine you go and you get tests for cancer, screened for cancer, and it says you have cancer.

[1426] And then you get tested for cancer again and it says you have cancer.

[1427] And you go a third time and you have, but you only have a 50, 50 shot of actually having cancer because of the way the populations break down.

[1428] You're going to be shitting your pants.

[1429] You're going to rearrange your life.

[1430] You're going to make some bad decisions.

[1431] This is, so what don't you do in the schools?

[1432] You don't assume that a large population of the children are gay kids who are getting bullied and treat the entire population of school.

[1433] kids, like they're gay kids who are being bullied.

[1434] You figure out when somebody's being bullied and you deal with the person individually and you figure out when somebody's doing the bullying and you deal with the people individually.

[1435] And we've known this since time immemorial until, in my opinion, we've reinvented our policies in the schools to do this broad cult initiation.

[1436] We treat everybody like they're sick, which is exactly the opposite.

[1437] So there's the paper itself lying about it.

[1438] There's the logical understanding of it.

[1439] But then there's also, you don't broadcast or universally screen to deal with low propensity sicknesses.

[1440] It's just a terrible idea.

[1441] Well, that's very logical.

[1442] It turns out it was from PhD math program.

[1443] So logic is, the logic is strong there.

[1444] Jesus Christ.

[1445] Yeah, because that was when I was getting, you know, I was getting prepped to teach statistics.

[1446] And it's like, these are the things you want to teach people in a statistics class, so they don't go make dumbass decisions because they don't understand how numbers work.

[1447] There's also this issue with the influence that people have on kids, just in general.

[1448] We're so flippant about who teaches kids.

[1449] And it should be a really difficult job to get, and it should pay really well.

[1450] And it's almost, if you wanted to go full tinfoil hat, again, you would think by design, You would want the least motivated Weirdest fucking shitheads to be teaching your kids Yeah Because you would ensure that their education would suck And especially if those kids, if those people, If you push the type of people to teach That we're a part of this ideology that you're trying to push Then you know what's the best?

[1451] Well these people aren't they aren't doing well anyway for the most part They're not like super financially successful if they're looking to push these agendas in general, they're not like really excellent capitalists.

[1452] So you could probably pay them less, you know, and you can get these people, they want that job because they think they're a part of a movement.

[1453] That's right.

[1454] And so this has been something that's been going on for a very long time.

[1455] There's a couple of explanations.

[1456] One thing to say is even without the tinfoil hat, right?

[1457] Queer theory in its own, imagine that you have the hiring body, the administration at the school gets infected with queer theory.

[1458] Oh, well, I don't want to, you know, that guy might like kids, don't want to like assume right you know i don't want to judge him it lowers the the the potential to say wait a minute bringing in this weirdo who's thrown off red flags everywhere might be a bad idea or in this case i guess rainbow flags with the triangle cut out of it that might be a bad idea right all of queer theory overrides your common sense so it lowers the the the screening potential it just makes the whole like it's like the fence is like wider open um so good people will make more mistakes when queer theory has come in.

[1459] But then there's the fact that this actually was a lot of people don't understand.

[1460] We don't need a tinfoil hat to understand that the universities are fucked up.

[1461] Nobody does.

[1462] Look at them.

[1463] Holy shit.

[1464] Harvard, you know, let's name some more universities.

[1465] They're all messed up.

[1466] And the fact is, there's an, I know I keep throwing out sources, but there's this book I read.

[1467] It's called The Critical Turn in Education.

[1468] It was 16, 15, one or the other.

[1469] And right from the beginning, one of the things he's explaining is that people with their ideology of education, which is called critical pedagogy, actually had captured our schools of education virtually entirely by 1992.

[1470] That's the date the Marxists themselves say, this is when we got the schools of education.

[1471] So I tried to explain this in this documentary that I've got coming out in May called Beneath Sheep's Clothing.

[1472] And it's very simple.

[1473] If you get the colleges of education, then you're going to get the teachers.

[1474] And if you get the teachers, then you get the kids.

[1475] And if you get the kids in a generational strategy, you get the future.

[1476] And that was, they, they own the schools.

[1477] Like, they own the manufacturing plant where we build teachers and administrators.

[1478] They're called colleges of education.

[1479] They have a virtual monopoly on producing them.

[1480] And they have said, in their own words, that their ideology has run the schools since 1992.

[1481] That's, you know, if we're keeping track on our fingers, 32 years ago.

[1482] Well, this is what Yuri Besmanoff talked about in the 80s.

[1483] Yeah.

[1484] Yeah, it's why it's like I say what I just said and then Yuri in the trailer to that film, which is at like top of my Twitter, if anybody wants to see it.

[1485] Let's see it.

[1486] We go back and forth.

[1487] Yeah, it's like right at the top after my cool.

[1488] Let's see the trailer.

[1489] When does this come out?

[1490] End of May. Most people are blissfully unaware that all this is going on.

[1491] That's why I'm writing books and making movies where I could be going out and like enjoying my life.

[1492] Yeah.

[1493] Like I do like traveling around.

[1494] I like getting to meet people.

[1495] I've got, like, one of my, you know, I don't know if you knew Tiffany Justice is with Mom's for Liberty.

[1496] No, well, yeah.

[1497] But she says all the time, who knew we were going to make so many cool friends in our 40s?

[1498] Right.

[1499] Is this the trailer right here?

[1500] Yeah, with the blonde lady.

[1501] That's Julie Beeling.

[1502] She wrote the book it's based on.

[1503] Even in the future, nothing works.

[1504] We'll see here.

[1505] It's spinning.

[1506] Oh, did I post it twice?

[1507] I'm an idiot.

[1508] There's a thing about communism.

[1509] When it comes knocking at your door, it doesn't say, hi, I'm here to impoverish, enslave, and murder you.

[1510] It says, I'm here to liberate you from oppression.

[1511] I thought of myself as a happy kid.

[1512] I had no idea that I was being brainwashed.

[1513] The KGV agents would go into the dirt and then rise in this right.

[1514] All of them is infiltrated.

[1515] This was a rape of the body of Christ.

[1516] You take over the colleges of education, then you take over all the teachers, then you take over all the students, and that's you get the future.

[1517] He said the ultimate objective of having government school was to destroy Christianity.

[1518] Those are his words.

[1519] People's war means to destroy the opposing country through unconventional methods.

[1520] And Christchav bragged about it.

[1521] We'll take America without firing a shot.

[1522] In other words, Marxism -Leninism ideology is being pumped into the soft heads of American students.

[1523] Without being challenged.

[1524] The result?

[1525] The result you can see.

[1526] That looks good.

[1527] What is that going to be on?

[1528] It's going to be, you've got to deal with Rumble, so it's primarily going to be on Rumble, and, you know, we'll spread it from there.

[1529] I was going to ask, it must be exclusively on Rumble.

[1530] I don't know that it's totally exclusive because we're talking about it.

[1531] Can YouTube host something like that?

[1532] Well, if I hope so.

[1533] Would you get demonetized, do you think?

[1534] You know, let me just tell YouTube what I think about that.

[1535] I hope we do.

[1536] I hope we can put it on YouTube, and I hope we get demonetized, and we'll put it.

[1537] put up a YouTube edit where we literally just like put up the YouTube emblem over the scenes they don't want shown and do the Charlie Brown won't want won't want voice to the part people they don't want to hear and we'll put up a YouTube edit with a link to send people to the real thing like to hell with them like we can get around this censorship and turn it to our advantage these days it's very bizarre that they would choose to demonetize something that someone's legitimate opinion about a very worrisome trend I mean this is something that people should discuss and you to to be able to discuss these things especially in this wonderful world of open communication that we find ourselves in, you should be able to have both perspectives.

[1538] You should have the perspective of the queer theorists.

[1539] That's right.

[1540] And you should have the perspective of the people who say, this is where all this comes from.

[1541] That's right.

[1542] And if you don't do that, then you limit information.

[1543] And some of that information, especially the stuff that you're talking about that seems to be absolutely true and provable, you're letting that stuff go through because it opposes your ideology in that, My definition makes you a cult member.

[1544] That's exactly, right?

[1545] So the cult, I don't know if you know who Robert Lifton is.

[1546] Robert Lifton, he's kind of weird now.

[1547] He's still alive, but he was like a, you would talk about gangsters doing infiltration.

[1548] This dude was in Hong Kong in the 1950s.

[1549] And he started interviewing guys that were going through Mao Zedong's brainwashing prisons.

[1550] And then when they would get thrown out of China after they'd get out of the prison after three, four years of getting brainwashed, started interviewing them.

[1551] Like, what did Mao do?

[1552] How did he like, literally the title of his book that he wrote off of this is called thought reform and the psychology of totalism, a study of brainwashing in China, which is Shee Now and Mandarin.

[1553] I know a little Mandarin.

[1554] And so the idea is that they were doing what Mao called ideological transformation or ideological remodding in these prisons.

[1555] And he wanted to know how it worked.

[1556] What's the psychological dynamic?

[1557] And he said there are eight primary characteristics.

[1558] And God only knows if I could rattle off eight things from memory.

[1559] But the first one is Milu control.

[1560] In other words, you have to completely control the environment of the people that are within it.

[1561] And so you can't let them have outside information.

[1562] You can't let them get near people who are raising uncomfortable questions.

[1563] You have to say that those people are a danger to what Mao called democratic centralism, or we would say Joe Rogan's a danger to our democracy.

[1564] I think they said when you took ivermectin or something like that.

[1565] Right?

[1566] And so you have to control the environment people are in.

[1567] And then there's other things like mystical manipulation into a sacred science and all this other.

[1568] There's eight of them he has, which is finally at the end, which it's got doctrine over person, but he calls it like the expiration.

[1569] It's not the right word, but it's the dispensing of the person.

[1570] So the people who go along with it, who are in the cult are treated as people and the people who don't go along with it have to be treated as non -people.

[1571] And that's based, Mao Zedong gave a famous speech in 1957 where he actually said, to not have a correct political orientation is like not having a soul.

[1572] So you're, no better than, well, capitalist running dogs.

[1573] You're no better than the dogs.

[1574] Well, I mean, we like dogs, but you know what I mean?

[1575] How he would have meant it.

[1576] And so it's like you're no better than an animal if you don't go along with this.

[1577] And that's what you're saying.

[1578] That's a cult.

[1579] And all this bears the hallmarks of a cult.

[1580] And it feels like that is a natural pattern that humans fall into.

[1581] And I think particularly if you're not religious, I think one of the things about religious people is they've already got their thing, you know.

[1582] And hopefully it's one that promotes good values and it's a good thing.

[1583] But there's a part in the brain that wants that thing.

[1584] And atheists, they don't have a religion.

[1585] And so they find a social religion.

[1586] That's exactly right.

[1587] That's exactly right.

[1588] So they find it in their social circumstances, politics, economics.

[1589] And it always goes demonic when they do that.

[1590] I've been spending a lot of time, thanks to Charlie primarily, Charlie Kirk, I've been spending a lot of time paying attention to the tenets of Christianity and studying it.

[1591] And it's got a lot of good advice in there.

[1592] But you're 100 % right that if you try to lack a religion, and the primary thing with the religion, why, so if you, if you lack a religion, then it'll get filled in with other things for very many people.

[1593] I think there's a small percentage of people for maybe that doesn't apply, but there's a spot in your brain for it.

[1594] But the, the thing that, why do they go after Christians and Jews so hard everywhere they go?

[1595] And the reason is because they are completely.

[1596] completely committed to.

[1597] They're not, when you say, you know, they already have their thing, for Christians and Jews, that's not how they think about God.

[1598] It's not their thing.

[1599] That's something that's above everything else.

[1600] But the Muslims as well.

[1601] Well, Muslims too.

[1602] But Muslims, Islam's a little bit different because it's got a political element worked into it.

[1603] And I'm not trying to like throw shade.

[1604] I'm just saying that with all there is no, the state is never above God in Christianity and Judaism ever.

[1605] The state and God are somewhat intertwined.

[1606] or can be in Islam.

[1607] But it's not, so they're not quite identical, but it's true.

[1608] The God is above state, no question.

[1609] So when the state shows up and says to you, hey, you're going to do X, Y, Z, or else, and it goes against your religion.

[1610] If you're a Christian or Jew and to many, if it's not Islam, a Muslim, you're going to say, no, I have a higher duty.

[1611] And it's not to the state.

[1612] And if you kill me, I'm going to a better place so I don't care.

[1613] And that's the enemy of totalitarianism in a way that nothing else is.

[1614] The Confucian virtues of China don't have that.

[1615] Buddhism actually kind of doesn't have that.

[1616] Well, you can't tell the CIA this because they're going to co -op the churches.

[1617] Well, that's what the documentary is actually about is how the co -opting of the churches took place in the Soviet Union, actually.

[1618] Yes.

[1619] And so, and then how, okay, so you put it on the table.

[1620] I think that's what this whole stupid Christian nationalism thing.

[1621] I think that's part of the purpose of the Christian nationalist dialogue.

[1622] And I'm exactly a Christian nationalist.

[1623] I'm probably one of his most vocal opponents.

[1624] You think it's like agent provocateurs?

[1625] That's right.

[1626] I think that they want to recreate something that looks like Charlottville, you know, the very fine people on both sides incident.

[1627] Or like January 6th, they get somebody to do something stupid or violent or maybe they just run a narrative.

[1628] Well, there's always been people that when they see these well uniformed people walking around with Nazi flags, their face cover, they're like they're feds.

[1629] People always think that.

[1630] They always think that, which is a terrible thing to think, that your own federal government is involved in doing something to stir racial hatred, or at least give the image that racial hatred is being stirred, and then connect that racial hatred with people that just don't believe what you believe and believe in God.

[1631] Right.

[1632] And that's why this Christian nationalist thing, you know, it's a leap that's not very far in most minds from Christian nationalist to white Christian nationalist.

[1633] Right.

[1634] And it's so easy for them to say then that independent conservative churches, and I would say those in particular are a hot house for domestic extremism.

[1635] And then they start cracking down on that.

[1636] Maybe it's the FBI agents are going to church every week and they're writing down every single thing you say or every single thing you do.

[1637] Maybe it's that, you know, they start messing with the IRS status.

[1638] Maybe it's that they create other pressures with zoning or whatever else to make it so that independent churches are very difficult to do.

[1639] Because what they had in the Soviet Union, and I learned it, I actually didn't know it until the Timothy guy, the Russian guy in the film was telling us, they had what's called a registered church in the Soviet Union.

[1640] So the Soviets didn't get rid of the churches.

[1641] The Soviets created a fake church that was like, you know, Lenin, Stalin, Jesus.

[1642] They have a church in China.

[1643] It's called the three selves church.

[1644] It's called the three selves church.

[1645] It's super weird.

[1646] It's like, if you see pictures of it, it's super weird.

[1647] It's like there's a cross and there's like Mao.

[1648] It's like President Xi.

[1649] And it's like, what am I looking?

[1650] It's so weird.

[1651] What is it called the three one?

[1652] Three Self's Church.

[1653] I'd have to remember if there's an essay.

[1654] Oh, I need to see this.

[1655] And so they have a fake church, and they funnel people into it.

[1656] And then they persecute everybody that's got religion outside of the fake church.

[1657] And that was really the impetus for making the film.

[1658] And we said, well, we've got to talk about the schools, too.

[1659] Well, is that, what's fueling the Uyghur Muslim thing, where they're...

[1660] Well, communists don't like competing religion.

[1661] So they also do like slave labor, and they like making examples of people to keep things under control.

[1662] So if we just take it at face value that the C and the CCP stands for communist, this is perfectly in line with the way communists behave.

[1663] That's what they would do with the Christian dissidents in Soviet Union.

[1664] They send them into gulags where their job would be like to carve a canal out of bedrock working themselves to death in freezing conditions and then only for not only the canal to not actually be deep enough or wide enough to do what it had to do.

[1665] So it's demoralizing it failed.

[1666] But then it drains whichever, I forget what's lake it was, RLC or whatever, gets drained because they're idiots and they don't know how things work.

[1667] They think that they can be the masters of nature.

[1668] But that's what they would do.

[1669] They would round up these dissidents and send them off to Gulag.

[1670] Gulag wasn't a concentration camp like the Nazis had.

[1671] It was a re -education camp where they were trying to re -educate you through doctrine combined with hard labor.

[1672] And if you died, you died.

[1673] Jesus Christ.

[1674] Yeah.

[1675] And that's where communism leads to, kids.

[1676] And that's where communism leads to.

[1677] And that's why other places too.

[1678] So here's what you want to, you know who I feel bad for?

[1679] And they get real mad.

[1680] I did this at Northwestern.

[1681] I told them this to their faces.

[1682] I did a talk at Northwestern University last year.

[1683] And they let a bunch of woke protesters in and they carried on and yelled at me and like we're like mocking me doing the loser sign on their faces while I was talking.

[1684] That's really effective.

[1685] It's hard to keep your train of thought while that's going on.

[1686] And they started cheering when I started talking about Mao so they know what it's.

[1687] It is.

[1688] And I was like, cheer for your dictator.

[1689] And they started clapping.

[1690] And it was like really creepy.

[1691] But then, so I did this at Northwestern.

[1692] And I told them something.

[1693] And they laughed at me. But okay.

[1694] So Mao created in the mid -1960s a thing called the Red Guard.

[1695] People all know about this.

[1696] It was young people.

[1697] It was mostly college and high school students.

[1698] But it went down to little kids.

[1699] Shee Van Fleet was in China during the Red Guard, for example.

[1700] And she's got that book, Mao's America out talking about what that was like.

[1701] surviving that.

[1702] And the Red Guard went around destroyed property, harassed, turned in people.

[1703] They ended up rounding up the sitting president of the CCP, Liu Shao Chi, pulled him out, humiliated him, kicked him out to the countryside to die.

[1704] Right.

[1705] He came out.

[1706] He said, he's a chairman of CCP.

[1707] He comes out.

[1708] He says, am I not a citizen?

[1709] Can I not speak?

[1710] And these teenagers, by the thousands, were out there protesting him and said, no, you're not a citizen.

[1711] You're not a person.

[1712] You cannot speak.

[1713] And they ended up carting him off to die in the countryside.

[1714] Mao takes his power back.

[1715] So that's the end of the end.

[1716] in 1967, took about a year and a half.

[1717] So as soon as Mal gets back on the throne, we turn around in 1968, what does he do with that red guard that was so loyal that got him in?

[1718] Did he give him trophies?

[1719] Does he give him a spot in the party?

[1720] No. He said the red guard has become too radical and two left.

[1721] So he rounded him up and sent him off to the gulag to die.

[1722] And some of them were so brainwashed.

[1723] They said shit like, going to work with the peasants in the fields will make my brain even more red as they got on the trains.

[1724] So I told these kids.

[1725] So I told these kids at Northwestern, I said, listen, you woke kids, cheer all you want for Mao.

[1726] This is your future.

[1727] Stability is what repels revolutions.

[1728] So if they need destabilizing forces now, that's you.

[1729] But once they destabilize things enough to take power, that's as Mao phrased it, that's a new phase of the revolution.

[1730] That's called building socialism.

[1731] They don't need destabilizers anymore.

[1732] They got to get rid of you.

[1733] And I'm like, if you win, you get your revolution, you lose.

[1734] And I feel bad.

[1735] I honestly.

[1736] Honestly, I mean, I talk big, but I feel bad for these kids.

[1737] It got caught up in this because if I'm not, if I'm right, that's their future.

[1738] It's probably a digital gulag, not a physical gulog.

[1739] Maybe they're going to have to go farm corn or something, but probably they're going to have to sit in their apartment.

[1740] It's like 200 square feet with their oculus on, pretending that they're going nice places, as long as they fill out enough surveys to give data so that whatever the data machine is can collect the data that justifies.

[1741] that and sell their by participating zero in real life they build up carbon credits that the rich people can buy uh because this whole fake carbon economy or sustainable economy that they're trying to build around it i think that's really i like i legit think that that's these kids future they go woke they break themselves they go in service of revolution and then the revolution turns around and eats them too snake eating its own tail jesus christ and i'm like i wish i could wake them up but man they're in a cult i seriously think they're disposable and enough people are going to hear what you're saying that it's going to cause a stir and then more people are going to share it and be aware of it.

[1742] And that'll help some.

[1743] But the problem is there's not many people like you out there that are saying this in an articulate and very well -informed way where it resonates with people.

[1744] And I realize like, oh, this is what's going on.

[1745] I think I'm just trying to be a good person.

[1746] I think I'm trying to be open -minded.

[1747] I think I'm trying to be kind and compassionate.

[1748] And really, I'm in a cult.

[1749] I mean, I've had it happen.

[1750] I've had it happen.

[1751] I had this one young lady at one point reach out to me and say something I said made her so mad that she went and she like blocked me on social media and that she went and spent months trying to prove that I was wrong and then ended up concluding that I was right and it de -radicalized her.

[1752] Well, the problem is people are so married to their ideas that it's almost impossible for them to look at something that is opposed to it without being angry at it or trying to pick holes at it instead of just like objectively trying to analyze.

[1753] I was like, is this possible that this is true?

[1754] And isn't it something that governments and dictators and kings have done throughout history?

[1755] Haven't they done things in order to initiate more power?

[1756] Haven't they had false flags?

[1757] Haven't they created conflict that wasn't real in order for them to gain more power or start wars?

[1758] Yeah, they have.

[1759] What makes us think they don't do that anymore?

[1760] And if you're doing it in this digital battlefield that we're all currently involved in, you that's what you would use you would use social media platforms and you would control them like the FBI was trying to control Twitter yeah they were in infiltrating social media organizations to suppress legitimate opinions and thoughts of actual experts yeah and they were doing that at the behest of the government which is fucking terrifying yeah and illegal but they found workarounds and you know that this is a huge huge risk But, I mean, look for these kids or whatever.

[1761] Like, let's look at three populations and say, maybe this will wake somebody up.

[1762] How are they going to treat you?

[1763] So the three populations are the revolutionaries themselves, the communists, we'll just look historically, and then say American classical liberals, right?

[1764] And then Christians.

[1765] So what's going to happen?

[1766] So you go woke, right?

[1767] And you're in this.

[1768] And then the revolution succeeds.

[1769] What have communists always historically done?

[1770] They always eat their own.

[1771] Yuri Besnamov says that too, right?

[1772] He says, don't deal with those political prostitutes.

[1773] They know too much.

[1774] We'll line them up against the wall and shoot them.

[1775] That's what he says.

[1776] So your chances are bad at best under the revolution.

[1777] What are American, you know, good old Americans going to do if, you know, you come out of being woke?

[1778] So we don't have to talk about the revolution.

[1779] What are other woke people going to do to you if you stop being woke?

[1780] They're going to treat you like a traitor.

[1781] They're going to hate you.

[1782] They're going to destroy your social life, maybe your professional life.

[1783] What are normal Americans going to do?

[1784] cool you do you right live your life glad that you got that sorted out and what are christians gonna do i forgive you that's literally their religion i forgive you if you repent come join our church if you want to if you don't i understand right you're welcome no big deal like if they're christians who are christians i mean i know that there's these christian fascist dudes who are thinking they can pound their chest and like be big tough guys but even the other christians are like that's not biblically sound like jesus didn't do that right so it's like the The woke are going to treat you like crap if you leave, so you're locked in.

[1785] If you come over and be an American, again, just a normal American dude, we're going to be like, cool, welcome back.

[1786] And the Christians are, if you go and repent of your errors or whatever and you decide to convert, are going to be like, they're going to celebrate you.

[1787] They're going to be like, praise God.

[1788] It's night and day different.

[1789] So revolutionaries destroy their own and everybody else, like you're saying, like just normal people who value, like what productive thing can you do?

[1790] love you great welcome are completely the other story well that's an ideology to live your life by the problem is if that ideology gets manipulated by the people in power as well it's all dangerous it's all dangerous because it's just what human beings do when they get into power and if there was a radical right wing religious sect that was in control of this country we'd be just as scared.

[1791] Yeah.

[1792] As if there's a radical left -wing, progressive, woke organization like there is currently.

[1793] Joe, that's the history of the 1930s right there in Europe.

[1794] You had the communists who were screwing everything up and everybody was scared of the radical left.

[1795] And what was their answer was fascism.

[1796] I read all this Mussolini a couple of months ago.

[1797] I was like, well, I better read the other side.

[1798] And I'm like, this guy is, he's supposed to be the answer, but he's making an idol of the state.

[1799] Like the state is God in both situations.

[1800] and who are you?

[1801] You're a subject is who you are.

[1802] Yeah, you know, my friend Duncan Trussell, when the George Floyd riots were happening in California, he was like, dude, we're going to get a radical right -wing president.

[1803] That was his thought.

[1804] It's like, this is what's scary to me. That's scary to me, too.

[1805] That's just as scary, if not more.

[1806] When Christians, the really crazy ones that we're talking about that don't represent the actual teachings of Christ, when those people think that there's like a holy war, that they're a part of and that, you know, they have to oppose all the other people, and they're the ones who get to enforce the rules, and they're the ones who get to enforce what people say and can't do, and if you say, God damn it, you go to jail for a year, that kind of shit's real.

[1807] It is.

[1808] And that's what you see in some countries that have radical Islam, that's what you see in some countries that aren't open societies, air quotes.

[1809] Well, I mean, legitimately ones, not, you know, George Soros's weird fantasy about it.

[1810] So it's like that's so important for people to understand because the line, you know, Solzhenitsyn said the line of good and evil cuts through every human heart, but so does the line of tyranny.

[1811] Then people who are afraid or they're angry or they feel like they've been robbed or cheated or oppressed can be radicalized really easily and become very angry.

[1812] Like Germans during World War II?

[1813] Right.

[1814] And what a lot of people need to understand is that if we put our tinfoil hat back, on and we believe that there are people pulling strings.

[1815] I promise you, they do not care whether a radical left or a radical right breaks the Constitution as long as the Constitution gets broken.

[1816] Well, especially if you can get the radical left to behave in a way that was completely opposed to what the radical left was like 20 years ago.

[1817] The full trust of the pharmaceutical drug companies, support of the military industrial complex, support of international wars, as long as they're being supported by the Democrats.

[1818] Banks.

[1819] It's like, I'm sure Larry Fink has best of intentions, you know, it's like, what are you talking?

[1820] Was Larry Fink?

[1821] No, Larry Silverstein was the guy who owned the big conspiracy theory about World Trade Center Tower 7.

[1822] Oh, yeah, something like that.

[1823] No, Larry Fink owns Black Rock.

[1824] Right, that's right.

[1825] Yeah.

[1826] I get my Larry's confused.

[1827] Yeah, well, there's a lot of Larry's.

[1828] But the idea that we could live in this world where if this stuff takes over that eventually they don't come for you.

[1829] It's so silly.

[1830] They eat their own.

[1831] It keeps going further and further down what you thought was acceptable.

[1832] And it changes the norms.

[1833] And it just keeps going.

[1834] It's just like with ESG.

[1835] They can change the rules tomorrow.

[1836] The real danger of ESG isn't that it's stupid and that it's control.

[1837] It's that it's arbitrary.

[1838] Somebody in some room, maybe it's Larry Fink, maybe he's got a little committee, I don't know, gets to decide that today Elon Musk is okay with ESG and then tomorrow he bought Twitter and is for free speech and now he's not okay with ESG or that Halliburton is bad and now it's good like overnight somebody gets to decide so maybe what you know at the WHO treaty we stopped talking about the WHO and I should talk about that they're in May at the end of May they are the WHO is meeting it's some kind of an assembly and they are deciding upon whether or not the WHO will have total they just screwed up one pandemic and then they say that they need to have total control of pandemic preparedness and public health.

[1839] But the thing is, it's not even just about diseases, right?

[1840] Because we know about like they screwed up COVID.

[1841] It was total global tyranny.

[1842] Imagine if they had the power where there is no Florida.

[1843] There's no free state.

[1844] There's no difference between Texas and California.

[1845] It's all whatever the World Health Organization says.

[1846] There's no difference between Florida and Canada or there's no Sweden, which, you know, did something different.

[1847] Everything has to be on the same page, but then they go further and they declare other things matters of public health, like gun violence is a public health threat, racial injustice, inadequate food systems.

[1848] It's literally a recipe for them to be able to declare total, total tyranny, but particularly over matters, anything that they can skew as a public health.

[1849] And so one of the things that they consider to be another kind of pandemic that's a public health risk is misinformation and disinformation.

[1850] So it explicitly calls for censorship of what would be misinformation and disinformation.

[1851] So now all of these 100 and whatever 93 or whatever those countries are supposed to sign over to the World Health Organization, the ability through a treaty that's not being ratified in the Senate like a treaty.

[1852] It's probably Joe Biden and will do it as an executive agreement rather than as passing two -thirds majority in the Senate.

[1853] So we have this treaty now that hands over the control of the states and of the United States as a federal entity to the World Health Organization, which is led by, I mean, Tedros is openly a Marxist, so like what the hell is going on with that, where they have this total blanket control over anything they can declare public health, including misinformation and disinformation.

[1854] One of the things they say, and I don't know if it's in the proposal or if it's in the documentation around it, is that we have a pandemic of two.

[1855] much information.

[1856] We have to limit how much information that people actually are getting.

[1857] And this is like, that's like living in China.

[1858] This is proposed.

[1859] Has any country signed off on this?

[1860] I think Canada's like already gung ho on it, but I think the meet, I don't know exactly how it works, but I think the meeting is at the end of May and there is no full signing off until the meeting at the end of May. So we got like 11 weeks to, if, for example, if we could get just make it through, you know, whatever Congress or whatever apparatus is where it has to be ratified in the United States as a treaty according to the Constitution, it's dead in the water for the U .S. because the United States, two -thirds of the senators are not going to go for this unless we're in a lot bigger trouble than I think we are.

[1861] 50 -50 would.

[1862] Who the fuck is going to go for that?

[1863] Joe Biden.

[1864] Or whoever tells Joe Biden what to go for?

[1865] Yeah, whoever gives Joe Biden his shots.

[1866] What do you think cocktail they got him on when he goes and gives those speeches?

[1867] I don't know, but it's got to be something good.

[1868] I want to know.

[1869] I really want to know.

[1870] I want to know what he's doing.

[1871] Dude, I'm barely catching up to you on baby IVs of NAD Plus.

[1872] I'm not ready for these cocktails.

[1873] Well, I mean, whatever they give him, it must be extraordinary.

[1874] Because you could tell he's ramped up.

[1875] Oh, yeah, totally ramped up.

[1876] Yeah, he's ramped up.

[1877] And otherwise doesn't know where he is.

[1878] It's so strange.

[1879] My favorite, it's one of my favorite, you know, Trump, whatever else, he's funny.

[1880] One of the favorite things I thought, no, maybe not, but top five favorite things he ever said was he was in an interview and they said, well, what do you think Joe Biden?

[1881] And he said, Joe Biden doesn't know he's alive on TV.

[1882] Yeah.

[1883] It was the funniest.

[1884] Oh, Trump's hilarious.

[1885] He does speeches and he does stand up in him where he did like his impression of Joe Biden not knowing where to go.

[1886] Have you seen that?

[1887] No. You got to see this bit because I swear to God, it's like a fucking comic.

[1888] He's doing this impression of Joe Biden.

[1889] He always does this thing.

[1890] He always does this thing.

[1891] Like, you watch him, you're like, the guy's a comic.

[1892] He's hilarious.

[1893] Well, he's been on TV forever.

[1894] He knows how to shoot back.

[1895] He knows how to talk shit.

[1896] He knows zingers.

[1897] It's like where they...

[1898] He knows how to work a crowd.

[1899] So you think about it.

[1900] If he is a smart man, regardless of what you think about him, you got to realize.

[1901] The guy's been very successful, right?

[1902] Yeah.

[1903] Don't lie.

[1904] So this guy has also hosted the apprentice forever.

[1905] been on television forever, and then he goes on tour.

[1906] So he starts doing stand -up for, essentially, for four years.

[1907] Yeah.

[1908] He's been doing stand -up without the job as the president, or at least three years.

[1909] And then before that, there was four years during the time he was president.

[1910] He's kind of doing stand -up.

[1911] Yeah.

[1912] And then before that, it's a year and a half that he's running for president that he's kind of doing stand -up.

[1913] Yeah.

[1914] So you've basically got a guy.

[1915] He's been doing stand -up for nine years.

[1916] Put it off for the beginning.

[1917] But if I walk left, there's a stair.

[1918] And if I walk right, there's a stair.

[1919] And this guy gets up.

[1920] I mean, he's fucking doing stand -up.

[1921] So now new records are being set like gasoline.

[1922] Gavin has become Crooked Joe Biden's top surrogate, I think, because he doesn't think Biden is going to make it.

[1923] That's why he's doing it.

[1924] think he's going to make it and it won't be him so easy he's going to have a big fight however because there will be a lot of democrats competing is going to be very interesting but let's see look some people say Biden's going to make it does anybody think he's going to make it to the starting game i mean the guy can't find his way off of a stage look here's the stage but it goes a little further than that seen this stupid stage before right i've never But if I walk left, there's a stair.

[1925] And if I walk right, there's a stair.

[1926] And this guy gets up.

[1927] Where am I?

[1928] Nah, he's terrible.

[1929] You know, I'm much tougher on him than I used to be.

[1930] Out of respect for the office, I was never like.

[1931] He's the most corrupt president, the most incompetent president we've ever had.

[1932] But when they indicted me and then again and again and again, I was never indicted.

[1933] Now I'm setting records.

[1934] Al Capone.

[1935] was not indicted so much.

[1936] Alphonse Capone.

[1937] If you looked at Al Capone in the wrong way, he'd kill you.

[1938] He was not indicted like me. I was never indicted.

[1939] I didn't know.

[1940] When they taught me at the Wharton School of Finance, they didn't talk about indictment.

[1941] No, it's a disgrace.

[1942] What's happening.

[1943] They weaponized elections.

[1944] They've done everything.

[1945] I mean, these are very bad people.

[1946] But I used to talk relatively nicely about them.

[1947] I wouldn't go out of my way.

[1948] I wouldn't say the things I say now.

[1949] Now I'm just all in because these people are bad and they're dangerous, and we have to stop it.

[1950] Okay, that's not it.

[1951] There's a thing where he does a thing about, he's probably doing it at multiple speeches.

[1952] Yeah, yeah, yeah.

[1953] When he talks about him, he's like pointing at somewhere.

[1954] Oh, yeah.

[1955] But it's like.

[1956] I don't know, that Wharton thing was pretty good, too.

[1957] It's pretty funny.

[1958] It is kind of crazy how many times they've indicted him.

[1959] Yeah.

[1960] It's pretty wild.

[1961] I actually hear, like, I fly a lot.

[1962] So I'm on planes a lot, and sometimes people talk.

[1963] and they like, I've heard several times, people are like, well, I'm a Democrat, but I don't, like, why does this keep happening?

[1964] It's kind of crazy because it seems like what happens in Banana Republic's.

[1965] Uh -huh.

[1966] But just somehow or another, it's okay, the exact same thing.

[1967] Well, because protecting democracy.

[1968] Well, did you see when that guy from Shark Tank, Kevin O 'Leary, when he was discussing this whole thing, is like, you're going to ruin real estate development in New York.

[1969] People are not going to want to do real estate deals there because this is how they do it.

[1970] Yeah.

[1971] When they say, my building is worth $400 million, you're supposed to say, no, it's worth $300 million.

[1972] Here's a loan on $300 million.

[1973] Like to say that that's fraud when he paid the loans back.

[1974] Yeah.

[1975] This is a, I mean, that is like the epitome of like, what are you doing?

[1976] Like, what are you chasing?

[1977] And what have you not chased?

[1978] What have you not chased down?

[1979] Can we go over what you have not chased down and you're chasing this down?

[1980] Is it possible that you're doing it?

[1981] doing this because this guy's running for president?

[1982] Right.

[1983] Because it kind of seems like it to the world.

[1984] Yeah.

[1985] It looks real...

[1986] It looks real like that.

[1987] Yeah.

[1988] It looks real like you're trying to prosecute your political opponents.

[1989] With these gigantic...

[1990] Letitia James with these gigantic, you know, I don't even know what it is a settlement.

[1991] It's not...

[1992] $360 -something million.

[1993] That's insane.

[1994] It's a lot of money.

[1995] That's a lot of money.

[1996] That's a lot of money.

[1997] Where's it go?

[1998] Because there's no victims.

[1999] Right.

[2000] That's a problem.

[2001] Like Elon tweeted that.

[2002] Yeah.

[2003] It's like, okay, where does it go?

[2004] It goes to her brag sheet is where it goes.

[2005] Well, it's just kind of bonkers.

[2006] And then you get the fucking Georgia one with that fanny lady.

[2007] Yeah.

[2008] The lady's in trouble.

[2009] She's in trouble.

[2010] She's in hot water.

[2011] She's in real trouble.

[2012] She's in real trouble.

[2013] I was actually in Fulton County the day where Trump came in and got indicted and did his mugshot or whatever.

[2014] It's pretty wild.

[2015] I mean, it's just nothing.

[2016] There's not a story, but I was there not at the courthouse.

[2017] I was just nearby.

[2018] And I was like, holy shit, I came here on this day, like of all days.

[2019] But, yeah, she's in trouble.

[2020] She's host.

[2021] The whole story is amazing, though.

[2022] See her on the trial getting sassy.

[2023] To see her on the stand getting sassy and to see that her explanation was cash.

[2024] She keeps a lot of cash around the house.

[2025] Like, where did you get this cash?

[2026] Yeah, really.

[2027] Why do you have so much cash to pay for all these vacations and all that?

[2028] You paid them back?

[2029] Okay.

[2030] Yeah.

[2031] What?

[2032] That's like another, it's like a little kid's explanation.

[2033] Oh, I paid him back.

[2034] I can cash, but I just had laying around.

[2035] I just happened to have it, you know.

[2036] Well, the idea that it's a black thing, too.

[2037] That's what I was going to say.

[2038] Keep cash around.

[2039] They tried to come out and say, well, this is, you know, they're scrutinizing her because she's a black woman.

[2040] This is encouraging, though, because, like, two years ago, I think that would have worked.

[2041] And now it's, like, people, like, stop.

[2042] The other thing she tried was, I am not about to emasculate a black man. Mm -hmm.

[2043] What does that mean?

[2044] That is not an answer to a question.

[2045] That is not an answer to a question.

[2046] That is not an answer.

[2047] That's a way to throw up that race card and see you can get out of this question.

[2048] That's right.

[2049] Get out of jail free card.

[2050] Emasculate a black man. He just happens to be black.

[2051] We're just talking about what you did with the money.

[2052] Yeah.

[2053] Like, tell us about that money.

[2054] Who the people are.

[2055] Just who did the money.

[2056] What the fuck happened here?

[2057] Yeah.

[2058] Like, what's going on?

[2059] I call it the Iron Law of Woke Corruption.

[2060] It's so wild.

[2061] Totally.

[2062] It's so wild to see.

[2063] It's just very strange.

[2064] And it's very, here's what drives me crazy.

[2065] Like, how is all this, uh, DEI stuff getting into airplanes like yeah isn't that scary as hell isn't a United run by a drag queen well he did do that at least Scott Kirby's the guy's name which sucks because I fly on United a lot but don't you want like the absolute best people regardless of their sexual orientation their gender their color their race the very best people that you can get to fly the fucking planes I do and fix the fucking planes.

[2066] Wouldn't you like, I'd like, it would be sweet if we had the best people for the job.

[2067] You want to put the tinfoil hat back on?

[2068] I got an explanation.

[2069] Okay.

[2070] Okay.

[2071] So earlier I said that the goal is to de -grow the West and facilitate China's rise.

[2072] Okay, so what's happening?

[2073] Boeing 737.

[2074] Boeing, Boeing, Boeing.

[2075] We see all this DEI stuff at Boeing.

[2076] We see all these problems.

[2077] We just see this guy that committed suicide.

[2078] The whistleblower?

[2079] A whistleblower against Boeing who was saying some deep stuff, like that they were intentionally fitting bogus parts.

[2080] I don't know if this is true, but that's what he was alleging.

[2081] And then all of a sudden he, you know, decided there was a good day to kill himself right before his deposition he was supposed to go to.

[2082] And so, I mean, it's weird timing.

[2083] But what's going, he's saying that Boeing could be construed, let's suggest, as though it's deliberately committing suicide as an organization.

[2084] It's cutting corners.

[2085] It's locked in by this ESG, DEI stuff.

[2086] The easy question is why is DEI?

[2087] Because ESG.

[2088] It's the S and ESG.

[2089] But little do most Americans realize, in addition to scaring the hell out of people and getting people to fly less.

[2090] China just released a new jet like two years ago called the Comac C -919 that is a direct competitor to the Boeing 737.

[2091] So maybe you kill Boeing and you allow American manufacturing of high -quality aircraft to fall, and then the Chinese competitor is now the thing on the market that doesn't have this bad rap sheet and this risk factor.

[2092] Maybe it's big, dirty international business that's actually happening.

[2093] Nobody knows about the Comac because how much do we pay attention to Chinese stuff.

[2094] They literally, it launched last year for commercial production.

[2095] That seems like such a hat you're wearing.

[2096] I know.

[2097] I know.

[2098] But the problem is, that's how ESG works.

[2099] The degrowth strategy of the West and the trap.

[2100] But someone at Boeing must know this is going on.

[2101] And why would they ever allow that to happen if they're a corporation?

[2102] And they have shareholders.

[2103] Oh, but we're exiting shareholder capitalism for stakeholder capitalism now.

[2104] In other words, to answer to the ESG cartel, they are, I mean, the Harvard document, that's hard.

[2105] Harvard corporate law document that I was talking about earlier explicitly says that your governance score can go up for giving yourself corporate bonuses for installing ESG.

[2106] So you're the CEO, you're the C -suite of Boeing, and you're like, well, my business is going to get attacked on the market.

[2107] It's going to be hard to get lines of capital through these banks unless I'm ESG compliant and I get a gigantic bonus if I'm ESG compliant.

[2108] Well, let's just be ESG compliant.

[2109] ESG compliant starts telling you.

[2110] you have all of these expensive regulations that you have to go through and you have all of these DEI social justice things you have to install all these administrators you have to hire commissars you have to hire DEI officers ESG officers those are like six seven figure jobs so you have all this stuff so what is it to cut corners on the cost a little bit to pull a broken piece out of the scrap and screw it on to the back of an airplane or to hire people who are not really like they don't know what an impact wrench is, but they'll figure it out on the, you know, the tail portion of a 737 in a moment.

[2111] So you hear the left saying it's corporate corner cutting, it's corporate corner cutting, that's profits over everything.

[2112] Well, what if the market that they're running in is actually controlled in this ESG sense to where they have very few options and they get to reward themselves for installing it and are punished if they don't?

[2113] And I will wear this, I will put the biggest, let's fold a tricorn, revolutionary war, tinfoil, hat and go, Joe, let's go.

[2114] Yeah, that's what I'm looking at now.

[2115] I'm looking at one of them sailboat looking at this.

[2116] Hell, yes.

[2117] But that would mean they're intentionally destroying a company by sabotage and by a slow infiltration of these ideas to the point where you can get them to fit inferior parts on an aircraft.

[2118] That doesn't, it seems like There's got to be inspectors, right?

[2119] So the inspectors must be watching.

[2120] That's part of the scandal.

[2121] That's what this guy that committed suicide.

[2122] That's what he was saying, is that they were not inspecting correctly.

[2123] And part of the video that went viral of him talking was that him and his team went out there and they inspected and they found all these violations.

[2124] Let's see his video.

[2125] Let's see his video because I've only seen him speak very briefly, but I saw the store and I was like, Jesus Christ.

[2126] Yeah.

[2127] And my first initial thought was this man was so embarrassed.

[2128] by the fact that he incorrectly said that Boeing was an evil corporation, and he decided to take his own life.

[2129] Because he knew that Boeing was amazing and that he had generally, genuinely done a terrible thing, so he decided to take his life.

[2130] Yeah, that's a plausible explanation.

[2131] That seems most likely.

[2132] Because the other possibilities they killed him.

[2133] Yeah, that's...

[2134] He's telling the truth.

[2135] Yeah, that was going to be a problem.

[2136] That's a dark story right there.

[2137] The dark story is that they killed him because if he's dead, then they make billions of dollars.

[2138] And if he's alive, he could find him.

[2139] fuck them up and cause the stock to crash and all kinds of other problems to happen and a lot of investigations and all kinds of other stuff if he's right have you heard of this thing degrowth by the way no i have it um do we have that video of that i want to hear it though i want to hear about this degrowth thing because this is also 4d chess that scares a shit out of me it scares the shit out of me think that there really is a puppet master well there's a yeah committee probably but council soviet means council yeah but that it's actually a But then if you think about who the actual president is, you know he's not in charge.

[2140] So, well, who is it then?

[2141] So, like, we've agreed to let a bunch of people that were not exactly sure who they are run the country.

[2142] And once you get that sort of a system in place, they'll do whatever the fuck they can to make sure that they keep that.

[2143] Because they could just keep him in there alive for four more years.

[2144] He's going to be even crazier three years from now.

[2145] People should look up the council for inclusive capitalism while they're wearing their tempo.

[2146] I almost want him to be president for three more years just for stand -up.

[2147] Well, there is that.

[2148] I mean, I don't know how much longer he can go.

[2149] There is.

[2150] Yeah, so let's listen to this guy.

[2151] One, this is not a 737 problem.

[2152] It's a boiling problem.

[2153] And I know the FAAs gone in and they've done due diligence and inspections to assure that the door plugs of the 737 are installed properly and the fasteners are properly.

[2154] But my concern is what's the rest of their?

[2155] airplane, what's the rest of the condition of the airplane?

[2156] And the reason my concern for that is back in 2012, Boeing started removing inspection operations off their jobs.

[2157] So it left the mechanics to buy off their own work.

[2158] So what we're seeing with the door plug blowout is what I've seen with the rest of the airplane as far as jobs not being completed properly, inspection of steps being removed, issues being ignored.

[2159] My concerns are with the 737 and the 787 because those programs have really embraced the theory that quality is overhead and non -value added.

[2160] So those two programs have really put a strong effort into removing quality from the process.

[2161] When I first started working at Charleston, I was in charge with pushing back defects to our, suppliers and what that meant was I'd take a group of inspectors and actually go to the supplier and inspector product before they sent it in well I'd taken a team of four inspectors to spirit aerosystems to inspect the 41 section before they sent it to charleston and we found 300 defects some of them were significant that needed engineering intervention when I returned to Charleston my senior manager told me that we had found too many defects and he was going to take the next trip so the next trip he went on he took two of my inspectors and when they got back they were given accolades for only finding 50 defects so i pulled that inspector aside and i said did spirit really clean up their act that quick that don't sound right and she was mad she said no said the two inspectors were given two hours to inspect the whole 41 section and they were kicked off the airplane wow yeah yeah so there are in some Specter, sort of.

[2162] Well, that sounds like a money thing, right?

[2163] They were saying that quality is overhead.

[2164] Yeah, well, that's profits.

[2165] A whistleblower statement was made in 2017, I think.

[2166] Yeah, he was, what, doing like a deposition or something the other day when he was found dead in his car in the parking lot of a hotel.

[2167] So, but you said the profit thing there.

[2168] So I mentioned the Comac C -919, and that's the direct competitor, Chinese manufacturer, or new Chinese manufacturer to the 737.

[2169] Well, there's a Comac 929 as well, which is a direct competitor to the 777 and 787.

[2170] And the 787 is the other one that you just mentioned.

[2171] And so if we, like I said, I don't know if you've heard of degrowth.

[2172] And degrowth is actually a model that kind of can avoid being communist.

[2173] But I read this book called Marx in the Anthropocene by this Japanese Marxist named Kohei Saito.

[2174] And it's called toward the idea of a degrowth communism.

[2175] And it talks about how what we need to do is, and as match as the Marxists of the 60s, by the way, is that what capitalists need to do, Americans, capitalism needs to shrink.

[2176] We produce too much stuff that nobody really needs.

[2177] So what we need to do, well, I'll just tell you what Herbert Marcusa said in the 60s was socialism has the right ideology, but it can't produce.

[2178] So we have to figure out how to make a productive socialism.

[2179] And I'm arguing that's what happened in China.

[2180] They figured out the code.

[2181] Well, how?

[2182] By opening up a kind of Potemkin market that the government really controls.

[2183] Well, then on the other hand, he said, well, capitalism produces.

[2184] His own words were it delivers the goods.

[2185] However, it's not sustainable.

[2186] It makes too much stuff, too much junk.

[2187] And so what we need is a reduction in our standard of living, a reduction in our amount of stuff, a reduction in, you know, energy and everything in the West.

[2188] And if you could somehow figure out how to make a more sustainable capitalism, then you're off to the races.

[2189] So what I was saying earlier is that when Kissinger and Brzynski and Deng and Chan and Rockefeller were meeting, they were erecting the idea of this productive socialism for China, for China to take off with a Potemkin contained market.

[2190] Meanwhile, eventually the West would have to degrow so that we could have a system that's not going to outstrip the world's resources.

[2191] This is at a time when limits the growth from the The Club of Rome was really big and really hot.

[2192] Klaus Schwab put that platformed it at the World Economic Forum in 73.

[2193] These guys are still around that Paul Ehrlich and his population bomb was like a big thing.

[2194] And so these guys were thinking along these terms, and it was how do we degrow the West?

[2195] And so what I think we're looking at is, well, there's a Chinese manufacturer that can rise while the American manufacturer shrinks.

[2196] America might not be able to make its own jets, but we can buy them from China.

[2197] And China becomes more and more secure as the manufacturer from the world.

[2198] Meanwhile, the Degrowth Initiative, there's this program, or this research project that was called U .K. Fires, F -I -R -E -S, like fire, right?

[2199] And this was Oxford University, Cambridge, the government, the British government.

[2200] Like, this is serious.

[2201] And so this thing that came out published in 2019 was called Absolute Zero.

[2202] It's not called Net Zero.

[2203] It's Absolute Zero.

[2204] And it says that net zero is not enough.

[2205] We are not going to save the climate change problem if we have.

[2206] only go for net zero carbon emissions.

[2207] We have to go to absolute zero carbon emissions.

[2208] And so it openly says, what are the initiatives?

[2209] No new concrete production, no new steel manufacturing, no container shipping.

[2210] I mean, you can actually look on the document and see it's like zero by 2050.

[2211] But it also says no fossil fuels and no air travel by 2050.

[2212] Zero.

[2213] Absolutely zero air travel by 2050.

[2214] And so how do you get to zero air travel by 2050?

[2215] How do you create a massive reduction?

[2216] Well, what else is going on besides the Chinese market go up?

[2217] Boeing look bad.

[2218] Media, of course, is amplifying stories that are pretty routine.

[2219] Little things go wrong with aircraft all the time.

[2220] I've taken off a few times, you know, a flap or something gets stuck.

[2221] We have to turn around and land and they have to fix it.

[2222] This is national news when it happens.

[2223] So they're creating this image that is really scary.

[2224] But what are the airlines doing at the same time?

[2225] What is the new aircraft?

[2226] Have you heard of the Boom Supersonic?

[2227] Made in Colorado?

[2228] So it's like the new Concord.

[2229] Right.

[2230] Well, you can't fly those over land.

[2231] Those are Transatlantic only, right?

[2232] So the UK fire thing actually says no domestic flights whatsoever, but international travel will be reserved.

[2233] Well, it turns out the Boom Supersonic is a Concord replacement.

[2234] It's really fuel efficient.

[2235] It's really well designed.

[2236] Not going to throw shade at it.

[2237] So its operating costs are approximately similar to like a 777 or a 747, right, for the same distance.

[2238] The Concord was a disaster in terms of how inefficient it was.

[2239] So now you have by 20209, 140 -something, 150, something, 150, something like that orders for the Boom Supersonics internationally.

[2240] So they're planning on flying Boom Supersonics internationally, but they see the bigger one seats 60 and the smaller one seats 45.

[2241] Well, a 747 or a 7667 might seat 360.

[2242] So that's either six or eight times as many people flying at roughly the same operation cost.

[2243] So you do the math and the tickets are going to go up by six to eight times over.

[2244] That's not a difficult calculation.

[2245] to figure out if they want to make the same profit, which means who's flying, people who can pay eight times as much for a plane ticket or as who's flying.

[2246] Nobody else is flying.

[2247] So what you end up doing is for the sake of the climate, you de -grow commercial travel that's going to kill off a ton of business, but you don't need that.

[2248] You can do it by Zoom.

[2249] Wouldn't this podcast be so much more engaging if we were on Zoom screens?

[2250] Wouldn't we be having a great time and great relationship?

[2251] Don't you, have you ever watched Zoom?

[2252] I do a ton of them.

[2253] So I watch these interviews on Zoom.

[2254] it's like five minutes in and I'm having like suicidal ideation like do I really have to watch this I don't really have suicidalation my god I'm going to get a million things I'm just kidding it's a joke it's a joke get to stare down the camera funny you have to say that now I What, dude, if you make a like, I swear to God, if I have to watch one more Zoom, if I have to be on one more Zoom call this week, I'm going to KMS, right?

[2255] If you say that on like any social media, you start getting emails that are like suicide hotline prevention, blah, blah, blah, blah, blah.

[2256] But it used to be a thing that people just said.

[2257] Yeah, for sure.

[2258] If this movie doesn't end soon, I'm going to kill myself.

[2259] Yeah.

[2260] It's fun.

[2261] Yeah.

[2262] Like, nobody's going to kill themselves because a movie went 45 minutes too long.

[2263] But no, I think this degrowth thing is serious.

[2264] The, what's it called?

[2265] the monthly review, monthly standard, one of these, it's a socialist magazine, publish this article about degrowth, and they have their drawing of what it's supposed to look like, and it's supposed to go down to this thing that Klaus Schwab talks about, called a circular economy, Bill Gates talks about a circular economy, but literally their drawing is a spiral down to this little circle in the middle.

[2266] It looks like your society going down the drain, and it's like, how do you not make fun of this?

[2267] But I think that they're very serious to try to shrink the economy.

[2268] And I have my tinfoil hat, but I can tell you why that's the strategy also.

[2269] And it's to avoid the war.

[2270] It's to avoid China rising.

[2271] This is that Thucydides trap.

[2272] We've kind of started there.

[2273] Thucydides trap was the idea that when you have a rising power, in the case it was from Thucydides, it was Sparta, going up against an existing power, which would be Athens.

[2274] In our day, though, it's China and the United States.

[2275] If the thing rises, eventually it's going to try to get regional or in this case global dominance.

[2276] China is going to try to become, when it becomes strong enough, the global superpower.

[2277] Well, if you want access to that market, which they did, you have to open that up and China is going to rise so that you get trapped into the threat of power struggle between two very wealthy superpowers eventually.

[2278] Well, how do you avoid the war?

[2279] Simple.

[2280] You take the existing power and sunset it while the other one rises.

[2281] So the sun is no longer rising in the west.

[2282] It's now setting over America and it gets to rise over the east.

[2283] We have a century of Asia.

[2284] So we build up Chinese markets.

[2285] We diminish American markets.

[2286] And I think that the whole ESG program, which, by the way, China is exempt from, is designed to do that.

[2287] How is China exempt from that?

[2288] Because they're a developing nation in the global south.

[2289] So the policies don't apply to them because climate change is super global or something.

[2290] How are they a developing nation?

[2291] Because they keep developing?

[2292] Well, imagine what would happen if you told them, no. They are the manufacturing base for the world.

[2293] What if you said you have to like, you know, start following?

[2294] You know, decent human rights protocols.

[2295] You can't not pay people to further labor.

[2296] Maybe don't kill people, don't disappear people anymore.

[2297] And at the same time, you know, instead of building something like 300 new coal plants, which is I think what they're doing, they're building a couple of coal plants a week in China.

[2298] They're building 57 nuclear power plants.

[2299] The U .S. is taking some offline, but we're building one in Georgia right now.

[2300] So you're creating the state of energy dominance for China because you've released them from all these expensive.

[2301] protocols.

[2302] And all you hear is when people try to start a big company that could compete in the U .S. Well, let's get the, you know, manufacturing for X, Y, or Z, take it out of the hands of China, bring it back to America.

[2303] Let's unoffshore some stuff.

[2304] Bring some American manufacturing back there.

[2305] They're like, whoops, too expensive.

[2306] Dei, ESG makes it too complicated, too expensive.

[2307] Everybody complains about it in the business world.

[2308] So if you don't comply with DEI and ESG, you can't get loans?

[2309] You have a diminished access to or worse interest rates for your short -term lines of capital.

[2310] Here it's Jamie pulled this up.

[2311] It says the benefits of the UN's designation extend beyond the institution itself.

[2312] For example, the World Trade Organization allows developing nations to have longer periods of time to meet various financial and trade obligations.

[2313] The World Bank provides China billions in loans, even though China's income level would otherwise make it ineligible for such financing.

[2314] Uh -huh.

[2315] And then add in just again.

[2316] Imagine the World Bank said no. China, that's it.

[2317] We're cutting you off.

[2318] What would China do?

[2319] China would say pound sand.

[2320] China's going to be like, we're huge.

[2321] Ha, ha, ha.

[2322] We're going to do what we want, probably in Mandarin because they're going to make everybody answer in Mandarin from that on.

[2323] God damn, man. And I'm like, I'm not going to say that, I mean, we talk about the tinfoil hat.

[2324] I'm, I can't think of a cleaner explanation that this is deliberate.

[2325] I've tried really hard to think of an explanation.

[2326] other than that this is on purpose, and they all start, like, spinning wheels.

[2327] It's, like, really weird.

[2328] But the tools are there, ESG, social credit in China, the whole thing.

[2329] But I think the Boeing thing is just another piece of this same puzzle.

[2330] It's destroy the manufacturing base in the wealth of the West and hand it off to what they call the Global South in China through its Belt and Road initiative.

[2331] God, I hope you're wrong about this one.

[2332] I spend my entire life hoping I'm wrong about everything I think.

[2333] how often do you write I write a lot no are you what do you mean how often are you right how often am I right correct sorry I thought you're like I'm so in my own life in my own stupid head I thought you meant W RIT no like no I'm writing two books at the same time right now I really am one about Maoism but um how woke is Mao but at any rate um I'm right let's put it's easier to it's easier to identify when I'm wrong I I over I do overcook the books occasionally, but it's not very frequently.

[2334] There's a whole joke on line, James Lundy was right.

[2335] What have you been wrong about?

[2336] Well, the far right likes to lord over me. I thought that they were setting up, and I'm going to totally give myself an escape hatch for this, but I thought that there was for sure going to be a clash of violent clash between probably conservative Christians and the LGBT thing somewhere around pride last year.

[2337] I was talking about that leading up to, you know, through the spring of 23.

[2338] And there was obviously no incident of violence.

[2339] I was particularly concerned when all those Christians went to L .A. to Dodger Stadium and they protested the weird, what were they, the sisters of perpetual indulgence or whatever they called themselves, the drag queens that looked like nuns.

[2340] And I thought, well, this is going to be it, right?

[2341] So I was wrong.

[2342] I overcooked there.

[2343] They did not.

[2344] Now, here's my escape hatch.

[2345] I think that when that shooter who was trans in Nashville, was it Covington?

[2346] Is that the name of the school?

[2347] Covington School Shooter.

[2348] I think that that changed the entire calculation.

[2349] I think Covington.

[2350] Was that Florida?

[2351] Yeah, I might have this wrong.

[2352] No, it's the one that was in Nashville, though.

[2353] Right.

[2354] Right.

[2355] And there were six people were shot, three kids, three teachers.

[2356] But the one where they haven't released the manifesto yet.

[2357] But Stephen Crowder ended up leaking allegedly three pages of it.

[2358] When that happened, I think the entire country had like a take a breath moment because you had this very disturbed young person who was in the transgender universe who went on a rampage and you very infrequently see she was biologically female young women going on rampages so why in the world is it was she hopped up on testosterone was she you know was the test converting to estradeal through you know aromatase or whatever and driving her into like you know rage because that's a thing right Why did this happen?

[2359] Or she's just so frustrated by her ideology and stuff not going her way and she decided she flipped out and was going to get revenge.

[2360] I think that changed the calculation.

[2361] I think that they were priming the situation for violence and then the violence didn't come.

[2362] So I overestimated the potential for that circumstance.

[2363] And I was wrong about that.

[2364] See if I can think of some more instances.

[2365] But that's like a bold prediction.

[2366] Yeah.

[2367] To predict violence is a bold prediction.

[2368] You can be wrong about that.

[2369] But I mean like specific things.

[2370] that you believe to be true that weren't?

[2371] Other than the fact that I thought there'd be better to live without religion than with it in the past.

[2372] Isn't that fascinating?

[2373] Yeah.

[2374] I've had the same sort of battle in my own mind.

[2375] You know, that was a luxury belief of like 90s kids.

[2376] Yeah.

[2377] Well, it was the idea that the atheists were smart and the other people were superstitious.

[2378] Yeah, that was totally dead wrong about that.

[2379] I had TDS.

[2380] I had straight up.

[2381] I was like on the floor like Trump derangement syndrome.

[2382] That's right.

[2383] I thought Trump was the end of the world.

[2384] 2016 1617 yeah when did you guys come on the podcast with those fake papers that would have been very beginning of 19 or very end of 18 yeah because it came out in october 18 and you were fast yeah i was fucking loved it to this day we've talked about the dog park one like a hundred that dog that dog park paper is on another level god damn genius it's god damn genius but it's so crazy that so many the things that you talked about in these fake papers were appreciated and applauded and it just makes you realize the lunacy of these fucking people that are supposed to be in charge of higher education that they didn't pick up on that this is insane you're talking about heteronormity in dog parks yeah fuck are you saying what the fuck did you study yeah you know won an award dude that's so fun it wanted award yeah what was the total title of the paper it was um queer human reactions to queer performativity and rape culture in urban dog parks in port Portland, Oregon.

[2385] Jesus Christ.

[2386] Oh, dude.

[2387] Some of those other ones, though, like we had the one, it was called In Through the Backdoor, where we said that straight men would become more feminist and more sensitive and less transphobic if they practice putting things up their own asses.

[2388] Oh, that's right.

[2389] That was called an important contribution to knowledge.

[2390] That's still my favorite thing ever.

[2391] An important contribution to knowledge.

[2392] I just love the titles.

[2393] We had a fat bodybuilding paper, and we called it Who Are They to Judge?

[2394] Because, like, bodybuilders are huge and fat people.

[2395] are huge and who are they to judge that one big body is bad and one big body is good?

[2396] Well, there's, I don't think that that merits a scientific paper, but if you wanted to do that and people wanted to see it, I would have no problem with it.

[2397] Like, if you, if you decided that we're going to go back to like the days of, you know, when you see these Rubin -esque women and these paintings that are obese, eaten grapes, that this was considered hot because it was really difficult to get fat back then.

[2398] Yeah.

[2399] You want to go back to that?

[2400] You want to go back to that?

[2401] If that's what your choice is, I have no problem with that.

[2402] You do what you want to do.

[2403] Yeah, well, best of luck to your dating pool.

[2404] But, yeah, but the thing is, like, to study that for a scientific paper and then to submit it and then have people give it a fucking award.

[2405] Like, what?

[2406] My favorite part of that actually is in the, I mean, the title, but in the way aftermath of that, this real neuroscientist wrote this paper like, no, there's no way that anybody could actually say that's absurd.

[2407] And so we wrote a paper back and we're like, no, it's really absurd.

[2408] And then he wrote another paper.

[2409] Like, there's no basis upon which anybody could say that fat bodybuilding is an absurdity.

[2410] His name is Jeffrey Cole.

[2411] Do you remember that weird phobia that came out and it went viral like 10, 15 years ago where he was like, they discovered a new phobia of things where like little holes in it all over the place, trips of phobia.

[2412] I don't know, something like that.

[2413] No, there's like honeycombs or whatever and it like weird some people out.

[2414] Oh, really?

[2415] It's the guy who discovered that.

[2416] Oh.

[2417] went off on us.

[2418] Oh, my God, that's hilarious.

[2419] So there's no way it could be absurd that fat bodybuilding?

[2420] There's no basis upon which we could conclude that it's absurd.

[2421] Some people might think it's totally normal, so it couldn't possibly be considered absurd.

[2422] Well, some people could think it's totally normal.

[2423] Well, they could imagine that someone could get to a point where they appreciated fat bodies.

[2424] And they want to see different fat bodies and like, how did you build your fat?

[2425] You know, only lard.

[2426] I ate only lard.

[2427] That's in the paper.

[2428] It says it takes a long time to build it.

[2429] a fat body.

[2430] It takes even longer to build a politicized fat body.

[2431] But it does take long to build a fat body.

[2432] I don't know, man. If you were interested in doing that, like if you're interested in drinking yourself to death, like, I don't think you should do it, but you're allowed to.

[2433] And it's a project.

[2434] Yeah, it's a project.

[2435] And if you decide to fat body build yourself into a state of total biological decay.

[2436] I went off my diet and I'm like, like, because I'm doing that like meat thing now, so I'm like three days of just, you know, okay, I'll eat breakfast, okay, I'll have the dessert.

[2437] And I'm like, what the, how did I gain six pounds?

[2438] Like, what the hell is this?

[2439] Yeah, you can, you can cheat and get gone pretty quick.

[2440] I'm, I'm on it 90, I'd say like 95 % of the time.

[2441] That's about me too.

[2442] But last night I cheated.

[2443] Last night, Joe DeRosa brought me a sub.

[2444] He's got this sub shop in New York City called Joey Roses.

[2445] Yeah.

[2446] And he just put in, like, I guess he's got a stand out here or something.

[2447] He's got a pop up out here.

[2448] And so he brought over, some sandwiches for the club.

[2449] It'd be hard to follow that diet in this city.

[2450] There's a lot of food here.

[2451] There is a lot of great steakhouses, though.

[2452] That's true, too.

[2453] It's not that hard to follow.

[2454] The thing is, like, that's what my body craves for the most part.

[2455] Me too.

[2456] I feel like a thousand times better.

[2457] Yeah.

[2458] Yeah, I just think for most people, high -protein diets, or they just feel better.

[2459] High -prote and you're eating real food.

[2460] And the most important thing is real food.

[2461] Real food, that's right.

[2462] I ate a lot of eggs, a lot of meat.

[2463] Yeah, it comes out of a chicken.

[2464] It doesn't come out of a box.

[2465] Yeah.

[2466] And I wish you'll way better.

[2467] I've done a bunch of different diets.

[2468] I've tried a bunch of different things.

[2469] Well, they're after that too, right?

[2470] You know, like no beef consumption.

[2471] Yeah, well, that's another one.

[2472] That's in the absolute zero paper, too.

[2473] No beef, no lamb, zero.

[2474] Absolute zero.

[2475] Because apparently that's really bad for the environment or whatever.

[2476] And the question is, like, how are you going to get people to go along with that?

[2477] The Salt Lake Tribune just put out an article yesterday.

[2478] I made fun of it on Twitter talking about the same thing.

[2479] It's like, we need to get no meat, no dairy, and then we can have, like, better diets.

[2480] How are you going to kill all those cows?

[2481] It's up to you.

[2482] You go do it.

[2483] Jordan Peterson says that it's proof that it's earth worshiping or Gaia worship cult because they're sacrificing cows to the weather.

[2484] That is wild.

[2485] Jordan, you've got a point, brother.

[2486] That's a very good point.

[2487] Sacrificing cows to the weather.

[2488] Like in Ireland, they passed on law where they had to kill like 200 ,000 cows.

[2489] That was what we were talking about.

[2490] Me and Jordan were talking about.

[2491] What the fuck are you guys talking about?

[2492] Yeah, well.

[2493] You're out of your mind.

[2494] You're stopping people from making food?

[2495] Are you fucking crazy?

[2496] And meanwhile, China's making thousands or how many coal plants they have?

[2497] 300 something.

[2498] I don't know how many they have.

[2499] I know how many they're making.

[2500] And I've gone over and breathed the air there.

[2501] Yeah, I've been over there and breathed the air.

[2502] Like on a bad day, like on a nice day, it's a nice day.

[2503] It's the same as usual, like here.

[2504] But three days of the week, it's like Blade Runner.

[2505] It's like, what the hell's going on?

[2506] It depends on which way the wind is blowing and otherwise, like, your life is literally poison.

[2507] Jesus Christ.

[2508] Like your eyes are burning for no reason.

[2509] Here's the worst part.

[2510] So you get off the plane.

[2511] If you've run to China, I don't want to waste your time.

[2512] No. Okay, you get off the plane and immediately you can smell it.

[2513] It smells kind of like glue and dirty cardboard and Petro.

[2514] You can smell the pollution immediately.

[2515] So, but, you know, about an hour in, you can't smell it anymore.

[2516] You're used to it.

[2517] Right.

[2518] Until the first time you go take a piss and you smell it again because it's in your blood.

[2519] Oh, wow.

[2520] And you're like, oh, no. Oh, wow.

[2521] You smell the pollution and your piss?

[2522] Like asparagus?

[2523] Yeah, like the first time.

[2524] Oh, my God.

[2525] Then you just kind of like, then you become completely used to it and you don't notice it anymore.

[2526] That's the thing about all factory senses.

[2527] Like, you become accustomed to smells.

[2528] That's why people that live in places that have like, if you go past like a slaughterhouse.

[2529] Yeah.

[2530] Like, have you ever done that?

[2531] Yeah.

[2532] That fucking smell.

[2533] You're like, how do these people live with this?

[2534] I got lost in the smokestack part of Texas one time and there are some smells on the road even.

[2535] Yeah, it used to be New Jersey when you go past the factories in Jersey.

[2536] Be like, what the fuck?

[2537] And they're just billowing smoke out to the sky.

[2538] Just billowing smoke out to the sky.

[2539] This fucking smell.

[2540] Imagine this is your town, dude.

[2541] You got to get out of here.

[2542] I'll tell you what.

[2543] It's like, that's China.

[2544] So what I said when I went over there first time, so this is kind of relevant.

[2545] This whole like ESG model, the first time I went over there, as I said, I came home and people like, well, what's it like?

[2546] And I was like, well, I looked around and it's obviously communist because you can see weird shit where like people are like fake doing fake jobs.

[2547] Like it's obvious that they just are paid in income to look.

[2548] like stuff like a dude like sitting on the on his hands and knees hitting the ground with a hammer when the boss is around like doing nothing like I went to a bank one time when I was over there to change like $200 so I could have some cash and they were like oh yeah the bank doesn't change money on Tuesdays and I was like what and then I got bumped into by this janitor and that's like I guess taboo or something because he got it was he was way too worried about having bumped into a customer than I thought he should have been maybe I just don't know the culture so I was like He was like, he bumps into me, and I know, like, 10 things in Chinese.

[2549] So it was like, May 1T, which means, like, no problem.

[2550] And all of a sudden, you could see, you know, they all did the little, like, you know, you're not supposed to do racial microaggressions, but they did the little face.

[2551] They're like, you know, because I did the whole like Asian surprise face because I spoke Chinese.

[2552] Right.

[2553] So all of a sudden, the lady behind the desk was like, oh, I just remembered.

[2554] We do change money on Tuesday because now she thought I could go tell on her in Chinese.

[2555] Whoa.

[2556] I went to, I was at Starbucks and they wrote white man on my cuff.

[2557] Byron on my cup.

[2558] It's so interesting.

[2559] Ireland isn't calling cows for climate, but maybe it should be.

[2560] What the fuck?

[2561] Oh my God.

[2562] What the fuck, Elon's not happening, but it should be.

[2563] It's not true story.

[2564] It's not true story.

[2565] It's fake.

[2566] I mean, yeah, it said it came from Elon's tweet that came from something else.

[2567] And then one looked into, oh, here you go.

[2568] Like the rumors started here.

[2569] Okay, here it goes.

[2570] The rumors of Ireland's dairy cull landed in a media and online context primed by the Dutch case for outrage.

[2571] Case and point.

[2572] Must comment was in response to a tweet by a right -wing provocateur about a story in the obscure Wyoming publication called Cowboy State Daily that accused Ireland's government of bovine -sidal intentions.

[2573] That article, in turn, cited an op -ed from the British newspaper The Telegraph, railing against Ireland's alleged mooted cow massacre and warned in apocalyptic terms of an eco -modernist agenda to do away with conventional meat, altogether.

[2574] The Telegraph did not cite its sources, but it likely drew on an article published the previous day in the Irish newspaper, The Independent.

[2575] That story reported on the internal government document discussed above, including the proposal that 195 ,000 cows be called over three years at the government's expense to help achieve its ambitious climate goals.

[2576] But hold on a certain.

[2577] It goes on and continue about how it would be so hard to even do it.

[2578] But that story, but hold up.

[2579] Go back there.

[2580] That story reported on the internal government document.

[2581] So what is the internal government document?

[2582] They would need to call 65 ,000 cows every year in order to meet the proposed climacals.

[2583] So they're just saying that if we, there's no way to meet these goals.

[2584] The only way to meet these goals in terms of what the impact agriculture would have, we'd have to kill 65 ,000 cows a year.

[2585] So they're not saying we should do that.

[2586] Right.

[2587] But they are at least saying that's on the table.

[2588] That's what I talked to these ranchers out in New Mexico not that long ago.

[2589] And they were telling me that that's the way all the policies are.

[2590] It's that to meet whatever the new environmental standard is so that you don't get somebody breathing down your neck or maybe you don't get fines or whatever, that they're actually impossible.

[2591] He said that the only way you could meet some of these is to have no cows and no people on the land whatsoever.

[2592] And I don't know if they're actually going to like move on that.

[2593] But that's this is what I'm talking about with because it's not not in the UK fire.

[2594] absolute zero document.

[2595] It's 100 % in there that this says no beef, no lamb at all.

[2596] So those have got to go.

[2597] By 20, I guess 50, there will be zero consumption of beef and lamb under the ambitious net zero, or is absolute zero, I should say, climate.

[2598] By 2050, zero.

[2599] And so do they plan on making cows extinct?

[2600] Do they plan on keeping a breeding population that you could fucking just keep the species alive with?

[2601] What the fuck are they going to do?

[2602] I don't know, but they talk a lot about the emissions of those, but then they also say that when there was the massacre of the Bisons, that that was really bad.

[2603] And bisoned make a lot of emissions, but there was no like climate emergency from all the bison.

[2604] So, I mean, I don't know.

[2605] Well, the climate science is also a religion.

[2606] If you have anyone that goes over the actual data and differs with what the narrative is, that person is a crazy person and a climate denier.

[2607] A denier, that's right.

[2608] You can't even have discussions about the actual, like the real numbers.

[2609] You can't talk about the real history of the climate of Earth.

[2610] You can't talk about the dangers of global cooling.

[2611] If you just talk about the dangers of global cooling, you're a climate denial.

[2612] Yeah, you can't talk about whether we're in a natural warming cycle or if it's got, you know, or whatever.

[2613] Not saying that we are, I don't know, but you can't talk about it.

[2614] This is one fact, for sure.

[2615] We know 100%.

[2616] The temperature of Earth has never been static.

[2617] That's right.

[2618] It goes up and down.

[2619] Ever, ever, ever.

[2620] When they do those core samples and they go back thousands and thousands and thousands of years, it's never been static.

[2621] It's always been all over the fucking place.

[2622] And there's a bunch of variables that cause it to change.

[2623] Yeah, that's right.

[2624] They know that.

[2625] And they know that humans are having some impact.

[2626] We're having some impact.

[2627] What is the impact?

[2628] And how much of it should we throw the fucking society that we all live in into the gutter?

[2629] to try to fix.

[2630] Right.

[2631] Or hand over all of the power to a handful of unelected, yeah, unelected dictators.

[2632] Yeah.

[2633] These so -called stakeholders.

[2634] Like, why does Bill Gates know more about all of this than everybody?

[2635] Like, I get it.

[2636] He built Microsoft.

[2637] Like, he can do something.

[2638] He knows something.

[2639] Like, I'm not going to take that away from him, but, like, why is he, like, the god of vaccines and climate and, like, every other thing?

[2640] Because he built a fucking computer.

[2641] It's very weird.

[2642] It's all very weird because it's just, like, you don't want to think it's that on the nose.

[2643] You don't want to think it's like that on the nose that they're engineering the demise of freedom.

[2644] I actually get hopeful when I think that they are.

[2645] I'm much more afraid of it being just some random organic shit going off the rails than it is that there's some number of people who could be identified as criminals.

[2646] I worry about that.

[2647] I worry about it.

[2648] There is a lot of it is a random thing that just happens with human beings that are tribally opposed to each other.

[2649] Then maybe too wealthy or whatever.

[2650] And there's a lot of that, a lot of free time and a lot of easy.

[2651] living and then it all just wraps up like everything does like nothing stays like this is a good way to behave that's where religion comes in because religion does tell you this is a good way to behave and these are the tenets that you should live by and it's not like this thing that you should be escalating and pushing it further and further yeah it's like not to get all like churchy because but i have been i seriously when i said earlier that i've been looking at the bible a lot looking at the gospels not just particularly but especially i was reading the gospel Matthew the other day seven's chapter.

[2652] And I bet you never thought you're going to have this conversation with me. But I was reading it, and I'm reading about the, you know, the narrow, the way is narrow, the straight and narrow way where he has that in Matthew 70's like, you know, wide is the path that leads the destruction, but the way that leads to life is straight and narrow.

[2653] And it's like, well, what is that talking about?

[2654] It's like you have to live well.

[2655] You have to treat each other well.

[2656] It's like you have to also repent when you mess up.

[2657] And people don't like to do that.

[2658] You can't just go along with the crowd because the crowd is going and some direction.

[2659] That's the wide path, and that leads to destruction.

[2660] But the way is straight and narrow.

[2661] When he says straight, it's not like straight, like straight like straight like a waterway.

[2662] Like, so that means that the edges are like right there.

[2663] And if you don't run the boat just right, you know, you're going to crash into the sides.

[2664] And it's a weird kind of pun or whatever in English, but it turns out that that's the word that he used for Greek is what it means is a narrow waterway.

[2665] And so it's like, you got to, it's very important that we, you know, we live like that.

[2666] And so what is religious do well religion teaches people to like at least contemplate this crap like why don't you stop for five minutes of your week and think maybe there's some like ways to be a good person right and if you don't have a structure for that that it's dependent entirely upon the ideology that you subscribe to right if it's an out of control ideology that may very well be controlled by foreign powers right they're using it to disrupt this country and then like you've got to it's a crazy thing to think But that might really be what's going on.

[2667] I think it is.

[2668] I mean, I'll put the hat back on or whatever, but I'm not afraid of the radio waves is the problem.

[2669] I've got to put it like something else on.

[2670] Timfoil hat's fine, though.

[2671] It suits its purpose.

[2672] Yeah, it suits the purpose.

[2673] It's seriously.

[2674] But then even that gets infiltrated.

[2675] So people have got to take it really seriously.

[2676] I like to tell people, this is my little like bit, right?

[2677] So I'll waste one of my bits.

[2678] But I tell people it's like, here's how communism is.

[2679] This is how freaking seductive it is.

[2680] So in again Matthew chapter 10, Jesus is talking and he says that I send you out and you have to be wise as serpents and gentle as doves.

[2681] It's a very famous, Matthew 1016, it's very famous verse.

[2682] So you have to be wise and wary like a snake, right, testing, the tongue testing, the air, knowing where you're going.

[2683] If I'm going to send you out into the world, you've got to be wise and judicious and discerning, but you also have to be gentle, right?

[2684] And so what do the communists do?

[2685] This is how subtle they are, Joe.

[2686] They come along and say, did you hear that?

[2687] Jesus said, be gentle.

[2688] And they leave out the other part.

[2689] And you're like, yeah, he did.

[2690] And then you got a bunch of, like, weak nambi -pambi pastors who think it's about being winsome and being cool for their congregation and, like, not standing up for the truth any longer.

[2691] And that's bad.

[2692] And then what happens is stuff starts to go shitty.

[2693] And then you have Mussolini comes along.

[2694] The fascist guy comes along.

[2695] And he's like, the problem is being gentle.

[2696] But no, the problem was that you're not being discerning and wise anymore.

[2697] So what happens is the communist take away half the commandment to suck you in and marries a truth to a lie or whatever.

[2698] And then the fascists overreact by throwing the principle out entirely.

[2699] But if people were grounded in their faiths and taking it seriously, they would realize, no, no, no, no, I have to be kind and gentle, but I also have to be wise as a serpent.

[2700] When the serpent's in danger, it doesn't hesitate to strike.

[2701] but it's only going to do that when it's in danger.

[2702] So it's like, that's crafty, man. Right.

[2703] And then look at, like, that's crafty to an adult.

[2704] Now imagine then when they do that to like a five -year -old, like with the stuff in schools or whatever.

[2705] Right.

[2706] Like, oh, well, people who look like you had a long history of causing a lot of problems in this country.

[2707] And they didn't say you're a bad kid to the kid, but they said people who look like you or your ancestors.

[2708] Speaking of my ancestors, we'll let the world know because my reparations bill will go through the roof.

[2709] out recently.

[2710] I'm seventh cousins with Robert E. Lee.

[2711] Whoa.

[2712] Yeah, like some family's doing the genealogy, and so now my reparation bill just went up to, like, Elon Musk level.

[2713] Wow.

[2714] That's crazy.

[2715] Yeah, I didn't know the guy.

[2716] Oh, of course.

[2717] Never met him.

[2718] Of course.

[2719] Which is like we're not mad at Klaus Schwab for his dad.

[2720] No. We're mad at Klaus Schwab for other reasons.

[2721] Yeah, it wasn't his choice.

[2722] It wasn't your choice that Robert E. Lee was how many generations?

[2723] Seven.

[2724] Seven.

[2725] Yeah.

[2726] Apparently, My line and he had a common grandfather is the way that the seventh cousin's math works out.

[2727] It is kind of wild to imagine that just a couple hundred years ago, less.

[2728] The United States was involved with a war with each other.

[2729] Yeah, that's right.

[2730] Like, just...

[2731] That's how nuts we are.

[2732] We're so fucking nuts.

[2733] We'll fight each other.

[2734] Well, freedom's important.

[2735] Yeah.

[2736] Yeah.

[2737] I mean, liberty or death.

[2738] Yeah.

[2739] And that was the ultimate one, right?

[2740] Yeah.

[2741] I would think about that all the time now, man. Like, give me liberty or give me death.

[2742] I used to think when I was a kid, like, that's crazy.

[2743] That sounds ridiculous because there's so much.

[2744] No real issues when you were a kid.

[2745] But now it's like, no, that's legit.

[2746] I'm in.

[2747] Yeah, they said it for a reason because back then it was a whole different ball game they were playing.

[2748] Imagine someone trying to start a new country today.

[2749] Yeah, you're going to have a hard time getting off the ground.

[2750] Good fucking luck.

[2751] You think it's hard to get a DEI loan?

[2752] Yeah, no kidding To start a manufacturing corporation in America Imagine trying to start a country Yeah, you better be compliant with like Everything, or you're like some crazy rogue state or whatever Right, imagine if like Iceland was for sale Or some country, Greenland Like Greenland, you could buy Greenland Imagine you bought Greenland You're like, we're just gonna fucking let people be cool Yeah, really cool actually Let people have a good time Yeah Well that's a good spot to buy if the global warming fanaticists are true, if they're right.

[2753] Yeah, well, they're correct.

[2754] Greenland's the spot.

[2755] So are they buying up that property?

[2756] No. Hmm.

[2757] They're also, the thing is that a lot of these people that are pushing all this climate change agenda have homes on the beach.

[2758] Yeah.

[2759] And they're not getting rid of those.

[2760] And by the way, the shoreline hasn't changed.

[2761] Yeah, Plymouth Rock is like just barely above the surface.

[2762] You can still see where they wrote 1620 on it.

[2763] Look, the surface has changed throughout human history.

[2764] We know that, folks.

[2765] And you know when it changes the most when there's a fucking ice.

[2766] age.

[2767] Uh -huh.

[2768] That's the scary shit kids.

[2769] Yeah, and there's these weird things because it doesn't like change tomorrow.

[2770] It's not like that stupid movie.

[2771] It's like you can build these things called sea walls.

[2772] But it's also like every single thing that happens is being used as a device to control people.

[2773] And the fact that some people are reluctant to see that is very disturbing to me. It's like, hey, guys, something's happening here.

[2774] And same formula every time.

[2775] It could be that we're all going to have a better future, but there's these deniers that It won't come along with us, so hate them.

[2776] Yeah.

[2777] Because it would be great.

[2778] And you're sacrificing, you're riding a bike to work instead of driving your car.

[2779] You're a good person.

[2780] And those assholes with their truck, you know, big diesel truck are ruining the planet.

[2781] I was watching this lady talk about this.

[2782] She was talking about how she loves having an electric car because she knows that it means I'm being a good person.

[2783] Yeah, that's right.

[2784] Contributing to the environment.

[2785] I just saw this thing that said that the environmental impact of electric.

[2786] cars is actually worse overall than the environmental impact of a traditional combustion engine.

[2787] Is that true?

[2788] Because that sounds crazy.

[2789] I read the same thing as you, so my depth of knowing that it's true is equal to yours.

[2790] Just in all fairness, I drove here an electric car.

[2791] I drive an electric car all the time.

[2792] Do you?

[2793] Yeah, I have a Tesla.

[2794] It's awesome.

[2795] Oh, okay.

[2796] Fucking rules.

[2797] I've seen a couple of the cyber trucks.

[2798] Do you got a cyber truck?

[2799] No, I have the Model S. Oh, yeah, those are fun.

[2800] It's great.

[2801] It's so comfortable.

[2802] It's easy.

[2803] It's fast as shit.

[2804] It's ridiculous.

[2805] It's like it makes other cars feel stupid.

[2806] They feel dumb because they don't move like that thing.

[2807] The thing moves like it's like teleporting.

[2808] It's bizarre.

[2809] It's bizarre what it can do.

[2810] It's easy to drive.

[2811] I don't like the fact the horn's not in the middle.

[2812] This is probably what we saw.

[2813] Electric vehicles release more toxic commissions are worse for the environment than gas -powered cars study.

[2814] This is in the New York Post.

[2815] And it says, it's amazing that they didn't ban this story.

[2816] Yeah, right?

[2817] From the New York Post.

[2818] Remember when they did that with Twitter with the Hunter Biden laptop?

[2819] How wild is that?

[2820] Electric vehicles release more toxic particles into the atmosphere and are worse for the environment than their gas -powered counterparts, according to a resurface study.

[2821] The study published by emissions data from emissions analytics was released in 2022, but has attracted a wave of attention this week by being cited in a Wall Street Journal op -ed on Sunday.

[2822] It found brakes and tires on EVs released 1 ,850 times more particle pollution compared to modern tailpipes, which have efficient exhaust filters, bringing gas -powered vehicles emissions to new lows.

[2823] Today, most vehicle -related pollution comes from tire wear.

[2824] Whoa.

[2825] As heavy cars drive on light -duty tires, most often made with synthetic rubber made from crude oil and other fillers and additives, they deteriorate and release harmful chemicals into the air, according to emission analytics.

[2826] I do know they're heavier, and they wear it on the roads faster.

[2827] Wow.

[2828] Because EVs are an average of 30 % heavier, brakes and tires in the battery -powered cars wear out faster than on standard cars.

[2829] Emission analytics found that tire wear emissions on half a metric ton of battery weight in an EV are more than 400 times as great as direct exhaust particulate emissions.

[2830] For reference, half a metric ton is equivalent to roughly.

[2831] 1 ,100 pounds.

[2832] That's something that someone had told me a long time ago about cities.

[2833] The thing about the pollution is it's not just the emissions.

[2834] It's brake dust.

[2835] Yeah, yeah.

[2836] You're breathing in brake dust.

[2837] Because if you've ever, like, touched your car, like your wheels after you drive it for a while, when you're cleaning your car, you get brake dust.

[2838] It's all over the inside.

[2839] That goes out in the air.

[2840] Oh, yeah, totally.

[2841] And that's fucking, you know what it doesn't?

[2842] On carbon brakes, you have um like uh those uh what is it carmic carbon ceramic disc breaks uh they don't seem to do that are they like more environmentally friendly or carbon ceramic disc brakes more environmentally friendly than regular because your wheels don't get all fucked up like that you don't get brake dust all your wheels like nasty black no no it's interesting it's more expensive and they put them on like high performance cars yeah but is it more environmentally friendly because it's it's it seems like it would be if you're not getting the brake dust.

[2843] I'm like, where's it going?

[2844] Is it just not making dust because it's a carbon fiber pad and then the brake?

[2845] So does it just work without making dust?

[2846] Does that even make sense?

[2847] What I read about these EVs besides getting their materials to make the batteries is that they don't, they're not like reusable.

[2848] There's like no used EV market.

[2849] Right.

[2850] Like nobody wants to buy a used one and then replacing the batteries if they, you know, wear out is a disaster.

[2851] It's very expensive.

[2852] Yeah.

[2853] Yeah, just close to the price of the car itself sometimes is what I've heard.

[2854] I don't know.

[2855] Really?

[2856] Yeah, they can be extremely expensive.

[2857] So there's no, like, there's zero aftermarket.

[2858] So, like, where do they go?

[2859] Do they have, like, electric car graveyards, like, with the windmill blades, where they just kind of bury them in the dirt?

[2860] Like, I don't know what happens to them.

[2861] There's a significant reduction in brake dust compared to metal blend pads.

[2862] Significant reduction.

[2863] But they are way more expensive, aren't they?

[2864] Yes, they're way more expensive.

[2865] Yeah, it's like everybody's got a ride around and, like, expensive.

[2866] Porsche brakes.

[2867] Yeah, but I mean, if you think about all the other things that we do for the environment, if carbon ceramic brakes are a possibility, like how much more expensive?

[2868] Does it make a car $500 more?

[2869] Like, is there a way that they can produce them in mass?

[2870] Is there a reason why they haven't done that?

[2871] I mean, that seems to be alone, a solution, at least for electric cars.

[2872] If you'd say, you're spending the money to get a Tesla, they're fucking expensive already.

[2873] If someone's going to spend $120 ,000 on a car, you won't spend $122.

[2874] do and get carbon ceramic breaks that won't pollute the atmosphere nearly as much.

[2875] Yeah, well, it would seem to make reason.

[2876] Yeah, that's a wild statistic, but that lady was not aware of that.

[2877] She's like, I'm doing really amazing.

[2878] It's important to this quick thing I just pulled up.

[2879] It says it almost takes a month to make each one.

[2880] Holy shit.

[2881] Whoa.

[2882] That's probably why.

[2883] That's a lot of investment to build.

[2884] Well, ceramics are complicated if they're high tech.

[2885] Wow.

[2886] Holy shit.

[2887] average of, I don't know, $10 ,000 per break.

[2888] Did that sound right?

[2889] Whoa.

[2890] I don't know if that's right, but it's just, yeah, you're looking somewhere in the $10 ,000 range for a set of rotors.

[2891] Wow.

[2892] Holy shit.

[2893] Maybe it's a little bit more than that.

[2894] Yeah.

[2895] Damn.

[2896] Yeah.

[2897] But isn't there another way?

[2898] If they have carbon ceramics and they're doing it for that, isn't there some other kind of compound that they could do that's comparable?

[2899] Doesn't it seem like someone should be able to figure that out if that's what's, literally the source of our major form of pollution.

[2900] I bet they're trying to figure that out.

[2901] Yeah, what am I retarded?

[2902] Yeah, some guys definitely trying to figure that out.

[2903] What the fuck is wrong with me?

[2904] I'm like, why doesn't anybody figure it out?

[2905] A lot of R &D.

[2906] But like, it still doesn't, like, answer their question.

[2907] If people want to drive an electric vehicle, like, okay, fine, who cares?

[2908] Right.

[2909] But it's like, why do we have to get rid of gas ones if they, if the emissions are negligible compared to their brakes, whatever the brakes happen to be?

[2910] Right.

[2911] So if their brakes are most of the pollution and the emissions are like basically nothing, I think that emissions is one of those words, that they just say it and then everybody has to do what they say because they said emissions.

[2912] Think about the emissions.

[2913] Think about the emissions.

[2914] Right.

[2915] And they're not taking to account break dust.

[2916] Yeah, there's so much else that's going on.

[2917] It seems just a little bit fake.

[2918] Well, it's definitely fake.

[2919] If that's true.

[2920] If that's true, that's something that really.

[2921] But the scary thing is, and they say, then we must take all cars off the road.

[2922] And everyone stays in a 15 -minute seat and bicycle everywhere.

[2923] It's good for you.

[2924] So the Buttigieg said a year or two, two years ago that their goal was by 2030 to get to net zero.

[2925] That's the buzzword, automobile deaths.

[2926] How you get to zero automobile deaths, Pete?

[2927] Stop people driving cars.

[2928] That's the best way.

[2929] Yeah, basically it.

[2930] Like, turns out that stuff happens.

[2931] Good Lord.

[2932] Yeah, so it's like...

[2933] Good Lord, James Lindsay.

[2934] Don't come here with good thoughts and tidings for the future.

[2935] I am the most optimistic person in this stupid culture war, Joe.

[2936] You're the most optimistic person that knows what you know.

[2937] Well, okay, that's fair.

[2938] Yeah.

[2939] No, I actually think, like, I see these guys bungling so much.

[2940] Like, Joe Biden's bungler.

[2941] He is a bungal.

[2942] Yeah.

[2943] Like, I got to ask at this Christian event one time, this kind of person's like wailing and they're like, you know, if God is real, it's almost like, if God is real, why do we have to have Joe Biden?

[2944] And, like, the only answer I could think of on the spot was because people have to be able to see, like, dude's pulling the curse.

[2945] back for an awful like what the hell is going on having that guy's president is fascinating and when they expose like when koreen jean pierre however to say her name yeah when she tweeted accidentally from her own account as joe biden's like oh look how about that and when you see that lady who when she's the white house press secretary answering questions it's so ridiculous it is imagine posturestress imagine that that's the person that's pulling strings and then it's like they invite like well what was that guy that was dealing women's luggage.

[2946] Yeah, Sam Brinton.

[2947] Like the whole administration.

[2948] Yeah, they're having their fucking minds.

[2949] Like, you know, whatever shade is deserved and no more, but we got the Admiral Levine, and we just see the pictures and you're like, what the hell?

[2950] And she's in charge of health.

[2951] He.

[2952] He's in charge of health.

[2953] We'll be in trouble for that, but.

[2954] Whatever.

[2955] That person, that crazy person, she's in charge of health.

[2956] That unhealthy looking person.

[2957] Yeah.

[2958] Is in charge of health.

[2959] Yeah, for, I mean.

[2960] Whoa.

[2961] Hey.

[2962] Hey, maybe it is a problem.

[2963] And China must be laughing.

[2964] Oh, I look, I kind of admire their long game.

[2965] I think it's very impressive.

[2966] Well, it is.

[2967] You know, listen, I am not Chinese, but if I was in China, I would be proud of what my country's doing to America.

[2968] A generational strategy.

[2969] I think they're killing it.

[2970] My experience on the ground in China is that roughly half, like what's going on, not with it against America, but with that system.

[2971] And roughly half would very quietly whisper when I was there, do everything you can.

[2972] can because if we lose America, we lose everything.

[2973] God.

[2974] So there's a sizable portion of Chinese that know that they can bug out to America, but if America goes, there's nowhere to bug out to.

[2975] Isn't that wild that wearing a Make America Great Again hat on can get you punched?

[2976] Yeah.

[2977] That's how, and it happens to be red, which seems at least slightly symbolic.

[2978] Yeah.

[2979] The whole thing is bananas.

[2980] We are in like the pinnacle of bananas time.

[2981] James Lindsay, I'm very, very glad you're out there.

[2982] I'm glad that you know as much as you know and you can talk about these things away and that you have a personality that seems to enjoy some of this conflict.

[2983] Well, I like a little bit, and I like the absurdity.

[2984] I'm not going to lie.

[2985] I think at the end of the day, it's easy to remember.

[2986] This is all really funny.

[2987] It is very funny unless it's tearing your life apart.

[2988] And then it's not so funny for you.

[2989] But it's human, the human folly of it all at scale, at the scale that we're witnessing is kind of amazing.

[2990] It's tremendously amazing.

[2991] It's also kind of amazing when we know as much as we know about human nature.

[2992] You know, we know as much as we know about the benefits of hard work and work ethic and discipline and all of these things that we've always praised people for in the past as now being dismissed as being racist or sexist or Islamophobic or whatever the fuck it is.

[2993] White supremacy culture.

[2994] Whatever the fuck they can label it with.

[2995] It's like they're trying to diminish strength through a very obvious sort of ideological scheme.

[2996] And it's weird.

[2997] It's weird to watch.

[2998] It's weird to watch human folly play out like that.

[2999] And so many people accept it and adopt it.

[3000] Yeah.

[3001] It's a fun project.

[3002] Is that what the Bible was talking about when they said the meek shall inherit the earth?

[3003] Yeah, it might be.

[3004] It was definitely what the Bible is talking about where one prophet after another stands up in the Old Testament.

[3005] I was like, listen, you screwheads, you're way off the track.

[3006] And if you don't get in line, God's going to punish you.

[3007] And so what did they do in almost every case, not quite every case?

[3008] They go after the prophet, right?

[3009] Like the prophets didn't have a nice, easy ride.

[3010] Maybe a couple of exceptions to that.

[3011] But the prophets got, you know, we're like, hey, guys, we got to get back to, you know, living the correct way.

[3012] And they bullied the prophet instead.

[3013] So it feels kind of like living in Bible stories sometimes.

[3014] It does.

[3015] I feel like if we were on Spotify, I would ask you to.

[3016] queue up Johnny Cash, God's going to cut you down.

[3017] So can we just play that just for the Spotify people and say goodbye to the YouTube people?

[3018] We can't do that?

[3019] Not really?

[3020] All right.

[3021] I'll listen to it when I get out of here.

[3022] You should too.

[3023] What was that?

[3024] I just want to play it for everybody else.

[3025] Who's everybody else?

[3026] I'll play it for us.

[3027] Just for us?

[3028] All right, let me hear a little bit of it.

[3029] I didn't know Chris Rock was in that video.

[3030] All right, we'll edit that out.

[3031] yeah uh hey thank you appreciate you thanks for being here man and thanks thanks for having so much information that you can just give people a roadmap that i really don't think is available in a lot of places well appreciate that thank you very much all right bye everybody