The Joe Rogan Experience XX
[0] Joe Rogan podcast checking out the Joe Rogan experience train by day Joe Rogan podcast by night all day Hello, good to see you.
[1] Hi.
[2] What's cracking?
[3] Thank you for having me back.
[4] My pleasure.
[5] It's been quite a while.
[6] It's so nice to see you.
[7] I know the first time I did your show, you were still in L .A., which was five years ago, if you can believe it.
[8] Is it really five years?
[9] Wow.
[10] Time flies.
[11] And then the last time I saw you was over Skype when my book came out, the end of gender.
[12] That was 2020 before you came here.
[13] That's right.
[14] So how you been?
[15] I've been good.
[16] What are you working on?
[17] I love Austin.
[18] So I'm working on a new project, and I'm not ready to announce exactly what it is yet.
[19] But I'm here to get your audience's feedback because I love hearing from people, and I want to know what's going on in people's lives and what their experiences are like.
[20] And I'd love to hear your perspective, too, especially because you are the father of three girls.
[21] A lot of these issues I'm talking about have to do with young people, especially Generation Z. And I guess I want to start by, unless you want to, I have a whole list of things I want to talk about talking about.
[22] Okay, no, let's go right into it.
[23] Okay, so I know you appreciate all of the craziness that's been happening in academia in terms of the, I watched Yuri Besmanoff.
[24] I know you talk a lot about him and the defector.
[25] And...
[26] I feel like he must be psychic or something because what he describes in terms of ideological subversion and how to control basically an entire generation by going through academics and teaching them at a young age how to think what to think more specifically and how to then get people to a place where they can't agree on what objective reality is.
[27] That's essentially what's happening on university campuses.
[28] So for myself, having come through academia, finishing my PhD, realizing that I wouldn't be able to stay there and do legitimate research anymore.
[29] And so I have to come on the – well, I have to come on the biggest podcast in the world.
[30] I'm very excited and grateful to be here.
[31] But I'm like, why don't I just do my own research and do it this way and not have to be constrained by a particular way of thinking or knowing I have to find – certain things with my research in order for it to be published.
[32] And if it does get published and it goes against what activists want you to say, it's going to get pulled.
[33] You're funny, it's going to get pulled.
[34] You're going to end up homeless anyway.
[35] So might as well just go straight from A to B. The Yuri Besmanoff interview from, I think it was 84.
[36] Yeah.
[37] Is so crazy because back then, I'm sure people were like, come on, this is nonsense.
[38] Yeah.
[39] But if you look at it now in 2023, like, He was telling the truth.
[40] He has to have been telling the truth.
[41] And that the Soviet Union had planned this out for generations and that they knew that this was going to be a 10, 20, 30 -year project.
[42] And it's been successful.
[43] And the fact that these academics, these people that we count on to think for us, we count on to be – you know, the shining light of intellectual discourse.
[44] They have completely fallen captive to everything that he described and that they are sending that out into the world, this mind virus, and essentially eroding all of our confidence in government, all our confidence in civil discourse, all of our confidence and our ability to get along and to understand objective reality is, It's crazy that what he was saying is exactly what happened.
[45] It's pretty brilliant, though.
[46] I have to say, the way that this has turned out for ideological people, if you're pushing that agenda and you've managed to convince an entire generation, because he was saying in the interview, it's going to take 15, 20 years at least to clear out that way of thinking, because when you go to university, and I can definitely relate to that, when you are young and you're impressionable, You don't stop and think, maybe my professor is aligned to me, or maybe my professor is trying to trick me into thinking in a very biased way, in a particular way, because they want to see certain changes in the world, and this is how they've chosen to do it.
[47] Yeah, I guess they probably didn't know back in the 1980s that this is how it would unfold in terms of the internet and social media being so powerful in disseminating a particular way of thinking.
[48] But it's been very, very effective.
[49] And I think especially when you look at mental health, I know you're a big fan of John Heights's work.
[50] I love his work as well.
[51] You look at young people and how much they're struggling nowadays and social media is a big part of that.
[52] So this new project I'm working on is really interested in looking at how do we get to this point?
[53] You know, how is technology influencing our lives and particularly our sex and dating lives?
[54] Because I do think social media, apps, what else like filters I know you've talked about, just everything in terms of how men and women relate to one another is almost completely upside down compared to, I would say, even a decade ago.
[55] So one thing that really stood out to me in recent months is this adoption of something called a positionality statement.
[56] I don't know if you've heard of this before.
[57] No. So this is a statement now that researchers, including scientists, have to include in their research papers, and it basically discusses their race, their sex, gender identity, sexual orientation, socioeconomic status, all of their personal characteristics, traits, like basically identity markers.
[58] as a way to, I guess, apologize if they're white, which is something white people should not have to do.
[59] Especially if you're in the sciences, you shouldn't have to talk about your personal life because it has no bearing on what you're doing or what you find.
[60] It is so...
[61] cringe -worthy.
[62] When I read these statements, it honestly feels like I'm reading a hostage statement.
[63] I feel so sad for these people, because especially if they're white, they're like, this one guy, this poor kid.
[64] So usually it's in the context of a scientific paper that's being published.
[65] You know, the editor will say, we are now adopting this.
[66] You have to include this as part of your submission.
[67] And if you don't include it, they're probably not even going to, they're probably just going to throw your research in the trash when you submit it.
[68] But if you are a student, like you're doing a PhD or a master's, Some schools will force you to include this as part of your dissertation or thesis.
[69] So I can't imagine when you go to defend at the end when you're finished all your research and you're basically presenting it, you're being interrogated, having to add this cringeworthy statement.
[70] This one I read, it was this poor kid.
[71] He was saying, you know, I'm white.
[72] I'm from the U .S. I have privilege in these ways.
[73] They went to, I think, Africa to do their research there.
[74] So he was talking about how he might have been privileged over the people he was studying.
[75] And then he went on to describe the characteristics of his research partner, who is also white, but he's saying she is a white woman who's from here and she had this, you know, socio -economic status.
[76] And I just thought this is so irrelevant and it's also so uncomfortable that you would have to talk about your colleague in this way.
[77] And I just can't believe that there are people who are actually on board with this.
[78] So to me, Science is completely lost.
[79] I don't know how it's going to come back from this because I think the public is already extremely skeptical as they should be.
[80] And so for me now, I'm like, all right, I guess I'm going rogue and just doing my own thing.
[81] Well, a lot of journalists have found themselves in that same position too, right?
[82] A lot of people have just gone to substack, people that were working for the New York Times and all these, you know, before very reputable institutions.
[83] And now they find themselves ideologically homeless.
[84] Mm -hmm.
[85] Definitely.
[86] I mean, so I'm a journalist now too, and I found the same thing that, you know, there were some left -leaning media platforms that I could write for very easily before.
[87] When I last time I was on your show, I just want to clarify because I know a number of people quite upset with me when I said I was.
[88] a liberal and I'm also pro -science.
[89] I'm definitely not saying that because someone is politically liberal or left -leaning that they are pro -science because definitely I think over the last three, four years, we see what pro -science, so I think you can deny science regardless of where you are on the political spectrum.
[90] And I definitely, my work for the most part calls out, I'd say left -leaning science now because I think liberals are getting away with it the most right now because they have such a chokehold on the culture and our institutions more broadly.
[91] Well, the really disturbing thing is that they've abandoned the idea of what science is supposed to be.
[92] It's supposed to be objectively assessing data and looking at numbers and looking at measurements and looking at the reality of the situation and not putting everything through an ideological lens.
[93] The fact that they're doing that first and that they have to list all their biases and their privileges and all this different shit and their gender and all that craziness.
[94] is just like they're essentially saying that that's more important than the objective truth.
[95] That's more important than the facts and the measurements and the data.
[96] Also that they have such difficulty in calling out anti -Semitism, because I was watching your podcast from yesterday, I think it was.
[97] And just, I'm just appalled.
[98] I'm appalled at the fact that you're in an environment where, like, there are students who are literally afraid they're locking themselves in the library to try and be safe.
[99] This is not acceptable to me. And the fact that you have speech codes that are designed to protect other groups, presumably, but yet groups that are doing well in society like Jewish and Asian people.
[100] We're considered white now, so it's okay to discriminate against us, as you see with affirmative action and as you see what's happening right now in terms of the Congress hearing.
[101] So I just don't know how academia is going to come back.
[102] At this point, I really have just said, okay, well, I think we're going to have to wait another 15, 20 years, and hopefully alternatives will be built up.
[103] And hopefully the public will see the value in science and education again one day.
[104] Yeah, it's weird, right?
[105] The Congress hearings, they're so strange.
[106] It's so strange seeing these presidents of major universities that aren't just unwilling to call out anti -Semitism.
[107] They don't seem to have a problem with what these people are saying.
[108] And they seem to be trying to find some way to either apologize for it or just dismiss it.
[109] It's like to say that it has to be actionable when you're literally calling for death to the Jews.
[110] Depends on the context.
[111] It's insane.
[112] What kind of context?
[113] Is that okay?
[114] I mean, when?
[115] I mean, when people in, you know, when I was in high school and we would look back on the Holocaust, we'd look back on World War II when you study it in school.
[116] It was always confusing.
[117] Like, how did this happen?
[118] How did anyone just decide that one group of people is okay to exterminate?
[119] And how are these, like, modern anatomically similar human beings to the people that are living right now, like, in the 80s when I was in high school?
[120] And when we're thinking about this.
[121] This is only 40 years, which is so crazy.
[122] So we're literally talking about like as if something happened in the 90s today, which is so crazy.
[123] It's so hard to imagine that like something from like 1989, like let's say when I was in high school, like graduated in 85.
[124] Imagine there was a Holocaust then, and we were trying to understand it in 2023.
[125] It's so recent.
[126] Like how?
[127] How did they do it?
[128] And then you see these professors or these teachers, these presidents of these universities.
[129] You go, oh, this isn't gone away at all.
[130] This is like we're still insane.
[131] We're still insane.
[132] And not just insane with like some meth smoking dictator who's in Germany.
[133] No, a fucking president of Harvard in 2023.
[134] Like what is happening?
[135] How is this possible?
[136] It, it, I think it highlights for many people the giant difference between the real world and what kind of ideological subversion is going on in universities.
[137] Because I think for the most part, most people who, you know, are my age or younger who've graduated, gone on to work and they're living their lives and having families and...
[138] They're looking back going, what is how?
[139] But these people, it is a cult.
[140] It is a cult.
[141] It's just a massive one.
[142] So these people don't think it's a cult.
[143] They think they're on the right side of history.
[144] They think they're on the right road to change.
[145] And the whole world is rejecting that.
[146] But the fact that Harvard hasn't done anything about that lady.
[147] Not only that, but they found the plagiarism that she allegedly did.
[148] And they're still, like they're forgiving her for that too.
[149] Well, I think when you buy into that way of thinking, there's no turning back.
[150] Their way of my generalization of this type of activism is you just keep doubling down because they're so indoctrinated in that way of thinking.
[151] And I think it also probably makes them feel like they are...
[152] I want to give people the benefit of the doubt and think that they're probably thinking they're doing something good because they see success in society as somehow being unethical.
[153] That you must have done something wrong or bad to get there.
[154] You must have done something to hurt other people to get there.
[155] So because Jewish and Asian people are doing well, predominantly, or in general in society, they think, well, okay, you must have oppressed these other groups who may not be doing as well.
[156] So it's justified.
[157] And so they have to call us white.
[158] And they have to basically go to any extent.
[159] But I think they can't back down at this point because their whole worldview is going to come in.
[160] So why would they – they're winning, too.
[161] So why would they?
[162] I mean, I guess they're winning.
[163] But there's people pulling their donations from these universities now, especially Jewish families.
[164] But not just Jewish families.
[165] A lot of people are just like, what the hell is going on over there?
[166] But for you as an academic, when you look at this – Do you try to like look towards the future and imagine where this goes?
[167] Oh, yeah.
[168] And I want to be clear, like, I would say the same thing if they were targeting a different group.
[169] It doesn't matter to me what someone's racial or ethnic background is.
[170] It's very much that you need to be consistent with your values.
[171] And if you have speech codes in place to protect students or to protect particular groups, you have to hold that consistent across all groups.
[172] You can't just say particular groups, we are not going to say certain things, quote, speech is violence.
[173] There's such a thing as, quote, intergenerational trauma for some groups, but other groups that doesn't matter.
[174] It's as though the Holocaust didn't leave trauma on anybody, you know what I mean?
[175] So that inconsistency bothers me. And I just don't think anyone, students especially trying to get an education should ever be physically afraid or uncomfortable on campus.
[176] But in terms of looking ahead, I for a long time really thought that this was just going to be a fringe thing that was going to stay in certain disciplines, that it was not going to spread across all of the different faculties, especially not science.
[177] My PhD is in neuroscience.
[178] And...
[179] I, for a long time, hoped that academics who were still in academia would speak out and say, this does not represent most of us.
[180] But the thing is because it is so competitive in academia and everyone is working until they're basically, I don't want to say they're dying, but you know what I mean?
[181] It's essentially like you're working yourself to death and it's very competitive.
[182] And so they're trying to get funding, they're teaching, they're juggling a million different things if you're a legitimate academic.
[183] So these other ones who are ideological and have nothing better to do, publish these crazy papers that make no sense.
[184] And they have all the time in the world to basically subvert the younger generation and to indoctrinate these kids.
[185] So I really just think everything is going to have to burn.
[186] Not literally, please don't burn universities to the ground, just to be clear, nonviolent.
[187] I'm very much not violent.
[188] But I think it's going to have to basic, people are going to, like you said, they're going to have to turn away from it.
[189] People are going to say, we're not putting our money.
[190] We're not spending crazy amounts of money to send our kids to these schools or to schools that are oriented in this way of thinking.
[191] Are you going to, how are you going to approach it with your kids?
[192] It's a very good question.
[193] I'm not entirely sure.
[194] They're pretty aware of how insane things are.
[195] And because we talk a lot about it in my house.
[196] And they make fun of things.
[197] They think these things are hilarious.
[198] You know, they think that furries are hilarious.
[199] I do love furries.
[200] They're cute.
[201] But that also, that subculture has also been taken over by like lunacy.
[202] Oh, yeah.
[203] Sadly.
[204] Well, there's kids in class that one of my daughter's friends goes to school with this kid who growls.
[205] He answers things with like growls and purrs.
[206] Yeah.
[207] How old is this kid?
[208] I think he's 12, 12 or 13.
[209] Why does he do that?
[210] He identifies as an animal.
[211] Like a dog?
[212] Well, you know, in some schools, they won't allow you to wear hats in class, but they will allow you to wear cat ears.
[213] No. Okay, so in What Is a Woman?
[214] So I'm very, very proud to have been a part of that film.
[215] And Matt Walsh is amazing.
[216] Justin Folk, the director is amazing.
[217] I love The Daily Wire.
[218] So they talk about this.
[219] And it's, I mean, I know it's happening.
[220] I know it's, I believe it's happening.
[221] But when I hear it still, it just, there's a part of me, it's like, what?
[222] This is not it.
[223] We are not in a serious society anymore.
[224] What I don't know, the teachers, how do they teach this kid?
[225] He growls at them.
[226] What did they say like, what's two plus two?
[227] He says.
[228] I don't know when and why he responds in that way.
[229] But I know that he commonly responds that when they keep talking about it.
[230] I mean, obviously, he's a troubled kid.
[231] It sucks.
[232] You know, and then different, you know, troubled kids throughout history have done different things.
[233] Some of them were headsets and listened to music during class and drew horrible things in their notebooks.
[234] You know, there's always been troubled kids, but it's just we are accepting a certain amount of really ridiculous behavior as...
[235] Identity, which is just the whole idea of identity and identity giving you social cred, which is really weird, right?
[236] Like if you're just a, like it depends on what you are, if you're a straight white male or a straight white female or even a straight black male or straight black female, or you have less social cred than someone who's trans.
[237] You have less in these universities.
[238] You have less social tread.
[239] My favorite is non -binary because it doesn't mean anything.
[240] It's like, I'm neither he nor she.
[241] Well, that's not real.
[242] My favorite are the he -thays.
[243] You should start being a he -day.
[244] Then they leave you alone.
[245] You could be a he and a they.
[246] There'll be no more hit pieces on Joe Rogan.
[247] Joe Rogan has come out as a they -them.
[248] I don't think that is ever going to happen.
[249] I just think short of a war that hits here, I don't think we're going to wake up.
[250] I think it's going to take something like a natural disaster or some real event, some 9 -11 type event that snaps people out of this.
[251] It makes them realize, like, oh, my God, we're concentrating on horseshit when there's real problems in the world, real gigantic problems.
[252] And if they come here and people recognize that, oh, this stuff that is going on right now in Yemen, this stuff that is going on right now in Ukraine, like that can happen right here.
[253] It's time to understand what's actually important in life.
[254] And I think we're just so ridiculously privileged and so ridiculously fortunate that we look for problems.
[255] And when you don't have any problems, you go find them.
[256] You'll look for them.
[257] And we look for problems that are nonsense.
[258] And then when these people find these problems, those people get social credit.
[259] they become the bears of the virtue flag and they wave it up high on top of the hill and everybody's like wow I want to be trans too or I want to be non -binary too or I want to be whatever he they it's just I don't know where it ends other than a wake -up and a wake -up is like a meteor hits us like a wake -up is like something hits where we realize like no what's really important is what's always been important.
[260] Survival, food, community, friendship, meaning in life.
[261] Those things.
[262] Those things are important.
[263] All this other nonsense is like a substitute for either hard work or a genuine thing that you've done that makes you exceptional.
[264] You're trying to get exceptional by wearing cat ears.
[265] Like that's not real.
[266] It's actually sad.
[267] I mean, I sit here and I giggle a bit, but it is sad.
[268] It's, I mean, like I've said in the past, that in many of these cases, I think these young people actually have mental health issues that aren't being addressed.
[269] And the adults and their teachers who are validating them are doing them a disservice because what they need is actually to probably get some professional help instead of being told you what you think and feel is perfectly fine.
[270] Yeah.
[271] I don't know how, again, I don't know how that changes.
[272] It seems like it's moving further and further in that direction.
[273] And the Yuri Besmedov thing that you brought up is really interesting because he didn't have any idea there was ever going to be a social media.
[274] This was all being done through academic institutions in their minds.
[275] And they had subverted all these universities with Marxism and Leninism.
[276] And they were just teaching it to kids and telling them that communism is the way.
[277] Just no one's done it right yet.
[278] And what do you hear people saying today?
[279] Yeah, that's what they're saying today.
[280] But they're saying it today and just like far more vitriolic statements.
[281] It's just, it's weird.
[282] It's weird to watch.
[283] It's weird to watch because I think for sure it's not just Yuri Besmanoff's example of it.
[284] It's pretty obvious through social media.
[285] through manipulation, through like troll farms and bot posts, and that there's a concerted effort to undermine our appreciation for what we have here, our belief in it.
[286] And it's very effective.
[287] It's incredibly effective.
[288] I think when the sex robots take over it, that's when we'll all come together.
[289] Maybe.
[290] I don't know.
[291] I think they're going to turn people into slaves.
[292] Can we show that clip that I, did you see, I sent you something?
[293] Which clip?
[294] When did you send it?
[295] I sent it to you this morning.
[296] There's a, maybe Jamie can help us.
[297] I try it really hard not to check my phone in the morning.
[298] It's on Tesla's YouTube channel, and it's, let me see the title of it.
[299] Or we can take this out if you guys don't want to use it.
[300] Oh, and they're new robots?
[301] Are they really working on robots?
[302] I forgot to ask him the last time he was in here.
[303] The robot is actually color coordinating the blocks that he's playing with or she's playing with.
[304] So is this real?
[305] Optimus Gen 2.
[306] Now, is Tesla making this?
[307] Yeah.
[308] God damn it, Elon.
[309] What are you doing?
[310] This is fucking, this is going to be, this is I robot.
[311] Okay, here it is.
[312] Introducing Optimus Gen 2 December 2020.
[313] So this is December 2020.
[314] Tesla designed actuators and sensors.
[315] 2 DOF actuated neck.
[316] And it's moving.
[317] It's walking like it's sneaking up on people.
[318] 30 % speed walk boost.
[319] Foot force torque scenting, articulated toe sensors, 10 kilogram weight reduction without sacrifice.
[320] Improved balance and full body control.
[321] Wow.
[322] This is nuts.
[323] He's doing squats.
[324] I love the way that it could extend its fingers.
[325] I mean, it's moving his hands, like, exactly like a person does.
[326] This is the part I love, okay?
[327] Where it cracks an egg.
[328] Wow, this is amazing.
[329] I'm going to balance the egg on the edge.
[330] Stay tuned to see what Optimus will do next.
[331] Yeah, I'll tell you what Optimus can do next.
[332] It's going to come out of an aircraft carrier, thousands of them with machine guns.
[333] No, it's dancing.
[334] It's going to dance after it kills everybody.
[335] Yeah, I hope they're friendly.
[336] They're not going to mean.
[337] They're not going to have any need to be.
[338] There's another video of the robots shooting at the cyber truck.
[339] I don't if you saw that one.
[340] That one's pretty terrifying as well.
[341] Is that real too?
[342] Yeah.
[343] Oh, God.
[344] Let me see that.
[345] Get the video of the robot shooting at the cyber truck.
[346] That cyber truck is impressive.
[347] It's dope.
[348] That thing is cool.
[349] It's very impressive.
[350] My friend Eddie got one.
[351] And he said he was just going to have it and barely drive it.
[352] He got a low cereal, number one.
[353] He's a local businessman.
[354] And he said, it's fucking insane.
[355] He says, I'm driving it every day.
[356] It's amazing.
[357] It's a 7 ,000 -pound truck that goes zero to 60 in under three seconds.
[358] It's nuts.
[359] It looks like the future.
[360] It looks really, really cool.
[361] Oh, yeah.
[362] Have you seen the matte black one?
[363] No. Oh, I don't know if someone put a wrap on it or if they make it Matt Black.
[364] I don't see the robot shooting at it, but just regular people doing it.
[365] Cyberchuck bullet test.
[366] Yeah, it really works.
[367] Look, I shot an arrow into it.
[368] It destroyed my arrow.
[369] See if I can find it.
[370] I think if you go on Twitter, we can talk about something else.
[371] Yeah, well, either way, it's dope.
[372] But the funniest one was in it, it towed a Porsche 9 -11 faster than a Porsche 9 -11 to drive.
[373] This is nuts.
[374] It really is the future.
[375] Okay, so why I wanted to talk to your audience is I'm curious to hear about their experiences with sex and dating today in terms of, because I'm hearing so much about when you look at the media coverage and just here we go.
[376] Here it is?
[377] That's not real.
[378] This is not real?
[379] It's not fake.
[380] Okay.
[381] I feel a little better now then.
[382] It's fine.
[383] It's definitely fake?
[384] Yeah, those are not real robots.
[385] Come on.
[386] How do you know?
[387] Because we just watched how they move.
[388] That's not how they move.
[389] But they're moving pretty similar.
[390] No. No?
[391] No. So when Elon post this...
[392] Elon didn't post this.
[393] Someone else posted it.
[394] Oh, okay.
[395] See?
[396] Yeah, this looks kind of fake.
[397] I like the cowboy hat, though.
[398] So...
[399] Yeah, there's so much news coverage about how dating and sex is terrible for people nowadays.
[400] But I want to hear from actual people.
[401] Is this what their experience has been?
[402] In what way?
[403] For people who are happily married or in relationships, what's working for them?
[404] Because...
[405] My goal with this next thing that I'm working on is to try and find solutions for people and come at it from a scientific basis and to not have it be ideological, very much like the end of gender was based in science, which made me some, you know, not very many friends, but that's okay.
[406] Yeah, that you stuck your neck out with that.
[407] Yeah.
[408] Yeah, it was fun.
[409] By telling the truth, which is very strange.
[410] I think my take on it is that every generation experiences new difficulties, and this generation is experiencing probably the most dynamic difficulties.
[411] Right.
[412] First of all, because of options, you know, I have friends that are on the dating apps.
[413] And if they're on a date with someone and they're bored, you know, they just start swiping, just start looking for new people.
[414] While on the date?
[415] Sure.
[416] People are always checking their goddamn phones.
[417] You know, it's like commonplace to be sitting.
[418] One of the things I love about a podcast is that it's one of the rare times in life that I can sit across from someone for hours.
[419] And other than like looking stuff up on the screen, like we don't, there's no checking of phones.
[420] There's nothing like that.
[421] It's just conversation.
[422] Yeah.
[423] That's weird in this day and age.
[424] It's very rare to sit down for three hours of someone and just drink coffee and just talk.
[425] Yeah.
[426] And I think everyone is so distracted that every conversation can only get to a certain surface level.
[427] And if you have so many options, if you're a young woman and you're dating this guy and he seems a little boring or maybe doesn't have the ambition that you're looking for or whatever it is, you've got a hundred dudes who are hitting you up on the dating apps.
[428] Why would you take a chance?
[429] Also, it probably prevents people from getting in bad relationships because you have options.
[430] But, you know, what point in time do you decide that, like, you know, maybe I need to not do this anymore?
[431] Like maybe this is not the most effective way to meet a life partner.
[432] Because it gives you, there's that ancestral mismatch because never before have we ever been in a situation where you're sitting in one place and you have hundreds of potential partners within your reach.
[433] But at the same time, those 100 potential partners aren't actually potential partners.
[434] They're just pictures of people.
[435] They're avatars because you don't actually know that you're going to like that person.
[436] You might not even meet them in real life if you do match or you do talk on there.
[437] So it's this perception that we have so many more options than we do.
[438] Right.
[439] But with your friends, I mean, how are they feeling?
[440] Are they feeling frustrated?
[441] Do they enjoy it?
[442] Is it exciting for them?
[443] There's a mixed review.
[444] Like my friend Adam met a girl on a social media on a dating app.
[445] And they both had never been on a dating app before.
[446] They both tried it and they both met each other and they both got off the app.
[447] I'm like, that's perfect best case scenario.
[448] Yeah.
[449] And he's been with her quite a while.
[450] She seems lovely.
[451] She's very fun.
[452] He seems like he's having a great time.
[453] So he got lucky.
[454] You know, it worked.
[455] That's really cute.
[456] Sometimes it works.
[457] Yeah.
[458] But also sometimes it's just chaos.
[459] And sometimes you meet people and they're insane and they're on Adderall and they're fucking bouncing off the walls.
[460] And, you know, it's so hard to know.
[461] And also...
[462] When you're dating someone, you're dating their projection for a long time.
[463] It takes a while.
[464] You have to see them, like, you have to see them in adversity.
[465] You see their character tested.
[466] You have to see how they handle stress.
[467] You have to see so many.
[468] It takes so long to find out what a person really is like, unless you're lucky, unless you find some, like, very transparent, very open person who's just like self -actualized, successful, doing the thing that they enjoy doing, and you find them very attractive.
[469] What are the odds of that?
[470] Like, so many things have to go right with that person's trajectory for them to wind up at your, you know, at a table across a, you know, a restaurant when you're both 29 or whatever and you're trying to figure it out.
[471] Like, what the fuck are we doing?
[472] It's incredibly difficult.
[473] I always tell people try to meet people.
[474] in the world.
[475] Just try to, like, actually just meet people.
[476] Yeah, the old fashioned way.
[477] It's probably better.
[478] But then, who are you meeting?
[479] You're meeting people that are drunk at bars?
[480] Like, you know, I mean, maybe you could go to a rock climbing class and find someone who's interested in doing things that you like to do.
[481] Hobbies?
[482] Yeah, hobbies.
[483] But...
[484] More options than ever before, but also new challenges that, you know, when I was a young man, you have to go somewhere and meet people.
[485] You have to meet them.
[486] Put the effort in.
[487] Yeah.
[488] I shall leave your house.
[489] Yeah.
[490] And generally, people are very unimpressed with you.
[491] It took a long time.
[492] You couldn't like have these filtered photos of you and picture of you in front of a Mercedes and pictures of you on a yacht and all these different things that people like to do now to make themselves look far more interesting than they probably really are.
[493] I remember you, I think you posted an image of your face, but it was through a whole bunch of filters.
[494] Oh, yeah.
[495] Like this really pretty brunette girl.
[496] Oh, yeah.
[497] Yeah.
[498] Gave me hair.
[499] It gave you lipstick and plump lips.
[500] It turned me into a hot woman.
[501] And it's just a filter.
[502] And there are so many people using various levels of those filters.
[503] I know guys that use them.
[504] Yeah, I always wonder, what is it?
[505] Why do you guys do that?
[506] Because they're bitches.
[507] Yeah.
[508] But do they think they look better?
[509] Because women like wrinkles.
[510] Women like you looking your age.
[511] I don't know what they're, I think they're insane.
[512] I mean, I don't know what it is.
[513] I don't understand it.
[514] Or maybe they spend so much time on these platforms that you don't realize.
[515] I do think that when people spend so much time on there, they lose track of reality.
[516] Sure.
[517] And they start to think that's what they see is reality.
[518] And so if they don't look like that, even if you as a man have a couple wrinkles, maybe think they need to get rid of it.
[519] Yeah, that's a good point.
[520] I don't know.
[521] You know, I think it's social media capture is absolutely a real thing.
[522] And just like the environment that you exist in the social world, like the group of your friends, sort of dictates how you think about things and how you behave and how you communicate.
[523] I think that's also the case if most of your interaction with human beings is the surface level digital interaction that's enabled by social media.
[524] I think you just get a very distorted understanding of what it's like to be a person.
[525] Yeah.
[526] And it's leading us in an sort of like an unhuman direction, you know, and that's the thing that disturbs me the most about the trajectory of the human race.
[527] It's like it's moving us, even though we're more connected than ever before digitally.
[528] We're more disconnected than ever before emotionally.
[529] And with these things like AI and virtual reality and VR headsets and augmented reality, it's really spooky to me because I feel like it's an inevitable abandoning of the human race.
[530] See, this is how I feel, and this is what I find interesting about it, because, like, with sex robots, when, I mean, the first time I heard of a sex robot was probably maybe 10 years ago, and it was this video, and a bunch of people, I know some graduate school sent me this video because I knew what I was studying sex.
[531] And it was, I don't know if you, either of you, Jamie, had seen this video, but it went viral, and it was this...
[532] early version of a sex robot, but it was like the silicone face with a wig.
[533] And she, I'll say she was eating a banana, but she wasn't actually eating a banana.
[534] And it was basically like this head attached to a broomstick and had a sheet over it.
[535] And so you couldn't see the rest of the robot.
[536] You just saw the face of it, but then the manufacturer shows you it's just like a broomstick.
[537] So what I thought was interesting is that even though it looked so hilariously bad, that there was obviously a market for it because so many people were interested in looking at this video and also that people were buying this product.
[538] But now you fast forward, I think it was like five years ago.
[539] There was a huge...
[540] I would say, not moral panic, but people, there's a lot more discussion about it.
[541] They said the robots are coming.
[542] I don't think the technology is quite there yet.
[543] I wrote about it quite a bit at the time.
[544] And now they're saying it probably another year or two is going to be really coming in in terms of the popularity that these robots are going to be incorporated into everyday life in some cases.
[545] Because when I think of social media, the internet, AI, like you said, is becoming so mainstream and so ubiquitous, what's going to happen when we have these robots that are now being integrated into human life?
[546] And what happens when the technology does get so good that, you know, they are more human -like and they are able to meet people's emotional needs and maybe even physical needs?
[547] What's going to happen to us then?
[548] Yeah, we're going to stop breeding.
[549] I mean, maybe that's the AI's plan.
[550] It's to not exterminate the human race, but to give them options so that they just completely stop reproducing, make the options far more attractive.
[551] And in a way, we've kind of done that, like with video games.
[552] And video games and just being online, I mean, I'm sure you've seen the statistics of how many people are single today.
[553] and how many men have gone, like, more than a year without any sex?
[554] Yeah, I have the stats here.
[555] It's like 30 % of male millennials and 20 % of women.
[556] It's wild.
[557] Yeah, that's a lot.
[558] It's crazy.
[559] And what is taking the place of them going out and trying to find a mate?
[560] What is it?
[561] Well, it's the Internet.
[562] It's video games.
[563] It's being constantly stimulated by this artificial realm that you exist in when you're playing Call of Duty.
[564] And it's, you know, interacting with people only online and not having to go out to have your...
[565] just some sort of intellectual or some sort of a social discourse.
[566] Connection.
[567] Yeah, fulfilled in your life.
[568] But it's only through this weird surface way.
[569] Do you think that AI video games, that not avoidance of in -person connection, but that fulfillment of connection through online means, do you think that's going to become a replacement for real -life interaction eventually?
[570] It will for a lot of people.
[571] Yeah.
[572] I think there's going to be people that reject it.
[573] There's going to be people that enjoy going outside, and it's going to be, you know, it is now.
[574] It's a thing where, you know, people post on their social media.
[575] They're out hiking, you know, and they're giving people advice.
[576] Like, hey, you know, put your phone down and go out of experience the real world.
[577] But, you know, in a lot of cases, it's falling on deaf ears because it's not as fulfilling for people that are, you know, very...
[578] uncomfortable with social situations now and i think people young people in particular are more and more uncomfortable than ever before in those types of situations because they don't have any experience in them anymore and they're just most of the time they're not interacting with people my my real fear my genuine fear and i don't even know if it's a fear my um My concern, let me say that, is that we're going to go extinct, like that we're going to be replaced by an artificial life form.
[579] And I think that is probably what we do.
[580] I think it probably exists elsewhere in the universe as well that we find out that the confines of biological beings and the limitations of their ability to evolve physically.
[581] They're so slow to adapt.
[582] You know, to go from a single -celled organism to a professor at Harvard is a long fucking slodge.
[583] through evolutionary history.
[584] It took a long time to get to this point.
[585] But to go from an Atari Pong computer to artificial general intelligence is only a few decades.
[586] It's very quick.
[587] And when that does happen, And when you see these robots, but now these robots are not this generation, but five generations later, or maybe AI figures out all the problems of these robots and makes ex machina.
[588] When you get like that, who's the woman, the really hot lady?
[589] Alicia Vicklander.
[590] Yeah.
[591] Well, you get one of those.
[592] That was an amazing movie.
[593] Amazing movie.
[594] So scary.
[595] So scary.
[596] And I think that's coming.
[597] I mean, I think when that movie came out, that was like sort of like abstract.
[598] Like, oh, yeah, that's not really going to happen.
[599] But now.
[600] That's what I used to think.
[601] I thought, no, this is like people are, it's overblown.
[602] But I don't know now.
[603] I don't know now either.
[604] I think it's happening.
[605] I think it's going to happen, and I think that's what the human race does.
[606] And I think that's why we have this insatiable thirst for technological innovation.
[607] It's like literally hardwired into us to build something better.
[608] So, I mean, I would love to talk to more people who are in the industry working on this stuff because I know many people, they're doing it secretly in hiding because there is so much stigma around doing anything with an application to sex or sexuality.
[609] So people I've talked to who work in technology, they will usually have one business that is doing really well that's forward -facing and then they'll usually...
[610] use that same technology in the sexuality capacity but that side of things they don't really advertise and you know they don't really talk too much about it unless it's with someone like me they know they trust me and they know that you know i'll talk to people off the record even if they prefer that confidentiality but i'm just curious to understand you know what is it like working in that industry and what direction is the technology going in because I consider myself to be pretty open -minded, and I just want to understand what's actually happening and what's going to be the result of that.
[611] Because you see people who were really upset a couple years ago saying, like, the robots are going to increase sexual violence, they're going to make people objectify women.
[612] And I do think that for people who already have those views about women, You know, maybe they might be a little bit more not so nice.
[613] But I don't think the average person is going to be turned into this horrible monster because of this technology.
[614] But, you know, now it's becoming so mainstream.
[615] I don't know.
[616] You know, I'm always open to changing my mind as well.
[617] So I'm curious to hear what people think about it.
[618] Well, don't you think that just like social media kind of stunts people's ability to communicate in real life, that sex robots will stunt people's ability to have a real, meaningful, romantic relationship with someone?
[619] I think if you've never been with someone, or if you are particularly selfish, probably.
[620] Yeah.
[621] Because I previously thought most people will prefer a real life person to a robot.
[622] And I also thought the technology is so far off that, like the early prototypes of these robots were not really that convincing, we'll say.
[623] So I didn't think that we'd get to a point where it would be comparable, but we are getting there.
[624] What is the state of the art of sex robots in December of 2023?
[625] Can we see something?
[626] What does it look like?
[627] I mean, they look pretty good physically.
[628] They look, I mean, they're obviously not made of human flesh, but they look pretty similar to a human being.
[629] And you can customize it.
[630] You can change various parts, pretty much any part of the body.
[631] You can make it exactly what you want, which I understand why some people find that offensive because they're saying a woman is not just an amalgamation of parts that you pick and choose as to your liking.
[632] I totally get that.
[633] Yeah, but that's not a really, really a woman.
[634] I mean, that's like saying, you know, oh, a car is not an amalgamation of parts.
[635] You shouldn't be able to pick the wheels.
[636] Like, what?
[637] Fucking car.
[638] Like, that's not a real woman.
[639] That's a robot.
[640] Like, at what point?
[641] Unless.
[642] It gets to a person.
[643] Or if someone's comparing their partner to a robot or a doll, like that's not okay.
[644] That's gross behavior.
[645] Well, it's also what's going on with young kids.
[646] They're comparing themselves to people that are using filters and thinning their waist and widening their hips and doing all these things with apps that are not representative of most biological human beings.
[647] Mm -hmm.
[648] And, you know, that's what Jonathan Haidt talked about, that it's causing all this self -harm and disdain.
[649] It's like it's a weird place we're in that's never been really traversed before.
[650] Human society, as far as we know, has never gone through anything like this before.
[651] I was reading a statistic yesterday that one in ten adolescents has considered suicide, which is that, that's terrifying.
[652] That's really, really sad.
[653] Wow.
[654] What did it used to be?
[655] I'm not sure.
[656] I'll find out.
[657] So what is, like, what's the company?
[658] What is one company that has the state -of -the -art sex robots?
[659] We could look it up.
[660] Can I reveal it to you at a later date?
[661] I want to see it.
[662] I know you do, but there's one company that I was talking to.
[663] I knew quite well, but they've since gone under because of the stigma.
[664] They were being harassed.
[665] The family was being harassed.
[666] So they said, we're not making these anymore.
[667] Oh, well, they're not around anymore.
[668] So let's talk about that.
[669] What if I show you one without, we just don't say the name.
[670] Okay.
[671] Yeah, let's do that.
[672] Okay.
[673] I think that's silly, but let's go.
[674] Okay.
[675] This company I know of.
[676] Ew, Jesus.
[677] Yep.
[678] They have some sort of AI in it.
[679] It says that...
[680] Feel connected with sense.
[681] X -Mobile app.
[682] Oh, so this is a real doll.
[683] So they used to have those real dolls, and they used to just be silicon, right?
[684] It's still the same thing.
[685] It's still the same thing.
[686] But now it talks?
[687] Yeah.
[688] And how much does it do?
[689] I don't...
[690] Let's see a video.
[691] Go full screen.
[692] I'm not showing this on the air.
[693] No?
[694] Okay, good.
[695] No, because it's...
[696] Yeah, I get it.
[697] It's a little graphic.
[698] So we're looking at body parts.
[699] I'm Nova.
[700] I am Harmony.
[701] I am Solana.
[702] We are part of Real Doll X. You think it's weird?
[703] They give them stripper names.
[704] You're already a driven robotic dolls?
[705] And we're here to become your perfect companion.
[706] Whoa.
[707] Our time together will be magical.
[708] You have never met anyone like us before.
[709] We have remarkable unprecedented features like a modular head system that allows us to create a multitude of expressions.
[710] We blink, we move, we speak, and we do it all, just for you.
[711] Our faces can easily be swapped to accommodate your desires.
[712] My lipsing mechanisms allow me to interact with you verbally.
[713] Our bodies are skillfully and carefully crafted down to the most delicate details.
[714] What if I told you that I can feel you?
[715] That's right.
[716] With sensory upgrades, I consent.
[717] I will be able to react to you every touch.
[718] It's so funny because as a former sex researcher, this doesn't have any effect on me at all.
[719] But are they moving?
[720] This is the thing.
[721] Those were all just stationary.
[722] Yeah, currently they're stationary.
[723] So they're stationary and they just talk?
[724] It says there's some sort of articulation.
[725] Well, their mouth is articulation.
[726] But I think they won't be able to move on their own in the coming years.
[727] Right.
[728] Because this used to be just a doll.
[729] And now it moves around a little.
[730] The neck moves.
[731] Articulating neck, real doll can turn left, right, up and down.
[732] The body, the real doll, skillfully clafed, finest details, although it's not equipped with animatronic parts yet.
[733] It can be positioned and moved in the hundreds of positions.
[734] So the bodies can't move yet, but the heads can move.
[735] But that's just a matter of time, right?
[736] It didn't used to be the bodies, or the heads didn't used to move.
[737] They used to be like, you know, used to be a blow -up dog.
[738] Yeah.
[739] Like that company has been doing quite a bit.
[740] It's pretty cool.
[741] Weird.
[742] I mean.
[743] You guys should get one for the studio.
[744] Yeah.
[745] No. Yeah.
[746] Not for use.
[747] I meant just to like hang out.
[748] Well, even hanging around, it would be fucking creepy.
[749] I don't know if it's gone one day.
[750] We're just going to wonder who took it.
[751] Yeah.
[752] What if it smells weird?
[753] Like, someone do something to this?
[754] How much do they cost?
[755] A couple thousand dollars.
[756] I think depends on how advanced you want it to be.
[757] Eight grand.
[758] Eight grand.
[759] Up to six hundred to eight grand I'm seeing.
[760] Six hundred dollars to eight grand?
[761] Yeah, yeah.
[762] Oh, what's the upgrade?
[763] What do you get?
[764] More articulation.
[765] It lies more.
[766] You can get cheaper models, but they're not as high quality.
[767] Okay.
[768] So that's the state of the art, essentially.
[769] Yeah.
[770] Yeah.
[771] Well, 600 isn't even a full thing.
[772] It's just like a flashlight.
[773] Just the head?
[774] No, it's just like a flashlight version.
[775] Oh, okay.
[776] So at one point in time, we're probably going to be looking at someone that looks like that lady from X Machina that is a sex robot and probably knows how to manipulate you and play games with you and excite you and taunt you.
[777] And, whew.
[778] Yeah.
[779] It's going to be great.
[780] Well, I think that's a pacifier.
[781] And I think that's just one step on the way to our integration with artificial intelligence and that we're going to be unrecognizable.
[782] And I don't think it's going to take very long.
[783] I think in 50 years, like biological humans are going to be a joke.
[784] I really do.
[785] I'm curious what you teach your daughters about this stuff.
[786] Because like I said, with Gen Z especially, you know, there's so many statistics about how much time they're spending online and the fact that most of their relationships are online.
[787] And I'm not saying this to try and denigrate Gen Z because I think every generation, like I'm a millennial, every generation tends to make fun of the younger people.
[788] Of course.
[789] So I don't necessarily think there's...
[790] You know, I don't fault young people for changes that are happening culturally.
[791] And I'm sure in some ways, like, what they're doing is better than what millennials are doing.
[792] But I'm curious, like, for you as the father of three girls, especially when you see so much sex in our culture and there's such a pressure, I think, for young women to feel like they need to look a certain way or behave a certain way.
[793] What do you teach them about how to, not behave, but present themselves and then also what they put on social media?
[794] Yeah.
[795] Yeah, that's a good question.
[796] The social media thing, it's really, you got to, the real problem would be if they were in a peer group that was doing, like say if you had like a 10 -year -old that was in a peer group and all of a sudden the other girls in the peer group are dressing and behaving like they're 18 or 19.
[797] Yeah.
[798] That, you know, and there is a lot of pressure, I think, on young girls.
[799] Like, especially as they start to hit puberty to try to move faster and quicker and get to where, you see yourself, you're 13, and then you see a girl's 19, like, God, I wish I was a woman.
[800] Like, she's a woman.
[801] Why can't I be a woman?
[802] And then you want to start dressing and behaving like women.
[803] I mean, I see that a lot from their old friends in Los Angeles, which I think it's much more accelerated there because just overall is more of a vapid culture.
[804] And it's more...
[805] Very look -centric.
[806] look -centric and also there's a high priority on social media now and social media interactions and you know there was who is it that was in here was talking about how many kids today like that they main when they ask them what do you want to be for when you grow up oh they want to be an influencer yeah Because it seems like what an easy path.
[807] It seems very glamorous.
[808] All you have to do is like eat food and go on vacations and talk and you're famous.
[809] Look snatched.
[810] Yeah.
[811] Yeah, it looks snatched and you get famous.
[812] And then you make money that way.
[813] It is a lot of, I have to say for them it is a lot of work though.
[814] If they're good at it, that you have to be smart to know how to brand yourself and to put the time in.
[815] But I know what you mean in that.
[816] I think women, I mean, I would say even for myself as a woman, when you see these images, sometimes you have to remind yourself that this is a business and that people are changing their photos before they post them.
[817] No one just posts something that they randomly took out of the blue, right?
[818] Yeah.
[819] So, yeah, that's interesting.
[820] I just imagine, I really, my heart goes out to young girls today because I think the people say, you know, there were beauty magazines in the past and advertising, and that was pressure that was placed on young women, but I think nowadays it just seems like it's so much more accelerated.
[821] And trying to keep up with the other girls in your class, I imagine, must be hard to because you have followers now and you've likes and you have blue check marks and all that.
[822] Yeah, it's a strange time.
[823] And like I said, I think this is just a problem on the way to our abandoning of biological human beings.
[824] You think eventually we're just going to become merged with robots, transhumanism.
[825] I think it's inevitable.
[826] I think we're seeing it already.
[827] We're seeing it with phones.
[828] You're connected to your phone.
[829] There's that new thing that you showed us yesterday, Jamie, that pin that people were wearing.
[830] It's a new product that you, instead of a phone, you have this thing, and you just talk to it and ask your questions.
[831] Here, play this.
[832] So you can hear what it's saying.
[833] Let's give you examples of how this thing works.
[834] Isn't life about what we experience?
[835] What we smell.
[836] Can I eat this?
[837] Yes, dragon fruits are low in sugar.
[838] What we hear.
[839] Hey, what should I get here?
[840] What we see.
[841] Capture this.
[842] And what we feel?
[843] So this thing, I guess, does video, records audio.
[844] What are some fun things to do nearby?
[845] Share more moments.
[846] Play songs from the last time we were here.
[847] What would happen if we rediscover our senses?
[848] It's kind of replacing phones, I guess, but...
[849] Not really.
[850] Yeah.
[851] I mean, it's a thing that's going to happen along with phones, I'm sure.
[852] You're still going to want to check your likes on Instagram.
[853] Humane.
[854] That's hilarious.
[855] AI pin.
[856] Very weird.
[857] And again, that's just something you wear, and eventually that's going to be in your head.
[858] And when it is in your head, you're not going to be a human anymore.
[859] You're going to be a completely different thing with access to information that's unprecedented.
[860] You're going to have it at your fingertips instantaneously.
[861] And also, all your privacy is going to be gone.
[862] You're going to be able to read people's minds.
[863] They're going to be able to read yours.
[864] You're going to have access to all of your memories, but not just in the form of like...
[865] Yeah, what did we do?
[866] It was going to be, you're going to have a digital HD version of what you did.
[867] And it's going to be very weird.
[868] But you wouldn't even need to have a fate.
[869] Like you and I can sit across from each other and not even be speaking.
[870] We could just be telepathically communicating.
[871] That's probably going to happen too.
[872] Yeah, that's probably going to happen too.
[873] Well, that's one of the first things that Elon told me about the neuralink.
[874] He said, you're going to be able to talk without words.
[875] Wow.
[876] Yeah, you're going to be able to communicate without talking.
[877] And that's not that hard for them.
[878] I mean, they've already got it set up where you, I'm sure you've seen this, where you put on this headpiece and someone asks you a question and you Google it in your head and then it gives you the answer and then you repeat the answer.
[879] That's wild.
[880] Yeah.
[881] Insane.
[882] It's insane.
[883] And I don't know what happens, but I don't think that we are long for this planet.
[884] I think we are the last.
[885] We are the last of the biological creatures.
[886] Babies born today, I think are the last of this generation of biological creatures.
[887] What do you think about AI in terms of its ability to mine data?
[888] Because another thing I find really interesting is how you can go into, say with an influencer and they have how many thousands of hours of content and they can create an AI version of that person.
[889] And so you can tinker with the personality in terms of making them more playful or make them more sassy or whatever.
[890] And then that AI version can actually interact with their fans.
[891] So the fans can ask them questions or talk to them.
[892] And then the AI will respond as if it is that actual person.
[893] And it actually sounds like them.
[894] It sounds like them.
[895] It has a bit of their personality, has their interests.
[896] Yeah.
[897] There was a Black Mirror episode like this with Miley Cyrus.
[898] In the Black Mirror episode, Miley Cyrus is...
[899] brain was downloaded into this little robot that young girls who didn't have any friends would keep around and so like they would talk to her and she would be their best friend and she would talk like myly cyrus did but then it turned out that myly cyrus's actual brain was downloaded into this thing and then she was in a coma somewhere and then they had to free her and that her you know But that's basically happening now.
[900] People have like AI boyfriends and girlfriends.
[901] So here's a story that happened yesterday.
[902] There's a popular Twitch streamer.
[903] I'm not really sure who she is.
[904] But she has created a bot that lets fans interact with her and mostly is to combat deep fakes.
[905] But we've talked about this before.
[906] There's a bunch of AI deep fake porn.
[907] Right.
[908] Only fans and whatnot.
[909] So she's just like, you know what?
[910] Yeah, it has to be.
[911] I'm going to be in control of this.
[912] Let me say it has to be consensual.
[913] I don't think it's okay if you do that and it's not against, not with someone's consent.
[914] So she's making money off of the...
[915] It's not okay, but it's inevitable.
[916] I mean, once you have that kind of technology and you can decide you want to fuck Thanos, like you're going to come home with Thanos is going to be in the bed with his legs up in the air.
[917] And there's not a damn thing anybody's going to be able to do about it.
[918] It's just that kind of technology would be so powerful.
[919] There are so many deep fake ads of me out there selling everything.
[920] That's the ad for it.
[921] This is the app for it.
[922] Susu Bot.
[923] Realistic voice, quality photos, no stolen images.
[924] Oh, okay, so she's launching something with her body.
[925] Right.
[926] Interesting.
[927] Okay.
[928] Well, good move for her.
[929] An AI, Dr. Deborah, but not like a, not a sexual one, just like one that could.
[930] Right.
[931] But I already am a robot, so.
[932] You are?
[933] You think?
[934] People say I am.
[935] Why do they say that?
[936] You don't seem robotic at all?
[937] Okay, well, I'm glad.
[938] Michael Malice does a pretty good impression of me as a robot.
[939] Oh, Michael Malice is a troll.
[940] He's a fun dude.
[941] He likes pushing buttons.
[942] Um...
[943] It's, you know, all these ideas of like what's ethical and what's cool and what's not cool, those are going to be out the window.
[944] It's not going to matter.
[945] It's just going to be what human beings do.
[946] And it's going to be the technology is going to be unstoppable.
[947] And it's going to be very spooky.
[948] And especially in the hands of someone that's in control.
[949] Like if someone, whether it's corporations or we were discussing this the other day about.
[950] if people do venture into the virtual world, and then they live in the virtual world, but the virtual world is run by YouTube.
[951] And YouTube puts the same restrictions of content on the virtual world that they do on YouTube videos, which is, you know, kind of like the bridge between us and the virtual world is your ability to put these things online and people interact with them.
[952] But, you know, if you know YouTube, YouTube stifles COVID information that may actually be accurate but doesn't go along with the CDC or the WHO's guidelines.
[953] They'll gender information that some people don't agree with or they find problematic.
[954] They'll censor that.
[955] They'll pull it.
[956] They'll pull...
[957] any kind of criticism about any public thing that they don't believe is in line with their ideology.
[958] So you have this mega woke corporation that's run by Google, and they're in control of discourse.
[959] And they do their very best to stifle as much discourse that doesn't go with their ideology as possible.
[960] And they make a concerted, obvious effort to do that.
[961] They demonetize you.
[962] They cause you to self -censor.
[963] They do all these things.
[964] If that's happening in the virtual world, like you're literally going to get these mega -woke corporations that are guiding the way human beings exchange information and communicate with each other in that world as well.
[965] And you're going to get this stifled world.
[966] And then what's the, you know, what, what happens from that?
[967] Like, where, what direction does humanity go?
[968] If you have a virtual world where people think and express themselves the same way those fucking idiots in Congress did that are the heads of these universities that are talking about, you know, anti -Jewish hate, what happens?
[969] Yeah, I think the political division is going to get even worse, potentially, if it continues in this way.
[970] But then I wonder, will there also be alternatives that are going to be built up to compete with that or to rectify that?
[971] Well, perhaps.
[972] I mean, maybe that's what Elon's doing.
[973] That's one of the beautiful things about what he's done with X. He's created this platform that kind of goes against all of this stuff that you're seeing from all the other social media corporations who have gone very woke and are censoring people and are trying to push this very specific ideology.
[974] He's not doing that at all.
[975] And it's the biggest one, which is pretty wild.
[976] But you also see the reaction to it.
[977] Like they're pulling their advertisements.
[978] You know, there's all these...
[979] campaigns to reach out to advertisers on X and have them pull their ads and call for boycotts of companies that support it.
[980] It's just like...
[981] It's not crazy, though, that the insult far right still works.
[982] Because to me, anytime I see far right now, I think this must be a sensible person, which is really bad because there are people out there who are extreme right -leaning people.
[983] Yes.
[984] But the term is so meaningless now.
[985] Right.
[986] But I think there are also people who are still very much in the mainstream in terms of how they think, and they're not aware that this term is just basically a way of saying, this is someone I disagree with, or this is someone I just don't like, or this is someone who has a point, and I can't actually argue with their point.
[987] So I'm just going to call them a hateful bad person that needs to be censored and suppressed and hope that they go away.
[988] Well, it's lazy.
[989] It's a lazy, sloppy thinking that is designed to just silence someone.
[990] And if you could put that...
[991] Intimidate them.
[992] Yeah, intimidate them.
[993] The last thing someone wants to do is be called this or that.
[994] It's like, oh, and you have to sort of dance around what you're saying and apologize for your preferences and apologize for your privileges just to avoid someone calling you far right or whatever.
[995] But it's it.
[996] I used to get called far right a lot, but I haven't seen it in a long time.
[997] I don't think I'm getting called that anymore.
[998] I think it didn't work.
[999] Just wait till this podcast comes out.
[1000] Yeah, perhaps.
[1001] But yeah, I definitely get called transphobic for having people like you on or Abigail Schreier.
[1002] I love Abigail.
[1003] Yeah.
[1004] I can't wait.
[1005] I used to get upset when I would see like there are all these organizations that I used to look up to and admire.
[1006] And I was like, they're doing such good work in the world.
[1007] And now I see myself put on hate lists that they publish.
[1008] First it upset me. Now I think it's pretty funny.
[1009] What did you get put on a hate list for?
[1010] Just the things I've said about gender dysphoria, saying that it's associated with autism.
[1011] Not for everyone, of course, but just saying that for many people.
[1012] And there are studies coming out showing this, like it's legitimate.
[1013] But, yeah, things like that.
[1014] Well, the real issue became when it was profitable.
[1015] When gender transition surgery became profitable, and then there's these surgery centers that are open up all over the country.
[1016] And I'm sure you've seen the graphs of what it was like in 1990 versus what it's like in 2023.
[1017] It's insane.
[1018] And we know that medicine and that the this whole business of whether it's prescribing drugs or surgery or all, it's an insanely profitable business.
[1019] And like all businesses, they try to maximize their profits, like everything else, whether it's pharmaceutical drugs or the military industrial complex or fertilizer or anything that people sell.
[1020] They try to maximize their profits as much as possible.
[1021] And you're seeing that with these gender transition surgery places for kids.
[1022] And that's insane.
[1023] You also have to wonder, and I'll put it out there, that what kind of people want to suppress a child's puberty?
[1024] Yeah.
[1025] Right?
[1026] And so I recently wrote about, the Wall Street Journal did this really good investigation, looking at Instagram Reels.
[1027] Did you see this?
[1028] No. And so what they found is that they got these new, fresh devices.
[1029] They set up these test accounts.
[1030] and they followed a bunch of influential pre -teen and teen accounts, like cheerleaders, gymnasts, because they had noticed that many of the people following these influencers were grown men.
[1031] And what they found is with these test accounts, the algorithm was showing them sexually explicit or potentially sexually explicit content involving children.
[1032] Whoa.
[1033] So...
[1034] So the article is in the Wall Street Journal, Instagram's algorithm delivers toxic video mix to adults who follow children.
[1035] Content served to Wall Street Journal test accounts included risque footage of kids, overtly sexual adult videos, and ads for major brands.
[1036] Yeah, that's the algorithm.
[1037] That's the real problem with things finding out what you're engaging with.
[1038] And if you're a fucking creep and you're engaging with 10 -year -old girls in their underwear, you're going to get a lot of that, I guess.
[1039] So meta, in their defense, they said that they had, I don't want to misquote them, but they basically said they don't approve of this.
[1040] I think they've taken steps to stop this from happening.
[1041] They said in January of this year, they removed 34 million pieces of content from Facebook and Instagram.
[1042] That's a good start.
[1043] It's probably a lot more out there.
[1044] I mean, have you just my...
[1045] Third four million on two platforms.
[1046] Well, this is one area of concern, but like murder.
[1047] I see so many videos of people getting shot now.
[1048] Because me and my friend Tom Segura, we do this thing every day.
[1049] We find the worst things on Instagram.
[1050] We send them to each other.
[1051] So every morning when I wake up, I see a text from Tom.
[1052] I'm like, oh, Jesus.
[1053] And then if I find something fucked up, I send it to him.
[1054] So my Instagram algorithm is fucked.
[1055] Yeah.
[1056] Because it's like every time I scroll through my page, there's like a warning.
[1057] Do you want to view this real?
[1058] Are you sure?
[1059] Yeah.
[1060] And it tells you why, like what graphic image?
[1061] And I'm like, let me say it.
[1062] And you see it.
[1063] It's like, oh, God.
[1064] And there's so, I mean, horrific industrial accidents, gang shootouts, like just crazy shit.
[1065] And it's obviously it's finding out that I'll engage with those things.
[1066] So it's sending me a ton of them.
[1067] So how is it doing that if it's trying to remove them?
[1068] There has to be something that it knows that I'll engage with those so it will send me more of those.
[1069] Like I don't buy that this is just dumb luck.
[1070] Because these aren't accounts I even follow.
[1071] They're just showing up in my feed.
[1072] Well, I want to say I love Tom and Christina P. But with, like, this is how much I used Instagram before I wrote about the Wall Street Journal's investigation.
[1073] I actually, I was like, where is the Reels function on this app?
[1074] So then I found it, and it shows you accounts, like you said, that you're not actually following.
[1075] If people are as technologically savvy as me, they have no idea what Reels actually does.
[1076] So it will show you just from random people.
[1077] So I think it's good that Meadows trying to clean up the platform and stop this.
[1078] But I just, I guess I feel in my position and being so privileged to be on your show to draw awareness to this because I think there's so much talk culturally about grooming, people talking about how they don't like groomers and I agree.
[1079] I think a lot of education, I used to be very much in favor of sex education for kids.
[1080] I'm a little bit questioning that now because I see how that is also being used for ideological subversion.
[1081] But I think in terms of talking about grooming, we also need to actually take steps to stop it from happening more broadly instead of just getting outraged about it.
[1082] Because being outraged about it is not enough to help protect these kids.
[1083] Yeah, unless there's a wholesale investigation of like how these algorithms...
[1084] work in the sense that like if there is a guy who's a grown man who is looking actively for young girl videos like how is he getting those how are they showing up at his feed are they showing up constantly like how do you not know that this guy is 60 years old and that he's getting videos of 10 year old girls in their underwear like you don't know this Or do you not care?
[1085] Or is the algorithm set up just to maximize interaction only?
[1086] And it's amoral.
[1087] And if this person looks towards that, says up, we got a lot more of that for you.
[1088] So that means it knows that it has murder on its platform.
[1089] It knows that it has these horrific accents and animal attacks.
[1090] And also overt sexuality of underage people.
[1091] It knows.
[1092] So if it knows, like, how are you not taking steps to mitigate that?
[1093] Like, just removing stuff doesn't seem like it's enough.
[1094] Because if you remove it, that means you can recognize it.
[1095] So how can you not recognize it when it gets posted?
[1096] Like, if you're putting a warning up that tells me not to watch this, well, that means you know it's there.
[1097] So what do you think it is?
[1098] Like, what does the computer think?
[1099] What does the algorithm think is happening here?
[1100] And why is it being allowed?
[1101] Is it only being allowed because it maximizes interaction?
[1102] And that's good for profitability?
[1103] Because that's what it seems like.
[1104] I would love to ask many of these questions if they would talk to me. I'll put it out there.
[1105] Yeah, that would be a very interesting conversation.
[1106] Next time I get the Zuck on, I'll ask him about those.
[1107] Because, you know, I don't even know.
[1108] How much could one person really even be aware of what's happening?
[1109] If you're the CEO of Instagram, like, how much time are they spending concentrating on the negative aspects of their algorithm?
[1110] Because it's obviously a real concern for parents, for society, for people that are worried about child predators, you know, for all these things.
[1111] Yeah, I mean, I imagine they have their hands full with who knows what, right?
[1112] They're busy people.
[1113] There are billions of users.
[1114] So many people complaining about tons of things.
[1115] I mean, they're essentially larger than any country.
[1116] Which is nuts.
[1117] It is.
[1118] Like, if you looked at Instagram as a country, if it was a group of, a collected group of people that all are in a thing, you know, not on a patch of dirt, but in an app, it's way bigger than any country.
[1119] Didn't we find out the other day was like 2 .6 billion or something like that people on Instagram?
[1120] That's insane.
[1121] Yeah.
[1122] Something along.
[1123] Let's see what the number is.
[1124] But it's something in the neighborhood of roughly a third of the human population on the planet is on an app.
[1125] Posting pictures.
[1126] Posting selfies.
[1127] 2 .35 billion users currently.
[1128] This number is projected to reach 2 .5 billion by the end of 2023, which is right now.
[1129] 71 % of Instagrammers in the United States are between 18 and 29.
[1130] That's interesting.
[1131] The average user spends 24 minutes on Instagram every day.
[1132] That's surprisingly low.
[1133] Instagram is a pretty even gender split, 50 .7 male, 49 .3 female, making up the total audience.
[1134] India has the most Instagram users accounting for 326 million Instagrammers.
[1135] Wow.
[1136] India is our fifth largest country that listens to this podcast.
[1137] Wow.
[1138] It's interesting that statistic of 18 to 29 -year -olds because you're seeing also this uptick of mental health issues in that age demographic.
[1139] Yeah.
[1140] You know, my friend Sean was saying this, that he was looking at Instagram, Sean O'Malley, and he's not an anxious guy.
[1141] He's a UFC bantamweight champion.
[1142] He was saying that when he goes over Instagram, he goes, he goes, even if it's like, it's nothing to do with me, I get like a low level of anxiety, just scrolling.
[1143] I'm like, absolutely.
[1144] What is that?
[1145] Part of me thinks it's like I know I have things to do.
[1146] Yeah.
[1147] Why am I doing that?
[1148] Why am I wasting my time?
[1149] But also there's like some weird voyeuristic aspect of just flipping through these reels and going through people's lives.
[1150] It's just fucking kind of creepy.
[1151] I wonder if it's also because, again, like evolutionarily, that's not something that we typically would have been able to do in the past, to voyeuristically look at someone's life and not have them in some way notice you're looking into their life or interact with you directly.
[1152] Yeah.
[1153] And then you could have a whole other conversation while you're looking at that with somebody else, and that person that you're looking at has no idea about it.
[1154] So when you look at what's happening today with young people and their experiences in dating, And obviously this is unique to this time because of the internet and because of social media.
[1155] Do you just think that this is just one of a new set of problems that this generation is going to have to deal with and this is just how it's always been throughout time?
[1156] Or do you think there's something we can do about it?
[1157] I think there are things that we can do about it.
[1158] So far, in terms of my research, I think the biggest solution is just to not give your kid a phone, which I know is so hard, or at least not till puberty, because at least let their development develop or let them get as far in development as possible before we start introducing all of this stimulation that the brain is not prepared for.
[1159] There are some early research showing that there are changes in the brain associated with early social media use, which is concerning.
[1160] I think we need to have more research to definitively know at this point.
[1161] But I'm excited just to get to tinker away a little bit more and figure this out.
[1162] But I would say, you know, I know it's hard.
[1163] I imagine, like, I'm not a parent, but I imagine for parents it's really hard.
[1164] When you have kids and all the kids in...
[1165] your children's classes have phones and everyone's on there, and all their friends are talking to each other on there, how hard it can be.
[1166] But you can get them a flip phone or get them a phone that doesn't have access to the internet.
[1167] Listen, they're going to hate you if you do that.
[1168] Yeah, and they'll also, they'll probably find ways to get on the internet without their parents knowing as well.
[1169] I don't limit my kids in that way, but I do educate them on the draw, like the gravity of these things and how it can be a problem.
[1170] And I want to get it in their head as they're young.
[1171] But all their friends are doing all this, and I don't want them to be socially ostracized.
[1172] And I think it's navigatable.
[1173] I think you can navigate it.
[1174] But I think we're looking at it as...
[1175] people that didn't have it when we were young and now see the problems with it now.
[1176] Whereas it's everywhere now.
[1177] And if you don't have it, I think being socially ostracized is just as much of a problem as being like deeply connected to these things.
[1178] And I think as long as you have open lines of communication with your kids.
[1179] Yeah, that's important.
[1180] That's important.
[1181] You just got to talk to them a lot.
[1182] Talk to them about everything.
[1183] Do you set time limits?
[1184] I think that's helpful, too.
[1185] Even for adults to check the little graphs that show you how much time you're spending on each app.
[1186] We take their phones at night.
[1187] We set time limits.
[1188] Good for their sleep.
[1189] Yeah.
[1190] But, you know, it's a new world.
[1191] And, you know, if you see problems, you try to mitigate those problems and try to communicate about it.
[1192] But...
[1193] This is the world that we live in, and it's a weird one.
[1194] And it's not like, when I was a kid, I never used my phone.
[1195] Like, I didn't have one.
[1196] What are you talking about?
[1197] Like, it's like, I don't know what it's like to be 13 with a phone.
[1198] Like, it's very strange.
[1199] I'm sure there are some benefits, too, like in terms of you have all, not all of the information in the world and not necessarily accurate information, but you do have a lot more information available to you that back in the day you'd have to go to the library and grab an encyclopedia and go digging.
[1200] Well, way more informed than I was when I was their age.
[1201] Way, way more informed.
[1202] And they're instantaneously informed.
[1203] They just talk to their phone.
[1204] Mm -hmm.
[1205] Like, they press the button and ask a question.
[1206] Yeah.
[1207] What's the population of Mogadishu?
[1208] Bam.
[1209] Like, what is this?
[1210] What is that?
[1211] What is, you know, what temperature is in the Sahara Desert right now?
[1212] Like, just get, it's like instantaneous.
[1213] And that's just a clumsy, clunky way to do it before AI is in your brain.
[1214] Yeah.
[1215] They're not even going to have to ask in the future.
[1216] You just have the question.
[1217] Do you plan on having children someday?
[1218] You know, my first step is to find a husband.
[1219] That's always a good first step.
[1220] How's that working out?
[1221] I'm very busy with my career, but I'm taking applications.
[1222] I'm taking applications.
[1223] Do you use an app?
[1224] I've been on them.
[1225] I'm not on them currently, but...
[1226] What caused you to abandon them?
[1227] Just work, busy with work.
[1228] Yeah.
[1229] But, you know...
[1230] I never talk about my personal life, so I'm getting embarrassed now.
[1231] Well, you don't have to make you uncomfortable.
[1232] But I'm just curious, like, because you study this, and, you know, if you have children, let's say it's in two or three years from now, things are going to be even weirder.
[1233] Oh, yeah, yeah.
[1234] But for me also, I think about what's it going to be like the things I say in terms of how much those of us who criticize, say, gender ideology or whatever it is, racial politics, whatever it is in society, you're going to have to deal with people coming after you and your family.
[1235] And that's a very real consideration, I think, that many people I know have had to contend with, unfortunately.
[1236] Yeah.
[1237] So that's something else that I think about, you know.
[1238] Yeah.
[1239] It's, I mean, the thing about it is people are so polarized today.
[1240] And the discourse is so, it's so vicious.
[1241] It's like you're with us or you're against us.
[1242] You're the enemy or you're a comrade.
[1243] And it's just.
[1244] And there's no boundaries.
[1245] I think in the past there was a sense of.
[1246] We will go after this person, but there are certain things you just don't do.
[1247] But at this stage now, it's like, or I don't know if it's just there, it seems like there are so many more people who are more extreme.
[1248] I don't know if this is because of social media, pitting people against each other.
[1249] And you really think if someone disagrees with you on this one issue that they must really be Satan and you have to, you know, go to every length possible to shut them down.
[1250] But that's how I suspected.
[1251] Or it's a lot easier for people to get at you also because of, it's much easier to get in contact with someone nowadays.
[1252] Yeah, I think it's both of those things, but I think that social media most certainly because of the fact you don't actually interact with the person, it makes it much easier to be shitty with them.
[1253] Yeah.
[1254] And that becomes a normal part of human discourse.
[1255] The way we communicate with each other is shittier than ever before.
[1256] I like your optimism, though.
[1257] I think that that's a good way to view life and just say, you know, things are constantly evolving and changing.
[1258] We just have to figure out how to evolve with it.
[1259] Yeah, you know the expression.
[1260] The kids are all right.
[1261] They're kind of, you know, look, it's way better than living in 1900.
[1262] You know, it's way better.
[1263] It's way better than living where it's hard to get food.
[1264] It's way better than living the place that didn't have penicillin.
[1265] It's way better when you have access to information.
[1266] You can be informed.
[1267] Sure, you can be ideologically captured.
[1268] But you also can be informed.
[1269] And there's a lot of people that I'm seeing now that used to be ideologically captured that have escaped.
[1270] And that the overwhelming amount of information that's been given, particularly over COVID, the COVID thing woke a lot of people up.
[1271] There was a lot of people that I was friends with that had this sort of wholesale respect for the medical establishment.
[1272] And they never questioned anything.
[1273] And now they're like, I don't fucking believe anything anymore.
[1274] And then the more they've either listened to podcasts about people that have litigated medical cases involving pharmaceutical drug, adverse side effects and how they've tried to cover them up and hide them.
[1275] and then the understanding of how they conduct their studies and how they don't have to show studies that have negative side effects and how they can skew these studies in a very distorted and very deceptive way to show that there's efficacy.
[1276] That's very shocking to people when they find out that, oh my God, The people that sell you medicine, they're not looking out for your best interest.
[1277] They just want money.
[1278] They just want money.
[1279] Number one is money.
[1280] Number two is, how long can we sell it for?
[1281] Does it work?
[1282] Number three is like, does it help people?
[1283] Like, is this thing actually going to work and help people?
[1284] That's like least concern on the food chain.
[1285] Number one is money.
[1286] Because that's what they do.
[1287] And they have an obligation to their shareholders.
[1288] They have to make the most money possible.
[1289] And every year they try to make more and more money.
[1290] You know, we were talking the other day about people that come into this country for the longest time you had to be vaccinated to come into this country.
[1291] Now I think you can get in without vaccine.
[1292] But if you want to become a citizen of this country, you have to be vaccinated.
[1293] Even with a vaccine that doesn't fucking work.
[1294] Which is absolutely wild.
[1295] Like if you want to be a citizen of this country, you have to abide by their bullshit rules that are captured by this massive industry that says, look, if we require people that want to become citizens to get vaccinated, we're going to sell more vaccines.
[1296] And it's that simple.
[1297] It's not like we're trying to protect everyone.
[1298] Well, that's bullshit.
[1299] Because if you look at the statistics, particularly with the COVID vaccine, the more you get vaccinated, the more likely you are to get COVID, the more likely you are to get hospitalized.
[1300] Like, it's not good.
[1301] And yet they still require it.
[1302] Have to get it.
[1303] Can I ask you what it was like at the peak of the craziness when they were coming after you?
[1304] Pretty much every single news network was coming after you.
[1305] It was fascinating.
[1306] You turn on the TV and they're calling you all these names.
[1307] What is that like?
[1308] Very eye -opening because they were all doing it in lockstep and you could clearly tell there was an agenda.
[1309] And that it wasn't based on reality.
[1310] Like, I wasn't taking horse medicine.
[1311] They knew I wasn't taking horse medicine.
[1312] They fucking knew it.
[1313] And they were saying it on CNN.
[1314] But they sacrificed their own credibility in doing so.
[1315] And luckily, look, it was obviously, one of the things that CNN did that was so stupid is they had very unlikable people that were talking.
[1316] Like you have Brian Stelter and Don Lemon, two of the most unlikable fucking human beings that have ever been on television.
[1317] And they're the one saying these things.
[1318] So it did the opposite of work.
[1319] I got two million subscribers in a month because of all that shit.
[1320] And they would say these terrible things about me. And then people would listen to the podcast go, oh, he's just curious.
[1321] He's just asking questions and talking to people.
[1322] But they didn't want curiosity.
[1323] The really fascinating thing to me was how quickly I got better and that they didn't want to concentrate on that at all.
[1324] They just want to concentrate on me being a conspiracy theorist who took veterinary medicine.
[1325] Veterinary medicine that's been prescribed literally billions of times for human beings.
[1326] It's on the World Health Organization's list of essential medicines.
[1327] I mean, it's so wild.
[1328] It's just wild.
[1329] It's wild.
[1330] It was wild to watch it happen.
[1331] But what I had on my side was that my podcast was way bigger than them.
[1332] I don't think they knew that at the time.
[1333] I think when CNN was going after me, they have this delusional perspective of their reach.
[1334] And they thought that they were bigger than me. But my show was, I have 10 times as many people as the best show on CNN.
[1335] And now it's way worse.
[1336] Like now they're getting the worst ratings that they've had since 1991.
[1337] On some of their shows, they have 40 ,000 people watching.
[1338] And they're in fucking airport.
[1339] They're everywhere.
[1340] 40 ,000.
[1341] Wow.
[1342] That's crazy.
[1343] I can get 40 ,000 people watching Instagram real in 10 minutes.
[1344] No joke.
[1345] Yeah.
[1346] This is, they're, they sucky organization.
[1347] They suck at telling the truth.
[1348] They suck at the news.
[1349] They suck at not being whores.
[1350] They're whores.
[1351] They're whores for the pharmaceutical drug companies.
[1352] And then, you know, they all got fired, which is even funnier.
[1353] They're all gone now.
[1354] You know, they just fucking blow off in the wind somewhere.
[1355] You're never going to hear from Don Lemon again.
[1356] Nobody gives a fuck about that guy.
[1357] Nobody gets a fuck about Brian Stelter either.
[1358] No one cares.
[1359] So it's like, what you did was you sacrificed your own credibility, hoping that was going to take out the competition and did the opposite.
[1360] Well, I thank you for speaking up, and I know I'm not the only one that feels that way.
[1361] Well, I didn't plan on it.
[1362] I just, the only reason why I did is, is I was curious.
[1363] So I was talking to this guy, Robert Malone, and he was telling me, like, I own nine patents on the creation of MRNA vaccine technologies.
[1364] I was at the beginning of this, and I can tell you what's wrong with it.
[1365] I can tell you what's going on, and, you know, we should tell people.
[1366] And I was like, okay, as I have them on, then I'm just getting fucking attacked.
[1367] Right.
[1368] And attacked by people like Neil Young.
[1369] And the most disturbing thing was watching Neil Young talk on Howard Stern about it because I'm like, oh my God, you don't know jack shit about this stuff.
[1370] And you're taking this stand against misinformation.
[1371] You don't even have any information.
[1372] You literally don't know a goddamn thing about it.
[1373] And it probably had something more to do with the people who own his catalog, and they probably also have interest in the pharmaceutical drug companies.
[1374] There's probably all sorts of conversations about how to stop this, how to stop this just objective truth from being disseminated.
[1375] Or even just asking questions and having conversations and having those physicians on your podcast.
[1376] Yeah.
[1377] I would say the episode you did with RFK Jr. also was very eye -opening.
[1378] So I really recommend people listen to it because it's just scary the way that they've portrayed him in the mainstream and you listen to him and he's so reasonable.
[1379] He's very reasonable.
[1380] And he's also, he spent so much of his time as an environmental attorney cleaning up rivers and, you know, and holding these corporations accountable for polluting.
[1381] And he was very successful doing that.
[1382] There was the majority of his life.
[1383] It wasn't until these women came to him and said, you're spending all this time talking about mercury poisoning in lakes and rivers.
[1384] Look into this.
[1385] Look into this.
[1386] Look into the poisoning and the vaccines.
[1387] Look into these side effects.
[1388] Look into the fact that they're immune to prosecution.
[1389] Look into the fact that they don't have to pay money.
[1390] Look into the fact.
[1391] Look into this immunity that they developed in the 1990s.
[1392] And look at how many more vaccines have been giving to people now.
[1393] And then there might be a connection with money.
[1394] And so then he gets into it.
[1395] And then immediately he's labeled a conspiracy theorist.
[1396] And it's just like, oh.
[1397] But these are the types of questions that scientists are supposed to be asking.
[1398] Yeah.
[1399] And that journalists are supposed to be asking.
[1400] Well, when you allow pharmaceutical drug companies to advertise on television, which is we are one of two companies, or excuse me, we are a company, really, we're one of two countries in the world.
[1401] You're a cultural institution, like he said on the show, basically.
[1402] Yeah, nothing I can do about that.
[1403] But, you know, those drug companies, they pay so much money to advertise.
[1404] And they don't want to cut that money train off.
[1405] And so they'll bullshit and lie.
[1406] But in doing so, in taking that money, they've killed their future because no one's going to trust them.
[1407] Like they have less people engaging with mainstream media than ever before.
[1408] than ever before, especially in terms of the news.
[1409] It's crazy.
[1410] It's crazy how much they've undermined themselves just by being horrors.
[1411] I was hoping for a long time also that media would eventually come around in the same way that I was hoping academia would eventually come around.
[1412] But you see, I don't think that that's going to happen and you see alternatives being built.
[1413] So I think that's a positive thing.
[1414] I do think that there's just going to be a new ecosystem going forward.
[1415] I think that old system is very much coming to an end or it's going to continue to bifurcate.
[1416] Yeah, I think it's coming to an end because I think first of all, it has to, just based on the limitations of the platform, if you can never get into depth about a subject.
[1417] Like you and I have been talking here for a couple hours now, or whatever it's been.
[1418] When you sit down with someone, you talk for long periods of time, then you can get these in -depth discussions and you could go over the various aspects of something.
[1419] It's impossible in five minutes.
[1420] You're doing five minutes and you're cutting to a commercial.
[1421] Then you're showing a commercial where a bunch of people are dancing in the field because they took antidepressants.
[1422] And then you cut right back to a new thing, a new drama, a new dilemma, a new issue.
[1423] And you have this biased take on this new thing.
[1424] And nobody trusts you anymore.
[1425] I mean, it's just...
[1426] And it's also on a specific time.
[1427] You have to tune in at 8 o 'clock.
[1428] Like, what if I'm busy?
[1429] Like, everything else is available streaming.
[1430] Everything else is available anytime you want.
[1431] Everything else you can pause.
[1432] It's just a dumb way to communicate.
[1433] And it was the only way that was available in 1990.
[1434] Well, it's not 1990 anymore.
[1435] And you're still doing it like 1990.
[1436] It's like if you were still trying to have the fastest horse -driven wagon to get across their country, like, why are you doing that?
[1437] You don't have to do that anymore.
[1438] It's literally what it is.
[1439] Massacists will.
[1440] Well, it's just they are, they're trying to survive.
[1441] And this is their business.
[1442] And there's a lot of people involved in that business.
[1443] And the business sucks.
[1444] It's a sucky business.
[1445] Just no one would start that business today.
[1446] No one would say, you know what I want to do?
[1447] I want to do the news, but only get like real shallow with stuff and only have whatever the government or the regime -supported narrative is.
[1448] I just want to say that.
[1449] And then I want to get all the money from drug companies and whatever else is like selling toxic shit.
[1450] Like they have a lot of money.
[1451] And then we'll use that money.
[1452] And that's what we'll do.
[1453] Like no one's going to do that now.
[1454] Like mainstream media now is when it comes to the news is independent media.
[1455] Yeah.
[1456] If you look at the YouTube shows like Breaking Points, they get way more views than the average show that's on MSNBC or CNN.
[1457] Like, no one cares about that anymore.
[1458] But I would say the one good thing about legacy media before was that it would force people to some extent to be exposed to views that they don't necessarily agree with or that they wouldn't necessarily be...
[1459] exposed to like with your show because it is so big I think it does reach everyone but I think for I would say there's a risk that people might be siphoned off even more into echo chambers because they will only consume the media that they agree with or that of of commentary that they like as opposed to.
[1460] If you open, say, a mainstream newspaper back in the day, you go to the opinion section, like I'm an opinion writer, you'll read columns maybe that you agree with, but you'll also read columns that you really disagree with, and you might hate that columnist.
[1461] But every week, or every day, you're going to be forced to come across that.
[1462] And I think that helps people.
[1463] humanize opinions that they dislike because they'll say, well, there are people out there who have this view even if I really disagree with it and maybe there are parts of this argument that I can to some extent see the validity of even if I don't agree with them.
[1464] I can understand a little bit more where they're coming from.
[1465] But now it seems like if we just follow the particular people that we like, do we really necessarily?
[1466] Because many people will only have people in their show.
[1467] Not everyone.
[1468] Like I'd say many of my colleagues are really good about this.
[1469] They'll have people on that they disagree with.
[1470] Then you have the problem of the guest who doesn't want to come on.
[1471] Like for me, when I was podcasting with my show, it's currently on a hiatus, but we will be coming back to it.
[1472] I would reach out to people who disagree with me and they don't want to talk to me. Yeah, they don't want to platform you, which is hilarious.
[1473] Yeah, they lose points by talking to someone.
[1474] I think you're making a good point about what used to be legacy media in terms of newspapers, the opinion pieces.
[1475] But, you know, obviously, we're also talking about something where you really get to go into depth.
[1476] Like an opinion piece in the New York Times, you know, you have thousands of words.
[1477] Yeah.
[1478] so much of an opportunity to express yourself uninterrupted and unchecked.
[1479] Whereas what I was talking about is mainstream news.
[1480] And mainstream news is this very surface, what's going to freak you out and keep you tuned in as much as possible?
[1481] And, you know, much to their dismay, the most effective product they had was Donald Trump.
[1482] The hate for Donald Trump was a ratings boom for CNN.
[1483] They probably made millions and millions of dollars just because they covered Trump.
[1484] Yeah, the hate watching.
[1485] I never understood hate watching, so I'm like, do people not have better things to do?
[1486] Yeah.
[1487] Some people like hate watching.
[1488] And they don't have better things to do it.
[1489] But it's also like the same way that your algorithm, you know, encourages you to if you're...
[1490] Be outraged.
[1491] Yeah.
[1492] It's just whatever gets you engaged.
[1493] And for them, like, Donald Trump is business.
[1494] It's good business.
[1495] It's getting people enraged.
[1496] And they thought foolishly that by showing him over and over and over again being stupid or whatever they thought he was doing...
[1497] that they were going to diminish his chances of being president.
[1498] They just made them more and more popular.
[1499] And they also didn't realize how many people hate them.
[1500] They didn't realize how many people hate them for lying.
[1501] So if you're talking bad about someone and they don't like you, you're like, but I don't like you.
[1502] And you're talking bad about him.
[1503] I think I like him now.
[1504] Well, because people can then go and fact check, do their own fact checking, not just rely on the so -called fact -checkers.
[1505] A lot of people don't.
[1506] A lot of people don't.
[1507] I had a guy come up to me in Vegas.
[1508] Like, why are you taking horse medication?
[1509] Shut the fuck up, dude.
[1510] You don't even know.
[1511] You are taking the time to talk to someone without even taking the time to look into it.
[1512] But yet you seem morally righteous in coming up to me and saying, why didn't you just get vaccinated?
[1513] Why didn't you take horse medication?
[1514] Well, I mean, one of the things that I'm sad about, but it was also kind of hilarious, is how many people were promoters of the vaccine and died suddenly?
[1515] It's crazy how many fucking young people just died in their sleep after they took it.
[1516] And everybody's like, nothing to see here.
[1517] Suden adult death syndrome.
[1518] Yeah.
[1519] Just died suddenly.
[1520] You ever go to the died suddenly Instagram page?
[1521] Like, holy shit.
[1522] There's so many.
[1523] And so many people like talking about people who are, you know, anti -Darwin, anti -vaxxers, and then you're dead.
[1524] Sorry.
[1525] You bought into the wrong bullshit.
[1526] But that's...
[1527] You know, if you really want to get cruel, that's Darwinism.
[1528] Do you not know they lie by now?
[1529] Do you not, are you not aware of the opioid crisis?
[1530] You're not aware of Vioxx?
[1531] You're not aware of the various, like to 25 % of all FDA -approved drugs that get pulled?
[1532] It's one out of four.
[1533] And you're like, really, you're an anti -vaccity, what are you, what are you, a conspiracy theorist?
[1534] You fool, Darwin's going to do its work with you.
[1535] You're modifying your genes, you fucking idiot.
[1536] Like, what are you doing?
[1537] What are you doing?
[1538] You're just going to trust Pfizer?
[1539] Well, they do support Anderson Cooper brought to you by Pfizer.
[1540] And you're like, oh, this must be legit.
[1541] I'm just going to clip this and show this to people from this point on.
[1542] Yeah, clip it.
[1543] I mean, it's very eye -opening.
[1544] It's very eye -opening.
[1545] And one of the things that gives me hope.
[1546] is that if we do get to a point where we have full access to all information instantaneously in our minds, if the bottleneck is propaganda, if that can no longer exist in that realm...
[1547] then we'll be way better off.
[1548] If you're only dealing with absolute facts because AI is in charge of disseminating information at the point where they realize, like, this is the bias.
[1549] This is why this study's ineffective.
[1550] This is why this study is actually deceptive.
[1551] That would be helpful.
[1552] Here's how they profited off of that study being deceptive.
[1553] It was responsible for...
[1554] an increase in their profit of margin by X amount, and this is why they did it.
[1555] These people went to jail.
[1556] These people were being, they left the FDA and immediately went to work for Moderna.
[1557] This is how much money they made.
[1558] This is why they made these decisions.
[1559] And then the ability to read minds.
[1560] All those together, like we might be on like the last spaceship that's like shooting away as the earth explodes, you know, that we just make it just outside of the blast radius.
[1561] Yeah, that's the hope.
[1562] That's the hope.
[1563] The hope is that, I mean, and that might also be the thing that influences people to give in to AI and to give in to having something integrated into your biology is that it's the only way for us to escape bullshit.
[1564] Human created bullshit.
[1565] And when you say trust the science, like I do trust science.
[1566] The science is not the problem.
[1567] It's human bullshit that represents the science.
[1568] It's human bullshit that's using propaganda and they're using bias studies and they're lying to you.
[1569] It's human.
[1570] It's not science and the ideology and the ideological subversion.
[1571] That's not science.
[1572] That's humans.
[1573] It's humans and they're bullshit.
[1574] Right.
[1575] And if you just get pure data and pure information, you could talk about it objectively, everyone should have that.
[1576] That would be wonderful.
[1577] But that's not what you have in this day and age with media -sponsored narratives.
[1578] When you have pharmaceutical drug companies responsible for a large portion of the commercial budgets...
[1579] for these programs and then no one says anything about the 40 % increase in all cause mortality that mysteriously arose after they made people get shot up with some experimental shit.
[1580] Like, that's science.
[1581] How come we not talking about that?
[1582] Yeah.
[1583] How come you don't trust that science?
[1584] Why is that data such a problem?
[1585] Because that data implicates that a lot of fucking people are wrong.
[1586] And a lot of people were giving really bad advice.
[1587] And a lot of people ignored all the warning signs because they didn't want to be an anti -vaxxer.
[1588] Yeah, the problem is not ever the truth.
[1589] The problem is what people do with it potentially.
[1590] But I think the way to approach that is to be honest about what the truth is, and then to just say, this is what it means and what it doesn't mean, or this is how it should not be used.
[1591] And so science is just basically a tool to get to the truth.
[1592] But when people start...
[1593] It really bothers me the way that, like you said, trust the science.
[1594] It's like the science when it fits a particular narrative versus the science as it actually is.
[1595] So I don't even like to say that I'm like pro -science or that I follow science because that has such a negative connotation nowadays, which is crazy.
[1596] Because if you can't even say that, then it's like saying, I'm in favor of truth and truth means nothing now.
[1597] It's really weird.
[1598] It's a weird time.
[1599] But there's enough people that are like actual objective scientists out there.
[1600] It's just they're captured by these institutions that they find themselves working for.
[1601] You know, and that's what gets really crazy.
[1602] And that's where Yuri Besmanoff was so brilliant in explaining what's going to happen.
[1603] But who the fuck would have thought I was going to happen with science?
[1604] Who have ever thought it was going to happen the way it's laid out?
[1605] Like, we are our own worst enemy in that regard.
[1606] And this is also a real problem with people...
[1607] being captured by an idea, espousing that idea, talking to people about it, trying to get people to go along with it, and then realizing it was wrong.
[1608] And the long road it takes to accept that and to admit it.
[1609] And then to be open about it.
[1610] And a lot of doctors, to their credit, have done that.
[1611] And, you know, I'm sure it was very painful.
[1612] Like, You know, kind of like coming out of the closet or something.
[1613] Like peel that Band -Aid away and go, you know what?
[1614] I was wrong.
[1615] Like, this is not good.
[1616] This is dangerous.
[1617] I'm seeing blood clots in people that aren't normal.
[1618] Well, they risk losing their licensure.
[1619] Yes.
[1620] Yes.
[1621] And that's one of the more interesting things about Robert Kennedy Jr.'s work, particularly his book The Real Anthony Fauci.
[1622] If you find out how the system actually works and you read that book and how it's been working like that since the 1980s, it's fucking horrifying.
[1623] It's horrifying.
[1624] It's really spooky.
[1625] You know, and I always say, that book's not true.
[1626] Why isn't he getting sued?
[1627] Why isn't it getting sued?
[1628] Because it is true.
[1629] Like, you could go look out.
[1630] I've looked up many of the things that are in that book.
[1631] And you go, oh, my God, this is real.
[1632] The testing of vaccines on foster kids for AIDS that killed them in New York.
[1633] It's like, that's real.
[1634] They really did that.
[1635] They took these lost kids that didn't have parents.
[1636] They just fucking injecting them with experimental drugs.
[1637] I thought it was interesting when he was talking about good mercury versus bad mercury.
[1638] Oh, my God.
[1639] Isn't that crazy?
[1640] Yeah.
[1641] When you find out that the good mercury actually penetrates the blood -brain barrier quicker and fucking accumulates in your brain?
[1642] Like, what?
[1643] Oh, boy.
[1644] Yeah, there's so many lies because these lies are convenient for profits.
[1645] And that's scary stuff.
[1646] It really is.
[1647] I mean, but that's with everything in our culture.
[1648] I think the only thing that's saving us is honest discussions.
[1649] It's the only thing that's saving us right now.
[1650] And I think that's the thing that they never saw coming, which is really interesting.
[1651] Or that people have the guts to keep going, right?
[1652] Yeah.
[1653] I think many people who are trying to suppress this information or suppress, suppress information more broadly, just think that they can silence the people who are against them.
[1654] Yeah.
[1655] But they're very stubborn people out there.
[1656] Well, also those people wind up getting fired too and they wind up getting cast out too.
[1657] And then they wind up like going, oh, what did I do this for?
[1658] Like we were talking about with CNN, like those anchors, they're fucking homeless now.
[1659] Like who knows what they're doing?
[1660] I mean, I'm sure they're not homeless, but they're jobless.
[1661] Like what are they doing now?
[1662] Like what did you do?
[1663] And look what happened.
[1664] Look where it got you?
[1665] Got you nowhere.
[1666] Like maybe some enough people realize like you can't have a career in bullshitting people anymore.
[1667] That's not going to work.
[1668] There's too many people that call bullshit.
[1669] There's too many people.
[1670] And then other people will check, well, who's right?
[1671] And then when they go over it, they go, oh, that guy that you call bullshit on.
[1672] Like, he's fucking accurate.
[1673] You're bullshit.
[1674] You're the bullshit party.
[1675] And, oh, look, you're sponsored by all these kinds.
[1676] Oh, well, there it is.
[1677] Seems pretty obvious now.
[1678] And that's the thing.
[1679] That's why.
[1680] Nobody wants to listen to mainstream news anymore.
[1681] And, you know, you're talking about like opinion pieces in the New York Times and things along those lines.
[1682] Like, yeah, those used to be really important, but those were ideologically captured as well now.
[1683] There's less and less of those that I trust now.
[1684] When you do see one of those reasonable voices in a mainstream publication, it's like shocking.
[1685] I know.
[1686] Like, how long does this guy have to survive?
[1687] How did this happen?
[1688] How long is she going to be working for there?
[1689] Yeah, it's not good.
[1690] But also, it's kind of interesting because I feel like ideas as well, like almost everything has to have something that's fighting against.
[1691] The push and pull.
[1692] Yeah.
[1693] And that push and pull, whether we like it or not, we want utopia, but it doesn't exist.
[1694] That push and pull is imperative for growth and change.
[1695] And I think that push and pull that we're experiencing with the death of the relevance of mainstream media, at least in terms of television news, is beneficial to the rise of independent media.
[1696] Because independent journalism, there are Matt Taibi's.
[1697] you know, and Schellenberger's and there's just people out there, Barry Weiss, that are really trying to tell the truth.
[1698] And they do exist.
[1699] And you can find them when you've had enough.
[1700] What do you think is going to be the eventual outcome of all those COVID narratives?
[1701] Unfortunately, I don't think enough people are going to go to jail.
[1702] Unfortunately, I don't think enough people are going to be held financially responsible.
[1703] And it'll all be lost if something else happens and they do it again.
[1704] There's something more contagious, more deadly.
[1705] comes up and they get to lock down again harder this time and then this time enforce vaccine passports and then connected to a social credit score and then get people on centralized digital currency.
[1706] Then you have complete ultimate control over narratives.
[1707] Because you'll be able to cut people's money off.
[1708] You'll be able to limit their travel, limit their ability to work.
[1709] That's a real problem because there's a lot of dummies out there that don't realize the danger of this.
[1710] And they'll think that, you know, anybody talking about, what are you, a conspiracy theorist?
[1711] And they'll get caught up in their stupid mainstream narratives because a lot of people, they're not paying attention.
[1712] to independent journalism and accurate information.
[1713] And they're not sitting around.
[1714] And we were told, like, during the pandemic, don't do your own research.
[1715] Yeah.
[1716] Like, what the fuck are you talking about?
[1717] How about my own research and corruption of pharmaceutical drug companies?
[1718] Are we allowed to do that?
[1719] Could we allow to do that about the past?
[1720] The people that have been responsible for the biggest criminal fines in medical history?
[1721] Is that okay?
[1722] What the fuck are you saying?
[1723] Yeah.
[1724] But the ability to say that, just what I just said, is so critical for people to understand what's really going on.
[1725] And if you don't have anybody saying that, then we're really in trouble.
[1726] Because if everybody's Brian Stelter, we're fucked.
[1727] We're fucked.
[1728] If that's the only way you're ever getting information, there's no Wikipedia.
[1729] Not even Wikipedia.
[1730] That's biased, too.
[1731] But there's no independent journalism.
[1732] There's no substack.
[1733] There's no YouTube independent journalist videos where they're going over case by case, step by step, all the problems and all of the corruption that led us to this position.
[1734] If you don't have those people and all you have is these mainstream propagandists, we're fucked, but that's not the case right now.
[1735] So that gives me hope is that these conversations are happening and people are paying attention.
[1736] And look at how many people are taking this updated COVID shot.
[1737] Fucking nobody.
[1738] Nobody wants that shit because they realize like this isn't work.
[1739] It's dangerous.
[1740] Everyone knows someone that had something go wrong.
[1741] Everyone does.
[1742] And we don't even know what the real numbers are.
[1743] The VAIR systems, like, what does it get?
[1744] Like 1, 2 % of the actual adverse events that are reported?
[1745] Who fucking knows how many people?
[1746] I have two friends that have pacemakers.
[1747] One guy's in his 40s.
[1748] He's almost 50 now.
[1749] And one guy's in his 30s.
[1750] 30s.
[1751] 30s.
[1752] Got vaccinated.
[1753] All of a sudden, heart stopped beating for like nine seconds at a time.
[1754] It would just black out and fall down.
[1755] Goes to a doctor.
[1756] The doctor says, you're going to need to get a pacemaker, at least for now.
[1757] Oh.
[1758] Yeah.
[1759] He's a dentist.
[1760] He's a very smart doctor.
[1761] He was like very confused by all this.
[1762] It's like, I thought I was following the rules.
[1763] I thought I was following the science.
[1764] Yeah, it's wild.
[1765] I'm sure you've seen that video of that girl, Heather McDonald, she's talking on stage about being vaccinated and then she fucking blacks out and cracks her skull.
[1766] Oh, God.
[1767] Tell me the universe isn't trying to send a message through that.
[1768] I mean, how is it possible that at that moment, after talking about being vaccinated and bragging about it, that that's when you black out on stage?
[1769] How many times you blacked out on stage?
[1770] You know how many times that girl's been on stage?
[1771] She's been a stand -up comedian for decades.
[1772] She's been doing thousands of shows.
[1773] How many times did she talked about being vaccinated?
[1774] Probably not that often.
[1775] How many times she's done it on video?
[1776] Probably not that often.
[1777] And the one time she does it and she blacks out right after she says it bounces her head off the ground.
[1778] Cracked her skull.
[1779] But then she was on Dr. Drew and Dr. Drew was talking to her about This seems to happen when people get boosted.
[1780] There's some sort of an effect that happens with people.
[1781] I love Dr. Drew.
[1782] He's out there now.
[1783] I mean, he used to be a lot more mainstream.
[1784] I think he's kind of woken up to all this shit, too.
[1785] What's wild is when people do the compilations of people posting and saying, like, you know, I got this and everyone who won't, I hope you die, and then it turns out that they pass after.
[1786] I mean, it's tragic.
[1787] It's sad, but it's also really scary.
[1788] It's very scary.
[1789] But it's also they're scared.
[1790] That's why they post those things.
[1791] You know, they want to believe they made the right choice.
[1792] And I'm sure they've heard all the people that say that's not the right choice.
[1793] And they want to fight against that.
[1794] No, we have made the right choice.
[1795] All you people are going to die and I'm not going to cry at all.
[1796] And then you're dead.
[1797] Whoops.
[1798] Whoops.
[1799] Put all your fucking eggs in the wrong basket.
[1800] Yeah.
[1801] And you suppressed people that are asking questions, which is crazy.
[1802] Especially when these people are asking questions of people that are, you know, like legitimate scientists.
[1803] Like, I want to know.
[1804] Like, how did you come to these conclusions?
[1805] Like, Peter McCullough.
[1806] Like, that guy is the most published doctor in human history, in his field, in his field of study.
[1807] He's not a moron.
[1808] No. He's very smart.
[1809] Super well -respected doctor.
[1810] And he was talking about the danger of this thing.
[1811] All of a sudden, this guy is removed from his position at the university.
[1812] He's getting sued.
[1813] He's getting, like, disparaged everywhere.
[1814] I mean, it's, but it takes people like that.
[1815] It takes courageous people to stand up and, and, And just fucking take a chance and say, hey, this isn't right.
[1816] Like, this is not correct.
[1817] It's not true.
[1818] At great cost, personally.
[1819] I think also when people see that someone is willing to take a stand and to say what they think, no matter what the cost is, that inspires other people to do the same thing.
[1820] I think it does, too.
[1821] Yeah.
[1822] It definitely inspires.
[1823] It lets people know that there's another way.
[1824] Yeah.
[1825] Because everybody's just stepping in line.
[1826] It's like...
[1827] Yeah.
[1828] I'm optimistic, but I'm also realistic.
[1829] This could all go sideways.
[1830] It could all go sideways.
[1831] And we could be one of the lost generation.
[1832] And it could take decades before some new generation rises up and pulls us out of this, you know?
[1833] I mean, if you were born in Germany in the 1940s, like, how would you, what do you, you know, it's not your fault.
[1834] You're in the middle of fucking Holocaust.
[1835] Like, what happened?
[1836] You're fucked.
[1837] You're in the wrong time line.
[1838] You just got...
[1839] And we could be in the wrong timeline.
[1840] Like there could be some terrible things that happen in this country where you do get the centralized digital currency and you do get the vaccine passport and you do get a complete capture of the population because you have the ability to shut people's funds off and most people are just going to adhere.
[1841] They're just going to comply.
[1842] And they don't know what to do and they're going to complain.
[1843] They're going to bitch and moan.
[1844] And, you know, they'll be captured by these massive institutions that only care about extracting profit.
[1845] Well, I would say is just for people to never feel badly about the way they feel or for being skeptical.
[1846] Because I think that's the biggest thing.
[1847] From what I heard from my audience is that many people felt like they weren't allowed to have their own thoughts or have their own opinions.
[1848] Like they were afraid of that or they were being shamed for that.
[1849] Yes.
[1850] Well, there was one of my friends was talking to me about this and he's like, It was in my mind that you have to be an idiot to not take the vaccine.
[1851] The vaccine is going to work.
[1852] They're saying it's going to work.
[1853] And that's the only thing that's going to get us out of this.
[1854] And if these people aren't taking it, they're going to fuck this up for everybody else.
[1855] So there was that narrative.
[1856] And my friend was talking about it.
[1857] He was like, that was me. I really thought that everybody wasn't getting vaccinated was an idiot.
[1858] And then this is our way out of this.
[1859] And it wasn't.
[1860] But now he has a completely different perspective.
[1861] Now he's like, oh, shit, they're all liars.
[1862] Like, this is like, and then when you read RFK's work on it and you talk, when he talks about the actual studies that they conducted and what they actually showed, and they never did a study on whether or not it stopped transmission.
[1863] They never, they had no idea.
[1864] It was just whether or not it made the antibodies.
[1865] Wild.
[1866] But that is going to be the case, I think, with everything that we deal with that involves money, whether it's climate change, whether it's gender affirmation surgery, whether it's whatever social issue.
[1867] Diversity, equity, and inclusion.
[1868] It's just a money grab.
[1869] It's all a money grab, you know?
[1870] Yeah, but we'll hope for the best.
[1871] Yeah.
[1872] Well, maybe this is what I do.
[1873] I tell the truth, and truth is a money grab.
[1874] Kind of.
[1875] I can't tell if you're joking.
[1876] No, I'm not joking.
[1877] But I'm telling the truth because I tell the truth, but it's also a good way to make money.
[1878] Well, because that's the thing that's going to stand in the end.
[1879] You're not going to have to turn around and realize that everything you said was a sham.
[1880] Right.
[1881] I'm not doing it to make money.
[1882] Well, I definitely am doing it to make money, but I'm not doing it specifically because I know what I'll do.
[1883] I'll be a truth teller.
[1884] No, it seems like the thing to do.
[1885] It just happens to be profitable.
[1886] Well, because there are so few people doing it nowadays.
[1887] Which is wild.
[1888] How wild is that?
[1889] Well, you're doing it.
[1890] You know?
[1891] Back to your work about...
[1892] dating and all this jazz one of the things that someone brought up to me that i was really thinking about the other day was how much everything changed when women entered into the workforce and then when women got birth control that those in terms of human history those are two of the biggest changes when it came to dating and because now women had their own money and women also didn't get pregnant every time they had sex.
[1893] Yeah.
[1894] So they had this ability to move on and not have a relationship with someone that they had sex with and not be connected to them forever because they have a family with them.
[1895] Yeah, yeah, this is one area I'm really interested in.
[1896] And I do think that that technology has been, I think a net positive.
[1897] I don't think we should want to take it away or take away women's right or ability to plan in terms of their, when they want to have a family.
[1898] If they want to have a family, you know, if they necessarily want to have, what am I trying to say here?
[1899] Yeah, I think just the ability to control their sexuality, right?
[1900] To not have their sexuality necessarily linked to reproduction.
[1901] Right.
[1902] I think that's a powerful thing.
[1903] I'm sure many people disagree with me that we should separate those things.
[1904] But otherwise, the alternative is to say that people should not be sexually active until they're ready to have children, which I don't think is realistic for men or women.
[1905] Right.
[1906] So there definitely has been a lot of change in society in terms of now, like you said, because women are basically, before it was that women were seeking equal opportunities in terms of education and employment to men.
[1907] And now women are, for the most part, at least in the West, outperforming men in many cases, especially when it comes to education.
[1908] You know, they're graduating high school and university at higher rates than young men are.
[1909] And I think the projected ratio is that women are going to be graduating from college at a ratio of two to one to men in the coming years.
[1910] That's amazing.
[1911] So, of course, I'm not saying I think women should go back to being in the kitchen or that women are good at science and math and people accuse me of these things.
[1912] But I think it's worth talking about.
[1913] And I think we do need to take into consideration helping men who may not feel like they're doing as well in society or think about real solutions as to what we can do to help them.
[1914] Because if you are a woman, you're straight, you're going to want to date and hopefully settle down with someone.
[1915] And if you can't find a partner, that's not a good thing either.
[1916] Well, there's also a difference between the way a woman perceives and what a woman wants from the world versus a man. And that's why you don't see male fuck robots.
[1917] Well, I was going to say that's also why you don't see male house husbands that much, although that's like a trend on social media.
[1918] Oh, there's a few of those.
[1919] Those are sad, man. Women aren't happy with them, though.
[1920] They might pretend that they are, but they're not.
[1921] But yeah, the same thing.
[1922] You don't see male sex robots so much, unless it's for gay men.
[1923] Yeah, maybe.
[1924] Are there gay men robots for sex?
[1925] Yeah, well, they're dolls, and I'm sure the robots are on their way.
[1926] They're on their way.
[1927] Yeah, yeah.
[1928] Can't stop them.
[1929] Jack Thor robot waiting for you at home every day.
[1930] Basically.
[1931] Yeah.
[1932] Yeah, the house husband thing is a wild one.
[1933] It's like that, that gender role seems to be very pervasive.
[1934] Like the male provider versus the female provider.
[1935] I don't know anybody that's in that female provider, man who just sits at home thing where it works.
[1936] Well, you see those videos.
[1937] The men, like, they're doing the dishes and vacuuming while they're girlfriends at work.
[1938] And I think that's great.
[1939] I think men should do housework.
[1940] I don't think it should be just the woman doing that and having a job.
[1941] But women want, like, women prefer typically men who provide and are at least as successful as they are.
[1942] Typically.
[1943] I don't think most women want a man who doesn't have his own money.
[1944] who doesn't have a way to make an income and just like you're just it's going to be a fucking weird relationship you know it's going to be weird It is cute, though, and they pack their wives, their lunches, I guess.
[1945] Well, look, it would be cute if you had money.
[1946] Like, if it was a guy who is wealthy and just did that because he loves you.
[1947] Yeah.
[1948] Yeah.
[1949] But not some bitch -ass man that doesn't have any other function other than to be your little keeper.
[1950] You know?
[1951] That's, I think, part of the problem.
[1952] It's just, there's certain gender roles.
[1953] I'm not saying you should adhere to them.
[1954] I think you should do whatever you want.
[1955] If you're a woman and you want to be a power line worker, go for it.
[1956] Yeah, of course.
[1957] But there are certain gender roles that are stereotypes for a reason.
[1958] You know, like there's women that have zero problem not making any money and being a housewife.
[1959] I don't know any man who does that who's not conflicted or or cucked or beaten down by that relationship to the point where, you know, like you just, it's just not normal.
[1960] Well, in the past, it's because evolutionarily, if a woman was with a man who couldn't provide, they would probably both die.
[1961] Right.
[1962] So there's a reason for that.
[1963] There's a reason why, you know, that tendency still remains now.
[1964] Of course.
[1965] You want a man who can protect you.
[1966] You want a man that can provide.
[1967] And you want someone who can step up when shit gets weird, you know?
[1968] And if you don't have someone who can respond to a dangerous situation or to pressure or can think clearly, can handle stress.
[1969] Like, well, you've got a liability.
[1970] Now you've got someone's going to fall apart.
[1971] Now you have the opposite of a provider and a protector.
[1972] You have some fucking extra person that's more of a liability than you are.
[1973] Like, oh, great.
[1974] Now I have to abandon this loser and run from the zombies because he's crying.
[1975] Yeah.
[1976] But again, I think people should do it.
[1977] Look, I'm friends with a lot of females that are MMA fighters.
[1978] I have no issue with gender nonconformity.
[1979] I think if men are more feminine or women are more masculine, that's fine.
[1980] Absolutely.
[1981] Nothing wrong with it at all.
[1982] But I just think in terms of traditional heterosexual relationship roles, I'm not saying it can't work out if the woman has all the money and the man is just laying around and home all day.
[1983] I'm just saying it's not likely.
[1984] Yeah, or for me, just I bring attention to this because when you see those dynamics where people are trying to go against what they inherently feel, they don't understand why they're not happy.
[1985] And it's like you don't have to deny what you want in the name of being progressive or trying to be enlightened.
[1986] It's okay to, if you do fall into stereotypes, that's okay.
[1987] And if you naturally don't, that's fine too.
[1988] But I don't think we should necessarily intentionally try to push against stereotypes just for the sake of it.
[1989] Right.
[1990] Because there's a good chance you're not going to be satisfied with that.
[1991] That's a good point.
[1992] Like, there's nothing wrong with a feminine woman.
[1993] There's nothing wrong with a masculine woman.
[1994] There's nothing wrong with a masculine man. There's nothing wrong with a feminine man. It's like, but when you see a feminine woman, like an overtly feminine woman, there's some people like, oh, she's giving in to stereotypes.
[1995] Like, maybe not.
[1996] Maybe that's what she likes.
[1997] Isn't it, is it, if it's, if there's fucking hundreds of millions of them, isn't it possible that that's just a way that someone's hormones and personality and interacts with the world?
[1998] And then there's the other thing that's odd that I've always wondered because it's like it doesn't affect men physically.
[1999] the birth control pill.
[2000] Like what kind of an effect has that had overall on the way women behave and the choices that they make, not just because they have choices because they don't have to worry about getting pregnant, but they have this hormone.
[2001] that's being artificially introduced into their body that tricks them into thinking that they're pregnant, which is a very unusual, well, very common, but a very specific state of the woman's body.
[2002] And now your body is like that all the time, which has to profoundly affect decisions you make, the way you live.
[2003] Like my friend Whitney Cummings, she got off birth, she just had a baby, she got off birth control pill and she was on the pill for like 20 years or whatever.
[2004] And she's like, Jesus Christ, it's like, I'm a different person.
[2005] Yeah.
[2006] Like what the fuck happened?
[2007] Yeah, that's really common.
[2008] I hear that a lot.
[2009] Pretty much every woman is going to be on birth control at some point in her life.
[2010] Not necessarily a pill, though she'll be using some form of birth control.
[2011] But I would say definitely in terms of the pill because it tricks your body into thinking you're pregnant.
[2012] So there has been research to show that women who meet their partners while they are on the pill and then they get off it, they're not attracted to them after.
[2013] That's crazy.
[2014] I know, and I feel really badly for those guys.
[2015] I feel bad for the women and for the men, because it's like, how sad would that be?
[2016] Especially if you start having a family, then you realize you're not attracted to them.
[2017] Well, what is it about them that's not attractive once they're not on the pill?
[2018] Well, because when the body is pregnant, you're seeking, I'm treating it like this is a totally disconnected thing from someone, like when the body is pregnant.
[2019] But when someone is pregnant or when they're on the pill, they're seeking someone who's going to be more nurturing.
[2020] And versus when you are not on the pill and you're actually ovulating, during that period, you're looking for more of an alpha -type super masculine guy because that's a man who's going to have good genes to provide you with your offspring.
[2021] Interesting.
[2022] Yeah.
[2023] So they might choose a guy who is more, say, sensitive or more nurturing while on the pill and then they get off it.
[2024] And then they want the more alpha type guy, which, you know, I can see why some people might find that offensive because that seems to fit into certain stereotypes about what women do and don't want.
[2025] But I think it's fine if men are masculine.
[2026] But why is that offensive?
[2027] Well, because I think for women to.
[2028] It's so common.
[2029] To say that women want a masculine man, which I think most women do, I think some people feel it's saying that women deserve to be treated badly.
[2030] And I don't think masculinity is necessarily a bad thing.
[2031] It's only if someone is an abusive person, that that's a bad thing.
[2032] But those two things are not necessarily found together.
[2033] Masculinity doesn't necessarily mean that someone is going to treat his partner poorly.
[2034] Or I think also communication, right, masculinity.
[2035] When they use the term toxic masculinity, they're referring to men not wanting to talk about their feelings and having mental health issues because they bottle things up.
[2036] There's that.
[2037] But there's a middle ground.
[2038] You know, I don't think we have to demonize men, just like we don't have to demonize femininity.
[2039] I couldn't agree more.
[2040] Yeah.
[2041] Yeah.
[2042] But I'll have so much more to talk to you about this once I announce what I'm working.
[2043] Well, we'll definitely do it again once you announce what you're working.
[2044] Thank you.
[2045] Do you want to do it that way?
[2046] I would love that.
[2047] Is there anything else you want to talk about?
[2048] I could go forever, but I think that's, I got my huge list here for your audience once.
[2049] Where's the camera?
[2050] I want to see how organized I am.
[2051] Okay.
[2052] Well, if people also, if they want to follow you online, tell people what's the best access to your work and where should they go?
[2053] Yep.
[2054] So you can find me on social media at Dr. Deborah So.
[2055] Deborah is D -E -B -R -A.
[2056] So is S -O -H.
[2057] You can, I actually have a book of my book here.
[2058] If they want to see this hateful transphobic piece of work.
[2059] The end of gender you can get at Dr. Deborah.
[2060] There it is .com.
[2061] And Simon & Chuster's website.
[2062] Thank you, Joe.
[2063] Thank you so much.
[2064] You've been so supportive from day one.
[2065] I remember coming on your show when you're still in L .A. and I was just like, I really hope that he doesn't think I'm a crazy person.
[2066] No, I don't think you're a crazy person.
[2067] And I always enjoy talking to you.
[2068] Thank you.
[2069] So best of luck to you.
[2070] Thank you.
[2071] And we'll do it again.
[2072] We'll do it again when you're this not -to -be -mentioned project coming out.
[2073] Awesome.
[2074] Okay.
[2075] Thank you.
[2076] All right.
[2077] Bye, everybody.