The Joe Rogan Experience XX
[0] Joe Rogan podcast, check it out.
[1] The Joe Rogan Experience.
[2] Train by day, Joe Rogan podcast by night, all day.
[3] Oh, hello, Seth.
[4] Hey, Joe.
[5] Nice to meet you.
[6] Nice to meet you, in person, finally.
[7] Yeah, man. It's, what has this been like for you?
[8] The Babylon B rise and attack and all the chaos.
[9] When did you guys start?
[10] 2016.
[11] What was the impetus?
[12] The impetus was, uh, nobody was doing it you know there's nobody that's like doing satirical comedy from like a conservative perspective i guess um adam ford's the guy that founded it and uh and it just i mean i don't know there was a void there you nobody was filling that void so he's like he publishes this site using like a wordpress template and like puts out some articles and they go viral so quick like within two months he's getting millions of visits so i don't know he just had a sense that like Like somebody, you know, there's so much comedy, like the left dominated comedy.
[13] They were just dominating it.
[14] Nobody, there was no answer to that.
[15] That's a good question.
[16] I mean, so everything, like all of these institutions, like, you know, the media, education institutions, corporations, all these things.
[17] They're all dominated by the left.
[18] So, comedians, though, I mean, as you know, there's been this, like, there's been this opportunity, this big opportunity to kind of like step in and provide.
[19] comedy that makes jokes that the left isn't willing to make and so they were dominating for a while but now I think things have shifted because you've got all these rules about what you can and can't joke about and the people who are willing to make jokes that kind of like sidestep those rules they're you know they're meeting in demand yeah the meme space though has always been very right wing in a lot of ways because it's like the thing to make fun of because since the media has been so dominated by the left whenever there's like a narrative that just gets pushed with like that sort of ignores logic and ignores reality there's like a thing that happens where someone goes yeah but what about this and like that has been like the meme space like memes have always been like very funny like some of the really funny trump memes and some of the very funny anti -biden memes and covid memes they were kind of like on that vein yeah well you know yeah when you've got a when you've got a narrative that's being advanced and it's being pushed on everybody you know like I don't know I I feel my personal take on it is that comedian's job is to like poke holes in it you know yeah try to like find its weak spots try to find like the hypocrisy try to expose whatever absurdity is there like the narrative you can't just buy the narrative as it is you got to challenge it some way comedy is a great way to do that so I don't know I think comedy that challenges the narrative is is key and it's like that's what people are going to find funny because it's like you're trying to hold people in positions of power accountable.
[20] That's like, that is punching up, right?
[21] That's what comedy's supposed to do, we're told, allegedly.
[22] Comedy is supposed to be funny.
[23] This whole punching up, you know, when people, cheers, by the way.
[24] Cheers, salute.
[25] Cheers, dude.
[26] My favorite far right extremist.
[27] Who's never voted Republican?
[28] This, um, the, that, it's like this whole idea of like punching up or punching down.
[29] Things are just supposed to be funny.
[30] One of the best bits of all time.
[31] is Sam Kinnison's bit about starving people in Africa.
[32] And it's the most punching down bit in the history of comedy.
[33] I mean, he's literally talking about starving children.
[34] Right.
[35] About him sitting at home, you know, trying to enjoy his dinner, and Sally Fields is on TV, asking him to donate money to starving kids.
[36] And he's like, why don't you?
[37] You're right there.
[38] Right.
[39] You know, like this is the whole bit.
[40] Why don't you send someone like me who says, hey, we just came 5 ,000 miles with your food.
[41] It occurred to us there wouldn't be world hunger if you people would live where the food is.
[42] I love it.
[43] It's so good.
[44] You live in a fucking desert.
[45] Right.
[46] I mean, it's a great bit and it's totally punching down.
[47] Well, I mean, so when I said that, when I said, you know, we're supposed to be punching up, I'm saying that's what they say.
[48] Right.
[49] I'm not saying I agree with that.
[50] They don't even say that, though.
[51] Comics don't even say that.
[52] No, comics don't say that.
[53] The people who are critical of us say that, you know, because that's, I mean, that's the reason we're in Twitter jail right now.
[54] Supposed we punched down.
[55] You know, like we made a joke about somebody who's in a marginalized or oppressed class and it's considered hateful conduct.
[56] Well, you call.
[57] Rachel Levine the man of the year when she won the where she was did she win the man of the year she's dominant it's like it's that's like what a rig game like when Caitlin Jenner she was a woman for six months she got woman of the year right what the fuck imagine if you're a woman for 40 fucking years Dave Chappelle is an amazing joke about that but it's just the whole thing is it's these narratives they get pushed they're they're bizarre in that like they're they force compliance like you can't have a nuanced perspective you can't look at it you can't you can't if you can't make fun of this idea that someone could be a woman for six months and then win women of the year right like what the fuck are you talking about you have to be able to make fun I mean so what happened was USA Today did a story about how um they named several women of the year and Rachel Levine was one of them so Rachel was named is you know a transgender health admiral and the Biden administration.
[58] And they named Rachel Levine one of their picks for women of the had several that they picked for women at the year.
[59] So there were women and then there was, you know, transgender.
[60] And we did a joke about how Rachel Levine was our pick for man of the year.
[61] Right.
[62] And, you know, that was considered misgendering by Twitter's policy, which is hateful conduct under their policy.
[63] And so unless you delete that, you guys are in Twitter jail.
[64] Unless and until we click delete on that tweet, yeah, we're in Twitter jail.
[65] And this is the problem.
[66] It's like the delete button says, you know, you have to, you acknowledge that you engage in hateful conduct when you click delete.
[67] And I'm like, The reason I refuse to click the button is because I don't agree that I engage in it.
[68] Well, first of all, I agree with you.
[69] You know, going back to what you said a moment ago, like, comedy should be funny, right?
[70] Right.
[71] Like, when we're sitting there trying to think of jokes, like, the thing that should be going through our head is, is this funny?
[72] That's the question we should be asking ourselves.
[73] Right.
[74] Is this funny?
[75] Not is it targeted at somebody who views themselves as marginalized or oppressed and they're going to come after me and try to destroy my life and career?
[76] Because if I'm trying to think in those terms or if I'm thinking in the terms of I'm up here, they're down here I shouldn't joke about those people they're beneath me you know like that's so condescending to have that thought and if I was in a marginalized community if I try to put myself in the shoes of somebody who's considered marginalized today I wouldn't want anybody trying to protect me from jokes like I can't handle it like my skin is too thin to handle a joke like that's condescending too but it's that thing where you know my friend Morgan Murphy is kicked off Twitter forever too because she got in some sort of a debate I think, I don't remember who it was about, but she was basically her, she's a feminist, and her problem was that transgender women are entering into these female spaces and sort of dominating them with these almost like male perspectives on female issues, and it pisses her off.
[77] She's like, we have to acknowledge that women are a real thing.
[78] And then people were like, you know, but this is a real woman.
[79] She's like, a man is never a woman.
[80] And she posted that on Twitter, and they said, you know, this is hate speech.
[81] This is your, whatever you're doing, you're, what are you, misgendering.
[82] And so they banned her for life forever.
[83] For that, which is crazy.
[84] The fucking Taliban is still on Twitter.
[85] I know.
[86] Well, there's so many things that you can say.
[87] And this is where it's weird.
[88] The content moderation conversation is a big conversation that needs to be had.
[89] When you're talking about like, well, what should these platforms be concerned with when they're talking about content moderated?
[90] right and you know in my mind and when you think of like section 230 and his provisions the language that's in there and you know like what they get immunity for when they're engaging in content moderation it's like what's in view there is like lewd and indecent content you know like things that wouldn't be appropriate in the actual physical town square um death threats and things like that stuff like not even lawful speech i mean obviously you know there's a there's a there's a there's a place for like taking that kind of stuff down harassed like terrible harassment where you're like sending somebody to somebody's address and telling him to go kill that person.
[91] I mean, there's obviously things that should be moderated, but what you see is so much of that stays in place, especially if it's coming, if it's aimed at the targets that are acceptable to harass.
[92] Yeah, but it's aimed at the right.
[93] Yeah.
[94] So much that remains in place.
[95] But then opinions like, okay, you see like, you know, like a family -friendly drag show that kids are like tipping these dancers and stuff like that.
[96] And you call that grooming behavior, you know, now all of a sudden you're banned for that.
[97] Now the family -friendly drag show isn't.
[98] considered lewd and decent.
[99] It's not, that's not moderated.
[100] It's so, it's the criticism of it that gets moderated.
[101] That's a little wild.
[102] But yeah, that's, you know, that's the, that's the forced conformity.
[103] It's the forced affirmation.
[104] You know, this idea, why, if, you know, Twitter can have whatever policy they want for content moderation, delete my tweet if you don't like it.
[105] Take it down.
[106] You know, like, they can delete it.
[107] Why do they, why do I have to delete it and say that I acknowledge that I engage in hateful conduct?
[108] Doesn't that go like a step beyond just content moderation or I think their idea is that if you delete it, they're giving you the power to come back.
[109] Like, just follow the rules, and you can come back, and they're giving you a doorway.
[110] Instead of just banning you forever.
[111] They're saying, look, we have an option for you to come back.
[112] I mean, but they could easily do that.
[113] They could delete the tweet and say, just don't do it again.
[114] You know, or they give me a warning and say, okay, we've deleted your tweet.
[115] It's already up.
[116] You already said what you wanted to say.
[117] And this way, you can say more shit now.
[118] You know what I mean?
[119] I don't think we would last much.
[120] I think it'd be only a matter of time before we had another one.
[121] And, you know, we're a permanent suspension or something like that.
[122] Well, do you see what that was Alex Berenson?
[123] Yeah.
[124] He got reinstated.
[125] I know.
[126] That's kind of a new...
[127] In court.
[128] That's a new precedent.
[129] Yeah.
[130] It's fascinating because he was correct.
[131] Yeah.
[132] And now when you look at the data in court, because basically what he was saying has all been proven in terms of studies and scientific, whatever, you know, data that has been accumulated over the course of the last two years on COVID vaccines and lockdowns and all these different things.
[133] He was correct.
[134] Right.
[135] And it's been found out now that the White House actively contacted Twitter and tried to get him banned.
[136] And now he's going to sue the White House.
[137] Right.
[138] Which is wild.
[139] Which is one of the arguments that people on the right make that these are not private companies just do just engaging in regular.
[140] These are like they become state actors when you have the government behind them saying, okay, it'd be unconstitutional for us to block this speech ourselves, but we can outsource it to this privately owned third party and they can do it.
[141] They can't do that.
[142] Right.
[143] That's still considered government censorship.
[144] That's where it becomes like a First Amendment issue beyond just, you know, saying these are private companies that can do whatever they want.
[145] Right.
[146] Like what happened with the Hunter Biden laptop thing.
[147] Right.
[148] Right.
[149] That is an egregious assault on reality.
[150] Yeah.
[151] I mean that that we deserve to have all the information at our disposal.
[152] Right.
[153] In terms of like what is actually going on, what, what has been done?
[154] done, is there evidence of corruption?
[155] And if there's evidence of corruption and it's censored by a company that is obviously not just in contact with the current administration and, you know, the previous Democratic Party, but it's also what they're doing is working with them.
[156] They're doing their bidding.
[157] And that's where it gets really weird because it is so biased in one perspective.
[158] Right.
[159] And they're not just objectively disseminating information based on whether or not it's been proven to be true.
[160] No, they're suppressing information that's true because it fucks with what their desired result was get Trump out of office.
[161] Right.
[162] And that collusion between the government and these private companies, ultimately that's going to end up coming back to bite them because they're not going to be able to moderate.
[163] There's going to be some kind of pushback on that.
[164] There's going to be some kind of legal change or something.
[165] Yeah.
[166] But I mean, like, to answer more directly that question, like, why we haven't deleted that choke?
[167] I mean, we haven't tweeted since March.
[168] Right.
[169] We have 1 .5 million followers.
[170] We can't reach on Twitter.
[171] We haven't tweeted since March.
[172] Why not just delete the joke?
[173] Well, for one thing I said, we wouldn't.
[174] But, I mean, the main reason is because I don't believe that the truth is hate speech.
[175] I don't want to play along with this game that, like, you know, they make, if you go to the hateful conduct policy on Twitter's website, you pull up the hateful conduct policy, it starts out with like this ringing tribute to free expression, right?
[176] They say that Twitter is supposed to be a platform for free expression without barrier.
[177] Those are exact words.
[178] They say without barriers.
[179] And then you scroll down in the hateful conduct policy, and it's talking about, like, misgendering and dead naming and all these things.
[180] The dead naming one's wild.
[181] It's wild.
[182] I mean, you've got to go back and rewrite history.
[183] Bruce Jenner won the Olympics.
[184] Like, if you say Bruce Jenner, right?
[185] Like, that's dead naming.
[186] If you go to the Wikipedia page about that, it probably says Caitlin, right?
[187] It says Caitlin won it.
[188] You got to go back and rewrite that.
[189] But I just don't want to go along with a system where, you know, they say on the, what they offer you as a platform for free expression.
[190] But then in reality, and it's supposed to be without barriers, but then in reality, they have these ideological terms that you have to agree to.
[191] And so they're making it, especially from like a comedian's perspective, you know, when I talk about like poking holes in a popular narrative, they're rigging the system so you can only affirm the narrative.
[192] You have to affirm it.
[193] Like we try to poke, we try to speak a truth there.
[194] We try to basically say, hey, look, this is a male person.
[195] And if you consult your dictionary today, it might change tomorrow.
[196] But if you consult your dictionary today, like a man is an adult human male, like there's a, This joke has like a grain of truth to it and you're not allowed to say that on Twitter.
[197] So like, I don't know.
[198] I feel like it's a protest.
[199] It's like, look, I think you should be able to say that two and two make four and you shouldn't bake into your terms that two and two make five and you have to say that or else you can't tweet on this platform.
[200] It's like I don't want to be on a platform like that.
[201] I'd rather stand up and say, you know, look, we're not going along with that.
[202] And did you guys have the same post on Instagram?
[203] We did.
[204] Yeah.
[205] Didn't hit us on Instagram.
[206] Interesting.
[207] And Facebook?
[208] Yeah, we didn't get in on Facebook for that one either.
[209] So it's just Twitter.
[210] So is Twitter the worst with that stuff then?
[211] No. Twitter's the one that's forcing us to delete something ourselves.
[212] I mean, like, Instagram is, you know, Instagram, Facebook, we've had plenty of issues with them with, like, getting flagged for, like, inciting violence with a stupid joke.
[213] What did you incitreliance with?
[214] What was the joke?
[215] We did a joke about how during Amy Coney Barrett's confirmation hearings, we did a Monty Python joke about how she was being compared to a duck to determine whether or not she was a witch.
[216] And then the caption said, like, we must burn her.
[217] And that was like we said we must burn her.
[218] So like the automated system flagged that as like a threat.
[219] And then we appealed it and somebody manually reviewed it and upheld the ruling that it was incitement to violence.
[220] We're like, this is a Monty Python joke.
[221] This is crazy.
[222] But we've had that stuff happen on all those.
[223] It happens everywhere.
[224] It's just Twitter is the one that's like going a step beyond and saying you have to acknowledge you did something wrong and delete this.
[225] And that's where it's like a little bit different.
[226] I think you guys getting banned from Twitter was one of the influences that led Elon to want to purchase Twitter.
[227] I mean, he hasn't talked about it publicly, but I know he had a real issue with it.
[228] I mean, I think it's one of them.
[229] Yeah.
[230] I mean, I wouldn't say, and I'm never taking credit for it, like, oh, we are the reason that must.
[231] Because some people have said that, you know, like, oh, he did that.
[232] this to save the Babylon B. I don't think Elon Musk is putting tens of billions of dollars on the line to try to save the Babylon B. But I think he's genuinely concerned about speech.
[233] You know, he's called Twitter the de facto town square.
[234] I think he's right.
[235] I mean, I think these platforms are the town square.
[236] And if free speech doesn't exist in the town square, then something's got to be done about that.
[237] So I don't know.
[238] I think that it factored in, it's one of those things.
[239] It's like, okay, the Babylon B can't even make jokes on this platform.
[240] Like, this is not a free speech platform.
[241] Well, it's not as simple.
[242] as a private company anymore.
[243] It used to be it's a private company.
[244] They have their own rules.
[245] But when it's the number one platform for distributing information by average citizens, which is what it is.
[246] It's a little bigger than that.
[247] I don't know what the response to that.
[248] I don't think the response to that is let's get the government involved and regulate it.
[249] But I think there's a responsibility that they have, and this is what Elon believes, that they have a responsibility to, you know, he's a free speech, absolutist.
[250] He said, if this is what you guys are, because it is what they are, they are the town square, you have a responsibility to allow everyone to communicate.
[251] Otherwise, you create this divisive environment where it just divides the country even further without the ability to discuss things.
[252] Without the ability for people to criticize that post that you guys made about Rachel Levine or laugh about it or make other memes or all these different things.
[253] Without that ability, then you're going to get more people angry, more people to feel isolated, disenfranchised, and it creates a problem that we already have.
[254] It accelerates.
[255] It throws gasoline on a problem we already have.
[256] And that problem is this country is divided in a lot of ways.
[257] And it's divided in a lot of ways because of the narratives, the media pushes.
[258] The fact that the vast majority of mainstream news and media is leaning to the last.
[259] left and the ones that are on the right, you know, it's like, what do they have?
[260] They had OAN News and Newsmax, and it's just not that effective.
[261] They're not that, they're just too goofy.
[262] And so they were too easily criticized.
[263] They're goofy.
[264] Like, the people that were on there were not like the best representations.
[265] We're not talking about Ben Shapiro.
[266] We're not talking about, you know, intelligent right -wing punt, Matt Walsh, these guys who are intelligent right -wing pundits and, and.
[267] influencers that's not what it was these guys were goofy and you know it's it's easy to criticize it's easy to to say oh we need to shut that down but that's the worst thing you can do for everybody the worst thing you do for everybody is to make an echo chamber and that's essentially what their with their policies are doing well and you end up with echo chambers on both sides because you have the people the people who leave and go to like another platform they're just talking amongst themselves because nobody like left of center came with them they're not interested in a free speech platform where you can where like conservatives can speak freely and get their opinions and say the word groomer you know they don't want to be on a platform where that's allowed now if they'll end up in their own yeah they banned it yeah who's banned it Twitter all the all the big tech companies in concert all at the same time it started I think it started on Reddit so like Reddit stopped allowing you to say called this behavior grooming and then the other ones kind of followed suit didn't we discussed this Jamie wasn't it's not is it in certain rooms they did they've banned the term groom because what about heterosexual groomers what about men who go after like really young girls and befriend them and groom them because that's real yeah that's real and it's always been disgusting well it's all real but I mean the the term itself is now designated a slur how is that possible I just don't understand why you would throw the baby out with the bathwater the band okay groomer guy James on us James Hansy Twitter has a ban on college transgender people groomers oh but what about groomers that are transgender what if they're real i mean there are people that groom people that's a thing well okay so if we're talking about like a family like a family friendly drag show right like how's that possible how do those terms even work together you know it's a family friendly friendly porn theater but but it's not a transgender person that's performing it's usually like a drag queen is typically a gay man who's dressed as a woman he's not necessarily transgender.
[268] He doesn't identify as a woman.
[269] He's just, that's the show is to dress like a woman, right?
[270] That's what drag is.
[271] It's not, it's not transgender.
[272] Yeah, but sometimes they are.
[273] Sometimes they do consider themselves.
[274] Sometimes they do.
[275] But how would you know?
[276] If you're just, if you're just watching this on Twitter and you see it and you say, I think that's grooming, you're not necessarily targeting a transgender person.
[277] You don't even know if they're transgender.
[278] For all you know, it's a, for all you know, it's a straight man who's, who's been hired to do this drag performance.
[279] You don't know.
[280] But it seems like a piece of duct tape over a, a leaky dam to say that you have to ban the word groomer right it seems so crazy I just can't because that's that term to all not ever use that term for trans people but what about a trans person that is engaging in grooming behavior right that's that look I'm sure that the vast majority of trans people do not support pedophilia right so if someone is a pedophile but also trans wouldn't it behoove them wouldn't it help their cause to like say like this is not good this is this is bad this is not what we want we don't want people like this connected with us yeah you think so yeah you would think so i don't know i mean as far as like taking it to the point where you ban this language it's not the answer i think that i think the answer is okay look you know if there's if there's a huge swath of the population who has a problem with this with this behavior and look i can speak for myself i can tell you why i think when i'm when i use the word groomer if I use it in a tweet, I'm referring to behavior, okay?
[281] I'm like, it's a criticism about somebody's behavior around kids.
[282] Right.
[283] It has nothing to do with that person's identity or like who they sleep with or the color of their skin.
[284] Right.
[285] It's not targeted at a person for their identity.
[286] Right.
[287] And we conflate these things.
[288] I think that I think so much, you know, with the left's kind of heavy -handed censorship, they do a lot of this conflating where they take criticism of behaviors and treat it as criticism of identities and people.
[289] It's not the same thing.
[290] And you have to be allowed to criticize behavior because there genuinely is, like you said, real bad behavior that actually harms people and we're not allowed to talk about it.
[291] Well, that's the problem also when they have this sort of blanket free pass for people who are trans where you're finding people that are sexual abusers with a long history of being sexual abusers, like people that have literally been incarcerated for various sexual offense, sexual assault, and these people are going into women's locker rooms and saying that they're trans now and pulling their dicks out.
[292] And, you know, there are a lot of people that just genuinely are transgender people that would like to use a woman's locker room.
[293] But there's also people that are, they're sick.
[294] There's sexually assault women and you're giving them access to women in a very vulnerable place where there's no protection for them at all.
[295] And that was, you know, that LA case where the massage was the massage place?
[296] Yeah, yeah.
[297] Yeah, that was the case.
[298] That person had a history.
[299] Pull up that story.
[300] They were just exposing themselves, right?
[301] Yes.
[302] I think there was a mother in there with like a teenage daughter.
[303] Exactly.
[304] And they was in LA, right?
[305] Yes.
[306] Was it Envy Massage, Paul?
[307] Is it what it was called?
[308] but that person who did it had a record.
[309] I mean, it wasn't, it was pretty clear that this person had already engaged in some seriously problematic behavior.
[310] Right.
[311] Yeah, but even if they haven't in a context like that where you've got like a biologically male person in a women's space like that and a mother's in there with her child, it's like, you know, regardless of how that person identifies, it's not an attack on them to say, hey, look, you know, like, I don't want my daughter seeing a native person.
[312] man's body in the locker room and so that's you know it's tough conversation when you're when you're talking about like that person may not be an offender of some kind they may not be trying to put anyone in an uncomfortable situation but you know you still have the concerns it's the same thing like with sports too you know well some people might not be but some people definitely are and some people have a history of it indecent exposure child charges filed against trans woman over LA spy and so now they've they this is This is 2021, but what, what did the person did something in the past?
[313] They have a history of this.
[314] See if you can find that.
[315] This person, I'm 99 % sure.
[316] Yeah, there you go.
[317] Police said she has a criminal history.
[318] Yeah, she's a registered sex offender.
[319] Okay, Marager has been a registered sex offender since 2006.
[320] So that's 16 fucking years of being a registered sex offender as a result of conviction.
[321] for indecent exposure in 2002 and 2003.
[322] So 20 fucking years.
[323] Wow.
[324] This is crazy because you're giving someone who is a sex offender access to women where they can do the exact same thing where they were arrested for.
[325] And now they're celebrated and you're going to have people protest for them.
[326] Right.
[327] It's fucking crazy.
[328] Wait in trial on seven counts of indecision exposure that were first five.
[329] in 2019, according to court records.
[330] After the video alleging someone exposed himself, went viral, the spa became the target of right -wing demonstrations.
[331] Well, I bet a lot of those people weren't right -wing.
[332] They're probably just fucking parents, which many chided as transphobic.
[333] Many did.
[334] Oh, you got sources?
[335] Many.
[336] Look how they write these things.
[337] What is this?
[338] L .A. Times.
[339] Fucking, of course.
[340] Became the target of right -wing demonstrations with many child as transphobic after extremist groups such as proud boys.
[341] See what they're doing here?
[342] they just, they're shit in the punch bowl by saying transphobic extremist groups, proud boys, they're connecting them all together.
[343] The recording which surfaced in late June showed an irate customer arguing with employees after she said she had been she had seen a customer with a penis in an area that's reserved for women.
[344] The Wilshire Boulevard facility has some gender separated areas with changing rooms and jacuzis.
[345] Now what they'll try to do, and this is the thing, it all's good It all goes back to this punching down conversation with comedy.
[346] It's like, we can't joke about this situation.
[347] You can't joke about it because it'll be perceived as the target of the joke as a marginalized person.
[348] Right.
[349] In truth, and this is the truth.
[350] This is the reality.
[351] And we all need to start recognizing it.
[352] The people that you're not allowed to joke about or else you lose your career, you get canceled, you get banned from the public square, those people have tremendous amounts of power.
[353] They have tremendous.
[354] It's scary power, honestly.
[355] If you can't even joke about somebody or you get penalized and punished and you can potentially lose your job, then who's marginalized, who's oppressed?
[356] Like, who's on the outside there?
[357] Who's having to, like, hold back what they really think and feel and silence themselves and, like, do the tyrants work for him by engaging in self -censorship?
[358] You actually have a situation where it's the exact opposite of what they say.
[359] The people who have a problem with this are the ones who, and the people who are supposedly in the position of power are the ones who are most vulnerable.
[360] and the ones who are most likely to have all the powers that be destroying their lives.
[361] I think we're in the middle of the fog of war.
[362] It's like a fog of war in this culture war, and there's so much chaos going on.
[363] And, you know, so many people are trying to score a victory for their perspective and their ideology that they're missing the big picture on this, that free speech and freedom of speech and free expression have always been very important for sorting out what's right and what's wrong.
[364] And it's not good for anybody when you silence that.
[365] If someone makes a joke on Twitter, like, if you guys did something that was truly offensive, you would lose audience members.
[366] Some people who supported you would not support you anymore.
[367] Happens with every joke.
[368] That's, but that's supposed to be the free market.
[369] Right.
[370] That's supposed to be the free market of ideas.
[371] Like, if your ideas suck, people go all the, I don't like them.
[372] And then they stop paying attention.
[373] And they can mute you, by the way.
[374] If we put out jokes that you don't like and they're offending you, you can block me you can mute me like they the the platforms have given you the power to decide like who you listen to right but why do you have to de -platform me and like take me down they'll say that you're encouraging harassment right and so it's not just you but by putting it this is the escape clause right not just but you but by doing what you're doing you're encouraging others to attack which is crazy right it's it's consider harmful well i would i would take issue with that idea too I would say that, you know, so much of like this effort, I think a lot of what these policies come down to is safety, you know, you're saying it would be harassment, it would be intimidation, it would be a call to violence or people could perceive it as a call to violence.
[375] And these are, you know, these people might get attacked.
[376] They might face some kind of, you know, physical intimidation in the streets as a result of whatever.
[377] So it's a, it comes back to safety in it.
[378] And I don't know.
[379] In my, in my view, in my thinking, you know, I think it's healthy.
[380] to be willing to laugh at yourself I think it's healthy to be exposed to ideas that challenge you even harsh criticism I think it's healthy to be exposed to harsh criticism jokes at your expense all of these things like build character in a person you know like we have all of these methods in place now to like insulate people and keep them safe from ideas that might hurt them or jokes that might offend them is that really better for people there was an interesting study that we've done on that because here's the problem with the transgender thing is this is the LLLLBOR within the room is that it's easy to make fun of a very obvious male that wants to be a woman.
[381] Yeah, they, they, you know, if you attack them or you, I shouldn't say attack, if you mock them and belittle them, like there's nothing they can do that turns them into the physical form of a female.
[382] Like if Rachel Levine was attacked in jokes or was someone criticized her or mocked her in jokes, it's not like fat shaming right like there's an argument like fat shaming like there's the reason why you're upset is because you've eaten yourself into this position you know this is your own doing and you can actually not eat yourself out of it you can exercise your way out of it and many many many people have done that right where they've actually become smaller again it's almost the same argument you would say about people who are handicapped like if you mock a handicapped person they're not going to go you know what I shouldn't be handicapped anymore this right this criticism is really getting me to the point where I'm going to be mobile.
[383] Right.
[384] They can't do anything about it, right?
[385] So she can't necessarily do anything about her physical appearance.
[386] And, you know, that is the, that's the argument.
[387] It's, but it's a slippery slope.
[388] I'm just speaking in general terms about this idea that, like, doing everything that you can in your power to moderate speech to keep people safe from ideas or jokes that might hurt them is not necessarily helping them.
[389] In fact, I think it can be harmful.
[390] I was going to mention this study that was done by this nonprofit group in New York.
[391] And they were taking a look at the playgrounds in New York.
[392] And they were studying like whether or not they were trying to answer the question whether or not the playgrounds had been made too safe.
[393] And they actually determined that they had.
[394] It was this weird thing.
[395] Like all these playgrounds had been like redone so that they had really cushy, soft flooring.
[396] And you couldn't really fall from any heights or get hurt on these playgrounds.
[397] And what they found was that it was actually teaching kids at like, falling on the ground doesn't hurt you and like that doesn't help kids you know they learn on the playground that they don't get hurt when they fall and then they go climb a tree over a sidewalk and and it does hurt and it shocks them and they're learning the hard way you know now they got a broken arm I think that some of the efforts it's one of these things it's just like a self -defeating thing you know you try to create a safe space a safe environment in some cases I think you actually do more harm because you're you're protecting people from what you know Like I said before, I mean, like being the target of a joke, like, I don't know.
[398] I don't want anybody telling me that like, oh, you can't joke about me because you might hurt me. Like, I think that's offensive.
[399] That's more offensive than any joke you could tell to my face.
[400] Well, the other argument in your case with the Rachel Levine thing is that we're being forced to say something that some people don't agree with.
[401] Right.
[402] This is not, like, they're forcing this opinion as fact.
[403] and there's a narrative that a transgender person is a woman and a lot of people support that but a lot of people don't support that so there's a debate there should be a debate there's not a debate there should be a debate and if you want to say that that's the woman of the year well then it accelerates a debate because you're not you're like you're kinging your checker piece you know right you're doing something where you're not just saying that this is a woman but this is the best woman we have, which is kind of wild.
[404] That is kind of wild.
[405] That a male that becomes a female or becomes a woman is the best woman we have.
[406] Right.
[407] How offensive is that to women?
[408] It's almost like you better not criticize.
[409] We're going to go even further and further and further, and we're going to take this to this crazy place.
[410] But these are the best women we have.
[411] There's just such a tremendous distinction and difference between somebody going on Twitter and tweeting something like all trans women.
[412] people should die.
[413] Someone's going to take that clip out of context, by the way, and say that I said that.
[414] I'm sure there are.
[415] But like, if somebody went on Twitter and said that, I think that's wrong.
[416] You shouldn't say that.
[417] Of course.
[418] You shouldn't say that about everybody.
[419] Right.
[420] You shouldn't say if you target some group out and say these people deserve to die, they should be put in gas chambers or something like that, right?
[421] That's horrible.
[422] But if you say, if you make a joke, a motorcyclist identifies as bicyclist and sets world record.
[423] Right.
[424] You know, you're making a joke to make a point.
[425] You know, it's a funny joke.
[426] It's a funny joke.
[427] It went viral.
[428] It got shared millions of of times and were criticized for being like antagonistic towards these communities by making jokes like that and it's not the same thing as saying like it's not okay to be trans what it's saying is it's not fair to the bicyclists to have a motorcyclist competing against them yes and so what you're comparing that to transgender athletes versus biological females exactly most people do which is a valid it's a valid point that's not rooted in hate right at all in fact it's rooted in concern and compassion.
[429] When you try to say that this is a hateful position, I turn that on it, I turn it around.
[430] I say, well, look, you know, like, what about women?
[431] Like, who's showing compassion and concern?
[432] Who's trying to protect women and keep their sports, their sports?
[433] I mean, well, not only that, what is the percentage of people that are upsetting the far larger percentage that are biological females that are competing against this person to protect one person's feelings by affirming them as a woman?
[434] You're making all these other people victims of unfair athletic events right because that's what it is you know there's a somebody just sent me this study that they they've done studies on the the performance of various people whether it's trans people or not and they someone just sent me this i'm going to send it to you jeremy you know there's clearly some kind of advantage for some sports and to say that any differently after you see what happens with like lea thomas or uh whoever the bicycle is is and you know some other athletes that have started to dominate in those spaces it freaks biological females out because if you're a male and you've had testosterone pumping through your body most of your life all of your life until like a couple of years ago and then you transition like that's not the same as being someone who's never gone through puberty it's not right and that's why the swimming organization changed a threshold for transgender athletes people that have not transitioned after 13 or 12.
[435] So avoiding males going through puberty, where it would give the minimal amount of biological advantage.
[436] And people talk about outliers.
[437] They always want to talk about outliers.
[438] They always want to talk about someone who is like the elite of the elite of female athletes, the freak of the freaks.
[439] But there's a giant difference between the elite and the elite and the elite of males.
[440] Like if you're going to do this thing where you're going to the fall, of the spectrum, you've got to go all the way.
[441] Because I don't give a fuck what the elite of the elite female boxer is.
[442] She will never be Mike Tyson.
[443] Mike Tyson in his prime was the elite of males at the far end of masculine, the elite of females at the far end of feminine.
[444] We have to look at the full spectrum.
[445] The elite of females, the full end, can't compete with non -elite males.
[446] It's a giant difference.
[447] And when you add in someone like who's the elite of...
[448] The elite of males, you get a chance to see the actual spectrum.
[449] Didn't we have a boys team beat like the U .S. women's soccer team?
[450] Yes.
[451] Listen, there's, you can take 10 high school boys in this country that are competing in track and field, and they would be the greatest female athletes of all time.
[452] These aren't Olympians.
[453] These aren't world class runners that are in the world championships.
[454] These are just five to 10 high school boys.
[455] Right.
[456] It's not fair.
[457] And that's all we're saying It doesn't mean that you're not a woman It doesn't mean I won't call you a her It doesn't mean I won't say whatever name you want Because I will I don't give a fuck I want you to be happy If you say your name is no longer It's Sathina now And you want to be a woman And you really believe it I'm like okay I don't give a fuck dude Right I think it's crazy though When women defend it And I just saw a woman A woman on Twitter the other day She was saying look I'm 6 4 and 240 pounds I would destroy other women in rugby I guess she's you know In a country where Mugby's popular.
[458] That's the example that she pulled out.
[459] She wouldn't destroy a 180 -pound man. No, and this is nonsense.
[460] And I mean, no, I think she was 6 -2.
[461] She's 6 -2.
[462] She's not 6 -4.
[463] 6 -2 -2 -40.
[464] And saying that she would dominate these women, I'm like, well, okay, I mean, anybody who has a size advantage in contact sports, whether it's men against men or women against you have a size advantage.
[465] That's an advantage.
[466] It's literally what I just said.
[467] It's the outlier.
[468] Exactly.
[469] It's the outlier.
[470] And she's using herself as an example of that.
[471] And I said, you know, it's like humble bragging.
[472] Yeah, I don't know how.
[473] I don't know how you use that as a justification for saying, well, therefore, it follows that.
[474] We should allow men to come into women's sports and dominate them all, including the 6 -2 -240 woman.
[475] They're making a rational argument.
[476] They just haven't played it out to the end.
[477] The rational argument is that there are outliers in female spaces.
[478] There's outliers, there's outliers, there's athletic outliers.
[479] There's outliers in everything, intelligence, hand -eye coordination.
[480] There's outliers.
[481] But there's outliers in males that far exceed the outliers of females.
[482] Kamala Harris just said we all have the same capacity.
[483] Did you see her say that?
[484] We all have the same capacity.
[485] We just haven't realized it because, you know, equity says, unless I'm mocking.
[486] Was she talking about economics?
[487] Or was she talking about life success?
[488] She's talking about, yeah, life success, opportunity.
[489] You know, we don't all start on equal footing, but we all have the same capacity.
[490] So if we put ourselves on equal footing, we'll all reach the same result.
[491] It'll be equity.
[492] Yeah, that doesn't work.
[493] but what we should do is make it so that it's not so hard for people who live in disenfranchised communities.
[494] That's a real concern that we're not addressing.
[495] And I think if you wanted to really give people the best chance in life, don't give them a fucked up childhood.
[496] Figure out a way to somehow or another revive communities and give them a sustainable future where you don't have a long history of gang violence and crime and drug sales and violence.
[497] We cannot deny that it is, It's a big difference growing up in the suburbs of, you know, fucking the Hamptons versus growing up in Baltimore.
[498] It's a fucking giant difference.
[499] It's a huge difference.
[500] But you don't level the playing field by like, if someone can't see over the fence, you don't level the playing field by cutting out the legs of the person who can.
[501] So that they both can't see.
[502] Yes.
[503] You're talking about a funny meme.
[504] Yeah.
[505] It's a great meme.
[506] It is a good meme.
[507] But we can do the other thing and make.
[508] this is what I always say about America.
[509] Wouldn't it be better if we had less losers?
[510] Right?
[511] It would be better for everybody.
[512] Well, what's the best way to get less losers?
[513] The best way to get less losers is to help people get the fuck out of where they're at.
[514] And there's some people that just got a shit roll of the dice.
[515] And a lot of conservative people don't want to recognize that.
[516] They don't want to talk about that.
[517] There's this narrative, this pull yourself up by your bootstraps.
[518] There's people that don't have fucking shoes.
[519] There's people that they got a terrible roll of the dice.
[520] And by the time they're working and integrated into a system, they're 18 years old or whatever there are, that's 18 years of a fucked life.
[521] Yeah.
[522] And that can be fixed.
[523] That can be fixed, just like we have enough money to send $40 billion to Ukraine and 87 ,000 new IRS agents.
[524] You know what else we have money for?
[525] We have money to revitalize cities that are fucked.
[526] And we've never done it.
[527] We know we ignore.
[528] But should that be done by the government or privately?
[529] And I would think that, you know, with a lot of conservatives who are often criticized for.
[530] that mentality that oh yeah or no equality is just you know making sure everybody has the same opportunity uh nobody needs a leg up um you know you know these people should pull them up by their their bootstraps i do think that people generally speaking um christian conservatives are very compassionate and do a lot of charity work yes they do a ton of charity work yes and so they're willing to put their own time volunteering and donating money towards causes that help with those things You look at like crisis pregnancy centers, for example, which Elizabeth Warren wants to shut down for some reason.
[531] I mean, these are helping women in need, and she wants to shut them down.
[532] I mean, these are people who are volunteering their time, their resources, their money to help people who are in a tough spot.
[533] And it's completely charity, it's kindness, it's love and compassion.
[534] But it's always, you know, always painted with a brush of, oh, yeah, you know, you're on your own.
[535] We only care about children before they're born, not after they're born, you know.
[536] But I do think, honestly, an argument can be made that conservative Christians are the most charitable people there are.
[537] They're very charitable people.
[538] I think what the problem with someone like Elizabeth Warren's idea that you should shut down a place that has an ideology that millions and millions of people believe in, that life is sacred, and that this is somehow an assault on a woman's right to choose whatever her decision may be.
[539] They're worried about other influencers.
[540] They're worried about pressure.
[541] You're not even allowing for a choice.
[542] If you shut down the alternative, what's the choice?
[543] You're right.
[544] You're right.
[545] And that's where I'm pretty absolute when it comes to that.
[546] Yeah.
[547] And not just free speech amongst every subject, not just amongst a woman's right to choose or abortion laws.
[548] Yeah.
[549] And this idea that you can't have someone who is a Christian who talks to another person who's a Christian and maybe they were on the fence about something and you convince them to have a child.
[550] and it's the best decision they have ever made in their life and they love their kids so much.
[551] They couldn't imagine they were thinking about getting an abortion.
[552] That's real, too.
[553] That's real too.
[554] There's also women who have been raped who should not have to fucking carry some rapist baby.
[555] There's women who have been sexually assaulted before the age of 14.
[556] There's also, hold on though.
[557] But hold on, don't stop me. That's real too.
[558] And we all have to agree.
[559] We have to agree on both of those things.
[560] There are also, though.
[561] I'm not going to argue with you on that point, but I will say there are people who have been born of rape and are alive right now and are pro -life and they go around speaking talking about how I had a right to live and they will go out there and make an argument, a pro -life case.
[562] And they're a rape, they're born of a rape.
[563] You don't have a right to tell a 14 -year -old girl she has to carry a rapist baby.
[564] I'm just saying.
[565] I'm just saying that.
[566] But you understand what you're saying?
[567] Yeah, I understand what I understand what you're saying.
[568] But I'm saying, like, you don't have the right to tell my 14 -year -old daughter she has to carry her rapist baby.
[569] understand that?
[570] To look that woman in the eye who was the borne of a rape.
[571] Do you understand that?
[572] That's a 14 -year -old child.
[573] If a 14 -year -old child gets raped, you say that they have to carry that baby?
[574] I don't think two wrongs make it right.
[575] I don't think murder.
[576] I don't think murder is an answer to, I don't think murder fixes a rape.
[577] What if we're talking about an abortion when the fetus, like literally it's like six weeks, four weeks, three days?
[578] What if you just turned positive, just now?
[579] Positive for pregnancy?
[580] I don't.
[581] Well, I just disagree that.
[582] What if it just happened today?
[583] can like draw a line on when you can't once life has begun at the very moment like if you can if someone came inside of someone and they cracked the egg and then bam they took plan B you shouldn't do that uh well I mean that's if it's preventing the pregnancy for occurring it's different it's an abortion that's what plan B is it makes your body abort the conceived pregnancy that's what it does is it I mean I'm pretty sure let's let's Google it I know that women used to do something similar They would take like a shit ton of birth control pills.
[584] If it prevents the conception, it's different than if it's terminating.
[585] No, no, it's terminating.
[586] I'm 90 % at sure it's terminating.
[587] I think it's after conception.
[588] That's the whole idea about it.
[589] That's why it's plan B. Plan A is don't get pregnant.
[590] All I'm saying is it's real.
[591] What you're saying is real, and those are tough situations.
[592] It's also real that sometimes these babies are born and they do, and they grow up to be real people with feelings.
[593] They're alive.
[594] They're humans and they're pro -life.
[595] Can we click on that so I get the full sentence because it seems like, keeps going.
[596] All right, here it is.
[597] The morning after pill is a type of emergency birth control contraception.
[598] Emergency contraception is used to prevent pregnancy for women who've had unprotected sex or whose birth control method has failed.
[599] Morning after pill, no, no, no, is intended for backup contraception only not as a primary method of birth control.
[600] Morning after pills contain either Livano gestural, Livano gestural.
[601] or eulipristol acetate.
[602] Levanorest, Levanoresgestrol.
[603] You want to try it?
[604] Try that.
[605] Try that word.
[606] Levanargestrel.
[607] Levantal.
[608] Is available over the counter without a prescription, but go to the part where we're reading in the synopsis that it, yeah, here it is, right there.
[609] Keep in mind the morning after pill isn't the same as blah, blah, blah, also known as R U -486 or the abortion pill.
[610] Wait, wait, wait, wait, morning after pills, do not end a pregnancy that has implanted.
[611] They were primarily by delaying or preventing ovulation.
[612] Yeah, we're talking about, I was talking about the different thing then.
[613] I was talking about RU486.
[614] This drug terminates an established pregnancy, one which the fertilized egg has attached to the uterine wall and has begun to develop.
[615] Okay, so forget about plan B. What about RU.
[616] 46?
[617] It's the same question.
[618] The same question is if someone knows they're pregnant or if they test positive for pregnancy and they take a pill that can get rid of that, like, the day of you're against that I would say I would lay it out like this I would say it is wrong to intentionally kill an innocent human life abortion intentionally kills an innocent human life therefore abortion is wrong and I don't think any of the I don't think any of the examples of like oh well how developed is it you know can it can it think is it conscious can it dream can it feel pain so for you it's the moment of conception I think that if it's a human life an indis a distinct human life then I think it's wrong to to end its life So you think that once, do you think that, like, once the conception happens, there's some sort of a miraculous event, like, at the very moment, like, you could literally get to the point where the sperm cracks the egg.
[619] If you could scoop that egg out right there, would that be abortion?
[620] Well, I mean, at some point, you're going to have to say there was a magic moment that happened because you believe that we eventually become valuable humans, right?
[621] Well, listen, where's the moment where you think the magic happens?
[622] Let me tell you my perspective on this, because I've said this multiple times, but it bears repeating.
[623] I think abortion is a very human issue in that humans are, we're messy.
[624] Yeah.
[625] And it's a very messy issue.
[626] It's a, it's complicated.
[627] Bill Burr has a very good bit about it in his last comedy special where he says, I agree with your right to choose, but it's also killing a baby.
[628] Right.
[629] You know, and it's a very well.
[630] I like that bit.
[631] It's a good bit.
[632] It's fantastic.
[633] You talk about the oven, you know, baking something in the oven.
[634] know when you talk about like someone who's at six months or nine months it when it gets that gets crazy that's like you're literally killing a baby you're killing a baby that could exist outside the world what if rape produced it and it's eight months old in the womb it's a good question that but that's also what makes it a very very messy it's tough conversation you know there was a story that came out recently that someone had said that this woman got in trouble for having an abortion because they got hold of her Facebook messages.
[635] And then my wife sent me the actual story, that the actual story was, it was a late -term thing.
[636] She was trying to poison the baby.
[637] But the actual story is that she took medication online.
[638] She had a miscarriage because of that medication.
[639] She took stuff to kill her abortion, to start the abortion rather.
[640] And then heard her mom buried the stillborn.
[641] Investigator seeking a warrant said they later learned.
[642] and her mother had bought the oral medication online to end her daughter's pregnancy, that information was gathered in part from, but find out how far along she was.
[643] Because I think she was far along.
[644] And that's, you know, that's where it gets crazy.
[645] 23 weeks.
[646] Perhaps as long as 29 weeks.
[647] Wow.
[648] Under Nebraska law, abortions are legal up to 20 weeks.
[649] Yeah, she was past that threshold.
[650] how many months is that that six months what's 29 weeks that's really long so i'm so that's where we're at that that's that that area we're talking about where it's like this is this weird place where it's like okay that's a baby yeah this is not like a clump of cells that's an actual baby right that's why it makes it such a crazy issue i mean when you get when you start talking about harmful misinformation i mean i'm i'm as you can tell i'm pro life yeah and like so You know, when we start talking about harmful misinformation and the types of things that are considered like that I say or that we tweet or the jokes that we make that are considered harmful misinformation, I'm like, well, what about what about calling that baby a clump of cells?
[651] I think that's harmful misinformation because then you're encouraging people to kill it like it's nothing when it's actually a human life.
[652] It's a developing human life.
[653] I think abortion is health care the way that rape is lovemaking if we want to if we want to use rape as an example.
[654] I think it's I think they're opposites.
[655] And it's like a these are euphemism.
[656] that we use.
[657] You know, we use the word health care.
[658] We're talking about a procedure that ends an innocent human life, and we're calling it health care.
[659] That's like calling rape lovemaking.
[660] And this is why it's such a human issue, because I see what you're saying.
[661] And I think that if Christianity had been any other religion other than, you know, I mean, Christianity is the most mocked religion.
[662] Like, we want to look at religions with respect and dignity, whether it's Islam or Hinduism, we look at those with respect and dignity.
[663] And even if they have practices that we don't agree with, we sort of give them this leeway that it's a part of their religion.
[664] Christianity.
[665] Christianity is the most openly, easily mocked of all religions.
[666] It's the most derived.
[667] For whatever reason, well, we tolerate a lot of it.
[668] Plus, we also, you know, I think one of the things that was refreshing to me about the Babylon Bee before I got involved in the bee when I saw it for the first time, I liked the self -deprecating humor.
[669] I liked the willingness to go after our own and make fun of ourselves.
[670] That's important.
[671] Because I think that's really healthy.
[672] I think it's very, everybody, look, like, if we were able to laugh at ourselves, we wouldn't have people breaking down crying on TikTok because one of their students used the wrong pronouns for them, you know, like, it's, we're so sensitive.
[673] We take ourselves so seriously.
[674] Like, we can't even laugh at ourselves anymore.
[675] Isn't that kind of an audience capture thing, though?
[676] We're running onto the stage and slapping comedians in the face when they tell jokes.
[677] that's a different story that's a different story I mean but I think with the these people the thing about if you go on Twitter or any kind of social media and you have a story like that where it's really you know if you're a man with a beard and you have blue hair and you say you're a woman and your teacher calls you a man or your your student rather calls you a man and then you want to cry on TikTok can't you see why that's kind of an issue and I know you're going to get a shit ton of support from people that say you're right and that's why people do it they do it because they know that there's like a lot of love in in it with that narrative there's a lot of love you say that so if you put that out there you will get a lot of people supporting you and but then you also get a lot of people attacking you right and then they have to smear those attackers and these hateful people hateful comments and transphobic comments online because a male with a beard and blue hair who thinks he's a woman because he decides he's a woman and is just fully biologically male and is in a class and a kid a little kid not me who I would definitely call him a woman, whatever your name is, I'll call you whatever you want.
[678] A fucking five -year -old doesn't understand this.
[679] Especially if this five -year -old grows up in a Christian household.
[680] Maybe they don't discuss these things.
[681] But we don't respect that religion the way we respect other religions.
[682] It's very interesting because in this particular argument, and again, you and I are opposed in some ways about this, but I think what we agree with is that what you are trying to say is that all life is valuable.
[683] All life is valuable.
[684] the moment of conception is value it's all valuable and it's so important that we be this loving Christian community and I don't think you have to be a Christian to hold that view by the way and I think there are there's plenty of pro -life atheists who would say you know when I'm saying you guys as Christians that's how you think about it right you you guys you follow your guidelines of your religion and that's what's in your guidelines of your religion it's not like you're pushing it on other people right but this is what you're promoting what's what you're saying right and it's the only thing that's fucked is the right to choose that's the only thing that's fucked to force that onto a kid is to me it's chaos that's crazy doesn't make any sense but what you're saying other than that is like life is valuable like yes and people have almost were the victims of abortion and they weren't they went on to become these amazing people and we would have lost them sometimes it's a failed abortion like there's people who've survived like a saline abortion and their damage as a result of it But they lived, and now they're born.
[685] They usually go on, ironically enough, to become pro -life activists.
[686] Oh, well, that's crazy.
[687] Yeah.
[688] That's wild.
[689] But it makes sense.
[690] I mean, that's what made you.
[691] Yeah.
[692] Wouldn't you be a pro -life activist?
[693] Probably wouldn't be.
[694] Yeah, of course you would be.
[695] It's crazy.
[696] It's crazy how that works, right?
[697] Yeah.
[698] It's just, human beings are a weird fucking creature.
[699] You know, and this is one of the battlegrounds.
[700] of the two ideologies.
[701] This is where they get together, it's with abortion.
[702] It's one of the most heated battlegrounds because the people that are pro -life have this, in many ways, it's a loving view that all life is sacred, right?
[703] That would, the best way to live.
[704] And then the people, the other like, end of it, see where this is going if you tell a woman what you can and can't do.
[705] And tell a woman what her decision is.
[706] If she decides that at two weeks, it's not a fetus, it's not a, it's not a baby, it's not a life, it's a clump of cells.
[707] Like if she makes that decision and she wants to move on with her life, that's, I don't think we should have the ability to tell someone what they can and can't do like that.
[708] But again, when it gets to like where it's six months, it gets kind of crazy.
[709] Right.
[710] Like that's actually a baby.
[711] And I appreciate that you distinguish the two because I would say, you know, when you talk about outliers in sports, for example, those outliers being used as examples to try to shove an argument through right that's done with abortion all the time or use the uh you use the example of the teenage girl who gets raped right you know like that i mean that's tragic genuinely tragic awful awful terrible stuff super small percentage of abortion i mean most abortions are contraception it's like you know they're used as like late a contraception it's i got pregnant i don't want a baby i'm getting rid of it right that is the vast majority of abortions and so but yes i i do think though what's interesting about this topic, because, you know, you go back to, like, the harassment, intimidation, content moderation, free speech, all of that stuff.
[712] You know, depending on where you land on this issue, you can say almost whatever you want.
[713] After Roe v. Wade was overturned, the kind of stuff that people are saying about the Supreme Court justices and how they should never know peace again and the harassment and intimidation, all of that's perfectly acceptable.
[714] All of that's perfectly acceptable.
[715] All that's evil.
[716] All that stuff is evil.
[717] All that stuff is evil, showing up their houses and essentially what they did was law is a complicated thing where you look at rulings and you go over decisions and you try to find out if it applies to what you're talking about right there.
[718] I don't understand the argument.
[719] Like I would have to go deep, deep, deep into the argument.
[720] But I think their position was that the way Roe v. Wade was structured, it was not compliant with the law.
[721] Is that correct?
[722] Is that a good way to describe it?
[723] I think so.
[724] Yeah, they weren't saying, I mean, it's a mischaracterization to say that they like banned abortion.
[725] They didn't do that.
[726] They basically just said like the states can determine this.
[727] This is not like there's no, there is no federal protection.
[728] There was no constitutional right to it that's explicit in the constitution that was inserted in there.
[729] And so they're saying, we're going to toss that out and put it back to make your own abortion laws at the state level because this was imported into the imposed on the Constitution, not derived from the Constitution.
[730] That's essentially...
[731] So for a conservative perspective, this would be a good thing because this would give people the ability to make their own decisions without having the federal government dictate something.
[732] Right.
[733] Yeah.
[734] And move to a state, you know, that has, you know, this way it's like not federally mandated.
[735] You can live in a state that's pro -life.
[736] You can live in a state that's pro -choice, you know, like you can make your own...
[737] That gives you more leeway, too, to decide where you want to live.
[738] But the problem people have with that is if they're stuck in a state where it used to be legal, And now all of a sudden it's not and there's no federal protection for it.
[739] So they can't get it.
[740] And so then you have someone who's forced into keeping their baby.
[741] And it's the same sort of argument that we were talking about before.
[742] Like if you're a young person, if you're a 20, you're 40, whatever.
[743] If it's your body and it's your choice to decide whether or not you keep that baby, if you take that away from them and they would have to move to another state.
[744] Like you're giving people like a very complex scenario that's it's it's a series of hoops that they have to jump through that didn't have to jump through before i mean laws are there to protect life you know i i don't know when you talk about your body there's another body at stake at that point right there's another but you think there's another body at stake instantly well yeah i mean yeah there's another for your developing one yeah developing one yeah yeah it's size and location and and degree of development is different obviously but that's true of a two -year -old from a 30 -year -old i mean it's we We go through development for a very long period of time.
[745] Richard Dawkins once tweeted something about how a human embryo at whatever stage is indistinguishable from a pig embryo.
[746] Richard Dawkins said that?
[747] Yeah, yeah, see if you can find that.
[748] And I was like, that's crazy because the human embryo will become a person.
[749] Like, how can you even say that?
[750] It doesn't make any sense.
[751] Like that just because it looks real similar.
[752] One's a human, a developing human and the other is a developing pig.
[753] Yeah, it's nonsense.
[754] It's nonsense.
[755] It's not an alien.
[756] It's not an animal.
[757] He must have been drinking.
[758] With respect to those meanings of human that are relevant to the morality of abortion, any fetus is less human than an adult pig.
[759] That's silly.
[760] Wow.
[761] It's a human in development.
[762] That's even worse than I thought it was.
[763] I was more charitable with my description of that tweet.
[764] That's ridiculous.
[765] Oh, I tweeted right on it.
[766] Did you reply to it?
[767] Yeah.
[768] Oh, I was having a disagreement with Frankie Boyle.
[769] As I said initially, it's certainly more human than a pig that has zero potential to be a person.
[770] Right.
[771] That's ridiculous.
[772] It's a ridiculous statement from a really smart guy.
[773] And I would say, I would just argue that the, and we can move on to a different topic if you want to get off this one.
[774] But I would say that it's a person with potential, not a potential person.
[775] What's more of that?
[776] I think you got to, yeah, sure.
[777] Yeah, no, I see what you're saying, man. And this is why people need to have, cheers again, sir.
[778] See what we just did, we had a peaceful disagreement.
[779] Yeah, we did.
[780] That can be done.
[781] Respectful disagreement.
[782] But what we would agree on, I think 100 % is that we should be able to have that debate.
[783] 100%.
[784] There should never be like terms of service on Twitter that say you can't criticize abortion.
[785] You can't criticize transgender ideology.
[786] You can't joke about these things.
[787] Yeah.
[788] Not that abortion is a funny topic.
[789] It's not funny, but it is funny when Bill Burr.
[790] talks about it.
[791] Yeah, yeah.
[792] But it's only funny because Bill is a genius at like carving out a perspective on things.
[793] Right.
[794] You know, right.
[795] He is.
[796] Yeah, he's good.
[797] But I think it's, you know, what we agree on 100 % is that there should be, you should have a conversation.
[798] I don't think this effort, the content moderation effort, which is not aimed at the lewd and the indecent, but it just opinions that the powers that be don't like, that effort is limiting what information we have access to, limiting what we can to talk about.
[799] And we're supposed to.
[800] And we're supposed to supposed to be better informed as a result?
[801] You know, the arguments that they give or that we're somehow we're going to be better off because we don't have access to all this misinformation, you're better informed by hashing it out and talking through it and say, we might learn something.
[802] We might change each other's minds by engaging in debate.
[803] And you take that away from everybody.
[804] And I think it's the most valuable thing in the world.
[805] I think it's the reason that Musk got involved, not because of the B, Musk got involved because he saw that threat to speech and the free exchange of ideas.
[806] Because where else are you going to have it?
[807] If you're not going to have it on Twitter or Facebook, where else are you going to have it?
[808] I think Twitter is the problem in the format, because think about what you and I just did, this would be horribly frustrating to do through text.
[809] Yeah.
[810] You would have to think about the exact wording of your tweet.
[811] You'd have to respond to what I said.
[812] Character limitation.
[813] And, you know, and maybe you might insult me. Get a little jab in on me. Maybe I insult you back.
[814] Get a little jab in on you.
[815] And we're not in front of each other having a conversation like this.
[816] The beautiful thing about having a conversation with someone is that you're right there with them.
[817] And if you're a good person, you don't have to be in an argument with someone.
[818] If you could have a conversation with someone without being in an argument with them, you can't.
[819] I mean, I supported my position, you supported your position.
[820] We just talked.
[821] Right.
[822] And we can't be communicating.
[823] You're going to have a really hard time doing that on Twitter.
[824] Like even someone who's like a genuine kind person.
[825] Like I find myself to be a genuinely kind person.
[826] I try myself, I try very hard to be a genuinely kind person.
[827] I really do.
[828] And so like if I'm engaging with people, on Twitter I don't want to get into one of those things right I don't want to like shit on them I don't want to fuck with them I'm not interested don't dunk on them I could do that at a comrade ratio the hell out of them if I have to shut someone up but I'm not engaged in that sport but some people do and the problem with that is that becomes sort of a way that you learn how to share ideas and communicate and it favors being mean it favors like taking things out of context and favors dunking on people all of that wouldn't happen if you talk to that person in real life.
[829] The problem is we're taking what's essentially a complex exchange of ideas and philosophies and interactions.
[830] You're doing it with words.
[831] You're also doing it with tone.
[832] We're looking at each other.
[833] Clearly, I respect you.
[834] You're a really nice guy.
[835] I think you're very intelligent guy.
[836] I like talking to you.
[837] I don't want to get in an argument with you.
[838] I don't want to hurt your feelings.
[839] But I also want to say what I'm saying and I want to see where we have middle ground and where we disagree.
[840] But in person is the way to do that.
[841] We're designed for it.
[842] We're not designed for text.
[843] You know, it's just, it's bad for us.
[844] Yeah, yeah, I get that.
[845] Plus, now it's like a preserved written record of how mean you just were to somebody.
[846] Exactly.
[847] You know?
[848] Well, it's just.
[849] It doesn't fade away.
[850] It's there forever.
[851] It's not how we're designed to communicate.
[852] If we say something in a heated argument and I regret it, I can apologize to you, and it fades away.
[853] It goes away.
[854] Yes.
[855] You know, it doesn't fade away when it's a written record on the Internet.
[856] Well, it's just like having a conversation, right?
[857] Sometimes you say shit and you're like, oh, why did I say that that way?
[858] Yeah.
[859] Because you're thinking out loud, and sometimes you're not good at it.
[860] You're just, you're running with words and you're trying to figure out how to structure them together.
[861] It doesn't always work out well.
[862] Right.
[863] But when you do that on Twitter, the problem is now it's printed and published and it's there forever.
[864] It's just like a dumb thing that you might have said yesterday in real life.
[865] It's not like this is something that you're like staking your intellectual reputation on.
[866] I've researched this thoroughly and these are my opinion.
[867] No, you're just fucking tweeting something.
[868] Right.
[869] But the problem with fucking tweeting something is it's a weird way to communicate.
[870] It's permanent record that is casual thought.
[871] Right.
[872] You know, and that's a lot of it.
[873] I mean, some people put more effort into it and they really like structure their tweets and those people are mentally ill and they should fucking go find actual physical hobbies.
[874] A real human being should engage in.
[875] You don't spend an hour on each tweet?
[876] Thinking through it.
[877] Well, I mean, if you had something important to say, yes, there's nothing wrong with that.
[878] My point is that that's, the problem is people do that all day long.
[879] Right.
[880] That's their sport.
[881] Their sport is like getting in arguments and virtue signaling on Twitter.
[882] And I know people who have lost their fucking minds doing that.
[883] And I recognize it the same way I recognized when I was in high school, my cousin's friend started selling coke.
[884] And I noticed that guy was losing his fucking mind because he's doing coke all the time.
[885] It's the same kind of thing.
[886] I know people that I'm friends with and I'll go to their Twitter page and they're insane now.
[887] All they're doing is tweeting about the Democrats, pro -democrat this, pro - they're talking about obscure congressmen in South Dakota.
[888] It's wild shit.
[889] And their career has completely fallen apart.
[890] Like, their career has fallen apart.
[891] There's very few of them that are really successful that engage in this kind of constant, all -day -long, psycho behavior.
[892] It's all people that are, like, literally flailing.
[893] They're mentally ill. And they don't see it that way.
[894] They see that somehow they're out there fighting a good fight, like one tweet at a time.
[895] It's wild.
[896] It's madness.
[897] I do think that, that engaging in those forums, if you have something to say on issues that you think are important, rather than keeping it in and holding it to yourself for fear of whatever backlash you might get or getting sucked into an argument, I do think there's benefit to that.
[898] It's just, you know, in moderation, right, not taking it too far.
[899] Right.
[900] And making it like your thing is like you're on Twitter all day, like debating people or like trying to beat people over the head with whatever you're, political ideas are or your moral ideas or whatever.
[901] But I mean, advancing, trying to, you know, like the B, it's interesting.
[902] What we do with the B, satire itself is like, you want, you're, on one hand, you're just trying to make people laugh, but you are also trying to make them think.
[903] You're trying to engage the ideas of the day, right?
[904] Like the satyrs, like the way that, uh, the way that the onion defines satire in one of their like encyclopedias or whatever is it's, uh, it's being a smart ass while saying it's for a higher purpose.
[905] And, uh, and, uh, And that's funny because the satyrus will tell you it's for a higher purpose, right?
[906] They'll tell you like we're trying to like speak truth through these jokes.
[907] We're trying to like tear down bad ideas and address them.
[908] And I think that's true.
[909] I think there's a benefit to that.
[910] I think it's why it's why we want to be in the conversation is to be playing.
[911] Like I, we don't have a mission statement at the Babylon B, but if we did, I would say it's to ridicule bad ideas.
[912] Ridicule them.
[913] Do you know what a -a -e -do -mocca is?
[914] Have you ever heard that term?
[915] I do not.
[916] It's a Lakota term for a sacred clown.
[917] They had a character that was in their tribe that would make fun of everything.
[918] He would make fun of the chief, he'd make fun of the best warriors, he'd make fun of the women, he'd make fun of the children, make fun of everything.
[919] That's great.
[920] And the tribe had, they had a philosophy that anything that could not be made fun of was bullshit.
[921] You had to find the holes in things.
[922] And one of the ways you would find the holes in things is to make fun of them.
[923] And then there was an important part of the tribe.
[924] And they were the sacred clown.
[925] And that's a heoka.
[926] And that's a satirist.
[927] That's the Babylon Bee.
[928] That's a good stand -up.
[929] That's a lot of comedy.
[930] A lot of meme comedy is coming from that perspective.
[931] Right.
[932] They're mocking something that is available to be mocked.
[933] There's some things that are not available to be mocked, right?
[934] Like a horrific murder of a good person.
[935] It's not available to be mocked, right?
[936] We all agree.
[937] Because we don't want that happening, and this is not something we support as a society.
[938] But when it gets down to debatable ideas, then you've got to find out how mockable is that debatable idea.
[939] And when it comes to, like, Rachel Levine winning woman of the year, that's pretty fucking mockable.
[940] It's mockable.
[941] It's not just funny.
[942] And to say it's not mockable doesn't, you know, like to say it's hateful to mock it.
[943] Right, right.
[944] That's a mockable idea.
[945] Right.
[946] But I also think that there's a moral obligation to mock some of these things.
[947] A moral obligation.
[948] I would say the absurd has only become sacred because it hasn't been sufficiently mocked.
[949] I think we have crazy ideas, crazy ideas that comedians to some extent bear the responsibility for becoming popular because they were too afraid to mock them.
[950] They were too afraid they would get canceled.
[951] They didn't want to make fun of it.
[952] You know, like kids, kids are so impressionable.
[953] Kids don't have like, kids don't have like a theological foundation or a philosophical foundation.
[954] They can't ward off bad ideas.
[955] Like they just absorb what.
[956] whatever you throw at them.
[957] But let me stop me there because you're talking about my tribe now.
[958] Yeah.
[959] Here's the thing about comedians.
[960] We make fun of things we think are funny.
[961] Right.
[962] So we don't have an obligation to decide that something is funny.
[963] And if you say that we have an obligation, like who is our representative that has that obligation?
[964] Hold on.
[965] They're individual artists.
[966] Some of them are absurdists.
[967] Some of them are guys like Zach Alfanakis or like Mitch Headberg that just write non -sequiturter jokes.
[968] Like they're not responsible.
[969] for anything.
[970] The idea that comedians are responsible for mocking something.
[971] Well, if there is a comedian who sees something there and he wants to talk about it on stage, then he's responsible for making it funny and it's an important subject and it's something that you can mock, just do a good job on it, make sure it works good.
[972] That's the responsibility of the individual, but it's not like we have a, but it's not like we have a committee.
[973] We're not like a government organization.
[974] No, no, no, no. We're not like the FDA.
[975] We approve bad drugs.
[976] We're not that.
[977] We're a group.
[978] We're a of artists.
[979] But we have an obligation as comedians to be funny.
[980] Yes.
[981] So when you said, okay, so you were talking to Gina, Carano recently, and you talked about how woke shit is the funniest shit.
[982] That's what you said.
[983] Woke shit's the funniest is the funniest.
[984] And we make fun of that stuff.
[985] So ridiculous.
[986] It's so ridiculous.
[987] And somebody's got to make fun of it.
[988] I think at a minimum, the comedian has an obligation to be funny and not dance around those things that deserve mockery.
[989] And so you can look at it from two perspectives.
[990] It's if you want to, if you want to, if you want to, if you want to, if you want to be funny you got to go after the stuff that's really funny and you shouldn't dance right try to you shouldn't try to avoid that just to just to like to not ruffle feathers but the key is just whether or not you think it's funny right that's the key yes yes it's like some people don't have a joke on something like I don't have an abortion joke right but Bill Burr had a great one right I didn't think it was I didn't think the idea of it like it didn't pop into my head as good subject matter it it popped into my head as a problematic human situation.
[991] So when I look at comedy, like I have to decide what I want to talk about based on what I think is funny.
[992] It can't be any other thing.
[993] Now, if I look at something like Rachel Levine women, woman of the year and thinking it's hilarious, I would probably do a bit about that, but Chappelle had already done that bit about Caitlin Jenner winning woman of the year and compared it to M &M.
[994] And it was hilarious.
[995] It's a brilliant bit.
[996] So that subject's dead to me now.
[997] So I move on.
[998] But that's how we do it.
[999] So, like, if there's a thing that I think is funny, and I decide to talk about it on stage, then I agree with you.
[1000] Then I have an obligation to see it through.
[1001] So I would put satire in a different category than just generic comedy, just jokes for the sake of jokes.
[1002] I, you know, the onions thing about, you know, it's being a smart ass and saying it's for a higher purpose.
[1003] Generally, throughout the history of satire, especially like political satire, right, that's dealing with the issues.
[1004] of the day in the culture.
[1005] The idea is to, I heard somebody define it as, you know, satire weds wit with moral concern.
[1006] Okay.
[1007] So you're taking, you're looking at what are the social cancers?
[1008] What are the, what are the things that are bad for society?
[1009] Mm -hmm.
[1010] And you're finding a witty way to excise them, to cut them out before the cancer can, can kill the host, right?
[1011] Sort of.
[1012] Also, a lot of times you're just mocking things.
[1013] You are.
[1014] You are.
[1015] A lot of times you're just mocking the mundane and the silly and the deserving of it.
[1016] But you're also finding those things that are like dangerous, that are harmful, the social cancers.
[1017] And you're saying, okay, look, we're not running around.
[1018] We're not an attacker with a knife trying to stab and hurt somebody.
[1019] We're more like a surgeon with a scalpel trying to cut out something harmful so healing can happen.
[1020] And so there is like the satirist still has like this mission that goes beyond just, I want to make people laugh.
[1021] And abortion is a great example of this because abortion is not a funny topic.
[1022] But we've done satirical jokes.
[1023] like Bill Cosby claims sexual assault is only 3 % of what he does.
[1024] And Planned Parenthood defends him for that reason.
[1025] Because Planned Parenthood claims that abortion is only 3 % of what they do.
[1026] So we'll do like an abortion joke like that to show, you know, the absurdity of Planned Parenthood trying to get off the hook by saying this is only 3 % of what we do.
[1027] By saying, okay, what if Bill Cosby said sexual assault is only 3 % of what I do?
[1028] So we're making a point there that's not just like going for laughs.
[1029] You see what I'm saying?
[1030] I see what you're saying.
[1031] and what you guys are doing is trying to speak to a voice you don't think is being heard and that's why it's being very successful because there's not a balance in narratives when it comes to like left or right in this country and when there's unspoken stuff because now we're banned right and there's unspoken stuff that you can't talk about that a lot of people wish they could talk about and those are the things that you could poke fun at and they get a big response and they get shared a lot of lot.
[1032] I mean, obviously you guys are still on Instagram.
[1033] You have a lot of people on Instagram and you're still on Facebook.
[1034] And I think there's some, there's some sort of, uh, we got banned on TikTok recently.
[1035] Did you?
[1036] Yeah.
[1037] It's probably good.
[1038] You should be off TikTok anyway.
[1039] No, appeal, permanently banned.
[1040] I know.
[1041] I know.
[1042] I was like, I don't even know that we want to talk about that that much.
[1043] We don't even like it.
[1044] We're not even making a big deal that we got banned on TikTok because maybe we shouldn't have been in the first place.
[1045] Do you ever read the terms of service?
[1046] No, not in detail, but I've read it on the podcast.
[1047] We read it at me and Theo Vaughn read it out on the podcast.
[1048] It was bonkers.
[1049] It's bonkers.
[1050] They get access to your microphone.
[1051] They get access to all your keystrokes.
[1052] They get access to other computers that aren't connected to your phone.
[1053] Why is Apple allow that in the app store?
[1054] I have no idea, but that is one circumstance where Trump was correct.
[1055] When he was talking about banning that, look, that is Chinese spyware that's dressed up as the most addictive social media app ever.
[1056] It's wild.
[1057] It's a Trojan horse.
[1058] Yeah.
[1059] When they back -engineer that shit and find out exactly what's in there, they're freaking out.
[1060] They were like, this is like the most invasive app we've ever discovered.
[1061] Right.
[1062] It's so crazy and it's so addictive.
[1063] It's fucking genius.
[1064] I say to China, well played.
[1065] You got us.
[1066] And now they're trying to turn Instagram into TikTok basically, essentially just like morph it into that to compete with it.
[1067] Well, you know, I mean, that's what they always do, right?
[1068] That's where stories came from.
[1069] Right.
[1070] Yeah.
[1071] They always find the best features that some other application has.
[1072] They all do that.
[1073] Twitter did for a lot.
[1074] Remember fleets?
[1075] They did fleets for like a minute.
[1076] Well, they're doing substack on Twitter, right?
[1077] They're allowing people to pay, which is interesting, pay to be like a super tweeter or something like that.
[1078] Right.
[1079] Super supporter, super follower, something like that.
[1080] Yeah.
[1081] Yeah.
[1082] Hello.
[1083] It's like, they all do that.
[1084] They all, I guess that's what you have to do.
[1085] The thing about these companies is that these are all companies that, you know, they have stakeholders.
[1086] There's, they're stockholders.
[1087] It's a lot of responsibility.
[1088] Yeah.
[1089] They have to do.
[1090] If a CEO doesn't capitalize on some opportunity where there's a really popular thing that keeps happening, like, that's why Instagram is always pushing the videos.
[1091] They're so, the reels, they want Instagram reels to be like the most important thing on Instagram.
[1092] Right.
[1093] The other things don't get nearly as much traction.
[1094] Right.
[1095] Now I go to, I go to make a new post, and the first thing that's an option is real.
[1096] Like, I don't want to just post a reel every time, you know.
[1097] Well, they know.
[1098] They know what's going on.
[1099] They know that them TikTokers, they got them.
[1100] TikTok got them.
[1101] They came along out of nowhere With the most addictive app ever That's also spyware and no one cares It's not good for teens either Not healthy Well all that stuff is bad for kids Do you keep your kids off it?
[1102] No, I don't I don't think my children Should not be exposed to things That are in the world I think the more you protect them From certain I think you have to have communications With them about it I talk about it openly I've described one of my My daughter came home the other day She goes one of my friends is so mad at you because her mom watched a video of you talking about the terms of service on TikTok and she made him deleted off his phone.
[1103] It was hilarious.
[1104] I go, good, you should delete it off your phone too.
[1105] I'm like, they're listening to me right now, honey.
[1106] This is dangerous.
[1107] But that's how I deal with it with my kids.
[1108] I give them the opportunity to make their own decisions.
[1109] I think it's very important.
[1110] And I don't think in this world, I think this world, keeping your kids off social media, you might think that's good, and it's probably good to regulate it, and it's probably good to have discussions with them.
[1111] But everyone's on social media.
[1112] This is the new world.
[1113] This is keeping kids from listening to rock and roll.
[1114] That's what this is.
[1115] This is keeping kids from wearing skirts.
[1116] We're in a new world.
[1117] The new world involves social media with most people.
[1118] The key is going to be how do you engage with people and how do you treat people the way you would treat people if you were in front of them?
[1119] If they were a genuinely nice person and you're being genuinely nice to them.
[1120] We should discourage cuntiness everywhere, including cuntiness and comments, cuntiness on tweets, cuntiness in Facebook.
[1121] That's not good.
[1122] And to encourage that kind of communication, that shit is going to come back on you.
[1123] You're setting a tone, and you see it with so many people that these attack dogs, they develop like a fan base of other attack dogs, and then someone goes after them.
[1124] They go after them, they start attacking them, And they hate it Because everybody hates that It's a shit way to talk If you were having a fucking dinner conversation At your house with a couple of buddies And one guy brought his friend And his friend is just an insulting asshole He just wants to mock everything you do And shit all over you Talk about shit you did in seventh grade And you're like, get this fucking guy out of here Sounds like a good time He'd be like get this fucking guy out of here This guy's rude It's this That leaves room for mockery Leaves room for comedy but I think what we have to be really careful with is we're setting a tone for communication because most of the communication that people do today where it's with other folks that they don't know a lot of people are like the majority of their communication is on social media that's crazy.
[1125] It is scary though that teens are like suicidal because of like what they see on social media I mean like I don't think rock and roll ever had that kind of like you know like exposing letting your kids listen to rock and roll there was a lot of that was a debate at one time right but I don't know social media has like it has the ability to put kids into very unhealthy mental states that's what they thought about rock and roll yeah it's just the exact same thing they thought when they wouldn't let Elvis shake his pelvis on television yeah it's the same exact thought they're like oh my God rock and roll is going to ruin my children this is the world your children are living in your children are young human beings This is the world they live in.
[1126] Give them a strong set of morals and ethics.
[1127] Show them by the way you talk to people, by the way you behave that you're kind and thoughtful.
[1128] Show them that you work hard, that you have discipline, and that you admit your faults.
[1129] And you're always trying to do better.
[1130] Show them that.
[1131] And they can apply that to all things in life.
[1132] But to try to pretend that we should all live in a world that is completely alien to the world that we currently live in seems to me to be ridiculous that doesn't mean you should be on like Chinese spyware right but but if my kids want to do that that's what they're gonna fucking do right because they they enjoy the fun of doing TikToks and shit you know it's there's a real world that we live in that real world is fucking complicated I think the most important thing is to let your kids understand this shit is complicated yeah let them understand that there's a lot of games afoot there's a lot of things going on there's moving pieces.
[1133] There's a lot of narratives that are being projected that aren't accurate and they do it on purpose and these are people that are in the fucking highest echelons of media.
[1134] They're putting out lies and nonsense.
[1135] They're acting as propagandists for the state.
[1136] They're doing what administrations want them to do versus what their journalistic ethics should compel them to do.
[1137] And this is happening left and right and we know that 75 % of all advertisements on television are pharmaceutical companies.
[1138] That's fucking wild right that's wild that the vast majority what people see on television has to be in line with the hugest advertising budget that the world has ever known that's some wild shit right there and that's the pharmaceutical companies really and we're all cool with that equipping kids though to deal with this stuff i i absolutely agree with you i totally agree with you and i think but and it also goes back to the point about mockery i think that kids should see you modeling good behavior, they should also see you mocking ideas that deserve to be mocked.
[1139] Yeah.
[1140] Because otherwise, they're going to take those ideas seriously.
[1141] They're going to think that there is such a thing as a transgender three -year -old, you know, just because the boy picked up a Barbie for two seconds.
[1142] Well, there's so many stories of people who were, when they were younger, thought they were a boy, and then they grew up and just became a tomboy and then became a regular woman.
[1143] And they're like, oh, what if I had lived today?
[1144] Oh, and what a crazy time for gays and lesbians, like lesbian women.
[1145] who are a little bit more masculine in how they dress and how they feel, but they don't, they don't identify as male, you know?
[1146] They're attracted to women.
[1147] They are themselves women.
[1148] They identify as women.
[1149] But they're not like, girly, you know?
[1150] How crazy is that term you just said, what's that?
[1151] They identify as a woman.
[1152] That's like a normal term now.
[1153] Yeah, yeah.
[1154] Isn't that wild?
[1155] What is a woman, Joe?
[1156] What is a woman?
[1157] That documentary was fantastic.
[1158] It was.
[1159] It's really good.
[1160] It's really good.
[1161] I would encourage everybody on the left, on the right, in the center libertarians watch that documentary what is a woman it's very interesting because he does the best job I've ever seen of having like a poker face through the entire thing yeah he's just asking questions asking questions not being adversarial yeah and he's letting them go why you notice how like there's a couple of conversations in that documentary too where there were just just questions just simply questioning what is true like what do you what do you believe and why do you believe it like what's the truth here and it actually got people irritated the point where they wanted to kick him out of their office.
[1162] Oh yeah.
[1163] And I'm like, you know, this is what happened.
[1164] This is how you react like when you've spent your entire life and your current job is to suppress the truth.
[1165] The mere question about what is true is enough to upset you.
[1166] Yeah.
[1167] That's crazy.
[1168] That one politician.
[1169] He wasn't being adversarial.
[1170] The one politician that ended the conversation quickly.
[1171] Shut it down.
[1172] Yeah.
[1173] I don't remember what his question was, but it was pretty innocuous.
[1174] It wasn't the, like, sniping question.
[1175] Right.
[1176] There was nothing nasty about it.
[1177] It was just asking questions.
[1178] Like, you should be able to have answers to those questions.
[1179] You should be able to have answers to those questions to people who agree with you.
[1180] Right.
[1181] And you should have it, have, if you hold that idea in your head and you're a politician, and you might actually vote on these things, and you might have to actually have a say on these things, and you might be able to promote these.
[1182] ideas you should tell me right what you think about everything about crosswalks I want to know what you think about stop signs I want to know what you think about everything you should be able to tell me if this is something that you have an actual prof you're a professional politician you're supposed to have a fucking opinion on these yeah very important issues so this is an important issue and somebody brings it up and they just maybe they oppose you you can't talk to someone who opposes you right that's preposterous right you have to be able to but that that should just immediately Disqualify you from being a politician someone should just say you go whoa whoa whoa you can't say we're not going to talk about this if you wanted to talk and then this comes up and you don't want to talk about this I'm interested what you think about the The question that I get all the time This you know oh isn't your job easier now because this is such a target rich environment You know like there's so much like we like we can't even decide what a woman is you know like we A Supreme Court Justice nominee was asked what is a woman and she is a woman by the way and and and she refused to answer that and said, I'm not a biologist, it's like, this kind of, you can't make it up.
[1183] You can't make it up.
[1184] And we're doing satire and comedy.
[1185] I think, in my opinion, you know, like when people say it's a target -rich environment, yeah, of course, you know, some of these things are easy to make fun of.
[1186] You know, a lot of these things are easy to make fun of.
[1187] But, like, you could literally just publish what he says verbatim, and it's funny.
[1188] It's hard to exaggerate.
[1189] It's a Mike Judge movie.
[1190] Yeah.
[1191] It really is.
[1192] It's like idiocracy.
[1193] It's making comedy harder.
[1194] I don't know if you've seen.
[1195] Did you see the spreadsheet that I shared that has our 70, it's now 76, 7, 76 jokes we made that came true.
[1196] Oh, no, she'll pull that up.
[1197] Yeah, we tell, I don't even know.
[1198] It was on my, uh, it's got to be somewhere.
[1199] It's on my Twitter.
[1200] I don't know how it'll directly link.
[1201] It was a spreadsheet, a Google sheet that shared.
[1202] Jamie is the wizard of Google.
[1203] He will find it.
[1204] 76 jokes we've made that have come true.
[1205] And it's things like, you know, Gavin Newsom named U -Haul salesperson of the year.
[1206] And then like, and then Fox News puts out a story that like, U -Hauls running out of trucks so fast because people are fleeing for Texas that they can't keep up with the demand you know and so it's like but you did this way before we did it a month before they actually published there you go there it was quick so the left the left column is the joke and then the right column is a real story let's see what they are can you get you can't zoom anymore from here there we go blind boy needs whoa that's huge oh look at those pop -up windows they fucking occupy everything okay scroll up a little so I could read this there we go oh that oh it just keeps going.
[1207] Oh.
[1208] What is going on with these pop -up windows?
[1209] You fuck.
[1210] Make that a little lower.
[1211] A little smaller rather.
[1212] There we go.
[1213] New York Times praises Soviet Union for unprecedented gender equality in labor camps.
[1214] Click on that.
[1215] How is that?
[1216] Well, that's our article.
[1217] And then the real story is like over in the right column.
[1218] Is that it?
[1219] Yeah.
[1220] Okay, click on that.
[1221] Biden Men's pick for banking regulator once praised Soviet Union for having no gender pay gap.
[1222] Wow.
[1223] So...
[1224] It's not in camps, but it's pretty goddamn close.
[1225] It's close.
[1226] So I think some of these are like, it's like a partial, we call it a partial fulfillment of a prophecy, you know, like it comes half true, and then sometimes it comes like totally true.
[1227] And then sometimes it comes even more true than like it's even more exaggerated.
[1228] Look at it says there.
[1229] Say what you will about the old USSR.
[1230] There was no gender pay gap there.
[1231] market doesn't always know best what the fuck are you talking about right and here's the thing that's gross about that the grossest part she knows the gender pay gap is not as simple as you're a carpenter she's a carpenter right you both do the exact same quality work she makes a hundred dollars an hour you make 150 that's not what it is no no what it is is men choose different jobs yeah like welders and architects and whatever it's all the and they work more hours and women get pregnant and they have babies they can't work there's a lot they make decisions because they're mothers that they don't want to be, the husband can pay all the bills, they make decisions within the household more regularly that they make less money overall with all humans.
[1232] But it's not the same people making the same job.
[1233] Like if they do the same work, why wouldn't you just hire all women?
[1234] Because they're just as good and you could pay them less.
[1235] That's so dumb.
[1236] But it's a lie.
[1237] It's not dumb.
[1238] It's a lie.
[1239] Because people know, and a lot of people don't know this.
[1240] I had a friend of one who was arguing with me about it.
[1241] Right.
[1242] Like we were talking about divorce settlements, and he was saying that he thinks that maybe it's to make up for the fact that there's a gender pay gap.
[1243] I go, what is the gender pay gap?
[1244] We had a conversation about it.
[1245] He goes, well, women get paid, you know, 75 cents for every dollar a guy guy.
[1246] I go, doing the same job?
[1247] He goes, yes.
[1248] I go, no. No. He goes, really?
[1249] I go, no. No, it's different jobs.
[1250] Right.
[1251] That's what it is.
[1252] Now, if you can say that there's some sort of a gender bias within those jobs.
[1253] By the way, I would call that harmful misinformation.
[1254] You know why that's harmful?
[1255] Look at all the resentment it creates.
[1256] Look at all the strife.
[1257] Look at all the division it creates.
[1258] You end up with a political divide.
[1259] You end up with people like acting like they're getting the short end of the stick and thinking of themselves as a victim when they're not.
[1260] It's just not true.
[1261] It's a misleading narrative.
[1262] Right.
[1263] And it also flies in the face of real gender discrimination that probably does happen in some jobs.
[1264] Right.
[1265] So it fucks that up.
[1266] Right.
[1267] Because it like paints this unrealistic narrative.
[1268] When there's, you could highlight an actual real case, like real gender discrimination, whether it's in, with specific fields or specific companies where like there's an old boy's network that control who succeeds and who doesn't succeed.
[1269] And it's not based on merit.
[1270] It's just based on, you know, cronyism, then it makes sense.
[1271] But if you're saying that and you know it's not true and you know it really is that women choose different jobs and that men go into different fields of work and sometimes are more dangerous and men are much, Jordan Peterson talks about this so ill. You know, that men are more likely to die, they're more likely to get murdered, the more likely to commit suicide.
[1272] There's more of them in prison.
[1273] Yeah, yeah, it just goes on and on.
[1274] But that this, in specifically, when it comes to work, that this, they get killed at work more often.
[1275] They tie on the job.
[1276] It's like, and they choose these paths based on, you know, a lot of, like, traditional male characteristics.
[1277] More often than women, because they're roofing more often than women are.
[1278] Probably right.
[1279] Yeah.
[1280] But that's the whole point is that he's saying these are dangerous.
[1281] that these men gravitate towards to.
[1282] Right.
[1283] And also, like, very physical jobs.
[1284] Like, when was the last time you saw a female garbage men?
[1285] Right.
[1286] And I'm sure they exist.
[1287] I'm sure there's a garbage woman out there.
[1288] I'm sure she's mad at me later now.
[1289] Motherfucker, I listen to you why I'm working now.
[1290] I hate your ass.
[1291] She needs a transition.
[1292] I've been working in Tallahassee, Florida slinging garbage for 16 couple years.
[1293] I'm sure there's a woman like that out of that out there.
[1294] Probably got a cigarette hanging out of her mouth.
[1295] But for the most part, it's garbage men.
[1296] Yeah.
[1297] Masonry, you know, like a heavy duty construction job.
[1298] jobs, the vast majority are men, you know, walking up on those fucking beams, 30 ,000 feet in the sky, they're not that high, but you know what I mean, like the skyscrapers.
[1299] That's like flying out to do.
[1300] A lot of those are men, and a lot of people that, you know, gravitate towards certain fields that require extreme competitiveness in the work environment, you know, 16 -hour days, like lawyers and doctors.
[1301] Oh, I've heard Jordan Peterson talk about that at length and the lack of appreciation that's there, too.
[1302] You know, like, men are, like, men are making things work.
[1303] They're fixing things that break.
[1304] They're building things that we need, you know, and they're working themselves the bone to do it.
[1305] And there's, like, very little appreciation for those types of jobs that they're doing that are just backbreaking labor.
[1306] There's very little appreciation in terms of, like, what impact it would have on society if they didn't exist.
[1307] Yeah.
[1308] Imagine a world with no plumbers.
[1309] Right.
[1310] You know?
[1311] Right.
[1312] Yeah, bro.
[1313] Right.
[1314] Imagine the world with no electricians.
[1315] You have no idea what the fuck is going on in that box.
[1316] Right.
[1317] He got to call a guy.
[1318] Right.
[1319] You know, imagine a world with no carpenters.
[1320] Like, how do we make a house?
[1321] We have to make our own house.
[1322] What?
[1323] Yeah.
[1324] No framers.
[1325] What?
[1326] No cement guy.
[1327] I've got to figure out of a mixed cement.
[1328] What the fuck is to?
[1329] I've got to run a toilet line?
[1330] No, we're going to make an outhouse.
[1331] Right.
[1332] We're going to shit in a hole in the ground.
[1333] I'm going to keep moving every six months.
[1334] Yeah.
[1335] Go down real quick.
[1336] Real quick.
[1337] Yeah.
[1338] But, you know, that's like teachers, right?
[1339] Teachers are one of the most important parts of a child's development.
[1340] is the education they're exposed to when they're a child.
[1341] But we don't think about that as being that valuable.
[1342] We don't pay them very much.
[1343] We treat them like shit.
[1344] Right.
[1345] You know, it's not a great job in terms of like the financial reward.
[1346] It's not very celebrated.
[1347] We only think about them when they suck.
[1348] We're only mad at them when they do something wrong.
[1349] Like teachers are responsible.
[1350] Like I've had a lot of bad teachers.
[1351] Same with law enforcement, by the way.
[1352] So fucking lutely.
[1353] I am, well, I'm on the opinion that most cops are good people.
[1354] That's why most interactions that people have with cops, I think cops are representative of human beings, and most human beings are good people.
[1355] Most, the vast majority.
[1356] That's why you can go to the mall.
[1357] That's why, for the most part, without mass shootings, you can go to dinner at a restaurant, you can go.
[1358] Because most of the time, people are great.
[1359] Most of the time, even when they have disagreements, it's rare, unless you're hanging out in the wrong bars.
[1360] But most of the time, people are great, and I think that's the case with everything.
[1361] I really do.
[1362] I agree with that.
[1363] I think that's the case with most things in life.
[1364] Do you think, though, going back to that, our jokes keep coming true and all this nonsense, do you think comedy's harder now or easier?
[1365] Is it a target -rich environment that's easier to make fun of, or do you think it's more challenging?
[1366] Well, you're going to get criticized more, you know, but that's part of the job.
[1367] You know, this idea that comedy's under attack, you know, it means it's just people's opinions.
[1368] They're expressing their opinions on something.
[1369] you suck or they think you're rude or they think this and that they're allowed to have opinions.
[1370] The problem is not that.
[1371] The problem is when you want to suppress a person's ability to say something that you would criticize.
[1372] That's where our problem lies.
[1373] A problem doesn't lie in criticism of comedy.
[1374] I think that's important.
[1375] I think criticism and everything and you might agree with it.
[1376] You might disagree with it.
[1377] You might think that criticism is ridiculous.
[1378] It's completely out of context.
[1379] You're taking it out of line.
[1380] This is not what you're saying and this is really what they're saying.
[1381] Or you could say, you could look at it and say, I agree.
[1382] It's valid.
[1383] I think that joke sucks.
[1384] I think that comedians mean.
[1385] You're allowed to think that too.
[1386] That's a part of being a person.
[1387] There's certain people that are so goddamn sensitive, they believe in microaggressions.
[1388] The slightest little look that someone can give you is she's microaggressing me at work.
[1389] And you could go to your fucking human resources person and make a formal complaint that someone's enacting microaggressions on you.
[1390] There's where the debate lies.
[1391] Is it better, Is it better for society?
[1392] Is it healthier for us?
[1393] Is it better for that person if we mock that or if we coddle it?
[1394] Do we mock it or do we coddle it?
[1395] Microaggressions are the best example.
[1396] It's not that to coddle.
[1397] If you say something really fucking dumb to a guy at the office, he should be able to go, okay, and just walk away.
[1398] Right.
[1399] That should be his right.
[1400] And if you want to go to human resources and say that, you know, you're going through a simulated pregnancy and he didn't support you.
[1401] And you were telling him about how you're in the, you know, third trimaster of your simulated pregnancy and he's like okay and he just walks away and you've got a beard and you have blue hair and your name is alice now and for the last six months you've been Alice and you've been pregnant what am I supposed to do right you tell me what I'm supposed to do there I can't go okay yeah you know if if you tell me that you're a fucking psychic and that you're an intuitive and that you're tuned into the world you're an empath you're telling me all this and you're getting together with a bunch of people that don't believe in possessions and they're all polyamorous and I go okay and I walk away it's the same goddamn thing you're saying something that's outside of the norm you're you're talking cuckoo talk you have so many people who would argue with you and say that you have a moral obligation to affirm affirm affirm affirm it's the right thing to do it's the compassion thing to do it's the loving thing to do is affirm it's compelled speech because of belt speech is always dangerous yeah it's always dangerous it's it is akin to uh you mean that's a problem with some people have like the pledged allegiance they think that's compelled speech but I think that's patriotism and I think you know, with a good pledge of allegiance and a good idea of what we represent as a person, it's like a mantra that we can chant.
[1402] I could see their argument, though.
[1403] But I think that some the proponents of compelled speech only want their side to win.
[1404] But what if it comes back against you, man?
[1405] That's Hitler, right?
[1406] That's fucking Mao.
[1407] That's Stalin.
[1408] When it comes back on you and there's a dictator saying something that you disagree with, but they're compelling you to say it.
[1409] They're compelling you to do it.
[1410] Just because you think that it's a kind thing to do to transgender people.
[1411] If you allow that right to exist in modern society, it will go into other things.
[1412] And if we go south, if something goes bad, if there's a civil war or a nuclear war with another country and only half of us survive and we get some hardcore, really fucking authoritarian type dictator running this country, they will turn that shit on you.
[1413] And the idea that that's never going to happen is preposterous.
[1414] The idea that that can't happen.
[1415] It's happening right now in other parts of the world.
[1416] That shit could have.
[1417] What's happening in China could happen here.
[1418] If we allow a centralized digital currency and we allow a social credit score system and shit goes south and we have like an uprising against the government.
[1419] And so the government has to lock down and put more rules in place and they decide whether you can and can't travel based on your tweets, that shit could happen right here.
[1420] And the same sort of compelled speech that you would think would be compassionate towards causes that you support.
[1421] The problem with that is that could be applied to almost anything.
[1422] And in worst case scenario circumstances, which is what we always have to think about.
[1423] And if you set a precedent for it being okay for the powers that be to enforce those rules, then what if the powers change?
[1424] And suddenly it's somebody who disagrees with you.
[1425] Or this is the problem that the left has.
[1426] The left is redefining things every moment.
[1427] The goalposts are shifting.
[1428] The goalposts are always shifting for what's acceptable and what's not.
[1429] You know, conservatives tend to conserve, right?
[1430] At least that's what they should be doing, is preserving what they believe is true and good, right?
[1431] And but then the left also is going after what they think is true and good, but it's a moving target all the time.
[1432] And so when the target changes, all of a sudden your view that you tweeted out like a year ago is bigoted today.
[1433] Don't you think that's a classic power struggle, though?
[1434] I think that's a classic power struggle between people who are in power and people who are in power and people who are want to be in power and between two opposing parties.
[1435] If you made one team, you made them wear a blue jersey, the other team you make them wear a yellow jersey, the fucking yellow ones kind of kick the shit out of the blue ones.
[1436] Those blue ones are all pussies, and the blue ones think the yellow ones are going to quit.
[1437] That's just how people operate, man. And if you give people an ideology, a rigid ideology that they can follow.
[1438] And this is my problem with calling yourself left or right.
[1439] There's a lot of shit on both sides I agree with.
[1440] But when you give yourself a rigid ideology, then you subscribe to that ideology.
[1441] That becomes you.
[1442] And then you defend it because you're defending your identity, you're defending, you're defending your way of life against all these mindless whored.
[1443] But if you think it's true, then what's wrong with defending it?
[1444] If you think it's true, then you should stand by it.
[1445] The problem is when you have, it's not rigid.
[1446] The problem is when it's moot.
[1447] Okay, what's the point?
[1448] My point is that people just do this.
[1449] Right.
[1450] They just do this.
[1451] They just decide, I'm a conservative Christian.
[1452] I'm an atheist liberal.
[1453] People just decide things.
[1454] Yeah.
[1455] And then they find ideas within there that they can sort of espouse and they can get in the more you do it, the more you get love from your community and the more kind of rationally make sense to you.
[1456] And the more you vehemently opposed to anything that's opposed to that.
[1457] Right.
[1458] The more you find the sworn enemy of the GOP.
[1459] You dig it in.
[1460] Everybody's like, yay.
[1461] And you got a rainbow in your Twitter flag.
[1462] Yay.
[1463] You're doing a great job.
[1464] You're doing the right thing.
[1465] You're getting all the right.
[1466] Or on the other side, You know, whatever, whatever conservative ideas that you attach yourself to, you'll only hook, line, and sinker buy into those, whether it's First Amendment, whether it's Second Amendment, whether it's whatever different issues, border control, whatever different issues that people have.
[1467] And everyone loses all nuance.
[1468] Everyone is just fighting for one ideology or another ideology.
[1469] And then people switch.
[1470] How about those wild fucks?
[1471] They go, I've seen the error of my ways and I'm going to the other side.
[1472] And everyone's like, come on over, come on.
[1473] over and people get excited and then all of a sudden they have a completely different philosophy than they had just six or seven years ago like you did when you went far right yes that's what they say if you if you even defend freedom liberty freedom freedom of speech you're far right yeah well I'm so left it's hilarious to call me far right is preposterous you know it's so dumb this but I like guns you know I believe that human being should have the ability to defend their home and their property from bad people Because I think bad people are a real thing, you know, and I think this idea that you're going to somehow another make the world a better place by getting rid of guns that are owned by law -abiding citizens and that the crime, the people that commit the crimes are going to follow those rules, and they're not going to have guns anymore.
[1474] Well, you're talking about a population of guns.
[1475] It's larger than the population of humans.
[1476] Right.
[1477] That's a lot of guns to keep track of.
[1478] Do you guys, you have track of all the bullets?
[1479] You're still allowing people to make poets?
[1480] What are you saying?
[1481] You want me to give up my guns for what reason?
[1482] Right.
[1483] And it's like, you fucking, you fucking little dicks, you just want to keep your guns.
[1484] Like, no, I want to stay alive.
[1485] I want to stay alive.
[1486] If it's between a bad person or me, I want to be the person that makes that decision.
[1487] Right.
[1488] I want to be the person who's trained in firearms, who knows what the fuck he's doing.
[1489] And I don't want to be helpless.
[1490] But if guns are banned, then the bad guys won't have them.
[1491] And then you won't have to defend yourself with them.
[1492] Yeah, where do you made that argument?
[1493] It's just too many people.
[1494] But the problem is not guns.
[1495] The problem is mentally ill people.
[1496] That's the problem.
[1497] The vast majority of people who have guns are just like the vast majority of most people.
[1498] They're good people.
[1499] Yeah.
[1500] It's that's that I think that's across the board.
[1501] Our problem is with mental health.
[1502] I wrote a tweet about that once that we have a gun problem.
[1503] We have a mental health problem described as a gun problem.
[1504] Yeah, I agree with that.
[1505] That is what it is?
[1506] It's a mental health problem.
[1507] And you got to dig it.
[1508] What's the root of that?
[1509] Where are we generating all this?
[1510] Why do we have a mental health crisis on our hands?
[1511] You know, what are the root causes of that and dig into that and nobody wants to?
[1512] Well, they do that with population densities.
[1513] You know, population density rat studies they do, mirrors human behavior.
[1514] When they have rats, and there's, like, a lot of room for the rats to roam around, and they behave normal.
[1515] They behave like rats.
[1516] But as soon as you get too many rats into an area, they start developing all these, like, weird ticks.
[1517] Some of them sit in a corner and rock themselves back and forth.
[1518] They become way more aggressive, way more conflict.
[1519] They start caring about what pronouns you use.
[1520] Baa!
[1521] They start, you know, there's too much content.
[1522] There's too many different, fucking opinions and ideas and humans and that's what's causing a lot of what we're seeing right now.
[1523] And then it's compounded by social media because we're communicating in this ineffective way.
[1524] And also, a lot of people have been taught how to communicate in a kind way, in a friendly way, how to have a conversation with someone that you disagree with and just talk to them like a genuine human being.
[1525] And try to see their perspective and try to find merit and steal man their perspective.
[1526] you know that's that's a very important thing for all human beings like if you're a person who's a liberal and you don't have any conservative friends i feel bad for you if you're a person who's a conservative you don't have any liberal friends i feel bad for you because you're not exposed to a variety of different people and there's some people that think ridiculous shit but then they think really admirable shit and it might be the same person well and you may have some ideas that deserve to be challenged that you should let go of and they're never challenged because you never talk to anybody intelligent who has an opposing view yes that you might actually learn something from.
[1527] Yes.
[1528] Yeah, this is a strange time because I think there's almost too much to keep track of.
[1529] You know, there's a fucking 100 million streaming television shows.
[1530] There's so much content to be absorbed from the internet.
[1531] There's so much content.
[1532] And people are constantly being distracted, whether it's by social media, whether it's by real life.
[1533] And to form like concise opinions, to have like a real understanding of why you believe something and what you believe it takes a lot of time and it takes these kind of conversations and the way we have this conversation like you and i put our phones off sit across each other and talk and it's weird because because we're doing it for a podcast it's like a way that we probably wouldn't do in real life if we're doing this in real life we'd probably go to dinner we probably have some food and talk and we'd laugh we'd talk some shit but we probably wouldn't get heavy into something like that like whether it's abortion or you know we would probably you see a different perspective just let it go.
[1534] Yeah, well you don't want to get contentious at dinner.
[1535] Exactly.
[1536] Yeah, but it's like, like, the beautiful thing about the conversation we had was I don't even know if it was really considered contentious.
[1537] No. We didn't see eye to eye on some aspects of it.
[1538] We do see eye to eye on other aspects of it.
[1539] That is what I think most of the country is on most things.
[1540] I think the problem is when people get rigid and they subscribe to only one ideology because they think they're supposed to.
[1541] And they don't formulate their own actual opinions on things.
[1542] And they don't want anyone else to be able to have a different opinion.
[1543] And they don't want anyone else to be able to have a different opinion.
[1544] That's where...
[1545] And that can happen on both sides, for sure.
[1546] Yes.
[1547] And that's where, you know, I'm in full support of you guys.
[1548] And I think what Twitter did is wrong.
[1549] I really do.
[1550] I think it's bad for them, too.
[1551] It's bad for their reputation.
[1552] It's bad for goodwill.
[1553] You know, I think allowing conservative people to talk and joke around about stuff is just as important as allowing progressive people to make Trump memes.
[1554] I mean, you know, how many fucking hilarious comedy shit was made about Trump?
[1555] You know, it was great stuff.
[1556] You know, let me tell you, my favorite joke, actually, on that sheet that we were looking at for a moment was a Trump joke.
[1557] We did a joke about Trump in 2019, about how it said Trump.
[1558] And it was quoting him, I have done more for Christianity than Jesus himself.
[1559] And it's just like, you know, we're stuffing words in his mouth that he's never said.
[1560] That's hilarious.
[1561] We're playing off his ego.
[1562] You know, he's claiming to have done more for Christianity than Jesus himself.
[1563] Doesn't he say nobody loves the Bible more than me?
[1564] Probably, yeah.
[1565] Didn't he?
[1566] So, me and, didn't me and Whitney Cummings go over these things?
[1567] But he, so, I mean, it was probably in the context of him saying that that we made this joke.
[1568] Who knows?
[1569] But anyway, this goes viral, right?
[1570] And people on the left are sharing it like crazy because they want to believe that he really said something like this.
[1571] They're just eating it up.
[1572] And so it goes mega viral shared millions of times.
[1573] So Snopes gets involved and they fact check it and they bring it false.
[1574] So like Trump never said this.
[1575] He did not claim to do more for Christianity than Jesus himself.
[1576] Then you fast forward to 2021.
[1577] And he calls into some radio show.
[1578] And he tells the host of this radio show that I've done more for Christianity and religion in general than any other person in history.
[1579] Oh, my God.
[1580] So two years later, three years later, he's literally, and he took it a step further.
[1581] No. Yeah, because he didn't say, I've done more.
[1582] Our joke was he'd done more for Christianity than Jesus.
[1583] He actually said he's done more for Christianity in religion and all religions than any other person in history.
[1584] He took it a step further.
[1585] So, I mean, I think it's a funny example for a couple reasons.
[1586] I mean, it got fact -checked, you know, it was us making fun of Trump.
[1587] You know, we're always accused of being like, and I'm sure you're probably accused of being like a Trump supporter, even though you're like, you know, everybody, if you disagree with them on anything, they'll label you far right, they'll put you in Trump's camp, and there's no escaping that.
[1588] You know, you're aligned with him.
[1589] Right.
[1590] And we make these jokes about him, and they don't even realize we're joking about them because they think it's true.
[1591] And then it comes true.
[1592] It's just unbelievable.
[1593] Well, The Onion made a joke about Bernie Sanders supporting or accepting my endorsement that he shouldn't have done that because he shouldn't want to win.
[1594] Something was way better than what I just said.
[1595] I forget how it went, but it was hilarious.
[1596] Yeah.
[1597] Because, like, that was the first thing that was canceled for.
[1598] It was, like, saying I support Bernie Sanders.
[1599] Right.
[1600] And they're like, how can you call me a Trump supporter when you say I support Bernie Sanders?
[1601] Right.
[1602] That's the dumbest fucking comparison ever.
[1603] They're a little bit different.
[1604] It doesn't make any sense.
[1605] They're a little different.
[1606] I supported Bernie's idea about taking a little bit of stock speculation, taking a small percentage of that, and using it for health care and student loans and all that shit, and public education.
[1607] I'm like, that would be a brilliant idea.
[1608] It's a brilliant idea because it's a fraction of a penny from each transaction.
[1609] Yeah.
[1610] It's amazing.
[1611] It's a great idea.
[1612] And he said it would generate an immense amount of money.
[1613] It wouldn't have an overall effect on the economy he didn't believe.
[1614] I don't know if he's correct because I'm not an economist.
[1615] I'd love to talk to an economist that would tell me that it's bullshit, but it was a fascinating idea.
[1616] That's what I supported.
[1617] I'm like, that seems like a good idea.
[1618] Because in a lot of ways, I'm kind of a diehard hippie.
[1619] Like, I really think that we could all get along together and do better.
[1620] But I'm also a realist when it comes to human nature and discipline and people's, you know, willingness to cave in to bad ideas and to self -loathing and to, you know.
[1621] and that's why I tell people like it seems like a non it seems like a trivial thing to like to exercise but I'm like it's one of the most important things you're ever going to do because it's hard to do and you need more hard physical things to do in your life everybody does it keeps you robust it keeps your mind working well resilient it keeps you resilient but more importantly than that it's great for your mental health because if you do something really fucking hard look it could be a 90 minute yoga class a hot yoga class Those are the best.
[1622] If you do that, that's so fucking hard to do that the rest of your life will seem easier.
[1623] Right.
[1624] And that's what it's all about with people.
[1625] If you think about people that come from like a really bad childhood and they have this incredible willpower, where the fuck do you think they got that from?
[1626] They got that from being beaten down.
[1627] Right.
[1628] You know, and if you do, you can do that to yourself.
[1629] Those things, it applies to yourself.
[1630] And there's too many sedentary people in this country.
[1631] And those people are upset by almost everything because their body is awash with fucking.
[1632] chemicals and hormones and corn syrup they don't know what the fuck they are they have no real foundation for who they are as a human they don't understand their human potential they never push themselves past the boundary of where their inner bitch wanted them to quit that's important for you it's important for human beings for all you can't force that on anybody you can't force anyone no but you can encourage people to do it right you can incentivize it you could tell well incentivize it how I don't know I mean not like with I'm not saying like a like an offer disincentive that would be like a penalty if you don't do it but incentivize it positively somehow yeah a lot a lot of workplaces are trying to incentivize getting healthy that's a good you know they'll give you some kind of benefit for like if you work out if you use the gym you know and then the people that don't work out right claim discrimination yeah like you're fat shaming me I have a hormone problem yeah you probably do have a hormone problem if you fucking eat terrible food and get to be 600 pounds you're gonna have a hormone problem you're gonna have a lot of problems yeah your whole body's cholesterol problem.
[1633] Yeah, you got a lot of issues.
[1634] How stupid, though, that you mentioned that was like the initial cancel attempt on you because you voted for Bernie Sanders or you supported, you endorsed him, right?
[1635] It's like, imagine going to a comedy show and you're like, you see like, you know, there's a comedian that's on, and you've heard of him and it's supposed to be a funny show, so you go to buy tickets and then you ask, you want to know first off, wait a minute, who did this guy vote for?
[1636] Like, how's that relevant to whether or not you're going to enjoy the show?
[1637] I want people to ask that question.
[1638] Who cares?
[1639] No, no, no, it's important.
[1640] because if you want that answer I don't want you showing up right I want that ask that question please ask who I voted for I know but it's just the stupidest question it's a great question weed those people out I guess yeah but I mean what a stupid thing to be like what you're not going to laugh at his jokes if he voted for someone that you don't like exactly his jokes are funny or not you know yes exactly they stand on their own they're funny or not and that is what we're supposed to be doing and this who do you support I don't want to support who you support that was one of the crazy things after Biden won where people were calling for blacklists of people who supported Trump people who publicly endorsed Trump or talked about Trump they were talking about making them unhireable I'm like to know how crazy that is...
[1641] Accountability project they called her or something like that.
[1642] But you know how crazy that is to do to fellow Americans like to try to remove their livelihood?
[1643] You're coercing them into you know you have a that's a disincentive you're penalizing them for not going with what you want.
[1644] Where do you think this is going?
[1645] You think the children are going to suffer?
[1646] The person is going to lose their job?
[1647] What if they become homeless?
[1648] Like what is going to happen?
[1649] What kind of a physical abuse is going to happen to these people?
[1650] What horrible things are you enacting on people that are in the range, like a part of that person?
[1651] Like if you cancel that person, who knows what kind of devastating effect that's going to have on the rest of their family?
[1652] What if that person commits suicide?
[1653] What if because they lost their job, because they get canceled because they supported Trump?
[1654] That's real shit.
[1655] Right.
[1656] People have killed themselves for very minor attacks on Twitter.
[1657] I mean, some people are very vulnerable.
[1658] And if you want to be this super sensitive, woke, kind, compassionate person, you're supposed to apply that to everybody, okay?
[1659] You're not supposed to only apply that to people whose opinions agree and align perfectly with yours.
[1660] You're supposed to look at people that are Trump supporters or whatever supporter they are that you disagree with, DeSanta supporters, and you're supposed to find common ground with them and find out why do they like that person and what, and try to have a communication line with them where you're both being kind and friendly and just trying to talk about stuff.
[1661] Well, you're certainly going to have better success changing, if your goal is to change their mind, you can have a lot better success treating them as a person than vilifying them and calling them names.
[1662] Right, but punishing them by removing their livelihood.
[1663] Right.
[1664] Because they maybe in your eyes were incorrect about their political support.
[1665] That's crazy.
[1666] That sounds more like fascism.
[1667] It is fascism.
[1668] Yeah.
[1669] I mean, that's literally, what is the definition of fascism?
[1670] Because a lot of times it gets equated to the right, but I don't think it's supposed to be.
[1671] I think it's supposed to be whoever's in charge.
[1672] Like if Mao was on the left.
[1673] State authority.
[1674] It's going to have to do a state authority.
[1675] So they're going to try to differentiate it from fascism by saying, well, you're just being held accountable for your actions by people who don't want, you know, it's the market working itself out.
[1676] Right, but you're doing that at the benefit of the state.
[1677] You're acting as an, you're an actor for the left -wing party.
[1678] That's what you're doing.
[1679] If you're penalizing people for, they're going to lose their job.
[1680] They need to show, by the way, the last date, this definition was edited because they change these things all the time.
[1681] But look at this part.
[1682] A tendency towards or actual exercise of strong autocratic or dictatorial control.
[1683] Early instances of army fascism and brutality.
[1684] Forcible suppression of opposition.
[1685] Yeah.
[1686] There it is.
[1687] I mean, that's part of the definition.
[1688] Yeah.
[1689] Okay.
[1690] A political philosophy, movement or regime such as that of fascist, fascistee?
[1691] Fascistee?
[1692] Oh, God damn, these fucking problems.
[1693] They always get you.
[1694] They're like, oh, not right now, bitch.
[1695] That exalts nation and often race above the individual.
[1696] Well, there you go.
[1697] It stands for a centralized autocratic government headed by a dictatorial leader.
[1698] That easily could be the leader of a fucking social media platform.
[1699] Severe economic and social regimentation, equality of outcome, and forcible suppression of opposition, banning you from social.
[1700] media for disagreeing, banning you and forcing you out of your job for disagreeing.
[1701] Like, it's align with that type of thinking.
[1702] It's a control -based, problematic thing that we've always agreed leads to horrible results.
[1703] It doesn't lead to good results if it's applied to a good cause.
[1704] It's still a bad philosophy.
[1705] It's still a bad thing to completely discourage or attack opposition like that and make it so that they can't talk.
[1706] It's not good for anybody.
[1707] And I know it's not dictatorial in terms of it's not actually the government.
[1708] But, God damn, they're so in line with them.
[1709] It's so obvious that they'll do things that benefit the side that they want to.
[1710] And that's never been more clearly expressed than during the Hunter Biden time.
[1711] That's wild shit, man. Leading up to an election, you're insulating your preferred candidate from criticism.
[1712] And it's the New York Post.
[1713] That's where it's wild.
[1714] It's one of the oldest newspapers in the country.
[1715] That is a long established newspaper.
[1716] Yeah, they talk a lot of shit.
[1717] They have funny headlines.
[1718] But it's New York.
[1719] You kind of have to have funny headlines.
[1720] It's part of the charm of those papers.
[1721] Jack Dorsey came out and said they messed that up, right?
[1722] Didn't he make a statement?
[1723] Say Twitter messed that one up.
[1724] Hey man, Jack Dorsey is a different animal than Twitter itself.
[1725] Yeah.
[1726] He really is.
[1727] Jack Dorsey is an interesting guy, very thoughtful guy.
[1728] And I think it's great that he stepped down.
[1729] He stepped away from Twitter.
[1730] His philosophy is that, it should be decentralized and he's working towards doing something like that now.
[1731] Yeah, something new.
[1732] Yeah, he also wanted to, I mean, he had an idea that wasn't popular amongst the other people of Twitter.
[1733] He wanted to have a Wild West Twitter.
[1734] He wanted to have a regular Twitter and then they're like, wee, 4chan Twitter.
[1735] Yeah, yeah.
[1736] That would have been crazy.
[1737] I wonder which one would have been popular.
[1738] If there was no, like, suppression of Wild West Twitter, that would have been where everybody was fucking shooting around.
[1739] I mean, are we talking, are we talking like even unlawful speech or?
[1740] No, I think they were always going to have no doxing because Fortune has stuff like that.
[1741] No doxing, no calls to violence.
[1742] Nothing, you know.
[1743] I mean, the First Amendment doesn't protect all speech.
[1744] Right.
[1745] It's like that's really in line with the First Amendment.
[1746] I think that was the idea behind it.
[1747] Right.
[1748] That they were going to have your ability to express yourself.
[1749] Regardless of the language you use.
[1750] What's wild about Twitter is like you can show porn.
[1751] That's really wild.
[1752] The content moderation should be, that idea behind giving them immunity to moderate content was to be able to take down things that are like objectively indecent or lewd or obscene.
[1753] You know, like content that...
[1754] Well, I think it was mostly to protect people first.
[1755] I think it was more about doxing and harassment.
[1756] I mean, it's all in there.
[1757] It's in the language of Section 230.
[1758] Right, but that's where they started it.
[1759] They started it in response to doxing, harassment.
[1760] They didn't respond to pornography.
[1761] But all that stuff is there.
[1762] Yeah, but they don't have a problem with that.
[1763] They don't have a problem with pornography.
[1764] It's very interesting because, like, I follow...
[1765] I don't have a problem with harassment either, by the way.
[1766] I mean, look at libs of TikTok.
[1767] And, like, if you, like, lives of TikTok will simply showcase, like, what's happening.
[1768] They'll post a video of something.
[1769] They'll say, oh, there's this family friendly drag show coming up, you know, like, and this is a flyer that's being publicly advertised, you know?
[1770] And, and then they will go hard, like, doxing and intimidating and trying to harass and shut up this account that's drawing negative attention to things that they don't want to receive negative attention.
[1771] It's like they're doing what they say, you shouldn't be allowed to.
[1772] do, but it's justified when they do it because they don't like the activity or the speech of the person that's saying it.
[1773] See, the thing about Libs of TikTok is people look at it and then they make this defense.
[1774] And this defense is this is not indicative of the greater whole of educators or of liberals.
[1775] You've found an egregious example of a far left loon and everybody is now going to attack everyone in that group.
[1776] but that's not you can't but it's not it's not like what lives of TikToks is post posting is fake right this is the thing these are real yeah and they're much more prevalent than people realize well we're dealing with a lot of humans that's the other problem we're dealing with 330 whatever million people we have in this country you're going to find a lot of really ridiculous people and if you highlight those really ridiculous people it does have the unintended consequence of forcing people and thinking that's happening everywhere around them and then people who are not doing anything remotely like that, get lumped into that same group.
[1777] That's what people are scared of.
[1778] And their overreach, their overreach to do that is to ban the word groomers.
[1779] Their overreach is to stop your ability to express yourself.
[1780] The overreach is to ban libs of TikTok.
[1781] That would be the overreach.
[1782] Well, the way that they lump themselves into the group, like lives of TikTok won't explicitly lump everyone into the group.
[1783] They'll just showcase the content.
[1784] Yes.
[1785] And then, but they will lump themselves into it by defending those things.
[1786] They'll say, well, why can't a teacher talk to children about sex and gender?
[1787] Why can't they demand that they be addressed by certain pronouns?
[1788] Why can't they be the confidant that children come to because their parents won't accept them for who they are?
[1789] So they're the mom and dad now and they're going to be loving and affirming of that child.
[1790] Like they defend it.
[1791] So they come and defend it.
[1792] So it's like it's not like these are outliers that they're saying, oh, no, no, we want nothing to do with that.
[1793] We don't believe in that.
[1794] They actually defend it hard.
[1795] They go in hard defending it.
[1796] media matters all these people they go in hard and they suggest what are they defending they're defending the insane videos and the behavior we i think we have to talk about that on a specific case -by -case basis because some of the videos those people are insane like the thing they are saying is insane i would like to see if they defended that yeah no what they'll defend i mean i gave a couple examples they'll defend for example um teachers coming out to their children and talking about sex and gender uh talking them about uh critical race theory and and and how uh white kids should giving giving kids an assignment where the white kids have to move to the back to the back of the room and the black kids move to the front to separate them and show how white kids are privileged now we're going to have the blacks at the front and a kid in the white kids are need to be quiet right now classes yes they're doing that in classes wait a minute what classes are they doing that where they make all the white kids sit in the back of the room was assignment that was an assignment yeah it was an assignment yeah yeah but they defend this stuff and we we actually highlighted more okay but if you wanted a kid to feel like what it would feel like if you you were a black child living in the early 60s, if you wanted to express how wrong that is, wouldn't you do it that way?
[1797] Like, if for an exercise, you're not talking about, like, if this was like every day, I would say that it's fucking crazy racial discrimination.
[1798] But for an exercise, to let kids, white kids know what it would have been like, to let everybody know.
[1799] Even the black, hold on, even the black kids, to let them what it would be like.
[1800] Let's imagine if this is 1963, this is how we would have to do it.
[1801] Okay, so I want all the white kids to sit back there.
[1802] and all the black kids to sit here.
[1803] And then you would say, you see how terrible that would be?
[1804] If I treated you only by something you have zero control over, what your ethnicity is, and not by the color your skin or the content of your character, as Martin Luther King would want you to do it.
[1805] And then you could actually, like, bring the kids together with an understanding that at one point in time, there was horrific racism, and it was prevalent all throughout America, and that people like Rosa Parks did have to sit at the front of the bus and get arrested so that people understood that this was going on, that people did have to sit at that counter and and and and let people know that this is a this is a real thing that's going on and this is contrary to the way the way we should all feel yeah about human beings i agree with that 100 percent if the lessons were ever framed in the context of the content of your character matters more than the color of your skin then that it would be totally not though because we're looking at the lesson materials like what what what's being exposed is a lesson where it's talking about how white people are privileged you know your skin your guilt it's all about white guilt it's It's all about whites are an oppressive race.
[1806] They're privileged.
[1807] And to acknowledge your privilege, you need to do it.
[1808] So they're teaching about a present issue.
[1809] That's all about, but it's a lot of it is about the history of slavery.
[1810] That's the reason why they're doing it that way.
[1811] No, no, no, no. These are critical race.
[1812] These are critical race theory lessons that are trying to say that you are, that currently, in our current system, the current system privileges white and not black people.
[1813] You're guilty just because of your skin color for being white and you need to be quiet right now.
[1814] But it's because of the history of slavery, right?
[1815] That's how they look at it.
[1816] Well, that's how they look at it.
[1817] This is the example.
[1818] But they're not suggesting that all that matters is your character, not your skin color.
[1819] They're suggesting your skin color matters very, very much.
[1820] I understand what you're saying.
[1821] But what I was saying was I could see how that exercise would work.
[1822] Yeah.
[1823] That's why I was saying it that way.
[1824] If you had a really good teacher and they explained it in that way...
[1825] I think it'd be less objectionable for sure.
[1826] You would say this shouldn't be the case with white people in the back and it should be the case with no one.
[1827] You should be judged as an individual.
[1828] How about one lips of TikTok did where kids, kindergartners?
[1829] Kindergarteners.
[1830] What's that, age five, kindergartners.
[1831] Yeah, six.
[1832] Five, six, first grade.
[1833] Sent home with a masturbation assignment.
[1834] What?
[1835] The masturbation assignment was to find a private place in your home where you can touch yourself without being disturbed.
[1836] Is that real?
[1837] Real.
[1838] Jesus.
[1839] I might have retweeted that.
[1840] And I'm saying, what?
[1841] Masturbation assignment for kindergartners.
[1842] I mean, this kind of stuff, it's like.
[1843] Shocked but not, because I did read this one thing where they were talking.
[1844] about kids that were preteens and they were describing like different ways that people have sex and I'm like I just don't necessarily think that's your place and I don't it's a complex thing that I don't necessarily want taught by someone who I don't even know I don't know what you're like I don't know what how are you going to say this are you going to say this in a way that's promoting a certain thing what are you going to do because children are very malleable and most people want to be the ones other than the you know the world itself but in terms of like authority figures they're explaining things you don't want someone explaining things to your kid that you absolutely don't agree with right or that you don't think they're ready for yet yes right and you don't want that happen on your timing you want it to be look I've had some great teachers but I also had some fucking dummies some real dummies that said some stupid shit to me you know yeah I remember I mean a teacher kicking me out of room telling the class don't laugh because he's never going to amount to anything oh well like mr. Rogan's never going to amount to anything.
[1845] And I remember like, why would you say that to a fucking eight -year -old?
[1846] Now, did you internalize that?
[1847] Is all of this?
[1848] Is this, your whole trajectory now in your career is to stick it to that teacher?
[1849] I'd be like, ha -ha.
[1850] But it was me cracking a joke in class.
[1851] But the whole point is like, you shouldn't say that to kids.
[1852] If you're a fucking teacher, you shouldn't say to a kid, you're never going to amount to anything.
[1853] That's a crazy thing to say to a kid.
[1854] Not a thing about it.
[1855] I think it was 14.
[1856] But it's not a thing that you should say to a kid.
[1857] It's a rude, shitty thing.
[1858] And if you're in a, and they just, that's what they're going to get from you?
[1859] You know, that's crazy.
[1860] That's crazy talk.
[1861] And that is a problem with expressing any complex ideas that may have a very nuanced, there's nuance to all of those conversations.
[1862] And you might not want this person who's not that bright, not that good at expressing themselves, and very biased to explain something to your kid that's going to cause an argument at home because then the kid's going to bring something up and you're going to like where did you hear this from or to tell your kid that you know what you don't have to confide in your parents about that don't tell your parents about this come to me about these issues that's where the kid's crazy that is grooming behavior and you can't even call it that anymore but it is i mean that that follows right in the definition of it you're most certainly certainly tutoring someone towards your perspective yeah you're most certainly well and you're positioning yourself as a mentor and confident right you know somebody that they can trust more than their own parents Yes.
[1863] And maybe their parents are dicks, right?
[1864] So maybe you can get a leg up on the parents.
[1865] They like you even more than the parents.
[1866] It's just, it's a weird thing because children are so influenced.
[1867] They're so easily influenced by their environment.
[1868] Children are so malleable.
[1869] Right.
[1870] Depended upon the area they grow up in.
[1871] They'll have different personalities or different accents, rather.
[1872] They'll have different things that they gravitate towards, different sports, because the community enjoys them.
[1873] kids are so goddamn malleable with almost everything they're malleable with religion i think that's why this transgender craze and it is a craze it's legitimately a craze the numbers are off the charts with all these young people that are now identifying as either non -binary or some other gender that's abigail schreyer's and it's just gone off the charts and bill marr was talking about this actually recently where he's like what are all these we're all these kids all of a sudden they're born in the wrong body right you know is it really that we're just more accepting now and that's why Or are we actually influencing kids in this direction?
[1874] Well, the idea that we're not influencing kids is ridiculous, because we influence them with everything, you know?
[1875] I mean, this is one of the arguments about violent movies, right?
[1876] Or violent video games.
[1877] We're influencing kids.
[1878] Violent songs.
[1879] Don't play Ozzy Osbourne backwards.
[1880] Remember all that shit?
[1881] The whole reason for, like, a parental guidance warning on a rap album from, like, the 1980s when Tipper Gore was promoting that shit, the whole reason is you're influencing kids.
[1882] That's the whole reason.
[1883] The whole reason is, like, if you're an adult, you can go to the R movie.
[1884] But I don't want a 17 -year -old seeing it because they'll influence the kid.
[1885] It'll fuck with you.
[1886] You can go to R movies at 17.
[1887] Oh, is it 16?
[1888] 16, you can't.
[1889] I'm thinking of NC -17, right?
[1890] NC -17, you have to be over 17 to go.
[1891] Right?
[1892] Yeah, I think so.
[1893] Yeah, it's the whole thing is, you know, it's just no one wants to look at both sides of it.
[1894] No one wants to look at both sides of it.
[1895] Everyone wants to think that you have to like buy wholesale all the ideas of either the left or the right depending upon which group you align with.
[1896] Which group do you decide you align with?
[1897] That's where it gets weird with people because if you're like a progressive, open -minded, compassionate person and then you watch two professors having a conversation about kids under 13 should be forced to have sex because we force them to do a lot of other things.
[1898] We force them to clean the room.
[1899] They do a lot of things they don't do.
[1900] Why shouldn't we force them to have sex?
[1901] Like, that's a real conversation I saw on lives on TikTok.
[1902] That's fucking wild.
[1903] That they're trying to make an intellectual argument about whether or not children should be forced to have sex.
[1904] Right.
[1905] And about how they've been that way throughout human history.
[1906] Yeah.
[1907] Like this arbitrary age that we put on people.
[1908] I think it should concern us more than anything else this effort to stamp out not just the dissent, not just the different opinions.
[1909] but the jokes it's the jokes you know like the fact that we can't joke about this stuff is really disconcerting and i know you can i mean you know your show is well is you know it's the censorship from these platforms that are hosting your content that you put them on to and the control that they're exerting over these conversations i think that our our ability to be able to laugh at these things that deserve to be laughed at it's so it's so vitally important for the health of society that we not stop that and shut that down i agree i think is super important.
[1910] I couldn't agree more.
[1911] It's very important whether you agree with those people or not.
[1912] You got to have discussions.
[1913] Oh, I mean, both sides deserve more mockery than they're currently receiving.
[1914] Yes.
[1915] Both sides are more.
[1916] We need more Babylon Bees.
[1917] We need more of them.
[1918] You know, there should be more entering the space and feeling free to say what they want to say rather than everyone learning that, oh, if you follow in the Babylon Bs footsteps, you're going to get shut up.
[1919] And I'm glad you guys mock Trump, because if you didn't mock Trump, I mean, if you didn't mock some of the more ridiculous shit that guy says, you know, it's like, He's well open for it, just as Biden is well open for it.
[1920] If you stop people from mocking Biden being old as being agist, you're out of your fucking mind.
[1921] He's the leader of the goddamn free world and he's shaking hands with ghosts.
[1922] Right.
[1923] Like this is wild.
[1924] This is wild shit we're seeing in real time.
[1925] Yeah.
[1926] And if you tell me that's off the table, you're out of your mind.
[1927] You're out of your mind.
[1928] Oh, you're a MAGA supporter if you talk about those things.
[1929] Oh, really?
[1930] How about you just seeing reality?
[1931] This is fucking crazy.
[1932] If you see one of them Kamala Harris speeches, which is like, time is the passage of time based on seconds and then minutes.
[1933] The significance of the passage of time.
[1934] If you don't mock that, you're not paying attention.
[1935] Her comments, she just made about the space telescope that we put out there.
[1936] She was like, I had a very intellectual response to these images when I first saw them.
[1937] And it was, wow.
[1938] An intellectual response she had.
[1939] Well, I think she's probably joking.
[1940] No, no, no, she was dead serious.
[1941] She was dead serious.
[1942] That was, well...
[1943] She went on about how space is, you know, we've accomplished this.
[1944] We've done something, and now we need to continue doing something, because space, we were there, we accomplished it, and now we need to also continue to accomplish it.
[1945] It's like, what do you say?
[1946] Does she not have a script that she can follow?
[1947] Can someone not write a speech out for her that she can just read?
[1948] Well, people keep quitting.
[1949] Hasn't she had, like, a shit ton of people quit?
[1950] Yeah, yeah.
[1951] It's like a nutty number.
[1952] Yeah, right?
[1953] How many people have quit?
[1954] How are staffers?
[1955] I've seen articles about staffers just, you know, dropping like flies.
[1956] Find out how many staffers.
[1957] Maybe they just let her make her own speeches.
[1958] Like, you go.
[1959] Go ahead, do your own thing.
[1960] She's got a teleprompter at all these events.
[1961] Maybe she doesn't want to.
[1962] Maybe she wants a free ball.
[1963] Maybe she's working on her act.
[1964] It's awesome, though.
[1965] Well, what it is is it shows you whether or not a person's real.
[1966] Because a real person has real ideas.
[1967] Like, if a person is actually just trying to be themselves and tell you this is my take on things, you'll get that from their words.
[1968] But when they just say nonsense.
[1969] Nonsense is because they're not being real.
[1970] There's nothing there at all.
[1971] Staff Exodus continues as top advisor, a speech writer.
[1972] Oh, there you go.
[1973] She lost her speech writer.
[1974] That's recently.
[1975] So how many people have left?
[1976] That's the answer.
[1977] Does it say?
[1978] Speech writer was also departing after fewer than four months on a job.
[1979] Can you imagine you write this fucking groovy speech, make her look like a wizard.
[1980] And she goes up there, the time that we're enjoying is different than the time of other times.
[1981] Well, people were not enjoying their time.
[1982] There is so much significance.
[1983] 13 key staffers have left the VP's team in as many months, including chief of staff, chief spokesperson, deputy press secretary, deputy chief of staff, communications director, director of digital strategies, director of advanced, director of, what is that, advanced, director of advance.
[1984] Oh, director of advance, deputy director of advance, director of press operations, deputy director of public engagement, speech writing director, oh my God, national security advisor.
[1985] Everybody's like, fuck this, I'm out of here.
[1986] Now, is that when it says they left, does that mean that they mean they resigned all of them?
[1987] Or were they like, or were some of them let go?
[1988] That's a good question.
[1989] I don't know.
[1990] Yeah, I think when they say left, they meant quit.
[1991] It sounds that way.
[1992] fired.
[1993] She's like a female Trump.
[1994] You're fired, Jetson.
[1995] Right.
[1996] The whole thing is wild, man, because it's like that is a type of person that can't speak.
[1997] They can't just speak about their position on ideas.
[1998] No. They got to kind of dance around with bullshit because they're bullshitting you, right?
[1999] Like if that was Tulsi Gabbard and you asked her to give her opinion.
[2000] She'd give you a thoughtful.
[2001] A very thoughtful response.
[2002] Very well -phrased response about any particular.
[2003] important issue because she actually thinks about them and she formulates her own independent opinion she's very impressive yeah she formulates her own independent opinion of those things and that's what's terrifying to her well about her rather that's what's terrifying about her to the government like you can't control her right you can't just tell her to play ball she actually has ethics right she actually has like a very clear opinion on things and she's not willing to go along with the hive mind i got to tell you another joke that came true.
[2004] It was about how Kamala's staff was hiring Hillary Clinton's staff as consultants to try to make Kamala more likable.
[2005] It's like the last person in the world you go to for likability of Hillary Clinton.
[2006] And then it came true.
[2007] Actually, a month later, a month later, her report came out that her staff had reached out to Hillary Clinton's staff to try to figure out how they could make Kamala Harris more likable.
[2008] It's a month later.
[2009] Well, imagine, though.
[2010] Why Hillary Clinton?
[2011] Who goes to Hillary Clinton for likeability lessons?
[2012] This is why.
[2013] You don't go to Hillary Clinton.
[2014] You go to the people who made Hillary Clinton so likable that she could run for president.
[2015] Because imagine you just let her go on her own.
[2016] Oh, yeah.
[2017] No. I mean, if she didn't get any public feedback, you'd just get her honest opinions about things.
[2018] Oh, my God, it would be a blood bag.
[2019] Oh, she just, she just told you whatever the polls say she should say.
[2020] Probably.
[2021] But then there's moments where you catch her.
[2022] Like, you remember that moment when she was being interviewed?
[2023] And it was right after Libya killed, the rebels killed Gaddafi.
[2024] Yeah, yeah.
[2025] She's like, we came, we saw, he died.
[2026] Right, right.
[2027] And she laughed.
[2028] She had that comment about Benghazi, what possible difference does it make or something, what difference does it make?
[2029] Yeah.
[2030] She's, the basket of deplorables or something, or redeemables or whatever.
[2031] Wasn't that her?
[2032] Deporables.
[2033] Was that her?
[2034] Yeah, that's, yeah, absolutely was her.
[2035] That's not that bad.
[2036] That's just, you know, stupid politically because you're mocking literally like half of the people in the country.
[2037] but the things about like mocking this guy who we kept in power being like ruthlessly murdered by those rebels like you can watch it like you can watch the video where this guy shoves a knife up Gaddafi's ass right you can see his face while this guy runs up behind him with this knife and shoves it up his ass and he's in full shock he can't believe they have him he can't believe he's captured and he barely reacts to this guy shoves a knife up his ass I mean you can't believe he's can see that you can see his dead body the rebels it like anybody that would think that's funny that scares me that that person would be in any sort of position of power right right because the thing that a person there's something off that's to see it look yeah if you're if you're happy that that person's gone and you want to express that in a sober way right you want to say the world is better off without moh mar Gaddafi running Libya right okay that's different but if you're laughing laughing about a person who got a bayonet up his asshole I mean it was like you've seen right?
[2038] I mean, he's a big ass knife.
[2039] It looks like one of those knives that's on the end of a rifle and he just shoves it up his ass.
[2040] Like that's not something that anybody should ever laugh at.
[2041] Even if it's to the worst person, the worst person dying like that, like that's not funny.
[2042] I mean, maybe it'll make you happy if that person's a killer and they've killed a bunch of people and someone runs up behind them and shoves a knife up their ass.
[2043] I'm glad that guy did that.
[2044] Fuck yeah.
[2045] Yeah.
[2046] Okay, but laughing is a crazy way to react.
[2047] Is it.
[2048] reaction.
[2049] Does someone to get, someone getting brutally murdered?
[2050] Not very likable.
[2051] Doesn't make you very likable.
[2052] I feel like Kamala Harris' response to everything is laughter.
[2053] She throws her head back and cackles at the most serious thing.
[2054] Well, she doesn't do that anymore.
[2055] You notice that?
[2056] She's better about it.
[2057] She's learned how bad it was, how bad a look it was.
[2058] It was constant.
[2059] It would be a serious topic she'd be challenged on.
[2060] Her initial response was to laugh nervously before she would reply.
[2061] Well, she's mocking things, right?
[2062] And she's mocking things, know that, knowing that she's got this army of supporters like that's the thought process behind it like mocking it yeah like that like yeah like Tucker Carlson throwing his head back and laughing at something but less gross like his is less gross yeah the way she does it is like it's fake it's always fake like there's no reason to be laughing in that moment you know it's just like this strategy that she has to just to mock ideas that don't align with the ideas that she has it's just not good And then it's also to just like portray this, this sort of persona of being a very happy, fun person who's laughing a lot.
[2063] That's good too.
[2064] Yeah.
[2065] Right?
[2066] So there's that.
[2067] But it's fake.
[2068] It's just a weird, it's a act that a person could put on.
[2069] Yeah.
[2070] And she doesn't do it as much anymore, which shows you that's fake.
[2071] Yeah.
[2072] She's been rained in.
[2073] Well, she's been exposed.
[2074] Maybe it was Hillary Staffers.
[2075] Yeah.
[2076] Yeah, maybe they said, listen, we told Hillary, you're going to stop laughing about murder.
[2077] What is this?
[2078] What's in this?
[2079] Coffee, you want some?
[2080] Coffee, sure, yeah.
[2081] Get some of that black rifle's finest.
[2082] All right.
[2083] Thank you, sir.
[2084] My pleasure.
[2085] Yeah, I don't understand why we can't get more really good candidates, the people that, whether it's congressmen or senators.
[2086] It's good, solid stuff, right?
[2087] Yeah.
[2088] Black.
[2089] But it is, it's just.
[2090] hard to it's hard to find people who want that job you know I mean who the fuck wants to have your life picked apart like that who the fuck wants to be the target of at least half the country hating you a lot of people who want power yeah willing to put themselves through that for the power that's the problem right and that's something we don't want to acknowledge the type of people that want those jobs are not really the type of people we want to have those jobs yeah yeah I remember in a gladiator when uh Marcus Aurelius is going to make Maximus his successor and he goes I want you not my son to be my successor he goes with all my heart no and he's like Maximus that's why it must be you it's because you don't want it yeah yeah what do you think uh is going to happen with Trump what do you know about the the raid the FBI raid I have no inside knowledge I don't know I mean he calls you I think uh I think he did retweet us a couple times um he retweeted me a couple of times, too.
[2091] Hilarious.
[2092] I was laughing my ass off.
[2093] I was like, the fucking president retweeted me. I think that the people who are arguing that he's been given a ton of fuel, like they just poured rocket fuel in his engine.
[2094] I think that's absolutely true.
[2095] I mean, if you just look at the fundraising that he's done off the back of this already.
[2096] Right, but what did they...
[2097] Absolutely.
[2098] Yeah, but I don't mean in terms of politically, I mean, like, legally.
[2099] Like, what did they find?
[2100] Oh.
[2101] And is he actually in trouble?
[2102] Because I think the goal was to try to knock him out of the 20 -20, for elections, right?
[2103] By trying him for crimes.
[2104] What did he do?
[2105] I don't know.
[2106] Do you know what he did, Jamie?
[2107] Has it been absolutely released what they caught?
[2108] I don't know that yet, but holding on the boxes of documents.
[2109] Is it really about confidential information that he shouldn't have had in his home that was so important?
[2110] They couldn't just ask for it.
[2111] They had to go in there and get it.
[2112] Well, I think the problem is having it, right?
[2113] Because if you have it in an unsecure location, meaning unsecure in terms of the government's protection.
[2114] It's not locked up in archives.
[2115] It's not in a place that's very difficult to access.
[2116] You have control personally over the access to something that's top secret.
[2117] If that's the case, then that's a problem because that safe could be open.
[2118] People can get in there.
[2119] People can get the code.
[2120] They can copy it.
[2121] They can send him to China.
[2122] Yeah, but do you think that's a genuine concern?
[2123] Or is it they want to find something, anything that they can use to prevent him from running again?
[2124] I think both things are valid.
[2125] I think if they're just doing that and they're using the FBI in a way that they would never use it against Hillary Clinton and they're going after him in a way they would never go after Galane Maxwell's client list.
[2126] Oh, my God.
[2127] Then I think...
[2128] There's no interest in that.
[2129] Right.
[2130] Then we have a real conversation.
[2131] But it doesn't mean that there shouldn't be a real conversation about should someone have access.
[2132] Now, I don't know what the files were.
[2133] I have zero idea whether or they were okay for him to have or declassified.
[2134] I don't know.
[2135] But I think the argument would be if you're not supposed to, if there's a fucking a whole chain of command about classified documents, this is the law on classified documents.
[2136] And you decide to violate that law because you think you can't.
[2137] Right.
[2138] I just want to keep them.
[2139] And you just keep them in your safe.
[2140] I don't know if that's what happened.
[2141] But if that is what happened, someone needs to be held accountable for that.
[2142] You're not supposed to do that, right?
[2143] Right.
[2144] You're not above the law and you can't decide that you're not going to follow the law because.
[2145] because you know better, right?
[2146] And I don't know if that's the case.
[2147] I think where people lose where they don't care about that is because they're like, okay, you know, if you're going to be selectively enforcing laws like that and just turn a blind eye to Hillary deleting emails that have been subpoenaed and all of that, in a blind eye to Hunter Biden trying to act like this is not a story until you're forced to admit that it is.
[2148] It's the double standard that makes everybody say this is persecution.
[2149] For sure.
[2150] And so even if it's just, even if there was something that was done that was wrong, they're still choosing to be selective about going after him in a way that comes across as they're after him doing what they wouldn't normally do to someone on their own side.
[2151] If it was Hillary Clinton's home, they'd have no interest in what's in her safe.
[2152] Because she'll kill them.
[2153] Yeah.
[2154] Jokes.
[2155] Yeah, I see what you're saying.
[2156] I see both sides, though.
[2157] I see the side that if you're an anti -Trump person and you find out that he's doing something that's against the law, you'd want to prosecute it for it.
[2158] Yeah.
[2159] I see both sides.
[2160] Yeah.
[2161] I really do.
[2162] I don't know the specifics of the Hillary Clinton email thing in terms of like what those files were.
[2163] But if they're the same classification of files as like he had, you could make the argument they were more vulnerable because they were on a regular laptop.
[2164] Well, and it's still destroyed after a subpoena.
[2165] Right.
[2166] Right.
[2167] I mean, imagine if Trump was subpoenaed for this information instead of handing it over, he burned it.
[2168] Right.
[2169] That is where shit gets really squarely.
[2170] Yeah.
[2171] It gets really squarely.
[2172] And it's like what punishments are there for that?
[2173] Is it zero?
[2174] There's nothing.
[2175] Nothing, not a fucking thing.
[2176] And everybody's just like, you know, whatever.
[2177] Didn't she has a hat for sale that says, but her emails?
[2178] But her emails.
[2179] She's capitalizing.
[2180] Good for her.
[2181] Yeah.
[2182] Good for her.
[2183] I wonder if, like, amongst people that are on the right now a black hat with white letters will get your ass kicked because, you know, like the MAGA hat?
[2184] Yeah.
[2185] Like, you could have a MAGA hat that said anything.
[2186] Like, I saw a lady get maced in the face because she had a hat on that said, it was a red hat with white letters that said, make Bitcoin great again.
[2187] And she was at one of those protests.
[2188] There's a video.
[2189] She thought her as a mag hat.
[2190] Yeah.
[2191] And some guy maced her in the face.
[2192] He pepper sprayed her in the face because she had a red hat with white letters about Bitcoin.
[2193] That's assault, by the way.
[2194] It is assault.
[2195] Well, that happened a lot of those fucking anti -Trump protests.
[2196] It's just the problem is that he was such a divisive character that he became a great enemy for the other side.
[2197] He wasn't like a statesman who like, you know, you could criticize his politics.
[2198] policies and his positions.
[2199] You could say he's heartless, all you want.
[2200] But he represents the United States in a statesmanly way.
[2201] And like, no, Trump's, you know, he's, he's a wild guy that, like, encourages people to hate him.
[2202] Do you think DeSantis would, uh, demand or command more respect from the left in the left?
[2203] And the left still hates him, but they don't hate him the same way they hated Trump.
[2204] They try to, but he's more reasonable.
[2205] He's very, like, level in the way he talks about things.
[2206] He's firm, though.
[2207] I mean, he hits, he's, yes, but you know what I'm saying?
[2208] Like, he's, he's.
[2209] Yes.
[2210] Yes.
[2211] But you know what I'm saying?
[2212] Like, he's.
[2213] He's.
[2214] He's.
[2215] He's.
[2216] He's.
[2217] He's He doesn't get, he's not, um, he's not an insulting, like, uh, character.
[2218] Like, Trump's a character.
[2219] Right.
[2220] Like, part of what he's doing is, like, doing comedy.
[2221] It's like he's doing stand -up when he's up there.
[2222] I mean, when he makes fun of Biden, it makes fun of other people.
[2223] He's doing fucking stand -up.
[2224] He really is.
[2225] And he kills.
[2226] It's, he's got this thing, you know, and that thing is like everybody's with him is fucking really with him.
[2227] And everybody's against him is really, really.
[2228] against him and he like encourages it you know he that is what i think is not good that part of it like you're i get where he comes from because like that's what made him is fighting against the haters you know but when you're a president like that's a different role that's a different role than being the fucking host of the apprentice right and a lot of people hope that when he got in there he's going to abandon that and just be like common sense get shit done but no he's on twitter calling his ex -girlfriend horse face.
[2229] Saying about Kim Jong -un, calling him Little Rocket, man. He's fucking wild dude.
[2230] What was it he said about Rosie O'Donnell?
[2231] Oh, yeah.
[2232] Did he call her a fat pig?
[2233] Something like that.
[2234] Something like that.
[2235] I don't even remember exactly what he said, but it was nasty.
[2236] Oh, he says horrible shit.
[2237] He calls her a loser all the time.
[2238] Yeah.
[2239] And people think it's hysterical.
[2240] They do.
[2241] And they do have something like, you know, the but her emails thing.
[2242] But whose mean tweets, you know, like he could sell shirts and stuff about that because it's like, you know, yeah, well, a lot of people will say, Yeah, I don't approve of his tweets.
[2243] I don't like his tweets.
[2244] No, a lot of people would say that A lot of people on the right will say that a lot of people on the right will say that this is where I think they have hope in DeSantis that he wouldn't do the same kind of things He would never tweet things like that he would never call Kim Jung on little rock a man all that shit was hilarious It was funny he would never call his ex -girlfriend horse face like that kind of stuff is You know in many people's eyes and a lot of conservatives eyes it's unbecoming of the commander -in -the -office yeah, yeah that's the argument against it You know, the argument for him was his economic policies that people think were better.
[2245] And if we didn't have COVID didn't hit, we would have been in a better place financially.
[2246] And the argument is as the economy grows, it gives more opportunities for everybody.
[2247] And everybody sort of does better because the economy is doing better.
[2248] And this is like in the anti -Marxist, anti -socialist argument, is that a strong economy where these businesses are killing it is better for everybody because then there's more jobs, there's more opportunity, there's more.
[2249] and then other people say, well, no, because it's a small percentage of people that are getting the most benefit.
[2250] Right, right.
[2251] Yeah, it's hard to get there.
[2252] Like, they're playing a game, and this is the game.
[2253] The game is, it's like a giant game of monopoly.
[2254] Like, you're trying to get the most amount of resources.
[2255] That's what they're doing.
[2256] Yeah.
[2257] And if you want to cap that, you're not going to get people playing the same game with the same.
[2258] I'm not saying they should be able to steal.
[2259] But I'm saying, like, competition, like, if you want to be a Rupert Murdoch, if you want to be some, maybe that's a bad example.
[2260] If you want to be someone who's like some head of some gigantic industry that's worth billions and billions of dollars, like Jeff Bezos, buying the biggest yachts and fucking flying around the biggest jets, like that guy played a game and he got to the highest level of the game.
[2261] And it's the same game that most people are playing.
[2262] That game is do the best for yourself financially.
[2263] Now you can say that he did it at the expensive unions.
[2264] You can make all these arguments that I would probably agree with.
[2265] You could say he did it in a way that undercut family people.
[2266] businesses and go maybe maybe yeah it's good argument but at the end of the day he's playing a game it's a legal game and he got way ahead of that game and you can say well now he's at the head of the game he's got too much influence over the other players of the game yeah that's how the game works it's a fucked up game right but it's way better than communism right it's fucking way better than the soviet union it's way better than what's going on in china it's fucking way better than what's going on north korea it's way better this way well thank you about how convenient it is then anybody can order anything they want from Amazon at any time.
[2267] You know, have it delivered straight to our house.
[2268] I'm a fan.
[2269] Yeah.
[2270] Yeah.
[2271] And I'm not in any way, shape, or form an anti -Bezos guy.
[2272] I'm not, or anti -industry guy.
[2273] There's a way to have ethical capitalism.
[2274] It's totally possible.
[2275] It can be done.
[2276] You know, and it can be done by regulating things correctly.
[2277] It could be done by, you know, whatever, whether it's tax structure or whatever it is.
[2278] It can be done.
[2279] It can be done.
[2280] just like you're playing a very specific game and that very specific game is always going to encourage people to win and they're going to try to make more money every month they're going to try to make their fucking stock the biggest thing so they can get that bonus and that's what they're doing they have an obligation to that if you want to say that game sucks you want to say you don't want to play in that game that's fine but that is the game that this country runs on and to just throw the fucking board up in the air i don't think is the solution yeah well and i mean usually the people who are saying that are people who aren't very good at playing the game or they're early on in the game you know maybe they're just out of college which makes sense if you're just out of college and you're making 24 ,000 dollars a year and you're looking at people that are worth billions and then you're seeing homeless people and you're a kind compassionate young person right and you're these professors are you know teaching about Leninism and Marxism and you start to espouse these ideas like I get it I get all of it I understand how I just don't get how I don't get how personally I don't get how as a young person you look at someone like a Jeff Bezos and all the success he's had and you're not inspired to go out there and work your butt off to, like, get to that level versus feeling envious or resentful and saying, oh, that's not fair that he did that.
[2281] But it's natural.
[2282] It's natural to feel envious or resentful.
[2283] I never had those thoughts.
[2284] I always looked at successful people, and I'm like, you know, like, what was, what did they, how did they operate?
[2285] How did they think?
[2286] Did you get that from your parents?
[2287] What kind of books do they read?
[2288] What kind of, I don't know that was, you know, my parents, my parents didn't instill in me, you know, a business mindset.
[2289] you know my dad is a pastor I grew up in it in the church most of what they still to me was values most of what they instilled with me was it was it had to do with values and and faith and doing the right thing not you know getting ahead in the business world they were never like oh we want our son to go to the finest business school and become successful they they judge success in much different terms in terms of you know how big your bank account is which is great and important but and when I as a as a young person when I was in college or before, looking at people who were successful, I never, I never, and we weren't successful.
[2290] My dad's the pastors don't make a lot of money unless you're the pastor of a big megachurch.
[2291] It's telling people to be healthy and wealthy if they just give you more money.
[2292] You know, my dad didn't pastor church like that.
[2293] So he had a very meager salary.
[2294] And we had a very, you know, middle class, lower middle class upbringing.
[2295] And so I don't know, I always, but I always looked at wealthy, successful business, people, entrepreneurs with like, I had huge admiration for like what they were able to accomplish, how brilliant they are, how hardworking they are, what they put into that.
[2296] Well, it's kind of interesting because we celebrate that in a lot of other areas.
[2297] We celebrate that in sports.
[2298] We celebrate that in art. We celebrate the overachievers.
[2299] The problem is that comes in a lot of people's eyes with victims, right?
[2300] Like the overachieving capitalist comes with victims.
[2301] Victims, the environment is a victim.
[2302] The people are the victim.
[2303] The lower class is a victim.
[2304] There's like a lot of bodies along the way.
[2305] their eyes.
[2306] And also, we're talking about how malleable people are.
[2307] I mean, they're, if you're in the university system of 2022, you're 100 % at least being exposed to a lot of these socialist ideas and Marxist ideas and very progressive left -wing ideas.
[2308] And they're more, they're more popular than conservative ideas.
[2309] Yeah.
[2310] You're going to have a lot more victims if you get rid of that system and do it a different way.
[2311] But you're doing, by a large market.
[2312] But you're doing, by a large though the university systems are leaning towards the left right and so these people they're looking at this they just left their parents house maybe they disagree with their parents maybe the dad's an asshole and now they're at this university and this professor who's so eloquent and so interesting it's so well read and well traveled and they're saying these things that are so opposed to the way they grew up but makes so much sense like oh I'm gonna do the right thing now and I'm gonna fight for this right oh yeah it's tapping into people's sense of right and wrong.
[2313] Yes.
[2314] And yeah, and trying to say, look, you don't want to, you don't want to leave a trail of victims in your wake trying to climb to the top or whatever.
[2315] Right.
[2316] You can't get rich in business the way that Bezos did without offering something of value and employing a ton of people, which is a tremendous good.
[2317] And it's just like anything.
[2318] Like when you look at, you know, there's tradeoffs to everything.
[2319] Free will has tradeoffs.
[2320] The fact that you have freedom to do whatever you want means that you can be loving, good, kind, and charitable.
[2321] But guess what it also means?
[2322] It means you can take the wrong turn, become a criminal, take advantage of people.
[2323] a scumb bag mistreat people be cruel to people mean to people that's the downside of free will but the upside the upside more than makes up for that the fact that any of this good stuff is possible makes up for the fact that the bad stuff is possible you're never gonna have any any system that you could possibly think of that doesn't have those that doesn't have downsides that are offsetting to the positives you're worth having yes no I think I think you're correct and I think that looking at it the way you're looking at it like to say that you grew up the way you grew up is very interesting because I think having a firm set of ethics and morals when you grow up is very advantageous because you realize like later in life if you do the right thing you'll feel better regardless of the result like you do the wrong thing but you benefit you feel guilty it's bad for people like if you're a person that is made you're living in an institution where you're working at a you're working at some sort of a corporation and you poison the environment in order to succeed and make more profit, but you hit your goal, but then you realize that there's a dead lake in Ecuador now and all these people are suffering and getting cancer.
[2324] That's happened before, right?
[2325] That's the bad side of it.
[2326] And you either become a callous sociopath, or you've got to find another line of work.
[2327] Because a lot of people just become callous sociopaths, and they continue that behavior.
[2328] So the way you grew up is so beneficial, because having a strong, formate, you know, a foundation of ethics and morals is what we should all strive for.
[2329] It doesn't mean you can't succeed.
[2330] It just means you shouldn't succeed at the expense of things that you know are evil.
[2331] And that is the problem with unchecked capitalism.
[2332] That's the problem when they're allowed to go to third world countries where there's no regulations, do wild shit and pollute rivers and fuck up the environment.
[2333] They do that and they do that because it's profitable.
[2334] That's our problem is that what you are saying and what a lot of Christians are saying, what they grew up in this way is more important to do the right thing.
[2335] That's the most important thing.
[2336] I think we should all agree on that.
[2337] I think we should, yeah.
[2338] And it's going to benefit you in business, too, to some extent.
[2339] I mean, will you necessarily be as successful as somebody who's, like, cutthroat and doesn't care if their employees are underpaid or not well treated or you're taking advantage of somebody in a process or ripping somebody off?
[2340] I mean, you're not going to be as maybe successful as them, but you bring values.
[2341] You honor your word.
[2342] You know, you make good on your promises.
[2343] You do those types of things, and that reciprocates in ways that are valuable in business.
[2344] And that comes back to help you.
[2345] I mean, if you leave enemies in your wake all over the place, like you cut off, you damage and destroy relationships because you're dishonest in business.
[2346] You take advantage of people or you poach from them in nasty and unethical ways.
[2347] Like, I mean, you're going to create more problems for yourself than you solve, even if you get ahead in the short term.
[2348] They think they can stay ahead of that, you know.
[2349] A good, maybe not the best example, but if you have a really good product, you can stick by your, your, ideas is chick -fil -a and a lot of people oppose chick -fil -a for their anti -LBG -G -T -Q ideas and you know gay marriage and stuff but they don't open on Sunday they're leaving billions of dollars on the table yeah and they're like Sunday's the Lord's Day yeah like which is crazy for a fast food place to have that take but meanwhile they're everywhere yeah they're fucking killing it every time I go buy chick -fil -a is a giant ass line delicious food you always crave chick -flay on a Sunday oh it's like the only time I think I want chick -fil -a it's Sunday and I'm like I can't go it's closed nope nope nope Jesus said don't eat I don't know You want what you can't have I don't know why it's bad You can still go to church and still work It's still possible to keep Chick -Filly open But they don't want it I know so look it's not It's not that I align myself with Chick -fil -A's values But I'm saying like if you have a great product You can stick by your thoughts Even if they're not the best thoughts You can stick by your ethics and your morals And you can still be successful If you have a great product So that's where the idea of like ethical capital Could come in you know like moral ethical capitalism like it's still okay to compete in the marketplace But it's not okay to lie it's not okay to pollute environments It's not okay to do some of the shit that we know the corporations have done lie about studies and do different things where they Try to proclaim their innocence it's not okay to pretend that you're a free speech platform and then moderate Political viewpoints that you don't like how about that one agree put that one on the list agreed Yeah, that's and that is a lot of people would think it's less consequential than corporations doing evil things, but it's still bad.
[2350] It's not the optimal way to do things.
[2351] But I think part of the problem is, I'm going to keep coming back to this, but I think part of the problem is the format itself.
[2352] The way people communicate in that text format is just shitty.
[2353] It's just a shitty way to get thoughts across to people where you're going back and forth with them.
[2354] It just leaves too much to the imagination.
[2355] It's too many openings to be an asshole.
[2356] It's not a good way for human beings who are designed to communicate, look at each other in the eye, reading emotional cues, reading tone and, you know, and the context of the conversation.
[2357] That's how people are supposed to talk.
[2358] In any other way, it's just, I don't think it's good for you.
[2359] It's the reality of the world we live in, though, it's just like with your kids and social media.
[2360] It's like you can't, there's never going to be a physical town square anymore where you go and have, like, a debate about the issues of the day, like, with your neighbors.
[2361] But you should at least encourage people.
[2362] people to not engage with people like that.
[2363] I think that idea needs to get out there in a way that more people resonate that it's not good for them either.
[2364] It's not good for anybody.
[2365] Like when you, when I know I have friends who do that Twitter beef back and forth shit with people.
[2366] And when they do it, they have anxiety all day.
[2367] You know, I was talking to this one friend.
[2368] He's like, I couldn't walk down the street 10 feet without checking my phone to see who replied.
[2369] So I'm walking around, like almost bumping into people.
[2370] Right.
[2371] Just freaking out.
[2372] Yeah.
[2373] You know, because he's in some sort of a weird Twitter conflict with people.
[2374] I can raise your blood pressure too You get all anxious You get annoyed He couldn't sleep You said he couldn't sleep Snap at people I've had that happen Where I like I'm engaged in some Twitter spat With somebody And my kid comes up to me My son will come up to me And ask me for something And I'm like not now And dismissing him You know And it's like What did I just do?
[2375] You know I just treated this Twitter spat With more respect Than my own son Yeah That's not healthy It's bad for you Yeah It's bad for you And I understand If you have a point If someone's saying Something that's not true And you want to correct it Or you want to argue against it Or you want to say something about them like this is so hypocritical because of this i get it i get it i just i just think it's a shit way to communicate that's why i don't do it it definitely detracts from our ability to see other people as ends in themselves and not as means to ends or as you know an object to dunk on so that we can score some kind of points yeah and uh and generate more followers or likes um you know people don't exist for that purpose no shouldn't be abused for that purpose um you know we i don't know.
[2376] That's one of the areas where I think a faith like the Christian faith comes into play and seeing people as more than just simply organism, conscious organisms.
[2377] But like everybody's made in the image of God and that we all have an inherent intrinsic value.
[2378] And we shouldn't be disrespected for that reason.
[2379] I don't know.
[2380] I think instilling values like that is going to be a lot more beneficial in those kinds of arenas than trying to tell people what they can I can't say right no I think so too I just don't know how the two of them find common ground right now because right now the ideological battleground is so fucking rigid there's trenches dug in the center and everybody's got guns pointed at the other side it's like how do we make that thing where the Germans and the Europeans played soccer was it World War I they broke the trench and they uh did they uh did they did christmas yeah they played soccer and hung out it was a big deal because um you know they they had a truce and a ceasefire and i think it lasted a few days and they went back to shooting at each other wow find out that story it's a crazy story i believe it's world war one that's kind of profound honestly well it's most people don't want to be engaged in a fucking gunfight with people they don't even know right they're not even sure why they're doing this and they're doing this because their leaders have told them to do it like there's times where that's noble to do, like Nazi Germany's sweeping across Europe.
[2381] Yes, 100%.
[2382] I'm not opposed to that, but I'm saying most of those people did not want to be there.
[2383] Most of those people did not want to shoot at some person.
[2384] They don't even know.
[2385] And most of those people don't even agree with the thing they're being forced to enforce because they're fucking peasants.
[2386] They're just some guy got pulled out of the fields and they put him in a uniform and gave him a rifle and they're sending them to the front line.
[2387] That's the reality of war to those people.
[2388] So when they got together and hung out for a couple days they became human it must have been so fucked when they started shooting at each other yeah yeah see did you find it yet the first story I found was about like myths on it so I was trying to make sure some stuff was real but I just I think there's photographs of it yeah it happened it was just there's some stories that have been like uh exaggerated yeah yeah yeah so pull the story up though because we gotta end soon let's see what we got here soccer in the trenches remembering the world war one Christmas truce so it's real so German lieutenant in the first war.
[2389] He disappeared forever in the Soviet Union.
[2390] The second in 1999.
[2391] His son Rudolph found his dad's diary in the attic.
[2392] This is what Zemich Sr. recorded for Christmas Day 1914.
[2393] A couple of Britons brought a ball along from their trenches and a lively game began.
[2394] How fantastically wonderful and strange.
[2395] The English officers experienced it like that too.
[2396] That thanks to soccer and Christmas, the feast of love, deadly enemy.
[2397] briefly came together as friends.
[2398] It was one of several impromptu soccer matches played between British and German soldiers in no man's land that Christmas.
[2399] For one day, and in some sectors of the line, for several days, the enemies made a spontaneous peace.
[2400] A century on, these games transfix Europeans.
[2401] In quotes, we all grew up with the story of soldiers from both sides, putting down their arms on Christmas Day, says Prince William, president of the English Football Association.
[2402] No wonder, because this extraordinary story suggests an alternative history of the 20th century.
[2403] Many people, including some veterans of the war, have doubted that these games were ever played.
[2404] The story seems too good to be true.
[2405] Indeed, Jeff Dyer in his 1994 book, The Missing of the Somme, dismisses it as myth.
[2406] Some historians believe the truth is somewhere in between.
[2407] Others contend that the impact of the games has been overstated as we witness the Premier League of FAA and other organizations commemorate the moment.
[2408] It's hard to happen.
[2409] It seems like it happened.
[2410] And that is a good story.
[2411] It's a horrible ending, but it's a good moment in a terrible story.
[2412] Yeah.
[2413] I mean, it's a great ending, I guess, in terms of the result of the war.
[2414] But it's a horrible ending for those people.
[2415] They had to go back to shooting at each other when they realized they had common ground and they could just hang out together.
[2416] So you think the liberals and the conservatives can have a day of peace and come together on Twitter?
[2417] No. Lay down their arms.
[2418] I would hope they could do it individually one -on -one in real life.
[2419] Yeah, I'd like to see more of that.
[2420] People would learn how to do that, but I think it's a learned skill.
[2421] Yeah.
[2422] I mean, I think if a lot of people have a conversation about something that they disagree with, even in real life, they're so used to communicating in this kind of shitty way that I think they would just engage in the standard and scream at each other.
[2423] I mean, those Karen videos that you see, people yelling each other about wearing a mask or yelling each other about, you know, you support Hitler, whatever it is.
[2424] Like those crazy conflicts that people have in real life, that's not the way to do.
[2425] it either.
[2426] It's just human beings are in this weird stage of information over exposure and social media and just an incredibly volatile world right now.
[2427] I mean, there's so much uncertainty and there's so much anxiety that people have about international conflicts for the first time in a long time.
[2428] People are genuinely worried about our relationships with China and Russia.
[2429] It's scary shit.
[2430] And so people are just ramped up with anxiety.
[2431] already and then you know even if you get them together in public they might scream at each other but I think that if people could learn how to not do that you're learning to just communicate I think we could get along a lot better and we could find common ground I think that's what we all want I don't think we're ever going to come to a time in this world where they're not conservative people and liberal people I like what you said earlier about steel manning your opponents instead of straw manning you know like actually giving them the benefit of the doubt that maybe they're actually a rational thinking person who's considered you know the evidence or whatever and has reached conclusions in good faith and respect that and even if it's different than what you believe like respect that and be willing to have a dialogue with them about it without without the assumption that they came by their views in bad faith or that they're stupid right they're ignorant they haven't you know they just haven't done enough research and just belittle them you know like you're never going to get anywhere with that with anybody agreed agreed well listen man I appreciate you coming by here it was really fun to talk to you I appreciate appreciate your website.
[2432] You guys make hilarious memes.
[2433] It's fucking really funny shit.
[2434] Thanks.
[2435] And I'm glad you're out there.
[2436] And I wish you'd get back on Twitter.
[2437] I really hope they let you back on.
[2438] You know, I was hoping that when Elon, if he bought it, and maybe he still will.
[2439] Maybe they'll force him to buy it.
[2440] Maybe.
[2441] If he, you know, opens that up a lot more and lets a lot more freedom of expression on both sides.
[2442] Yeah, we'll see.
[2443] I hope that that happens.
[2444] Or if that doesn't happen, there's got to be some other solution.
[2445] Because I think Elon is absolutely right.
[2446] This is the town square.
[2447] And if it is the town square, then we do have to have some consideration for First Amendment rights, because otherwise we don't have free speech privileges in the town square.
[2448] Do you think there's any potential for a real third party site, a second site, like something that is, that mimics what Twitter does, but has a better, like, sort of open -minded approach that doesn't just get dominated by right -wing people or dominated by I mean, there's been attempts.
[2449] A parlor was a great attempt initially, and then look what happened with it, you know, Amazon and Apple and everybody.
[2450] What did happen?
[2451] They got, they got deep platformed because they were blamed for January 6th.
[2452] It was this whole thing.
[2453] It's like, oh, all this hatred and all this planning was happening here.
[2454] So they're not on the app store anymore?
[2455] They got back.
[2456] They got back.
[2457] But, you know, people, it's, it's, people moved on and went to other things, whatever, you know, like, parlor's still there.
[2458] Parlor's still.
[2459] But the problem is this.
[2460] And this is what I say over and over and over again, when people ask me about alternative platforms, There's a place for these platforms.
[2461] I think they should exist and I think they should offer.
[2462] They should try to honor the free speech principle.
[2463] The problem is the left doesn't want free speech.
[2464] I say this all the time.
[2465] They don't have a problem with hate speech.
[2466] They just hate speech.
[2467] When they say misinformation, they mean like opinions they don't like.
[2468] When they say hate speech, they mean opinions they don't like.
[2469] They mean bigots are people they don't like that have opinions they don't like.
[2470] So they're not going to want to be on a platform that honors free speech.
[2471] You have to force it on them essentially.
[2472] It has to be by the law.
[2473] It can't just be, oh, here's a free speech platform.
[2474] Let's all go there.
[2475] Yeah.
[2476] That's not going to happen.
[2477] I wonder.
[2478] You know, I wonder, but I think one of the ways that that could happen, and one of the only ways, in terms of having a platform, is if Elon buys Twitter.
[2479] Because he really did open it up.
[2480] I think most of the people that are addicted to Twitter, the progressives, the left -wing people, they're going to stay on it if they're not censored.
[2481] They're going to stay on it.
[2482] I think they'll stay on it.
[2483] And if they can develop these little environments where they can block everybody they don't like and, you know, and, you know, and.
[2484] limit comments to people who follow them.
[2485] They could still sort of regulate their own feed, regulate what they...
[2486] Blocker they want.
[2487] If you have 8 million followers on Twitter, where are you going to go?
[2488] I mean, you're going to leave Twitter to go to some other, start your own platform.
[2489] Right, that's the problem that conservatives have had.
[2490] It's like it's hard to build a platform, especially if it's going to be just for your own...
[2491] Like, they're not going to have 8 million followers if it's just for leftists.
[2492] Right.
[2493] They're going to have a lot less than that.
[2494] They've got followers from all sides.
[2495] Right.
[2496] That's the value of Twitter.
[2497] Right.
[2498] And that's the problem with something like parlor is because the right -wing people go over there when they get kicked off Twitter or if they don't think Twitter's supportive of it and then it becomes so right -leaning that the left -wing people don't want to go over there.
[2499] Right, and the left -wing, and this is the thing.
[2500] I think that, you know, there should be, like we're talking about, we're very idealistic in this conversation, talking about how people should behave and how they should treat each other and it's, you know, is this really going to happen?
[2501] I would love to see more people.
[2502] I love when I hear people like, when people ask me who my favorite comedians are, my favorite comedians or anybody who's like willing to make the jokes, you're not supposed to make and speak the truth and stand up for free speech and this nonsense cancel culture stuff pushing back on that when Bill Maher is talking about the importance of free speech he's very hard on Twitter he talks about how they do need a new sheriff like it's been run poorly someone like Bill Maher would be happy I think I don't know I don't even know if he knows what Parlor is but he'd be happy to join a place like Parlor and bring other people from the left with him because they could actually benefit that discussion they could provide a counter to those other arguments more people from the left should be willing to jump in the pool and swim with others not the men in the women's pool by the way But, you know, the ideological pool.
[2503] The ideological pool.
[2504] I see what you're saying.
[2505] I mean, I...
[2506] Thanks for the laugh.
[2507] I don't know where this is going to go, but I hope it goes in a good direction.
[2508] Yeah.
[2509] I think, like all things, this is a very disruptive technology.
[2510] It's shaking up the world, and I think, well, hopefully, cooler heads will prevail.
[2511] Yeah.
[2512] We'll find a rational solution.
[2513] But I think a lot of it depends on these kind of conversations.
[2514] Yep, absolutely.
[2515] So thank you very much.
[2516] The BabylonBee or The BabylonBey .com.
[2517] BabylonB .com.
[2518] Subscribe and support us.
[2519] I mean, we have, you know, we're getting deep platform left and right.
[2520] You guys have a podcast as well, right?
[2521] We got a podcast, YouTube channel, putting out a lot of video content.
[2522] Beautiful.
[2523] Thank you very much, Seth.
[2524] Thank you.
[2525] Appreciate it.
[2526] Bye, everybody.