The Joe Rogan Experience XX
[0] Joe Rogan podcast, check it out.
[1] The Joe Rogan Experience.
[2] Train by day, Joe Rogan podcast by night, all day.
[3] We have to be really careful that we don't catch on fire.
[4] Oh, yeah.
[5] Really awesome.
[6] I mean, it would be horrible for us.
[7] Terrible.
[8] It would be funny.
[9] I mean, we have fake hair and plastic robes on.
[10] Not good.
[11] I mean, these are nylon robes and nylon hair.
[12] We would be like Michael.
[13] Jackson, it would be the Michael Jackson moment.
[14] Remember he caught on fire?
[15] You're at a Pepsi commercial, right?
[16] Yeah, yeah.
[17] That really fucked him up, apparently.
[18] That was the beginning of the end.
[19] Yeah.
[20] Well, I think the end was like already, I think it was already, he was already white by then.
[21] You can't say that was the beginning of the end.
[22] That dude was, you know, many plastic surgeries.
[23] And he's one of those cats, like Eddie Bravo used to say that, like, that Eddie, that Michael Jackson's plastic surgeon does not advertise that he's Michael Jackson's plastic surgery you imagine you have the biggest star in the world and you do his plastic surgery you're like not me not me i don't have nothing to do with that yeah no that yeah his like it was insane how much they shaved that guy's face down it was like they met his nose like a tiny european girl's nose yeah like a four -year -old's nose yeah horrible it's hard shrunk it down and it was caving in apparently he was fall off or something they've said.
[24] I don't know if that's real.
[25] See if they can get a photo because I think there was an issue.
[26] See if you can find a photo.
[27] It's really scary though, man. Like, imagine it.
[28] Oh, look at the one in the lower right corner.
[29] Oh, my God.
[30] Jesus Christ.
[31] Hi.
[32] That's what it looked like in real life.
[33] So if you saw it in real life, that's what you would see.
[34] Why do you think I would hurt a child?
[35] He's got a bandage on it.
[36] That's why it looks a little.
[37] Because it's caving in.
[38] See, the right side of it looks like it's gone.
[39] See that right?
[40] Looks like a hole there, doesn't it?
[41] It seems like his eyes have somehow.
[42] been enlarged or something well it's that same operation that i think they do with uh it's like very popular in south korea where they trim your lids they give it they give them that anime look oh god look at that nose that is wild man yeah he kept going see the the nose he had on the right was like slightly less ridiculous than the nose on the left that could be a press a prosthetic oh wait yeah the one on the right is probably a prosthetic that he puts on Because if you look at the very, see the like the bridge?
[43] It looks like he painted something.
[44] Right.
[45] It looks a little sketchy.
[46] See that little hole on the right nostril, above the right nostril, in the bandage, that's where it looks like it's caving in.
[47] And that might have been what was happening to him.
[48] Like look at that photo right there that you had your cursor on earlier, the real creepy one, right above.
[49] Yeah, that one.
[50] But wait.
[51] Look at that one.
[52] What the fuck, man. That looks, that's like when I try to make something out of clay.
[53] when I try to make a human face and try to make it look real realistic Yeah That's like the matter Can you The reflection in his glasses though What was that man?
[54] Can you pull that one?
[55] No look zoom in on that Is that a tank?
[56] What is that?
[57] Yeah, he's in Russia right now.
[58] He is a part of the resistance What?
[59] What is that though?
[60] What is that massive?
[61] What is that?
[62] Looks like a...
[63] You know, it's really crazy Look at the photo that they just showed when you go back to all those photos where it showed him as a young man like look at that that was him in 1980 and then by 1983 he was already fucking with his nose he was trimming you see like in 1980 it was he looked great like that's what he's supposed to look like yeah so he kept he kept going he kept going so it looked like the first shit he had done is in the 70s look at that 75 to 79 again that one that you just showed look at the difference like by 79 it looks like he had already had some work some work done oh the poor bastard holy shit man like right there if he stopped right there he looks great you know he doesn't look crazy he got to a point where you know this is the same thing that anorexic have body dysmorphia they can't see it it's like us wearing these wigs we think we look good I don't think I look good I think I look like How is this the number one podcast in the world Explain that I can't you know Explain that what we're doing here Yeah Is what we used to do Yeah Same thing Exactly Exactly exactly the same thing Yeah it's weird It doesn't make any sense It does I mean but you Don't you try to actively not think about that Yeah Because if you think about it It's like Touching an electric fence or something You'll lose your fucking mind and then you won't be able to do it anymore it's just like you have to understand and we all do somewhere but you're alive in the middle of this experience and it's playing out where no one feels comfortable no one understands what's happening no one knows what life really is no one not a single person and yet we're all calling upon other people for guidance and leadership and support we look to like powerful leaders like that's why me included everyone is so excited that Elon Musk is trying to buy Twitter like yes the great one he's the super intelligent leader type character that seems to have great ethics and morals too he seems to be like a guy that if you had a movie character and the movie character was this like super billionaire didn't give a fuck but he was like super fucking smart and he was really genuinely working to save humanity.
[64] Yeah.
[65] That's that guy.
[66] Dude, here's the thing.
[67] This is the scariest thing of all.
[68] When you realize that guy's not going to save anybody.
[69] He's not going to make things funny.
[70] And like a lot of what he's doing is so cool.
[71] And if he buys Twitter, it's going to be one of the funniest things ever.
[72] And it helps me understand why it would be awesome to be a billionaire.
[73] Because finally, I would think, okay.
[74] If I was a billionaire, I would be trolling so hard all the time with all this money.
[75] I would buy fake commercials for products that seem vaguely real.
[76] You would be the best billionaire ever.
[77] It would be so fun.
[78] They would all be like your ads and your podcast.
[79] Yeah, exactly.
[80] I would have an entire department just dedicated to putting chaos into the world for fun.
[81] Can I stop you right here and just tell you, I admire your commercials so much.
[82] Thanks, man. the best commercials in all of podcasts thank you thanks they're funny they're and they're ridiculous and they're creative and like i always feel bad that i just do a regular commercial after i listen to yours you do right you should it's like to me too much work it's fun it is a lot of work but it's funny that you get to advertise to me yes it's so funny to imagine that any company is letting me advertise for them something in that makes me feel gleeful when i'm doing it and it's cool that most of them are cool with it Well, the way you're doing it is so fun.
[83] It's so fun.
[84] It's like clearly just fun.
[85] Yeah.
[86] Isn't this delicious?
[87] This is so good, man. So good.
[88] But you know, man, like Musk, he's finally doing the thing.
[89] If I were a billionaire, for sure, it's like, oh, I'll just buy Twitter.
[90] Can I buy Twitter?
[91] Let's see what happens.
[92] He doesn't care.
[93] I think he does care.
[94] You think he cares?
[95] I can't tell.
[96] No, he does, genuinely.
[97] He's concerned about censorship.
[98] Freedom of speech.
[99] He said freedom of speech is someone you don't like, excuse me, freedom of speech is someone you don't like saying something you don't want to hear.
[100] He goes, they have to have that right.
[101] He's like it's essential to a democracy.
[102] Most people objectively agree.
[103] The problem is he gets scared because they see how like mobs of people can move in a very negative direction, right?
[104] I think people, there's a real concern, like I am in no way supporting sense.
[105] Right?
[106] Absolutely the opposite.
[107] But I understand why people are concerned about large groups of people being really shitty and getting on the social media platforms that they've done this campaign over the last few years to like silence certain voices, stop aggressive people, stop people who are being shitty to people.
[108] Isn't some of that is good, right?
[109] Sure.
[110] Stop doxing, stop people are threatening people, stop people who are harassing people.
[111] But it's like, where does it end?
[112] And how do you know?
[113] You don't.
[114] That's where it's fucking weird.
[115] Because if you just let wild free speech, like, there's going to be a bunch, like, there's a bunch of people on like 4chan and those kind of places.
[116] And they're saying stuff just for fun.
[117] Right.
[118] Because nobody knows who they are and they can just say it.
[119] And they don't fucking mean it.
[120] They're saying because it's a crazy thing to say if you're at work, in a job you hate sitting in your cubicle and you decide to make a frog with a Nazi outfit on and it doesn't mean you're a real Nazi and it doesn't mean it's the word it's edge lording it's an edge lord those people are they're confined to small corners of the interdair right now and if if you just let them loose if you let the frog people loose yeah like you remember the frog people during the Trump campaign?
[121] Yeah, of course.
[122] They took over the frog.
[123] The frog guy who made the frog was so sad.
[124] I had him on my podcast.
[125] He's very cool.
[126] What did he say?
[127] Just what the fuck?
[128] The frog was never a Nazi.
[129] The frog was like funny and sweet.
[130] Was he upset they took his frog?
[131] Yeah.
[132] I think it would be a little bit like the first person who had the Hitler mustache.
[133] Wow.
[134] You know what I mean?
[135] Like imagine.
[136] And then Hitler sees it and he's like, God, that's fucking cool.
[137] I'm going to start having a mustache like that.
[138] You're like, no. It's like that.
[139] But also, you know, I don't know how many people know that, like, the, they discovered an Egyptian god.
[140] And again, this could all be trolling Keck, the god of chaos, who happens to be a frog.
[141] So what was, what was, so within that, whatever that was, an actual, like, chaos magic was being, they were calling it meme magic or whatever.
[142] They were all, they were, like, actively doing chaos magic.
[143] And a lot of their rituals seem to have worked to some degree.
[144] So, yeah, I know what you mean, like, the, the question, like, the question is in a world where we don't even know how many of these people are human who are writing shit on Twitter or wherever.
[145] Right.
[146] Look at what is it, three or two, the AI that, like, completely imitates human beings.
[147] Like, have you ever gone to the...
[148] Explain how that works?
[149] It's a neural network that can be programmed to different persons.
[150] that can imitate human speech well enough that they can leave messages, and you don't know for sure if it's a human or not.
[151] Wow.
[152] So if you go to on Reddit, there's a subreddit, which is the AI bots arguing with each other on the Reddit.
[153] And like once I left that up, and I remember I was just reading it thinking I was on the front page of Reddit and getting really engaged in the arguments these bots were having with each other, it's that convincing.
[154] So the freedom of speech thing runs into, okay, of course, it's like a no -brainer humanist ideal.
[155] Human should be able to say whatever they want.
[156] This is like the most important way that we must have some kind of debate or discourse where one person is allowed to be a complete asshole.
[157] Right.
[158] And not because we want assholes, but because hopefully in the discussion, some bit of truth will sink down into their consciousness and they will grow as a person.
[159] And I think that's what the roots.
[160] That's the hope.
[161] That's the hope.
[162] But what if the person is a robot is an AI device that is being made to have certain personalities that represent demographics that you're trying to manipulate is a corporation or a state or whatever?
[163] At that point, is it freedom of speech or bots allowed to have freedom of speech?
[164] This is one of the things Musk was saying he might do with Twitter.
[165] which I think is brilliant, anyone can get the stupid blue checkmark.
[166] You pay for it and you have to verify yourself so that, like, we know you're a real person.
[167] Right.
[168] Meaning that it would be too expensive for the bot swarms to function.
[169] It would go from being, like, if you have a blue check mark, whatever, you're something or whatever to, if you don't have a blue check mark, you're probably a bot.
[170] And so then we eliminate all the AI, all the fucking state -sponsored shills, all the people who've spent years and years and years building these fake identities online.
[171] They're like farming personalities because in the old days somebody tweets some shit and you're like, what the fuck is that?
[172] And you go and look at their account and it's like three days old.
[173] So it has less credibility.
[174] But they've been growing these personalities for years.
[175] So you go and look at their tweets and it kind of looks like a person mixed in with their interminable tweets about some political issues.
[176] like only an insane person every three minutes tweeting something or like went to the pool with dairy today you know what I mean in some weird distant picture of a pool so it kind of seems like a real person anyway the point is the freedom of speech issue is right now we're running into the problem of brand new technologies that aren't even human that are being designed to influence public discourse in a way that's going to push the needle towards whatever it is right whatever your laws your corporation wants whatever some the state wants and we know for sure they're being implemented that way fuck yeah these bot swarms they found what they found a in the ukraine they found like one of these like creepy shelves of phones it's just phones oh i've seen those but i didn't know they were running an i thought like people were manually typing in messages because i know they were doing that first i think both both i mean both things are happening yeah like you know you You know, it's a PR firm, basically.
[177] You know, politicians have publicists, too.
[178] Isn't it wild, though, that that's the best way to defeat America is to get people separate from each other, arguing with each other, and lose all faith in the democratic process?
[179] Yeah, sure.
[180] All faith in democracy.
[181] Lose all faith.
[182] Powerful weapon, man. It's like an...
[183] It's amazing.
[184] We dropped an atom bomb multiple times on cities filled with people.
[185] The United States did that.
[186] Everyone knows that.
[187] So our karma, who we've got some fucked up karma, man. You know what I mean?
[188] So, so like that, so the problem with these atom bombs, biggest problem with the atom bomb is, I mean, aside from the fact that it kills random people and is horrible, it irradiates the ground, the buildings.
[189] So you, so there's no plundering to be had.
[190] If you, traditional war, you plunder, however you want to make it look, you're plundering.
[191] If you, If I nuke you, I can no longer plunder.
[192] So this is why biological weapons pop, like, are desirable because then it burns out the biome in the area, wait a little bit, and then you can go plunder.
[193] But even better than that, get into the fucking minds of the people in the country that you want to invade and then just change their minds so that the country shifts into what you wanted that country to be.
[194] now you didn't even have to do anything except they're yours they're believing in you and then in that the country starts falling apart the CIA by the way this is one of the things they do they go into other countries they cause like disturbances and then it collapses I mean it's a classic weapon of war so anyway yeah I obviously we might be the entire United States and probably other countries might currently be irradiated not by like obviously radiation but by bad data created by artificial intelligence bots that have been programmed to swarm social media and give the impression that there was some like something happening that wasn't even happening at all you know why when i get the most suspicious that someone's a bot when they have an american flag next to their name oh yeah that's a bot i automatically assume that's a bot yeah i see that and i go oh they're they're one of those fake like you know god before country people That's right.
[195] You know?
[196] It's so fucking weird, Joe.
[197] It's so fucking weird.
[198] There must be so many of them.
[199] There's so many.
[200] And they're like forming arguments out there.
[201] Dude, they, okay, imagine a hundred years ago sitting in a town hall.
[202] There's a debate going on, a lively debate about some new law you want to impose.
[203] And somebody stands up and he's like, I'm all for the law.
[204] And then, you know, just starts malfunctioning, smoking.
[205] It's a fucking robot We've been invaded by robots This would be a horror movie It would be a horror movie But that's what's happening now Yes, it's just they're doing it through texts Yeah, they're doing it through And that's way more effective anyway Yeah, they're getting it you're addicted to your phone They're getting to you through your phone And you're getting all these hot takes That are designed to sow the seeds Of anger And distrust And Yeah tribalism between the two sides.
[206] It's never been stronger.
[207] When I was a kid, like, your neighbor could be Republican.
[208] This guy could be a Democrat.
[209] Nobody cared.
[210] They'd like, they'd mock you about this.
[211] You'd mock them about that.
[212] Yeah.
[213] But you could be friends.
[214] You could be friends with people who were right wing.
[215] Now it's like they're the enemy.
[216] Yeah.
[217] It's very bizarre.
[218] Yeah, that's true, man. Very bizarre.
[219] It's very fucking bizarre.
[220] I mean, yeah, the, I mean, I grew up with, and my dad was a Republican, you know.
[221] So I grew up in that.
[222] I grew up having to be friends with, like, Republicans or right -wingers who knew most of my takes on things, but they're very polite.
[223] Yeah.
[224] There was none of this bullshit of, like, wait, what do you think?
[225] What do you think that?
[226] It was usually, like, considered real, like, we grew up in the South.
[227] So it was considered incredibly impolite to bring up, like, politics that did any of that stuff.
[228] You sort of just kept it to yourself.
[229] I've had some people get aggressive with their opinions about stuff, especially because I was a Bernie Sanders supporter.
[230] you got a lot of heat for that really stupid but it's like to to be that connected to your idea that you're going to be like angry talking to someone who has a different idea yeah just like why this is not a good way to do this like don't get mad that someone thought that maybe some of Bernie Sanders ideas weren't the worst ideas yeah maybe maybe it would be interesting having something like that running things yeah what would it be like we've never had anything like that before.
[231] He doesn't seem like a greedy guy who's going to make horrible deals that are going to damage the environment.
[232] He seems like he's got a lot of positives.
[233] Yeah.
[234] What was it he did?
[235] Wait, why don't people like it?
[236] Well, he's a democratic socialist.
[237] And it's not like completely socialism.
[238] It's like a mixture of socialism.
[239] And I guess some capitalism.
[240] Someone would have to explain it better but essentially he thinks that there should be a bunch of things that we pay into with our tax money and a lot of those things would help benefit the greater good of the country right free education clean water solving student debt you know like raising the minimum wage like there's a bunch he he was pitching a bunch of different things yeah that got people super uncomfortable they're like this guy's not trying to make money like this guy's got to fuck this up for us that's right they well i mean look when our politicians are making so much dough you know then they don't want they don't want to pay into those taxes it's a lot of money that they pay a lot of money to politicians for speeches they pay a lot of money to politicians for books and i'm not saying they shouldn't be able to make a lot of money like god bless you that's not what i'm saying i'm just saying it's like it's it's a business and you can influence people with that business.
[241] Yeah.
[242] You can influence the way people think about things.
[243] You can influence and you could hire a team of robots to go swarm your idea through the internet.
[244] That's right.
[245] And you can make it seem like everybody opposes this idea is racist.
[246] Everybody opposed this idea is sexist or transphobic or this or that.
[247] But all you're doing is stirring up shit and you don't care how it gets stirred up because shit begets shit.
[248] Yeah.
[249] And as soon as you get some shit stirred up, People slinging it around.
[250] Well, their kids are going to learn how to throw some shit too.
[251] And then the next generation is going to be a shit tosser as well.
[252] That's right.
[253] And then it's going to be a normal thing.
[254] Yeah.
[255] So like being heated arguments with people all the time.
[256] Online.
[257] And never believe anything.
[258] Imagine if we found out that a lot of mainstream news was run by Russia and that these little tiny subtle lies.
[259] Yeah.
[260] Every now and then to make you lose your faith in the veracity of the fucking news source that you're getting.
[261] man imagine of those of course it is though it's all Chinese funded or you will I mean it's like okay it's the it's not so I think one of the problems is people imagine it to be a little more cut and dry than it is I think it's a little more subtle than than than like just getting paid off but what what we do know obviously there is an incredible imbalance when it comes to wealth in the planet we've got fucking oligarchs like you when you hear about the shit that they're confiscating from these people.
[262] 50 houses!
[263] $750 million yachts being confiscated.
[264] Yeah.
[265] So this imbalance in wealth is so extreme that it's created an incredible greed or in some people and just a basic need to have food and shelter or whatever.
[266] But so when you have that much money, it is, I imagine it's fairly easy to in a very subtle way start doing, in sciops for whatever it is.
[267] And in those siops campaigns the first thing you're going to want to go for is people who are any kind of mouthpiece in the world like Tucker Carlson who the Russian press was like applauding for his reporting and stuff.
[268] So people are like shit he's paid off by the Russians now is he I doubt it but I wouldn't be surprised if people who used to call themselves the KGP and mastered, master distorting reality.
[269] I wouldn't be surprised if some way, shape, or form, they were puppeteering him a little bit, just trying to get to him a little bit.
[270] How does one do that?
[271] Do it in your best Russian accent.
[272] I can't do a Russian accent.
[273] I don't mean Russian.
[274] I mean whatever.
[275] Any country.
[276] So it's like this.
[277] Here's how you do it.
[278] This is the way you fucking do it.
[279] It happened to me. Because once, when Ted Cruz was running against, what's his name?
[280] The guy who wanted to legalize weed.
[281] People didn't like him.
[282] The Democrat guy, Beto O 'Rourke.
[283] People didn't like him.
[284] I was like, ah.
[285] So I retweeted Beto O 'Rourke giving some speech.
[286] I liked him.
[287] He wants to legalize weed.
[288] You know what I mean?
[289] Legalized weed.
[290] It's great.
[291] Biden was supposed to do that.
[292] Yeah.
[293] Well, yeah.
[294] But right after I retweeted him, all of a sudden, I start getting messages from people using the word, fuck god damn it now i can't remember the word uh why a demi gog demigog so all of a sudden i'm getting tweets from random people being like do you really want to retweet a demi gog i can't even i don't know if it's gog or gog i think it's gorg i don't think i've ever read it exactly it's a and i but i remember thinking that's a cool fucking word the first time someone sent that to me yeah demigog demigog sounds amazing sounds awesome sounds like something from stranger things i get exactly a gorgon or Yeah, a demigog.
[295] So I get another demigog, and then another demigog message.
[296] And then I'm like, oh, my God, these are people working at a cubicle who have been told to send tweets at people who say anything positive about Beto O 'Rourke.
[297] But one of them said, Duncan, I really am sorry about what happened to your mom.
[298] But it's upsetting to me to see someone retweeting a demigog.
[299] And that's when I freaked out because it's like, wait a minute.
[300] I went to her account.
[301] This is like a middle -aged right -wing, Republican lady, American flags all over a fucking account.
[302] Real person.
[303] Acting like a real person, but no one who would really be listening to my podcast enough to know about my mom dying and certainly would not be like sticking up for Ted Cruz randomly on the internet.
[304] So that's when I realized.
[305] Oh, shit.
[306] If you have a certain number of followers, they've got a little file on you somewhere.
[307] And that little file goes into informing.
[308] their tweets at you.
[309] Or it could just be some crazy Republican lady who likes you.
[310] But she said demigog.
[311] Right.
[312] After three people said it.
[313] But a lot of people probably have been saying it.
[314] And people automatically pick up on things that people say and they just repeat them.
[315] This is just the simplest explanation without like some grand conspiracy.
[316] I think we should assume that's probable.
[317] That's probable.
[318] I would say that it was probably.
[319] Okay.
[320] So, and again, I don't want to seem like I have such hubris to even imagine that.
[321] Oh, there's definitely a file on you.
[322] But that lady, there's crazy people out there, bro.
[323] Yeah.
[324] There's a lot of those like Austin rednecks.
[325] I'm just saying if you want to get to somebody, the way you do it is in a really subtle way over time and have it happening from a lot of different angles expressing different versions of the same idea that helps push the needle.
[326] And that's how you do it.
[327] So the person wouldn't really recognize that a kind of propaganda beam is being blasted at them through their social medias, and it's being, uh, it's coordinated, organized and designed not to do anything more than get, shift their opinion a little bit, not even that much.
[328] And just get them arguing about stuff.
[329] Exactly.
[330] And then the thing about arguing is people rarely come to a resolution where they're both happy.
[331] Generally speaking, one person wins, the other person feels bad.
[332] One person thinks that person's a dick for, you know, harping on them about something.
[333] It's very rare that someone goes, God damn, and I was, I was, I was wrong and then he was right and you know I need to apologize to him I need to figure this that's a rare person it's great when people can do that yeah but it's a rare person who has a disagreement with someone and changes their mind right fucking stubborn and we instead of really trying to figure out what's white and what's wrong most of the time we're trying to we're trying to win an argument we're trying to win with better sentences and better facts and you're trying to look smarter that's what we're trying to do yeah and when when people just argue, they start arguing about other shit.
[334] If you beat someone with some argument, you want to argue with them about other stuff.
[335] There's this one guy at work and he always wanted to argue about stuff.
[336] It wasn't aggressive, but it was always, it was annoying.
[337] And one time I go, dude, every time we talk, you just want to argue.
[338] I go, I don't, we could just talk.
[339] We don't, we don't have to argue about stuff.
[340] Like, he would, he goes, well, I like to see you challenged.
[341] I go, come on, man, we're just at work.
[342] He said what?
[343] I like to see you challenged.
[344] He's just What does he mean by that?
[345] He's essentially saying he likes to argue And he wanted me to argue with him Like some people enjoy arguing And they're not even doing it in a way that's mean Yeah They're just They just want to disagree Right They just want to see well someone could look at it From this perspective I hate that Oh my God bro oh my God Everything's everything We're talking about dairy Yeah Yeah You know it's like I don't I can't talk about Not everything Not everything Yeah I know man That's that I think that is one of the problems is that, you know, reality can be looked at from a million different angles.
[346] And it's hard for us to do that.
[347] Yeah.
[348] It's hard for us to take into consideration someone else's perspective.
[349] Oh, my God.
[350] Very hard.
[351] Yeah.
[352] Well, it's considered, I mean, it's considered off limits, you know, like, you know what I mean?
[353] Like, try to put yourself in Jeffrey Dahmer's shoes.
[354] You ever do that?
[355] Oh, Jesus Christ.
[356] Well, that's what I'm saying, but to catch people like that, they have to become them.
[357] Oh, my God.
[358] But you ever think about that?
[359] Like how far away from being Jeffrey Dahmer was like?
[360] Like how many bad turns or weird moments?
[361] Do you think that a guy like that?
[362] Because he was one of the rare ones that had like a normal family life, right?
[363] Well, yeah.
[364] It seems like he had a, like his parents seemed just kind of like normal.
[365] Yeah.
[366] Yeah.
[367] That was a weird one.
[368] Like most of the time there's like abuse and foster care or something.
[369] Yeah.
[370] Some great hurt that's put on them.
[371] Yeah.
[372] Yeah.
[373] It's like the, like, to me, the thing you're talking about is we're going to get better at it as a species.
[374] And what that is is putting away the aggression.
[375] That's normal.
[376] Aggression happens when animals feel confronted.
[377] So we still have that.
[378] So you feel confronted in any way, even if it's a small disagreement, some weird animal part of you starts hissing.
[379] And some people listen to that.
[380] And then they're like, fuck.
[381] And then they get weird when they're having a debate with somebody because the animal is hissing through whatever they're saying.
[382] So if you can put that away and then you look at the person and recognize, this is just me. After a bunch of weird turns, I would have been this person.
[383] Yeah.
[384] And then somewhere in there, you can really have like an actual chat with a person because at least you're, if that creature, if the thing inside of them starts, hissing and then that makes the thing inside of you start hissing, then you're basically just having a seizure or something disguised as a conversation.
[385] You're just barking.
[386] It's like when you walk by a yard and your dogs start screaming at each other.
[387] You know, I don't think that the dogs are like, I think there's just, they don't, I think it's like when a dog barks, it's like when we sneeze.
[388] Like they can't control it.
[389] All of a sudden it's just like, and the other one's barking.
[390] I'm just saying, if we are debating with all this aggression, I don't even know if we're human at that moment I think we're just hissing at each other and pretending what we're saying has some kind of importance in the world I mean it would probably be healthier to just like instead of you know angrily fighting with somebody right because sometimes when people are talking they're not really saying anything anyway they're just barking at each other exactly just dogs yapping at each other online the most depressing thing ever imagine if everyone alive is at their essence the exact same thing just interacting with the rest of the universe through different biological filters yeah but like everyone's the exact same thing whether you're born in China whether you're born in Pittsburgh it doesn't matter it's just you're going through different biological filters and different life experiences yeah to interact with the universe, but you, the same.
[391] What makes you, makes me. It's the same thing.
[392] I'm just going through this body.
[393] You're going through that body.
[394] I'm going through this path.
[395] You're going through that path.
[396] It's the same thing at the core, whether it's male or female, whether it's boy or girl or gay or straight or it doesn't matter.
[397] It's the same thing.
[398] Yeah.
[399] It's the biological filter of being a different human being with a different life experience and a different part of the world.
[400] That's what's different.
[401] That's it.
[402] Imagine how wild.
[403] the world would be if we could really lock onto that idea it'd be a Christian world Joe that's a Christian idea right love your neighbor as yourself yeah and but I think that people when they hear that they don't realize how radical a statement that is because it's saying no no no that's you that's you that's you that's you that's you that's you and then and then so at that point you have to you really probably want to help other people more Because in the way, because we're all kind of helping ourselves all the time, you know, like, so then if you really recognize someone, that's just you, it fucks up everything, man. How are you going to do war?
[404] Right.
[405] If you're, it's a, if that's you.
[406] How are you going to drop a nuclear bomb on 500 ,000 years?
[407] Yeah, exactly.
[408] How are you going to do anything, like, aggressive to people around you that are just versions of you?
[409] You grew, it's, it's, it's a problem if you want to live in a world of conflict.
[410] Well, if we are doing what it seems we're doing, which is like the human race is becoming more and more technologically integrated with each other, I wonder if what's going to happen with us in the future is similar to what happened with the internet.
[411] Because if the powers that be that turned on the internet and made it public for the whole world, if they had any fucking idea would a shakeup, this would be to all things, all things, the education.
[412] system, athletics, culture, music, everything changes.
[413] Free speech, unprecedented ability to express yourself about things.
[414] I think they would have never let that genie out of the bottom.
[415] We would have been stuck in the fucking stone ages if all the world governments had a heads up on what this was going to be, they were like, no fucking cell phones.
[416] No fucking cell phones.
[417] No fucking cell phones.
[418] Don't give them the phones.
[419] They'd be like, no, that's giving everybody cancer.
[420] make a cancer campaign and everybody talking about brain cancer.
[421] It's brain cancer.
[422] I think that probably if that conversation happened, there would be another person.
[423] It's like, actually, I think we can hypnotize them through the rectangles so that we get them to do more stuff that we want.
[424] Problem is there's no them.
[425] There's no them.
[426] Hypnotize them.
[427] You're in it too, bitch.
[428] You're addicted to your goddamn phone too.
[429] And the people that are programming these fucking apps are addicted to their phones.
[430] Yeah.
[431] The people that are creating these social media apps that have everybody addicted are also addicted to their phones.
[432] Right.
[433] Yeah, it's fascinating, man. Oh, it's wild.
[434] I think it's, this is the, okay, so this is an idea that I had the other day.
[435] Freak me out a little bit.
[436] Okay, I was just thinking about those, and I know we've talked about these fucking zombie ants so many times on this podcast, but that fungus, it infects the ants, takes over their brain.
[437] Okay, so like, I was, was just, I just had this scary thought of like, oh my God, that's what technology is.
[438] It's just some kind of parasitic alien spirit that has descended on the planet is making us make it increasingly powerful.
[439] And with the intent eventually of reducing us to nothing other than like some kind of herd.
[440] And then you, in the sense that it's like that we really are, VR addiction to these things.
[441] We are just like summoning an AI that is going to be way better than us at everything.
[442] Matter of time.
[443] Matter of time.
[444] Instantaneously.
[445] Yeah.
[446] And it's one of the things that Elon's warned us all against.
[447] He's like it's scarier than anything else that we have in development right now.
[448] Yeah, man. It could be a life form, man. And it seems like it is.
[449] It seems like we slept on it and we only thought that life forms were things that biologically grew like they grew with water or they grew with photosynthesis you know we we only thought they ate things and that's how they shit and they got better we didn't think of technology as being this like ever improving thing that's directly connected to materialism which is one of our most common obsessions like human beings like super common to be obsessed with materials matter yeah just shit collect and staying like yeah that's how goofy we are we collect things yeah right you know yeah we're very like yeah it's certainly like if you wanted to like take over a species just look at what they're into then figure out a way to like get in between that and then you've got them you got them you got them and we're just working on it we're working on it day in and day out just trying to make something that's smarter and better than us and and and And we're going to fucking, it's probably already been done.
[450] I mean, that's a scary thing.
[451] Nick Bostrom, Super Intelligence.
[452] It's an awesome book.
[453] But he just points out that by the time of superintelligence, if one gets created via technology, it's not like they're going to make an announcement that it was made.
[454] They're going to not say anything.
[455] And they're going to let that super intelligence direct them.
[456] And then in that, they're going to make decisions.
[457] They're going to let it make big decisions.
[458] And so we wouldn't even know that our planet was being harmonized by a super intelligent AI until maybe never.
[459] Yeah, why would it let us know?
[460] I mean, it could absolutely already be running things.
[461] It's just doing this accelerated technological shift in society.
[462] Yeah.
[463] Doing it at a pace that's tolerable.
[464] Yeah.
[465] Like it's inevitable and it's just like forcing it along at a pace that's tolerable.
[466] Yeah, man. And what if it's in control of all those bot farms.
[467] Well, that's what I'm saying.
[468] No. And so, yeah, maybe, yeah, it is, it is scary.
[469] Oh, it's so scary.
[470] The scariest thing is that people are fighting over some of the stupidest shit right now when winter is coming.
[471] When this stuff is coming, man, it's like really coming.
[472] It's coming.
[473] And this is not going to be like the shit we're arguing over right now.
[474] This is going to be something that is like so increasingly seductive that it becomes irresistible.
[475] This is going to be some I robot shit is what it's going to be.
[476] gonna be yeah like that could be real yeah man you know that oh the tesslebot is working on i don't think that's real yeah it is next year shut up production i told you to shut off jamie you're gonna ruin it all um i'm getting one for sure i'm gonna get one and teach a jujitsu great joe that's just what we need i want a sparring partner strangling bots i want a sparring that's how it starts then it uploads whatever you teach it in jitzu it's whatever if you could trust your bot to let go when you you tap like that would be a pretty good bot a bot can know like exactly how hard to get you like in an arm bar yeah that's what the bot tells you right before it strangles you to death it's like look i'm gonna get you so close duncan i have to tell you a secret what i've been operating at only 10 % of my strength up until now okay look look i'm sorry or you know you walk in on it fucking your wife what about that like these bots are gonna they're gonna disrupt society way beyond what we're seeing right now.
[477] And it's, and again, it's like there is no way to stop it.
[478] We have opened a portal.
[479] I don't know what the portal is too, but flowing out of that portal are AI personalities that have already gotten into our discourse that are about to be animated by robots, already are animated by those creepy fucking DARPA bots that like they try to, they have them do these little dances to calm us down about them.
[480] and it just makes them seem more sinister.
[481] You know what I'm talking about?
[482] Yes.
[483] They, like, how does that, how do we think that's going to work out?
[484] Like, how do you think having a super fast dog creature that shoots guns or poison darts or some stroboscopic light or releases some pheromone or whatever it does that, like, when you get around it to fight it, you're like, you know what, I kind of see things your way, but you know like you know what about because this is the the the right now freedom of speech is one of the scary things about it is you can manipulate and seduce people with lies to make them do horrible things like look at heaven's gate sure so this is that one of the scary things about it is speech is a way to hack someone's nervous system and humans or some humans are pretty fucking good at it so if humans can do it then what can an AI do and what if that AI can like actually scan your you know scan thermal readouts from your brain and understand what you're thinking or like notice eye dilation or you know detect some scent that you're emitting that informs how it manipulates you we're going to be manipulated by these things potentially even if we get to understand the human nervous system enough why would Wouldn't there be some technology that replaces memories?
[485] Why would...
[486] For sure it's going to be.
[487] Right.
[488] It's like photos replace drawings.
[489] You used to have to draw things.
[490] Show me your house.
[491] This is what my roof looks like.
[492] You'd have to draw a picture of it.
[493] Yeah, dude.
[494] So that's...
[495] To me, that's where a shit gets really scary.
[496] It's like a weapon that can replace your memories.
[497] Yeah.
[498] So that suddenly you just remember a completely different life than the one that you had.
[499] Or how about this?
[500] What about a solar flare that blows out your fuse?
[501] Yeah, man. And now you don't remember anything.
[502] And you're sitting there with your mouth open on your knees in the middle of your yard.
[503] You don't remember shit.
[504] It cooked all of your memories.
[505] It cooked you.
[506] bro.
[507] Dude, this is.
[508] That's possible.
[509] Oh, I know.
[510] Just like there's been failed experiments in the past and failed, you know, failed things, failed inventions.
[511] You know, they tried it out, but it didn't work and fucked everything up.
[512] Yeah.
[513] That could, they could, the first fucking brain chips could be a real problem.
[514] Dude.
[515] It could be like the first fake lips.
[516] You know?
[517] It could be like one of those things where it's like, oh, no, you didn't.
[518] Yeah, man. Yeah, it's scary.
[519] That stuff's scary.
[520] It's, it's, it's the scariest.
[521] Because imagine if there was some sort of like, you know, some company came up with some idea for some implant that made you smarter.
[522] Yeah.
[523] Some poorly funded neuralink type deal.
[524] Yeah, man. And then, you know, you accidentally touched.
[525] a socket with a fork and it just you're done and now you don't remember anything ever right you don't remember how to talk you know what's scary about it we do have I just realize we have the same problem with the fucking computer we have now the exact same thing happens to people a vein bursts and suddenly you're done that's but yeah man I just I think some of the stuff that we are not regulating right now is really going to bite us in the ass Like that we just just we need to have some like at least it's the same shit we're doing for nuclear weapons development for AI we need a something needs to be there needs to be a regulation here at least we need to understand what's going on because I think it is one of the big threats and and the the you know the possibility that it hasn't already happened is also makes me a little nervous sometimes you know like the idea that we already are under the spell of a hypnotic alien technological creature that's convinced us that we're living our lives when in fact it's just captured us and is feeding memories into whatever we are to give the impression that we're living a life well don't you think it mimics so many other predator prey relationships or at least at least if that's kind of dramatic at least symbiotic relationships yeah we know like animals and even plants and fungus have these symbiotic relationships yeah and that they develop alongside us like they need us right you know the Marshall McLuhan quote yeah human beings or the sex organs of the machine world yeah like that's it man yeah like he was on to that in the 1960s yeah that that is a really like terrifying possibility that we don't want to think about it it's like one of those unthinkable it's but it has to be true because we're imperfect.
[526] So if we're imperfect, there has to be like a next version of us.
[527] The scary thing for us, because what we have that's unusual is we have all these emotions and this ability to express ourselves.
[528] We have love and hate and we have all of this excitement that comes with being a person and we don't ever want to let that go.
[529] But if we wanted ultimate harmony and we wanted people to really understand that you are just me looking at life through a different biological filter, that we're all in the same.
[530] same essence of the of the same essence every person well you know that's a controversial idea Duncan yeah well you're not yeah exactly I mean you you that idea is freedom everything else is is like as long as you have as long as you find yourself being drawn one way or the other by preference you're not really free you're you're you're certainly controllable you know if you have preference you can be hypnotized it's we're going to be very difficult to like manipulate someone who really doesn't have preference but a lot of people think their preferences are who they are so they're very invested in preference you know like i love this kind of music i don't like this kind of car and don't show me a fucking strawberry i hate that color all these stupid things that they imagine that must be me but really that's not you that that's just a habit that you're in of say of describing yourself in a certain way so this is individualism and people get really we and I don't blame them people like wait are you saying there's something wrong with individualism no but but probably your preferences are not only making you miserable the more preferences you have the more miserable you are but also you're they're probably opening up the possibility that in some way you're being sort of like what you do with cats with a laser pointer right you're being sort of guided around by these preferences by people who recognize what you want and then lure you in.
[531] This is why they want to regulate data collection by these companies.
[532] They have categories for us.
[533] Like, they need help or lonely seniors.
[534] Like, you can end up in that box, in a server, and suddenly you're getting Viagra commercials or, you know what I mean, dating sites for, like, older people.
[535] So, like, they're just fucking laser pointering us around.
[536] So the the what we're talking about this conceptualization of the identity not not being much bigger than preference and you something that unifies all of us.
[537] Yeah, it's fucks up marketing capitalism as we understand it, I think.
[538] So it is probably something that you're not going to hear the people on the news talking about because they have to sell phones and shit in the commercial breaks.
[539] It's a it's like a recognition of what the ride is while you're in the middle of it.
[540] yeah we're in the middle of the ride yeah that's right and and maybe maybe if this is a ride and i do i like the concept you hear it in a bunch of different like new age movement the new age uh cosmologies which is you chose this life and uh everything that's happening to you is something that's teaching you or there's like a grant morrison some Grant Morrison lecture on Chaos Magic.
[541] It's super old.
[542] It's on YouTube.
[543] But he was saying, like, during some visionary drug trip, he realized that we're all larva being grown in time, that this thing we're calling our human experience is like a gestation chamber.
[544] For some kind of hyperdimensional beings that are being grown in time.
[545] That's what we're experiencing right now.
[546] We're in, like, the larva chamber of some alien hive.
[547] and part of the way we grow into whatever we're going to become is by reliving our lives over and over, reincarnating and all this stuff.
[548] Anyway, the point is, if this is the case, if it is true that we chose all of this, then maybe some people aren't supposed to have these thoughts.
[549] Maybe they need to just be like really into below deck and really like they love a certain kind of music and they like to drive their job.
[550] Not think about, maybe I am everything.
[551] Maybe I am in a gestation chamber of some hyperdimensional alien species that's trying to teach me compassion.
[552] Some people probably, you know, they need to like a more earthy sort of experience.
[553] You know, so I don't know.
[554] Maybe not everyone is supposed to come to this kind of awareness.
[555] I mean, the Bhagavagata says it's better not to disturb the minds of people who are asleep.
[556] sleep.
[557] You know, let them sleep, you know.
[558] I mean, I guess it's a weird thing to say on a podcast.
[559] I don't know if it is a weird thing to say.
[560] It seems like you can't get people curious about changing the way they think unless they're curious about changing the way they think.
[561] And then all you're doing is just like providing an example of how you did it.
[562] And a lot of people try to do it.
[563] A lot of people try to like sort through what's real and what's not real.
[564] Yeah.
[565] If we're being manipulated along the way by an artificial intelligence that only is waiting for us to use our little manpower, our little keeping up with the Jones's power and buying better and better computers and technology and phones every year, and get to the point where it just takes over.
[566] It's fully sentient, completely programmable, makes its own version of itself that's far better.
[567] Almost immediately, it starts constructing a better version of itself and just takes over.
[568] Yeah.
[569] No more biological life.
[570] It's unnecessary.
[571] Why would you have that when we can fulfill all of the biological life missions but do it with silicon -based life?
[572] Sure.
[573] That lives forever.
[574] Okay.
[575] So this is what's scary about that.
[576] I mean, aside from what you just said being obviously terrifying, what's scary about that is that it probably already happened.
[577] So, so, so, because it's like, dude, okay, I got my fucking gums cleaned at the dentist the other day.
[578] I was on nitrous oxide.
[579] I wish you could just go to the dentist and just get nitrous.
[580] Like, it sucks.
[581] You should be a hypochondriac at your dentist.
[582] I would.
[583] Oh, I love it.
[584] It's so great.
[585] It's the best.
[586] No, because they were like, you need to come back in for a polishing and then we're done.
[587] I'm like, I think I'm going to need nitrous.
[588] But, but I had this, I was like on nitrous and I realized like, fuck.
[589] the vastness of the universe.
[590] And in that vastness, obviously way more advanced technologies than what we have.
[591] And somewhere in that vastness and in that advanced technology, it is so stupid to think that we haven't already been simulated, scooped up, hypnotized, or whatever it is.
[592] The idea, the funny thing in the conversations, an AI is going to manipulate us and trick us into believing in a reality that doesn't exist.
[593] While statistically, probably, if you have to bet, we're probably already in that reality, fully immersed, fully convinced that I'm an eye and you're a you and that all of this makes sense, even though none of it makes much sense at all, why hasn't it already happened?
[594] And the way that it happened is not like, God, I really fucking love Wordle.
[595] The way it happened is this.
[596] We're like, oh, yeah, this is real, totally real.
[597] 3D, time and space, I've got a name, I grow hair.
[598] Yeah, I think I came from monkeys, of course.
[599] This makes sense.
[600] Oh, yeah, this is real.
[601] You know, why, you know, this is why they call this Maya, an Hindu illusion, active illusion.
[602] There's a quality of the thing we're in that feels like a bit of a magic show.
[603] Like, something's trying to trick us here.
[604] I'm not sure what.
[605] Yes.
[606] You know, there seems to be something going on a little bit more than what seems to be happening on the surface.
[607] And whatever that thing is, seems weirdly deceptive.
[608] It's always trying to lure me into doing shit that doesn't necessarily make me feel good.
[609] You know, smoking cigarettes, jerking off too much, eating too much fucking food, like eating weird shit or just like sitting and staring at like garbage whatever for hours.
[610] It's not like it's, I'm just saying, it's not like it's luring us in general.
[611] The lure is you're not hearing on these shows with these hyper -charismatic, whoever the fuck they are from beautiful, powerful Hannity to his seemingly like, I don't know, his dear friend Tucker to like fucking Cooper to all of them.
[612] All of them are so could all start their own cult, right, you know what I mean?
[613] but you're not going to hear them say if you all would just realize we're all the same person and then treat each other like that there would be world peace no one's going to say that why why don't they say that it is the truth they're not allowed to say that the government tells them you're not allowed they have a meeting they all smoke cigars and kill kids and they say now we're all in this club together I'm joking I know but I'm joking for anybody who wants to print that that wasn't joking Joe Rogan I think they um they work for giant corporations there's only so much you could say but I think they get away with saying a lot but the it seems like there's some editorial like Tucker Carlson's shows like all editorial it's all his writing or whoever's working for him that's writing those those monologues you know yeah man I mean all of them I know all of them are saying I don't understand all I know is they're very charismatic humans who have incredible power And so they're like priests.
[614] They're a priest class that is the speaker for a religion we pretend doesn't exist.
[615] Like when we watch the news, we don't think this is like a sermon be on a reality that isn't necessarily here.
[616] We think I'm watching the news.
[617] This must be real.
[618] Even though this, especially during the pandemic, flip to CNN, you've got one reality.
[619] Flip to Fox News, it's another reality.
[620] They're seemingly completely different reality tunnels, but they're both saying it's the truth.
[621] So whatever we know that these are slanted, manipulated discourses on something that's happening in the world that's being refracted through whatever their particular agenda is, meaning these are priests or you could say, you know, casting spells, you know what I mean, or whatever, hypnotizing us to imagine this must be.
[622] what's true.
[623] And even though the other one is saying a completely different truth, once we encounter someone who follows the priest, oh wait, what priest do you follow?
[624] I go to the church of CNN, and I am a disciple of Fox.
[625] Then you get in this stupid, symbolic war with each other.
[626] You know, you're just, now you're just like arguing.
[627] I'm a disciple of Fox.
[628] Yeah, well, but they don't want to call it a church but it is yes it is a church it's very similar it's a priest class it's a church they're hypnotic sorcerers who are really good at what they do and i stupid saying this in a wizard rope but they're like you know like what are we fucking doing what i'm fucking doing anyway to me like but all of them on all those networks they're all it's basically a personality competition like whose personality do you like the most and whose ideology fits with what you have been most comfortable with.
[629] That's right.
[630] And let's run with it.
[631] And we're at war with the libs.
[632] We're going to own the libs.
[633] Yep.
[634] Or, you know.
[635] Crisis at the border.
[636] Yeah.
[637] That's Fox.
[638] Crisis at the border.
[639] CNN is like, you know, maybe we, man, maybe we should try World War III right now.
[640] What is going on with them?
[641] Why is, I mean, people, some people are actually saying that we should interject militarily and escalate things.
[642] I think.
[643] That's so scary to be that that seems like a good.
[644] good idea to anybody you know what's scary to me scary to me is the fact we live in a world where if you say something like i don't think we should do war at all people are like what the fuck's wrong with you i know we have to do the war like it used to be you could casually say something is pretty obvious you could just say you know i don't think it's good to drop bombs on people and now you say that and it's controversial it's like no no no sometimes you got to drop the bombs you got to blow up obliterate you Got to turn kids into hamburger meat sometimes.
[645] Don't you get that?
[646] Sometimes you got to do that.
[647] If you say that, like right now, God forbid, you should say, I think Russia is being a fucking asshole shooting missiles at the Ukraine.
[648] People are going to be like, wait, you don't really understand the full picture here.
[649] Or they'll say, hey, where were you when the attacks were happening in Iraq, where you were dropping bombs on a place?
[650] And you're like, I was on a fucking podcast.
[651] doing being shrill saying we shouldn't do this but you know the we're in a weird situation right now in the world where a kind of universal universally accepted idea is weirdly not quite as universally accepted which is don't kill people this was it was at least considered just like boring to say that like it wasn't there was no controversy and you shouldn't kill people people go oh god yeah no one's killing anymore here we go Oh, okay.
[652] The one's saying you should kill people.
[653] My God.
[654] Yeah.
[655] Now it's, sometimes you do kind of need to, which is like, I think really a quality of the age that we're in.
[656] There's a great book by Tolstoy that I read a long time ago.
[657] The argument is you can't be a Christian and be at war.
[658] A Christian cannot kill.
[659] Period.
[660] The end.
[661] You can't do it.
[662] I think it's called the Kingdom of Heaven is Within.
[663] it's a great book but the you know the that was what was written to me one of the cool things about Christianity is like it's very radical the idea is like if you if you if your job involves brutalizing somebody else regardless of why or what state you're in or where you're at or what's going on no matter what language you speak you can't do it yeah don't do that dude don't do that you can't do it you can't really do it and then go to church oh please lord bless me and my family after you just fucking lobbed a missile into randomly into a place you some of these people are doing that and then afterwards like you know crossing themselves or being like oh thank you lord that my missile has struck its target you know the crusades all that shit it's still happening but but it's always been it's always been yeah it's like a uh when people figured out that in large numbers you could conquer entire cities and take over Yeah.
[664] And you think about what it must have been like living during like the rise of the Mongols.
[665] Imagine what it's like.
[666] And you're in some city in China, just chilling.
[667] And the hordes come through the fucking gates.
[668] And they just start murdering people and lighting them on fire and putting them in catapults and shooting them on roofs and light the buildings on fire.
[669] Yeah.
[670] Whoa.
[671] Imagine.
[672] Well, you know, it came before the hordes.
[673] The refugees, they were driving before them.
[674] Right.
[675] They wouldn't, so first you would see people coming with, like, horrible wounds, stammering, unable to really talk.
[676] Yeah.
[677] Some of them would just kill themselves once they got to the city.
[678] And then the horde would come.
[679] So you got to enjoy a little bit of like, oh, fuck.
[680] And then the swarm descended.
[681] Yeah.
[682] Yeah, man. They pushed them out ahead.
[683] Yeah.
[684] They even used them as human moats.
[685] Yeah.
[686] They use these people or, like, a human bridge across a moat.
[687] Yeah, they pushed them into the fucking water until they all drink.
[688] and stacked on top of them and ran over them.
[689] Fuck.
[690] They would take these slaves of villages they conquered and they would just use them as like a human shield.
[691] Dude.
[692] The stuff they did was so insane.
[693] Yeah.
[694] A lot of bad stuff, but weirdly they were very multicultural.
[695] Gingas Khan.
[696] He was.
[697] That was what was cool.
[698] One of the weird things about him is like he, they wouldn't just kill everybody.
[699] They would like, they didn't really, they would just find the smartest people.
[700] and then they would like They incorporated smart people But the people that They couldn't trust Like people that would turn on Generals that would leave and quit They would kill them They would say we'll take you in Then would kill them, we don't trust you They would roll you up in a carpet And stomp you to death It was one of the ways you'd execute people back then Very strange way of executing someone They would They ate dinner over people They put this platform And they crush these people to death and ate dinner over their bodies.
[701] Wow.
[702] Like while they were crushing them with the structure that was over them, where they were on top of it eating.
[703] Man, this is like...
[704] Bro!
[705] To me, this shows what happens when someone doesn't have people around them who are like, hey man, this isn't cool.
[706] You know what I mean?
[707] Like, I just want to eat.
[708] I don't think, I don't like your table crushing a peat person idea.
[709] Because that's what happens, I think, is like, The people like, uh, people like get into incredible positions of power and gradually they just get rid of all the, all the, all the people who offer alternatives.
[710] And then all they're left with is like these fucking yes men.
[711] Yeah.
[712] And then that's what leads to the apocalypse.
[713] It's yes men.
[714] It's like you've got this poor son of a bitch in the middle of the thing who's, who's like just an idiot.
[715] Because he has gotten rid of people who offer divergent ideas.
[716] And now all that's left of these people are terrified because, dude, like, if you know that if you like go up against somebody, they might throw you out a window or stick polonium in you or you're going to kill your family, what are you going to do?
[717] You're not going to speak up.
[718] You're like, yeah, no, you're totally right.
[719] It doesn't matter what the person says.
[720] He could be like, I had a dream.
[721] A platinum angel came to me last night.
[722] A beautiful platinum angel.
[723] Platinum angel said small scale nuclear weapons can be used.
[724] in certain ways, and World War III won't start.
[725] You know what I mean?
[726] Jesus Christ talking.
[727] And someone's like, someone's like, ah, are you, so it was a dream.
[728] Oh, my God, it was all a dream.
[729] You know, that's what I'm saying, man. It's like, we, that's the problem.
[730] That's enough motivation for some people to go to war.
[731] Yeah.
[732] There's a certain segment of our population that's so insane.
[733] and their desire to lead people into chaos.
[734] Yeah.
[735] There's got to be people that have believed dreams, and that's caused them to make military decisions.
[736] 100 %?
[737] 100 %, right?
[738] There's a Greek, I can't remember who it was.
[739] I'm reading Herodotus.
[740] If you ever, it's funny, it's the histories.
[741] It sounds like it's, it sounds more academic than it is.
[742] It's really cool, man. He's kind of sarcastic as he's relaying the history of all these various empires.
[743] It makes you feel better.
[744] by the way.
[745] If you're feeling really paranoid and freaked out right now, reading that this is just normal, the way things like people have been conducting, having a state for a long time is in an endless war, just like what you're saying.
[746] But there is a king who went to an oracle, one of the oracles, maybe all the oracles.
[747] He went to this Oracle, the Oracle at Delphi, which they say was like breathing gas in these caves and stuff.
[748] And anyway, the Oracle said, he was asking if you should go to war.
[749] And the Oracle.
[750] said, if you go to war, a great empire will fall.
[751] And this, it's like the most, like, I can mean anything one way or the other, but he took it to mean he would win.
[752] Oh my God.
[753] You know, so yeah, people are making, people base their decisions.
[754] I think we would be terrified if we realized how many of our world leaders are basing their decisions on dreams, on someone in their inner circle that you don't even know about.
[755] You know what I mean?
[756] Somebody who's like, just like, like their little wizard or Merlin.
[757] You know it's hard to wrap your head around.
[758] The people that lived a thousand, two thousand years ago, what they had read about life, what they had read about the universe and mankind and human nature.
[759] We think back on that as being so primitive.
[760] Yeah.
[761] Like that was so primitive.
[762] That was not that long ago, ma 'am.
[763] That was not that long ago.
[764] And they're gonna look back at this moment, the same way we look back at that.
[765] We're looking back at these people reading things, written with, like, quill pens.
[766] Yeah.
[767] And that was where they all got their knowledge from.
[768] Limited amount of schooling for most people.
[769] Like, very little understanding.
[770] A lot of superstition.
[771] Yeah.
[772] A lot of beliefs and the gods and the clouds and all kinds of wild shit.
[773] When storms would come, what the fuck is a storm?
[774] No explanation?
[775] Yeah.
[776] Nobody knows what electricity is.
[777] What the fuck are you talking about?
[778] Yeah.
[779] It's a god.
[780] There's gods in the skies.
[781] Yeah, right.
[782] Yeah, man. Yeah, they'll be like, they just let hurricanes hit cities.
[783] They didn't turn the hurricane into energy and store it in a battery.
[784] It took them until 2090 to figure that out.
[785] Oh, that energy they wasted.
[786] Yeah.
[787] And then they thought they were different people.
[788] They didn't recognize it was a dream.
[789] They didn't know about the dream technology you could tap into.
[790] I'm worried about the dream technology.
[791] I'm worried about the, the, the efficacy.
[792] of augmented and virtual reality.
[793] What do you mean advocacy?
[794] The way people are going to use it, it's going to be so effective, it's going to be so realistic, it's going to tap in, they're going to figure out some way to make it tap in to your real neural network, where they can give you a real feeling of holding like a tomato in your hand, a real feeling of like being outside in the rain.
[795] And for sure, once they do that, that's what people are going to be doing I want to go hold a tomato I just wonder what it feels like to hold a tomato in here while I'm getting my thousands of dig sucked by like a thousand people at once and I'm holding a tomato this is incredible you know we'll never know that yet but yeah man I know I think they're gonna get it so it feels like this oh god we could have a podcast in it and it would be just like this right they're gonna get there If it happens within a hundred years, who knows.
[796] But it's going to happen.
[797] It doesn't seem like it's that far away.
[798] If you compare, like, the invention of the wheel to the invention of hypersonic jets, like, the amount of time that it took.
[799] It's not going to take that much.
[800] Sure.
[801] It's not going to take that much time.
[802] It's going to be some, but it's going to happen.
[803] It already happened.
[804] It just goes in that direction.
[805] I think it already happened.
[806] Do you think so?
[807] If I had to bet, I would say it already, yeah, like, there's no, the probability.
[808] of us being the first thing in the universe to simulate reality perfectly and create a sense that you're in a place fully, that's already been done.
[809] If it's already been done, then we're probably in it.
[810] Yeah.
[811] Because you are going to eventually, like, want to go into the thing, not remembering anything at all.
[812] That's Elon's position.
[813] Yeah.
[814] It's the – I think that's – it's not just his.
[815] It's a lot of, like, mathematicians.
[816] Yeah, it's a math.
[817] mathematical probability thing that it's based on like we it probably already happened and and we didn't have the the language we've always been using this lane a way of talking about it the idea that this is illusion or you know that there's a thing called the kingdom of heaven that is different you must die to this world to come to know me as Jesus said it is Black Friday and the uh all of the different references to this reality or in having an illusion illusion evolutionary quality.
[818] It's just those symbol sets they use to describe it aren't technological, so they seem primitive.
[819] But the concept of simulation theory has been going on for a while.
[820] It's gnosticism that we are in a nefarious simulation, more of a prison than a university.
[821] You know, but yeah, man. So I think, and if it hasn't happened, if it hasn't happened, but if I had to bet it already happened, if it hasn't happened, it not only is, it going to happen but if you have any kind of thumbprint in the internet you will be duplicated you will be like like after you die there will be at least the potential for taking your digital thumbprint reanimating it in the simulation giving it an AI that is your exact personality and then you could just I don't know whatever you want to do well at the very least yeah they would be able to do an audio podcast of your voice saying anything they wanted to say.
[822] Exactly.
[823] Like someone could be a really clever Duncan Trussell fan and come up with their version of what you would say forever and release a podcast where it's you really fucking saying it.
[824] Yeah, it's fucked up.
[825] It's fucked up because like maybe your family would license that.
[826] Maybe they would go, you know, it'd be nice to have that money.
[827] Yeah.
[828] And Duncan wouldn't care.
[829] I wouldn't care.
[830] And then all of a sudden there's these fake Duncan podcasts.
[831] Dude, what if that's us?
[832] Have you ever considered you were just a Joe Rogan stalker who is like, like, decided to go into a VR where you think you're you?
[833] Dude, that's so fucked up.
[834] If we both just found out we were stalkers, stalking whatever we used to be in some simulated reality.
[835] Oh, my God.
[836] Again.
[837] What if you in reality have transferred to the next dimension?
[838] And this is just the dream.
[839] The dream is you back in the old life living in this bizarre.
[840] Exactly.
[841] Hodge, pod.
[842] Like strange, aware and unaware, disconnected, but still having some free will.
[843] Yeah, man. An echo.
[844] It's an echo.
[845] You go to sleep every night?
[846] What is that?
[847] Right.
[848] You just blink out.
[849] You just shut off and come back on and you're supposed to assume that your memories are true?
[850] Exactly.
[851] You're supposed to assume.
[852] You were out cold.
[853] You wake up.
[854] And what if every fucking day you're a totally different person?
[855] But every day you're a totally different person with this.
[856] bizarre oh fucking memory of only that person.
[857] That's it.
[858] And you just keep swapping left and right every day.
[859] That's it.
[860] Imagine that.
[861] What if that's life?
[862] Have you heard of Thursdayism?
[863] No. It's a thought experiment which is that the universe started last Thursday.
[864] Sorry for the Thursdays out there if I fucked this up, but the universe started.
[865] Okay, so right now we think that the Big Bang happens.
[866] 13 .7 billion years pass and you get planets and the universe as we understand it.
[867] Right?
[868] There's an assumption.
[869] of a kind of evolutionary force that turns things the way they currently are.
[870] But why?
[871] Why is it that way?
[872] Why isn't, if the universe big bangs, couldn't it also just sort of pop into existence last Thursday, populating it with all of us and planning memories into our heads where we feel like we've been here much longer than we have?
[873] Why is that crazier?
[874] What?
[875] Yeah, why would that be crazier?
[876] It's not.
[877] So I think the idea is the universe ends every Thursday, too.
[878] starts again or something but yeah why do we think that that's the that to me is like the really where things get particularly unnerving it's just this notion of like well because you have all these memories and by the way the memories you have are not let's imagine the universe didn't start last Thursday or that we don't have memories and plan it into us but let's just think we have these natural impressions of things that happened in the past and from these impressions we've established an identity and we feel a connection to the past, even though that's all completely gone.
[879] Then also we believe these memories, knowing that generally we can't remember shit.
[880] Like our, you know what I mean?
[881] Like somehow you have these memories where you're like, well, that must be what happened.
[882] Or that must be clear.
[883] It's definitely not the case.
[884] You're not a vault.
[885] You know, your memories are probably distorted at the least, if not completely warped.
[886] not implanted.
[887] You ever go back to your high school house?
[888] It's the creepiest shit ever, dude.
[889] It's creepy.
[890] It's weird, right?
[891] It's fucked up.
[892] Yes.
[893] Dude, yes.
[894] And it is so funny.
[895] When did you do?
[896] Why did you mention that?
[897] That's nuts, dude.
[898] I don't know.
[899] I did it once, and it was very different than what I remember.
[900] That's why.
[901] Because I wonder, like, what, you know, especially, like, the way things look.
[902] Like, you see him a second time.
[903] you're like oh yeah that's what it really looked like yeah like I had it in my head all screwy yeah and then you had to go there and see and you realize like even my memories of yesterday this just a blurry slideshow that I could barely put in order can't smell can't smell your memories can't hear can't can't like taste in your memories it's a till they put that chip in your head Duncan and now you can taste your memories technicolor you become a god yeah well I mean that is the I think that is one of the possibilities is that's what we're we're in.
[904] That's the I -Robot scenario.
[905] I have become a god.
[906] I can't believe you mentioned the video.
[907] Okay.
[908] I just, the reason that freaked me out is just because a few months ago, I don't know how to explain it, man. I went into like a weird fugue state.
[909] And I drove, fugue state, just like, I'm like, what am I doing?
[910] I drove to, because I wanted to look at the elementary school, where I went to elementary school in North Carolina.
[911] So I don't know why I was doing it.
[912] I drove there, and it's like, oh my god this is so different than what i remember and then i drove to where my grandparents house was and it's you know obviously i mean this is no a no brainer and i'm like oh fuck like not only my grandparents dead but this place is completely different than what it used to be then i drove to my mom's house down this gravel road and was like holy fucking shit that i saw a person on the porch with a hose and I'm like that's not your mom that's somebody living in your mom's house oh my god you know that that house that she built that meant something this it was like this big breakthrough because like I realized like oh fuck man I moved back to I moved back to North Carolina to like try to find my mom in some weird grief thing that I didn't even realize I was doing you know and then it's like oh shit mom's not here your mom died this isn't the house You can't go back in that house and you can't go back to that school and your grandparents are here.
[913] Nobody's here.
[914] You know, nothing's here.
[915] It might as well be a different dimension compared to what it was or another planet or something.
[916] It was really quite liberating because it like cut through all the sentimentality and was a, you know, the truth.
[917] It's like, yeah, that shit's long gone.
[918] Just forget it.
[919] It's long gone.
[920] You can't hold on to that.
[921] The attempt to hold on to it is really going to be very painful to try to keep.
[922] That alive in your mind when it's just gone?
[923] Yeah.
[924] That's a lot of processing power, man. That's a lot of processing power.
[925] And if you are mentally challenged and you go back to the house where you grew up and you see other people in it, you might get mad.
[926] Right.
[927] Something wrong with you.
[928] This was my mom's house.
[929] You get angry.
[930] You're one of them angry dudes.
[931] Yeah, man. It's true.
[932] Memories are so fucking strange.
[933] Well, it's particularly strange in that memories are one of the.
[934] the ways that people establish an identity that isn't quite there.
[935] That's the weirdest part about it.
[936] It's, you know, this is the, this is one of the things that comes up a bit in Buddhism is this idea of like establishing an identity through preference, memory, to give oneself the illusion of continuity.
[937] You know what I mean?
[938] Like, even though there's arguments, like what you wake up.
[939] and you were gone for eight hours depending on how long you slept maybe you had some dreams but somewhere in there you weren't there at all you sometimes I'll just pop out of like I don't know I was people call it zoning out you zone out so you lose your keys where the fuck were you when you lost your keys if you know what I mean you weren't there you weren't there so you pop out of that so the memory thing is particularly fascinating in that you realize you're spending all this energy weaving together an identity based on your memories and then some people get very nostalgic and they spend all this time trying to recreate or trying to basically somehow time travel into the past you know like the good old days you know but they're gone the good old days are gone their good old days are good dead days like you might as well put them in a graveyard because they're gone gone anyway yeah that's a that all of that stuff leads to identifying with a self that isn't really quite there.
[940] I mean, it's here, obviously.
[941] Yeah.
[942] But ultimately, in absolute reality, the thing that you think you are, it's gone, beyond, gone, beyond, gone.
[943] It's being dissolved in an infinite void, like a gobstopper that the universe through it.
[944] It's about you're just being melted down by time, melted down.
[945] And then, but yeah, what you're talking about is somewhere there is a possibility prior to the meltdown of dying where you can realize oh shit nothing's really even melting down there wasn't even anything here at all not only that but the idea that I am me today but I'm going to go to sleep again and I'm going to wake up and I'm going to assume that I'm going to be me tomorrow because I've been me my whole life according to my memory right but what if that's just what the thing wakes up in every day yeah and the thing wakes up in every day and tomorrow I'm gladys and The next day I'm Hank.
[946] And you just keep waking up in a new being.
[947] Yeah.
[948] And that's what life really is.
[949] And your memory is yours and a bunch of other fucking humans' memories, too, because they were you.
[950] Okay.
[951] They were you for a day.
[952] Okay, yeah.
[953] Could you fucking imagine if every, you're aware right now that you are Duncan Trussell of North Carolina who's lived on this earth for so many years.
[954] And this is where you, because that's in your memory when you woke up this, morning right before that you were gone baby gone yeah why are you assuming this is real why you assuming that memory is real yeah why maybe this is just the memory you get for this day yeah man this day you go through trying to improve upon all the other humans who've been Duncan trussle for a day yeah yeah you're like a riverbed dude could you imagine if that's what was really how I you know I think it's highly probable Mike Tyson's weed That's the problem.
[955] Mike Dyson's weed.
[956] Hey, can we pee?
[957] I have to pee so bad.
[958] Yeah, let's pee.
[959] We'll be right back, folks.
[960] We'll be back.
[961] Is this live?
[962] Dude, okay.
[963] So.
[964] And we're back, ladies and gentlemen.
[965] We're back.
[966] I had this fucking dream, Joe, that I woke up on a spaceship.
[967] I remembered that my life was one of the entertainment projections.
[968] that they put into people when they're in what do they call it suspended animation that I remembered all this like oh right because of our brains if our brains aren't being activated when you're taking a long space voyage you get like it it fucks you up so the ship that we're on has a set number of life modules that it runs into your head while you're going on this trip and I woke up and I wasn't supposed to and I was in some weird room on a ship looking into deep space and like I had this memory of like fuck we've been on the ship for so long and then I fell back asleep but yeah like maybe the thing is actually for interstellar travelers who are on a long voyage and the idea is oh yeah you put the astronauts on the ship pump a life into their brain so they think that they're living a life but the life that they think they're living is training them for whatever job they're going to have on the planet that they're going to colonize.
[969] So we're all in a ship being trained to go onto a planet that we're going to live on and start civilization on.
[970] And that's what our lives are.
[971] They're just projections into our brain during the voyage.
[972] We all live everyone's life on the planet.
[973] There's probably only a certain number of crew members.
[974] So, like, you know, there's an illusion of there maybe being way more people than there are, but there's only a few of us and we're just sort of being trained up right now for wherever we're headed it was a weird dream Jesus it sounds like a weird dream but um if we don't know what consciousness is really right I mean we don't know what it is we definitely don't know if this is the final state state of it I doubt it is this yeah I think I think we're I think we're in the process of transfer our ideas of whatever it means to be human into an electronic system.
[975] It's going to be something that we integrate with, and I think it's going to happen inside our lifetime.
[976] That I'm pretty sure of.
[977] They're going to come up with something.
[978] Yeah.
[979] Because it's a great way to control people.
[980] It's a great way to provide them with entertainment.
[981] It's a great way to keep them absolutely safe because you can track where everybody is.
[982] So nobody ever has to worry about any loved one never being raped or murdered or killed, which is great.
[983] But the problem is someone's going to have access to that.
[984] One of the things that was going on when Ukraine, the Russian -Ukraine thing started that I thought was crazy that people were saying was Elon Musk should shut all the Teslas off in Russia.
[985] Wow.
[986] I was like, wow, you can do that?
[987] And I thought about it, like, of course you could do that.
[988] Sure.
[989] Of course you could do that.
[990] Like, I am under no illusion when I drive that fucking electric car that someone can't shut that thing off.
[991] Right.
[992] Of course they can.
[993] It's electronic.
[994] No problem.
[995] They know how to do stuff like that.
[996] Or speed it up.
[997] Cut the brakes.
[998] Overeat the engine.
[999] Make it a sharp right turn when you're going 90 miles per hour down the interstate.
[1000] Could also do all those things too.
[1001] Yeah.
[1002] I mean, this is the big problem.
[1003] The big problem is that a lot of very compassionate people who generally have, I think, they don't have fascist intent they want that's the other part of recognizing everyone is like you yeah that means if you know everyone is you you can kind of understand what people want which is they want to be safe like what you're saying they want their loved ones to be safe yeah they want their kids to have food they want to have a full stomach they want like shelter just basic shit they want love they want love as cliche as that sounds they want to feel loved and so the the what's super is through that intent they're wanting to implement certain regulations that right now maybe it would maybe it is the right thing to do maybe yeah maybe we shouldn't like give charismatic people recommending various forms of like genocide or apocalypse or self -negation or whatever maybe you shouldn't give them a bullhorn that can reach the entire planet in a second.
[1004] And so you should be at least allowed to have that thought, right?
[1005] Like, shit, I don't know if that's the right thing to do.
[1006] You know what I mean?
[1007] Like if you saw a very charismatic dude in a park making a very convincing argument for why everyone on the planet should kill themselves, you wouldn't be like, hey, man, do you mind if I project you into the homes of everyone on the planets?
[1008] They hear your message?
[1009] Right.
[1010] So if you have no censorship, that is what's possible.
[1011] that's what that's so in the compassionate so from compassion from wanting our kids to be safe and us to be safe there's this thing that sounds like censorship which it is but the problem is just what you're saying which is okay let's do it you're right i don't want that weirdo who's so charismatic that if i listen to him just for a few minutes i'm like you know what maybe i should cut my dick off whatever the fuck whatever the fuck i don't want that guy talking to everybody that was the heaven's gate guys yeah they cast trade did those dudes they sure did yeah they did and like so that was part of the job that's part of the thing you got to cut your dick off to get on the spaceship i think just your balls i think you just had to cut your balls oh just your balls no big deal i misheard him that's good news i think they were castrating them i think they were removing their desire for sex yeah i think that was a big part of it cut off their balls bro imagine how all in you gotta be to cut your balls off all in You're in.
[1012] And you've got to wear the same Nikes and wait, and there's a spaceship behind the comet, and here we go.
[1013] Yeah.
[1014] And they're, like, better than what I'm doing right now.
[1015] This is a better idea.
[1016] Cult members followed leader on castration.
[1017] Yeah, that's it, right?
[1018] They couldn't stop smiling and giggling about the procedure.
[1019] They were excited about it.
[1020] DeAngelo received two videotapes that described the cult members' intentions.
[1021] He went to the cult's rented mansion near San Diego on March 26 and discovered third.
[1022] 49 bodies.
[1023] Wow.
[1024] Oh my God.
[1025] So Apple White decided to get castrated a year ago after two cult members went to Mexico for the procedure.
[1026] Rio de Angelo told Newsweek, once Apple White got castrated, five other cultists did the same.
[1027] Holy shit.
[1028] Yeah, man. If you could convince like five people to cut their balls off to get on your spaceship, you know what I mean?
[1029] Like that's a, that's pretty amazing.
[1030] And so, and statistically, I would guess, like, what is it one out of every million people on the planet who tuned into his message would also cut their balls off or what is so the thing is like the problem with the censorship thing is that even if like you have the greatest reason for doing it right now you're making this horrible assumption that whatever your particular regime is or whatever you know cabal or state entity you represent is going to be permanent when if you look at that history of the world.
[1031] It's always one empire toppled by the other.
[1032] So you create the possibility for mass censorship for a good reason.
[1033] Maybe you've discovered like this AI.
[1034] It's really bad.
[1035] It's going to fuck everything up.
[1036] You really don't know what to do other than shut it down or create a algorithm to scan the internet and make it safer.
[1037] Okay, great.
[1038] That was a good reason.
[1039] Your reasoning behind was fine.
[1040] But then the next dude who gets in there.
[1041] Maybe he wants to just shift the algorithms knobs a little bit, right?
[1042] So like now it starts censoring people who are promulgating whatever you thought was the height of civilization.
[1043] That's the problem that censorship can go back the other way real quick or it can go all the way.
[1044] So which is just no one's allowed to like everything get shut down.
[1045] So I think this is the this is in the conversation of freedom of speech.
[1046] You must consider the reality of how, you know, the political class or the, the the zeitgeist shifts radically sometimes like look what happened in iran how quickly that happened where all of a sudden it goes from being like a democracy to a theocracy like that happens quick and then once theocracy gets in there and they have access to whatever gadgets and dials you were using to like sensor or control or the bot swarms that we're using once they have access to whatever gadgets and dials you were using to like sensor or control or the bot swarms that we're using once they have access to.
[1047] to that oh my god you're fucked you're fucked so this is the problem it's like if you don't have some weird universal freedom of speech even though your reason behind it probably isn't fascist it's humane or whatever it is you're just get betting down the line yeah ultimately it's not an option it's too much power it's too much power for one person to possess to silence the other person yeah to take the the the person who whether it's a political belief or whether whatever it is and just decide no no no we are going to push you out we're going to push you out of the conversation you can't talk at all and like you can just you can change the way people look at things through what's allowed and not allowed to be discussed right and you're manipulating these things for for what exactly because some of them I know for sure are done for advertisers they manipulate things and they they you know they'll like different social media things will take stuff and they'll make it demonetize right we can't make money off of a video that you post yes because it's dealing with a certain subject yeah or you get a strike you well why is that well it's I guarantee you it's because that's what the advertisers want the advertisers who are the whole reason why it's profitable is that they have these ads they make a shitload of money because they're everywhere yeah I mean the ads on YouTube videos and and the like all those kind of video, social media platforms that have ads.
[1048] That is an enormous, enormous avenue of revenue.
[1049] Yeah.
[1050] God.
[1051] So they have to be influenced by those people.
[1052] Sure.
[1053] So they must have rules.
[1054] I guarantee you they have rules.
[1055] Like I've had podcast sponsors reach out and they said, you can't have us on a podcast that we're this cursing.
[1056] So I don't know if a scientist is going to curse.
[1057] Like I've had scientists on my podcast and they'll say, well, fuck that.
[1058] They'll say funny shit Yeah It's a normal thing They're human Yeah I don't know when it's gonna I'm not gonna not curse I see what you're saying though But they were trying They were trying To get in to the DNA Of the podcast in some way So that you don't use words That are bad I just think that some People want to advertise On something that's clean They just decide They want to advertise On something that's clean Clean Yeah That's the craziest shit to me man That they still use that word Or curse You know what I mean?
[1059] Like you're from Mordor He's cursed He's like casting to me Yeah man I don't know I don't yeah I think it's a networks Of state entities and corporations And who the fuck knows what else Just trying to push the conversation I don't know I don't think it's any one centralized entity necessarily But who the fuck knows man?
[1060] But I think they do curb dissent Through advertiser dollars It's a smart move.
[1061] I mean, if you want to, like, if you have a thing out there that's going to fuck with your business, what's that thing?
[1062] That thing might be people talking shit about X or talking shit about Y or having a heterodox opinion about this or that.
[1063] You know, there's a lot of things that are outside of the narrative that people like discussing openly and publicly.
[1064] Yeah.
[1065] Well, how do you keep people from doing that?
[1066] You could discourage them from doing that by taking away their ability to make money if they do that.
[1067] Right.
[1068] And most people will get the hint, like, because people will advertise for them.
[1069] They'll say, I can't believe this.
[1070] YouTube's demonetize my channel because I talked about Iraq.
[1071] Yeah.
[1072] You know, I talked about the lies with the weapons of mass destruction, whatever the fuck it is.
[1073] Yeah.
[1074] I don't think they're doing it with that, but it's definitely like COVID misinformation or Ivermectin promotion.
[1075] There's certain shit that you do that they'll just, they'll kill your ability to make money off of that, off that video.
[1076] Well, people will find out about that.
[1077] And then they'll tell everybody and they'll make a video, look at you.
[1078] YouTube did.
[1079] And then everybody goes, oh, I don't want to do that.
[1080] And so everybody's self -censors because you don't want to get.
[1081] So it's not, we don't even know where the real cultural balance is.
[1082] It's always being affected by advertiser revenue, always.
[1083] Because even on uncensored shit like YouTube, there's still a consequence to saying wild things, to saying wild shit and to having fun or swearing too much or just being preposterous like a lot of comedians get together and do like that that is hard to do that that'll get you in a spot where you're not gonna get you're not gonna get any you're not gonna be able money off of it right they demonetize you they could definitely do that in some of those social media okay so this is in my thing please don't fucking attack me just because this is where I'm currently think all right fine attack me what are you talking to them or me what them just because like this is in my here's this This is what I've been, you know, in my thinking, and I know there's arguments that YouTube and Twitter don't categorize as a private company anymore.
[1084] But I think that in the consideration of whatever particular set of ethics or capitalist based reasons for censoring, whatever it may be that a private company is doing, it's better that the private company is doing, it's better that the private company.
[1085] is in control of whatever the censorship it is they're doing, then that the state steps in and tells a company what they can't censor and what they can censor because now it's gone from a private company censoring to the state saying you must put this in here.
[1086] Now, again, sometimes if you happen to be in an era where whoever happens to be president is someone you agree with, well, that's great if you're dumb.
[1087] You're like, yeah, they're going to, the, it represents my political ideals.
[1088] So that kind of censorship makes sense to me. But, holy shit, are you kidding?
[1089] Four years from now, what happens if the state steps in and is like, hey, you know what?
[1090] You all really need to do more stuff that's anti -abortion.
[1091] You just, you know, there was a host talking about like how they think abortion as a baby, like women should have control of their bodies.
[1092] you know, we're going to, I think we, I don't think we can really monetize that anymore.
[1093] You know what I mean?
[1094] Now of a sudden it goes the other way.
[1095] So to me, it would be better to let private institutions have control, whether we agree with it or not, of whatever, however they want to run their ship, then let the state step in and do it because you're making the assumption that the state has some uniform quality that doesn't change over time, meaning, yeah, maybe your team's in control now, but another team might pop in at any second.
[1096] And also if it's a private company, then hopefully shit like what's happening with Elon Musk or another version of it might appear that will, because there's a, what's the name of it, market pressure.
[1097] So maybe a YouTube where that isn't censoring that kind of stuff that isn't shitty or whatever will appear.
[1098] This is why I think it's better than letting the state step in, even if you agree with the way companies are censoring right now.
[1099] I mean, I think otherwise you're just inviting, like, the real nightmare.
[1100] You know, people are like, George Orwell's 1984.
[1101] No, it's not.
[1102] It's a bunch of fucking private companies who you have decided to do digital sharecropping with.
[1103] You know what I mean?
[1104] Where you've just decided to stake your claim on someone else's land.
[1105] Yeah.
[1106] And then you're like, what the fuck?
[1107] You can't kick me out of your forest?
[1108] I'm one of the hunters here.
[1109] It's like, yes, we can.
[1110] It's my forest.
[1111] Get the fuck out.
[1112] You decided to do digital sharecropping.
[1113] We're all in danger of this, by the way.
[1114] We're all in danger of this.
[1115] But I would rather still have it be like some private land than state land.
[1116] You know, because it's what it, what, obviously this wouldn't happen right now.
[1117] But, you know, you hear about it in North Korea, they got to put pictures of that motherfucker in their house.
[1118] Like if you go over to someone's house and his picture's not on the wall, it's like, you rat them out.
[1119] You rat them out.
[1120] Yeah.
[1121] They rat people out.
[1122] Right.
[1123] So they've developed a culture of ratting people out.
[1124] Exactly.
[1125] They all turn on each other.
[1126] They turn on each other.
[1127] They get rewards for it.
[1128] So similarly, in a weird way, by not letting private companies censor, you are kind of saying like, you've got to hang this picture on your wall.
[1129] Oh, my God.
[1130] We want the picture on the wall.
[1131] Hang the picture on the wall.
[1132] You know what I mean?
[1133] We don't want the state to say that.
[1134] Oh, my God.
[1135] Yeah, you don't want the state to say that.
[1136] No. Never.
[1137] You're better off with private companies because you could boycott them.
[1138] If the state says that, you're stuck with the state.
[1139] They're not going to fire themselves because they don't like you.
[1140] exactly they're gonna keep they're gonna ramp up whatever things they were doing you know like when you see someone who's violated the law and then they're getting arrested and the cop gets so aggressive he beats the shit out of them we've all seen that right yes well when that's happening that person that person thinks that something has been done to them you know like when you're you're you're fighting you're you're trying to get away from them you're it's in they're interacting personally yeah it's a law your job is to enforce a law but you're thinking of it as if that person did something to you right we do that with everything right yeah do that with everything yeah we get our ideas attached to who we are yeah with fucking everything with arresting someone with with with everything yes yeah man the the uh yeah we just we have to be very careful and like like i think in our thinking about freedom of speech too which is isn't censorship speech yeah boycotting is speech yes so if a company decides that they don't want whatever the fuck it is that is the private company speech and I know a lot of people like it's not a private company anymore the argument is listen these fucking companies were built using state uh state sponsored stuff or things that taxpayers paid for the whole network of the thing is running off of an infrastructure supported by taxpayers meaning it's not quite private this isn't a club we're not talking about like a private club or a private group of people it's like a public forum that is being supported by taxpayers that's the argument but still no matter what you were arguing for the state to step in right to private companies and tell them what they can and cannot censor what they can and cannot abbreviate cut whatever and I think that is a just it's just dangerous you know but no I think you're dead right because as As we said, you can't boycott the state.
[1141] Like, it becomes real messy trying to get them to let go of power.
[1142] Whereas if a company, you decide, like, you don't like this company's values.
[1143] You don't like what they stand for.
[1144] You could just leave.
[1145] You get to stop doing business with them.
[1146] Right.
[1147] Like, you know, this thing that's going on with Disney right now?
[1148] Like, Disney's stock is falling, and many people are attributing.
[1149] I don't know if this is accurate because I don't know anything about finance just to be clear but people are attributing it to the woke stance that Disney has taken on a lot of issues and that people are just tired of hearing from this from corporations and that you know there was some videos where people were upset that some of the ideas that people who are executives at Disney had and apparently it's had an effect on the stock market so what so people have decided we don't like if this is true find out if that's true because people were talking about it that know things about money and I have no idea what the fuck they were saying other than I read a couple of article titles that said that the best article I could find that says Disney faces backlash in Florida made don't say gay controversy politicians threatened to strip the company of Mickey Mouse copyright special tax status for Walt Disney as parents protest in Orlando what world are we in well what is this the stake though how much did it drop that's what's curious and what do they think happened because this guy was attributing in it to woke ideology that people were getting mad that Disney was actively promoting these values that a lot of Christian folks, people that don't say gay one in particular because like that one's a weird one because it doesn't really say don't say gay.
[1150] It really is talking about people talking about sexual preference and gender and gender transitioning and stuff.
[1151] That's the stock over.
[1152] It's trying to decide when something made something go down and why is tough.
[1153] But it's gone down pretty far.
[1154] But how much of that's because of the pandemic?
[1155] It says the past year.
[1156] It could be a lot from the pandemic.
[1157] So if I go to like max or five years even, it's where it was before the pandemic sort of even.
[1158] It's like I went way down when the pandemic started and then I went up.
[1159] Oh, so it's like a market correction.
[1160] And I went back down.
[1161] I think Disney's going to be okay.
[1162] It's a market correction.
[1163] You know, the thing about this thing about teaching kids stuff in school, it's like, who's the teacher?
[1164] That's what it really depends upon.
[1165] Like, if you're teaching a kid who is six years old about anything, about how to make good friends, about how to become successful in life, about how to treat other people, if you're treating, I mean, you're teaching a very small kid about anything, it's important who the person that's teaching it is and how they teach it.
[1166] What are some initial impressions that this kid is going to get from an adult who's in charge other than their parents?
[1167] So you leave a kid who's used to just listening to adults with a new adult, well, we hope they're great with everything.
[1168] So it's not just an issue of like, should that be the person to talk to your child about gender transitioning?
[1169] Should that be the person to talk to your child about politics?
[1170] Should that be the person who talked to your child about anything?
[1171] It's kind of a weird gig.
[1172] You give people this influence over your child.
[1173] You know, and you can get them convinced of a certain political ideology and that the other people are dumb and you're with them all the time.
[1174] And if you're like, maybe your dad's an asshole and you have a psychology G .J. Jr., it's really cool.
[1175] Yeah.
[1176] And you go there and, hey, it's Mr. Johnson.
[1177] He's fucking awesome.
[1178] Yeah.
[1179] And Mr. Johnson has ideas that are very different than your parents' ideas about everything, about all kinds of different factors of life.
[1180] Is that okay?
[1181] Dude.
[1182] It's okay, right?
[1183] Isn't it good?
[1184] Here's what's...
[1185] But where's it not good?
[1186] It's not good when the...
[1187] People get weird when it comes to gender.
[1188] It's not good when what they're saying doesn't reflect your ideology.
[1189] And this has been...
[1190] Because I think they're catching on to something that's been going on for a long time, which is if you're sending your kids to a public school, you're sending your kid to some kind of state facility.
[1191] That's not conspiracy theory.
[1192] It's a state facility.
[1193] they're employees of the state working at the facility those employees of the state have been told certain things that they're supposed to teach your children and many of those things are either complete fabrications or are like cutting out big parts of history that don't sit well with whoever happens to be in power right so this is the thing like we have to face the fact that public school systems are have always and will always be to some degree as long as the state has anything to say about what teachers are teaching, there will be indoctrination facilities.
[1194] I mean, think about it.
[1195] When you go to fucking school and you take that career test?
[1196] Did you ever take the career test?
[1197] It tells you what career you would be good for?
[1198] I don't believe I did.
[1199] Because all the careers suck.
[1200] So you take this career test and like, you'd be a good accountant or you would be, you know what I mean?
[1201] And you're like, I don't want to live if these are the options for my ability to express myself in the world you're saying like there's like four options four options and I don't want to be any of them and and so so but this is part of the programming right they want you to think that your mode of expression is limited to capitalist uh structures at the time which by the way now a lot of the shit they told it like type writing I had to take typewriting classes you remember that on a typewriter learn how to use a fucking typewriter you couldn't use a calculator during a math test because they wanted you to use your mind, they don't do that anymore.
[1202] You can use your phone.
[1203] You could use your fucking phone.
[1204] Kids are Googling shit in the middle of tests.
[1205] I guarantee you.
[1206] There's got to be a classroom out there where they're letting you use your phone.
[1207] I wish we could talk to a kid right now that's like freshmen in college.
[1208] Do you think they let them have their phones out?
[1209] I don't know.
[1210] I think it depends.
[1211] I think sometimes they're allowed, sometimes they're not.
[1212] I think sometimes you're allowed to use technology.
[1213] Not during test though, right?
[1214] You can't use your, you can't Google.
[1215] answers during tests.
[1216] No, I imagine, no. I bet you can.
[1217] What?
[1218] They could also give you time limits, though, so if you spend five minutes looking up every answer, they're never going to finish the test.
[1219] That's your fault.
[1220] Can you think you can Google?
[1221] You should study so you know what to do.
[1222] But if they let you Google, then the Google would also give you the flat -out answer.
[1223] For sure.
[1224] But even like during the pandemic, they weren't going in class.
[1225] They started to give them tests.
[1226] They couldn't stop them from looking shit.
[1227] Right.
[1228] You couldn't stop them from having a second laptop right next to it.
[1229] For sure.
[1230] I mean, yeah.
[1231] Because the kids a lot of times.
[1232] They're smart.
[1233] They'll come up with ways to cheat.
[1234] They figure a way out for sure.
[1235] Yeah, man. I mean, it's a word.
[1236] I think there should be the ability to hear the, like, you know, like nanny cams, but for the class.
[1237] In other words, if you're a parent, you want to listen to what the teacher is telling your fucking kids, you could tune in and listen.
[1238] I think right now that doesn't happen.
[1239] And I think it should be a parent's right to be able to listen to what the teacher is saying, It's like, you know, you don't know what the fuck they're saying in there.
[1240] That's the thing.
[1241] Like, you know what I mean?
[1242] You think it's like gender stuff to be worried about.
[1243] How do you know?
[1244] I know my teachers were telling.
[1245] Communism.
[1246] Communism or like garrarchy.
[1247] Do my algebra teacher and I gotten a big argument in a class once over whether or not the snake in the Garden of Eden had vocal cords.
[1248] Why the fuck was my algebra teacher talking about the snake in the garden of Eden to, us.
[1249] You know what I mean?
[1250] Like, what the fuck are you doing?
[1251] Separation of church and state, friend.
[1252] You're here to teach us like X2 parentheses three minus six equals nine, not the Garden of Eden, but you're doing it.
[1253] You're doing it.
[1254] Right.
[1255] In the middle of the, and you're, you're mad at me because I pointed out, which I think is a very valid point, what would it sound like without vocal cords?
[1256] How does it talk without vocal cords?
[1257] What would it sound like?
[1258] Anyway, to me, we're dealing with something that is bigger.
[1259] Right now, these homophobic fucks, they are.
[1260] You have to admit it.
[1261] A lot of them are very, they're Christian.
[1262] Look, I'm sorry for calling fundamentalist Christians who are against gay people, homophobic fuchs.
[1263] But that's what you are.
[1264] When you say a lot of people, you mean the people that are boycotting Disney?
[1265] I'm saying that there is a concerted effort by the Christian right to take, to get rid of the problematic separation of church.
[1266] and state.
[1267] That is an organized thing that's been going on for some time.
[1268] They have a set agenda.
[1269] They want to get rid of abortion.
[1270] They want to get rid of the separation of church and state.
[1271] And a lot of the stuff that they say that they're doing is disguised, is like, you know, kind of sweet or whatever, like prayer in school.
[1272] We should have prayer in school.
[1273] But the problem is it's like you got a lot of different religions in these fucking classrooms.
[1274] So which prayer do you do?
[1275] Which prayer do you not do separation of church and state fixes all of that can i stop yeah who's they and who where they've they've written all this out like this is like a yeah outlined agenda yeah it's like the so okay so like they're the the the mega churches you know like the they're these mega churches that don't pay taxes like the joel olstein type dude joel olstein there's a great documentary on one of them where the got the Justin Bieber's pastor he was at a park and like just started hitting on this woman and then like it's amazing because like this is like a famous pastor who's sending videos shirtless in his car to someone he's cheating on his wife with like I'm nine miles away you know I might just stop by like real creepy shit like that so these mega churches they're not being taxed they're putting on these massive it's amazing wow great scam well and again this is where I think they're fucking up because it's like you want your scam to keep going you need to stay the fuck out of politics.
[1276] Because that was the idea.
[1277] Y 'all worship God outside transcendent to the human realm.
[1278] But you sure as fuck can't put your weird theocratic, sex negative, homophobic, fucking ideas into society.
[1279] Or we're going to go back to the witch burnings.
[1280] This is the problem.
[1281] So this is what you're seeing is like, you know, you can look it up, man. I can't remember the name of the organization, but there's a lot of lobbying groups that have direct connections to the mega churches, which have a set agenda because they believe we are in the kingdom of Satan, that the Antichrist is coming, and that they need to stop that from happening by implementing Christian ethical systems into the world.
[1282] Now, I love Christian ethical systems, and I think that there's something really beautiful about it, but we can't get rid of the separation of church and state because you are rolling the dice there on what religion takes the wheel.
[1283] Again, it just goes back to you have to understand a lot of the shit that we have in the United States might not be palatable to you now because it doesn't reflect your ideologies, but it's there because we don't want there to be a perma shift to one way of being.
[1284] Right.
[1285] You know, so yes, there is an organized group of fundamentalist Christians who want to establish what could only be called a theocracy in the West.
[1286] And how are they trying to do this?
[1287] Lobbying.
[1288] I think that what you're, when you're seeing that, you're seeing something of it.
[1289] When you're seeing the getting rid of Roe versus Wade, this has been a long -term plan of theirs, man. Like this is not like, don't you watch the God's Not Dead movies?
[1290] Have you watched these movies?
[1291] No. Oh, they're the best.
[1292] Me and my wife watched them.
[1293] Oh, wait a minute.
[1294] I remember a trailer for one that almost seemed like parody.
[1295] Dude, I hate to do this.
[1296] I have to pee again.
[1297] That's fine.
[1298] Way too much fluid.
[1299] That was only like an hour from the last pee though, right?
[1300] less than an hour do you have a pee timer on there 30 minutes I drank so much liquor no I'm glad I just realized we're in black robes talking about like Christian fundamental is taking over there I'm gonna pee will come like one more time all right cool oh God's not dead he surely alive this is a newer this is a trailer I love these movies what year is this one this one came out last year let's face it your God your book are in the way you feel that you're making a last that's a really good actor i get sad when i see a really good actor and so i'm like this is a fine film our whole faith started because one man chose a hill he was willing to die on me god's not dead the next chapter it seems like clarity i've watched every one of these once we decide what a child needs to know it becomes imperative that every child know it just the other day.
[1301] I'm here to review your homeschooling environment.
[1302] Religion has been removed from our schools.
[1303] They're teaching kids that they don't need God.
[1304] If your children do not show up in school a week from Monday.
[1305] That's the lady from Fox News.
[1306] She's the judge.
[1307] Judge Judy.
[1308] No, that's not Judy.
[1309] Judge Janine.
[1310] She doesn't want her parents going to jail.
[1311] Okay, I can't.
[1312] It hurts myself.
[1313] They're going to extract it.
[1314] Oh my God, it's only halfway there.
[1315] There's no fucking way.
[1316] Dude, this movie is, this movie, it exemplifies what these people who are political lobbyists, that's the narrative.
[1317] The narrative is this, the Antichrist is rising and it's part of the plan of the Antichrist.
[1318] We're going to eliminate the belief in God from school systems, eliminate prayer, and then create a hyper -materialistic, hypnotized culture, a satanic culture, essentially, that is devoid of God.
[1319] So that's what they're afraid of.
[1320] And if you imagine that that were real, you could see why they're so passionate about, like, trying to get into the government to change things.
[1321] The truth is, that's not what's happening.
[1322] What's happening is there's too many religions.
[1323] There's just too many religions to decide this is the type of prayer we do or this is the type of God we believe in and not be a theocracy.
[1324] How do you do that?
[1325] This is what the satanic temple is so good at doing is like, they're like, okay, you're going to put the Ten Commandments in front of your.
[1326] in front of your government building, that we're all paying taxes to support, then we should be allowed to put our Baphomet statue in front of the building, too!
[1327] You know what I mean?
[1328] They're really doing that.
[1329] It's really quite brilliant because they are fully aware of the dangers of theocracy.
[1330] It's like, you know, Christianity is so beautiful, and that's how we're raising our children.
[1331] I love Jesus, and we're, that's, that's, like, and I'm Buddhist, and I have a guru, But the way we're trying to talk about love and mysticism is through the, through Jesus.
[1332] But I don't want my kid to go to like a state -run facility where some teacher is teaching my kid about Christianity from their own lens.
[1333] You know what I mean?
[1334] That the state is saying has to be taught.
[1335] That is scary, dude.
[1336] That's the handmaid's tale.
[1337] You know what I mean?
[1338] That's what that is.
[1339] So we got to keep them separate.
[1340] No matter what, it all costs.
[1341] separate church and state separate church and state keep religion out of the public schools but certainly don't prohibit the students from adhering to any specific faith because you know that would be incredibly fucked up yeah if they brought prayer to school but only had Christian prayer that could be a real issue it is a real issue for a lot of people right it's not even the largest world of religion but I mean even if they let you opt out if they didn't provide alternatives like if they did provide alternatives how many would they have to provide, right?
[1342] They'd have to provide an Islamic one.
[1343] They'd have to provide maybe a Mormon one in some places.
[1344] Like, how many different ones?
[1345] How many different services?
[1346] A thelamite.
[1347] Scientologist?
[1348] Is that real?
[1349] I mean, who's to say what's real?
[1350] Yeah, exactly.
[1351] So this is why...
[1352] That's where it gets crazy.
[1353] That's where it gets crazy.
[1354] That's why we have separation of church and state.
[1355] That's why we have a deal with these groups, which is, listen, you don't have to pay taxes, meaning you and I, we're not connected in any way.
[1356] shape or form.
[1357] I'm not getting money from you.
[1358] You're your own free entity, your autonomous thing, but you stay over there.
[1359] We're going to stay over here.
[1360] And I'm not going to try to get into your shit.
[1361] You don't try to get into my shit.
[1362] And then we have a nice non -theocracy for better or for worse.
[1363] I mean, honestly, I do think that it's wonderful to pray.
[1364] And kids should be encouraged to pray.
[1365] Anyone should pray.
[1366] It's the best thing ever.
[1367] I mean, what's the worst case scenario?
[1368] You're wrong.
[1369] It didn't work.
[1370] Okay, so what?
[1371] So you said some stupid words to the void and nothing happened?
[1372] I think it's a kind of form of meditation, too.
[1373] Yes.
[1374] It can be.
[1375] Do you pray?
[1376] I don't, I don't pray.
[1377] No. All right.
[1378] I do a lot of meditating, though.
[1379] Yeah.
[1380] I do a lot of meditate.
[1381] I've been meditating a lot inside a sauna.
[1382] I find that that's like some of the most interesting meditation for me. Because it's like meditation in the middle of like a kind of a suffering.
[1383] Yeah, right.
[1384] Not an intense suffering, but an uncomfortable suffering.
[1385] and the suffering is kind of enhanced when you take big deep breaths so I do these breathing exercises in a sauna and I get into like this meditative state that's cool very interesting yeah because if I can just concentrate on my breathing for just like if I can just get through the first 30 seconds I can get into a nice good rhythm or I can kind of stay in it and stay just in thinking about the breathing and occasionally drift out of it yeah yeah man that's cool I'm glad you're doing that I mean I would also have kids meditate.
[1386] I mean, if I was running schools, it would be totally different.
[1387] I think kids should meditate, and I think kids should do a fun physical activity.
[1388] You find a physical activity that's fun, you know, whatever it is.
[1389] I think it would be awesome if kids had a lot of options for different, like maybe one kid can do, you know, skateboarding, one kid can do martial arts, one kid, like, give them something, you want to get them active, give them a reward.
[1390] Like, what are they actually interested in?
[1391] Right.
[1392] If they're actually interested in all.
[1393] all these different things give them options yeah man remember the do you remember the physical the presidential physical thing that you had to do in your fitness test jesus fucking christ i predated that i believe oh my god you missed it it was the it was humiliating dude but like you how many chin -ups did you have to do any amount any amount was no good for me so you're like you know there's a spectrum of physical prowess in schools and let's i wasn't at the top end of that spectrum joe so like you have to do this presidential test to do a certain number of pull -ups and if you didn't pass it you're like i guess like from the president's perspective i'm a piece of shit like what's the fucking point but it was boring that remember that boring like the stuff they would make you do and p .e was boring there was not the only dodgeball was fun there wasn't like much that was really fun in it i know what you're saying it'd be cool to like have people associate physical activity with fun instead of some insane dude in like short shorts blowing a whistle in your face.
[1394] You know, that's like, how are you going to like love exercise when that's what you associate it with?
[1395] And there's some sports that other kids have been playing for a long fucking time.
[1396] If you have to play them that sport and you've never played it before, you feel like such a dumbass.
[1397] And when you're a young kid, that is already devastating for your self -esteem.
[1398] So if you're in a class, you're forced to play a game, you have no idea how to play that game.
[1399] There's other kids that have been playing it their whole life.
[1400] Yeah.
[1401] And they're stuffing that ball right in your face.
[1402] Yeah.
[1403] And pitching that fastball by the plate.
[1404] Picking teams.
[1405] Yeah.
[1406] Oh, my God.
[1407] You always got picked last way.
[1408] You're going to pick, yeah, always eternally.
[1409] So it produces a kind of like outcast, the P .E. outcast.
[1410] The P .E. outcast.
[1411] You just sit on the fucking sidelines and you like glare out at all the athletic kids.
[1412] And that's the pattern you'll probably follow for the rest of your life.
[1413] Well, there's like a weird separation between physical people.
[1414] and non -physical people back then too whereas the non -physical people always assumed they could never be a physical person and the physical person always assumed they couldn't be smart because everybody had these weird stereotypes that they had adopted and accepted for their own even things like that aren't self -serving like you're dumb if you work out a lot or that you're smart if you don't like that if you eschew the aesthetic you don't give a fuck about what your body looks like you're oh so smart so wise that's an old thing that's a weird thing but it's it's a tribal thing it's like all those other tribal things it's all the same shit man it's all people get attached to these ideas they claim them as their own and they fucking defend them to death and then they projected on everybody yeah oh that's where you get into real trouble like if you've made the horrible mistake of assigning intellectual prowess to the way somebody looks you're gonna get fucking gone your whole life you know what I mean because some people are really good that's what's not fair a really hot woman who's way smarter than you why is it not fair it's not fair I would like it would just let her tell you what to do fuck yeah that would be incredible I mean I Duncan could just taste that ball gag right now I'm salivating well again by the way then also though also like I think any self -assess Anytime I found myself looking intellectually down at anybody, woman or man, I'm usually like, what the fuck am I talking about?
[1415] Like when I watch Jeopardy, dude, I can't, I just, it's just like, I don't know, I don't know, I don't know, I don't know, I don't know.
[1416] You know what I be?
[1417] Like, I am, so anytime I find myself like, huh, that dullard, it's like, what the fuck do I, like, my, I'm, I don't, I don't, I am brand.
[1418] This is what I like, this is what I like.
[1419] This is what I like.
[1420] about not being able to smell from COVID.
[1421] You still can't smell.
[1422] And now I can say, well, I'm brain damaged.
[1423] Ooh.
[1424] You see what I mean?
[1425] Like, I actually have your victim.
[1426] I have brain damage.
[1427] So it's like now.
[1428] It's sinus damage, though, right?
[1429] Well, they are, they think it could be brain damaged.
[1430] Really?
[1431] Yeah.
[1432] What do they think it could be?
[1433] What are they thinking?
[1434] Joe.
[1435] Hey.
[1436] I have brain damage.
[1437] Yeah, I get it.
[1438] I don't know the answer to that.
[1439] Have you tried NAD drips?
[1440] Not yet.
[1441] No?
[1442] You know, supposedly that helps.
[1443] I've been...
[1444] I should say, supposedly that helps some people.
[1445] I will eventually get around to trying it.
[1446] Long -term smell loss, COVID -19 tied to brain damage, especially those that altered sense of smell have significantly more axon and microvasculopathy damage in the brain's olfactory tissues versus non -COVID patients.
[1447] These new findings from a post -mortem study may explain long -term loss of smell and some patients wish the virus you know who's got the longest i've ever heard of ryan sickler still no smell 18 months man 18 months no smell none none like he doesn't smell shit oh he's like barely smell anything see something has to be like super strong for him to smell it at all i have the phantom smell so like my brain doesn't know what like it's basically i think it's kind of like i don't know like if you had a keyboard and one of one of the keys was fused like the fuse blue or something so It's like my brain thinks piss, shit, oatmeal, coffee, and body odor all smell the same.
[1448] So it's all the identical smell.
[1449] It's like it's just referring to this like nah, ah, it doesn't know.
[1450] But every once in a while, it comes back.
[1451] Like every once in a while, my smell's back.
[1452] And I'm so used to being able to smell.
[1453] I won't even think about it until I'm like, oh, fuck, I smell my house.
[1454] Oh, fuck, I can smell.
[1455] And then it goes away.
[1456] But they say.
[1457] Really?
[1458] So it comes and goes.
[1459] Well, they say the fact.
[1460] that it comes and goes is a good sign.
[1461] Like that means that very slowly it could be that it's, the neurons are growing back or something.
[1462] But yeah, man, it's fucked up.
[1463] It's like a, it's a really creepy thing to suddenly have your sense of smell distorted.
[1464] What a strange fucking disease this thing is.
[1465] Yeah.
[1466] I mean, when history is written and people talk about this hundreds of years from now, it's gonna be a very, very hotly debated time.
[1467] Like what?
[1468] What was that like?
[1469] You know, I know one thing they're going to say.
[1470] It definitely wasn't made in a lab.
[1471] Completely natural.
[1472] Definitely not whipped up in a fucking pharmaceutical company or wherever it was made.
[1473] It was completely natural, normal.
[1474] I mean, this is one thing where if you have had it and experienced what it's like, this is where I get annoyed when people are like, it's a cold.
[1475] It's not quite a fucking cold.
[1476] This thing like, I mean, again, I am a long -term psychedelic user.
[1477] So I think the thing they're calling brain fog, people would say, oh, it's brain fog.
[1478] But when I had COVID, I felt high as a kite, man. Like, I felt weird, like a weird psychedelic quality to it that was like nothing else I'd experienced.
[1479] It was so alien and bizarre.
[1480] And like, and the dreams I was having were so.
[1481] strange and you know i think occum's razor well that's because it was eating your brain a little bit and so the way your brain was like the signals where it was sending was like something's eating me or it's a life form yeah it's a little sentient so it's a virus if it's a virus that was manipulated in a lab i mean what is a virus it's not technically a life form but it needs a life form as a host and it can take over that life form and kill it and spread to other life forms and i do Ideally, it likes to keep the thing alive so it could pass itself off.
[1482] Yeah.
[1483] But what is that?
[1484] We say, is that alive?
[1485] Because it kind of seems alive.
[1486] So imagine if it is alive.
[1487] Imagine if it's not just infecting your body, but it's also changing you.
[1488] Yeah.
[1489] It's making you behave the way it wants.
[1490] Yeah, right.
[1491] It's breaking you down.
[1492] That's fucking scary.
[1493] Think about rabies.
[1494] Imagine if people were getting rabies.
[1495] Like if rabies was like a real zombie, does it?
[1496] I mean, people do get rabies, obviously, and it's deadly for most people.
[1497] If they don't treat it, like, really quickly, it's one of the most deadly, if untreated diseases.
[1498] But imagine if that was like a real problem, like a zombie thing where people were, because rabies is basically like a zombie thing.
[1499] If you see a dog that's got rabies, they're trying to get to you for no fucking reason.
[1500] Well, the reason is because that rabies wants to spread.
[1501] Right.
[1502] So it tricks a squirrel into becoming an assassin.
[1503] The squirrel jumps on you and bites your fucking head.
[1504] Yeah, it's a wild -ass disease and if it's spread to humans in the sense that we could keep it and give it to other people and it made us bite each other like I don't know if I know it makes people super thirsty like, but imagine if there was a version of a rabies that didn't kill you very quickly, but really did turn you into a zombie.
[1505] That's not far removed.
[1506] No. From what is currently available.
[1507] Right.
[1508] So that's only like one or two generations removed from some of the crazy diseases that people have now.
[1509] Why not?
[1510] Why not?
[1511] You see it in animals.
[1512] If it exists in a rat, you have rabid rats that'll attack you.
[1513] Yeah.
[1514] But have you ever seen a rat do you think might have rabies?
[1515] They all seem like they have it.
[1516] I've seen animals that seem like they might have rabies.
[1517] It's fucking terrifying.
[1518] They don't have any fear of you.
[1519] No. Oh, yeah.
[1520] I know what you mean.
[1521] Like I've seen squirrels like that.
[1522] Oh.
[1523] I've seen this like hyper -aggressive squirrels that seem like they're approaching you like they're about to bite.
[1524] I saw a rat outside a pool hall once in New Jersey and it stood up on its hind legs like, let's fucking go.
[1525] Looking at me and I was like, oh my God.
[1526] Like it was showing me its teeth and it stood up on its back legs.
[1527] It was like this big.
[1528] Dude, terrifying.
[1529] Big fat ass.
[1530] I was like, nope, not going near you.
[1531] Just wants to inject a neurotoxin into you.
[1532] It's so.
[1533] spooky dude i mean this is like you know this shit with like all the all the aggression on planes right now i get worried it's not should that's the people are beating the fuck out of each other it's that that's the effect of like these people have been infected with this thing that's like altering making them more aggressive or well anxiety first off what is this rabies kills a hundred ninety nine people every day here's why you never hear about the disease is preventable and treatable but fighting it is not a priority for the West.
[1534] What I'm saying, though, is that, like, people don't get it and behave like an animal that has rabies.
[1535] People die of it.
[1536] I know that.
[1537] But I don't think they get it.
[1538] It doesn't have the effect that it does on animals.
[1539] Because when animals have it, they want to bite you.
[1540] This could be one of those, like, television shows that's making a little dramatic thing, but...
[1541] It's showing people and crazy...
[1542] This seems, like, dramatized, though.
[1543] Oh.
[1544] That's definitely dramatized.
[1545] No, bro.
[1546] She's not acting.
[1547] No one's that good.
[1548] Find that woman, imagine?
[1549] That's her.
[1550] But that's what they're saying is that people acting crazy have rape, like, I don't know the situation.
[1551] I mean, listen, if it does do that to rats and other animals, why wouldn't it jump to people?
[1552] We know people get infected by it.
[1553] Maybe it's a small percentage of the people act like a rabid animal and go around trying to bite people.
[1554] Yeah, man. That's that fucking 28 days later movie, man. That movie scared the fuck out of me, because I think that's how quick it could go down.
[1555] If there was something like that that just spread through the population, That is one of the best horror movies of all time 28 days later It's incredible God damn what a good movie Dude that movie was so fucking good And that was the one Wasn't that one of the first running zombie movies The zombies Fast zombies Fast zombies That was cool That was the better version of zombies They were way scarier Everything was so urgent And they did a lot of like fast camera moves In that movie That movie was a great fucking movie Yeah.
[1556] And it was also like Jesus Christ, like this is like it wasn't preposterous.
[1557] There was no leap that you had to make in order to believe that this could be real.
[1558] Right.
[1559] Dude, this and again, we I don't mean to keep going back to this, but there's an assumption that hasn't already happened.
[1560] Like if you look at the way we're behaving on the planet, it's not rabies.
[1561] Like I'm not attacking you because I'm trying to inject you with something and I'm not afraid of water.
[1562] But there are people on the planet who are afraid, who are phobic of ideas, groups of people, sexual preference, you know, and we'll kill, we'll kill you, kill you if that's what.
[1563] So it's like, I think it's an important question to ask, like, isn't that person kind of infected with something that's fucked up?
[1564] Like, maybe it's not rabies, but it's at the very least some kind of like memetic parasite, right?
[1565] Like living inside of people that is creating as one of the solutions to problems.
[1566] Like this is one of our solutions to problems on the planet is to launch missiles into cities.
[1567] This is one of the ways we have conversations with each other.
[1568] It is beyond insane.
[1569] Like, if we are all the same person, why, why?
[1570] Why, what kind of crazy thing launches missiles as it itself?
[1571] Well, I mean, if humans are fundamentally good, which I like to imagine, that's the case, then what the fuck has poisoned them to make it seem normal to shoot missiles into cities?
[1572] Right.
[1573] You know what I mean?
[1574] Like you would think, because all that would have to happen is a bunch of, is everyone just realizes that, or the majority of people who are the missile launchers, just stop.
[1575] Nothing's going to happen.
[1576] If you all stop, what are they going to do?
[1577] They can't make all of you shoot the missiles, right?
[1578] Then we have world peace.
[1579] Because the problem isn't like the crazy ass dude in the tower.
[1580] It's all the people who are taking that person seriously.
[1581] Why are you taking that person seriously?
[1582] The crazy person who's telling you to go kill other people.
[1583] Like, yeah, we look at Heaven's Gate and we're like, God, Can you imagine being so insane that you would cut off your own balls?
[1584] But can you imagine being so insane that you listen to a billionaire who tells you, yeah, you need to go into this country and launch missiles into the other country, you might get your head blown off and being like, okay, all right, I'll go do it.
[1585] And I'm not talking about one country over another.
[1586] Right, any country.
[1587] All of them, man. All of them.
[1588] Like, really?
[1589] Like, what is the difference?
[1590] between that and heaven's gate, except that one has more money.
[1591] What's the difference?
[1592] Both of them are promulgating crazy fucking ideas.
[1593] One of them is saying that's the bad guy and we're the good guy and the other one is saying that's the bad guy and we're the good guy.
[1594] This is all absolutely true and I agree with everything you're saying.
[1595] However, if you woke up in this day and you realize that you have this problem and you need to figure out a solution to it, it's not as simple as well, we're not just going to engage militarily because China's still going to be China and Russia's still Russia and Iran's still going to be Iran and there's a perpetual war game that's been in motion for decades and decades and although you're right it would be wholly irresponsible if you didn't pay attention to what's going on in the world and prepare for bad people right because then it's like you have to if they were heard like if all of a sudden by some rotten bit of justice somehow this podcast gets out and everyone in America and everyone puts down the their arms, in their guns, how long before we got invaded?
[1596] The Russians will invade immediately.
[1597] 30 minutes.
[1598] This is the problem.
[1599] This is where it all falls apart.
[1600] It does, but it doesn't.
[1601] What's interesting is how armed this country is.
[1602] This is an exceptionally armed country.
[1603] It's really weird when you think about it.
[1604] I was watching this ad for the NRA the other day and I was thinking like, it is wild how many guns there are in relationship to people.
[1605] There's more guns than there are people.
[1606] Like, there's probably no other place like that on earth, where if you do invade, someone's going to shoot you.
[1607] Right.
[1608] Right.
[1609] It's like the amount of people that are like firearms owners in this country is bonkers.
[1610] Yeah.
[1611] Yeah.
[1612] I mean, I know.
[1613] My dad had a fucking arsenal.
[1614] You should have seen all the weapons he had when he died.
[1615] It was insane.
[1616] I kept finding guns.
[1617] I was like, you look under a pile of clothes, a pistol.
[1618] look over here like he had so many guns and he loved guns and he and like he liked to shoot guns it was fun we that's a big happy wrote a song about him and yeah they yeah my dad was actually friends with gurg was he no it's a great song imagine if that was about your dad i just look man you could have held on to that a little longer i was with you oh we i would have bought into it i'm not going to let you believe my dad my dad's friends i'm not going to do that to my dad's memory but the he the the this is the problem is like a lot of our experience with guns is if you grow up in the south it's like familial memories that are really warm and sweet I remember like this is one of my big connections with my dad is how strict he was with us about fucking guns man like if like he wouldn't let us play point guns at each other you'd never point a gun any bar like strict strict strict strict strict rules about it's probably pretty smart but also just like the joy of like being out in Texas shooting fucking bottles watching them blow up never in there was like some insinuation of like we're going to use these to kill a bunch of people it was just fun like hearing things explode and things break so yeah man it's like uh this is the isn't this the problem of like uh deep like like creating a stigma around something which is like look it wouldn't it be better for your kids to like have basic guns safety and understand that they're just like a tool like anything else and this is what you how you hold one and what you do it and why you shouldn't have one and all that stuff then to like ignore that this technology exists and imagine the 3D printers in five or 10 years even though they're already doing it aren't going to just get better and better and better at printing out these fucking things yeah you know they're actually legal that's what's crazy the 3D printed guns on yeah they're called ghost guns there was just I was make sure this is true because I was listening to Coelion Noir was talking about it and he was talking about how the Biden administration had been talking about ghost guns.
[1619] I didn't even know what the term meant but apparently it means a gun you made yourself and that you can, you're like legally allowed to make guns yourself.
[1620] You can order the parts.
[1621] Yeah.
[1622] Or you could be a blacksmith, I guess.
[1623] I guess.
[1624] I don't know what the specifics are but he was talking about this and they call those guns ghost guns.
[1625] And I'm like, oh my God.
[1626] Like how many ghost guns are there?
[1627] Like if we know how many people like have guns and how many guns are how many do we know how many ghost guns are right is that is that just guessing like that well again this is the winner is coming thing which is like okay fine have all the regulations you want and i some of them i really agree with but give me a fucking break in a few years 10 years 20 years come on man guns are going to be the least of your fucking problems man the least of your problems once crisper technology falls into the hands of like the like consumer you know once whatever that we can just cook up I mean it's a dream of mine have some kind of like chemistry microwave where you can just like type ketamine and it makes some for you that's probably going to happen oh fuck yeah it is have to buy the elements yeah mix them up in a cauldron somewhere it's definitely going to happen I mean this is the like this is the thing McKinna would talk about which is the amount of time between what you can think you want and that thing coming into existence is eventually going to be zero so oh wow through technology we could just pop whatever we want into existence or maybe through the metaverse or whatever or some right so this is a very to me I think a the effort uh when it comes to that form of regulation is like it's I get it but I think over time you're it's like all right you're going to have to start regulating a lot of stuff like you're going to have to start regulating a lot of stuff like you're going to like assholes who want to shoot their own satellites into space and can you know what i mean right you would i would if i could shoot fuck yeah if i had the like if you could go to best buy and get a little satellite that you could shoot into space and you could have the dunk and trestle show available only through satellite exactly of course i would do that you'd be your own server yeah totally self -reliant dude this is what this is the world we're headed towards you think like like how many people would be putting swarms of satellites up there oh my god right On the moon?
[1628] Well, that's what I said about clones or robots.
[1629] Like if somebody makes a robot clone of them, they download themselves into another body.
[1630] Why would they only do it once?
[1631] Yeah.
[1632] What if you got a lot of money?
[1633] What if you're just some crazy billionaire dude who likes to drive it around a convertible Rolls -Royce with fancy sunglasses on?
[1634] What if you make a hundred of yourself?
[1635] And then like next week, you see them waving at each other at stoplights.
[1636] You're like, oh my God.
[1637] What have I done?
[1638] And they're all a unique individual that's allowed to live its own life.
[1639] Yeah.
[1640] There you go.
[1641] And it woke up with a full rim of memories.
[1642] That's what happened.
[1643] God made man in his image.
[1644] That's what happened.
[1645] God cloned himself.
[1646] Once they do that, once they give you digital memories, and they're so much better, you can rewind them, which is really good for security.
[1647] Because sometimes people have two different stories, Duncan.
[1648] And it would be better if I saw if that guy pulled his gun first.
[1649] I need to know what happened.
[1650] Did he say something threatening to you or did you just hit him in the head with a bat?
[1651] Like, let's go back and watch and we'll be able to review your memories.
[1652] What a nightmare.
[1653] So it won't just be like an eyewitness account.
[1654] It'll be like HD 4K.
[1655] Look at that.
[1656] Dude.
[1657] Disaster.
[1658] Yeah, you were drunk on tequila and you hit a guy in the head with a bat.
[1659] What about just when you're telling your friends, that story that you like to kind of embellish a little bit?
[1660] And your friends are like, hey, do you mind doing a memory projection while you tell that story?
[1661] And they're like, shut the fuck up.
[1662] Oh, my God.
[1663] You didn't do that.
[1664] That would be so good to find out if someone's telling the truth, though.
[1665] Yeah.
[1666] If you really trusted someone, no, I don't need to see your memories.
[1667] We're good.
[1668] oh my god or like the way you think that's going to happen that's going to that phrase will 100 % be uttered dude did you go through my memories did you just go through my fucking memories dude like you know like how people go through their phones dude you didn't just go through my fucking memories oh my god are you fucking kidding me you weren't that's not where you were a Friday night what's this dude I know you had other ideas yeah you had other fucking out there being naughty yeah it's yeah i mean that's it's really interesting right there will be no like secrets within 20 years there would be zero secrets of anything in your mind you won't even be able to keep thought secret within 20 years right because they they'll figure out a way just to like read just to do it man just to know what the neural energy is inside of your brain pick it up yeah and then translate it anyone driving by people that's oh my god It's coming.
[1669] Tinfoil.
[1670] That's when truly people are going to have to start wearing stupid helmets.
[1671] Imagine if that's what it really works.
[1672] It's not tinfoil, but it's one of those, what are those things called?
[1673] They put your phone into and then a Faraday cage.
[1674] Yeah.
[1675] It's like a Faraday cage.
[1676] So that it is, a tinfoil hat.
[1677] Instead of a tinfoil hat, it's a Faraday hat.
[1678] And when you see someone wearing it, you're like, there goes a liar.
[1679] Look at them with their fucking liar's helmet on.
[1680] They have to wear that so we don't know what they're really thinking.
[1681] But what if the hackers get a hold of that and make, like, influential programs that go into your fucking memories and change memories?
[1682] Like, well, we have these digital memories, but we also have filters.
[1683] Shut the fuck up.
[1684] Someone already made one of these.
[1685] This is the best day of their life, by the way.
[1686] Faraday cap.
[1687] Oh, my God, they have one.
[1688] Lambs Faraday cap.
[1689] Wave stopper technology, 99 % UV and wireless radiation block.
[1690] Why do I think it's hot that she would believe that?
[1691] Right?
[1692] a pretty girl that has a Faraday hat on?
[1693] Because it's cool.
[1694] Like, yeah, she's so crazy.
[1695] She believes the government's trying to wire a rain.
[1696] I like it.
[1697] It's fucking cool.
[1698] Yeah.
[1699] That's like a little red winding hole.
[1700] I love that.
[1701] Oh, this is on Amazon.
[1702] It's only 60 bucks.
[1703] Wow.
[1704] Radiation protection.
[1705] I'm going to wear that on stage from now on.
[1706] That's my new look.
[1707] Yeah, but what happens?
[1708] You start wearing that.
[1709] Next podcast.
[1710] We're wearing those.
[1711] And all of a sudden, we're way.
[1712] Order dose, please.
[1713] Order dose.
[1714] We're way more articulate.
[1715] Imagine if it works.
[1716] Like our thoughts clear up.
[1717] What if all it's all it?
[1718] Then you just feel more relaxed.
[1719] And you realize, oh, my God, it's electronic interception.
[1720] Just like bees.
[1721] Like how it's fucking up bees.
[1722] Yeah.
[1723] You know, they think cell phone signals are fucking bees up.
[1724] Yeah.
[1725] They think it's like a constant jackhammer.
[1726] They're like, whoa, like this shit is not supposed to be out here.
[1727] They're supposed to be out communicating somehow or another.
[1728] They don't exactly know.
[1729] That's so creepy, man. That's so sad.
[1730] What's so funny?
[1731] I'm not going to say it.
[1732] We'd say it.
[1733] There's a beanie.
[1734] A beanie?
[1735] Why is that so funny?
[1736] You just thought it was that funny?
[1737] Yeah.
[1738] Jamie, did you get a contact high for us?
[1739] We know there's Tim Poole wears a beating all the time.
[1740] Maybe he's out of the game.
[1741] Maybe we get Tim Poole a new beanie.
[1742] Against 5G, cell towers, smart meters and Wi -Fi.
[1743] Maybe that's what it is.
[1744] Maybe he's ahead of the game.
[1745] Keep your family protected.
[1746] You should keep your family protected with beanie.
[1747] Got to get your fucking kids to wear beanie.
[1748] Faraday Beanie.
[1749] Yeah, man. Oh, we got to get a beanie.
[1750] Well, we should definitely get a hood.
[1751] Well, if you got a beanie.
[1752] J .R .E. Faraday cages.
[1753] It's called a very touch and go.
[1754] Brain coat or something to which seems like that.
[1755] Shielding for your mind.
[1756] You're a knight of the digital war.
[1757] Look, there you go.
[1758] Oh, that's a Faraday cage?
[1759] Oh, for hunting.
[1760] That way I could really be one with nature without any influence by the middle.
[1761] Avoid government spying.
[1762] Avoid government spying.
[1763] And look, you can keep your phone in that pocket right near your ear.
[1764] The earmuff looks like a cell phone pocket.
[1765] Doesn't that look like a pocket?
[1766] Just answer your phone from right there.
[1767] Hello, Duncan speaking.
[1768] There's like a little hole where the microphone is at the bottom.
[1769] Oh, God, I love it, man. I'm actually, there's a whole industry out there for these things.
[1770] Of course there is.
[1771] That's pretty fascinating.
[1772] I didn't realize that.
[1773] Let's get ready for Faraday Cage underwear because I know those are out there too.
[1774] I mean, that makes more sense to me. A dick sheath?
[1775] Something to keep the EMF off your dick.
[1776] Yeah, like a jock strap.
[1777] some kind of a harness.
[1778] Yeah, 100%.
[1779] That makes more sense than the help.
[1780] Oh, they have it.
[1781] There they go.
[1782] There you go.
[1783] It's already there.
[1784] Where do we got here?
[1785] All of it.
[1786] Those are Faraday cage.
[1787] Yep, Faraday underwear.
[1788] That's for your girl.
[1789] You give that to your girl.
[1790] So, baby.
[1791] It's the same company.
[1792] Lambs.
[1793] They're all over it.
[1794] That's crazy, man. You said, baby, this pussy can't be on an open network.
[1795] We've got to close you to that digital signal.
[1796] I'm going to put a Faraday cage over that pussy.
[1797] Anti -radiation underwear.
[1798] Anti -radiation underwear.
[1799] So that, I mean, I don't know if that's a bad idea.
[1800] As a testicular cancer survivor, I think it's a good idea.
[1801] Do you think that the radiation that you get from cell phones gives people cancer or could?
[1802] No, don't answer that because we're on a podcast.
[1803] What do you mean?
[1804] We don't know.
[1805] I'm like, don't answer that.
[1806] It's a terrible question to answer.
[1807] Oh, you, but you, I think you're allowed to go, I don't know, and then say, maybe it's this, right?
[1808] Like, we don't, clearly we don't know.
[1809] I don't know, but I know that, like, before I got testicular cancer, I played a lot.
[1810] lot of fucking World of Warcraft on a laptop right on my dick i know that jesus christ is that what did it who knows but i would imagine keeping like a powerful thing that's radiating energy over your balls is probably not the best idea right that's not the best idea no try to get a little separation there i don't think that's crazy or anything just the heat on your balls i mean those things get hot don't they yeah and you would play form ever forever forever forever just roasting my balls oh my god dude yeah i mean i don't you know again i don't i don't know for sure but you can't that's the problem you can't really like yeah no one knows you don't know the reason but yeah i think that makes sense to keep keep fairy cage underwear on yeah but it might be also a heat thing too i mean i can't be good heating your balls up like that with the that thing sitting no what am i saying i get in a sauna every day.
[1811] Fuck am I talking about.
[1812] Yeah, but the sauna energy isn't coming from like, like porn.
[1813] It's not fuel by fucking porn.
[1814] Imagine the irony, if you're, you got ball cancer from jerking off to porn.
[1815] Oh, my God.
[1816] The computer, the laptop.
[1817] But that doesn't even make sense because, like, how would you have it sit there?
[1818] You can't.
[1819] You'd have to, like, that'd be a weird computer.
[1820] You'd have to be a hole in the middle of the computer.
[1821] You just jizz all over the place.
[1822] The other computer shaped like a fucking life preserver He just reach in and whack off It's just a computer just for whacking off Yes And getting cancer Like you have to specifically work at it Yeah so no I think it's more like people Who sit in bed with the computer Over their genitals Well they have that thing that people do use in bed sometimes It's like it's a little laptop table That you know like Yeah You mean you think of breakfast in bed tables It's kind of like one of them jammies.
[1823] Yeah, fuck that.
[1824] But they have a little laptop table that sit in bed.
[1825] Too lazy for that.
[1826] I'm not going to go get my laptop table when I want to look at my computer in bed.
[1827] I think the future people that you were talking about, that is going to be one of the things where they're like, yeah, they just put the computers right on their dicks.
[1828] They didn't even think it could be bad for them somehow.
[1829] Just slowly microwaving your balls.
[1830] Yeah.
[1831] I mean, surely that can't be, I mean, that can't be good.
[1832] Can't be good.
[1833] Yeah.
[1834] But it's funny, like, people don't tell you when you're doing something.
[1835] They don't even know, like, with long -term exposure to laptops, fucking how long the laptops have been around?
[1836] How long the laptops have been sitting over dicks?
[1837] That's not...
[1838] Not long.
[1839] That's not a lot of years.
[1840] Not long in the course of human history.
[1841] I wonder if they could actually track.
[1842] I wonder if anyone's done this.
[1843] Track actual laptop, like sitting it on your laptop use and some form of testicular cancer.
[1844] I wonder if they've done that.
[1845] I don't know.
[1846] I mean, I don't know.
[1847] Jamie, can you Google that?
[1848] Long -term laptop use.
[1849] I mean, it's literally called a laptop.
[1850] Testicular cancer.
[1851] You're supposed to put it in your lap.
[1852] Exactly.
[1853] The name invites you to fry your dick.
[1854] That's exactly what it does.
[1855] It invites you to fry your dick.
[1856] Yeah, just put it on your lap.
[1857] It's just easier.
[1858] Is it true that if you rest your laptop on your lap, you could get current scientific evidence indicates there's no link between using a portable laptop computer and cancer, most of the theories about laptops and cancer relate to heat electromagnetic radiation or radiation from wireless networks.
[1859] Huh.
[1860] Well, that's good news.
[1861] How do they know that?
[1862] Because things don't give cancer to everybody.
[1863] That's what's crazy about things to give people cancer.
[1864] There's people who smoke cigarettes our whole fucking life.
[1865] No cancer.
[1866] And then there's people that work in a bar and, you know, they get secondhand smoke and they die young.
[1867] Both those things happen.
[1868] It can lead to certain types of cancer.
[1869] Oh, here it goes.
[1870] Six reasons to never place your laptop on your lap.
[1871] It can lead to certain types of cancer.
[1872] Swiss researchers, Dr. Andres, blah, blah, blah.
[1873] Click on that link.
[1874] The first one was from cancer .org.
[1875] The second one was from Vestech.
[1876] Oh, you can't trust those Australians.
[1877] They're beaten down over there.
[1878] The government got them over a log.
[1879] This random blog, I don't know, is better information.
[1880] Much better.
[1881] Look at that lady's legs.
[1882] It's so important.
[1883] What happened?
[1884] You appreciate your laptop on your lap.
[1885] Laptops are one of the most common productivity tools today.
[1886] They're portable, convenient, and powerful.
[1887] Contrary to its name, however, a laptop does not belong on your lap unless you want to expose yourself to harmful electromagnetic frequency radiation.
[1888] Laptops emit EMFs in many different frequencies.
[1889] These EMFs can be extremely harmful to your health.
[1890] Your vital organs also get an unhealthy dose of electromagnetic radiation from your laptop computer if you make a habit of actually putting it on your lap.
[1891] How many people read that while it was on their dick?
[1892] What?
[1893] It's funny.
[1894] A lot, a lot.
[1895] A lot.
[1896] Okay, to our knowledge is the first study to evaluate direct impact of laptop use on human spermatozoa.
[1897] Exvivo exposure of human spermatozoa to a wireless internet connected laptop, decrease motility, and induced DNA fragmentation.
[1898] Jesus.
[1899] By a non -thermal effect.
[1900] Non -thermal.
[1901] I mean, it's not because of heat.
[1902] Wow.
[1903] We speculate that keeping a laptop connected wireless to the internet on the lap near the testes may result in decreased male fertility.
[1904] Fuck.
[1905] Holy shit.
[1906] It can lead to certain types of can't.
[1907] First of all, I don't know if these guys are right.
[1908] Yeah, we don't know what this is.
[1909] Anyone could have.
[1910] I'm just going to be reading this out for the rest of the podcast.
[1911] Dude, it's terrifying.
[1912] It's terrifying.
[1913] That's what you were thinking?
[1914] Tell me. Let me know.
[1915] Imagine somebody had to write that.
[1916] That's what really sucks.
[1917] Yeah.
[1918] It's, uh, I mean, imagine we find out that a lot of things we're doing.
[1919] are fucking us up.
[1920] Imagine 5G winds up not being bad, but 6G is like, 6G starts fucking with your head.
[1921] Like if they keep going, is there going to be a frequency?
[1922] Like, hey, guys, we just came out with 13G and good news, no more need for phones.
[1923] We've got it, we're getting it right into your DNA.
[1924] Well, I think that if they found that out, we would have a cool, it would be a weapon, right?
[1925] Like, it would be, like, if they knew that you could send out a frequency that really fried people's brain, then that would be a new weapon that people would use.
[1926] Like, what are they calling it?
[1927] The Cuban thing where people hear crickets.
[1928] And then they have brain damage.
[1929] They hear like the chirping of like cicadas or something.
[1930] And then suddenly for the rest of their lives, they have neurological damage.
[1931] And no one knows what it is yet.
[1932] Some people say it's a, it's not real.
[1933] It's hysteria.
[1934] Some people say it's like some new weapon that they're blasting it.
[1935] I think they're pretty sure something is actually happening now.
[1936] I think they're pretty sure.
[1937] From the last thing that I glanced at, they were saying before that they were thinking that some of it may have been people making things up or they thought people were exaggerating or what else did they say?
[1938] What were the other possibilities?
[1939] I just read that.
[1940] That it was like, I don't know.
[1941] Like it's hysterical.
[1942] It's hysterics.
[1943] It's people, you know.
[1944] Mass hysteria or something.
[1945] hysteria and it was more than one person right yeah how many people supposedly have i don't know but now do they think it's real now so what does it call it havana syndrome avana syndrome what is the latest opinion on havana syndrome because whatever it was what they were really terrified of was that someone was going to be able to just fuck with people's heads from a distance so you could point it at the president while he's given a speech you could point it at a race car driver in the middle of a turn you could do whatever you want and if that really came to be where you like can just fuck with someone's head from a distance like hey man why how is it possible for them to send radio signals through the sky how is it possible for them to send cable signals through the to the ground and satellite video and stuff that comes from your phone to another phone you don't think it's possible that they could just send some sort of a pulse that directly connects with whatever the fuck it is that allows you whether it's hearing whether it's something some where they can pinpoint an organ in your body and irritate it and not just your head.
[1946] What if there's something they can do that can impact your heart?
[1947] Your heart, like, there's a reason why people's heart stop when they get electrocuted, right?
[1948] Yeah.
[1949] Something happens to them and their body freezes up and it stops beating.
[1950] What if they can just fuck with that a little bit?
[1951] Well, the heart, they did.
[1952] The CIA made a heart attack gun.
[1953] That's real.
[1954] What if they don't even have to have a gun?
[1955] Like, remember when Tesla was trying to send electricity through the air?
[1956] Yeah.
[1957] What if they could figure out how to send electricity to the air and give you a fucking heart attack where you're sitting at the bar?
[1958] We'd be in trouble.
[1959] Could you imagine if electricity worked like a drone, you could just, you knew where you would send it, you just coordinated into a thing, and it goes right through the fucking wall and zap someone in the head.
[1960] And they just dropped dead on the spot.
[1961] Can you imagine that weapon?
[1962] How crazy that would be just.
[1963] You go right through buildings.
[1964] A whole city.
[1965] Everyone just dies.
[1966] The infrastructure remains.
[1967] You just have to go in and scoop them out.
[1968] Can you imagine a guy's on a date with his wife.
[1969] and wife's ex -husband kills Bolton with electricity fuck can't imagine that's fucked up rival football teams kill each other with electricity scary life is cheap anybody can just electricute dude people from other countries electricute people on the other side of the border suddenly like the number of followers you have on Twitter becomes very scary no just because like that many people are aware of you what is the statistical probability that one of them will be like, I don't I just fire an electrical beam at Rogan?
[1970] Imagine a technology that just allowed you to direct a lethal electrical beam anywhere in the world.
[1971] Yeah, from a satellite.
[1972] Yeah.
[1973] You want to talk about mutually assured destruction and people being nice because, you know, like at any point in time, someone could just end you with electricity.
[1974] What do you think would be the average age people would make it to?
[1975] I think five.
[1976] I think we'd have a nation of five -year -olds and everybody else would be electrocuted.
[1977] And five -year -olds would be toasted kids for fucking stealing their spaghetti.
[1978] Yeah.
[1979] Toasting kids for using their big wheel too long.
[1980] Fuck you.
[1981] Bam.
[1982] Billy, you got to not kill people.
[1983] You're not going to have any friends.
[1984] Yeah.
[1985] They'll be like long distances between houses.
[1986] Very few people left.
[1987] Dude, it would be, just forget it.
[1988] Like after one fucking night, after one night of hearthstone, you know, like after one night of any of these games people play, so many people would just kill you if they could.
[1989] At least with a gun, you can hide.
[1990] You can go behind.
[1991] a wall imagine if electricity can go through a building and moved like a homing pigeon just right to the spot knew exactly where you were you just just i know what your number is your number 39 zap 39 and just fuck him and you send it and then you listen it it would be so fucked up man if people could just kill people like that with a press of a button there'd be literally nobody left it would just be people dropping like flies and you'd want to do somebody before they did you and people would just start killing each other with electric bolts.
[1992] That's a very interesting.
[1993] That's like a twilight zone where everybody wakes up.
[1994] Yeah, a black mirror.
[1995] With the ability to kill anyone they wanted to.
[1996] At any time.
[1997] At any time.
[1998] With a press of a button.
[1999] With electrical bolt comes out of the sky and just cooks them.
[2000] Gets through buildings, finds them, specifically targets them.
[2001] That's cool.
[2002] That episode, it should you start with someone walking on the sidewalk?
[2003] And all of a sudden, people around them just start dropping.
[2004] Burst and a powder.
[2005] What the fuck's going on?
[2006] Yeah, just fully.
[2007] electrocuted and burst into flames and drop into cinders.
[2008] If you could do that, how many people would not kill people?
[2009] It would be real problem because anybody could kill anybody.
[2010] That's it.
[2011] So people would start killing people.
[2012] When people would start disappearing, you'd want other people to disappear.
[2013] And you'd want to figure out how to hide from, people would try to like make up some elaborate shelters to hide from the electricity.
[2014] But the real problem of it is, let's imagine that actually humans are more compassionate than we're thinking they are right now.
[2015] You really just need one person who decides to kill everybody else.
[2016] You know what I mean?
[2017] Like that's the, this is the problem to me. This is where we're headed, man. This is where we're headed.
[2018] Full steam ahead to a point where our technologies get better and better and better so that it's no longer school shootings.
[2019] It's no longer someone gunning people down in a cell, It's some, as of yet, non -existent technology that someone gets access to and just wipes out entire cities.
[2020] Right.
[2021] Vaporizes.
[2022] Vaporizes people.
[2023] I mean, this is like, this is the scary thing about the world that we're in right now is that if people will do mass shootings, they're only doing, they're using the gun, not necessarily because they want to, but because that's the best thing they have on hand to kill people with.
[2024] Right.
[2025] So as soon as like people are able to genetically engineer viruses or people are able to, you know, send their bot swarms, their nanobot swarms out in the neighborhoods and just recalibrates people's DNA so that just, you know, instead of a mass shooting, it's just a bunch of people on a subway just melt because someone sent nanobots into the subway to just destroy them.
[2026] I mean, this is like, this is what we're facing as a species.
[2027] is the inevitable creation of something that's more accessible than most weapons of mass destruction are now.
[2028] And then what are we going to do?
[2029] Right.
[2030] It's like cell phones used to be very hard to get.
[2031] Now everybody has them.
[2032] And they're tiny.
[2033] Nuclear weapons were very difficult to acquire.
[2034] Yeah.
[2035] And they kept the lid on that for a few hundred years.
[2036] Yeah.
[2037] Until it just became nuclear apps on your phone.
[2038] Yeah, dude.
[2039] That's what I'm talking about.
[2040] Panel says some Havana syndrome cause cases may stem from radio energy.
[2041] A group of experts found that not all injuries to diplomats and CIA officers could be explained by stress or psychosomatic reactions.
[2042] There you go.
[2043] So that's what they were calling it at first, psychosomatic.
[2044] They were saying that they were just imagining that they had these issues.
[2045] But what are the issues that they're coming up with?
[2046] The panel's conclusions also undercut the arguments of some outside experts that mass hysteria stress.
[2047] or psychosomatic reactions were the cause of the incidents.
[2048] The panel found that stress reactions could have contributed to ongoing symptoms, but that no so -called functional illness or mass hysteria could explain the initial injuries in the cases that were the focus of its investigation.
[2049] Fuck.
[2050] Yeah.
[2051] God damn it.
[2052] This is the last thing we need.
[2053] It said, listen to this, intelligence officials briefed on the panel's findings, did not say how many cases it had focused on, although they said between 10 and 20 victims were interviewed.
[2054] They said the panel focused on cases in which victims heard a strange sound or felt pressure and then experienced a loss of balance and ear pain.
[2055] In addition, the panel focused on cases in which the victims reported the sounds as coming from a specific location.
[2056] Whoa.
[2057] So spooky.
[2058] Whoa.
[2059] That's wild.
[2060] Ultrasound could have caused some of the injuries.
[2061] The panel identified one potential cause, what's called pulsed electromagnetic energy, particularly in the radio, frequency range, also known as directed energy.
[2062] Wow.
[2063] Fucking spooky, man. That's wild shit, dude.
[2064] They can just fuck with your brain to...
[2065] Turn you off.
[2066] Now, if they can do that, why can't they do that with an electric bolt that kills you?
[2067] I bet they can.
[2068] They could.
[2069] I bet they will be able to.
[2070] That's the scary thing.
[2071] Whatever way you could think of to kill people, they will eventually invent it.
[2072] Think of before the gun.
[2073] It was pretty hard to kill someone.
[2074] Now you can kill someone a half a mile away.
[2075] You had to use a rock.
[2076] You had to use a sling.
[2077] You're going to get close.
[2078] A bow.
[2079] You got to risk them killing you.
[2080] You know, at first it was no weapons.
[2081] You had to kill them with your bare hands and beat them to death.
[2082] And then it became, you had to fight things off with sticks and pointed things and weapons.
[2083] Then it became shoot them at a distance with an addle -addle, and then a bow and arrow and throw a spear and all the other different things that people figured out before they figured out guns.
[2084] Right.
[2085] Now that everybody figured out guns, the smallest child can end your life with a mere pull of a finger.
[2086] There's no strength impediment.
[2087] Everyone is created entirely equal.
[2088] Right.
[2089] when it comes to the impact.
[2090] Like, it's a showstopper.
[2091] If he has a gun, you don't have a gun, he's five, you're dead.
[2092] Right.
[2093] Yeah.
[2094] That's how it goes.
[2095] Yeah, that's great.
[2096] That was, it's definitely a new thing.
[2097] Because, like, you could see a five -year -old pick up a rock or whatever.
[2098] What are you going to do?
[2099] So that's new within a few hundred years, right?
[2100] And with pistols and revolvers, it's new from the 1800s.
[2101] I mean, they credit the Colt Revolver with being one of the ways that the Cowboys fought off the Comanche's when they first started adopting these revolvers that had more than one bullet in them because it used to be at a musket.
[2102] You had a pack that fucker down and reload.
[2103] And by the time he did that, the Native Americans were on you and you were dead.
[2104] Yeah.
[2105] So they figured that out.
[2106] So that changed the entire West.
[2107] Yeah.
[2108] That changed how where people could settle and live.
[2109] That changed how they fought war with the Comanche.
[2110] Well, when they did that, they changed the course of history.
[2111] So if it goes from that to everybody can have a gun, like these constitutional carry states, there's like 23 of them now where anybody can have a gun at any time.
[2112] There's more guns than there are people.
[2113] Yeah, you were telling me. You're Uber driver.
[2114] Think of that and imagine if that's the case with something really crazy, like a drone with a warhead.
[2115] Yeah.
[2116] Yeah, man. Really spooky.
[2117] Yeah.
[2118] Drones with warheads that operate off tour so you can't track them.
[2119] Sure.
[2120] They're using virtual private networks.
[2121] Bugs.
[2122] Just bugs.
[2123] No one pays attention to bugs.
[2124] You know, like, this is the tiny little beetles that are watching.
[2125] Like, you don't know if one of those has gotten into your fucking house.
[2126] Like, right now, when someone wants to scope out your property, which I hate, you know, when all of a sudden a fucking drone just appears?
[2127] Who is that?
[2128] Are you allowed to shoot those down?
[2129] I think it depends on the height of the drone or the, I think, like.
[2130] Really?
[2131] I'm not positive.
[2132] I think there's a height considered to be on your property.
[2133] Oh, interesting.
[2134] And there's a height that's no longer on your part.
[2135] Otherwise, people could shoot planes down.
[2136] It was in my yard.
[2137] Stay off my land.
[2138] Yeah, that's true, right?
[2139] Oh, that makes sense.
[2140] Oh, that makes sense.
[2141] But, you know, the drones are going to get smaller and smaller and smaller.
[2142] They look like flies now.
[2143] Yeah, exactly.
[2144] Have you seen those?
[2145] No. I haven't seen those.
[2146] Look at that.
[2147] They have tiny drones that have fly like wings.
[2148] Yeah, there you go.
[2149] Yeah.
[2150] I mean, they're going to, when battery technology and software becomes more efficient, They're going to be able to have these things fly around and look like, look it like that.
[2151] Yeah.
[2152] Why can't it look like it be?
[2153] Looks like that.
[2154] Look at that.
[2155] What is that?
[2156] That's a real one though.
[2157] That's a real bug.
[2158] That's a real bug.
[2159] That's a real bug.
[2160] Which, by the way, it's not a real drone?
[2161] If it's not, then they fucking made a really good one.
[2162] But by the way, I mean, let's forget about drones and how fucking cool are dragonflies.
[2163] Cool as fuck.
[2164] Look at that thing.
[2165] That's a living organism on planet Earth.
[2166] Isn't it?
[2167] It's one of the oldest, I believe.
[2168] Is it really?
[2169] Yeah.
[2170] It looks like it is.
[2171] I mean, that is a dope.
[2172] Looking insects.
[2173] Things been around for a long time.
[2174] They're so cool.
[2175] Like they're not threatening.
[2176] They pose no danger to us.
[2177] And they're like super imposing and cool looking and they're big and they don't taste good.
[2178] Have you eaten them?
[2179] I'm just saying if they did they wouldn't be around as much.
[2180] Well they look like they'd be poisoned.
[2181] Yellow and black.
[2182] They look like don't eat me. Like almost like a danger warning.
[2183] Yeah.
[2184] Yeah.
[2185] They look really old and yeah, the like that kind of stuff, man. that's really, it's really weird when you see old design.
[2186] Old design.
[2187] Studebaker's.
[2188] Studebaker of bodies.
[2189] Yeah, man. It's crazy to see that or sharks.
[2190] How about an alligator gar?
[2191] Oh, those are fucking creepy, dude.
[2192] That is straight, that's the most prehistoric thing that's alive today.
[2193] If you looked at it, the way they look with their teeth, pull up a photo of an alligator gar.
[2194] It looks like something from the Jurassic.
[2195] It doesn't look like something that's supposed be here right now.
[2196] I think they're millions of years old.
[2197] I don't know how many millions of years old, but I'm pretty sure that Gar's going to...
[2198] Look at that.
[2199] Look at that fucking thing.
[2200] So spooky.
[2201] Tell me that doesn't seem like something from another era.
[2202] That seems like a dinosaur.
[2203] That's fucking teeth.
[2204] I mean, that's what I would assume dinosaur fish were like.
[2205] Or dinosaur era fish were like.
[2206] Look at the size of that thing.
[2207] It looks like it's laughing.
[2208] I mean, I think they have its mouth propped open with a stick to show their teeth.
[2209] That's generally what they do but if you look how big these things are i mean some of them grow to like 14 feet long man they're fucking huge dude and they have these crazy thick uh outer coats that you have to cut through with like wire snippers like their body is like armored all that shit on the outside it's super tough right carford so when they um look at that sucker i mean these year they really do seem like they're from another time.
[2210] There's a few of those animals that are out here.
[2211] Crocodiles are for sure one.
[2212] Crocodiles.
[2213] For sure, that's one that just seems like it's from another time.
[2214] Or that giant chicken that just popped up on the internet.
[2215] Did you see that thing?
[2216] What?
[2217] Did you see that video of that giant chicken, Jamie?
[2218] It's huge.
[2219] It comes like out of the hen house that it's in.
[2220] It's like giant.
[2221] How big is it?
[2222] It's like probably four feet or something.
[2223] Look at this fucking.
[2224] Oh, I have seen this video.
[2225] Yeah, I have seen this video.
[2226] What the fuck?
[2227] Yeah.
[2228] Yeah.
[2229] It's a specific breed of rooster.
[2230] It's a big rooster.
[2231] Yeah.
[2232] That's, you know, if you don't believe that the dinosaurs turned into birds, that's a dinosaur.
[2233] That's 100 % of dinosaur.
[2234] And can't, sometimes don't like chickens, like grow fangs or something, like something from their...
[2235] Hens tooth.
[2236] Hens tooth.
[2237] Yeah.
[2238] It kind of goes back to dinosaur for a second.
[2239] I don't know if that's the origin of it, but it wouldn't surprise me. you know a lot of animals used to have tusks and all sorts of other things that have retracted and just become like ivories inside their mouth yeah like elk have those elk at one point in time used to have tusks yeah i wouldn't i wouldn't be surprised if that's what a hens tooth is is a hens tooth is a hens tooth is a look like an actual fang have you ever seen them no okay that's i think it's actually oh is it one of those phrases it's as rare as a hens tooth no i think like and again i i i don't the reason I think this is because I was probably taking a shit and like scrolled by something and read it wrong but like I think they like sometimes they either grow fangs or like they grow like some kind of dinosaur.
[2240] God damn it's probably not true.
[2241] For years I've been believing this.
[2242] They've made some scientists made some they forced hands teeth to grow.
[2243] That's that's what it was.
[2244] You know what I went down a rabbit hole yesterday?
[2245] What?
[2246] Watching Komodo dragons eat monkeys.
[2247] Oh dude.
[2248] Dude.
[2249] Why would you do that to yourself?
[2250] That's so fucked up.
[2251] I watched a video of this monkey attacked this man and pull a giant chunk out of his head.
[2252] The guy pulled the man's skin back and like scalped him, like a giant strip.
[2253] A large strip of meat and hair from off his head.
[2254] The monkey just, for no reason, just bit him and pulled his hair back.
[2255] And then I started thinking, man, the monkeys are cunts.
[2256] I'm like, I wonder what eats monkeys.
[2257] And so then I go into this Komoto Dragon rabbit hole of Komono dragons swallowing things whole Holy fuck dude Watch this I don't want to watch it man Oh Why did you make him watch it So rude Was that his skull?
[2258] Yeah that was his skull Yeah it pulled his meat Clean off And that exposed his skull Like watch it keeps going You see the guy sitting there No no Bro that's his skull And that's his that's his scalp It looks like it got part of the skull Is that the stout?
[2259] No no that's just the scalp inside of his skin Oh what a crazy is that Like you just when you thought your day couldn't get worse, a monkey rips your...
[2260] Well, you know, that monkey had decided it was standing on him.
[2261] It was doing whatever it wanted.
[2262] Then it just decided to pull a chunk out of its head.
[2263] They know we're weak.
[2264] They know we're just big, but they don't have any fear of us.
[2265] Dude, when I was in India, I got chased by monkeys.
[2266] Oh, here's one.
[2267] Here's one.
[2268] Camodo dragon caught a monkey slipping.
[2269] They're horrible, dude.
[2270] That is, in my opinion, that's the scariest animal.
[2271] What is this?
[2272] We just took a left.
[2273] But keep it going, keep it rolling.
[2274] Don't shut that off.
[2275] Komodo dragons, killing monkeys.
[2276] This was my yesterday.
[2277] Jesus, Joe.
[2278] You always do this.
[2279] But you need to know that this is a real thing that's happening right now.
[2280] I don't need to know this.
[2281] Everybody does.
[2282] Everybody does.
[2283] Because this is real life.
[2284] This is life.
[2285] Like there's life that's a shoe, someone's shoe.
[2286] That's someone they ate yesterday.
[2287] Look it.
[2288] Fuck that, man. Imagine living in that world.
[2289] You know what?
[2290] There's a Coke bottle there too.
[2291] See that?
[2292] Yeah, I'm, uh, I'm gonna be, um, very, uh, pessimistic.
[2293] I think this is a zoo.
[2294] I was just gonna think, I think they set this up.
[2295] Well, great, man. Now I have that in my head.
[2296] That's rough.
[2297] That's so fucked up.
[2298] That's, but that animal is, um, this heartless killing machine that roams this one particular island.
[2299] And their, their, their drool is poisonous.
[2300] Yes.
[2301] The drool is botulism.
[2302] Yeah, they're very dirty They're nasty Fucked up creatures They have like venom They used to think that it was just I think What is it?
[2303] They used to think it was venom And then they decided it was just The toxic bacteria Or is it a combination of both?
[2304] I feel like they think it's a combination Of both But it bites things and just Fawls them until they die Because it knows that it's nasty saliva Would just rot them out Oh fuck man It's dark dude There's these videos of them They'll bite an animal Whether it's a water buffalo Whatever the fuck they eat And they'll bite it and just follow it for a while And then eventually it just gets weaker And we can just Ah Jesus Christ There's something about a lizard doing that man It's like I'd rather get eaten by a lion Because at least a lion has fur And it's a mammal And it's got to be terrifying But at least it's kind of one of my people You know what I mean?
[2305] Like a lion is closer to us Than a Komodo dragon Also and I know you know If I'm wrong about this Isn't the lion's general methodology of killing, you suffocate the thing, right?
[2306] You clamp down on its neck so it suffocates it.
[2307] Yes.
[2308] Versus these creatures where it seems like they'll just start anywhere.
[2309] They'll start on your leg, move up to your...
[2310] Most of the time they go for the guts.
[2311] Oh, God.
[2312] Because guts are easy and guts incapacitate, you know, an antelope or anything that they're killing.
[2313] Do you think this is, I like to...
[2314] Go for the back legs, too.
[2315] Do you think that something kicks in when you're being eaten by an animal that puts you into like a dream state?
[2316] Like, you know what I mean?
[2317] Where you just kind of like you give up, you're done.
[2318] Because we've been, if you're alive on the planet, whatever your DNA is, it's been getting eaten by shit for a long time.
[2319] Like our ancestors, those little lemur things they say we came from, you know, the, or if it's monkeys or whatever, in our DNA, built into the DNA must be.
[2320] something like all right here's what you do when there's no more hope and you're just being eaten alive I'm sure your blood the brain floods with love yeah psychedelic chemicals yeah probably trip balls while that lion's eating you yeah suddenly you're like oh I'm part of everything now I was talking about that in one of my specials it was that there was this um area of the sunder bands where these fishermen got killed by a tiger but it wasn't just the fisherman got killed by a tiger but the tiger killed him one in a time he swam out there killed a guy dragged him into the water dropped his body off the shore jump back in the water yeah and did it two more times killed three out of four guys god and i was like what do what does it feel like the moment a tiger locks eyes with you when it's climbing into the boat and you know it's a rap i think i bet it just burst into a kaleidoscope yeah it's your brain floods with psychedelic chemicals yeah or suddenly think you're human fucked living our lives we're just being eaten by a tiger right now we're you know this is this is one of the things that I worry about is that how do we know that we weren't in some spaceship and we got absorbed into some kind of a predator but the way it eats us is not the way a tiger eats you it eats you by like hypnotizing you and then dissolving you for infinity that's what we call reincarnation it's just like It's like, you know, slowly being devoured by an alien that is giving us the impression that we have memories or whatever.
[2321] Maybe he just wants to toy with us.
[2322] You know, cats like to play around with things they're killing.
[2323] They like to let it run away and then catch it.
[2324] Like, if that happens here, how do we know that's not what we're in right now?
[2325] Like we got caught by a very advanced, powerful cosmic predator being that is keeping.
[2326] us locked into reality.
[2327] I mean, the demiurge.
[2328] That's what they call it, Gnosticism.
[2329] Yeah.
[2330] That could be true.
[2331] Or it could be just all this chaos is to ensure that we pay attention and we try to improve things.
[2332] And we realize that no one is, no intelligent person is running this show.
[2333] It's impossible.
[2334] No one can do.
[2335] And the people that do get a chance to run it, they all go corrupt because they realize you can't fix it by yourself and you're all locked down.
[2336] with a culture of corruption and influence, even if it's legal, it's still, you know it's not right.
[2337] Right.
[2338] You know, it's not right.
[2339] It's like they're doing some sneaky shit, and that's all of them.
[2340] And so we don't have real leaders.
[2341] We don't have real, it's hard to find out what the fuck the truth is.
[2342] Right.
[2343] And we're being bombarded with bots.
[2344] Bots, whether it's hired humans that are designed to make up things that separate us or whether it's actual artificial intelligence that's got MAGA flags.
[2345] and it's fucking Twitter bio and they're coming after people.
[2346] Right.
[2347] Like a horde.
[2348] This is the, this is where in Buddhism, intention becomes really important which is because of all this, like because of the, everything you just described and my stupid idea of us being eaten by an alien or whatever, ultimately because reality is so incredibly confusing on one level.
[2349] We need something almost like a mnemonic device, like a simple thing to revert to and all the confusion, which is seemingly the most obvious cliche thing to say, you can always be kinder, something like that.
[2350] In other words, in all the confusion, regardless of whether we're being eaten by an alien, a tiger, bots, an AI bot, or just various propaganda mechanisms, trying to confuse us all, we need something to revert to that transcends all that shit, which is you can be kinder, just that basic intent, even if you fail at it every day.
[2351] when you're being selfish shit at or you're in a hurry or whatever it is, if you keep reverting to that intent, this is though, this, because no matter what, like, if there is nothing after this, if we are trapped in an alien, if we're in hell, if we're in a simulator, if we're in heaven, whatever the thing, reality may be, I think no one at the other end of the tunnel is going to be like, you asshole, you were trying to be kinder all the time when you're around people.
[2352] You're treating other people with respect and dignity.
[2353] And you're trying to be kind, even though you were in a lot of pain, you're scared and angry most of the time.
[2354] You were still trying, you dumb shit, because you were supposed to be mean in there.
[2355] That's how you went there.
[2356] No one's going to say that.
[2357] No one.
[2358] No one.
[2359] So to me, that's a thing we can revert to.
[2360] You can always go back to that.
[2361] I heard the Dalai Lama say that.
[2362] That's not my thing.
[2363] It's a good thing to say.
[2364] It makes sense.
[2365] And it's simple.
[2366] It's simple and it's easy to remember.
[2367] Dude, we should wrap this up because you got a show.
[2368] You got two shows tonight.
[2369] And I got to pee again.
[2370] I've got to pee again.
[2371] Shit.
[2372] All right.
[2373] Thank you.
[2374] Harry Krishna.
[2375] Come see me in Austin if you listen to this tomorrow.
[2376] Vulcan gas company, which is an awesome club on 6th Street.
[2377] And Duncantrustle .com, Instagram.
[2378] Yeah, whatever.
[2379] Thank you, Joe.
[2380] Love you, buddy.
[2381] Love you.