Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert.
[1] I'm Dan Rather, joined by Manager Mouse and he nominated.
[2] Oh, my gosh.
[3] I was going backwards again.
[4] Yeah, you were.
[5] This is an Experts on Expert edition of the show, and we have been longtime interested in conspiracy theories.
[6] Yes.
[7] And boy, did we find the perfect person.
[8] David Ferrier is a New Zealand journalist, and he directed this amazing documentary we love called Tickled, and he also has a show on Netflix called Dark Tourism.
[9] But today he is here to talk to us about conspiracy theories.
[10] And you can go read all of his great work on this topic at webworm .com.
[11] That's where he breaks down conspiracy theories.
[12] I have to say, of the many episodes we've done, this was the most just, like, titillated I ever was.
[13] Yes.
[14] This is such a wormhole conspiracy theories.
[15] Like they just go and go and go and you can get really sucked in.
[16] And I also feel like there's a big degree of Bader Meinhoff since we talk to him.
[17] I feel like it's popping up everywhere and everyone's curious about all these different conspiracy theories.
[18] So good timing.
[19] Please enjoy David Ferrier.
[20] Wondry Plus subscribers can listen to Armchair expert early and ad free right now.
[21] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[22] Or you can listen for free wherever you get your podcasts.
[23] Hey, Dax, how are you?
[24] So good, David, really excited to talk to you.
[25] I think this will be one of the juicier interviews we do.
[26] Well, you talk to a lot of interesting people, so if you're saying that, I will take that compliment.
[27] And run with it.
[28] I mean, I don't think we ever talk to somebody where we have, like, a deep curiosity in the topic per se.
[29] I mean, obviously sometimes, but we really want to know about a couple things you know a ton about.
[30] And we're just really excited about that.
[31] Oh, good.
[32] And hello.
[33] I know I popped in.
[34] Sorry.
[35] Hi.
[36] You did.
[37] Surprise.
[38] And I'm Dax.
[39] I knew you were an expert on conspiracy theories, but I did not know until I started researching you that you directed Tickled, which we've talked about a bazillion times.
[40] Loved it.
[41] Yeah, yeah.
[42] And which was a real blast, by the way, because that was this tiny thing that we made in New Zealand.
[43] And so it was very surreal to hear it being talked about just out of our city and out of our country.
[44] So that was cool.
[45] It was so good.
[46] I got to say it's one of these, it was on HBO here originally.
[47] I think now it lives on maybe Hulu.
[48] So people should check it out.
[49] It's called Tickled.
[50] I'll watch any doc.
[51] And I got to say, I passed your dock like seven times.
[52] I'm like, I don't tickled.
[53] I don't want to watch dudes get tickled.
[54] Like, that's not my jam.
[55] And then finally like, fuck it.
[56] I'm going to give it a shot.
[57] I'm going to watch dudes tickle each other.
[58] And then what unraveled was one of the most exciting real -time unravelings in a dock.
[59] You know, so rarely does the events happen.
[60] in real time.
[61] You know, I was so lucky to come across that because it did unravel in real time.
[62] And I think we captured the spirit of that in the film.
[63] You know, like, as an audience, I think you felt that all this weird shit was just suddenly happening in front of you.
[64] And yeah, I feel really proud how we captured that.
[65] Because it was.
[66] It was a very strange thing.
[67] And it's funny you bring up the name because I think if I had my time again, I think tickled is a bit of a misleading name because people do just think it's about a tickle fetish and which is fine.
[68] some people don't want to watch 90 minutes about a tickle fetish.
[69] Yeah, five minutes maybe, but 90's a bit much.
[70] So I think, like, I kind of wish we'd called it something.
[71] So when people like, you came across it, you didn't flick past it.
[72] Because I hate to think how many people are just like, oh, God, I'm not watching that.
[73] Well, and even when it starts, you think for sure that's the road you're on.
[74] I mean, not for long, like for maybe the first seven minutes, but you're just kind of seeing the videos and you're like, wow, A, these videos exist.
[75] This is new info to me. And then not only is your real -time discoveries so wonderful.
[76] I mean, I put it on par with, like, jinxed, and a couple of these others were just the worst, most heartbreaking documentary ever, dear Zachary, the notion that it sets out to be one thing and then all this real -time stuff happens.
[77] Anywho, the post -doc is as interesting as the doc, which is then these follow -ups are so fascinating.
[78] Yeah, well, we had this situation where we could never get to the main.
[79] like big bad during the entire most of the feature you know was trying to get to this guy we see him briefly but then you know we took the film to sundance and in the audience was like one of the minions like one of the bad guys sort of turned up and that created i mean that was great publicity for our premiere because you know it sort of creates a certain buzz in the air when this guy kevin shows up who's you know and i spoke to an audience member who was sitting next to kevin in the cinema and his scene coming on where I'm secretly recording him and he's kind of threatening me. And then apparently Kevin started making like angry kind of noises under his breath.
[80] And the audience member turns and like clocks that the guy sitting next to her is the same as the guy on screen and was just sitting there feeling like, oh my God, like what is going on?
[81] It's a real life pop out, like a horror movie pop out.
[82] The bad guy sitting next to you in the theater.
[83] We released in like a small number of art house cinemas in the state.
[84] And at one of those screenings, David DiMato shows up, who is like the focus of the film.
[85] And suddenly, Dylan, who I made the film with, Dylan hands at Domato and Mike.
[86] And he just has this Q &A session with him.
[87] And the audience, like, you see everyone's phones come out because they sort of can't believe what's happening.
[88] And it was just the whole thing was wild from like discovering the tickling videos and discovering there was a thing called competitive endurance tickling.
[89] Yeah, let me just repeat that.
[90] Competitive endurance tickling.
[91] It's a real thing.
[92] Well, but it's not.
[93] But it's not.
[94] Yeah, it's not.
[95] I mean, you know, there's these videos of these young athletic guys in Adidas sportswear on top of each other, tickling.
[96] And it kind of, when I first saw the videos, I sort of thought, oh, I mean, maybe this is a sport.
[97] Like, Ultimate Frisbee's a sport, and that's kind of weird.
[98] So maybe someone has just made tickling into, like, an odd sort of thing.
[99] But it became apparent that it was, like, pretty homoerotic, and it was pretty fetishy.
[100] And so the second I reached out to them and was like, I'm a journalist in New York.
[101] New Zealand.
[102] I'd love to sort of interview you about the sport you've created.
[103] And their immediate reply was like, we don't want to deal with a gay journalist.
[104] That was just so aggressive straight away.
[105] And that's the instant I think that I knew there was something more happening.
[106] Yeah, well, the guy, and I don't want to give away the documentary because I really want people to check it out.
[107] It's so fantastic.
[108] I tickled.
[109] But I don't think it's too much to say that obviously there's a person behind funding these videos.
[110] To your point, a lot of sports are homoerotic.
[111] Oh, I mean, I'm from New Zealand.
[112] Like, rugby is our main thing.
[113] And I don't think I've ever seen a more homoerotic sport, you know, with guys, like, all rugging into each other.
[114] It's kind of mad.
[115] Oh, scrumming.
[116] I mean, scrumming's got to be code for a group fucking somewhere in some country.
[117] Like, let's get a proper scrum together tonight.
[118] So, no, like, it wasn't that outrageous to think that maybe, you know, tickling was in someone's world sort of a competitive thing.
[119] Like, sure, like winners and losers.
[120] you know, like how much you laugh and don't laugh and what position you're in.
[121] But, yeah, looking at those videos, you know, like, you can sort of tell what fetish is when you watch it.
[122] Like, you know if it's a bit off and if it's just something kind of innocent or if someone's taking some kind of sexual pleasure from it.
[123] You just kind of know.
[124] And watching these tickling videos, you're just like, okay, this isn't just a tickling contest.
[125] This is more than that.
[126] But no, it's not, thanks for watching it.
[127] Like, it's so nice.
[128] It's always nice when you make something and people watch it.
[129] One of the interesting things, and again, it's not interesting in that.
[130] It's so prevalent.
[131] There's so many of these senators who have been the vanguard of getting some anti -gay legislation passed that then get discovered in an airport bathroom.
[132] I mean, it's happened a dozen times in my lifetime.
[133] So I guess in that way it's not shocking, but it's so interesting, yes, that the response was immediately homophobic.
[134] And then clearly the person who's driving pleasure from this and funding it, is it not in a position probably to be outwardly homophobic?
[135] I mean, you could speculate, you know, about so much of David D 'Amato's life, but he was clearly someone who was incredibly closeted about whoever he was.
[136] And I wish there was a world where he could just be, you know, happy in his own skin, but his whole way of sort of viewing the world was from his computer and he would find these elaborate ways to get young men to tickle each other on camera.
[137] But then if he was ever accused of being involved with that videoing or setting it up, he would just lash out with the most homophobic stuff.
[138] You know, anyone who is extremely homophobic, I think there's always something in them that they're not happy about themselves in some way or they're sort of covering for something.
[139] It's that classic thing.
[140] Like it's almost like so stereotypical in a way.
[141] You know, we're so used to seeing this.
[142] You're totally right.
[143] It can go to crazy unimaginable depths.
[144] Have you watched the Roy Cohen documentary that's out on HBO right now?
[145] No, it's on my list.
[146] I need to.
[147] When we finish this, I don't care what you have scheduled, fucking cancel it.
[148] Because you want to talk about someone who like got so twisted in the, you know, the separation between their lives and their presenting image.
[149] At the bottom of all that craziness is the victim of a society that made his inclination a proclivity and a secret.
[150] So it's like I ultimately could find redemption in this guy at any point because I recognize he's a victim of circumstance.
[151] Yeah, absolutely.
[152] And I think, you know, sort of, I don't want to sort of do spoilers, but where the film sort of ends up, we talk to a family member and kind of get an idea of where he's come from and why he's like this.
[153] And yeah, he absolutely was a victim of society and he was in a position where he didn't feel happy being himself.
[154] And whether that came from his father or colleagues or just society in general, there's a timeline where none of this would have happened.
[155] He would have been happy with himself.
[156] He would have been open.
[157] In saying that, you know, he did have a certain sort of part of him that really loved abusing people for the fun of it.
[158] I mean, I think part of his whole thing, like part of what made him get off on it was that he had power over these boys, and he could kind of ruin their lives whenever he wanted to.
[159] As someone who lives in a world where they will have no control over a huge chunk of their life, who then desires mass control, again, I'm pretty sympathetic to.
[160] I'm not, it's not an excuse, but it's a compelling explanation to me. No, 100%.
[161] And, you know, after the film came out, I've sort of kept in touch with some of his old colleagues and people he taught with at various schools.
[162] And they had a very similar position to you.
[163] They were like, it's just incredibly unfortunate.
[164] can't condone what he did, but we understand it on some level.
[165] So I totally agree.
[166] Yeah.
[167] Okay.
[168] Now, I also just want to plug for a second, dark tourists, which I've yet to start.
[169] But now that I know you're behind it, that is going to be what I'm going to clear my schedule for.
[170] So you also have dark tourists on Netflix.
[171] Oh, thank you.
[172] Yeah.
[173] And what is?
[174] Yeah, spent a year traveling around the world, kind of looking at dark tourism.
[175] And I guess it's like the anti -holiday, like most people would sort of want to go somewhere to a nice beach or something.
[176] But there's a number of people out there that just love going to say.
[177] somewhere where some huge atrocity has happened.
[178] And so that's dark tourists.
[179] Now, meaning people who are like sex tourists or people who are actually fascinated with like genocides?
[180] What are we talking here?
[181] Kind of more the more the genocide side of things.
[182] I mean, I suppose it's the kind of people that would want to go to Chernobyl because they are either fascinated by the history or they get a kick out of it.
[183] You know, we did an episode in Colombia, looking at narco tourism and spent some time with Popeye, who was Pablo Escobar's hit man, who's turned into this kind of a celebrity over there, like people love him and because of Narcos, which was a Netflix show.
[184] Well, people love Pablo.
[185] I mean, half of that country absolutely loved him to this day.
[186] Yeah, and people around the rest of the world who don't really fully understand the circumstances or have just seen a TV show, love him.
[187] We spent a day looking around Pablo's prison he was kept in, which was basically just a big mansion.
[188] Well, he built it himself.
[189] We should bad.
[190] Yeah, build it himself.
[191] It was, you know, he could sort of do whatever he wanted.
[192] He had a pretty good deal.
[193] But, you know, we were walking around there with Popeye who had, you know, this is a man who's murdered over 300 people, and he killed his girlfriend and their unborn child.
[194] And he is just objectively a bad person.
[195] Like, there's no arguing.
[196] And short circumstance, again, he came into a certain sort of world.
[197] He's an atrocious human being, just objectively.
[198] Yeah.
[199] Yeah.
[200] And yet, you know, you're walking around with him, and he's so jovial.
[201] And you're like, oh, it's that character from that show.
[202] And, you know, you've got some American tourists had come up, and they were like in awe of him.
[203] They were getting like selfies that were so excited to see him.
[204] It's like they'd met Beyonce or someone that was just this huge name.
[205] And you're like, no, this is like an awful human being.
[206] And so I guess part of the show was trying to explore just those weird places we end up in society where we put an emphasis on things we perhaps shouldn't.
[207] Yeah, yeah.
[208] I got to say, I might have been one.
[209] those disgusting Americans, like, wanting to shake his hand.
[210] I have, like, such an interest in that whole world.
[211] Oh, I mean, the thing, we did a whole show about it, so we're kind of just as guilty of that, you know?
[212] I mean, we gave him a platform, so that kind of comes into the whole discussion as well, I think.
[213] Okay, the only personal bit of information I want to get out there about you is that you were born on Christmas Day in Bethlehem.
[214] I was.
[215] That sounds like a joke, but it's true.
[216] I'd clarify that that is a town in New Zealand.
[217] It's kind of wild.
[218] You could be your own conspiracy theory.
[219] Yeah, I could.
[220] I think when people come up to me for rallying against sort of conspiracy theory culture, I think that's something they can kind of point to that I'm probably a crisis actor or making things up.
[221] Because, like, who's born on Christmas Day?
[222] Who lives in Bethlehem, right?
[223] You're the first person I know, so that is suspicious.
[224] But yeah, I can confirm that one.
[225] Okay, now you're an investigative journalist, so I understand that you'd have an interest in anything.
[226] But why in particular do conspiracies start to interest you?
[227] Yeah, I think the sort of things I've always been drawn to, whether it's sort of the world of tickled or my journalism or dark tourist, is kind of these subcultures of life.
[228] And I think the whole conspiracy movement over the years has always been fascinating to me. I mean, I was obsessed with the X -Files growing up.
[229] Mel Gibson's conspiracy theory at the time was really a big film in my life.
[230] But I've always been fascinated by, I guess, belief and why people believe that.
[231] things.
[232] And, you know, I was raised in a pretty conservative Christian home.
[233] And so...
[234] Well, you would be being Jesus.
[235] Being the Christ child, yeah.
[236] It's probably your fault just to add, you know, I don't want to point fingers.
[237] I should explain that as well.
[238] I was born on Christmas Day.
[239] And your mother's a version.
[240] Yeah.
[241] For quite a long period of time.
[242] Now, I think conspiracy theory culture kind of ticks every box that I find engaging in, you know, why people believe what they believe.
[243] And I guess I particularly started digging into it over, lockdown in New Zealand because we were in lockdown for a couple of months and we suddenly had a lot of the conspiracy theories that have been propagated in America like 5G being a weapon of mass destruction and COVID being no worse than the flu and fake.
[244] We had that stuff propagate over here pretty quickly and so we had this incident where a number of 5G cell phone towers were set alight and this was exactly what we were seeing over in the States and over in the UK.
[245] They were damaged.
[246] Are you saying that?
[247] Oh, okay.
[248] People lighting them on fire.
[249] Like people were like torching towers and burning them down because they thought that 5G was being installed by the government.
[250] Whilst COVID had us all distracted, they were secretly rolling out these 5G networks, which were either to, A, distribute COVID to the population, or B, infect us with some sort of disease that wasn't COVID cancer, a number of other options.
[251] And so people started burning them down.
[252] And so, like, I just saw this weird fusion of COVID conspiracy beliefs merging with anti -5.
[253] G rhetoric, and I started writing, I've got this little newsletal called Webworm, and I just started writing about conspiracy theories a lot this year, and especially sort of based around QAnon, which is this big motherload of conspiracy theory that's come out.
[254] Yeah, that's the one we most want you to explain to us at length.
[255] We don't understand.
[256] Oh, gosh.
[257] Because I realized that I had been participating in it in this bizarre way.
[258] Oh my God, how?
[259] I started getting this rash of.
[260] of comments on my post about why aren't you talking about child trafficking?
[261] Why are you staying silent on child trafficking?
[262] I was like, well, first of all, I'm not.
[263] I had the fucking lawyer for the Epstein victims on four days ago.
[264] So, A, I am talking about this.
[265] And then I just thought it was, it was enough that I was like, this is an abnormal and an amount of comments saying, why aren't you talking about sex trafficking?
[266] And they were all very accusatory.
[267] Like, I was an unethical person.
[268] and then I was purposely ignoring this issue, that there was some intention behind me. Right.
[269] And I was getting defensive.
[270] I'm like, well, first of all, I can't talk about every fucking issue.
[271] There are a billion issues.
[272] And because I talk about one doesn't mean I don't think another one is valuable because a guy studies cancer doesn't mean we don't need a doctor treat nail fungus.
[273] Like, I reject this thought process.
[274] But anyways, I then came to realize what the fuck it was all about.
[275] But anyways, I want you to walk us entirely through Q now, but I just have a couple more set up questions if you'll bear with me. I happily accept your questions.
[276] And I love that someone like you who has a decent number of followers, suddenly seeing a similar thread come through would be fascinating.
[277] Because it's not just a couple of people giving you shit about something.
[278] It's like walls of people being like, why are you keeping the traffic children secret?
[279] And you're like, what?
[280] What is happening?
[281] And in today's climate where you can be, you know, I don't want to overreact, but there's some canceling that exists.
[282] And occasionally you're like, oh, my God, if I tripped the wire of canceling, like am I?
[283] Is there a movement building to get rid of me because I'm not talking about sex trafficking?
[284] So like on some of all you've got to take it somewhat serious and it wasn't until we had Bill Gates on and then it fucking blew up, man. We had on a great episode with the biggest star in the world we'll get a thousand comments.
[285] And all of a sudden we had 2 ,800 comments in a day and I'm like, did that many people listen and love it?
[286] I started glancing all about that.
[287] He is a pedophile.
[288] I'm like, what the fuck are they talking?
[289] Like that was my real hard up close realization that there is some.
[290] some movement, some percentage of this country that thinks Bill Gates and me are pedophiles and we're swapping people at our Illuminati meeting.
[291] And I was just like, this is the most crazy one I've ever heard in my life.
[292] Yeah, it's wild and it's wild how Bill Gates has been brought into this whole discussion.
[293] I mean, and it's amazing that you have someone with his brain on the show and a million interesting things to talk about.
[294] Oh, and he's solving our biggest If there was any guy that should be hoisted up on a sedan chair and paraded around the country, it's this man who's donated $30 billion to fixing sanitation and energy.
[295] It's just bonkers, but yes.
[296] And, you know, I'm sure there are real criticisms you could come out with about him, but that is not one of them.
[297] In fact, he's trafficking with children.
[298] I mean, what's been wild to watch with QAnon is that it's something that started, you know, on sort of the darkest recesses of the Internet.
[299] forums that are just a bunch of idiots, you know, shit posting and winding each other up.
[300] And we're in this situation today where it spreads so wide that influences people on TikTok, your audience, are suddenly believing that the world is run by a group of elites who are trafficking in children and sort of doing every extreme evil act you could possibly think of that Donald Trump knows this and is one day going to sort of walk up to a holding him and announce who these evil people are and they're all going to be magically sent to prison, you know.
[301] Yeah, FBI Task Force.
[302] Yeah.
[303] You know, this is something that started in the darkest corners of the web three or four years ago and we're at this point now where people are just casually talking about it on Instagram has been a real thing.
[304] And I think Save the Children is a really fascinating development because, you know, if you see a Save the Children rally, it's very difficult to sort of go, that's a bad thing.
[305] You know, we all want to save kids.
[306] It's brilliant.
[307] And I should clarify.
[308] I should clarify that, you know, of course there are children that are being trafficked and it's awful.
[309] And yes, Jeffrey Epstein did traffic in children and it's awful.
[310] But what sort of this whole conspiratorial online brand has come up with is that because of the shadowy cabal of evil that is running the world, they've taken that whole idea of child sex trafficking into them to queue and non -followers.
[311] The reason kids are being trafficked is because they're being trafficked on mass, they're being trapped in underground bunkers, they're being tortured and terrified so that they create adrenochrome, which is then drained out of their blood, and that is then injected into the bloodstream of politicians and Hollywood actors and the Hollywood elite in order to prolong their life.
[312] So when you go to a Save the Children protest, that is what you're engaging with, not the other thing.
[313] Hold on one second.
[314] Even if you believe that you can extract some youth serum from children, Let's say you believe that, I accept that.
[315] Why on birth would Bill Gates need to fucking kidnap kids?
[316] He could pay kids for their fucking youth serum.
[317] If there was really a youth serum, you could approach the family and go, listen, man, I'll give you fucking $5 million for an ounce of that use serum, and everyone would be happy.
[318] Yeah, I mean, part of why this stuff is spreading is that it's born on the internet and it's people picking apart evidence that they find.
[319] And so it's this interactive kind of game almost that people are playing.
[320] So you look at something like, do you know Wayfair Gate?
[321] Were you across that?
[322] No. Wayfair Gate?
[323] So Wayfair Gate, there's an online store called Wayfair, and it basically sells like home and furniture and cabinets and cabins and stuff.
[324] I just heard a bit of this.
[325] I'm so excited for you to tell us about this.
[326] Yes.
[327] And so what happened?
[328] Someone came across Wayfair site and they saw that some of the cabinets that they were selling were like super expensive.
[329] They also noticed that, oh, some of the cabinets have human names like, I don't know, Eric, Sarah.
[330] Sandra, I don't know, it's a bit weird, whatever.
[331] And so from this, they put together the fact.
[332] And when I say they, I mean internet researchers and people that believe in QAnon and conspiracies in general, saying that Wayfair was basically an open portal for child sex trafficking again.
[333] And, you know, and of course, if you're going to be involved in the stuff, of course you're going to do it through a public -facing store and you're going to name the cabinets after the actual children that you're selling like it's all so this turns into this huge thing propagated over social media and as part of the save our children movement okay so really quick so i was told this and now i'm remembering i bought it but i don't know that i bought it in terms of children i think maybe it was pitched to me as like any kind of illegal things and i thought oh well that's brilliant yeah just like on eBay you really want cocaine and so the the people on the inside know if you order this air purifier and it's $12 ,000 you're getting a kilo a Coke in the mouth.
[334] That, to me, seemed brilliant.
[335] And so I don't even think the person who told me, who's not a crazy conspiracy, I believe that until just now.
[336] What?
[337] No. I don't know that I believe the sex trafficking part.
[338] So why were things $10 ,000 on that site?
[339] They just were.
[340] No, no. I know.
[341] It's one of those three things.
[342] It's one of those trade sites where anyone can have a store on there.
[343] So, you know, sometimes you go to Amazon, you look for a book.
[344] It's super expensive or an item super expensive because someone's selling it.
[345] Someone was selling some stuff and they did just have an outrageous price on it.
[346] It wasn't the price of a child.
[347] It was just someone being greedy, wanting a lot of money for their cabinet.
[348] Wait, wait, real quick, real quick, are they saying if you buy this Susan cabinet, you're buying Susan the kid?
[349] Like, it's a way to buy the child.
[350] Yeah.
[351] But how do you even know what the kid looks like?
[352] How do you know if it's the one you want?
[353] I mean, I don't even understand how you can make a good purchase.
[354] You go with the kid's Elizabethan.
[355] Why are people doing this?
[356] If it's a blonde piece of furniture, you know it's a white child you're buying.
[357] I need to know why.
[358] I don't understand.
[359] Well, no, we're going to get, we're going to get into that because, but let's, let's keep it juicy because I have some sobering, not very fun opinions on what the appeal is.
[360] And I have to imagine you do too.
[361] But my real quick question is, is Q and on an outgrowth of Pizza Gate?
[362] Because Pizza Gate was this similar story, right, where liberals were trading in kids and they were doing all the deals at a pizza parlor.
[363] People probably remember this, right?
[364] And then a real human being.
[365] And by the way, kind of back to your documentary guy.
[366] It's like, if I really believe there was a room full of children being abused, I'd show up with a tank.
[367] So I don't not understand the reaction.
[368] I don't understand how that was plausible to the person.
[369] But what I thought was really clear is that in the wake of that event, when the man was arrested, I thought it was like the consensus was that was preposterous.
[370] This was a really bad conspiracy theory.
[371] And I thought it was a lesson to be learned by people.
[372] And now that, no, it kept its momentum right up and just changed names.
[373] Is that what happened?
[374] Pretty much.
[375] I mean, I think for a period, a number of people saw that PizzaGate was clearly, you know, the idea that Hillary Clinton had children locked in the basement of a comet ping pong, saw that that was outrageous.
[376] It's hard not to laugh out loud when you say this stuff.
[377] It is.
[378] It really is.
[379] No, there wasn't even a basement there.
[380] You know, it was all clearly incorrect.
[381] But what happened up to that is.
[382] that this character called Q popped up on the internet on 4chan, this terrible board that has morphed and changed into a new thing now.
[383] And Q said that they were basically like deep state, like operative, like very aware of what was going on in the world.
[384] And they started making Q drops, which were essentially coded really bizarre messages that would be like they're so weird to even talk about because it's like coded language and their little clues.
[385] So like Trump talking today, look to the left.
[386] What do you think?
[387] 7 -78 -9 question mark you know like just incoherent sentences but for whatever reason a certain type of person on the internet decided that Q was actually someone with hidden knowledge about what was going on in the world and was starting to create these Q drops really quick was the assumption that this insider was within the cabinet of our current administration or just in some level of privilege a level of privilege and opinion shift on exactly what that privilege is, but they are basically a heroic figure who is telling the truth and is, you know, a whistleblower.
[388] He's a whistleblower.
[389] You know, he or she or they are a whistleblower and that they are slowly encoded language revealing this information.
[390] So what that does, because it's like kind of engaging and interesting, you get a whole lot of internet researchers surging on board, pouring apart these things, trying to take clues into what it means and taking meaning out of things to, you know, when this happened with Pizza Gate, you had these leaked Hillary Clinton emails, and there were little clues in there about pizza and serviettes and certain things.
[391] And that was all put together eventually to mean that Hillary Clinton had children locked up in a pizza basement.
[392] I'm going to pause you again for one second.
[393] I'm so sad that you live in New Zealand.
[394] I want to hang out with David.
[395] It's so fucking we can talk about this every day.
[396] Okay, sorry, sorry.
[397] I keep interrupting you, but it's like I'm getting so I'm on the next plane to fucking the North Island.
[398] Okay.
[399] Stay tuned for more armchair expert, if you dare.
[400] We've all been there.
[401] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.
[402] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[403] like the unexplainable death of a retired firefighter whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[404] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[405] It's called Mr. Ballin's Medical Mysteries.
[406] Each terrifying true story will be sure to keep you up at night.
[407] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.
[408] Prime members can listen early and ad -free on Amazon music.
[409] What's up, guys?
[410] It's your girl Kiki, and my podcast is back with a new season, and let me tell you, it's too good.
[411] And I'm diving into the brains of entertainment's best and brightest, okay?
[412] Every episode, I bring on a friend and have a real conversation.
[413] And I don't mean just friends.
[414] I mean the likes of Amy Polar, Kell Mitchell, Vivica Fox, the list goes on.
[415] So follow, watch, and listen to Baby.
[416] This is Kiki Palmer on the Wondery app, or wherever you get your podcast.
[417] Yeah, so like what Pizzagate proved is that online researchers that are a little bit, you know, they're unhappy in their own lives.
[418] They want to kind of take some control.
[419] They want to figure out what's going on in the world.
[420] They're very partial to taking some clues and just haphazardly aligning them with other facts and other things they've seen and coming up with these elaborate theories.
[421] And that's how Pizagate happened.
[422] And then with Q and on and the Q drops, it's like every day or every week there's a new Q drop.
[423] It's a new thing to analyze and de -reep.
[424] code and suddenly you've got all these wild theories floating around to the point now where you know, COVID is fake, 5G is going to kill you, adrenochrome needs to be drained from the blood of children to keep the Hollywood elites alive.
[425] You know, I liken it to an alternate reality game and there was this thing like in marketing and movies for a while.
[426] I think when AI came out, Stephen Spielberg did this for that film, where there's a movie poster and hidden in the movie poster was a name of a person who wasn't an actor or a director, it was just a weird name.
[427] If you Googled them, you got taken to a website.
[428] There was some secret numbers.
[429] Those were coordinates for a place.
[430] If you go to the coordinates of that place, there's like a USB stick.
[431] And, you know, it's like an interactive real world game with clues.
[432] And I want to add, I see the appeal of being a sleuth.
[433] It's really fun to be a sleuth.
[434] It's really fun to figure something out.
[435] There's like a puzzle involved.
[436] And in doing so, again, we'll get into this at the end, but I do think that these conspiracy theories, that the people that are most vulnerable to them seem to be just, anecdotally from my life experience, people who feel very excluded from the system that seems to be rewarding other people.
[437] I think that's like a big component of it.
[438] So if you can add in this notion that you're smarter than these assholes who have created this system that's excluding you, I can imagine great comfort in that.
[439] Absolutely.
[440] It's a way to sort of take control of the narrative.
[441] And if you're living a life where you're not particularly happy, we see that you don't have a lot of money and that society has kind of given you a pretty bad rap, then sure, you want to solve why that is.
[442] And what QAnon provides is this perfect story where you've got a group of global elites who are out to take over the world.
[443] And if you take that to the extreme, I mean, what they're talking about is that like Satan is behind it all.
[444] So it's like this big spiritual battle, which is why there's this weird alignment between evangelicals and Pentecostal Christians and QAnon because they're this group that have been raised on the idea that, yeah, of course here on Earth, there's this giant battle.
[445] battle playing out between good and evil.
[446] And that's exactly the narrative that Q and on is serving them.
[447] So that's why there's so much alignment between Trump, the evangelicals, and QAnon.
[448] It's like this perfect kind of very weird fusion.
[449] I was telling my wife, I was about to interview you, and she got excited and she mentioned, and I agree, which is like, we're uniquely primed for this in a sense because it really starts with Santa Claus.
[450] I hate to be critical of it, but it's like, I tell you that there's a guy in the North Pole who somehow goes to every single house in one night.
[451] And you go, not possible.
[452] And I go, no, it is possible.
[453] And your better part of your brain is going, no, no, no, it's not possible.
[454] And then I have this huge fucking reward.
[455] If you buy in, I'm going to give you all these presents.
[456] And you're like, fuck it.
[457] It's worth silencing that critical part of my brain because I want those presents.
[458] And then you go, Easter bunnies coming in two weeks.
[459] And you're like, what?
[460] There's a bunny that travels around and hands out game.
[461] Don't worry about that.
[462] You're going to get a big -ass Easter basket full of chocolate and Halloween.
[463] And it's on and on.
[464] And so we prime people to ignore the critical part of their brain for some reward, be it eternity in heaven, absorption of your sins, whatever the thing is, it's like, we've got a nice pattern in it, you know, from the gap.
[465] Yeah, and we're very good at leaning into certain things that we believe in and ignoring anything that will counter that.
[466] So I think you find it really strongly with, like, young Earth creationists who believe that the Earth is 10 ,000 years old.
[467] And, you know, it's usually they're raised in a family that will tell them this from day dot.
[468] So they grow up with this belief system.
[469] They believe in it.
[470] They believe in eternal life.
[471] And what they're faced with then just in their day -to -day life is a lot of counter -evidence to that, saying that, hey, the earth isn't 10 ,000 years old.
[472] It turns out, you know, dinosaurs were real and all these facts that, like, they've been told, aren't necessarily true.
[473] And I feel bad for him.
[474] I've had these arguments with people, and it's like, the archaeologist faked the bones.
[475] Oh, no, no, the bones are there, but God put them there to trick you.
[476] And I'm like, okay, so your God is actively trying to trick you.
[477] And that's cool with you.
[478] That's just interesting to me. But no, it's very similar logic.
[479] to QAnon where you are given this figure, Q, who's giving you this information, it's almost like a scripture that you get that you then have to decode and figure out and make it true to yourself.
[480] It's playing to all the same things.
[481] It's a prophecy.
[482] Yeah.
[483] And, you know, Trump plays into that and he loves it and he talks all that stuff up.
[484] You know, he's actively retweeting QAnon accounts.
[485] You know, he's sort of giving the big thumbs up to all this stuff.
[486] And here's where we get into.
[487] And by the way, I am never critical of the president on this podcast because I have more important thoughts than my politics.
[488] So just in general, I don't do it, but I will, I'm going to break tradition right now to just say.
[489] One of the very fascinating things about him is I am so often wondering, is he in on it or does he believe it?
[490] Because how could he be the head of the state of the most powerful country in the world in history with all the info.
[491] Nobody has more info in the world than Donald Trump.
[492] How could the guy with the most amount of information on planet Earth, retweet a QAnon thing.
[493] I'm so curious whether he believes it or not.
[494] No, no, it's to rally people around him and his base loves this stuff.
[495] I mean, that's, he knows what he's doing.
[496] Well, part of their narrative, which confused me, but when I was reading through those Bill Gates attacks, he's perpetuating the fake COVID disease.
[497] Also, he wants population control.
[498] This is a big attack on him.
[499] He wants population control.
[500] And they think he has said that.
[501] in so many words.
[502] Right.
[503] I mean, that's a perfect case of how conspiracy thinking works.
[504] You know, one thing Bill Gates said, got taken out of context, spread super fast, and suddenly he's someone who wants to actively depopulate the planet, which is why you got all those people flooding your comments when Bill Gates appeared on the show.
[505] It's like selective things being pulled out in exactly the same way that an evangelical Christian will selectively pull out information that supports their point of view.
[506] It's survival, and it's a worldview that makes him feel safe and in control of reality.
[507] And it's exactly the same with someone who has this belief in QAnon and that one day Trump will step up, he will announce the truth.
[508] All those bad, terrible people will go to prison.
[509] All the Epstein's and the child sex trafficking will stop.
[510] Everything will be glorious.
[511] It's like a rapture.
[512] It's all the same stuff that you get.
[513] It's all the same beats.
[514] I'm going to make a joke now, but I believe it.
[515] It's like, you're going to throw all the entertainers in prison and then the fucking smartest people in tech.
[516] What a boring -ass world you're going to end up with.
[517] Say goodbye.
[518] The TV show you like.
[519] You know, have you thought that part through?
[520] It is so selective, though, like you said, because if we're talking about real child trafficking, Epstein, who we know and there's evidence and, you know.
[521] A hundred victims, maybe more.
[522] And Trump is in every other picture with Epstein, with his arm around him.
[523] But he's fine.
[524] And he's fine?
[525] Like, the irony that it's child trafficking and Trump is going to, like, understand.
[526] uncovered this yet he's buddy buddy with the only child like right currently yeah but that logic doesn't sit with your typical because of conspiracy theorists they'll get that little list of passengers on the plane and they'll see that name and they'll see it as confirmation that they're complicit in everything epsine was doing that's like the logical jumps that they will make and it reminds me of like a really obvious thing and i think it was the guy that pointed this out to me um was mick west and he's the guy that created the video game Tony Hawk Pro Skater and he made like a shit ton of money off that game.
[527] He retired and now he debunks conspiracy theories full time and he's a really wonderful human being.
[528] But he sort of made the point to me that not one QAnon follower or researcher has actively saved or helped anyone.
[529] Like if you look at who took Epstein out, it was an incredibly hardworking journalist at a Miami newspaper who slaved away for years.
[530] That is the reason that Epstein was locked up.
[531] You know, you look at the team that outed the Catholic Church and all the child abuse that went on there.
[532] That wasn't an internet researcher.
[533] That was like a team of journalists who worked that story for years.
[534] There's not one time that a conspiracy theorist, a QAnon follower, has actively created any positive change in the world or proven that anyone is a pedophile.
[535] It just never happened.
[536] Great point.
[537] And then the only people that have a track record of doing it are enemy number one, fake news, right?
[538] Yeah, exactly.
[539] So it's this very weird dichotomy, again, that Trump has helped sort of push out that, you know, journalists bad, media, bad, fake news, bad, you know.
[540] And it's unfortunate in a way, the sort of the trickle on effect of Epstein being arrested is that that just propagated this whole conspiracy that everyone is in on it, you know, and that's really unfortunate because what we should take from that is that, God, maybe we should respect journalists more because they're the ones that found out what Epstein was doing, and it's the reason we're talking about it.
[541] And it's a reason you watch that Netflix documentary about it, you know?
[542] And I'm going to add another thing about it, which is what it exposed is not sex trafficking.
[543] It exposed income inequality.
[544] Epstein cannot exist in a world where $100 isn't life -changing to a young person.
[545] What the takeaway of that is is that story can't happen in Sweden.
[546] It can't happen in Austria because nobody is so desperate that $100 can change their life.
[547] Yeah, and you're coming back to the idea that it's a systemic problem.
[548] It's not this one -off.
[549] It's like, where you're talking about David DiMato, like the reason he felt terrible about being a gay man, it's like society telling him that, and it's society that's setting us up for someone like Epstein to operate.
[550] You're completely right.
[551] Like, we're taking all the wrong takeaways from this stuff.
[552] If our strategy is to wait for people to pop up and then bust them.
[553] That's just a really dumb strategy.
[554] The strategy is what is causing it?
[555] How are these people empowered?
[556] How do we dismantle the thing that empowers these people to pray on people?
[557] Let's look for the preventative cure and not the bandaid to it.
[558] 100%.
[559] It's the problem with the criminal system, right?
[560] It's like locking people up is that's not what we should be focusing on.
[561] We should be focusing on how to keep people out of there in the first place.
[562] It's like we've got it the wrong way around, always.
[563] Absolutely.
[564] Now, is there any way to extrapolate how many people believe in this QNON thing.
[565] And then part two to the question is, is there an ultimate plan for them?
[566] No, it's really difficult to get the numbers.
[567] There's different people doing polling on Q &NN belief and you get wildly different results.
[568] For my Webber Newsletter, I talked to a professor at a Miami university and he basically said belief in QAnon is not changing at all.
[569] All that is happening is that people are talking about it more in the media.
[570] And I think to a degree, there's a certain element of that that is right.
[571] We are talking about it in the media a lot more, and people are more aware of Q &N on than they were three years ago.
[572] But then another poll came out a couple of weeks ago that said the opposite.
[573] Q &Non belief is absolutely rising, and more and more people are believing it, and most of them are Republicans, and this poll said the opposite to the other poll.
[574] And I tend to lean into the second poll, because what we're all seeing around is, like, people weren't flooding your comments about Bill Gates on Instagram.
[575] you know, two years ago, this is spreading.
[576] This is everywhere now.
[577] And I think, you know, exact numbers, we're not going to get them because it's just too difficult online looking at what accounts are real people, what are bots, what are not.
[578] I'm so glad you brought that up because I actually took the time to often go to the pages of the people who are saying these things.
[579] And what I notice is...
[580] Oh my God.
[581] In general, there's almost no posts.
[582] If their posts, it seems like they've been pulled from someone else's photos.
[583] And all of the handles, the ads, the names, involve multiple numbers, right?
[584] And so what I started wondering is, having read this New York Times article that Russia actively planted the seeds for the anti -vaccine movement in the U .S., I had to wonder, are these being computer generated and is there a state involvement in it?
[585] Oh, absolutely.
[586] There's, yeah, no, I mean, there's no doubt that it is bigger than just Americans.
[587] jumping on and believing this stuff to the extent that we're seeing the number of posts online.
[588] So there is absolutely more going on to this, I think, yet to be revealed because it's just so hard to track.
[589] But this whole, I mean, look at what we're seeing from this stuff.
[590] We're seeing active movements now to like remove masks to get Trump back in because he's fighting against all this.
[591] Essentially, he's being held up as the savior in all of this.
[592] And a lot of that plays into what these Q accounts are saying and the narratives they're putting out there.
[593] So bad actors absolutely factor into this.
[594] There's no doubt.
[595] This is so tricky, though, right?
[596] Because it feels like a little bit of an impossible middle ground or a solution.
[597] Because to them, I think what they'd say is, oh, Russia is meddling.
[598] It almost sounds like our own conspiracy theory.
[599] That's what I'm saying.
[600] To them, our reality sounds like a conspiracy theory.
[601] And so how do we reconcile fact and truth when we're literally living in two different ones?
[602] Yeah, I mean, that is a big thing to factor into this, is that objective reality and truth, that does not matter anymore.
[603] And I think that is something, and I know everything I say just comes out as sort of raving, liberal.
[604] But one of Trump's really, something he's very good at doing is saying whatever he wants and just actively going with it.
[605] And if people disagree or it's proven that what he said is incorrect, he is just so good at just boldly going on to the next thing and it's all forgotten about.
[606] And so this idea of truth, thanks to him, has really, like it has gone out the window.
[607] Like truth doesn't matter anymore.
[608] It's who's got the loudest voice and who's got the best story?
[609] And at the moment, I fear the best story is this giant good versus evil battle where a group of global elites are keeping kids locked in in cellars at undergrows.
[610] underground, you know?
[611] Like, that's the more exciting story, and it's what people are latching onto.
[612] And it's got no semblance in reality at all, but it's what people are choosing to believe in.
[613] Well, I'm going to give you another self -centered story, but it'll make it really quick, which is I got in a kind of public motorcycle accident at the racetrack a few weeks ago, and I was injured, and it kind of made news.
[614] And I just, my wife happened to show it to me. It was at the bottom, and there was like maybe eight news stories lined up.
[615] And it was like, you know, from CNN, and I see.
[616] And all of them, I thought, a pretty fair extrapolation.
[617] from the only info that existed on it, which is what we had said on this show.
[618] And in that, I said, you know, I was at fault because I was passing a bunch of dudes and a guy turned in earlier than I was expecting, whatever, I recognize I'm the one passing.
[619] It's got to assume the worst.
[620] Yes.
[621] The Fox News headline said, Dak Shepard, I'm completely wrong.
[622] And I thought, oh, that to them, I'm proud of the fact that I can own my responsibility in this mishap.
[623] And yet I could see that, I had done the cardinal sin, which is I admitted weakness.
[624] I was fallible, and I fucked up.
[625] And that was the best way to take me out at the knees is I had owned a mistake.
[626] I was just like, oh, that's an interesting aspect to this.
[627] Yeah, all the subtlety out of that discussion that had taken place was just completely sucked dry.
[628] And it was just like, I'm wrong.
[629] Dax is wrong.
[630] This man's wrong.
[631] Headline.
[632] Yeah, because what a lot of people admire about Trump is that he just, he never says, I'm sorry or I'm wrong.
[633] So I understand that that's the part of the appeal.
[634] So it's like, yes, by painting me as some weak person who would admit I'm wrong was like it was the worst sin.
[635] Yeah.
[636] And I think, yeah, that is a big part of what we're facing is that subtlety has completely and nuance has completely gone out the window.
[637] This email says this.
[638] So this is clearly what there's no pulling apart the information in any kind of intelligent way.
[639] It's just blunt and black and white.
[640] I mean, you watch the discourse in the states at the moment and you can see how polar it's getting on both sides.
[641] Oh, yeah.
[642] And it's both sides just leaning into it.
[643] You know, both sides leaning into their own bullshit.
[644] But I would argue that the bullshit on Dewanon's side is definitely, definitely maxing out at the moment.
[645] Yeah.
[646] Okay.
[647] I want to go just through a couple more of these popular conspiracy theories.
[648] I mean, there are so many.
[649] There are so many.
[650] I mean, the interesting, I mean, America kind of lives and breathes this stuff.
[651] I mean, I think, you know, back around, you know, debating whether the Moonland happened or not, JFK, we're kind of rooted in some kind of reality and some kind of intelligent discussion.
[652] But then I think things started shifting around 9 -11 because people were connecting on internet message boards and they were doing their own research and that's kind of where that idea came out from.
[653] And very quickly things spiraled and you started seeing things emerge in 9 -11 and certainly more recently ideas that are much more mean -spirited in conspiracy theory.
[654] culture, you know, this idea of crisis actors, for example.
[655] So, you know, someone's been in this horrible incident.
[656] They've had a child killed in a school shooting.
[657] And suddenly, you've got a parent who is being painted on online conspiracy forums as being, you know, acting the whole thing, or their kid's an actor, or, you know, the Boston Marathon bombing, you know, that guy that lost a leg, he's an actor.
[658] All these really, and things shifted somewhere along the way where I think now it's just much more mean -spirited and much less.
[659] trying to get to the truth and more just focusing on again what makes a compelling crazy story and at the moment the crazier it is the more likely it is to like rise up the ranks in a forum and on Reddit and on terrible boards and suddenly that idea is propagated out you know makes a jump from 4chan to Reddit to YouTube a year later and influences talking about it you know so it's just the way the stuff spreads is insane I read through some of your articles on your website which I want people to go to, which is webworm.
[660] Dot substack .com.
[661] Is that the best place to go?
[662] That's me. Yeah, that's me. Webworm, got substack .com.
[663] Also, I got a domain for it.
[664] I got webworm .com because dot com was way too expensive.
[665] And dot CO was like a thousand times cheaper.
[666] So you can do that as well.
[667] I read ads with dot CO and I fucking hate it.
[668] It sounds like I'm leaving off the M and I'm like, people must assume I meant dot com and I've messed it up.
[669] So true.
[670] Okay, but yes, one of the things that was in there, which you just touched on, is Reddit, because for whatever reason, I cannot comprehend what Reddit is, first and foremost.
[671] I don't really understand what it is.
[672] I've gone.
[673] I got confused.
[674] I don't know how to find anything.
[675] So just tell me what Reddit is and tell me how it works its way up the chain, and then what role an influencer can play in all this.
[676] I mean, Reddit has great stuff on there and terrible stuff on there.
[677] It just depends what part of the site you're on.
[678] So their whole board, their politics board, is particularly intense, and they've got certain areas of Trump interests that are intense as well.
[679] But the idea is that it's survival of the fittest.
[680] So you'll post something on Reddit, and people will either upvote it or downvote it.
[681] So the popular stuff rises to the top really quickly.
[682] And that's the stuff that will get picked up by other people.
[683] Like if you or I occasionally went there, we would just see the stuff that was right at the top, the stuff that had been voted up by the community.
[684] The, like, trending stuff.
[685] The trending stuff, yeah, and it operates in a similar way to the likes of the four chans of the world, which are a much darker place on the internet that morph and change into other things all the time.
[686] But it's the same idea of, and it's a documentary out at the moment actually called Feels Good Man about the Pepe the Frog meme.
[687] And that is a thing that started on Forchan.
[688] Say the name again of the documentary.
[689] It's called Feels Good Man. Feels Good Man. It was at Sundance this year.
[690] It's one of the best docs I've seen in a long time.
[691] And it's, it's, it's, it's, we're getting sidetracked here.
[692] No, we're not, we just got boners.
[693] We both have raging boners because you just said there's a new dock and we're junkies.
[694] This really beautiful, quiet, mellow man who created this cartoon called Pepe the Frog.
[695] And so he had this frog, he had a whole lot of comic books he created around it at his home.
[696] Suddenly that frog got taken over by 4chan and by the internet.
[697] And eventually it morphed into a place where he ended up being classified as a hate symbol.
[698] by the Anti -Defamation League.
[699] And Pepe now is just, you know, so this beautiful creation this man had suddenly got taken over by the internet and suddenly got fully out of his control and this documentary tracks how that happened.
[700] Oh, wow.
[701] And if you want to find out about 4chan and about how these communities work, it's a really compelling place to start because it goes to some crazy places.
[702] Ooh, okay, must see.
[703] Yeah.
[704] And now that you say it, I've seen every time there's a standoff in this country on state land, I always see that frog being flown.
[705] There's also, I'm really into off -roading, and the off -roading community is largely very right -wing.
[706] And so the flags I see on people's off -road vehicles, I notice there's like a snake everyone has, and I think there's like a tread lightly on me. There's like some kind of...
[707] Yeah.
[708] You'll see a lot of peppy.
[709] I mean, the whole documentary is about this cartoonist trying to win his symbol of love and innocence back again after it's being co -opted by the right.
[710] It's like the Hindu swastika.
[711] kind of no totally it's just you do not associate it with the original thing that it was meant to be you know and it's gone really unfortunate for the person that that created that thing how does it how does it build from reddit up to like an influencer to finding itself in a news cycle it's a really good question and i mean i just observed we've got the celebrity chef over here in australasia called pete evans and he's gone over the last probably six months gone from posting on his Instagram beautiful photos of food and what he's making to purely just QAnon like fucking madness all the time like it's absolutely bonkers and you know I saw an example here in New Zealand is we've got like the New Zealand version of The Bachelor which is very funny because it's like it's a very like just imagine like a Kiwi eyes version of your American show it's like it is very funny well let's start what there's only three million of you right there's six million cheap and there's three million key no but it does become problematic because Some of these people should not be on TV, but we don't have any people to choose from.
[712] Yeah, yeah.
[713] The pool is small.
[714] But anyway, the winner of season one of The Bachelor, Art Green, he's like a celebrity over here in New Zealand.
[715] He's got, like, really good abs.
[716] He's really good looking.
[717] He's got, like, a paleo food company.
[718] So he starts to run a health podcast, a wellness podcast with his wife.
[719] And they'll have, like, dietitians on and I'll have, like, people teaching you how to, like, look as good as these people look.
[720] But then, suddenly out of nowhere, he booked people.
[721] And so from arts perspective, you know, Pete Evans was the chef and he was kind of into health as well.
[722] And so he had Pete Evans on the show.
[723] And pretty quickly in his podcast, it pivoted from being about fitness and food to the fact that COVID is definitely fake and that it just went into complete madness.
[724] And you've got this influencer who runs this podcast with not really any idea what was happening in any way to kind of know what this conversation was even about.
[725] Because if you haven't had these conversations.
[726] It's hard to have them, right?
[727] Because you're talking about stuff that is just so unusual, so quickly.
[728] And you're not armed with any rebuttal because the whole topic's new to you.
[729] So it's not like you have something in the chamber to fire back with.
[730] No. And so I think it's just an example of we're in this world now where QAnon exists primarily online where it's being spread.
[731] Influences are nothing but online.
[732] And so occasionally an influencer is going to like stumble on something like save our children.
[733] children.
[734] And I think a really interesting thing has happened as well over on TikTok where, you know, Black Lives Matter happened.
[735] And that was an incredibly worthy movement that was so important.
[736] But I think a lot of people got on board with that in a certain way just to look like they were on board with the right thing.
[737] Like, you know, you see all the black squares on Instagram, right?
[738] I would argue a lot of that is just people, and I'm not the, it's a tricky discussion to get into because I'm sure it was coming from a good place.
[739] But what What's happened since then is that people are very quick just to jump on movements, especially if they're an influencer because it makes you look good, it gets you clicks, it gets you follows.
[740] And so part of why Save Our Children, which was very Q and non -adjacent, part of the reason that spread so quickly is because influencers jumped on and spread it to their fans and other influencers, and suddenly it's everywhere.
[741] And so I think there's this really weird situation we're in at the moment where you've got people online who are popular that don't really have any idea about why this stuff is happening and they see save our children and it seems like a very worthy thing to get involved with.
[742] They don't know that pretty soon they're going to be up against people that believe that, you know, kids are having their blood drained in basements.
[743] So save our children and Q &ONON and Pizsigate, they're all very similar.
[744] All under the same umbrella.
[745] And I mean, that's one thing that Q &ON has done is given like a big master conspiracy theory for anything.
[746] So any crazy conspiracy theory now, it'll fit pretty neatly under QAnon.
[747] Right, right.
[748] And what about, I know this isn't one that you've written about, or maybe you haven't, I missed it.
[749] But Flat Earth, this is one of these things where it's like, I heard about it maybe on an episode of a show about football, right?
[750] These guys are training.
[751] It's like a pre -training docu -series and two dudes are arguing about whether the Earth is flat or not.
[752] And I'm thinking, this can't really be a conversation that's happening.
[753] How could you operate a car and also think the Earth's flat?
[754] There's such a disconnect.
[755] Yeah, we had a flat earth conference here in New Zealand last year, and a number of American speakers turned up.
[756] One of them had been in a Netflix documentary about flat earthers.
[757] And, you know, it's funny because that's a really good example of a conspiracy theory that is disprovable, like, very quickly.
[758] Like, it's not a difficult thing to do.
[759] A flight.
[760] You got to get on an airplane, see the curve.
[761] Yeah, that's what you got to do.
[762] Yeah, and you've got to ignore a lot of science to, if you want to keep believing that.
[763] And yet there are people out there that do actively believe in it.
[764] And I hate to say it because I feel like I'm knocking on a very specific group of people.
[765] But a lot of flat earthers, they do have that evangelical kind of background.
[766] Yeah, it's tied in with anti -science.
[767] Yeah.
[768] You see this again and again and again.
[769] And there are definitely evangelicals and Christians who are not into any of this stuff.
[770] I want to make this very clear.
[771] Yep.
[772] We know many of them.
[773] Yeah.
[774] Yeah, as do I. And it's unfortunate, I think.
[775] But it plays into that.
[776] It's brains that are very used to believing in something that they're being told and that gives them structure.
[777] And I think to a level as well, it gives them access to something that feels like secret, unique information.
[778] And it makes them feel special.
[779] Well, there's also community involved.
[780] That's comforting.
[781] And you've got friends.
[782] Yeah, it is absolutely a community.
[783] And obviously, we're living in a world where community is less and less and we're living online more and more.
[784] and if you want direct contact with people, it's really good to meet people who have a shared interest and yeah, when that shared interest is the earth is flat, those are your people and there you are and they're very passionate and, you know, what I found, it was kind of depressing because I went along to the Flat Earth Conference here in New Zealand because I was curious about what they were about and you very quickly find out that they're adjacent to all the other conspiracy theories you don't want them to be in.
[785] You sort of hope that they won't be into everything but they are, you know, I was talking to one of them and they were convinced that the mosque shooting here in Christchurch was fake.
[786] You know, we had a terror attack here, and that very obviously happened, but this flat earth was like, no, they were crisis actors.
[787] And at that point, I sort of backed away from the conversation because there's this unfortunate thing as well where I think 10 years ago, I think you could probably believe in a theory and not necessarily believe in another theory.
[788] You could think that maybe 9 -11 was an inside job, but you maybe thought, yeah, we probably did land on the moon.
[789] But unfortunately, now you believe in one, there's a tendency to just believe in them all.
[790] And I think that's, it's super scary, man. Stay tuned for more armchair expert, if you dare.
[791] Well, and here's one where I was like, for about nine hours I was considering being true.
[792] A friend of mine who believes in science was like, well, you know, the COVID thing, Wuhan is where their chemical weapons plant is.
[793] And that, they believe that escaped.
[794] And I'm like, totally plausible.
[795] I mean, as soon as you're fucking around with, you know, engineering viruses to kill enemies, very possible it could get out, right?
[796] So there's a good eight hours where I was like, oh shit, well, and not even angry at the Chinese, just like, well, I know we have a chemical weapons facility.
[797] I hope ours don't get out, you know?
[798] And I was just like, yeah, that could happen with a chemical weapons facility.
[799] No, and I think it's, you know, you've got to be open to different things.
[800] And I think it's not a case of just instantly shutting things down.
[801] You're allowed to think about things and analyze things.
[802] But as new information comes to hand, you've also got to be, aware that maybe you were wrong about this theory or that you can change.
[803] And I think there's this really fascinating thing with Trump supporters where it's like everything is true and nothing can be false.
[804] They build unfalsifiable propositions.
[805] That's like what their master's at.
[806] Like I feel your most rabid Bernie supporter could occasionally still be capable of going on a, Bernie kind of sort of missed, he did a misstep on that.
[807] Or an Obama supporter that's like, yeah, we love Obama.
[808] He's our man but like that whole drone thing wasn't so great was it or being anti -gay marriage it's like yeah it was a fucking bummer man that was hard to stand with him during that but i think i feel like with a lot of trump supporters it's all or nothing and i think that kind of bleeds through into a lot of this conspiracy stuff where it's all or nothing so there's a group of people out there that would sort of hear oh chemical weapons plant in wuhan oh and just dive all into that and never back out and i think that's what's super alarming as well by the way they're tasty that's like a tasty secret if nothing else it's like oh that could be a plot of a movie you know yeah and there is this appeal to having secret knowledge and if a conspiracy theory gives you this inside knowledge and understanding it does make you feel good yeah okay so i want to talk about now i'm going to attempt you don't have to join me but i'm going to attempt to have some compassion for those people and i'm going to attempt to try to understand them and one thing i heard that i thought was really a compelling thought was on Carlin's hardcore history podcast.
[809] He said that the reason people are susceptible to conspiracy theories, and particularly the JFK one, is that the notion that one crazy bastard could alter the course of history is so scary that one guy, Lee Harvey Oswald, if he had a conviction and determination, could fuck up world history.
[810] That's such a scary idea knowing there's seven billion of us, that what is much more comforting is, no, that doesn't happen.
[811] There was like, this was an arm of the CIA, there was high level people, this was, you know, that is just a lot more comforting.
[812] And so that's one aspect of it that I'm really sympathetic to, that some person can alter so many lives is very scary.
[813] I 100 % agree with you on that.
[814] It's much easy to think that there's some evil power out there that is responsible for all this, than the multitude of more subtle, more alarming things that can take place.
[815] Yeah.
[816] And then the other thing, which I already brought up a little bit, is I have a very close friend who has had drastic ebbs and flows with success.
[817] And I have noticed it's unmistakable the correlation that when the ebb is out and he feels very disempowered and rejected from our system and not thriving, that I notice there's a lot more conspiracy theories.
[818] I just hear them over and over again.
[819] and then I'll notice like it goes well for him for a while and then I don't really hear them anymore.
[820] And I just can't help but think that as people feel very excluded from systems and they look around and it looks like all their neighbors are thriving under this system and they go on Instagram and everyone's thriving, it's very hard to accept that I'm doing something wrong.
[821] I'm a good person and why wouldn't I be invited along with all these other people?
[822] it's more comforting to think there is a conspiracy that is out to fucking exclude me i feel that's a lot easier to stomach than i have failed and that's why i'm not included or i even look at your high school high school is the saddest fucking place in the world for 15 % of the people there it is just a beating on the chin from the second you arrive to you leave and if i'm those people yeah man this is unfair and why is it unfair there has to be an explanation well, it has to be these rich people who profit from us losers.
[823] And again, so I see that certain things are more fertile for this than others.
[824] And I want to know if you have thoughts on that or if you've even observed that.
[825] Yeah, yeah, absolutely.
[826] I think it is people who are isolated and worried about the way their own life is unfolding or unraveling around them.
[827] And I think there's no better example than COVID happening.
[828] You know, suddenly this thing hit.
[829] people are suddenly financially stretched.
[830] They're scared.
[831] Life is incredibly difficult for them.
[832] They are locked in their home.
[833] They are online all the time.
[834] They're connecting with people who are also feeling like this.
[835] And I think it's no small coincidence that we're seeing a lot more talk about QAnon and we're seeing so much talk on our Facebook feeds and our Twitter feeds from people.
[836] and we've all got friends that have fallen down this conspiracy tunnel because people are really scared.
[837] We're more scared than we've ever been.
[838] Everything is incredibly uncertain.
[839] The economy's uncertain.
[840] We don't know when life is going to go back to normal.
[841] I think it's dawning on a lot of people that things aren't magically going to go back to the way they were before for quite some time.
[842] We're living in a different world now.
[843] And just watching people jump on these conspiracy bandwagons since April, certainly here in New Zealand, And it's because people are terrified.
[844] You're completely correct.
[845] Now, we have this stereotype about New Zealand, and I'm the biggest offender of it.
[846] I did a movie in New Zealand for four months, and I don't think I've ever fallen in love with another country more.
[847] What was the film?
[848] What was the film you shot here?
[849] It was called Without a Paddle.
[850] I know you own it and watch it monthly.
[851] But we shot all over the North Island, and we were kind of based in Wellington, and I just loved it.
[852] I think it's a little shocking.
[853] And we have a similar stereotype about Canadians.
[854] When you meet a super conservative Canadian, you're like, wait, I didn't know you guys existed.
[855] You have national health care.
[856] Is it shocking to you that that element exists even in New Zealand or not?
[857] You know, we have an election coming up pretty soon as well, and I've been super alarmed to see politics here echoing politics in the United States.
[858] We've got a party that arrived called the Public Party, and it's fascinating to me because it's run by this New Zealand musician.
[859] and, you know, Billy T .K. Jr. is his name.
[860] Really good guitar player.
[861] Watching his Facebook over lockdown, we were on lockdown in New Zealand because of COVID for two months.
[862] And so we were in our homes.
[863] We were very limited in where we could go.
[864] And you watch this guy over those two months go from talking about zero politics or zero conspiracy thinking to nothing but to the point where now he's registered a political party.
[865] He's getting big turnouts all over New Zealand.
[866] And it's all based around the idea that COVID is a conspiracy, it's fake, 5G is even, just all the Q and on talking points.
[867] We now have a political party here in New Zealand and people actively following them.
[868] And that's something I never would have imagined happening here because we're pretty, we call things pretty straight, I think, in New Zealand.
[869] We're less likely to fall for something that's clearly incorrect.
[870] But we're seeing that here now.
[871] So I don't know how far that's going to spread.
[872] But I guess what I'm saying is, I mean, maybe take solace in the fact.
[873] that it's not just America, that it is embracing this kind of lack of truth, we're doing it here in New Zealand as well.
[874] There's a global populist movement, and you're seeing more and more kind of tyrannical, dictator -ish leaders popping up in different countries.
[875] Oh, yeah, you just look to somewhere like Brazil.
[876] You know, there's so many places where leaders are popping up who are anti -truth and just do not care about their fellow man. Yeah, and do you have a theory, an armchair theory, on why that is?
[877] I'm inclined to think it's the just ever -expanding income inequality.
[878] Do you have a take on why populism's on such a rise?
[879] I think you're completely right.
[880] The divide between rich and poor is at a level we've never seen it before.
[881] I mean, and I think we're becoming aware of that in ways that we weren't before.
[882] I mean, I was sent something the other day, a graphical representation of Jeff Bezos's wealth.
[883] And it was basically a little timeline where you'd speed along.
[884] And you just kept scrolling for like, I think I was scrolling for like 10 minutes, just getting through his wealth.
[885] And like my wealth stopped at like 0 .2 of a second in.
[886] And 10 minutes later, I'm still scrolling, you know.
[887] And so we're very aware of that.
[888] And I think when the scale of it suddenly hits you in the face, it's just really alarming.
[889] And I think people are aware of that.
[890] And I think there's just much more uncertainty than the has ever been before.
[891] I think, you know, we've got this big thing looming over us of global warming and realizing that we've probably missed the boat on sorting things out for future generations.
[892] Like we have completely messed with the planet.
[893] Income inequality, just society in general and how we're living is just, it's incredibly warped.
[894] And I think humans, there's more and more of us.
[895] We're more and more connected.
[896] We're more and more aware of how strange it is to be alive and experiencing.
[897] life in the way that we are and I think extreme views are just going to become more and more alluring easy answers people claiming that they have the full take on the truth and moving away from any sort of subtlety or discussion that has more layers to it beyond good bad you know evil yeah another thing is I think you see an entire generation of people who were raised on a promise particularly in this country which is you bust your fucking ass you own a home within that home you grow well then you retire, you're happy.
[898] And now you get to the point where you've peaked out, you've fucking killed yourself.
[899] And now you're telling me I can't drive an SUV.
[900] Now you're telling me I can't air -conditioned my home.
[901] Now you're telling me, like, there's just all these things where it's like, no, no, no, no, I bought into this shitty program where I fucking gave, I donated my life to a cubicle, but I'm supposed to get X, Y, and Z. And now you're telling me I can't have X, Y, and Z. Again, emotionally, not to excuse it, but I understand the emotions of it.
[902] And there are people doing it very hard.
[903] And, you know, you look to the sorts of people that buy into conspiracy thinking.
[904] And they would tend to be people, you know, lower on the socioeconomic scale.
[905] They're people that have less money.
[906] They're really terrified.
[907] They've been badly done by.
[908] And they're looking for answers.
[909] And this stuff is, you know, a conspiracy will offer them that answer because it shows them who to be angry at.
[910] Yeah.
[911] When they profiled this sovereign citizen group, I don't know.
[912] Are you aware of that group here in the States?
[913] No, no, I'm not, no. They reject the federal government.
[914] They won't carry license.
[915] They get in all these disputes with law enforcement people, and they issue all these lawsuits.
[916] That's kind of their weapon is that there are certain politicians that have like over a billion dollars in lawsuits stacked against them right now.
[917] And it gets so cumbersome, you have to respond.
[918] Like, it's a great terrorist technique to fuck with people.
[919] So in this 60 Minutes profile, they broke down that in the high 80 % of all, members, they were all white males who within the last 18 months had had a job that was middle class that went away.
[920] And that's the part where I had to go, I mean, these people are fucking crazy and I don't want anything to do with them.
[921] And I can find some compassion in that.
[922] I can see that someone's really desperate.
[923] And I think it's important to view things that way because on a very simple level, you know, if you've got a friend that has suddenly been red pills and they're suddenly spouting out a lot of untruths and sort of dangerous health misinformation, for example.
[924] It does no good to just barge in there, calling them stupid, calling them names, shutting them down.
[925] That does no good for anyone.
[926] You need to try and understand them.
[927] And again, speaking with Mick West, this conspiracy debunker, he told me that, like, the main thing you've got to do is, like, find your common ground as a human, even about conspiracy theories.
[928] So don't just come in there and tell them that they're an idiot.
[929] tell them, yeah, like, I have distrust for institutions as well.
[930] And I think that, you know, Watergate was a real thing.
[931] You know, I have some misgivings about 9 -11, and perhaps that was used as an excuse to go to war.
[932] Just put your things out there and find a middle ground and then have that discussion.
[933] You know, don't just shut it off, try and understand it.
[934] Big business does have big levers in our government.
[935] There's no question about it.
[936] Big Pharma makes decisions.
[937] The FDA is regularly.
[938] misled willingly by corporations.
[939] So yeah, there's plenty of real shit for us to agree on that needs addressing.
[940] Yeah, and there are, you know, there are conspiracy theories that turn out to be very real.
[941] And I think we need to keep having those discussions as well.
[942] Yeah, because that was going to be one of my questions, because one of your newsletter topics was how do we talk to people we love who are in deep?
[943] Yeah, yeah.
[944] Part of it is coming to the table with the idea of being open -minded and finding common ground.
[945] And another thing I got told in sort of a good way to talk to conspiracy theorists is called steel manning.
[946] So instead of like a straw man argument, it's a steel man argument.
[947] So again, don't go there and say, you know, that theory about adrenachrome is so stupid.
[948] You should stop thinking that.
[949] You go and learn that conspiracy theory a thousand times better than they know it.
[950] So like spend a week online, like learning it better than they do.
[951] And then when you're talking to them about.
[952] the adrenachrome and the fact that children are being drained of their blood in underground tunnels, explain it back to them in such great detail.
[953] A, it shows them that you are not just shutting them down and that you actually know what you're talking about.
[954] Well, you hear them.
[955] You hear them, you too.
[956] And hopefully in hearing that bat shit theory explained back to them, they'll kind of go, oh man, like that, when it's out of my own head, like that does sound kind of crazy.
[957] Like, I wonder if that is a real thing.
[958] And so Steeleman, is something that I think is a really good technique to keep thinking about.
[959] I couldn't agree more with that because one of my big complaints about my side of the political spectrum is I think life on planet Earth operates on these two very specific levels.
[960] There's one that is science and facts and empirical.
[961] And then there is an equal chunk of your life that is as important, which is your emotional experience on this planet.
[962] And so I so often think we on the left will try to defeat climate deniers argument with, you know, the data.
[963] And we don't ever stop to question, what is the fear underneath of all this?
[964] How could I help this person not be afraid of this thing?
[965] Like, how can I not get hung up on that?
[966] And really as a human who would like to help another human really hear what the fear is underneath of it and make a sincere effort to try to help the person.
[967] out of that fear.
[968] Because if you're a xenophobic who hates Muslims because you're afraid of terrorism, telling them you're more likely to get struck by lightning than killed by a terrorist, it doesn't do anything that's operating up on this level that's not what's driving them personally.
[969] And so I just think we don't want to address the emotional level that's driving so much of our life.
[970] And it really needs addressing.
[971] Oh, yeah, and completely.
[972] And we've just been talking, you know, so much of our conversation just now has been about the fear that gets people and the point of view that gets people to this place.
[973] And I think you can suddenly defeat that by thwarting them like a bunch of articles that you've read that are really good articles.
[974] But that's not going to do a thing.
[975] Like we're humans.
[976] Like we thrive on emotion and we're emotive creatures and we want to connect with people.
[977] And if you can't do that with someone that has different beliefs to you, there's no chance.
[978] So as heady as you can get about things and as much as you can have the facts on your side, you're not going to, you know, you're not going to convince a young earth creationist that maybe the earth isn't 10 ,000 years old by like screaming science at them.
[979] No, no. You could take them to an archaeological dig.
[980] It's going to do nothing.
[981] But I think if you start with someone who's like, let's say they're Islamophobic and you say, I'm terrified of bears, which is something I'm really terrified of, okay?
[982] I think about them way more than I need to think about bears.
[983] I've never actually interacted with one.
[984] I've seen a couple in the wild.
[985] And if I could just bond with them and say, yeah, man, I'm so fucking afraid of bears.
[986] It kept me from going to Yellowstone on this vacation.
[987] We were in a float boat once and I was like, everyone was enjoying it and I was panicked and blah, blah, blah.
[988] And you know, I read this thing that I'm more likely to get killed by getting hit in the hell with a coconut.
[989] And I've laid under a million palm trees on vacation.
[990] And I just try to use that as like, either you got to be afraid of coconuts or you can't be afraid of bears.
[991] You got to pick.
[992] Like, to me, that's the conversation to have when you're dealing with someone who's afraid of a terrorist attack.
[993] I think humor is a really good way to have these conversations as well.
[994] Like, you don't have to be this weighty, heavy, person having this discussion, you can use humor because that's something, I mean, that's something I've found in documentary that connects so well.
[995] Such a good way to get information across is through humor and that can help when you're having these kind of conversations as well.
[996] If you're all laughing, you're all suddenly leveled on the same playing field and you might be able to connect over some of these bigger, more important ideas.
[997] But you should, yeah, try and have that one -on -one human connection.
[998] Don't forward them links.
[999] Don't text them abuse.
[1000] Like it's kind of got to be in person if you've got any chance, I think.
[1001] Yeah, David.
[1002] Wow.
[1003] How fun.
[1004] Can we exchange email?
[1005] Like, I want to hang with you.
[1006] The next time you're in the States, I want you to, like, stay at our house or something.
[1007] I'm really viving you.
[1008] I really enjoy you.
[1009] I love that you Kiwis put a fucking egg on everything.
[1010] I challenge someone to go to New Zealand and order anything that won't have a soft boiled egg on it.
[1011] And I love it.
[1012] Yeah, you pop an egg on it.
[1013] And egg makes everything better.
[1014] I'll have a bowl of rice, rice, Krispy, sure.
[1015] We'll pop an egg on it for you.
[1016] You didn't ask for it.
[1017] Here it comes.
[1018] Yeah, I love the idea of coming to shoot a production in New Zealand and having this very short, extreme experience of a country like this.
[1019] I would just love to be inside your brain with the imagery you have of our country.
[1020] David, I got into the all blocks.
[1021] They were doing amazing.
[1022] This was 2003 and the all blocks were on fire.
[1023] I got into the Hawka.
[1024] There was a bunch of Maori crew members on the movie.
[1025] I tell what does the Hawke mean?
[1026] And it's like really primal.
[1027] Yeah, totally.
[1028] Yeah.
[1029] I mean, it's an amazing thing to watch before a match.
[1030] And yeah, it's pretty terrifying to the other team.
[1031] And we're very good at it.
[1032] Yeah.
[1033] And at first I was like, this is the silliest pageantry.
[1034] And then I watched them do it to the Australians.
[1035] And by the end of it, you could see them going like, oh, we're kind of fucked.
[1036] What a kind of voodoo is about to get unleashed on this field.
[1037] It's spine chilling.
[1038] I mean, we're very, it's very cool to see that on the world stage before games because it's such an important part of New Zealand.
[1039] I mean, we're unique in so many ways, but you watch a hucker in person and it's just like spine chilling.
[1040] You know, it's pretty special.
[1041] I've always wanted to see like an American football team go up against a rugby team and like vice versa.
[1042] I would love to see that.
[1043] So it would be such a sight.
[1044] Yes, yes, yes.
[1045] Okay, well, David, everyone I urge to please go to your website, which is webworm .com.
[1046] and watch Tickled, and we're going to watch Dark Tours.
[1047] Oh, can't wait.
[1048] And this has been so fun.
[1049] No, thanks for having me on.
[1050] It's a pleasure.
[1051] Stay safe over there in beautiful Los Angeles.
[1052] We will do our best.
[1053] The entire place is on fire, but we will march forward.
[1054] I know.
[1055] A friend sent me some photos out the window.
[1056] It looks fucking chaotic.
[1057] Oh, it's orange outside.
[1058] It's just bright orange, and we already have a plague, and now we have fires.
[1059] So as soon as those locusts come, you know, that's the full apocalypse.
[1060] there.
[1061] Do please stay safe because it is, yeah, it's awful shit.
[1062] So just be careful, et cetera.
[1063] Yes, I appreciate it.
[1064] It's such a weird situation to be in.
[1065] And I really appreciate you thinking of me and being into my stuff.
[1066] And thanks for being into tickled.
[1067] And for tackling conspiracy stuff, because I honestly think this is one of the biggest issues we're going to be facing is just this complete lack of truth.
[1068] And if we can't get that back, I think we just don't stand a chance.
[1069] So thanks for playing into this stuff.
[1070] All right.
[1071] Well, be well.
[1072] Yeah, lovely to meet you both.
[1073] Yeah, and the door is always open to you.
[1074] So the next thing you make, please come here and talk about it.
[1075] Thanks so much.
[1076] I'd love to.
[1077] Thank you, both of you.
[1078] Bye.
[1079] Bye, David.
[1080] Bye now.
[1081] See you.
[1082] And now my favorite part of the show, the fact check with my soulmate, Monica Padman.
[1083] Christian Bale.
[1084] Christian Bale.
[1085] Was so good in the Cheney movie.
[1086] We were just discussing.
[1087] We just had really delicious rice dishes from squirrel.
[1088] Squirrel in Los Angeles is a...
[1089] Spelled with.
[1090] without any vowels or one vowel?
[1091] S -Q -I -R -L.
[1092] Every time I try to search it, I don't know how to spell it.
[1093] I'm like, I know it's missing some vowels, but which?
[1094] You don't like it when things are spelled funky or not spelled phonetic.
[1095] I don't have a great grasp of how they're spelled just normal.
[1096] Yeah.
[1097] Yeah.
[1098] I know that.
[1099] So we loved David.
[1100] Oh, my God.
[1101] We loved him.
[1102] This is not the right word or adjective, but it was almost tabloidy.
[1103] Like it felt...
[1104] Oh, Sidney.
[1105] Sintillating, scintillating, yeah, intittalating.
[1106] That is an interesting parallel because I do think what people get out of tabloids, I mean, a lot of things.
[1107] But one is just like a peak inside a world they don't know.
[1108] And that is what we experience.
[1109] And they're kind of conspiracy driven because, right, like it's been on the cover of, like, magazines.
[1110] I've seen it seven or eight times that Mila and Ashen are getting divorced.
[1111] Right.
[1112] And that's just proven to be like a conspiracy theory.
[1113] There's nothing behind that.
[1114] Yeah, we'll just lie.
[1115] Well, and one of them was driven by her being on the podcast.
[1116] Do you remember that?
[1117] What?
[1118] Somehow something she said ended up making headlines.
[1119] And two weeks later, I saw the cover that they were getting divorced.
[1120] And I was like, hmm, well, this time I'm just going to check in and make sure everything's honky -dory.
[1121] Yeah.
[1122] And I text.
[1123] Yeah.
[1124] And I'm like, oh, my God.
[1125] Now, it's ridiculous.
[1126] Oh, my God.
[1127] That's so crazy.
[1128] But, yeah, there are some similar traits.
[1129] And at the apex, the National Enquirer, you've got reportings of aliens, you've got the ape child, you've got a wolfman, yeah.
[1130] Of all the conspiracy theories, which one do you think are you most likely to believe?
[1131] Probably aliens, just because mathematically, there's billions and billions of stars and billions of guests.
[1132] Can we even call that a conspiracy theory?
[1133] Well, in that, I've not seen evidence of it ever that's like definitive.
[1134] But there is some, isn't there some evidence?
[1135] that there's life on other planets?
[1136] No. No, not that I'm aware of.
[1137] I know that like SETI's been listening for years for some kind of radio transmission emanating out of one of these planets, but I do know that in our own galaxy that they've looked at all the stars and they've tried to see if any one of the stars had a planet, that it was within the right range that it could support life where it wouldn't be too hot or too cold.
[1138] Yeah.
[1139] And that in our own galaxy, there are none.
[1140] But there's billions of galaxies and there's super galaxies and galaxy clusters.
[1141] And so, you know, as Eric was saying yesterday when we were at the beach, they say there are more stars in the universe than there are grains of sand on planet Earth, which seems not possible.
[1142] My brain kind of stops.
[1143] Yeah.
[1144] It stops working at a certain point.
[1145] As soon as you leave the planetary system or whatever, our solar system.
[1146] Oh, maybe even before that.
[1147] When we were all on vacation together and we were looking up at the stars and Eric was very into it.
[1148] Yeah.
[1149] And he was teaching us stuff.
[1150] He has the app and he's explaining things.
[1151] And I cannot understand.
[1152] I think I'm a fairly intelligent person.
[1153] Oh, I'd say super intelligent.
[1154] It stops.
[1155] It stops.
[1156] Yeah, we all have blind spots.
[1157] Mine's those stupid fucking music notes, the hieroglyphics.
[1158] Oh my God, you hate them.
[1159] I mean, I cannot comprehend that.
[1160] To save my family's life.
[1161] And you try.
[1162] Oh, yeah.
[1163] I took, I had, took trombone.
[1164] in sixth grade.
[1165] Oh, my God.
[1166] What a dumb instrument I picked.
[1167] I told you why I picked it.
[1168] Maybe I forget.
[1169] I was in fifth grade and we had an assembly where the junior high kids came that were in band to kind of show off to get us interested in joining band when we got to junior high the next year.
[1170] Yeah.
[1171] And the guy took the trombone and he went, Oh, yeah.
[1172] And it was that one move.
[1173] I was like, fuck, I got to be able to do that.
[1174] Forget the fact that I love drums and I wanted to play drums.
[1175] I just, I saw him go, Oh, God.
[1176] And then I took it and I regretted it from the second I did it.
[1177] and then I could not understand music, writing, and notes and reading.
[1178] Could you do that move, though?
[1179] Oh, good.
[1180] Anyone can do it.
[1181] I didn't need to take a semester of it.
[1182] I could have just fucking bought one at the goodwill.
[1183] Oh, so you fulfilled a fantasy, though.
[1184] You should just count that as a pro.
[1185] Chalk it up to a bucket item list.
[1186] I scratched off at 11.
[1187] Yeah.
[1188] Okay, I wanted to look something up because I think our friend Laura was just telling me that they just found life on Venus, or something?
[1189] I guess they found some chemistry on Venus that is the precursor to life.
[1190] Okay.
[1191] I think that's what it is, but I'm not entirely sure.
[1192] I'm not finding very many reputable.
[1193] Also, Eric was saying that apparently some Navy pilots and some military people have recently come out saying, no, no, no, we have seen many spaceships.
[1194] Like many people in the military, yes.
[1195] Wait.
[1196] So, yeah, we really broke it down yesterday, Eric and I, Because I was like, here's my problem with this area 51, and they've got these alien bodies, and they've got a couple ships that are in various levels of degradation.
[1197] So first we were going, like, well, A, I find it a little convenient that they landed in America.
[1198] Right.
[1199] Like, because if they landed in Guam, I think Guam would be like, check out this shit we got.
[1200] It would be exciting and they, you know, it would be good for their country.
[1201] So the odds that it landed here is a little suspicious.
[1202] But then we thought, you know, maybe they have it.
[1203] And it's a technology they don't yet understand, but they're very committed to learning the technology.
[1204] And then it would be a great advantage for us as a country to have this technology once we figured it out.
[1205] So it would be better for them to keep it quiet and not have the international committee of scientists looking into this because then it would be some shared technology, which again is bad.
[1206] If we have a cool technology, everyone should have it.
[1207] Anyway, that's side of now.
[1208] Right, right.
[1209] But it is very in our nation's interest, selfish interest to maybe.
[1210] learn what this technology is and how to use it so that we would have a leg up on everyone.
[1211] Wait, what technology are you talking about?
[1212] Well, the power source of their spaceships.
[1213] It can't be internal combustion engine and it can't be nuclear.
[1214] They have some power source.
[1215] Wait, so something landed here?
[1216] That's what they're saying.
[1217] That's the theory of Area 51.
[1218] And in fact, Joe Rogan had this scientist on who is documented at having worked at Area 51.
[1219] And this guy sounded so sane and rational that I was actually like, is this possible what he's saying?
[1220] And he was saying, they have this orb in there that's in a glass thing and they don't know how it works, but they've got it sealed up.
[1221] And then they have these other components of the spaceship that are kind of like sealed up and they're trying to understand how it works.
[1222] Would he be able to just openly talk about it if it was real?
[1223] Well, this guy has had lots of crazy things happen to him.
[1224] And he's been, people have tried to silence him.
[1225] So I don't, yeah, I could.
[1226] But also he could be a paranoid.
[1227] Schizophrenc or, yes, totally.
[1228] I mean, I don't, I feel bad saying that, but I have to think logically.
[1229] Yeah, there are a lot.
[1230] of people with delusions.
[1231] I'm one of them.
[1232] Delusions of grandeur.
[1233] Yeah.
[1234] Yeah.
[1235] I thought we were going to win a Peabody for like three seconds.
[1236] That was a delusion of grandeur.
[1237] Well, yeah.
[1238] Okay, but not as extreme.
[1239] No, they never last more than like, I generally can tell I'm being grandiose within five minutes.
[1240] I can give you another one that is so embarrassing.
[1241] Oh my God, this is so embarrassing.
[1242] There was a moment.
[1243] Mind you, it was only the second movie I'd ever been in, but the second movie I was ever in was idiocry.
[1244] And then, that character, whether it's good or bad, it was an arch.
[1245] You know, I was really swinging for the fences, regardless.
[1246] It's great.
[1247] I was in my trailer and I just, all of a sudden, I was like, my God.
[1248] Could I be the first person to get nominated from a comedy?
[1249] Wow.
[1250] Yeah.
[1251] And for about an hour, I let myself think that that was a possibility.
[1252] Yeah.
[1253] Oh, my God.
[1254] What's tricky is there's a fine line between delusion and fantasy and chasing your dreams yes yes because it's actually the odds of me going from milford michigan to being in a mike judge movie are actually much lower than me after i'm already in a movie to get nominated so in some way things start seeming really possible yeah that are ridiculous yeah but are they like that's that's really the big question Maybe I should just kept thinking that.
[1255] Well, I believe in that and, like, willing things.
[1256] And it's not that you're willing them and then they just come to you.
[1257] It's that you're thinking about it so much that you're inadvertently taking steps to get there.
[1258] Right.
[1259] So I believe in that.
[1260] But I really, and this is not faux humility.
[1261] I don't think I have the skill set to warrant to an Academy Award nomination.
[1262] Like, when I even, oh, my God, Peter Meinhoff, ding, ding, ding, ding.
[1263] Like, when you look at what Christian Bale does.
[1264] Oh, ding -ding -ding.
[1265] And then what I have done, those are like two dramatically different pursuits in a lot of ways.
[1266] It is.
[1267] But you're not trying to do what he does.
[1268] And not everyone who's won an Academy Award is of Christian Bale's transformative level.
[1269] They've just had an incredibly honest performance.
[1270] That's the only thing I've ever thought I could maybe get.
[1271] I could be 100 % true to who I am.
[1272] And if the story was compelling and moving enough, maybe it would love.
[1273] land me there, but I, but in no way, you know, like, I would say my only touch of that would be maybe parenthood.
[1274] It's like somehow it was this perfect part that I could do very naturally.
[1275] And I wouldn't have felt like a fraud if I had been nominated for an Emmy.
[1276] You should have won an Emmy for that.
[1277] You should have won an Emmy.
[1278] I think everyone on that show should have won an Emmy.
[1279] Oh, I know.
[1280] As someone who has been nominated.
[1281] Yes, please.
[1282] Tell me, how do I get on that stage.
[1283] I don't even know how to finish that sentence.
[1284] But when you were little, and you would practice being on Letterman.
[1285] Mm -hmm.
[1286] And then you got on Letterman.
[1287] Like, that's what I'm saying.
[1288] That was a delusion of grandeur and it happened.
[1289] I know.
[1290] You're right.
[1291] Yeah, but the beauty of being ignorant to what's possible is so helpful.
[1292] Yeah.
[1293] And then can cross over really quickly.
[1294] Exactly.
[1295] You really have to police yourself to stay in that middle ground of fantasy.
[1296] Uh -huh.
[1297] Without blowing your ego out, too.
[1298] Because I think where it jumps into something pathological, is when you start getting resentful at other people because you start thinking you did something better than that.
[1299] You deserve it or something.
[1300] Yeah.
[1301] Yeah, the deserving element is where things get bad.
[1302] Yeah.
[1303] And even the evaluating the other people who got nominated.
[1304] Like, once you're evaluating whether someone should or shouldn't have been nominated, it's probably not a great path to be on.
[1305] Yeah.
[1306] Okay, so the conspiracy theory that I fall for the most.
[1307] Oh, please.
[1308] I'm sorry.
[1309] I was rude.
[1310] I got distracted.
[1311] I would have followed up with what one do you like.
[1312] I think I would be most likely to believe, like, a governmental one.
[1313] Mm -hmm.
[1314] That, like, the government was involved in something.
[1315] Like, it's not 9 -11, because I just definitely don't think that.
[1316] Yeah.
[1317] But, like, maybe a assassination.
[1318] Sinking of the Lusitania?
[1319] I think there was conspiracy theories around that.
[1320] Yeah, yeah.
[1321] I feel like more of an assassination.
[1322] Oh, we've 100 % assassinated people.
[1323] That's not even a conspiracy theory.
[1324] There's been a lot of declassified.
[1325] If I was going to believe in.
[1326] I don't believe in any.
[1327] But if I was going to be.
[1328] I don't think the CIA would kill a president.
[1329] I really don't believe that.
[1330] I mean, I think they'd kill other people's presidents and probably have.
[1331] But I don't think they'd kill our president.
[1332] Think how fucking rogue the CIA had gone at that point.
[1333] Think of the boss of the CIA pitching that idea, guys.
[1334] Hear me out.
[1335] Someone would blow a whistle about the president.
[1336] Maybe.
[1337] Yes.
[1338] I mean, yes, that's why I don't think it's real.
[1339] But I definitely don't think the government is so pure that something like that couldn't happen.
[1340] Here's my other thing.
[1341] And this isn't Bader Meinhauf.
[1342] This is attribution error.
[1343] So maybe I'm wrong because I'm this way and I assume everyone else is this way.
[1344] But I know many times I was not supposed to tell my wife or you about things.
[1345] I do.
[1346] Right.
[1347] And I have to imagine other people are like me, they're going to tell their wives.
[1348] And then their wives are going to tell their best friend.
[1349] And then that best friend is going to tell their husband.
[1350] And then the husband is going to tell his best friend.
[1351] I don't think you can keep a lid on these things.
[1352] Secrets like that.
[1353] Yeah.
[1354] I mean, I sometimes wonder about that with my therapist, like therapists and stuff.
[1355] Like how are they not telling their husbands and wives?
[1356] They have to be right?
[1357] They are.
[1358] But then, yeah, then that person is definitely telling them.
[1359] But I don't, I can't really believe that because I need to be able to talk to that person.
[1360] Well, what you have to do is you have to go like, that could happen, but it's worth the risk because I need this.
[1361] Yeah, that's scary.
[1362] So far, I don't believe in any.
[1363] Well, I think we have corrupted tons of elections.
[1364] I think we've assassinated leaders.
[1365] I think we've probably shot down planes at points.
[1366] I think we've done some horrendous things.
[1367] But what about like the skulls?
[1368] Have you seen that movie, The Sculls?
[1369] Based on the skull and key.
[1370] Skull and bone.
[1371] I think Skoll and Key?
[1372] Maybe.
[1373] Oh.
[1374] Yeah, Secret Societies.
[1375] I'm so into secret society.
[1376] But when I was in high school, I read this book, Behold the Pell Horse.
[1377] It's all of the, early conspiracy theories.
[1378] And it's all about the Illuminade and the Knights Templar.
[1379] Are you going to tell us you were invited into the Illuminati?
[1380] I'm going to tell you that I'm the current president of the Illamonati.
[1381] Wow.
[1382] But there is all these weird things, right?
[1383] Like there are a very significant percentage of the presidents were Freemasons.
[1384] Oh, I know.
[1385] And Skull and Key Harvard grads.
[1386] The Knights Templar had these famous people associated with it.
[1387] So, you know, all the coincidences are there for you to really start.
[1388] Oh, the secret societies in those colleges, that is real.
[1389] They really exist, obviously, and they do produce really elite members of society.
[1390] But then what's their mission?
[1391] That's the part where I'm like.
[1392] I think that the truth is, it's just like in Hollywood, like all these people know each other.
[1393] So they say, how about this person?
[1394] How about it?
[1395] It's more like that.
[1396] Like, they all just know each other.
[1397] Agreed.
[1398] And also, it really just sums up your view of the world, right?
[1399] Because I can, even if I can see that there might be a group of a lot.
[1400] elite people that are trying to steer the course of history.
[1401] I don't assume they're trying to destroy history.
[1402] I think they're trying to do something great.
[1403] Yeah, yeah.
[1404] Like my assumption is even, like, I remember reading that Behold the pale horse.
[1405] And at the end of it, I was like, I'm kind of grateful that the 20 smartest people in the world are guiding it, if that's true.
[1406] I don't want me and my neighbors guiding it.
[1407] Well, the line can get crossed so quickly like social dilemma.
[1408] Yeah, that has the best intentions.
[1409] There was no bad intentions when all that social media stuff started.
[1410] And now, oopsies.
[1411] Big oopsies.
[1412] Yeah, oopsies are farted.
[1413] You know?
[1414] Oopsies, oopsies.
[1415] That's what they need to come out.
[1416] Tech needs to come out with a headline that says, oopsies, we farted.
[1417] Ah.
[1418] Let's open the windows and usher that air out of there.
[1419] Oh, my God.
[1420] Okay.
[1421] Dave.
[1422] So his documentary tickled.
[1423] We love that doc.
[1424] We talked about it.
[1425] At length, but if you haven't watched it, please watch it.
[1426] It's so good.
[1427] It is on Hulu.
[1428] Hulu.
[1429] Oh, you told him to watch the Roy Cohn documentary, but you didn't mention the name.
[1430] The name is bully, coward, victim, the story of Roy Cohn.
[1431] It's awesome.
[1432] Do you say Cohen or Cohn?
[1433] I say Cohen, and likely I say it wrong, you know.
[1434] Okay.
[1435] Because it's C -O -E -N?
[1436] No, it's C -O -H -N.
[1437] Cohen.
[1438] Cohen.
[1439] Roy Cohen.
[1440] I think it's Roy.
[1441] Oh.
[1442] Ding, ding, ding, ding.
[1443] Words you don't like.
[1444] Oh, ding, ding, ding, ding, ding, ding.
[1445] Oh, we're using words with letters.
[1446] Okay.
[1447] Okay, so you said you're afraid of bears.
[1448] And the technical term for fear of bears is.
[1449] Oh, can I guess it?
[1450] Yes.
[1451] Ursula phobia.
[1452] Nope.
[1453] Fuck.
[1454] Did you really think that was the thing?
[1455] Because I think Ursula is like the, wait, Kingdom Final Class Order family, genius.
[1456] Maybe genus of bears is Ursula?
[1457] Ursula Major?
[1458] That's a star, right?
[1459] Right, but isn't it a part of that bear constellation?
[1460] Maybe.
[1461] Yeah, I don't know.
[1462] We're way off.
[1463] Ding, ding, ding, ding.
[1464] Okay, it's arcodophobia.
[1465] Arcotophobia.
[1466] Architophobia.
[1467] Architophobia.
[1468] Or arcotophobia.
[1469] That sounds like a fear of architects to me. which I have.
[1470] It sounds to me like a fear of maps because cartography.
[1471] Oh, I love cartography and cartology.
[1472] Oh, and you don't have fears of it?
[1473] No, I love cartology and cartography.
[1474] And I love topographic maps a lot.
[1475] When you'd go to the library and they'd have a nice topographic map.
[1476] You were interested.
[1477] Loved it.
[1478] What do you like about it?
[1479] Well, because they build up the mountains and then there's lower portions where the water is.
[1480] Well, you know, I love miniature stuff.
[1481] Oh, yeah.
[1482] And so my ultimate fantasy, oh, I wish I would have said this when we were talking about winning the Powerball.
[1483] If I had a billion dollars, here's what I would do.
[1484] Okay.
[1485] I'd move to Austin, Texas.
[1486] I'd buy a piece of property on Lake Austin.
[1487] Uh -huh.
[1488] And then I would build a lazy river around the property.
[1489] And then I would build miniature mountains, but perfectly to scale of the Rocky Mountains.
[1490] Wow.
[1491] So the lazy river would be the Colorado River.
[1492] And then after you left the Rocky Mountains, you'd go down into the desert of Arizona.
[1493] Oh, my goodness.
[1494] Yeah, it would take the real route of the Colorado River.
[1495] Wow.
[1496] That's cool.
[1497] Wouldn't it be so fun to be in a tube floating and it's like, oh, here we go.
[1498] Now we're going through Colorado.
[1499] Hi.
[1500] Wow, that is pretty cool.
[1501] Wouldn't that be cool?
[1502] That'd be something to do if you had a billion.
[1503] Yeah.
[1504] You'd need a few bill.
[1505] No. Yeah.
[1506] But that's like a ride at Disney.
[1507] Yeah, you wouldn't have enough money to do other stuff.
[1508] You'd spend it all on that.
[1509] You could get that belt for probably $10 million.
[1510] No. Yeah.
[1511] The miniature mountains and the water park?
[1512] Yeah, these guys who build these train station sets.
[1513] I'd hire like 20 of them.
[1514] Oh, my God.
[1515] I'd give them each 300 grand to work for the year.
[1516] And between the 20 of them, they would probably build it.
[1517] It'd be the weirdest group of guys out in the backyard, wasn't it?
[1518] Okay, so since we recorded, we watched the doc he told us to watch, not his, which we do need to really watch, but feels good man. Feels good, man. And it was really a good documentary.
[1519] That was so good.
[1520] I really recommend it.
[1521] We watched it the same day.
[1522] We watched Social Dilemma, which was a lot to take it.
[1523] We were pretty pessimistic by the end of the night.
[1524] Yes.
[1525] Yes.
[1526] Man, what the Internet can do.
[1527] And yet it allows this.
[1528] Exactly.
[1529] Utopia and Dystopia.
[1530] All at the same time.
[1531] Please watch it.
[1532] It's really, really interesting.
[1533] It is.
[1534] It is.
[1535] And also just humans are so persuadable, fallible and, yeah.
[1536] Vulnerable.
[1537] Yeah, vulnerable.
[1538] Yeah.
[1539] Us, you and I, too.
[1540] Of course.
[1541] Yeah.
[1542] I know.
[1543] Consider getting off social media for a minute.
[1544] Then I persuaded you not to for business reasons.
[1545] Yeah, and then I listened.
[1546] But I do feel unethical.
[1547] I feel unethical about being a part of addiction circle.
[1548] Yeah, but can't you just isolate what the utopia is and use the utopia and then avoid the dystopian aspects?
[1549] Of course, but I think that it still adds to when people, if someone has a notification on their phone and it dings and they're looking, like I don't like that.
[1550] Mm -hmm.
[1551] And I want to promote my show.
[1552] That's right.
[1553] What do you do?
[1554] Oof.
[1555] All right.
[1556] Well, that's all for Dave.
[1557] Oh, boy, I love Dave.
[1558] too, Zealander.
[1559] Good a, aye.
[1560] Sweet -Ais.
[1561] That's what they say a lot in New Zealand, but they were teaching us how to canoe.
[1562] Like we had two Olympic rowers, Augie, who we loved.
[1563] We were obsessed with Augie.
[1564] He was such a man. He was on shore screaming at me as I was, you know, trying to do the J -stroke and then ferry across and all this shit.
[1565] And the fucking boat was sinking.
[1566] And he's screaming, you know, this is all I could hear.
[1567] Sweet -ass, sweet -ass, sweet -ass, sweet -ass, sweet -ass, sweet -ass.
[1568] And I came to -you -hear what I was saying.
[1569] And I was like, all I heard, man, was sweet -ass, sweet -ass.
[1570] He's like, I wasn't saying sweet -ass, but any time I would lose track of that accent, I would just hear sweet -ass, sweet -ass, sweet -ass.
[1571] And the other thing I love that they do in New Zealand is they do baby talk.
[1572] What do you mean?
[1573] Well, like the wardrobe department would say, do you have your trackies, your track pants.
[1574] They call trackies.
[1575] And then they'd say, you want some Inge Brecky, and that was English breakfast tea.
[1576] So there's all these, like, baby words, it's Inge Brecky, Trekkies.
[1577] And then the best was, Seth was at a gas station and just happened to look at an Australian tabloid.
[1578] Okay.
[1579] And they kept referring to women's breasts as blamper's.
[1580] Blamper's?
[1581] And some other hysterical word blamper's and something.
[1582] So Seth and I, all we talked about for three months were blamper's.
[1583] Oh, whimey, look at the blampers on that one.
[1584] Who could even take that seriously?
[1585] Well, she had a gorgeous set of blampers.
[1586] Well, on that note.
[1587] On that note, good day.
[1588] We love you.
[1589] Good day.
[1590] Oh, ding, ding, ding, ding, ding, ding, ding, ding.
[1591] Follow armchair expert on the Wondry app, Amazon music, or wherever you get your podcasts.
[1592] You can listen to every episode of Armchair expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.
[1593] Before you go, tell us about your stuff.
[1594] by completing a short survey at Wondry .com slash survey.