Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert, expert, expert.
[1] I'm Dack Shepard.
[2] I'm joined by Maximus My House.
[3] Hello.
[4] How are you doing, Maximus?
[5] I'm feeling Maximus today.
[6] Oh, good.
[7] I like you on your maximum.
[8] It's Saturday.
[9] Even though days or nothing, it's still the weekend, so we're still going to celebrate.
[10] We have a really interesting gentleman today by the name of Andrew Morant.
[11] He is an American author and journalists.
[12] He wrote the 2019 book.
[13] anti -social online extremists, techno -utopians, and the hijacking of the American conversation.
[14] He went undercover to find out who's writing all these crazy messages on the internet.
[15] Yeah, he's really interesting.
[16] He gets to know people that we kind of, I think, in society deem as like...
[17] Trolls.
[18] We'll write them off as trolls generally.
[19] Yeah.
[20] And then there's varying levels of that where they're now involved in semi -terrorist plots.
[21] Uh -huh.
[22] Yeah, it gets dark.
[23] Yeah, I sure does.
[24] So please enjoy Andrew Morantz.
[25] Wondry Plus subscribers can listen to Armchair Expert early and ad free right now.
[26] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[27] Or you can listen for free wherever you get your podcasts.
[28] He's an armchair expert.
[29] No, I know her from the New Yorker Festival.
[30] They do these live interviews once a year.
[31] part of just like an event.
[32] So I did, I interviewed her for that.
[33] On stage?
[34] Yeah.
[35] I'll deco.
[36] It was fun.
[37] It's a fun, it's a weird form because it's very, very different from the form you would do for a magazine interview.
[38] Right.
[39] When I'm doing print interviews, I'm not at all worried about how it sounds in the moment.
[40] Of course.
[41] Right, right.
[42] I'm literally the only person who will ever hear it.
[43] Yeah.
[44] So I'm very used to doing interviews in a way that's really like repetitive, asking the same questions six different times, circling back to things.
[45] Uh -huh.
[46] And then the live interview format is completely different.
[47] Especially, I'm also used to doing like adversarial interviews, which obviously a live interview you don't do adversarial.
[48] Very uncomfortable for people.
[49] Totally.
[50] Their mirror neurons start firing in there.
[51] Yeah, they don't enjoy it.
[52] Lafs uncomfortably.
[53] Uh -huh.
[54] But like when you're, when I'm doing it for magazines, I'm like, you know, I could be there for days, hours, weeks.
[55] Yeah.
[56] So you get your little digs in there over time.
[57] Yes.
[58] It's just a whole different form.
[59] Yeah.
[60] Yeah.
[61] Yeah, so I'm wondering, like, how often do you go in with an approach, or have you ever tried to go an approach where it's like, I'm not going to go out that thing and then ended up getting it?
[62] Totally.
[63] Yeah.
[64] And it's all person to person.
[65] Right.
[66] It's all just, it's like any other relationship, except in other relationships, you're hopefully not being as weird and duplicitous as you are in a journalistic relationship.
[67] Yeah.
[68] You know, what is Joan Didion said writers are always selling someone out.
[69] And that's a cynical take, but it's also true.
[70] like, you know, look, there's obviously good sides to what journalists do and unearthing truths that the world needs to hear.
[71] And there are pieces that are just profiles that are celebrating some aspect, right?
[72] That you would like.
[73] Totally all the time.
[74] And like there are times where, like, for instance, I did a profile of Leslie Jones right after she got on SNL.
[75] And her publicist was really nervous and was like, please, like, she's such a nice person.
[76] Like, please don't do a hit piece on her.
[77] And I was like, imagine the universe where I go in and like want to slam this woman.
[78] who is having, like, the time of her life, this well -earned thing, and I'm just like, I'm going to take her down.
[79] Like, that's not, that is just not what I'm here to do.
[80] But depending on your ambition level, the stage of your career, if you're trying to break out and make a name for yourself, that you can see where that's on the table for some people.
[81] Oh, and there are always ways.
[82] I mean, in that piece, there were two or three times when I was like, I could do some damage here.
[83] Yes.
[84] It, like, falls into your lap and you're like, it's not what I want to do.
[85] It's not worth it.
[86] That's why, you know, journalism is a very, like, high trust endeavor because there are built -in checks and incentives.
[87] And there's obviously, I mean, with the New Yorker particularly, there is this extensive fact -checking process that happens not only with investigative pieces where there might be lawsuits, but with every piece.
[88] You're essentially the piece is being re -reported.
[89] Yeah.
[90] And everybody always has a chance to weigh in.
[91] So there's that.
[92] But ultimately, I kind of do have to say to people, like, you kind of just have to trust me that I'm not going to, that I know what I'm doing.
[93] Yeah.
[94] So I increasingly don't do print interviews.
[95] It has to pass through your filter.
[96] It has to pass through my filter.
[97] That's just how we are, right?
[98] It's just going to go in your ear and come out your fingertips.
[99] And so that process scares me because so much of what I say is informed by how lighthearted I'm saying it, my delivery, all these different things that get lost in print.
[100] I've agreed to do them for the podcast because I care a lot about the podcast and I care a lot about Monica.
[101] And I like for us to be able to do that together.
[102] But we had New York Times, we did a piece on us.
[103] It was so lovely while we were here And then over the next three days I convinced myself Oh my God, it was This is going to be a hatchet thing Like she was posing as someone Who actually likes the show But then we got into other issues That are sticky And I'm like, you know, if she wants to hone in On those things, it's all right there And I said it all Her name is Eliza Brooke She was lovely She was incredibly lovely And I was like That was great When after she left, I was like, I think that was fun and great.
[104] And he was like, I don't know.
[105] There always are ways.
[106] There always are ways.
[107] Can I say, though?
[108] But then it was very positive.
[109] It was a beautiful, kind, lovely, nice, fair thing.
[110] And I was like, oh, I just completely spun out about it because I'm just so nervous.
[111] I mean, maybe I should just be, like, taking the journalist's side in this.
[112] And ultimately, she did right by you.
[113] Right.
[114] But I don't blame people who are worried.
[115] I really don't.
[116] Right.
[117] Whenever somebody doesn't want to talk to me. sometimes it's for dumb reasons I hang out in a lot of spaces where people are like you're fucking fake news and all this stuff where I'm like that's a bad version of that argument but I still get where the sentiment comes from because you're always taking a risk it's always a cost benefit I mean but when I remember when I was doing that piece first of all I had to spend a long time trying to talk my way into just being around to be a fly on the wall with anything involving SNL because one time Lauren got burned by someone who came in and said, I want to hang out, I want to see what you guys are doing.
[118] And the cover ended up, not of the New Yorker, the cover of New York Magazine ended up being Saturday Night Dead.
[119] And it was like, his show fucking sucks.
[120] They let them into the writer's room.
[121] And whenever you're in a writer's room, people are going to be throwing out jokes that don't land and you can make them look bad.
[122] So that happened in 1995, I think.
[123] And since then, Lauren was just like not doing this anymore.
[124] Not happening.
[125] And so then, exactly.
[126] So then I had to essentially like work for months to just sneak around.
[127] And again, this is not like me trying to like bust some big scandal or some embezzlement scheme.
[128] It's just me trying to get the scenic material I need to do a piece that I think is worthy of that I think is good.
[129] Yes.
[130] And that I think shows in real narrative time how this person operates and why she's funny.
[131] Because it's very hard to capture why someone is a creative mind if you don't see that at work.
[132] Yeah.
[133] Sure.
[134] Anyway, it took a long, long time for me to, like, work my way onto the set.
[135] I got there the week that Luis C .K. was hosting.
[136] Okay.
[137] And it was pre -pre revelations.
[138] So I was like, this is weird.
[139] This guy really doesn't want to talk to me at all.
[140] I wonder why.
[141] And now I think I know why.
[142] But she was in this sketch where it was a whole day, and there was a lot of just shooting the shit in between takes.
[143] Yeah.
[144] And I really wanted to show how funny she is.
[145] So they started just riffing.
[146] It was shooting on a rooftop.
[147] and it was her, Louis, Kenan Thompson, Mikey Day.
[148] They were all these people just sort of riffing and they were looking out in Brooklyn and they were like just doing dumb bits about like, I hate that billboard and I fucking hate banks because banks are like this just whatever they saw.
[149] Yeah.
[150] And then they saw a bunch of orthodox Jews.
[151] Oh sure.
[152] Uh -huh.
[153] And just went and I fucking hate Jews because blah, blah, blah.
[154] Uh -huh.
[155] And in that moment, I'm Jewish.
[156] Yeah, yeah.
[157] There was no part of me that was like, oh my god she hates Jews like zero zero percent right but I was like if I wanted to I could destroy everyone's life in print yes yes yes yes it just would not and so I just had to very quickly do the math in my brain of like is there any way I can translate why this is funny in the moment no there isn't I'm just going to pretend this never happened yes for sure Monica night's new favorite thing is to tell Kristen when she was canceled so like we all virtually live together Monica is living in our house right now.
[158] And yeah, Kristen will say something and we're like, that's it.
[159] You're canceled.
[160] That was being broadcast to everyone.
[161] You're out.
[162] That's it.
[163] That's a wrap on you.
[164] And even she, Kristen Bell, says things that could get her canceled.
[165] We all get canceled many, many times a day.
[166] Well, it's also hard to write things when you have the little cancellation angel and devil on your shoulder.
[167] I can't imagine it's helped your access or the openness of people.
[168] It doesn't help with access to other people.
[169] It also doesn't help me as a writer to be free enough to write things down even as a first draft.
[170] Because a lot of the process of writing is to just try stuff out.
[171] Throw shit at the wall.
[172] And then read it a week later and see if it works.
[173] And it's hard to just be like, oh, my God, I can't even type that.
[174] Yes, yes, yes.
[175] So I've been fascinated recently because I guess I didn't even realize this person existed, but both in reading Catch and Kill and listening to the Epstein podcast, I'm blown away with the audacity of some of these.
[176] folks who do have a humongous secret to hide, actually reaching out and having the arrogance to think you can steer the whole thing in your direction.
[177] There's no way any journalist was fooled by him sniffing around.
[178] You know, it just is such an obvious.
[179] Yeah.
[180] Although, you know, I mean, the Epstein thing and the Weinstein thing stayed buried for quite a long time.
[181] Yeah.
[182] Yeah.
[183] So it's not like it doesn't work.
[184] Yeah.
[185] I just, I was trying to imagine myself being proactive in a crisis like that.
[186] And it just would talk about a swing for the fence.
[187] For lack of a better word, it's impressive that they have that level of gall of like, oh, I'm just going to...
[188] It's crazy.
[189] Well, and the people that I was covering for the last few years have a similar kind of gall to them.
[190] They really, I mean, I was covering a lot of people who, they knew that there was no way that I was going to be favorably disposed to them.
[191] Yeah.
[192] They just knew based on where I was coming from.
[193] You're from New York.
[194] From New York.
[195] I mean, I literally would be in reporting situations with various alt -right types or whatever they would call themselves MAGA people.
[196] They would call themselves deplorables.
[197] They had all kinds of different.
[198] words for what they were, civic nationalists, whatever, white nationalists, it ranged from like actual Nazi -Nazi types to very, very soft, people who would never, ever identify as racist and who would be very offended if you said that.
[199] Right.
[200] And there would be a few moments, if not endless moments, where they would sort of be like, what's your deal, man?
[201] Like, what are you doing here?
[202] How are you going to treat us?
[203] Are you going to lie about us?
[204] Right.
[205] So all those questions have different answers.
[206] Yeah.
[207] And I was very clear with myself, okay, I'm not going to lie, but I'm also maybe not going to say every single thing I'm thinking.
[208] Because in no situation, do you say everything you're thinking?
[209] No. But I was like, I ethically feel like I can't say something that's not true.
[210] I couldn't say to them, I love everything you do.
[211] I'm a big fan.
[212] Yeah.
[213] Yeah.
[214] Actually, let me just back up for half a second and just say that your book, antisocial, online extremists, techno, utopians, and the hijacking of the American conversation is the book that you're referencing that you, as I understand it, it started maybe as an exploration of trolls and then you were observing them online and then you wanted to get in front of them in front of the real people that were creating all these things?
[215] Yeah.
[216] Well, so what it really started as was not even about where are the bad people or what are the bad people doing online.
[217] What it really started as was what is the internet doing to us?
[218] Oh, uh -huh.
[219] What is the internet doing to us as a society?
[220] What's it doing to us psychologically, emotionally?
[221] What's it doing to our belief structures?
[222] Which, you know, I started sort of getting at that stuff in like 2014, 2015, when it was obviously like a big important question, but it wasn't like the question.
[223] Yeah.
[224] And it felt like a little bit off to the side.
[225] I always felt a little bit kind of like, am I kind of wasting my time looking at like weird parts of the internet?
[226] Like it always felt, especially like sitting in the New Yorker offices in the World Trade Center and having these like people I idolize, you know, like some of the best writers in the world, like walking past my office and being like, why are you looking at why misogyny is cooled?
[227] You know, and I'm like, it's for a project.
[228] Your search history was probably a little embarrassing.
[229] My search history and my algorithms.
[230] I would go on YouTube and YouTube would be like, do you want to buy a diamond mine?
[231] Do you want to buy a secret gun holster?
[232] I didn't have the foresight to do like a separate work and personal account.
[233] So it would just be weird pickup artist stuff and then like LeBron highlights and Stephen Colbert.
[234] And so the algorithm was like, who are you?
[235] Yes.
[236] You're describing my Netflix account because my kids signed.
[237] into my name and so I get recommended all these unicorn cartoons and I just think well they are off base on me I think it's kind of cool to just confuse the algorithms as much as possible sure sure but so it started out as these kind of more abstract questions and less in a political context it didn't seem like a politics story at the time it seemed like maybe a tech story a business story a kind of culture story and so I was exploring like okay it feels like everything is getting broken into these de contextualized chopped up bits where you never go to any homepage ever anymore is just your feed throwing things in your face yeah that and that goes for everything that goes for brands companies people whatever it's all just sort of based on what the algorithm wants to send you yeah which as we just discussed is based on search history but it's also based on triggering the most extreme immediate emotions yeah as possible quickly and so those are good fuel for action yes yeah and specifically the actions of click scroll share comment those are literally the only things the algorithms can measure.
[238] Yes.
[239] The algorithm can't reach into your brain and go like, wow, this thing really made you think.
[240] It really changed you.
[241] It really made you a better person.
[242] It is just blind to all of that.
[243] All it knows is, did you smash button?
[244] Right.
[245] Did we capture your attention for this amount of time?
[246] Totally.
[247] And get you to do things on top of the time.
[248] Get you to comment, I fucking hate this or I love this.
[249] That was the only things it can capture.
[250] So, in a way that wasn't intentional when these Silicon Valley Disruptor people set out to do this.
[251] I think they really believed, yes, they wanted to make money and all this stuff, and they wanted power.
[252] But I think they really believed, like, we are going to give people what they want.
[253] Oh, for sure.
[254] And then they just didn't fully think through.
[255] It's like how when you are like building cities and you're like, hey, we've got this great thing called a car and it can get you places faster.
[256] Why would we not do that?
[257] And then 50 years later, you're like, oh, that's going to literally bake the planet.
[258] It's kind of equivalent.
[259] They just didn't think it all through.
[260] Yeah.
[261] Well, it's moving so fast in their defense.
[262] Pretend someone could be ahead of it is a little naive.
[263] Totally.
[264] Totally.
[265] I think, I mean, we'll get into this, but I think there are ways they could have and should have adjusted over the course of that time instead of they've kind of dug in their heels.
[266] Well, we just interviewed the head of Instagram.
[267] Adam.
[268] And he was, I thought, very transparent in there are many failings.
[269] He was in charge of the Facebook news feed before he went over to Instagram.
[270] And he was being vocal and sounding alarms and all these different things.
[271] And yeah, he's just like, yeah, we didn't think of everything.
[272] Also, we learned things like real time as the media is learning it and pointing it out, quite often we're learning it at the same time.
[273] So, yeah, I'm mildly sympathetic to what they're doing.
[274] Oh, I am too.
[275] And it's not like the point of the work or the book is to be an expose of like, these are the worst people in the world or, you know, these are not like Harvey Weinstein type figures.
[276] They're more like, I would say like the people who designed cities and buildings and cars who thought they were building this beautiful you know, Le Cibousier or Robert Moses or any of these people where you have this beautiful thing being built that also has these massive consequences and then it's a question to me of how much can you think and self -reflect and be kind of supple and flexible enough to go like the thing you built your entire life on not only your fortune and fame but also just your entire life if you think about mark Zuckerberg's life yeah yeah literally his entire life 90 % built on this one thing since he was 19 years old oh and I'm sure when he closes his eyes and thinks of himself I think Facebook is intermeshed into that.
[277] Oh, it's, how could it not be?
[278] I mean, it's like, and I try to think of it from the perspective of, like, if you suddenly told me that trying your hardest to write beautiful sentences that capture the perfect crystalline reality of the world we live in is actually, like, killing babies in Myanmar.
[279] Sure, sure.
[280] I'd be like downstream.
[281] What?
[282] Yeah, yeah.
[283] It doesn't make sense.
[284] It's hard to wrap your head around it.
[285] Yeah, it would take me a long time.
[286] Yeah.
[287] But there is this thing, the philosopher Richard Brody, who's like a little bit of a sort of backdrop.
[288] up to the book.
[289] So Roidey is like this massively important American pragmatist philosopher who kind of hard to pin down simply, but one of my favorite of his books is called contingency, irony, and solidarity.
[290] Contingency is essentially the end of the story isn't written yet.
[291] Everything is contingent.
[292] Everything could have happened one way or another way.
[293] Like essentially he's arguing against this idea of progress will sort of become this thing that works itself out.
[294] Whether it's through Marxism, you know, there will be this natural class struggle or whether it's through enlightenment, you know, the rationality of humans will naturally lead us.
[295] He's against all that stuff.
[296] It's not going to, there's no natural course of history.
[297] It has to be decided.
[298] People have to do it.
[299] Yeah, yeah.
[300] So, like, it may be the case that human ingenuity and rationality, like, bring us to a, but it's not just going to be like, it's not a given.
[301] Yeah.
[302] It's not this Hegelian sort of course of history thing.
[303] And the reason I thought that was important to bring up for this stuff is that when you are someone who just, at 19, drops out of college and starts a thing that ends up being the biggest information tool ever known in human history.
[304] You're not a philosopher.
[305] You're not a historian.
[306] You're building something cool.
[307] And you, I think you imbibe without really realizing it, just these cultural ideas of like, free speech, good.
[308] History is a march of progress.
[309] We live in a time where, you know, the more globally we get knitted together, the fewer wars will have, the more prosperity will have.
[310] Like, all these things you don't articulate them, but they're just like the water you're swimming in.
[311] Yeah.
[312] So that when you build a thing where you're like, we're going to knock down all the barriers, we're going to tear down all the walls, we're going to disrupt, we're going to innovate, I think there's basically no part of your brain going, hold on, am I going to start a world war?
[313] And you're not questioning the destination.
[314] So you have these ideals and you're like, I'm creating this tool that will help us get to this thing that I think we all agree, right, is where we're going.
[315] So let's get there faster.
[316] Let's get there faster, exactly.
[317] And so there's this thing where I call it the gleaming vehicle of progress.
[318] It's just like, look at this vehicle.
[319] It's so fast.
[320] It's so sleek.
[321] It's so beautiful.
[322] And nobody asks where it's going.
[323] Because it's like, well, it's just like a self -driving car.
[324] Like, it'll get us there.
[325] Right.
[326] And you're like, what if there's a giant wall that it's about to run into?
[327] Uh -huh.
[328] Well, don't you think an element of that is because I will sometimes get spun up about a new direction we're heading in societally, governmentally, whatever you want to call it, globally?
[329] And I'm starting to get wound up about it.
[330] Let's say it's generally, because I'm getting older, triggered by younger generations and what they prioritize and how they compute the world.
[331] and I'm like, I'm bristling at it.
[332] And then a bigger part of my brain goes, this is never stopping.
[333] It's never slowing down.
[334] It's never ceasing to evolve.
[335] There is no destination.
[336] It has a momentum that is, I'm irrelevant in the equation.
[337] So even if I don't think that we've agreed upon the destination, I do feel like the march forward is inevitable and almost not worth my trying to steer it.
[338] I guess what I would say is the march is inevitable.
[339] Mm -hmm.
[340] It's the forward part that I question.
[341] Okay.
[342] So I think one thing that I had to work hard to sort of clarify is that it doesn't mean that there are these two choices that are optimism or pessimism or phones will knit us all together in this beautiful way or throw all the phones.
[343] Yeah.
[344] I think the way out is neither optimism nor pessimism but this concept of contingency.
[345] Uh -huh.
[346] It is going to be what we make it.
[347] So rather than let's throw all the phones off a bridge or rather than let's sit back and let the phones do whatever they do because it'll probably be good.
[348] It's just we don't know.
[349] We have to make the future be what it is.
[350] Right.
[351] So, and where I was going with the contingency, irony, solidarity, the irony part is what he means by irony is this very technical definition of irony, meaning like the ability to reflect on the possibility that your deepest held convictions might be wrong.
[352] Yeah.
[353] And that is something that, where I think a lot of the Silicon Valley people really struggle.
[354] So you started by saying, well, how did you get to the trolls?
[355] all this very airy, abstract stuff that was sort of swimming around in my brain because of how I like to work.
[356] I did not want to say, okay, let me make an argument about this stuff.
[357] Let me make a polemic screed about why I think social media is good or bad.
[358] What I like to do is take all these ideas and confusions and sort of like things that I'm not sure how to feel about and go out into the world and try to like weave together people and narratives and moments like a kind of documentary film...
[359] almost art project like you know right but i guess the fact that you're interested in it before you you launched into this you're you're kind of aware obviously you're seeing that let's say you're on twitter where most of us are on twitter you're like oh wow there's a lot of angry people there's a lot of people that are in camps there's a lot of fighting the place i come from is like this at least aspect of it seems to be the worst of our nature or that there's this anonymity to it that allows us to act in ways we actually don't act like what place where you starting from?
[360] Were you starting from frustration with the way that, say, social media and the internet's working?
[361] Yeah, frustration and kind of anxiety of like, where is this how bad will it get?
[362] Yeah.
[363] Some of it was sort of observing Twitter and Reddit and all these things.
[364] And anonymity we should get back to because Facebook is not anonymous, but Facebook is full of problems.
[365] So it's not any one of these factors.
[366] Anonymity is one factor, but there's lots of other factors too.
[367] Because I spent a lot of time at the offices of Reddit, which is anonymous, but in many ways they're more proactive about cleaning up their mess is.
[368] Okay.
[369] Because Reddit is a super fascinating part of this.
[370] But it's the one thing I still don't understand.
[371] Conceptually, literally.
[372] People tell me about like Reddit threat.
[373] I don't know what they're talking.
[374] I met the kid who invented it.
[375] He sold it.
[376] Yeah, Alexis.
[377] Great guy.
[378] Alexis loved him.
[379] Even asked him, I don't think I understood it after he was done talking.
[380] I'm like, I don't know.
[381] Someone starts a topic and then people just, they write indefinitely on the same topic.
[382] Is that what it is?
[383] This is another thing where being in the room helps you.
[384] Like, I just sat in the Reddit offices in San Francisco for a long time.
[385] time.
[386] The basic architecture of it is very simple.
[387] It used to just be links.
[388] Here's a link.
[389] Then they added comments.
[390] Here's my comment on that link.
[391] Then they added up votes and down votes.
[392] I liked your comment.
[393] I didn't like your comment.
[394] They're thinking, again, because of this kind of basic sort techno -utopian frame was like, this is democracy.
[395] It's literally voting.
[396] Yeah, yeah.
[397] Up or down.
[398] They didn't anticipate pylons.
[399] They didn't anticipate what they call brigading.
[400] They didn't anticipate and 67, and I'm just going to destroy her regardless of what she says.
[401] Ah, okay.
[402] I say her because it's usually her.
[403] Yeah, yeah.
[404] Who are the targets of, you know.
[405] Yeah.
[406] The disagreement immediately unravels into, I hope you get raped.
[407] Totally.
[408] Generally, right, there's always some kind of threat of physical safety to a woman.
[409] Yes, or everybody jump in here and downvote her as quickly as possible.
[410] Yeah.
[411] And this is part of the answer to how I was able to do some of this work.
[412] It was just like they don't go after men in the same way.
[413] I met some real unsavory people.
[414] Uh -huh.
[415] And they didn't like me because I was a journalist.
[416] didn't like me because I was Jewish in some cases.
[417] But I'm not a woman.
[418] So, like, I'm kind of okay.
[419] Right, right.
[420] It sucks, but it's real.
[421] My wife would sort of listen in sometimes I would do these interviews in the house and she could hear and she'd be like, they weirdly kind of want your respect, even though they fucking hate you.
[422] For sure.
[423] Interesting.
[424] They hear your voice.
[425] It's bizarre and it's like at a deeper level than is.
[426] Yeah, there's some primal stuff happening.
[427] Yeah, yeah.
[428] For sure.
[429] So Reddit is kind of in a way, the purest expression of some of the, these things.
[430] It's very stripped down.
[431] The design is very, very old school.
[432] It's like a message board, yeah, as I would have thought of it.
[433] It's like a message board.
[434] It's also gets more traffic than Twitter or Amazon or...
[435] Really?
[436] Netflix or...
[437] And does it have the impact downstream that say Twitter does?
[438] Because in my observation, maybe you can correct me, you would know more about it than I do.
[439] I have this theory that the farthest left and right 5 % of the spectrum have the loudest voices and they are commandeering a disproportionate percentage of the news cycle by having these loud, clickable opinions and takes on everything.
[440] Yeah, and some of it is left and right, but some of it is just attitude, temperament.
[441] Some of it's not like, I want a 39 % tax rate and you want a 41 % tax.
[442] It's just like, fuck you, I want higher taxes or fuck you, I want lower taxes.
[443] It's not really a left or right thing.
[444] It's more just to, like, who can get those emotions stirred up.
[445] Yes.
[446] And that was the thing where these guys, so I mostly spent time, with Steve, because Alexis had kind of moved away from the company.
[447] Steve and Alexis started it together in their dorm room in Charlottesville, Virginia.
[448] They went to UVA.
[449] Yeah.
[450] Then they went off, you know, they were in their 20s.
[451] They made a few million dollars.
[452] They sold the company.
[453] They went off.
[454] They sold it early.
[455] They sold it way too early.
[456] But does Reddit stuff track upwards towards policy and news cycles and all that?
[457] Oh, yeah.
[458] Everything is on Reddit.
[459] There's more than a million.
[460] Crazy stuff, right?
[461] Like, there's some like horrific stuff happening on Reddit.
[462] Oh, totally.
[463] Okay, okay.
[464] Like really bad.
[465] Well, Reddit is one of the reasons I spent so much time at Reddit specifically is because it is in many ways one of the clearest mirrors onto what social media is because it's so stripped down and it's everything is on there.
[466] The worst stuff, the best stuff, the creativity, the partnership.
[467] I mean, you know, I was focusing on the antisocial side of it, but there's a huge amount of pro -social side of, and not only on Reddit, but everywhere.
[468] People are finding their mob, they're finding their tribe, they're going to events, they're...
[469] forming a life based on all that.
[470] Yes.
[471] By the way, all the things you said, forming a mob going to events.
[472] Yeah, mob was the wrong word.
[473] No, but that's, it is exactly like the mirror.
[474] It is just both at the same time.
[475] It is the pro -social and the antisocial exactly at the same time.
[476] Right.
[477] And one can't exist without the other.
[478] Yeah.
[479] So the reason I wanted to delve into the bad stuff is not because I'm a masochist or because I like hanging out with the worst people, but because I just felt like that part had been under explored.
[480] Yeah.
[481] I felt like essentially the first 10 years of social media was them being just given a free halo effect.
[482] Everything was pro -social.
[483] Everything was Tower Square and there's going to be these revolutions.
[484] Yeah, the Arab Spring.
[485] And I was just like, there's no one really digging.
[486] Not no one.
[487] There were people, but it just the collective narrative was not being seen in this balanced way.
[488] Yeah.
[489] So the first thing I looked at in 2014 was this guy in Chicago who was building a clickbait website that was not political at all.
[490] Okay.
[491] It was cats riding skateboards.
[492] and he...
[493] Feel good stuff.
[494] Feel good stuff.
[495] Wasting time stuff.
[496] Yeah.
[497] So I met him at a dinner where I was at a tech conference and I was there to do something else but he was randomly seated next to me and he started telling me what he did and I was like, it's not intrinsically harmful what you're doing but you are wasting people's time.
[498] You're not making them smarter.
[499] He looked at my little name tag and said, oh, so you work for the New Yorker.
[500] Let me tell you how to improve the New Yorker.
[501] You need more pictures.
[502] You need shorter sentences.
[503] You guys are going to get way more clifference.
[504] if you just do, like, listicles.
[505] Yeah.
[506] And I was like, fuck you, buddy.
[507] Like, you're not wrong.
[508] Right, right, right.
[509] But the reason I'm annoyed is because you're right, but you just think you're, like, smug about it.
[510] Sure, sure.
[511] Or that that would be a defendable goal.
[512] That they'll be good for the world.
[513] Yeah, yeah.
[514] Yes, it's good for your business.
[515] But you've conflated, and this became an obsession with me, people who conflate what's good for them and what's good for the world, what's good for the bottom line and what's, and that's a particular internet problem.
[516] So I was so intrigued and provoked by this guy that I said, I said, I'm going to do a profile of you.
[517] And he was like, okay, great.
[518] Yeah, I hate your magazine.
[519] It's fucked.
[520] Yeah, I'd love to be a subject of your profile.
[521] And I also was aware that, like, I have my biases and I'm writing about a new media thing in an old media thing.
[522] I don't want to be just like catty about it.
[523] So I tried to really understand where he was coming from.
[524] And he was a smart guy.
[525] He was a nice guy.
[526] But then our value system was so different, particularly on this point that he kept saying to me, well, our motto is, if it gets clicks, it's good.
[527] And I would say, do you mean good for the money in your?
[528] your pocket or good ethically or quality wise and he was like I don't understand the difference right stay tuned for more armchair expert if you dare what's up guys this your girl Kiki and my podcast is back with a new season and let me tell you it's too good and I'm diving into the brains of entertainment's best and brightest okay every episode I bring on a friend and have a real conversation and I don't mean just friends I mean the likes of Amy Polar Kell Mitchell Vivica Fox, the list goes on.
[529] So follow, watch, and listen to Baby.
[530] This is Kiki Palmer on the Wondery app, or wherever you get your podcast.
[531] We've all been there.
[532] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.
[533] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[534] like the unexplainable death of a retired firefighter whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[535] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[536] It's called Mr. Ballin's Medical Mysteries.
[537] Each terrifying true story will be sure to keep you up at night.
[538] Follow Mr. Ballin's medical mysteries wherever you get your podcasts.
[539] Prime members can listen early and ad -free on Amazon Music.
[540] And, you know, I got to say, I'm going to do this a bunch during this interview, and I'm going to get in trouble at times, but I'm going to do it anyways, which is, what did your parents do for a living?
[541] Doctors.
[542] There you go.
[543] So I will say I get a little frustrated with people that will pat themselves on the back too much, because if you're born into a household of doctors, yeah, you're starting at a pretty good spot where they've already dedicated their lives to helping people.
[544] My dad sold cars, and if he could sell one with a den in it, that no one saw that was an extra $200 and we could make, you know, so my dad was fucking take what you can get and run like hell.
[545] And so I can relate very often.
[546] Monaco and I will get in these conversations and I'm like, I hate to say it, but I have that obsession and the coveting of money and at all costs.
[547] And what's fair?
[548] Oh, because someone went to that school, that's fair?
[549] Fuck that.
[550] The whole system's not fair.
[551] I'm going to get mine.
[552] So I don't longer feel that way, but I definitely understand how many people are given the worldview of this is a fucking video game and get some points totally and so i'm i don't think you blame people so much if that's what they were born into i totally agree yeah well i i agree in the sense that i don't think the most productive place to go with that is to go this is a bad person right right right right i want to condemn this individual as a bad apple i think the thing that's important is what are the systems that are driving people to behave the way they are it's also i just want to point out it's and it's something i find myself and it is it is a luxury to be finding purpose.
[553] I'm so grateful to have the luxury, but for many years, purpose wasn't on my agenda.
[554] It was like, get some fucking money so I can stop sweating the price of gas.
[555] You know, it all becomes like what's on fire.
[556] Yes.
[557] That's the Maslow's pyramid thing.
[558] Yeah.
[559] And, you know, then the question is, do you get addicted to that?
[560] I mean, this kid, when I met him, it was a 27 year old millionaire.
[561] Uh -huh.
[562] At what point do you actually rise up to the level of the Maslow's pyramid where you're actually trying to transcend that?
[563] Yes.
[564] And again, it's not about moral condemnation in the sense that I don't think this guy's a bad guy.
[565] I do think it's important though to flesh out our actual profound disagreements of principle, especially as we get more into the people who are actively like tearing apart the fabric of our democracy, which I ended up writing about more.
[566] Yeah.
[567] And again, I'm probably going to point out some things that'll sound like I'm a sympathizer, but and that's not the case.
[568] It's more just for like a sense of emotionally maybe what's going on with people, which I'm sure you are equally interested in.
[569] Totally.
[570] This is part of we were talking about the angel and demon of cancellation on your shoulder, there were times when I was like, I'm so interested in what drives these people and explaining it that I am worried that I'm going to get mistaken for excusing it.
[571] Oh, sure, sure, sure.
[572] Because I spent, I mean, in the book, I go deep into the life history of people that are doing bad things in the world, just objectively making this world a worse place.
[573] And I really, really want to tunnel into what was it with their relationship with their parents.
[574] it's also you have to understand it if you want to be able to create systems that will correct for it yeah so yeah i started with this clickbait entrepreneur guy in chicago then trump comes down the escalator and i immediately am like okay all the incentive structures all the business incentives the intentional incentives the psychological tricks all the things that i saw this sort of innocent neutral kid doing that's what this guy's going to do and he's win and I am sad to say I made a lot of bets that I won right and I also was like okay this my thing that I've been sort of trying to puzzle through and not knowing exactly what the story is this just became the story of American politics so then I was like okay now I need to broaden my scope it's not just about how this kid is making money on Facebook it's about how the same things that allow him to be a 27 year old millionaire on Facebook are the things that are going to destroy everything we hold dear I want to know how you go from like observing them online to then meeting them.
[575] Well, so, okay, so it was summer of 2016.
[576] I had been saying kind of internally, like making bets with my editors and people like, these forces are too strong, like Trump's going to win.
[577] And eventually it just got to the point where they're like, okay, tell that story in a way that is not just, again, not just you putting out your opinion and sort of writing an op -ed about how, but like, show me. Show me who's doing it.
[578] Yeah.
[579] And we were already hearing rumors even back then pre -election that, like, Maybe the Russians are doing it.
[580] And like, maybe the bots are doing it.
[581] And I was like, even if it's true that the Russians are doing it, there's no way that there is more of the Russians doing it.
[582] I'm doing air quotes, the Russians.
[583] Like, there's no way that these whatever 200 people who are sitting in a building being paid to do this are having more of an effect on the election than Americans who are doing this of their own free will.
[584] Yeah.
[585] There's just, like, yes, bots are a problem and yes, you know, troll farms are a problem.
[586] But most of this is real people.
[587] So I was like, I'm going to find the people who are doing this.
[588] Right.
[589] And I was trying to stay within gray territory.
[590] At first, I didn't go straight for just the outright Nazi stuff.
[591] I felt like that was a little too salacious.
[592] You want to be able to, again, not excuse, but find some way into understanding.
[593] It's very, very hard to find your way into understanding why just an outright anti -Semite, just out and out cross -burning person.
[594] I did end up getting there in the book because while I was reporting the book, Charlottesville happened.
[595] And I went, oh, shit, I have to go there.
[596] there.
[597] Right.
[598] But I started out with, I want to find someone who wants Hillary to lose for reasons that are not just pure blind hatred, but has reasons and is really effective at reverse engineering and hacking the attentional marketplace to make that a reality.
[599] I wanted to find someone who's just sitting at home in their living room doing it freelance.
[600] And in a way that would not have been possible without these algorithms.
[601] Like, you could do it as a super billionaire or Rupert Murdoch type person by like just buying up the press apparatus.
[602] But the fact that you could just do it from your laptop is new in human history.
[603] Well, let me ask you this, though, because we acknowledge the algorithms are virtually giving us exactly what we want to read already.
[604] We're already siloed.
[605] How would someone penetrate that algorithm from an opposing viewpoint?
[606] So here's how they do it.
[607] Well, so there's a lot of wiggle room within they're giving us what we want because there's different versions of us and what we want.
[608] For sure.
[609] There's the twitchy lizard brain version.
[610] There's the deeper, more considered version.
[611] So you can get people who don't want Doritos to eat Doritos by giving them Doritos.
[612] Yeah, yeah.
[613] It's not actually that deep and complex.
[614] You just have to, it's a trick you have to get good at.
[615] Yeah.
[616] You know, you put lines out to 20 people and you end up finding the one perfect person.
[617] Because a lot of what a New Yorker style thing is is finding the one perfect person who's going to encapsulate a larger set of concerns.
[618] That's like a fractal of that.
[619] Exactly.
[620] Yeah.
[621] They call it the donkey.
[622] Like the person who can carry the weight.
[623] of the story you want to tell.
[624] Oh, okay.
[625] Like Lawrence Wright, when he was writing about Scientology, his first piece about that.
[626] That's the best article I've ever read in my life.
[627] So if you remember that article, all these articles, this is like super dumb, like magazine nerd thing, but they all have a little thing above the title that's called a rubric.
[628] And it's called either profiles or a reporter at large or whatever.
[629] That one was technically a profile of Paul Haggis.
[630] Okay.
[631] Nobody read that and went, wow, I now know about Paul Hagan.
[632] Right, the career, the life and times of Paul Huggis.
[633] Right, but because Paul Haggis was the quote -unquote donkey in that story, he was the one carrying the load of the, right, because you have to go in through someone.
[634] So my Paul Haggis was this guy, I hesitate, maybe we shouldn't even like give him more attention by using his name.
[635] So the guy I used, I called him up and said, I am interested in how people are using social media to do unprecedented things in politics.
[636] Really quick.
[637] How do you even, because everyone is operating under an alias or a fox catcher 29, how do you even find out who the guy really is?
[638] Well, the thing is some people are not.
[639] Oh, they're...
[640] Okay.
[641] So he was living out loud.
[642] Yeah.
[643] People who have been at this for a long time, there's generally a time when they have either been doxed or they've outed themselves or something has happened where they go, this is going to be who I am.
[644] Now, this particular guy, he was a lawyer.
[645] He went to law school.
[646] He had a wife and a kid and a dog and they like to go hiking and actually...
[647] He was on the right path.
[648] Yeah.
[649] Well, and this is no, when I met him.
[650] Oh, oh, okay.
[651] This is so every time you meet one of these people, and I'm going to tell you, you about other people that it's going to be the same thing every time it is not the person you expect oh wow it is never the 15 year old in a hoodie in a basement this guy was about 40 this was his second marriage both of his wives were non -white he was you know really into exercise and fitness this was you know summer 2016 and i just wanted to understand what the fuck was happening on the internet so i was like okay show me how you do what you do and right away i was like okay this guy's not dumb.
[652] He's not living by himself and, you know, like, he's a weirdo in many ways, but he's not the kind of weirdos picture.
[653] He's not isolated.
[654] He's not Ted Kaczynski.
[655] He's obviously not a white nationalist in good standing because you get kicked out of being a white nationalist if your wife is of Iranian descent.
[656] And I actually later found out that his first wife had worked at Facebook and in the divorce, he got half of her Facebook stock, which is how he funds all of his.
[657] Oh, my God.
[658] Oh, wow.
[659] Some of this stuff is like, if it was a novel, they'd be like, come on.
[660] Too on the nose.
[661] I mean, wait till we get to the guy who's the top Nazi propagandist in the country who was married to a Jewish woman and who had a black brother and we'll get there eventually.
[662] Oh, I love that.
[663] Complicated.
[664] Oh, man. Mix messages.
[665] So I get...
[666] We love mixed messages.
[667] We do.
[668] We do.
[669] And again, you asked me this before.
[670] Like, didn't they look at you and go, like, aren't you going to be on the other?
[671] You're from New York, all this stuff.
[672] Sometimes I would literally meet someone at a party like the...
[673] deplorable one of the big parties I went to was called the deplorable it was their big inauguration party and that's a cute plan isn't that nice isn't that fun right monica deplorable this is this is how I spent about three and a half years going uh in my head but I have I there were times where I just had to go because I wasn't going to lie and because I wasn't going to say the full truth I just went you know what man just take a look at me like look at my glasses, look at my beard, just make assumptions and they're probably close enough.
[674] Sure, sure.
[675] And they were.
[676] Thin slice me. Yeah.
[677] And like, it's not that far off.
[678] Now, can you just tell me what you want to tell me?
[679] And then usually it would work.
[680] Or sometimes they would be like, take my notebook and tear it up and be like, fuck you.
[681] It was like 50 -50.
[682] Yeah.
[683] So when I get to this guy's house in California, he's like, okay, I have a laptop.
[684] I have an iPad and I live stream on my iPad.
[685] And like, that's it.
[686] I have a phone because people text me stuff.
[687] And that's how I'm going to reverse engineer the news cycle and like damage Hillary's brand enough that she's going to lose.
[688] I mean, obviously in concert with other people but like I'm going to do my part to hijack the narrative.
[689] And I was like, come on man. Like how are you going to?
[690] He's like, okay, watch.
[691] So, I mean, I was with him for days and days and he does it multiple times a day.
[692] He was like, one of the things that people really respond to viscerally and emotionally is someone being sick or disgusting or like, especially a woman.
[693] If I can make people think that she's just diseased and like rotting from the inside, they won't want to vote for her.
[694] Never mind that he was like, I have all these things about her foundation and the stuff they're doing in Saudi Arabia and Qatar and I have it.
[695] Like, he had all those things on his mind, but he was like, that's not going to go viral.
[696] That's not going to hit people at these emotional flashpoints that are going to make them click and comment.
[697] And what's going to make them click and freak out is she's diseased.
[698] Yeah.
[699] Oh, my God.
[700] So he goes, okay, there's these little video clips of her where she's blinking in a weird way.
[701] And I'm going to say that that's her having a mini seizure.
[702] Uh -huh.
[703] Or she did an interview sitting down, so I'm going to say she can't stand up.
[704] Or there's like a little bulge in her pocket.
[705] I'm going to say that's her catheter.
[706] And so part of it is just throwing shit out there.
[707] But part of it is also condensing it so that it becomes, that it spikes into the national discourse.
[708] Because to your point, you don't just want it to stay in your silo.
[709] You want to pollute the entire national discourse with it.
[710] So he would go, okay, I'm going to do a live stream on Periscope.
[711] And I'm going to get, it doesn't have to be a million people.
[712] It could be 1 ,000 people or 500 people.
[713] if they're passionate enough and hardcore enough and we all pick the same hashtag at the same time and we get enough of the right kind of images and words all in the right kind of combination and it spikes quickly enough that will become a trending hashtag on Twitter so he would sit in this video conference and go okay today it's going to be Hillary's bags because Hillary's got bags she's got bags under her eyes she's got bags of catheter bags she's got bag so it's like let's all go do Hillary's bags right now go to Twitter they all do it None of this is against the rules, by the way.
[714] Twitter doesn't have a rule.
[715] Maybe they should.
[716] Right.
[717] But they don't have a rule against you and your buddies hijacking the news.
[718] Right.
[719] They all go to Twitter.
[720] They do the hashtag.
[721] If they do it right, it trends.
[722] Then once something is trending on Twitter, that is a signal to every journalist in the world, which is the core audience of Twitter to go.
[723] Okay.
[724] So is.
[725] To go.
[726] This is now a thing.
[727] You have now been objectively given permission to go after this as a thing.
[728] Yeah.
[729] Which.
[730] Well, now it's in the public consciousness.
[731] So it deserves reporting on.
[732] Exactly.
[733] Which is misleading in a lot of ways.
[734] right?
[735] Because most Americans are not on Twitter.
[736] Most people on Twitter are not talking about Hillary's catheter bags or whatever.
[737] It's just an engineered outrage.
[738] But because you've now seen it in the little box.
[739] And by the way, there's no...
[740] Twitter never said that the little box of trending hashtags are the things that objectively the most people are talking about.
[741] It is a proprietary algorithm that they've never shared how they come up with it.
[742] So it's not just a volume thing.
[743] It's not a volume thing.
[744] It's a bunch of factors.
[745] It's speed.
[746] It's concentration.
[747] Uh -huh.
[748] It's supposed to measure virality.
[749] Well, and so these are, you know, these are all proprietary companies that don't share how they do it.
[750] But it just so happens that if you just do your homework and learn how to reverse engineer it, you can have this effect.
[751] And then he would just sit me down and explain, okay, so watch, man, watch what's going to happen.
[752] It's going to trend.
[753] I bet you that if Chris Zeliza picks it up at CNN and then he, you know, retweets it, then Brian Stelter is going to notice it, and then he's going to maybe consider talking about it.
[754] I know it's going to end up on the drudge report, and Rush is probably going to talk about it.
[755] And then maybe Sean Hannity is going to talk about it.
[756] not because this guy's some kind of like profit or genius just because he's just like this is what he does all day he pays attention it's like being like a stock trader he's a great ad man and then i would go back to my hotel and wake up the next day and scan the headlines and be like there's some headline that's like some people are talking about maybe hillary might be sick and i'm like that's because of this guy right oh my god that's impressive like all ethics aside that that is really impressive yes and there was a part of me that was like there's kind of kind of like an Ocean's 11 element of this.
[757] Yeah, of course.
[758] You're like, you don't want them to rob the thing, but you're kind of like, how are they going to pull it off?
[759] Well, his technology has increased.
[760] I have only felt increasingly less a part of everything or that the system is so big.
[761] And to think someone else went the other direction with this and like, oh, no, I can have a huge impact.
[762] So he had the clear objective of making sure Hillary didn't get elected.
[763] Right.
[764] Yeah.
[765] But also becoming a big player in some kind of game.
[766] It's an adrenaline rush.
[767] It's addictive.
[768] You make money from it.
[769] Like, he became a brand.
[770] He became a media brand.
[771] What all the people in my book have in common, and it's like two dozen people who make up this sort of ensemble cast, from the guy in Chicago to the Nazis.
[772] What they're all actually good at is media creation.
[773] Media, propaganda, messaging.
[774] They're not, like, great political thinkers.
[775] They're great messengers.
[776] Yeah.
[777] Well, I got to tell you, the time I was impressed by it is Firefest.
[778] Oh, yeah.
[779] Did you watch any of those documentaries?
[780] I watched both of them.
[781] And the fact that company, fuck Jerry or whatever.
[782] They really had engineered this thing with the yellow screen and by God it fucking worked and when they were describing it I'm like, why is this gonna work and they were just dead right?
[783] It worked like crazy.
[784] You know what else?
[785] Fuck Jerry engineered?
[786] What?
[787] One of those documentaries.
[788] Oh, right.
[789] They produced the one.
[790] Which is even like I watched the one, I think it was the Netflix one I can't remember which one.
[791] And I then looked at the credits and was like, what?
[792] Yeah, yeah.
[793] Yeah.
[794] Like that it just, the point is there's no like floor where you go, okay, this is where this is where the human fingerprints aren't on it anymore it's all the way down i don't have this opinion but let me just be a contrary in for one second i can imagine sedate homeless anis and going okay great so yeah the individual has this power that in the past was really held just by the powerful who owned the monopoly of newspapers i could see where a lot of people would think well this has been democratized and you're pissed that your average human being now has sway over what we're taking in but why do you prefer be one of these megalists that's setting the story.
[795] Well, and to take it one step further, I'm pissed because I'm losing the power in that equation because I don't get to sit in the New Yorker and go, no, no, here's what you should think.
[796] Yeah, any story that's worth consuming should be fact check.
[797] Now, I am of that opinion, but I can definitely see where people go like, fuck that.
[798] No, I trust this guy more than your fact checkers at.
[799] Totally.
[800] And just numerically speaking, that is true.
[801] Right, right, right.
[802] So I think you're totally right there, and I go to some length in the book, and Maybe people will think I should have gone to more length to do this.
[803] To be clear, the good old days of gatekeeper media were not good old days.
[804] There were huge, massive blind spots that were, it's not possible to go back to that, and it's not desirable to go back to that.
[805] So it can't be, well, let's just put this genie back in the box.
[806] So it's not an argument -driven book in the sense that I'm not like every chapter ends with my checklist of things we should do.
[807] Right.
[808] It's a story.
[809] Yeah.
[810] And the way, and I think the point of getting drawn into the deep.
[811] complexity of these stories is to look at all the sides of it and go this is really complicated there's no clean solution we have to work our way out i'm of the opinion because yeah i don't think you legislate yourself out of this situation at all some legislation can help but it's not gonna i don't yeah unless you're dismantling the first amendment i just don't see a a legislative answer to this so i guess what i would be hoping and the best side of myself would would want um you to have figured out some underlying part of their human experience that led to this, to this sense, maybe I'm wrong, but I once watched a great 60 Minutes segment on the sovereign citizens.
[812] And these people, by my definition, are insane.
[813] You know, they don't, they won't carry a license.
[814] They've killed all these law enforcement agents.
[815] And they have this very interesting view of what it means to be a sovereign citizen.
[816] But when 60 Minutes looked at it, almost without exception, almost every member of that group, within the last few years had had a good pain blue collar job and they lost that and so what was unavoidable was that that was a part of all of their stories and you can't really ignore that so it's like when you're trying to talk about how to legislatively affect the sovereign citizens I don't know that you're going to have a lot of success in that but if we can address the scenario by which these things are born out of I just feel like that's a lot better approach for us you know it's it's the ounce of prevention approach.
[817] Yes.
[818] There are a lot of sort of commonalities and patterns that I encountered.
[819] And I do try to be clear that it's not a one -size -fits -all prescriptive thing, that, you know, 12 % social isolation mixed with 5 % prednisone addiction equals 89 % chance of Nazi.
[820] Like, yeah.
[821] There's definitely economic factors for sure.
[822] Although a lot of people that I encountered were very, very bucolic middle class, had everything handed to them.
[823] Sometimes there's just personal, sometimes there's such an addiction to hyper -contrarianism that it's like one of the ways I think I described it was like, you guys know what the red pill is, the sort of obsession with the matrix.
[824] The matrix.
[825] Yeah, the, the, the red pill analogy from the matrix ends up being used all over the internet to be like, take the red pill and understand the truth.
[826] See the truth.
[827] See the truth that we're actually sovereign citizens that don't need to respond to law enforcement or that we're, you know, that climate change is a hoax or whatever.
[828] These are all red pills.
[829] Yeah.
[830] And I sort of said like there's a kind of personality where you're so addicted to red pills that you'll just swallow anything with a pinkish hue without being like, is this full of arsenic or is this, like, what is this?
[831] You'll just swallow it because you're like, the thrill of it is so addictive.
[832] Yes.
[833] So there's that.
[834] There's personal elements.
[835] There's, I mean, the guy who grew up, you know, poor in an Illinois junkyard, of course the system wasn't working for him.
[836] And then, you know, he's told that the problem is that you have too much white privilege.
[837] And he's like, I do not feel like I have too much white privilege.
[838] Yeah, yeah, yeah.
[839] And again, this is where you get into these tricky, I'm not excusing or saying that I think he's right about that.
[840] I actually feel like it's part of my job to bring my brain to bear and go, like, I think you're wrong about that.
[841] Yeah.
[842] But I do think it's incumbent on us to understand the argument and go, okay, well, this, you know.
[843] Well, again, I just want to point out, there is him being right or wrong about that.
[844] And then there is a human being on planet Earth that has these feelings and these motivations.
[845] Right.
[846] And then they are legally, they're able to act on that.
[847] Right.
[848] And so just the question is, do you want to have have an impact on that and you know what is the approach for that right yes and some of it is broad and systemic and some of it is personal and sometimes there's a overreliance on everybody wanting to have a clean rosebud moment and go like well that's just the moment and if we could just make sure that nobody ever has their sled taken away from them yeah they won't grow up to be assholes but like so that's not scalable right as the tech guys say but it's also important to look at how all the factors play against each other so some of it is this feeling of don't tell me what to say.
[849] I feel hemmed in.
[850] I can say whatever I want.
[851] Some of it is I often found people would have a kind of combination of a very high IQ and a very low EQ.
[852] I mean, I saw this with one of the sections of my book is actually I embedded at the White House briefing room because some of these just absolute troll performance artist type people were in this new timeline we're in getting White House press credentials.
[853] Oh, it's so spectacular.
[854] I was at the deplorable.
[855] I was sitting at the deplorable and I meet this sort of like you guys know mylo yeah yeah yeah yeah mylo's the fabulous british i think belmar had him on yes he was on yeah he was yeah i've spent a lot of time with milo too myel used to make fun of me call me a soy boy for getting soy milk in my coffee oh okay and then he would drink hot coffee through a straw so he didn't stain his teeth sure sure you're making fun of me in this scenario but myelow is a kind of species there's like other people who do that thing like I'm fabulous, I'm unconcerned.
[856] Like, the trolling that they do is to just be so fabulous that no one can say anything to them.
[857] Yeah, all press is good press.
[858] It's all attention.
[859] You're talking about me. I'm not talking about you.
[860] Yeah, exactly.
[861] All the haters love me. I love my haters.
[862] You know, ugly people hate me. That's one of those shirts that Milo, you know, wears.
[863] It's just like, it's really dumb and despicable.
[864] Whatever.
[865] There's a million things to say.
[866] But by the way, if he doesn't outrage us, he doesn't have a position.
[867] So you also have to recognize everyone's, own role in it.
[868] But that is what they're trading on.
[869] It's the same way that it's frustrating that people don't understand the actual mission of terrorism, which is a low barrier of entry act that will warrant an outsized reaction that will cripple a state's economy that we would probably not respond then in the way that we do because we would know, oh, we're doing exactly as they would want us to do.
[870] Which is very much exactly that the intention of trolling.
[871] Yeah.
[872] The intention of trolling is to say one little thing and just like, I'm not mad, you're mad.
[873] And then suddenly you're the center of attention and you just get to back off, I mean, imagine if you had no experience in leadership or politics or anything except corruption and you could just say, I think this black guy wasn't born in America and he shouldn't be president.
[874] And you got to control mainstream media for as long as you wanted, just by saying that and not having the basic human decency to not say that.
[875] Yeah.
[876] Yeah.
[877] And then maybe imagine if you could do that enough that you could literally become president because of it.
[878] Yeah.
[879] That would be a great trick you could play.
[880] Sure.
[881] Sure.
[882] And And it turns out we live in that timeline.
[883] And there's this concept of fitness that, you know, Darwinian fitness means just the fittest will survive.
[884] There's also a kind of fitness that means like it's like ethically or morally fit, like the news that's fit to print.
[885] And one of the things that I sort of started playing with is a meme like birtherism or a meme like, you know, feminists are worse than cancer or whatever.
[886] The worst thing you can think of is maximally fit in the Darwinian world of social media.
[887] It is maximally engineered to be a viral meme, and it is zero percent fit in the old gatekeeper media sense.
[888] There's no good gatekeeper media paper of record that would print an article saying, gee, I wonder if this president was born in America, because it's not true.
[889] Right.
[890] But it is perfect catnip for people to go, how dare you?
[891] How dare you, sir?
[892] And you can't not do that.
[893] There can't be no one in the world who tells Donald Trump he's wrong for saying that because then you...
[894] Yeah, exactly.
[895] Well, no, yeah, no. I disagree.
[896] I mean, I really think, look, again, we're getting dangerously political for the show, but I will say, we're in a bed we made.
[897] There's no getting around that.
[898] If no one ever would have responded to any of the outrageous shit, he said, there wouldn't be one.
[899] Like, we are as responsible as anyone that voted for him.
[900] If you were perpetuating and endlessly talking about and giving attention to and getting him in the news cycle, we're playing along.
[901] What I say, I'm glad there are two of you here because you're just acting out the two.
[902] I literally do think you're both right because what I say in the book is trolls set such an ingenious trap because to respond to them is to give them fuel.
[903] To not respond to them is to create the perception that you are okay with what they're saying.
[904] But that part I pushed back on.
[905] Why?
[906] Explain why that I don't engage with someone saying something absolutely ridiculous.
[907] Like I wouldn't engage with a flat earther.
[908] I'm not going to waste my time.
[909] If you think the earth is flat, good on you.
[910] I would never waste one second of my time saying it.
[911] But what if you're already an influential person like Kyrie Irving who's in front of TV cameras and then you're saying the earth is flat?
[912] What if you're being interviewed on TV and you say, yeah, I was really glad that we won this game.
[913] I really played good defense.
[914] Also, the earth is flat.
[915] And at some point somebody's got to go, wait, wait, wait, wait, don't tell people that.
[916] That's wrong.
[917] Well, I kind of disagree because on its surface, 99 % of people are going to hear that.
[918] and go, oh, my God, that guy thinks it's this.
[919] Now, I come out and I attack him.
[920] Okay?
[921] Now, I've amplified that message hugely.
[922] And now something emotional starts happening.
[923] So someone else who feels like an underdog because they didn't go to an Ivy League school or they weren't educated.
[924] Now there's an emotional bond they have with that person that someone with power, status, and stature is calling that person out and belittling them.
[925] Now they, I've created an ally for somebody who might not even be a flat earther, but the emotional component is so compelling and so relatable that now they're defending some guy and then by virtue of that defending his position on flat earth that's the risk it runs 100%.
[926] 100%.
[927] There is no way out.
[928] I will take that one step further.
[929] When I go into the world of one of these people who I think is bad or wrong and I write an article in the New Yorker pointing out how they're bad or wrong, I run the risk of gaining them more exposure and more fans who are going, And this egghead elitist, whatever, just went to go prove that you're wrong.
[930] But like, fuck that guy.
[931] Yeah.
[932] And also, let me just add something.
[933] I'm not being critical.
[934] I cherish your role in our society and the fourth estate and all that.
[935] But you put your article in The New Yorker, guess what?
[936] It's only reaching people who already thought that.
[937] Right.
[938] It's already, you know, I'm going to read it.
[939] I'm like, yeah, I totally agree with that.
[940] Because I already fucking subscribed to the New Yorkers.
[941] So, of course I do.
[942] Thank you.
[943] One of the moments that this really hit me, I mean, I was constantly thinking about this and constantly thinking about the tradeoffs.
[944] It's very, very complicated.
[945] There were a few moments where, you know, somebody would tell me, my whole worldview is based on fuck you, you smug elite motherfucker.
[946] You don't get to tell me what to think.
[947] And then I would go write that quote in The New Yorker and they would have to put a little accent over elite.
[948] Oh, uh -huh.
[949] And I'd be like, we're the problem.
[950] You're confirming exactly what he accused you.
[951] I was like, but we just make an exception in this one case.
[952] Stay tuned for more armchair expert.
[953] if you dare And the other thing is the trolling scenario, it's people who have the power to enact massive cultural or political harm part of it depends on what kind of microphone you have to begin with.
[954] So there's a difference between some random anonymous guy on Twitter saying, I think the earth is flat or I think whatever, versus someone who is already a massive celebrity who is maybe actually having some, like if someone goes on TV tomorrow and says, hey, nobody wash your hands and this coronavirus thing, just do a silly dance and it'll all work itself out.
[955] Then there's like material harm that might happen.
[956] Yeah, yeah.
[957] And so part of it depends on like, what are you preventing?
[958] But again, the basketball player, like when he says that no one responds, to me that files him into the, oh, he's the nut in front of 7 -Eleven.
[959] Now if Alec Baldwin engages with that guy, now he's not a nut in front of 7 -Eleven.
[960] He's like, Alec Baldwin's taking him serious.
[961] But he's not a nut in front of 7 -Eleven.
[962] He is.
[963] Anyone who says the earth is flat is a fucking nut.
[964] Okay.
[965] In your perspective, but he has a lot of fans and people who believe what he says, whether they should or shouldn't they do.
[966] I don't know if I buy that.
[967] I think they stand with him, but I don't know if it converts your opinion.
[968] I just don't know if we can, in the current year, make the argument that people with insane, just objectively wrong opinions can't hold real power.
[969] Yeah, we can't run that risk.
[970] Well, it's not even a risk.
[971] The president doesn't think climate change is happening.
[972] It's like that that's the most dangerous fact.
[973] about the world we live in right now.
[974] So it can't be like, well, everybody knows climate change is real.
[975] Because of the very identity formation tangle we're in.
[976] Because people go, I don't want the eggheads telling me what to do with my car and my air conditioner or whatever, we're in the tangle right now.
[977] I agree with you that the steps we took to get into it were part of this vast, complex finger trap scenario that we all led into in our own ways and we're blinded to.
[978] And partly because of the technology stuff we were talking about, we blundered into this collectively.
[979] Yeah, yeah.
[980] Let me just ask both of you.
[981] Do you believe that they should say the names of the mass shooters?
[982] Well, this is something I think about all the time with, I don't even know whether to name the people in your story.
[983] In my story.
[984] I name them in the book because of, you know, journalistic, all kinds of reasons.
[985] There are some people I don't because they were sources who were trusting me with their anonymity.
[986] But I do think that naming them on a TV show is different.
[987] I think what kind of picture you choose to show of them and does it glorify them.
[988] I think these are all difficult questions.
[989] I think if we kept them entirely anonymous and there was no glory and there was no sense that they would finally have gotten their comeuppance, but you would have to, you would have to learn their name, you'd have to know about them.
[990] These are people who feel like they didn't exist and they want to hurt as many people as possible to be seen.
[991] I think seeing them proves that the calculus works.
[992] Well, I'll tell you, there's one woman who I did a lot of section of the book about who I've probably spent more time talking to than any of them.
[993] And actually she never wanted anyone to know her name.
[994] She was not in it for the fame.
[995] She wanted to be part of something.
[996] She wanted to have an identity.
[997] She wanted to feel like she was someone.
[998] And not even in a fame or power sense.
[999] Just like wanted to be part of a community.
[1000] And I think we all know people who don't, the way they see themselves is by seeing themselves reflected back from other people.
[1001] She was just one of those people like, who am I?
[1002] I'm the way people respond to me. And she was really good at service.
[1003] She was a good bartender for this reason because she was very good at creating an intimate intimacy with someone, like an immediate intimacy with a stranger.
[1004] And she grew up in New Jersey, a lot of friends of different races and whatever.
[1005] She also knew that her grandmother grew up in the 30s in Germany, but they didn't really talk about it very much.
[1006] And then she moved to a new place, didn't really know a lot of people, started dating a guy who spent a lot of time on the internet and would sort of make these jokes that she didn't really understand.
[1007] He would, like, make a distinction between Jews and white people.
[1008] And she'd be like, Jews seem like they're white to me. And he'd be like, all right, we'll talk about that later.
[1009] This is over, like, months and months and months.
[1010] And she just kept getting more and more of these clues.
[1011] And then one day he said something about the Day of the Rope.
[1012] And she was like, I don't know what that is.
[1013] It was just a joke, like, ha, ha, day of the rope.
[1014] And she Googled it on her phone.
[1015] And a Reddit thing came up explaining what it was.
[1016] And it said, The Day of the Rope is from the book that was found in Timothy McVeigh's car.
[1017] Oh, yeah.
[1018] And it's the fantasy of what's going to happen when the white nationalist rise up.
[1019] So then she drives to his house and says, what the fuck, dude?
[1020] And he says, okay, so I am a, I am a fascist.
[1021] And if we white people don't stick up for ourselves, we're going to get wiped out.
[1022] And they're going to come for us and they're going to genocide us.
[1023] And I don't believe, I'm not actually proposing that we do these violent things.
[1024] I'm just joking about that.
[1025] But I do think that, and what he actually said to her was, when I first heard these concepts, they were so shocking to me and viscerally almost, off -putting to me that I almost turned away, but then I looked more deeply and I realized that they're actually getting at a deeper truth, which is that we white people are endangered and the Jews are the ones engineering the whole thing.
[1026] And because they have this superior intelligence, they are kind of the puppet masters of this entire, you know, and then he basically started showing her all these little sort of clues of like, do you notice how this is something they never mentioned in the media?
[1027] They never talk about how the immigration laws were changed in 1965, but why did they do that?
[1028] Why don't they want you to see that?
[1029] Well, look who owns all these media companies and blah, blah, blah.
[1030] And he started taking her down these paths.
[1031] And she had the visceral reaction that you would have where she was like, no way, I'm out.
[1032] I'm never seeing this guy again.
[1033] Yeah.
[1034] Left the house, crying all the way home, got back to her place and just said, I just want to see one website that he was looking at just to see what it is.
[1035] Because I don't want to be closed -minded.
[1036] I want to just know what it is.
[1037] Sure.
[1038] Opens up the website.
[1039] Hmm, I've never read this fact about this thing.
[1040] I've never seen this perspective on this thing.
[1041] Goes from that to that, to that, to that.
[1042] Five days later, she calls him back and says, okay, I'm on your team.
[1043] Oh, wow.
[1044] Eventually, they called her the first lady of the alt -right.
[1045] And she actually, like, will be mad at me when she hears this because she's a big fan of this show.
[1046] Oh, Jesus.
[1047] Don't say that.
[1048] No, but she's...
[1049] I hate that.
[1050] No, no, no, no, no. Because I didn't tell you, I didn't tell you that.
[1051] And the reason that I'm still in touch with her is because she worked her way out.
[1052] Oh, okay, we're back.
[1053] we're better.
[1054] Okay.
[1055] She, no, no, no, no. You almost caused a huge fight between us.
[1056] No, no, no, not.
[1057] She started from the, she started from the normie world of listening to this show.
[1058] She was a big fan of Dan Savage.
[1059] She used to drive to work listening to Dan Savage.
[1060] And then she actually, because this is, I mean, the way that I got to tell her story was by just spending hundreds of hours, just being like, what were you thinking then?
[1061] What were you feeling then?
[1062] And one of the things, and these are just ran, these aren't things that you would get.
[1063] to unless you got really deep.
[1064] But she was like, you know, at one point, I started going, why am I allowed to listen to like people talking about fisting or whatever, just crazy Dan Savage sex stuff?
[1065] But I'm not allowed to listen to a show about how white people are okay.
[1066] Now, again, this is one of these things that is deeply, deeply wrong.
[1067] Yeah.
[1068] But from her specific perspective where she had never been inoculated against certain ideas in the proper way and she'd never read the kind of books that whatever, like whatever she was coming, whatever she needed emotionally or personally, it hit her.
[1069] And because she was so in love with this guy and all this stuff, it hit her in a way that it made sense.
[1070] To the point where they got her to be like, you know what we're going to do?
[1071] We're going to go down to Charlottesville, Virginia with some torches.
[1072] But we're just there to just show that we don't hate anyone.
[1073] We just want to stick up for ourselves because we have a right to exist.
[1074] And she just was like, yeah, I guess that might be a good idea.
[1075] She actually went to one demonstration in Charlottesville before.
[1076] There was one before the big one.
[1077] The second one, she was at work, so she didn't go.
[1078] She saw the woman getting killed on TV, and then she was just like, okay, something in my life has gone horribly wrong.
[1079] It shattered her.
[1080] It shattered her.
[1081] I need to get out.
[1082] She called me. Oh, wow.
[1083] Because through a series of connections, I was kind of known in that world.
[1084] They didn't want to talk to me, but they knew who I was.
[1085] Right.
[1086] And she said, okay, I'm going to talk to someone who understands this world, but is not in it.
[1087] Mm -hmm.
[1088] She was so confused and turned around that she would go, I know the Holocaust happened.
[1089] But did it really happen, though?
[1090] Like, was it really $6 million?
[1091] What if it was just like $1 ,000 or like $10 ,000?
[1092] And I was like, no, no, no, it happened.
[1093] And she was like, I know, I know, I know.
[1094] But really quick, what if it was $10 ,000?
[1095] Right.
[1096] What if a leader decided that people should be killed for their religious beliefs?
[1097] Well, that too.
[1098] And they only kill 10 ,000.
[1099] That too.
[1100] I guess I don't see the numbers being as material.
[1101] They think it matters because the Jews have been using this.
[1102] You'll notice that I am, because I'm Jewish, I can ventriloquized that voice and feel like I am okay.
[1103] But like their theory is the Jews have used this victim narrative to get whatever they want from society.
[1104] So it has to be that big inflated number for them to get, you know, all this, blah, blah, blah.
[1105] Sympathy.
[1106] Yeah.
[1107] If you talk to this woman, she.
[1108] is funny she's charming you talk about bands with her all day long yeah there just was just a a big enough hole inside of her but this one is actually she got out so like this is a useful one not everybody in my book got out and i still talk to her all the time she's the only one of all the people that i spent all the time with in my book who i thank in my acknowledgments because she's the only one who i feel like i can have a relationship with right because she got out the other people that i talk to they are trying to give me a propaganda line and I have to step through it.
[1109] And with her, it was a process, just like there's all these coupled things that need to be present for her to sink in.
[1110] There were all these other factors that needed for her to come out.
[1111] There was Charlottesville.
[1112] Her grandma died and she had this complicated set of feelings about that.
[1113] She actually rose up in the movement more quickly than he did.
[1114] Oh, you probably resented her, yeah.
[1115] Well, a lot of the, you might not be surprised to learn that a lot of the people who build their identities around racism and anti -Semitism and massabism.
[1116] Rely on women to do a lot of the work in the movement for them.
[1117] Women have done all the work and all the things always.
[1118] So there were a lot of times when literally they were organizing their next Tiki Torch rally.
[1119] And she was like, I'm going to make the spreadsheet because no one else is doing it.
[1120] The guys are too busy fucking acting mancho and alpha and doing nothing.
[1121] Yes, exactly.
[1122] And ranting on their Nazi podcasts.
[1123] Well, and actually one of the ways she found me is because I wrote a piece about the kind of main podcast of that movement, which is called The Daily Shoa, which is a very hilarious pun because showa means holocaust.
[1124] Oh, wow.
[1125] Yeah.
[1126] So what a funny joke.
[1127] Yeah, isn't that?
[1128] That's so they, this is what I'm saying.
[1129] But the thing is, and this is again, this is sort of dangerous territory, but like sometimes these guys are witty.
[1130] I mean, the Daily Shoa, they had one show called Nationalist Public Radio.
[1131] They had one called Fash the Nation.
[1132] Like, they're not idiots.
[1133] Right, right.
[1134] They had like one episode and they put out in September called White After Labor Day.
[1135] I was like, that's pretty funny.
[1136] That is pretty funny.
[1137] But these are guys who, this was the show she was listening to as she was getting indoctrinated and it's not yelling.
[1138] It's not hate this person, hate that person.
[1139] It's being part of a group, having in jokes, having all these acronyms and all these memes and all these memes and all these, it's, they call it shit posting.
[1140] So what you do online, on 4chan or on Reddit or whatever is you shit posts.
[1141] You just, whatever shit pops into your head, it's like a group chat.
[1142] That's what they would act out.
[1143] And they would be like, you guys, and as you know, podcasting is a very intimate form you feel like you're there you feel like you're in the room so she just started to feel like these are my these are my dudes like I'm part of their joke you know the first 10 episodes I listened to I didn't know what they were talking about half the time now I'm part of their community it's like Opie and Anthony for Nazis that's what they pattern it after and so it's designed to inculcate you and indoctrinate you that guy who made that show he's the guy with the black brother and the Jewish wife I found him because when Charlottesville happened, he had been anonymous.
[1144] He had been podcasting under a pseudonym.
[1145] He grew up in the most idyllic, liberal, his parents were professors, his dad taught Beowulf.
[1146] They adopted this biracial kid.
[1147] His sister was very accomplished in the sort of normie world.
[1148] And he was addicted to this kind of the way I can get a certain type of reinforcement or feedback, even if it's negative, is to be surgically able to demolish anyone's argument, even if it's the right argument.
[1149] Uh -huh.
[1150] So he was the guy at Thanksgiving who'd be like, oh, well, you know, you think you're so cool for being vegan, but let me actually tell you that there's more, soy is worse for the environment that, like, he was that guy.
[1151] Sure.
[1152] About everything.
[1153] And that led him to be a communist for a while.
[1154] Okay.
[1155] Because, like, you guys think you're so cool being bourgeois liberals, but you actually don't stand up for your principles and you actually aren't redistributing wealth in the way that you claim to be.
[1156] He was really into anti -war Afghanistan stuff.
[1157] If you really care about the Afghan people Why are you supporting this Democratic Party that's right?
[1158] So he's not wrong about that Right, right He just is too addicted to the thrill of demolishing people with those arguments Yeah Then the internet comes into his life He's married to this Jewish woman But they're kind of like recluses They don't really go out that much He spends all his time on the internet On these message boards Demolishing and destroying people Going into these message boards and going Here's why you're dumb if you believe in God Here's why you're dumb if you whatever It doesn't even matter It's like a video game He gets so good at this that he builds his identity around what can get the most traction of people They call them rage quits How can I collect the most rage quits Where I piss people off so much That they have to leave the internet And he gets addicted to that Kind of like collecting trophies like that Or you're getting blocked probably Getting blocked coming back with a new name He and his buddies would all brigade Into these rooms together and go It's like liberal tears We're going to create all these liberal tears Byrata So at that point he's a Trotskyist But then because he has been you know, accepted by the Trotskyists, he has to find a way to piss them off.
[1159] Sure.
[1160] So then he says, what's the books that you guys are the most afraid of?
[1161] And those are like these libertarian tomes about how government shouldn't exist.
[1162] So then he goes and reads those books.
[1163] And then he becomes converted to libertarianism.
[1164] And then he has to go piss off the libertarians.
[1165] So then he goes, well...
[1166] Oh my God.
[1167] So then...
[1168] This is exhausting for him.
[1169] It's crazy.
[1170] So then you get to a point where you go, the problem with libertarianism is that you pretend that all people are equal, but they're not.
[1171] So if I really have freedom, I should be able to create a covenant of other people where we're all white and we don't let anyone else in.
[1172] That's the short version of how you get from libertarian to white nationalist.
[1173] Right.
[1174] And it's not a random, again, I don't pick don't don't pick don'ties who are random examples.
[1175] This happens to a lot of people.
[1176] The libertarian to alt -right pipeline is what they call it.
[1177] Ah.
[1178] Oh, wow.
[1179] So he goes to this place where he goes, I have taken the ultimate red pill.
[1180] The one thing in our society that you're not allowed to say is white men are the real oppressed group.
[1181] Uh -huh.
[1182] Jews are not actually white.
[1183] They're the ones who are secretly controlling us.
[1184] Why do you think we're always going into all these costly foreign wars?
[1185] Because the Jews, that's what they call it, talk about Zionist occupied government.
[1186] The Jews have decided that in Israel's interest, we're going to destabilize the Middle East so that Israel is the only democracy left standing.
[1187] There's like many, many layers of this, and it's all a conspiracy theory, so I don't want to give it too much credence.
[1188] But it's all mapped out, is the point.
[1189] It's not blind hatred.
[1190] It's very, very overthought hatred.
[1191] Right.
[1192] And it's systematic.
[1193] It's like we have to spend hours and hours and hours explaining it to you because it's so deep.
[1194] And when you are the kind of all the conditions are in the right place for you to get led down that path, I can tell from talking to enough of these people it is genuinely thrilling.
[1195] It is a genuine process of intellectual discovery that is wrong.
[1196] Yeah.
[1197] But that by their lights is exciting.
[1198] Like there's this moment in Huck Finn where Huck Finn realizes Jim is actually a person.
[1199] This slate.
[1200] that he's on a raft with and he has this epiphany that like I no matter what my society has told me I have this moral intuition that this guy is a person and I'm going to build my life around defending that they have the opposite experience and either way you go it's like a conversion moment so but this woman she is now piecing her life back together and going I've essentially made the worst mistake I can make and in a weird way it's like can't Can she ever work her way out of that?
[1201] Because remember, she wasn't at Charlottesville where somebody got hurt.
[1202] She was at the other one.
[1203] She never physically hurt anyone.
[1204] I have people, my wife was a public defender for a long time.
[1205] She defended people who really did hurt people.
[1206] Oh, sure.
[1207] And every time we would talk about them or bring them up with friends, her clients, everyone would go, well, I hope he's okay.
[1208] I hope he works his way to rehabilitation.
[1209] I hope they would have nothing but kind things to say about someone who was trying to do their debt to society.
[1210] And then I would bring up one of these people who had these bad ideas and felt bad about it.
[1211] And they'd be like, that person is just fuck them forever.
[1212] Yeah.
[1213] Yeah.
[1214] It's a very weird.
[1215] Well, the problem is like, yeah, I'm not disputing the anger towards those people.
[1216] I'm not disputing the disgust.
[1217] I have it.
[1218] It's all about what's the solution.
[1219] And I don't think further alienating, further demonizing for all those things.
[1220] I just, I don't think that, unfortunately, is the solution.
[1221] Mm -hmm.
[1222] Yeah.
[1223] Like, compassion's the solution.
[1224] Preventative stuff's a solution.
[1225] With the understanding that you don't have to ever soften your...
[1226] No, by the way, justice with compassion is what I believe in.
[1227] Like, people who do shit need to pay the price for that.
[1228] But that doesn't mean you can't make an effort to understand how it's started or what went out of hand or, you know.
[1229] And in a way, the fact that I, I wasn't intending to go to the actual out -and -out white nationalist anti -Semites.
[1230] But when I did, in a way, I found it easier because I'm Jewish to go, okay, there is no part of me that is worried about getting sucked into their worldview.
[1231] So I can actually look at it with more clarity.
[1232] When I was talking to the pickup artist or the misogynist or whatever, I was like 99 % sure that I wasn't going to be persuaded by any of it.
[1233] But there's always a chance.
[1234] Like, maybe there might be some, but once you're talking to a guy who wants you to be incinerated, you're like, I don't think this is going to be persuasive to me. Like, yeah.
[1235] So, and there were these moments where they would be talking to me that that guy, the Daily Showa guy, we talked for hours and he was crying and we were getting into some really deep stuff.
[1236] he was telling me about the book that changed his life, the book that told him about the JQ, which is what they call it, the Jewish question.
[1237] This is the book that red -pilled him on the J -Q.
[1238] And he told me, you really got to get it.
[1239] You're going to, it's going to blow your mind.
[1240] And then he goes, wait, you're not like a, you're not a Jew, are you?
[1241] Like, yeah, buddy.
[1242] Yeah, yeah, you bet.
[1243] Yeah, maybe you should have, like, done your homework, done your research, or I'm not hiding it.
[1244] I'm not, you know.
[1245] Yeah.
[1246] And he's like, well, I don't, I mean, you had red hair.
[1247] hair and like that kind of that threw me off and I'm like yeah I don't know what to tell you like when I hear people using these terms I literally think oh they must have honestly never met someone Jewish they must not even know what they're you know what I'm saying it must be such a foreign thing to them that they could think this this is this guy he grew up in New Jersey totally Jews everywhere he went to high school was Zach Braff.
[1248] Oh my god.
[1249] I just talked to him and a half hour ago.
[1250] How weird.
[1251] Bader Meinhoff.
[1252] Did he in that moment when he's in front of you and you're saying, yeah, is he visibly embarrassed?
[1253] Well, this was over the phone.
[1254] This was over the phone.
[1255] He was audibly embarrassed.
[1256] He was definitely embarrassed.
[1257] He was embarrassed.
[1258] I had actually, we were supposed to go to a Shabbat dinner that I had canceled to spend my Friday night talking to a Nazi on the phone.
[1259] And we did end up meeting up and he wanted to meet at a German beer hall, which we, did, and I went to meet him at a German beer hall, which was his way of trying to troll me and throw me off my game.
[1260] And I was like, yeah, man, I'll meet you at a German beer hall.
[1261] That's fun.
[1262] Uh -huh.
[1263] And that's, by the way, he lived in the Upper East Side of Manhattan at the time.
[1264] Oh, really?
[1265] It is never what you think.
[1266] Yeah, it is very shocking.
[1267] You have a cartoon character, I do, of those people, and that's not the case.
[1268] It's almost never, I mean, Stephen Miller's from Santa Monica, people are never, you know, a lot of these guys, the people I've mentioned so far went to Bard, Pepperdine, it's just never, it's never what you think.
[1269] And part of it is, I mean, some of them, they're not all whatever, the identical kind of intellectual profile, but the people who become leaders in this movement, the people who are good at picking apart news narratives and inserting themselves and hijacking it, it's a skill that they have to acquire, that they have to deduce how to do.
[1270] So I do think all this stuff is an argument for actually immersing yourself narratively in people's lives.
[1271] I think the only way you can actually walk that fine line between excusing and engaging is to actually like go through the process of immersing yourself in that world and not try to get to the bottom line.
[1272] Because the bottom line is these people are bad.
[1273] The internet should probably be reorganized so it's not playing to our worst instincts.
[1274] We should try to build more complex and robust systems so that we don't drive ourselves off a cliff.
[1275] Like those are the answers that everybody I think would agree are the answers.
[1276] The challenge is how do you get there?
[1277] Yeah.
[1278] And I just don't think you can get there unless you delve into the complexity of it a little bit.
[1279] Because, again, this will be unpopular too.
[1280] But I am kind of a hardcore liberty person in that this place has got to be open for the very best ideas and the very worst ideas.
[1281] And you can't just lop off the part you don't like.
[1282] Like, you got to take, it's a whole recipe that I want those good ideas.
[1283] So unfortunately, I have to, you know, protect people's rights to have these terrible ideas.
[1284] Yes.
[1285] And I think the thing where I often like go to push that a little bit.
[1286] is to say nobody reasonable wants to diminish free speech.
[1287] The thing that I don't like is when we seem to suggest that what follows from that is because free speech is a value, therefore it can't be intention with other values.
[1288] So I can hold the value of free speech to be absolutely sacrosanct, but I also have to be realistic about the fact that it has externalities.
[1289] It has other things that are concurrent with it, that it can be problems.
[1290] So in the same way that I want to be able to have people have businesses and, you know, have an economy that's not centrally planned.
[1291] That doesn't mean that I don't look at that and go, oh, if you're polluting a river, you can't do that.
[1292] Or if you have carbon emissions, that that's not a problem.
[1293] So sometimes we blind ourselves because we go, no, no, no, free speech, that's the end of the conversation.
[1294] Free speech is the most and the least and the end.
[1295] That's all we can say.
[1296] Yeah.
[1297] And like what I have started sort of pushing toward is like, okay, free speech and creating a society where we're not just completely like eating ourselves from the inside because of our allegiance to speech and the absence of all other values.
[1298] Well, I was confronted with this on a personal level, which is the rights of journalists.
[1299] So that umbrella shelters and protects paparazzi.
[1300] And so when I had a kid and I started noticing, oh, these people are living in my front yard and they're like attacking kids on playgrounds, right?
[1301] I at least quickly recognize, like, I'm not even going to enter into an argument that they don't have the right to do that.
[1302] They do.
[1303] They have a protected right.
[1304] But you also have the protected right to shit on your dining room table.
[1305] There is no laws in America against shitting on your dining room table.
[1306] So you can have calls to action that don't involve legality.
[1307] So mine was like, yes, you're right.
[1308] They have a right to do that.
[1309] As they have a right to shit on their dining room table, but I'm imploring people to recognize how these photos are procured.
[1310] And maybe they don't want to participate in that.
[1311] And in the same way that if you have an economy that's built on getting clicks by sniping children's photos, that's a bad economy to build.
[1312] in the same way that if you have deconstructed all of the things we've ever been able to use to distribute information or knowledge or news or whatever and you've decoupled all of that and rebuilt it in a way that is engineered to be a slot machine for people's quickest and worst emotions Yeah, the reptilian midbrain.
[1313] And then you go, well, look, that's just what people want to click on and who are in free speech.
[1314] Yeah.
[1315] It's like, no, no, no, no. Yes, people should have the freedom to pull a slot machine, but you're also engineering it so that they're stuck there and they're wearing diapers and they're losing their homes.
[1316] Yeah.
[1317] You've built a bad economy.
[1318] So it doesn't mean that I want the government tanks to run in and seize your casino.
[1319] But like, we got to build something better.
[1320] Yeah.
[1321] That's largely how I feel about the internet.
[1322] It's not, yay, everything's great.
[1323] It's not drowned at all in a river.
[1324] It's how do we work our way out of the system we've built in the same way that we need to rebuild our cities.
[1325] We need to rebuild our food chains.
[1326] We need to rebuild our economy so it doesn't fuck up the climate.
[1327] Has it progressed?
[1328] Have we seen the nadir of this or is it continuing to get worse?
[1329] Or you tell me, where are we at in the history of it?
[1330] Yeah, it's kind of both.
[1331] It's kind of both.
[1332] Well, in the sense that, you know, the kind of like generals are always fighting the last war.
[1333] The loopholes from four years ago are largely being closed.
[1334] You can't buy an ad in an American election with rubles.
[1335] Right.
[1336] Which you could do in 2016.
[1337] Right.
[1338] So it would be crazy if they didn't close that loophole.
[1339] And they did.
[1340] Yeah.
[1341] But the bigger vulnerability of.
[1342] Rupils.
[1343] Literally.
[1344] I know.
[1345] It sounds so silly, though.
[1346] But like, you know, to your point of sympathy, it's like you didn't know that that was going to be a loophole until you saw someone exploiting it.
[1347] Yeah, yeah.
[1348] How quickly do you act when that's brought to your attention and all those are all questions?
[1349] But okay, so you close that loophole, but are you still premising all of human information in sharing and dissemination around what is going to immediately viscerally excite people?
[1350] Yes, we are still doing that.
[1351] Yeah.
[1352] So the larger vulnerability hasn't been closed.
[1353] You know what I did, if I could use an analogy, I think we're in the, we're in the 70s right now with the ads.
[1354] advent of all this fun food that turned out to not be food.
[1355] Yeah.
[1356] And right now we're just like, fucking, hey, this shit's tasty.
[1357] Pop -tarts.
[1358] I can have dinner in five minutes.
[1359] Pop this in here, yeah, microwave this, right?
[1360] I think we were in the exciting, like, immediate response reward of it all.
[1361] And I hope to think that we will be correcting like we did with our food in some capacity.
[1362] And, you know, there's also an element of, like, how capitalism works, where you have to reckon with the idea that once you are craft and you have shareholders that you feel some fiduciary responsibility to.
[1363] Mm -hmm.
[1364] you're going to probably try to bury the food science that is trying to tank your stock price.
[1365] Yes.
[1366] That's just kind of the system we've built in terms of corporate capitalism.
[1367] Yes.
[1368] And so now everything seems to be justified by, well, I just have a responsibility to the shareholders.
[1369] It's like, yeah, but don't you also have a responsibility to, like, make sure you don't touch off any more genocides or, like, those are also responsibilities.
[1370] And it sounds very sort of squishy and, you know, whatever.
[1371] But it's like, yes, you do have responsibilities other than one responsibility.
[1372] Yes.
[1373] And I think this is something that people who work in the fourth estate are sort of comfortable with because the businesses are businesses, but they're also like mission driven businesses.
[1374] It doesn't mean they're perfect and they make mistakes all the time and whatever.
[1375] But you couldn't in my world make the argument of like, well, this is just good for the business.
[1376] So I'm just going to make shit up or I'm just going to.
[1377] Right.
[1378] It doesn't work.
[1379] There's a dual kind of bottom line.
[1380] Yes.
[1381] And that's what you're striving for.
[1382] You don't always achieve it.
[1383] but the reason that I am motivated to do what I do and work hard at it is that I feel like people are trying hard to achieve something other than I'm going to make a buck for someone somewhere else that is a hard thing to sustain and in fact it won't be sustainable for very long and journalism is really really endangered right now for that reason but you have to figure out how to do that if you want to look yourself in the mirror Well, listen, I am super impressed that you've dedicated the amount of time that you have to attempting to understand this stuff because, yeah, again, I think lobbying insults from afar isn't necessarily going to yield any forward movement.
[1384] Your book, antisocial, antisocial or antisocial?
[1385] What would you guys say?
[1386] What do you prefer?
[1387] We'll find out in the fact.
[1388] Okay.
[1389] Anti -social, online extremists, techno, utopians, and the hijacking of the American conversation.
[1390] I implore everyone to check this book out.
[1391] I can imagine that it would only help us as we navigate this thing we spend increasingly more and more time on.
[1392] So I'm really glad there's people like you.
[1393] I'm grateful that there's publications like New York or, you know, you're supported in that.
[1394] And these things are so valuable and worth protecting.
[1395] And I think you're on the right side of everything.
[1396] So thank you so much, Andrew, for coming in and talking to us.
[1397] Thanks.
[1398] Yeah, I appreciate it.
[1399] And now my favorite part of the show, the fact check with my soulmate, Monica Padman.
[1400] Go ahead, baby boy.
[1401] Baby boy.
[1402] Did you like that part when she sings baby boy?
[1403] La da da da da da da da da da da da da da baby boy.
[1404] I did like it.
[1405] It's one very, very, very tiny part out of that song.
[1406] And I've made the whole song about it.
[1407] I even thought the name of the song was baby boy.
[1408] Yeah.
[1409] We're speaking about Alicia Keys still.
[1410] We'll probably still be talking about her in months ahead.
[1411] Dax was trying to look for her song, Baby Boy, and did a bunch of Googling, couldn't find anything.
[1412] But turns out it is a small lyric in diary.
[1413] That's right.
[1414] That's right.
[1415] That's exactly right.
[1416] And you love it.
[1417] You just want to keep singing it over and over.
[1418] I was wondering why it wasn't having the same impact on you as it was me. And I was thinking maybe, maybe all of us guys desire to have a girl call them baby boy.
[1419] Do they?
[1420] I think maybe.
[1421] Wait, really?
[1422] I think that might be part of it like especially the way she says it like baby boy like we want to be called that wait wait this is worth delving into okay sorry andrew sorry Andrew we'll get there we'll come we'll come back to you come back but this is important because you want to be taken care of men want to be like a mother's love maybe yeah sounds like a mother's love they want to be nurtured even though they also want to be get away from us we want to shoot guns and take over something but they're just baby boys that's right like is there a like a pet name or something that you would like if you were watching a matt damon movie and he was being intimate with some co -star and he called her like a cute name could you imagine yourself going like oh i want him to call me that yeah of course what would it be snickerdoodle sex chinchilla no we're done if he calls me a sex chinchilla la da da da da da da da da da da sex chinchilla I'll keep your secret.
[1423] Ew.
[1424] Don't bastardize her song.
[1425] I don't even think that's...
[1426] Those words are in the song, but I don't think they are.
[1427] None of these words are important to that song.
[1428] I think it's just that you like to be nurtured.
[1429] Yeah, that's probably right.
[1430] I have a very nurturing mom.
[1431] Yeah.
[1432] And wife.
[1433] And wife.
[1434] And I would like Matt to call me...
[1435] Baby boy?
[1436] Yeah.
[1437] I wanted to call me baby boy.
[1438] But, okay, ironically...
[1439] Baby girl.
[1440] Baby girl sounds a little dismissive.
[1441] Baby girl sounds a little pervy.
[1442] Sounds pervy and like a power plate.
[1443] Like I'm above you.
[1444] A power move.
[1445] Yeah, exactly.
[1446] So I wouldn't like that.
[1447] Baby woman.
[1448] Ew.
[1449] Ironically, I have the same thing.
[1450] Like, I would want it to be a nurturing, not a baby name.
[1451] Okay.
[1452] But like sweetheart or something like that.
[1453] Baby bug.
[1454] Baby bug.
[1455] That was kind of cute.
[1456] Yeah, it's kind of nice, right?
[1457] Baby bug.
[1458] I don't want to be called, well, I was about to say I don't want to be called baby, but I do think I do like that.
[1459] Of course.
[1460] I think I would like that.
[1461] Yeah.
[1462] Yeah, baby bug.
[1463] Baby bug.
[1464] Because people like being nurtured.
[1465] They do.
[1466] We all like being nurtured.
[1467] And we like feeling safe.
[1468] But then we're also selfish little pieces of shit.
[1469] So as soon as we've been nurtured enough, we're like, get away from me, Mom.
[1470] Yeah.
[1471] Let me a little mom.
[1472] Yeah.
[1473] We're all pieces of shit, aren't we?
[1474] We want like max nurturing on our time.
[1475] when we're in the mood and then the second we're not, we're like, get out of my space.
[1476] Yeah.
[1477] I'm that way.
[1478] Yeah.
[1479] And then I'm mad at the other person because they're not navigating perfectly, like when I'm needy and when I need autonomy.
[1480] It's up to you to tell people what you need.
[1481] Yeah.
[1482] And there's no rhyme or reason to it either.
[1483] I'm like hot and cold.
[1484] Yeah.
[1485] Well, you all are.
[1486] Yeah.
[1487] But it is confusing for everyone if you're changing.
[1488] Not you.
[1489] I mean, in general, when people change the rules up on other people, It's not fair to the other person unless you're communicating.
[1490] Another reason we just need emotion hats.
[1491] Like you got five, six hats.
[1492] Yeah.
[1493] You pop it on.
[1494] It's like, I'm feeling needy.
[1495] I'm going to pop this on.
[1496] Clear signal, visual.
[1497] Uh -huh.
[1498] Even if it's in a loud environment.
[1499] Someone can see the hat from across the room.
[1500] I'm going to go nurture my baby bug.
[1501] Oh.
[1502] And then another hat's like red.
[1503] And it's like, says, rock out with your cock out on it or something really aggressive.
[1504] And that's like, I'm autonomous right now.
[1505] I want to feel like a big grown man. But, you know, it's a bit of a cop -out because you could just use your words.
[1506] Oh, yeah, yeah, yeah, yeah.
[1507] Part of it is I think sometimes we don't know.
[1508] We personally don't know what we need at the exact moment or we can't, you know, sort out our emotions.
[1509] Maybe we think we want autonomy, but really we are feeling needy.
[1510] Yeah, yeah.
[1511] Or it's like we want to feel safe.
[1512] And then the second we feel safe, then we feel free to go play and gallivant and not need it.
[1513] Yeah.
[1514] Baby Bird.
[1515] What if someone called you baby bird?
[1516] Too weak and defenseless.
[1517] So I like baby on its own, but I don't like baby another word, linking it to animal or...
[1518] Baby kangaroo?
[1519] I think Matt could probably say any of these things to me, and I'd be happy.
[1520] I'd feel good.
[1521] Baby squirrel.
[1522] Squirrel.
[1523] Squirrel's kind of cute.
[1524] Yeah, it is.
[1525] What did your mom call you?
[1526] Daxer.
[1527] Yeah, she still calls you that.
[1528] Yeah.
[1529] And then she had a song, and she said, my daxer, my daxer.
[1530] My big strong dachshar.
[1531] So she was smart enough.
[1532] She knew to be calling me a little baby, but then also saying my great big daxer.
[1533] Oh, wow.
[1534] She knew how to play me like a fiddle.
[1535] What if when you were just singing this song right now, Baby Boy was in the...
[1536] Oh, and then we discovered it.
[1537] Yeah, she'd go, my dachser, my dachser, my great big daxer.
[1538] Yeah, she was really smart.
[1539] She made me feel great big.
[1540] Uh -huh.
[1541] But I was...
[1542] Still a big.
[1543] You were her daxer.
[1544] Baby, I've been melting into her arms.
[1545] And did she have a song for your brother and your sister?
[1546] I know that she calls my sister a baby rabbit.
[1547] She does.
[1548] Yeah, which is so sweet.
[1549] And my mom, her totem animals, a rabbit.
[1550] Mm -hmm.
[1551] Yeah.
[1552] In fact, her neighbor, she just sent me a picture this morning.
[1553] Her neighbors put a welcome gift because she moved into a new neighborhood, put a welcome gift on her porch.
[1554] And it was a baby rabbit figurine.
[1555] Oh, that's cute.
[1556] Yeah, she felt like it was a sign from the universe.
[1557] Oh, she felt like it was a sign from Bart. Well, no, that would have been a silver dollar.
[1558] She thinks she finds silver dollars everywhere and that he's leaving them because he was really into silver dollars.
[1559] And she's got some pretty outrageous claims and she's got photographic evidence.
[1560] I don't know.
[1561] You know, it's funny because my mom and I have different.
[1562] That's where we diverge.
[1563] She believes in some supernatural stuff that I don't.
[1564] But I trust her so much.
[1565] Yeah.
[1566] And so she's had a couple different ghost experiences with her father.
[1567] And I take them at face value.
[1568] Like I think she had those experiences.
[1569] I don't believe in that.
[1570] but I also believe she had those experiences.
[1571] So apparently she's finding silver dollars every couple minutes since Barton died.
[1572] That's so sweet.
[1573] And it really isn't even worth trying to figure out whether.
[1574] Yeah, what would be the win of me proving to her that's not happening?
[1575] Yeah, exactly.
[1576] But I'm tempted.
[1577] Don't do it.
[1578] I'm really tempted.
[1579] Well, because she'll go like.
[1580] She says like, well, this happened on the cruise that she took my niece on.
[1581] They looked in the change purse because it was happening.
[1582] They both agreed how many silver dollars are in this change purse?
[1583] And there was like three or four.
[1584] And then they got back to the room after a day at shore.
[1585] And there was one on the ground.
[1586] And then they looked in the change person.
[1587] There was like five or six.
[1588] And then they took a picture of it.
[1589] Wow.
[1590] So I go straight to, ooh, we have a fun mystery.
[1591] What I know is that Barton didn't come out of the clouds and put the money in there.
[1592] No. So I think there have to accept that she's lying full stop.
[1593] Okay.
[1594] Or I've got to explain this.
[1595] Where do these silver dollars come from?
[1596] And then I get really excited about trying to figure out a plausible explanation for this other than Barton descended the heaven.
[1597] But I got to be careful that I'm not doing that out loud.
[1598] Agreed.
[1599] You should let her have that and believe what she wants to believe.
[1600] Do you remember a men in black whenever they would show up and they'd go, there was a leak of methane gas cloud from a local swamp and it was ignited by a flash bolt of light.
[1601] Like they would make up this crazy thing that happened to explain the alien they just saw.
[1602] Oh.
[1603] I like that.
[1604] That's funny.
[1605] Great movie.
[1606] I can watch that again.
[1607] I haven't seen it in so many years that I don't remember anything about it.
[1608] The first one is a masterpiece.
[1609] We could do a Will Smith Marathon.
[1610] I'd fucking love it.
[1611] Did your dad call you any pet names?
[1612] You called me Dax or two.
[1613] Oh, we did?
[1614] And Grunt.
[1615] Uh -huh.
[1616] You know about Grunt because I was deaf.
[1617] Didn't bother me, though.
[1618] No. I didn't feel like they were making fun of me. That seems sweet.
[1619] Yeah.
[1620] Also, you know, as a superhero most of the time, I was super dax, and I would show up and I would rescue her from all kinds of situations.
[1621] Oh, boy.
[1622] And she would play a role.
[1623] It's just so there.
[1624] She's all so transparent.
[1625] Right from the jump.
[1626] Yeah.
[1627] No evolution.
[1628] You're saving everyone from age five.
[1629] Uh -huh.
[1630] Yeah, I would rescue her all the time and I had a cape and everything.
[1631] You know, the whole nine.
[1632] Yeah.
[1633] Okay, Andrew.
[1634] Andrew Morantz.
[1635] Did Joan Didian say writers are always selling someone out?
[1636] Yes.
[1637] She did say that in the preface of her book slouching towards Bethlehem.
[1638] What if you go?
[1639] She did say that.
[1640] And she regretted.
[1641] He said Reddit gets more traffic than Twitter or Amazon or Netflix.
[1642] Netflix.
[1643] So he then emailed me after.
[1644] Can I just say I don't believe that?
[1645] But go ahead.
[1646] Okay.
[1647] He emailed me after.
[1648] He said, I think I said that Reddit gets more traffic than Amazon or Twitter.
[1649] That was true when I wrote the book, but might not be true anymore, given how fast these things change.
[1650] And that's U .S. traffic, not world traffic.
[1651] Okay.
[1652] Okay.
[1653] So I have not distinguished world versus U .S. because I read this after I did my research.
[1654] It's consistent, too.
[1655] Like when we argue about movies, it's always about world versus domestic.
[1656] This is true.
[1657] Yeah.
[1658] It's always the big question.
[1659] But according to my research, Twitter has 330 million active monthly users.
[1660] And that sounds right from the last one I did with Adam Masseri.
[1661] And Reddit has 430 million active monthly users.
[1662] So that is more than Twitter.
[1663] Yep.
[1664] I believe the Twitter claim.
[1665] Yeah, Amazon.
[1666] I don't believe the Amazon.
[1667] Okay, yes.
[1668] So on Wikipedia's list of most popular websites, Amazon is above Reddit.
[1669] Okay.
[1670] Yeah.
[1671] I also were shocked by that.
[1672] But he knew.
[1673] I'm on Amazon every six seconds.
[1674] Oh, my God, I know.
[1675] I mean, honestly.
[1676] And Netflix, but I guess Netflix, I didn't see Netflix on the list above Reddit, so.
[1677] And Netflix would have to be done in a different way because certainly they don't have over 300 million subscribers.
[1678] That'd be all of America.
[1679] Right.
[1680] But if you added up in the attention economy, how many hours people spend on Netflix versus Reddit, that to me would be a toss up.
[1681] Yeah.
[1682] Yeah, that's true.
[1683] Okay.
[1684] Okay, does John Oliver say business daddy?
[1685] Yes.
[1686] He takes a jab at HBO parent AT &T over terrible service.
[1687] I got you business, daddy.
[1688] Yeah, it's really funny.
[1689] He said that the podcast, The Daily Showa, was like Opie and Anthony for Nazis.
[1690] But I didn't know what Opie and Anthony was, did you?
[1691] Yeah, Opie and Anthony is a very, very popular radio show for decades.
[1692] Yeah, I didn't know that.
[1693] I mostly know about these people listening to Howard.
[1694] Because Howard's, of course, aware of all of his peers slash competition.
[1695] And then so he'll talk about it.
[1696] So all of my opinions of all these people are pretty skewed by him.
[1697] But I have done the OPE and Anthony show.
[1698] Oh, you've been on it?
[1699] Yes, in New York.
[1700] Oh, cool.
[1701] Yeah, I'd never heard of it.
[1702] Okay, is it antisocial or antisocial?
[1703] According to the general pronunciation online, when you type in how to pronounce and that thing comes up, it's antisocial.
[1704] Anti -social.
[1705] Oh, like my aunt -social.
[1706] But I would have said anti.
[1707] He's super antisocial, I would say.
[1708] But I would say that's pretty antisocial if I was describing someone's behavior.
[1709] Oh.
[1710] You asked me the other day if I was feeling ant -ant -ant -social?
[1711] I would have asked you if you were feeling antisocial.
[1712] Yeah, that's what you asked.
[1713] But then I would have said you were acting a bit antisocial.
[1714] Oh, my God.
[1715] Like Caribbean, Caribbean.
[1716] Caribbean.
[1717] Yeah.
[1718] Caribbean queen.
[1719] That's all.
[1720] That's all?
[1721] Yeah.
[1722] All right.
[1723] Well, I love you.
[1724] I love you.
[1725] Baby boy.
[1726] Baby bird.
[1727] Follow armchair expert on the Wondry app, Amazon music, or wherever you get your podcasts.
[1728] You can listen to every episode of Armchair expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.
[1729] Before you go, tell us about yourself by completing a short survey at Wondry. com slash survey.