The Joe Rogan Experience XX
[0] So, Jamie pointed out, this, this, a congressman, is that who it is?
[1] Yeah.
[2] Jamie pointed this out that there's a congressman and he released a series of tweets and the first letter of all these tweets, if you put them all together, it says Epstein didn't kill himself, or did not kill himself?
[3] Is that what it is?
[4] Yeah, I think it's didn't.
[5] He did, uh, I'll pull it up.
[6] Yeah, how do you do the apostrophe?
[7] Yeah, so he should have gone with did not.
[8] Starting here with that evidence of a link.
[9] Rep Paul Gosser.
[10] what are the odds that this guy did this accidentally really small right that's kind of like one of those monkeys typing Shakespeare things yeah I don't think it could it could work and the thing is he did it backwards right so you didn't see what the puzzle was until the last tweet because the last tweet is an E I got a tweet from someone about 35 minutes ago that I don't know if there's a bunch of people online paying attention to it or what but someone alerted me and a few other people of it what is he does he have an image of that fucking that crazy mask?
[11] Is that in his shit too?
[12] Okay.
[13] He's a weird about.
[14] He's got the, not until I was November 1st.
[15] The V mask.
[16] Yes.
[17] What is that mask again?
[18] V for Vendetta.
[19] What was it representative of something?
[20] It's the guy Fox mask.
[21] Yes, that's right.
[22] Yeah.
[23] So this guy is, uh, he's, he's thinking along alternative lines of thought, but that is really an interesting way of saying it.
[24] Alphabetry, that's called.
[25] Yeah, just making a bunch of tweets.
[26] Don't ever address it.
[27] Just leave it there, walk away.
[28] Yeah.
[29] Lewis Carroll was famous for that.
[30] Was he?
[31] that was one of, he did a lot of sort of tricks with words.
[32] Do you read the book?
[33] Goetalash or Bach?
[34] No. Yeah, there's a whole bunch of stuff in there about people who used, who put puzzles in text.
[35] You know, that's kind of a thing that people did, I guess, back more in the 18th century and before.
[36] Well, this Epstein case is probably the most blatant example of a public murder, of a crucial witness I've ever seen in my entire life, or anybody's ever seen, and the minimal amount of outrage about this, the minimal amount of coverage.
[37] It's fucking fascinating.
[38] I mean, what's amazing to me, just as somebody who works in the media, is that this was shaping up to be the biggest, like, news story in history.
[39] Yes.
[40] And the instant he, you know, he died or was died or however you want to call it, the story just fell off the face of the earth.
[41] It's like nobody's doing anything about it.
[42] And I don't 100 % understand that.
[43] I mean, I get it why that's happening.
[44] But it's just amazing.
[45] Well, when the woman from ABC, what was her name?
[46] Amy Rose.
[47] That lady.
[48] The one who...
[49] Roboc.
[50] Roboc, yeah.
[51] Who had the frustrated moment that she called it, a frustrating private moment.
[52] Right.
[53] When she was talking about having the scoop and having that story and them squashing it.
[54] Right.
[55] Like, this is all.
[56] all stuff that everybody used to think was conspiracy.
[57] Everybody's think this was stoner talk.
[58] This was you know, you know what I mean?
[59] Like this is stuff where people are just delusional.
[60] They believe all kinds of wacky conspiracies.
[61] Sure.
[62] But the reality is much less complicated.
[63] Well, this is not possible.
[64] This is one of those things that's so obvious.
[65] It's so in everyone's face.
[66] Well, there's a couple of things going on because there are many different ways than this can play out.
[67] I mean, I mean, you could have a news director who just sort of instinctively decides, well, we can't do that story because I might want to have Will and Kate on later, or I might want to have this politician on later.
[68] And it's, it's not like anybody tells them necessarily that we can't do this, but they just decide it's too hot.
[69] If you grow up in this system and you've been in the business for a long time, you just, you have all these things that are drilled into you at almost like the cellular level about what you can and cannot get into.
[70] and I think but there were some explicit things that happened with Epstein too I mean there were a lot of news agencies that killed stories about him that you know and we're hearing about some of them in Vanity Fair of this thing you know so yeah it's bad it's terrible yeah when I found out that Clinton flew no less than 26 times on a plane with Epstein I was like dude I haven't flown that many times with my mom and how long did he know Epstein Yeah, I don't know.
[71] But, I mean, to have that many flights, to have the Secret Service people involved, I mean, that's incredibly bold.
[72] What was he doing?
[73] Was just girls?
[74] Is Clinton that much of a hound that he would go that deep into the well that many times, 26 times?
[75] Well, that's the thing about the upsine story that makes no sense to me. Like, I thought that the percentage of people who were out and out, like perverts who had a serious problem, like with pedophilia or whatever, It was pretty small, you know?
[76] Yeah.
[77] But they had a lot of people coming in and out of this compound.
[78] And it just seems like it's a very strange story.
[79] What were they really up to?
[80] I have no idea.
[81] And was it all a blackmail scheme?
[82] It's just so strange.
[83] Well, it seems like the pedophilia aspect of it might be directly connected to Epstein himself.
[84] Like he might be the one that has a problem with girls that are like 16 and he likes them very young or he did like them.
[85] but with the other guys it could just be girls could be yeah i mean that's why it's so crazy like how could it be that these but maybe it's not but they must but they knew who he was yeah but they probably didn't know the extent of it probably not yeah up until a point up until he was arrested right and then they're like oh well then that then that's when everybody backed off of them right yes yeah i mean i'm not a hundred percent yeah i haven't covered this story in depth i've only i only really got into it a little bit.
[86] We need you.
[87] We need you in this one.
[88] You're the guy.
[89] This is a tough one.
[90] I mean, you know, because it mixes a lot of things that are very tough to cover.
[91] Yes.
[92] You know, the intelligence world is very tough to cover.
[93] You know, it's hard to get stories out of there that they don't want you to have.
[94] Yeah.
[95] And this is, this is like the mother of all stories, you know, in terms of that.
[96] And there are just little, little breadcrumbs here and there, that whole thing about Acosta, you know, the Vanity Fair quote from him is that when he said, that when he looked at the case, he didn't do it because I was told he belonged to intelligence.
[97] Yes.
[98] What does that mean?
[99] You know, who's intelligence?
[100] You know what I mean?
[101] Like what agency?
[102] What for?
[103] You know, and then you pair that with things like, you know, I have friends on Wall Street.
[104] You tell me, I've never heard a single instance of this guy actually having a trade.
[105] Right.
[106] You know, so what was his hedge fund doing?
[107] You know, I mean, if you think about it, hedge fund's a perfect way to do blackmail, you know, because you can just have people putting money in and out all the time.
[108] time and it would look like investment, you know, so very strange story.
[109] Well, Eric Weinstein had a conversation with him.
[110] You know, Eric Weinstein was with Peter Thiel Capital.
[111] Right.
[112] Yeah.
[113] He's like, this guy doesn't know what the fuck he's talking about.
[114] Oh, yeah.
[115] He's like he's a fake.
[116] Yeah.
[117] He's like he's an actor.
[118] Right.
[119] This is nonsense.
[120] Right, right.
[121] That was his initial, almost instantaneous response.
[122] Yeah.
[123] Yeah.
[124] And, and what real clients did he ever have?
[125] What did he trade in?
[126] What?
[127] How's he got a billion dollars or whatever he had?
[128] Yeah.
[129] Yeah, no, it's, half a billion.
[130] Under management, yeah, that's ridiculous.
[131] Why the guy who owns Victoria Secrets give him a $70 million home in New York City?
[132] Like, what?
[133] I mean, these are all things that would have been really interesting to get into, you know?
[134] If he didn't, if he didn't try to kill himself twice.
[135] The suicide didn't happen to him, like, in the wire.
[136] Poor fella.
[137] Yeah, yeah.
[138] It's just so unfortunate.
[139] Yeah.
[140] So unfortunate that the cameras died.
[141] And so unfortunately, he sustained an injury that's, uh, that you, you, you, usually you only get through strangulation.
[142] Right.
[143] And he fell on the ground and accidentally broke his hyoid bone.
[144] Yeah.
[145] Happens all the time.
[146] Whatever.
[147] No big deal.
[148] I mean, it's so bizarre.
[149] I can't stand conspiracy theories.
[150] I'm one of these people who, who doesn't like reading it.
[151] But I can't make the story work in a way that isn't, you know, conspiratorial.
[152] Well, that's the thing.
[153] It's like, it gets to a point where you're like, okay, even Michael Shermer, who run Skeptic Magazine, he's like, wait a minute, the cameras were not working.
[154] thing?
[155] Yeah.
[156] I mean, it's like a bad excuse.
[157] This seems like a conspiracy.
[158] Fucking when Michael Shermer says, he, that guy doesn't believe in anything.
[159] Right.
[160] I mean, he is fucking, he's down the line on virtually every single thing that's ever happened.
[161] He doesn't believe in any conspiracies.
[162] Well, how do you, what's the innocent explanation for any of this?
[163] None.
[164] It doesn't make any sense.
[165] You can't, you can't spin it in any way to make it not a crazy conspiracy.
[166] Especially when the brother hires a doctor to do an autopsy.
[167] Oh, yeah.
[168] says like this guy was fucking murdered right yeah michael badden the famous guy from the hbio autopsy show right yep absolutely oh craziness complete craziness and you know it's it's an example of of um you know the fcine sort is interesting because it's because it's about villains on both sides of the aisle right this is the classic this is something i've written about before is that the press does not like to do stories where the problem is bipartisan yeah right so when you have an institutional problem when Democrats and Republicans both share responsibility for it when, you know, or if it's an institution that kind of exists in perpetuity no matter what the administration is, we don't really like to do those stories.
[169] If Fox likes to do stories about Democrats, MSNBC likes to do stories about Republicans, but the thing that's kind of, you know, all over the place, they don't like to do that story.
[170] Epstein is, you know, he's friends with Trump and with Clinton.
[171] I mean, it looks like he has more friends on the Clinton side, but still.
[172] And I think that's, this is one of the reasons why this story doesn't have a lot of traction in the media, because neither side really likes the idea of going too deeply on it.
[173] It feels like to me. Well, it's, but the, the, the blatant aspect of it, the only, I mean, the closest that we have to that is the absolute murder, the Jamal Khashoggi murder.
[174] That's the closest thing we have to, or it's absolute murder.
[175] Right.
[176] This one, but it's also so insanely blatant, but now you have foreign actors that are involved in it and they all disperse and then this left with this confusion of who's responsible for it well Saudi Arabia that's another example where you can't really say it's you know one side of the both parties have been incredibly complicit in their cooperation with the Saudi regime and in you know the massacres that are going on in Yemen it's a classic example of what Noam Chomsky used to talk about with worthy and unworthy victims right like if the if the Soviet communists did it they were that was bad But if death squads in El Salvador killed a priest or a Catholic priest, you know, then that was something we didn't write about because they were our client state.
[177] Yemen is a story we don't write about.
[178] Syria is a story we do write about it, but they're really equivalent stories.
[179] And, you know, but you're absolutely right.
[180] The Khashoggi thing, I don't think either party or either side's media really wants to get into that all that deeply.
[181] How much is media shifting now?
[182] Like you've obviously been a journalist for a long thing.
[183] time.
[184] How much are things changing in the light of the internet?
[185] Well, a lot.
[186] And this is what, I mean, I have a new book out now that's really about this, right?
[187] Why the business has changed.
[188] What's it called?
[189] Hey, Inc. Yeah.
[190] It's out.
[191] It's out now.
[192] And it's really about how the press, the business model, the press has changed.
[193] I mean, it's something that you talk about a lot.
[194] I hear you on your show all the time talking about how news agencies are always trying to push narratives on people, trying to get people wound up and upset.
[195] And that is a conscious business strategy that we didn't have maybe 30 years ago.
[196] You know, you think about Walter Cronkite or what the news was like back in the day.
[197] You had the whole family sitting around the table and everybody watching sort of a unifying experience to watch the news.
[198] Now you have news for the crazy right -wing uncle and then you have news for the kid in the Shea T -T -shirt and they're different channels and they're trying to wind these people up, you know, to get them upset constantly and stay there.
[199] And a lot of that has to do with the Internet, because before the Internet, news companies had like a basically free way of making money.
[200] They dominated distribution.
[201] The newspaper was the only thing in town that had a, you know, if you wanted to get a wand ad, it had to be through the local newspaper.
[202] Now with the Internet, the Internet is the distribution system.
[203] Anybody has access to it, not just the local newspaper.
[204] And so there, the easy money is gone and we have to chase clicks more than we ever had to before.
[205] We have to chase eyeballs more than we had to.
[206] So we've had to build new money -making strategies and a lot of it has to do with just sort of monetizing anger and division and all these things.
[207] And we just didn't do that before.
[208] And it had a profound difference on the media.
[209] As a writer, have you personally experienced this sort of the influence where people have tried to lean you in the direction of clickbait or perhaps maybe alter, titles that make them a little bit disingenuous in order to get people excited about the story?
[210] I mean, you know, my editors at Rolling Stone are pretty good, and they give me a lot of leeway to kind of explore whatever I want to explore, but I definitely feel a lot of pressure that I didn't feel before in the business because, especially in the Trump era, and, you know, I've written a lot about the Russia story, right?
[211] But, you know, that's an example of one -size media does, has one take on it, and another side's media has another take on it.
[212] And if you are just a journalist and you want to just sort of report the facts, you feel a lot of pressure to fit the facts into a narrative that your audience is going to like.
[213] And I had a lot of problem with the Russia story because I thought, you know, I don't like Donald Trump, but I'm like, I don't think this guy's James Bond consorting with Russian spies.
[214] I think he's corrupt in other ways.
[215] And there was a lot of blowback on my side of the business because, you know, people in sort of liberal, quote unquote liberal media, you just have, there's a lot of pressure to have everybody fit into a certain narrative.
[216] And I think that's really unhealthy for the business.
[217] Yeah, very unhealthy, right?
[218] Because as soon as people can be manipulated to conforming to that narrative, then all sorts of stories can be shifted.
[219] Oh, yeah.
[220] Yeah, absolutely.
[221] And you, the job used to be about challenging your audience every now and then, right?
[222] Like, if you think a certain thing is true, well, it's our job to give you the bad news and say that you're wrong about that.
[223] That used to be what the job was.
[224] to be a journalist.
[225] Now it's the opposite.
[226] Now we have an audience.
[227] We're going to tell you exactly what you want to hear and what you, and we're going to reinforce what you think.
[228] And that's very unhealthy.
[229] A great example of this was in the summer of 2016, I was covering the campaign.
[230] I started to hear reporters talking about how they didn't want to report poll numbers that showed the race was close.
[231] They thought that that was going to hurt Hillary, right?
[232] Like, in other words, we had information that the race was close, and we're not telling this to audiences because they wanted to hear that it was going to be a blowout for Hillary, right?
[233] And that didn't help Hillary.
[234] It didn't help the Democrats to not warn people about this, right?
[235] But it was just because if you turned on MSNBC or CNN and you heard that Trump was within five points or whatever it was, that was going to be a bummer for that audience.
[236] So we stayed away from it.
[237] And, you know, this is the kind of thing that it's not politically beneficial to anybody.
[238] It's just, we're just trying to keep people glued to the set by telling them what they want to hear.
[239] And that's not the news.
[240] That's not our job, you know?
[241] And it drives me crazy.
[242] Yeah, it should drive you crazy.
[243] What you said about journalism being used to be something that you're challenging your reader.
[244] You're giving them this reality that may be uncomfortable, but it's educational and expands their view of the world.
[245] Where do they get that now?
[246] They don't.
[247] That's the whole problem.
[248] You can predict exactly what each news organization, what their take is going to be on any issue.
[249] Just to take an example, when the business about the ISIS leader, Al -Baghdadi being killed, hit the news.
[250] Instantaneously, you knew that the New York Times, CNN, the Washington Post, that they were going to be.
[251] going to write a whole bunch of stories about how Trump was overplaying the significance of it, that he, you know, that he was telling lies about it.
[252] They were, they made, they, you knew they were going to make the entire thing about Trump.
[253] And then meanwhile, Fox had a completely different spin on it about how heroic it was, but, but news audiences didn't have anywhere to go to, to just simply hear who was this person.
[254] Why was he important?
[255] What were the, what do the people in the region think, you know, what kind of, what is this going to mean going forward?
[256] Is they actually going to have any impact you know is are we going to have to continually um you know is there going to be a new person like this every every time right are we actually accomplishing any like you don't get that anywhere all you get is Trump is a shithead on one side and and Trump is a hero on the other side and that's that's not the news no and but the thing is it's like the business aspect of it is so weird like you have your guys like Hannity or you can absolutely predict what that guy's going to say every single time you know what side he's on and he's blatant about it and when you see someone like that you go okay well this is okay where this is this is this is peak bullshit right so where where do we go where I see both sides where's the where's the where's the middle ground where someone goes well this is true but you got to say this is honest too and this is this is what's going on over on this side and the republicans have a point here and you don't you don't there's no mainstream media place where you can go for that right now no there isn't and that's I mean, I mean, this is one of the things I write about.
[257] This is one of the reasons why shows like yours are so popular.
[258] I mean, I think there's a complete loss of trust that they feel like people are not being honest with them, right?
[259] And they're not being straight.
[260] And, you know, they come to people like you and a lot of other people, sort of independent folks who aren't like the quote unquote mainstream media.
[261] Because they, it's not really thought, it's not reporting, it's not anything.
[262] If you can predict 100 % what a person is going to say.
[263] that's not thinking that's not reporting that's not it's just marketing for someone like me that's so disturbing i'm a fucking comedian and a cage fighting commentator when people are coming to me like this is this is the source where you go for unbiased representations of what's going on in the world that's crazy well i mean let's see i mean i saw your interview with barry wice right and you just you did a simple base you didn't go to journalism school right no no so she said something about how um you know oh she's an Assad, TOTI, and you said, what does that mean?
[264] You just ask the simple, basic questions, right?
[265] What does that mean?
[266] Where is that coming from?
[267] How do you know that?
[268] Like, journalism is in brain surgery.
[269] That's all it is.
[270] It's just asking the simple questions that sort of pop to mind when you're in a situation.
[271] Like, where did this happen?
[272] How do we know that?
[273] That's true.
[274] But there's a whole generation of people in the press net who just simply do not do that.
[275] Go through the process of just asking simple questions.
[276] How do I know that's true?
[277] Like, after each story of your report, you're supposed to kind of, like, wipe your memory clean and start over.
[278] So just because somebody was banned the last time you covered them doesn't mean that they're necessarily going to be the bad guy this time you cover them, right?
[279] You have to continually test your assumptions and ask yourself, is this true?
[280] Is that true?
[281] Is this true?
[282] How do we know this?
[283] And we've just stopped doing that.
[284] Like, it's just the morass of, like, pre -written takes on things.
[285] and it's really, really bad.
[286] And you can see why audiences are fleeing from this stuff.
[287] They just don't have the impact they used to.
[288] Well, it's really interesting that a lot of this is this unprecedented consequence of having these open platforms like Facebook and like where people are getting their news and then the algorithm sort of directs them towards things that are going to piss them off, which I don't even think necessarily was initially the plan.
[289] I think the plan is to accelerate engagement, right?
[290] So they find out what you're engaging with, what stories you're engaging with, and then they give you more of that.
[291] Like Ari, my friend Ari Shafir, actually tried this out.
[292] And what he did was he went on YouTube and only looked up puppy videos.
[293] And that's all he looked at for like weeks.
[294] And then YouTube only started recommending puppy videos to him.
[295] So it's not necessarily that Facebook wants you to be outraged, but that when you are outraged, whether it's over abortion or war, whatever the subject is, you're going to engage more, and their algorithm favors you engaging more.
[296] So if you're engaging more about something very positive, you know, if you're all about yoga and meditation, your algorithm would probably favor yoga and meditation because those are the things that you engage with.
[297] But it's natural for people to be pissed off and to look for things that are annoying, especially if you're done working and you're like, God, this world sucks.
[298] What's going on that sucks worse?
[299] And then you go to your Facebook and, oh, Jesus, look at these guys.
[300] Damn border crisis.
[301] Oh, Jesus.
[302] Look at this.
[303] Well, fucking, here's the problem with these goddamn liberal.
[304] They don't know.
[305] And you engage.
[306] And then that's your life.
[307] And then it's saying, oh, I know how to get Madhaw fired up.
[308] I'm going to fucking send him some abortion stories.
[309] Right.
[310] And then that's your feed.
[311] Right.
[312] Yeah, exactly.
[313] But there's so many economic incentives that go in there.
[314] Right.
[315] They know the more that you engage, the longer that you're on.
[316] Right.
[317] The more ads that you're going to see.
[318] Yes.
[319] So that same dynamic that Facebook.
[320] and the social media companies figured out, which is that if you keep feeding somebody something that has been proven to spin that person up and get them wound up, that they're going to come back for more of it, and they're going to keep coming back.
[321] And actually, you can expand their desire to see that stuff by making them sort of more angry overall, and they will come back and they will spend more and more time.
[322] Well, the news companies figured out the same thing.
[323] and they're just funneling stuff at you that they know you're going to you're you're going to just be in an endless cycle of sort of impotent mute rage all the time but it's kind of addicting you know and they know that and they're and it's sort of like the tobacco companies they know it's a bad it's a product that's bad for you and they just keep giving it to you because you know it makes money for them yeah and it's just the thing about it is all of it is about ads totally how many clicks they get in ads if they just said, you can have a social media company, but you can't have ads.
[324] There's a new federal law, no more ads on Facebook, no more ads on YouTube, no more ads on Twitter, no more ads on Instagram.
[325] Good luck.
[326] Right.
[327] Yeah.
[328] Those businesses were all collapsed.
[329] Yep.
[330] Yeah.
[331] But that seems to be what it is.
[332] It's like they figured out that your data is worth a tremendous amount of money.
[333] And the way they can utilize that money is to sell advertising.
[334] Yeah.
[335] No, they get it coming and going because they're they're not only selling you ads or but they're also collecting the information about your habits which they can then sell again yeah so it's a it's a dual revenue stream you know the media companies they're basically they're just consumer businesses where they're they're trading attention for ad space right so if they can get you to watch four hours of television a day they have that many ad slots that they can show you and they know how much money they're going to make you know but the the social media companies get it two ways they're They get it by, you know, attracting your eyeballs and then also selling your habits to the other, the next set of advertisers, which, you know, is very insidious.
[336] But what's interesting about this is that most people don't think about this as a consumer business, right?
[337] Like Americans, these are very conscious of, like, what they put in their bodies, you know, they won't eat too many candy.
[338] Well, depending on who they are, right?
[339] But people at least look at what the calories are, but they don't think about the news that way or social media, what they put in their brains.
[340] And it's also a consumer product.
[341] yeah it really is i've gone over that many times with people that that's a diet this is your diet you have a mental diet as well as you have a physical like food diet absolutely you have an information diet and a lot of people are just eating shit with their brain it's the worst kind of junk food it's like it's like a cigarette sandwich the stuff that we eat yeah it's so fucking bad and it's getting worse it is it is getting worse and it's what's weird is that this is a 10 year old problem and no one saw it coming and it's kind of overtaken politics it's overtaking social discourse, everybody's wrapped up in social media conversations.
[342] They carry them on over to the dinner table and it gets people in arguments at work and all this stuff no one saw coming.
[343] No one saw this outrage economy from, you know, social media sites from things like Facebook.
[344] No one saw that.
[345] No one never predicted that your data was going to be so valuable.
[346] No. Who the fuck saw that?
[347] I don't think anybody, I mean, I think some people in the tech business probably saw early on the potential for this.
[348] But, you know, in terms of other businesses like the news media and also politics, I mean, you have to think about the impact of this on politics has been enormous.
[349] And, you know, I cover Donald Trump.
[350] Trump really was just all about whatever you're pissed off about, I'm right there with you, you know?
[351] And people are just sort of pissed off about lots of things these days because they're doing this all day long, you know?
[352] And if you can, if you can, if you can, take advantage of that, then you're going to have a lot of success.
[353] And I think a lot of people haven't figured that out.
[354] And some of these things are real causes.
[355] Like, people are upset about real things.
[356] But it's just, I don't know, you're absolutely right.
[357] People did not see this coming and they didn't prepare for it.
[358] It's just weird that it's one of the biggest sources of income online and people didn't see it coming.
[359] I mean, Facebook is generating billions of dollars and now potentially shifting global politics.
[360] Yeah, and, you know, the whole issue of a couple of companies like Facebook having control over what you do and do not see is an enormous problem that nobody really cares about.
[361] I've tried to write about it a few times.
[362] I've written a couple of features about it and about how, what a serious problem this is.
[363] Like if you look in other countries like Israel, China, there are a number of companies where you've seen this pattern of Internet.
[364] platforms leasing with the government to decide what people can and cannot see.
[365] And they'll say, well, we don't want to see, you know, the Palestinian protest movements, or we don't want to see, you know, the Venezuelan channel, a telesaur.
[366] Like, we want to take that off.
[367] You think about how that could end up happening in the United States, and it is already a little bit happening.
[368] It's a little bit, but it seems to be happening only in the terms of like leaning towards the progressive side, which people are okay with.
[369] Because I think, especially in the light of Donald Trump being in office, this is acceptable censorship.
[370] Yeah, but I think they're wrong about that.
[371] I think they're wrong about that too.
[372] Yeah, and it's terribly dangerous.
[373] It's very short -sighted.
[374] Yes.
[375] And I think there's also this thing that happens with people where they think, oh, this is never going to happen to me, you know, like you can do that bad thing to this person that I don't like, but, you know, as long as it's never going to happen to me. Exactly.
[376] But they're wrong.
[377] I mean, history shows it always does happen to you, you know, and that's, it's, it's So we're giving these companies an enormous amount of power to decide all kinds of things, what we look at, what kind of political ideas we can be exposed to, you know, I think it's very, very dangerous.
[378] That biased interpretation of what something is, that was what people talked about when the initial Patriot Act was enacted, when people were like, hey, this might be fine with Obama in office, right?
[379] Maybe Obama is not going to enact some of the worst clauses of this and use it on people.
[380] Or the, was the NDAA?
[381] Is that what it was?
[382] Yeah.
[383] Some of the things were just completely unconstitutional, but don't worry, we're not going to use those.
[384] But you're setting these tools aside for whatever fucking president we have.
[385] Like, what if we have a guy who out -Trump's Trump?
[386] Right.
[387] I mean, we never thought we'd have a Trump, right?
[388] What if we have a next -level guy post -Trump?
[389] What if there's some sort of catastrophe, tragedy, attack, something that really gets people fired up and they vote in someone who takes it up to another level?
[390] And then he has these tools.
[391] And then he uses these tools on his political enemies, which is entirely possible.
[392] Well, I mean, we've already seen that a little bit.
[393] I mean, people don't want to bring this up.
[394] But, you know, a lot of the stories that have come out about Trump, they're coming from leaks of classified information that are coming from those war on terror programs that were instituted after night.
[395] 9 -11.
[396] Yes.
[397] The sort of FISA Amendments Act, the NSA programs to collect data.
[398] Like, they're unmasking people.
[399] Like, we have a lot of evidence now that there was a lawsuit a couple that came out about a month ago that showed that the FBI was doing something like 60 ,000 searches a month at one point where they were asking the NSA for the ability to unmask names and that sort of thing.
[400] So we're, I mean, these tools are incredibly powerful.
[401] they're incredibly dangerous but people thought after 9 -11 they were scared so you know we want to protect ourselves so that's okay for now you know we'll we'll pull it back later but they but you never do pull it back you know what i mean it always ends up being used by somebody in the wrong way and i think we're starting to see that that's going to be a problem yeah i'm real concerned about places like google and facebook altering the path of free speech and and leaning people and certain directions and silencing people that have opposing viewpoints.
[402] And the fact that they think that they're doing this for good because this is how they see the world.
[403] And they don't understand that you have to let these ideas play out in the marketplace of free speech and free ideas.
[404] If you don't do that, if you don't do that, if you don't let people debate the merits, the pros, the cons, what's wrong, what's right?
[405] If you don't do that, then you don't get real discourse.
[406] If you don't get real discourse, you're essentially, you've got some sort of intellectual dictatorship going on and because it's a progressive dictatorship you think it's okay because it's people who want everybody to be inclusive and you know i mean this is this is a weird time for that it's a really weird time for that because as you said people are so short -sighted they don't understand that these like the first amendment's in place for a very good reason and set up a long fucking time ago because they did the math they saw where it was going and they're like look we have to have the ability to express ourselves we have to have the ability to freely express thoughts and ideas and challenge people that are in a position of power, because if we don't, we wind up exactly where we came from.
[407] Yeah, no. And courts continually reaffirmed that idea that the way to deal with bad speech was with more speech.
[408] And they did it over and over and over again.
[409] You know, we, the legal standard for speech, you know, still, I think, remains that unless it's directly inciting violence, You can have speech that incites violence generally, and even in the Supreme Court even upheld that.
[410] You can have speech that comes from, you know, material that was stolen illegally.
[411] That's okay.
[412] But we had a very, very high bar for prohibiting speech always.
[413] And, you know, the liable cases, the cases for defamation, you know, that also established a very, very high standard for punishing speech.
[414] But now all of a sudden people have a completely different idea about it.
[415] It's like, you know, forget about the fact that this.
[416] was a fundamental concept in American society for, you know, 230 years or whatever, but they just want to change it, you know, without thinking about the consequences.
[417] Well, that's where a guy like Trump could be almost like, it's like almost like a Trojan horse in a way.
[418] Like, if you wanted to play 3D chess, what you would do, you'd get a guy who's just so egregious and so outrageous and then so many people oppose him.
[419] Get that guy.
[420] Let him get into a position of power and then sit back.
[421] Watch the outrage bubble.
[422] and then take advantage of that and funnel people in the certain directions.
[423] I mean, I don't think that's what's happening.
[424] But if I was super fucking tinfoil hatty, that's how I would go about it.
[425] I would say, this is what you want.
[426] If you really want to change things for your direction, put someone that opposes it that's disgusting.
[427] And that way, people just, a rational, intelligent person is never going to side with him.
[428] So they're going to side with the people that oppose him.
[429] And then you could sneak a lot of shit in that maybe they wouldn't agree with any other circumstance.
[430] Yeah, Trump's election is sort of like another 9 -11, right?
[431] Like, you know, 9 -11 happened.
[432] All of a sudden, people who weren't in favor of the government being able to go through your library records or listen to your phone calls, and all of a sudden, they were like, oh, Jesus, I'm so freaked out.
[433] Like, yeah, fine.
[434] When Trump got elected, all of a sudden, people suddenly had very different ideas about speech, right?
[435] Like, they, you know, hey, that guy's so bad, you know, that maybe we should consider banning X, Y, and Z, you know.
[436] And, uh, you Yeah, it's, if he was conceived as a way to discredit the First Amendment and some other ideas, that would be a brilliant 3D chess move.
[437] Yeah, super sneaky.
[438] Yeah.
[439] That's like China level, many steps ahead.
[440] Right, yeah, exactly.
[441] What do you, I mean, what do you think all this goes?
[442] It seems like this is, I mean, obviously you just wrote a book about it, but it seems like this is accelerating.
[443] And it doesn't seem like anyone's taking a step back and hitting the brakes or opting out.
[444] It seems like people are just ramping up the rhetoric.
[445] Yeah, I mean, I think the divisiveness problem is going to get worse before it gets better.
[446] The business model of the media now is so entrenched that until some of these companies start going out of business because they're doing, you know, they're losing audience because people don't.
[447] trust them anymore.
[448] The news is going to keep doing what it's doing.
[449] The Hannity model is going to become normal for news companies.
[450] I think it already basically is, you know, on both the left and the right.
[451] And in terms of, you know, the internet companies, they're consolidating.
[452] They're getting more and more power all the time.
[453] And there's, I think we've already seen that people have, I think, too much tolerance for letting them make decisions about what we can and cannot see and I think it's going to get worse before it gets better.
[454] I don't know.
[455] What do you think?
[456] That's what I think.
[457] I mean, Facebook, Twitter, all these places.
[458] Twitter has some of the most ridiculous reasons for banning people.
[459] One of them is dead naming.
[460] Oh, yeah.
[461] So if you call Caitlin Jenner Bruce, like, hey, I like you better when you're a Bruce.
[462] Banned for life.
[463] Right.
[464] You can't even say, I like you better when you were Bruce.
[465] Banned for life.
[466] Right.
[467] Yeah.
[468] And actually, that, that's a, that's a core concept.
[469] that we've changed completely.
[470] Like, all the different ways in the past that we punish speech, we punish the speech, not the person.
[471] Yes.
[472] So if, you know, liable, defamation, all those things, first of all, they were all done through the courts.
[473] So you had a way to fight back if you thought you were unjustly accused of having defamed somebody or liable somebody.
[474] But if they found against you, the person who got something out of it was the person who was directly harmed, right?
[475] And the courts judge that.
[476] And they, you know, it was.
[477] wasn't like you were banned for life from ever speaking again right they just gave a bunch of money to a person who might have suffered some kind of career injury or whatever it was because of that um and uh usually there was a retraction or it was removed from the press or whatever it was but it wasn't like we were we were saying we're never going to allow you to be heard or seen from again we kind of won't we were sort of encouraging optimistically people to get better right and to be different you know and now we're not doing that at all.
[478] Now we're just saying, you know, one, one strike or two strikes, whatever, you're gone.
[479] And it's not like it's a public thing, so you can't sue over it, you know.
[480] Right.
[481] Yeah.
[482] Well, that's what's crazy about it, because it is a public utility in a way.
[483] Yes, it is.
[484] It should be.
[485] And even Jack Dorsey from Twitter admitted as much on the podcast, and he wishes that we would view it that way.
[486] He's actually proposed two versions of Twitter, a Twitter with their standard censorship in place, and then a Wild West Twitter.
[487] Mm -hmm.
[488] And I'm like, sign me up.
[489] Right.
[490] How do I get on that Wild West Twitter?
[491] Right.
[492] Because the problem with like things like Gab, and I've gone there a few times and watched it, and I mean, even Milo Unopoulos has criticized it for being this, is that it's just like so hate -filled because it's the place where you can go and fucking say anything.
[493] Right.
[494] So the only people that it's attracting are people that just want to go there and just fucking shoot off cannons of end bombs and call everybody a kike.
[495] It's crazy.
[496] I mean, it's, and there's real communication there as well.
[497] There's plenty of that.
[498] too, but the sheer number of people that go there just to blow off steam because they can't say those things on Twitter or Facebook or any other social media platform without being banned, because of that, it becomes a channel for it.
[499] And it's like it doesn't get a chance.
[500] It doesn't get a chance to, the concept is great.
[501] The concept is if you're not doing anything illegal, we're not going to stop you.
[502] You're not doxing anybody.
[503] You're not threatening anybody's life.
[504] We're not going to stop you.
[505] Go ahead.
[506] But if you do that and you're the only one that does that, unfortunately, everyone who wants to just say fucked up shit just goes right and you get a disproportionate amount of fucked up shit yeah and it's directly because of the fact that these places like Twitter or Facebook have censored and they they make it so you are scared to say whatever you want to say and so you can't so even if you have controversial ideas that maybe some people would agree with and some won't you can get banned for life for just controversial ideas even controversial ideas that are scientifically and biologically factual like the gender issue.
[507] Like if you say, there's a woman, I brought her up a million times, but Megan Murphy, Murphy, yes.
[508] A man is never a woman, she says.
[509] They tell her to take it down.
[510] She takes a screenshot of it, puts that up, takes it down, but takes a screenshot of the initial tweet.
[511] It says, ha -ha, look at that.
[512] Ban for life.
[513] Right.
[514] A man is never a woman is a fact.
[515] That is a fact.
[516] It's a biological fact.
[517] Now, if you decide to become a woman and we recognize you as a woman in society well that's just common courtesy in my eyes like you have a person who has this issue they feel like they're born in the wrong body okay I get that I'm cool with that but to make it so that you're banned forever you can call someone a dumb fuck an idiot a piece of shit your mother should have swallowed you everybody's like yeah terms of service seem fine here everything's good say a man is never a woman gone for life right yeah call Caitlin Jenner I like you better when you're Bruce done that's it yeah No, and it's crazy, and obviously people see that, and they just get madder, and it seems to legitimate, you know, it makes people very, very resentful in ways that they wouldn't be otherwise.
[518] And it makes, there's no pathway.
[519] There's no, there's no other thing, right?
[520] There's no free speech platform that's universally accepted.
[521] Like these ones, like I said, like Gab or there's a couple other ones out there, there's no one's using them.
[522] It's a very small percentage of the people in comparison to something like Twitter, which is enormous.
[523] Right.
[524] And so because people don't want to be kicked off the platform, they're radically changing their behavior.
[525] Yes, yes, self -sensoring.
[526] And we're seeing this a lot also with political ideas too.
[527] Like, you know, I have a podcast.
[528] Useful idiots it's called, right?
[529] We try to talk to people who are kind of excluded from mainstream media because that's happening a lot now, right?
[530] Like if you have the wrong idea about anything, whether it's Russiagade or Israel -Palestine conflict or Syria.
[531] or whatever it is, you will suddenly be sort of labeled, I think with Tulsi Gabbard friends, they call her an Assadist, right?
[532] Like, once you get stuck with the term Assadist on Twitter, nobody wants to associate you with you.
[533] No one wants to defend you, right?
[534] They all kind of, and you're like suddenly like the kid with lice, and people don't want that to happen to them.
[535] So they stop saying X, Y, and Z, right?
[536] And they just sort of go with the flow, go with the crowd.
[537] And it causes this sort of, you know, uniform conformist discourse that isn't really about anything, right?
[538] Because people are just afraid to talk, which is crazy.
[539] Yeah.
[540] Right.
[541] Well, you're not supposed to talk to someone.
[542] I experience this all the time, this idea of giving someone a platform.
[543] Like if I have someone on like a Ben Shapiro or something like that, you shouldn't give that guy a platform.
[544] Well, he's already got a platform.
[545] Wouldn't it be better if I just talk to him and find out what his ideas are?
[546] and ask him about those ideas.
[547] Like, we had a very bizarre conversation about gay people, where, I mean, he's basically full -on biblical, religious interpretation of gay people, which to me is always strange.
[548] Like, okay, how do you stand on shellfish, you know?
[549] Are you just as strong on shrimp as you are on gay guys?
[550] Right.
[551] Like, why is it gay guys?
[552] It's that, like, the Bible's pretty clear on a bunch of different things that don't seem to fire people up the way homosexuality does.
[553] Like, why?
[554] Why do you care?
[555] If you had a friend that was eating shrimp, would you go to his house?
[556] If he had shrimp cocktail?
[557] No, but you wouldn't go to a friend's house if he was having a gay marriage.
[558] So you won't celebrate gay marriage, but you don't mind a guy who's got a fucking a shellfish platter.
[559] Right.
[560] Out at a party.
[561] Like, that's in the Bible, man. Right.
[562] You're not supposed to wear two different kinds of cloth.
[563] You know, there's a bunch your shit in the Bible that you're like well God was wrong about that like how confident are you right how confident are you that you can interpret God's word so perfectly that you're like you let the lobster slide but all that but fucking we got to stop that you know like it's really weird but that's the whole point is you challenge the idea yes yes but but the prevailing view now is that even having the discussion yes because you have a platform I mean I read that thing in the Atlantic you know where they're like, you give people to, I forget what the phrase was, they were saying something like, you had...
[564] I give people too many chances.
[565] Too many chances, people who had already forfeited their right to have them or something along those lines, right?
[566] That guy was silly.
[567] That guy gave up his hand when he said about me that I'm inexhaustible, but that he likes naps.
[568] Right.
[569] Oh, it's about you and your naps.
[570] That's what it is.
[571] You like naps.
[572] Okay.
[573] So you don't like people that have energy.
[574] I'm super sorry.
[575] But I mean, I thought that piece was really interesting because that whole idea that there are people who have forfeited the right to communicate forever.
[576] To communicate forever.
[577] Well, who decides that?
[578] I mean, again, there's this intellectual snobism that goes on in, you know, frankly on my side of the media aisle where, well, let's decide what an appropriate thought is, what's right thinking, what's wrong thinking, you know, who gets to have a platform, who doesn't get to have a platform, who doesn't get to have a platform.
[579] platform, who we're going to call a monster, who we're not going to call.
[580] I mean, I just don't understand that the arrogance, where that comes from, to decide that some people, you know, and I totally disagree with people like, you know, Alex Jones or Shapiro or, you know, most things.
[581] And, uh, but I don't think that they should be wiped out the face of the earth.
[582] I mean, I don't know.
[583] Well, it's interesting to challenge people on these weird ideas and find out how they come to them and you will get a lot of fence sitters that will recognize the flaws in their thinking if you let them talk.
[584] Because there's a lot of people that are on sure either way.
[585] Maybe they haven't invested a lot of time investigating it.
[586] Maybe they really don't know what this guy stands for.
[587] Maybe they just read a cartoonish version of who he is.
[588] And then you get to hear him talk and you go, oh, well, I see the flaw in his thinking.
[589] Or, oh, well, he's right about some things and a lot of people are right about some things.
[590] They're wrong about things and they're right about things.
[591] And the only way you can discern that is you communicate with them.
[592] But as soon as you de -platform people, like forever, you're just going to make a bunch of angry people.
[593] You're just going to make a bunch of people that are completely distrusting, and you're going to absolutely empower the opponents of your ideas.
[594] But, like, people that do get to, when do they get a chance to have their voice?
[595] Well, when they vote.
[596] So the more you do this shit, the more you censor conservatives, the more they're going to vote against liberals.
[597] This is just a fact.
[598] There's no getting around that.
[599] This is human nature.
[600] Yeah, I mean, I lived in the former Soviet.
[601] Union, you know, for 11 years.
[602] And 100%, if you lived in Soviet Russia and something was published by an official publisher, people thought it was basically full of shit, right?
[603] But if it was in the Samizdat, if it was in the privately circled stuff that had been repressed and censored, people thought that was the coolest thing in the world.
[604] Like that was the hot ticket, right?
[605] And And you're automatically giving something cachet and added weight by censoring it.
[606] I mean, this is just the way it works.
[607] It's human nature.
[608] If people think that you don't want them to see something, they're going to run to it twice as hard.
[609] So I just don't understand a lot of that instinct.
[610] I think people have this idea that it works, that, you know, that de -platforming works.
[611] But you can't de -platform an idea, you know?
[612] You may be able to do it to a person or two.
[613] but eventually you have to confront the idea you can do it to a few people and it has been successful which is one of the reasons why people are so emboldened like they have a successfully de -platform milo i mean they really have it's very hard to hear him talk anymore you don't he's not in the public conversation the way he used to be right because they kicked him off of all these different platforms and if you go into why they kicked him off these different platforms but even if you don't agree with him and i don't on a lot of things like boy i don't agree with kicking him off those platforms, if you listen to what he got kicked off for, it's like, man, I don't know.
[614] This doesn't seem like this makes a lot of sense.
[615] Yeah, no, I mean, same thing with Alex Jones.
[616] Yeah.
[617] Alex Jones has said, you know, he's gone after me a couple of times in ways that were pretty funny, actually.
[618] But when he was, you know, kicked off all these platforms, you know, I wrote a piece of saying, I think people are kind of doing an end zone dance a little early on this one, you know, because Jones is a classic example of how the system the way the system used to work, they would have punished him for being libelous about the Sandy Hook thing, right?
[619] Because that was sort of fit the classic definition of what prohibited speech was before.
[620] But we wouldn't, and he would have lost probably a lot, and he still might in those court cases.
[621] But to remove him forever, I think, you know, it just sets it it creates a new way of dealing with speech that I think is very dangerous you know right because the goalposts keep getting moved right if you can ban him for that then what why don't you ban me for repeating the things that I said about Megan Murphy right or banned because what I said about Bruce Jenner ban this for that I mean you you get you get further and further down the line you keep moving these goalposts and next thing you know you're in a very rigid tightly controlled area where you can communicate and you're suppressed.
[622] And that just accelerates your desire to step out of that boundary.
[623] And it makes you want to say things that maybe you wouldn't even thought of before.
[624] And also, logistically, it's an incredibly it's an insane thing to even think about asking platforms to rationally go through all this content.
[625] I talked to somebody who was a pretty high -ranking Facebook executive after the Alex Jones thing.
[626] And he said, think about what we used to do just to keep porn off Facebook.
[627] And we're dealing with, what, a couple of billion items of content every single day.
[628] We had these really high -tech algorithms that we designed to look for flesh tones.
[629] And that's how the Vietnamese running girl photo got taken off Facebook because they, like, automatically spotted a naked girl, you know, and they took that down.
[630] He's like, the Facebook Algo doesn't know that's an icon of fucking journalism, right?
[631] Like, it just knows it's a naked girl.
[632] So you say you take that, and now you're going to ask, Facebook to make decisions about ideas, right?
[633] Like if it's that hard and that expensive for us to go through and just to keep child porn off of Facebook, think about how crazy it's going to be when we start having entry -level people deciding what is and is not appropriate political content.
[634] It's not only going to be impossible to enforce, they're going to make a mess of it, and they will.
[635] And they already are, you know, and I think that's what we're saying.
[636] seeing.
[637] Well, that's why Twitter is so weird, because you can get away with shit on Facebook.
[638] You can say things on Facebook.
[639] Like, Facebook doesn't have a policy about dead naming or Facebook doesn't have a policy about misgendering people, but they do have a porn policy.
[640] Well, now, Twitter, you can have to be very careful.
[641] I have to be very careful.
[642] When I give my phone to my kids, make sure they don't open up the fucking Twitter app.
[643] Yeah.
[644] Because I follow a lot of dirty girls.
[645] And some of them, I mean, it's just right there.
[646] There's no. warning bang right in your face i mean it's kind of crazy right they have such an open policy when it comes to sex which i'm i'm happy they do i'm happy not even that i want to see porn but i'm happy that their attitude is just fine it's legal do it you don't have to follow those people if you don't like it seems like it's in the american spirit to be i don't know but but that's what it all comes down to for me but um but yeah no the the policies are completely inconsistent to with with Twitter.
[647] I've seen, I mean, I've talked to people who've been removed from Twitter for saying pretty, you know, pretty borderline things, right?
[648] Like they're, you know, basically pretty mild insults or something that would be threatening only if you really squinted hard, you know?
[649] Right.
[650] There was a guy from the Ron Paul Institute who got, who got taken down, for instance, because he was having a fight with some, you know, guy who was, I think, a Clinton fan.
[651] I forget what it was exactly.
[652] But you'll see behavior that's much worse from people.
[653] people who of another political ilk and they will not be removed, or they might be a smaller profile person, they won't be removed.
[654] So then what is that all about, right?
[655] Like if, if it's only a person who has 20 ,000 followers are higher, we're going to, I mean, it's just so, you just can't do it.
[656] There are just too many layers.
[657] I mean, I'm against it just generally, but just in terms of the logistics, it doesn't make any sense.
[658] I'm against it generally, too.
[659] And when I talked to Jack, and he was explaining to me the problems with trying to manage things at scale, You really kind of get a sense of it.
[660] Like, oh, you guys are dealing with billions and billions of humans using these things.
[661] Right.
[662] Yeah.
[663] Yeah.
[664] And, but they're already, you know, in many countries around the world, they have armies of thousands of people who go through content to try to flag this or that kind of political content.
[665] Yeah.
[666] You know, and punish people.
[667] Yeah.
[668] They have, you know, in Germany has, like, God, I forget what the term was.
[669] They have some really scary sort of authoritarian word for, like, filtration center.
[670] or something like that.
[671] You know, the Chinese have armies of people.
[672] I mean, I did a story about Facebook and how it was, you know, teeming up with groups like the Atlantic Council here in the United States.
[673] Remember a couple of years ago, the Senate called in Twitter, Facebook, and Google to Washington and asked them to devise strategies for preventing the sewing of discord.
[674] You know, so they, basically, it was asking them to come up with strategies for filtering out fake news and then also certain kinds of offensive content.
[675] But, you know, that is a stepping stone to what we've seen in other countries, I think.
[676] You know, and I think it's really worrisome, but nobody seems to care on our side of the aisle, which is very strange.
[677] My side of the aisle.
[678] It's my side of the aisle as well.
[679] It's a censorship issue, you know, and it's a short -sighted thing, as you said before.
[680] people and it's not even there's people that do pretty egregious things from the left like the covington school thing when people were saying we got to docks these kids and give me their names release their names these people are still on twitter to this day right talking about kids that just happen to have these make america great again hats and i have a friend who used to live in that area said like no you don't get it like there's these stands these kids are on a high school like field trip there's these stands we could buy these hats everywhere These kids bought the hats there.
[681] They think they're being funny.
[682] These guys play the music and then get in their face.
[683] You take a photo of it.
[684] It looks like this guy standing in this Native American guy's face.
[685] But then you see the whole video.
[686] And so, no, no, no, no, the Native American guy was playing his drum walking towards him.
[687] And then everybody starts piling in.
[688] Yeah, everybody just loses their minds.
[689] You know what I mean?
[690] It's this outrage cycle.
[691] It's just so exhausting now, you know?
[692] And signaling.
[693] Everyone's signaling how virtuous they are.
[694] Everyone's signaling that they're on the right side.
[695] everyone's signaling you know I want names take these guys down like you're talking about 16 year old kids right it's so fucking crazy and all what is he vic he's uh he's guilty of smiling right is that he's guilty of yeah no he's got a maga hat on i mean yeah it's crazy and the the signaling thing is crazy and you know for me the in the in the news business a lot of people that i know went into the went into journalism precisely because we didn't want to talk about our political views like The whole point of the job is like, you know, we're just going to tell you what the facts are, like, I'm not going to tell you what I'm all about.
[696] You can't do that anymore.
[697] Everything's editorialized.
[698] Everything is about editorializing and signaling.
[699] It's just like what you're saying.
[700] You're telling people what your stance is on things.
[701] And that's the opposite of what the job used to be.
[702] And this is, again, one of the things I've been trying to focus on is that, you know, what's exactly what you're talking about.
[703] People used to go to the news because they wanted to find out what happened in the world.
[704] And they can't do it.
[705] it anymore because everything that you turn on, every kind of content is just editorialized content where people are sort of telling you where they stand on things.
[706] And, you know, I don't want to know that.
[707] I want to know what the information is.
[708] Yeah, it's so hard.
[709] How does this get resolved?
[710] Because we're dealing with essentially a two -decade old problem, right?
[711] I mean, give or take.
[712] Before that, before the social media and before the internet and websites, this just wasn't, this wasn't what it was.
[713] You could count on the New York Times.
[714] give you an unbiased version of what's going on in the world.
[715] I don't necessarily know that's true anymore.
[716] No. No, the Times has kind of gone over to this model as well.
[717] I mean, they've, they've struggled with it.
[718] There was an editorial, and I wrote about this in the book that the, in the summer of 2016, this guy, Jim Rutenberg wrote this piece, said Trump is testing the norms of objectivity.
[719] That was the name of the piece.
[720] And basically what he said is Trump is so bad that we have to of like rethink what objectivity means.
[721] We have to not only be true, but true to history's judgment, he said.
[722] And we have to have copious coverage and aggressive coverage.
[723] So we're going to cover Trump a lot.
[724] We're going to cover him aggressively.
[725] And we're going to show you, we're going to take a stand on this issue rather than just tell you what happened.
[726] Right?
[727] So rather than doing the traditional New York Times thing of just the facts, we'll tell you, you sort it out, right?
[728] You figure out, we're going to tell you, you know, kind of how to, what your stance should be.
[729] And, you know, I think where does, where do we go from here?
[730] How does it get resolved?
[731] I don't know because, you know, unless the financial incentives change, they're not going to change, you know.
[732] The business used to be back when you were talking about it, New York Times, and then there were three networks, and they were all trying to get the whole audience, right?
[733] So they were doing that kind of neutral fact -finding mission, and it was working for them financially.
[734] Now they can't do that because of the international.
[735] internet, it's, it's, it's, you're hunting for audience and little groups.
[736] Yeah.
[737] And they're just giving you hyper politicized stuff because that's the only way they can make money.
[738] I don't know how we change it.
[739] I don't know how we go, you know, we reverse it.
[740] It's, it's, it's, it's a problem.
[741] It's so interesting though, because I mean, if you looked at human interactions and if you, you looked at, you know, dispensing news and information, and you followed trends from like the 30s to the 40s to the 50s to the 60s to the 70s, to the 70s, to the 70s, to the 70s, and if you looked at he'd be like oh well people are getting better at this people getting better and they're whoa whoa what the fuck is going on now everything's off the rails yeah there's two camps barking at each other there's blatant misinformation on both sides blatant distortions of the truth blatant editorializing of facts and you're like well hey what happened guys yeah no it's it's it's crazy and and not not that the news didn't have distortions before like you think about you know We covered up all sorts of things, you know, massacres in Cambodia, secret bombing, you know, use of Agent Orange, like stuff like I just didn't appear in the news and the degree it should.
[742] Now, though, you turn on either MSNBC or Fox, and you're right, you'll find something that's just totally full of shit within five minutes, usually.
[743] And that did not used to be the case.
[744] You know, I think individual reporters used to take a lot of pride in their work, you know, and it's different now.
[745] Now, now when you make mistakes in the business, you don't get bounced out of the business in the way you used to, and that's really strange.
[746] Only plagiarism, right?
[747] Plagiarism still bounces you.
[748] Plagiarism is pretty, yeah, that's usually fatal, right?
[749] You're not going to usually recover from that.
[750] I mean, some people have kind of near problems with that, and they, you know, I'm not going to name.
[751] But no, but you think about people who got stories like the WMD thing wrong.
[752] Right.
[753] Not only do they not get bounced out of the business, they all got promoted.
[754] You know, they're like they're editors of major magazines now.
[755] And, you know, and so what does that tell people in the business?
[756] Well, it tells you, you know, if you screw up, as long as you screw up with a whole bunch of other people, it's okay, you know, which is not good.
[757] And we used to have a lot of pride about that stuff in this business and now we now we don't anymore.
[758] you know there isn't the shame connected with screwing something up that there used to be I think there's a real danger with in terms of social media especially in not complying to the Constitution not complying to the First Amendment I think there's a real danger in that and I don't think we recognize that danger because I don't think we saw what social media was until it was too late and then by the time it was too late we had already had these sort of standards in place and the people that run it were already getting away with enforcing their own personal bias their ideological bias and this is when you're at this position where you go well how does that ever get resolved they're not going to resolve it on their own they're still making ass loads of money what do you do is the government resolve it well if Trump steps in and resolves it it looks like he's trying to resolve it to save his own political career or to you know to help his supporters it's like yeah no and no matter what if Trump does anything about it automatically everyone's going to be against it right right you know even even if it's um even if there's some sense in there somewhere people won't won't uh won't get behind it but if they do anything about it's there's going to be a correction time there's going to be a gab time where it's going to be like that where it's just going to flood with people that are just like with this newfound freedom they're just going to go p p p p pwap wow you shoot up the town you know but i mean but how would you how would you fix it now that that's the thing because it's not only about rules, it's also about culture.
[759] Like, people have already, they're in this pattern of, you know, not saying the wrong thing.
[760] Right.
[761] And they don't, I think there's, we're in a culture that doesn't even really know how to deal with free speech if we actually had it in the same way we used to, you know.
[762] No one seems to have a forecast.
[763] Like, no one's like, well, the storm is going to last about four years.
[764] And then it's like, there's no, there's no forecast.
[765] No. Everyone's like, well, it's fucking uncharted waters.
[766] Right.
[767] Right.
[768] But historically, the tendency is once you have a tool that kind of can be used to keep people in line and enforce compliance of ideas, and then it always ends up worsening and becoming more and more dictatorial and authoritarian.
[769] Yes.
[770] And again, you go back to the Soviet example.
[771] Like once they started, you know, really exercising a lot of control over the press and literature and things like that, it didn't get better, you know.
[772] It just continued becoming more of an entrenched thing until – so that's what I worry about.
[773] I think we're headed more in that direction.
[774] Yeah, I think so, too.
[775] I'm just really concerned with – on both sides, when people dig their heels in ideologically, the other side just gets even more convinced they're correct.
[776] Oh, yeah.
[777] Yeah, and there's no cross -dialogue of any kind anymore.
[778] And even now – I mean, it's interesting.
[779] you had Bernie Sanders on your show and Sanders is one of the few politicians left who has this idea that we should talk to everybody like there are no illegitimate audiences out there and like you know that's my job as a politician is to try to convince you of things but that's not normal in the Democratic Party anymore I mean Elizabeth Warren you know has made a big thing about not going on Fox and about having certain people taken off Twitter and I think that's increasingly the sort of line of thought in the mainstream Democratic Party thought now is that we're just going to rule out whatever, whatever that is, 47 % of the electorate, we're just not going to talk to them anymore.
[780] Right, right.
[781] I mean, I don't know how that can possibly be a successful political strategy and what the point is, you know?
[782] Yeah.
[783] No, it doesn't make any sense.
[784] I was reading something where people were going after Tulsi Gabbard for being on Tucker Carlson.
[785] She's like, I'll talk to everybody.
[786] And I'm glad she does.
[787] And by the way, it's like, it's hard for her because she's kind of an outside candidate.
[788] It's hard for her to get time on these other networks.
[789] And so they want to punish her for being on Tucker Carlson's.
[790] And then they have this, you know, reductionist view of who he is.
[791] He's a white supremacist.
[792] Like, oh, well, she supports white supremacist.
[793] She goes on a white supremacist show.
[794] Okay, is that what he is?
[795] Is that really what he is?
[796] And it's a lot more than that.
[797] There's a lot going on there.
[798] Right.
[799] You guys are fucking with life.
[800] Yeah.
[801] You know, you're fucking with the reality of life and you're saying it in these sentences.
[802] You're printing it out in these paragraphs as fact and you're sending it out there irresponsibly.
[803] And it's just really strange that people don't understand the repercussions of that.
[804] Yeah, this is something we talk about on our podcast to usefully it's all the time is that this, it's a catch -22, right?
[805] Like, you don't invite somebody like Tulsi Gabbard on to CNN as NBC, or they're kind of excluded from the same platforms the other politicians get.
[806] So they go to other platforms, right?
[807] And then you say, oh, you went on that platform, so you're illegitimate.
[808] Yes.
[809] You know, well, what do you want them to do?
[810] Like, you know, they do the same thing with people who go on RT, for instance, right?
[811] Oh, well, you're helping the Russians because you went on RT.
[812] Well, that's because you didn't invite them on any.
[813] I mean, yeah.
[814] People are going to try to talk to anybody they can to spread their ideas.
[815] And that, that kind of propaganda thing is, is pretty constant.
[816] Now, in the use of the term, terms like white supremacist with Tucker Carlson, I mean, there are a million terms now that you use to just kind of throw at people.
[817] And what they're trying to do is create this ick factor around people, right?
[818] Like, once you get, someone gets a label associated with them, then nobody wants to be associated with that person, right?
[819] And then they quickly kind of die out of the public scene.
[820] And that's, I think that's really bad, too.
[821] You know, it's like a, it's, it's just an anti -intellectual way of dealing with things, and I, and I think it's, it's not good.
[822] It's weird that it's so prevalent.
[823] That's weird that there's so few proponents of a more, you know, open -minded way of thinking.
[824] Right, yeah.
[825] And just to take the gap, we had Tulsi Gavin on our show, too, and immediately we got accused, what, do you love Assad, right?
[826] Do you want to bomb Syrian, you want to murder Syrian children?
[827] no, you know, she's a presidential candidate.
[828] And we want to talk when you hear what she has to say.
[829] But they immediately go to the maximalist interpretation of everything.
[830] And then what they're basically saying when they ask you those questions are, do you want to wear that label too?
[831] Because she's got it already.
[832] So if you have her and again, you're going to have that label.
[833] And people, they see that, you know.
[834] And so, you know, people who don't have a big following and who are worried about their careers, and about, you know, the money and advertisers and stuff like that, they think twice about, you know, interviewing that person the next time.
[835] Yeah, exactly.
[836] And that's another way to get at speech.
[837] Exactly.
[838] And, again, I don't know how you get out of it, you know.
[839] And, I mean, I've experienced some blowback, I guess, but it hasn't worked yet.
[840] Right.
[841] You know what I mean?
[842] It's not real.
[843] It's just words.
[844] Like, okay.
[845] Well, but yeah.
[846] And but you're handling it the right way, Ben.
[847] I think your audience is rewarding you for not bowing to it, you know.
[848] And I think that more people, if they took that example and said, I'm not going to listen to what the PAC says about this.
[849] I'm not going to be afraid of being called a name.
[850] You know, fuck that.
[851] I'm going to talk to who I want to talk to.
[852] And I'm going to, you know, explore whatever ideas I want to explore.
[853] Then this kind of stuff wouldn't be as effective.
[854] so it's so easy to do to people and it's so easy for them to de -platform people it's so easy and shadow banning and all those other weird shit that's going on they're channeling people and pushing people into these areas of their platforms that makes them less accessible and I know where it comes from I mean I was young and politically active once you know you want to change the world you want to make it a better place so you're in college You don't have any power.
[855] You don't have any way to make something into legislation.
[856] You know what I mean?
[857] So what do you do?
[858] You know, social media gives you the illusion that you're having an impact in the world by, you know, maybe getting somebody de -platformed or taken off Twitter or something like that.
[859] It feels like it's political action to people.
[860] But it's not, you know what I mean?
[861] It's something that they, that is open to people to do, but it's not the same as you know getting 60 congress 60 members of the senate to to raise taxes on a corporation that's been evading them for 20 years you know what i mean like that's that's real action um this you know getting some random person taken off the internet is just not change you know but but people feel like it is and and they want to they want to do the right thing so i i get it but no it's it's not you know real political action i don't think no it's fucking gross yeah and it just leads it's there's so much of it and there's so little logic also when and this must be a personal thing for you but isn't this isn't this the unfunniest time in american history like yes and no because you're rewarded for for stepping outside of the box that's true in a big way like yeah you mean dave chapelle gets attacked but guess what he also gets rewarded in a huge way right he goes on stage now people go ape shit that's true and part of the reason why they go fucking bonkers is because they know that this guy doesn't give a fuck and he's one of the rare ones who doesn't give a fuck so when he goes up there you know if he thinks something crazy about whatever it is whatever protected group or whatever idea that he's not supposed to explore that's not going to stop him at all he's going to tell you exactly what he thinks about those things regardless of all this woke blowback he's not he doesn't care right so because of that he's rewarded it even more.
[862] And same thing with Bill Burr.
[863] Same thing with a lot of comics.
[864] I experienced it with my own jokes.
[865] Sure.
[866] More controversial bits get people more fired up now.
[867] They love it because everyone's smothered.
[868] They're smothered by human resources and smothered by office politics and you're smothered by social discourse restrictions and you just don't feel like you can express yourself anymore.
[869] That's true.
[870] And all people also don't have a, they feel like they're being watched all the time.
[871] There's another thing, so they feel like they kind of can't let it all hang out anywhere, right?
[872] And so that's, yeah, they do feel incredibly, like, repressed and under the gun.
[873] Yeah.
[874] I think that's true.
[875] Yeah.
[876] I just, I feel like, I mean, I'm not a comic, but I just imagine it must be a more challenging environment.
[877] It's more challenging, but more rewarding, too.
[878] My friend Ari said it best.
[879] He said, this is a great time for comedy because comedy's dangerous again.
[880] Right, that's true.
[881] Yeah.
[882] That's true.
[883] Yeah.
[884] It kind of goes back to like the Lenny Bruce era, right?
[885] Yeah, when, you know, you could kind of completely freak people out with a couple, saying a couple of things.
[886] Sure.
[887] For good or bad.
[888] Richard Pryor, yeah.
[889] Well, you, like, you saw it with, like, Louis C .K., right?
[890] Louis C .K. is under the microscope now.
[891] That joke that he made about Parkland is absolutely a Louis C .K. joke.
[892] If you've followed him throughout his career.
[893] What was the joke again?
[894] I'm sorry.
[895] The joke was, why am I listening to these Parkland survivors?
[896] Why are you interesting?
[897] because you push some fat kid in the way like see you're laughing right like that is a Louis CK joke he's saying something fucked up that you're not supposed to say that is throughout his goddamn career he's done that that's what he's always done but after the you know jerking off in front of women all that stuff and him coming out and admitting it and then taking a bunch of time off now he's a target right now now he does something like that and they're like oh he's all right now like no this is what he's always done right he's always Taking this sort of contrarian outside the box, fucked up, but hilarious, take on things.
[898] And that bit, unfortunately, because it was released by someone who made a YouTube video of it, he didn't get a chance to, he was gone for 10 months, and he had only done a couple sets when he was fleshing these ideas out.
[899] I guarantee you he would have turned that idea into a brilliant bit, but he never got the chance.
[900] Because it was just, it was set out there in the wild when it was a baby, and it was mauled down by wolves.
[901] It needed to be, it needed to grow.
[902] Yeah.
[903] I mean, that's what a bit of these bits, they grow and they develop.
[904] And that was a controversial idea that we're supposed to think that someone's interesting just because they survived a tragedy.
[905] And his take is like, no, no, no, no, you're not interesting.
[906] You're fucking boring.
[907] You're annoying.
[908] Get off my, get off my TV.
[909] And a lot of us have felt that way.
[910] Sure.
[911] He just, the way he said it was easy to take and put in, you know, out of context, put it in quotes, and turn him into an asshole.
[912] Well, yeah, but that's what comedy is, right?
[913] It's, it's, It's taking what people, the thoughts that everybody has and vocalizing that thing, that forbidden thing, in a way that people can kind of, you know, come together over, right?
[914] I mean, I think that was a lot of what Richard Pryor's humor was about.
[915] Like, he took a lot of the sort of uncomfortable race problems, right?
[916] And he just kind of put them out there, and both white people and black people laughed at it.
[917] Yeah.
[918] Like, together, you know?
[919] And that was what was good about it.
[920] Yes.
[921] But if you can't, if people are afraid to vocalize those things, that they think it's going to ruin their career, I mean, I guess, you know, that makes it more interesting, right?
[922] It does.
[923] It's more high stakes.
[924] But if you can navigate those waters and get to the promised land of the punchline, it's even more rewarding.
[925] Right.
[926] But you just have to explain yourself better.
[927] You have to have better points.
[928] You have to have a better structure to your material where you, while the people who may find, your idea objectionable, they, you, you coax them.
[929] Like, hold my hand.
[930] I'm going to take you through the woods.
[931] We're going to be okay.
[932] Follow me. And boom, isn't that funny?
[933] Right, right, right.
[934] But you have to navigate it skillfully.
[935] And you have to navigate it thoughtfully.
[936] And you have to really have a point.
[937] You can't have a half -ass point.
[938] But you can't have a situation where it's fatal to be off by a little bit.
[939] You know, like there was a writer that I love growing up a Soviet writer named Isaac Babel, Stalin ended up shooting him.
[940] But he gave a speech about, I think it was in 1936, you know, to a Soviet writers collective.
[941] And he said, you know, people say that we don't have as much freedom as we used to.
[942] But actually, all that, you know, the Communist Party has done is prevented us from writing badly.
[943] The only thing that's outlawed now is writing badly, right?
[944] And everybody laughed, but he was actually saying something pretty serious, which is that.
[945] that you can't write well unless you can, you know, screw up, too.
[946] You know what I mean?
[947] Like on the way to being creative in a good way, you have to miss. Yes.
[948] You know?
[949] And if missing is not allowed and there's high punishment for missing, you're not going to get art. Yes.
[950] You're not going to get revelation.
[951] You're not going to get all these things.
[952] Well, in comedy, it's particularly important because you have to work it out in front of people.
[953] Absolutely.
[954] Yeah.
[955] No, I used to sit at a comedy club in Manhattan when I was in college.
[956] that you know they would try out their material like on a Wednesday right you know early and that was always the most interesting time for me like when they're trying stuff out and a lot of it wasn't so good but you know it was interesting right and you just can't have a situation where people feel like you know one wrong word is going to ruin their careers yeah you know yeah I don't know but there's also people that are wolves and they're trying to take out that little baby joke wandering through the woods they want that feeling of being take able to take someone down right and that that's you know that's you're getting that now too which is just and so now because of that there's like yonder bags at the improv where i'm performing tonight they they use yonder bags you have to put your cell phone in a bag when you go in there so you can't record things yonder bags yes it's a company called yonder it's just so strange it's uh like all the shows i did with chappelle he uses yonder bags and the idea is to prevent people from from filming and recording and you know and then eventually putting your stuff out there Uh -huh.
[957] Well, you know, look, I'm kind of all for that.
[958] I mean, I've seen this with politicians on the campaign trail.
[959] Like, they are so tight now in ways that they used to not be.
[960] Well, you saw the Donald Trump thing, Donald Trump Jr., where Trump Jr., they didn't want him to do, they wanted him to do a Q &A, and he didn't want to do it, so they booed him.
[961] The right -wing people were booing him.
[962] They were yelling out, Q &A, Q &A, because they wanted to be able to talk.
[963] Oh, I see.
[964] be able to say something to him and these are people that were like far right far right people they just didn't think he was being right enough or he was playing the game wrong or he wasn't wasn't letting them complain to him right yeah yeah no that's bad and and and politicians are are aware of that now and they're they're constantly aware that they're they're on film everywhere and so they're you know a thousand percent less interesting because yeah they're I mean I remember covering campaign in 2004 and And I saw Dennis Kucinich give a speech somewhere, and he was going from, I think, Maine to New Hampshire.
[965] And I said, well, can I get a ride back to New Hampshire?
[966] He's like, yeah, sure.
[967] So, you know, takes me on the van.
[968] He, like, takes his shoes off.
[969] He's, like, cracking jokes and everything and, like, eating udon noodles or something.
[970] Political candidates would not do that now, right?
[971] They'd be afraid to be off the record with you, you know?
[972] Right, right, right.
[973] And they're afraid to be around people and just behave like people, you know, which is not good.
[974] I don't think.
[975] It's the weirdest time ever to be a politician because it's basically you've got this one guy who made it through being hugely flawed and just going, ah, fucking locker room talk.
[976] And everyone's like, well, yeah, it is locker room talk, I guess.
[977] And then it works.
[978] And he gets through and he wins.
[979] And so you've got him who seems like he's so greasy, like nothing sticks to him.
[980] And then you have everyone else who's terrified of any slight misstep.
[981] Yeah, totally.
[982] And you can't replicate the way Trump does this.
[983] You know, Trump, Trump is, he was born this way.
[984] There's like a thing going on in his head.
[985] Like, he is, you know, pathologically driven to behave in a certain way.
[986] And he's not going to be cowed by the way, you know, people are of a social, because he just doesn't think that way.
[987] No. He's, and, but that's, no one else is going to behave like that.
[988] What do you think about him in speed?
[989] What do you think about all that?
[990] Does he take speed, do you mean?
[991] Yeah.
[992] So did you ever see his.
[993] speech after Super Tuesday Yeah, that's the one when he was slurry He was that That was the one He was ramped up He was very I just say Watch that speech You know We're not supposed to Draw conclusions About but you know What might be going On pharmaceutical With somebody But I would say just Watch Donald Trump's Performance After the results Of the Super Tuesday Rold in in 2016 Let's hear some of that First of all Chris Christy is hilarious much Hillary's speech and she's talking about wages have been poor and everything's poor and everything's doing badly, but we're going to make it.
[994] She's been there for so long.
[995] I mean, if she hasn't straightened it out by now, she's not going to straighten it out in the next four years.
[996] It's just going to become worse and worse.
[997] She wants to make America whole again.
[998] And I'm trying to figure out what is that all about.
[999] Is this it?
[1000] Yeah, I mean, I have to go back and look.
[1001] But yeah, but he went on and on.
[1002] Also, the, the, the Christy factor was really funny with that because he was the here.
[1003] He's just sitting back there going, What am I doing?
[1004] What am I doing with my life?
[1005] Look at his face.
[1006] Literally, you could see his brain wander.
[1007] Well, how the fuck did this happen?
[1008] I was going to be the man. Like, I was the goddamn president.
[1009] It was going to happen for me. I could see it happening.
[1010] I saw him in Ames, Iowa, basically standing alone in the park, waiting for people to try to shake his hand.
[1011] Yeah, it was pretty bad.
[1012] Like, you see that.
[1013] But yeah, do you have a theory about Trump and Speed?
[1014] Yeah.
[1015] Yeah.
[1016] Yeah, I think he's on some stuff.
[1017] I think, first of all, I know so many journalists that are on speed.
[1018] I know so many people that are on Adderall, and it's very effective.
[1019] It gives you confidence, it gives you a delusional perspective.
[1020] You get a delusional state of confidence.
[1021] It makes people think they can do anything.
[1022] It's basically a low -level meth.
[1023] It's very similar to methamphetamine chemically.
[1024] Sure.
[1025] And people on it, tell me what it's like, because I haven't done it.
[1026] Yeah, I mean, I have done speed, too.
[1027] I mean, you know, all those drugs are, yeah, they're like baby speed, basically, you know, and you're absolutely right.
[1028] I think people who, it's not good for a writer, because writing is one of these things where one of the most important things is being able to step back and ask, am I really, am I full of shit here?
[1029] You know, are my jokes as funny as I think they are?
[1030] Like, if once that mechanism starts to go wrong, you know, you're really lost as a writer, right?
[1031] Because you're not in front of an audience.
[1032] You're with yourself in front of a computer.
[1033] So I don't think speed is a great drug.
[1034] I mean, you get a lot of stuff done.
[1035] So that's good.
[1036] But yeah, no, I think there's a lot of people who are on it now.
[1037] And also a lot of this is because kids come up through school and they're on it too.
[1038] You know, and they get used to it.
[1039] So I have kids, I wouldn't dream of giving them any of those drugs.
[1040] You know, I think it's crazy.
[1041] Yeah.
[1042] I do, too.
[1043] You saw the, I'm sure you saw the Sudafed picture, too, right?
[1044] No, what was that?
[1045] Trump was sitting in his office eating a, it was that famous photo where he's like, I love Hispanics, where he's eating a taco bowl and Trump Tower.
[1046] And behind him there's an open drawer.
[1047] And in that open drawer is boxes of Sudafed.
[1048] And Sudafed.
[1049] And Sudafed.
[1050] Yeah, it gives you, yeah, I mean, it gives you a low -level buzz.
[1051] And the, the, I mean, this is why you used to have to go to CVFED.
[1052] yes to buy this stuff you used to have to give your drivers i guess you still do they have to give your driver's license because they want to make sure you're not cooking meth right buying like 10 boxes of it at a time and cooking up a batch yeah if you're like in a in a holler in kentucky and you go in and get 20 20 boxes of suit of fed i think pretty much people know what you're doing there yeah that's really funny did he so he had a bunch of suit of fed behind him yeah in his box and you know there was that one reporter that uh what was that guy's name again who had a a whole he wrote a series of tweets, which he eventually wound up taking down, by the way, Jamie.
[1053] I can't find those fucking tweets.
[1054] He wrote a series of tweets that there was a very specific Dwayne Reed pharmacy where Trump got amphetamines for something that was in quotes called metabolic disorder.
[1055] Kurt Eichenwald.
[1056] Fun fact.
[1057] 1982, Trump started taking amphetamine derivatives, abused them.
[1058] Only supposed to take two for 25 days, stayed on it for eight years.
[1059] Really?
[1060] Now, is he full shit?
[1061] So, yeah, Kurt Eichenwald is an interesting because he's written some really good books about finance.
[1062] He wrote a book about Enron.
[1063] He wrote a book about Prudential.
[1064] It was really good.
[1065] And when I was starting out, reading about Wall Street, I was like, wow, these books are really incredibly well researched.
[1066] But he had some stuff in the, in 2016, where, like, that's an example of something as a reporter i see that i'm like well where's that coming from you know and because you in journalism you can't really uh accuse somebody of certain things unless it's backed up to the nth degree so he had a couple of things that i that i you know would be concerned about he took a leap i don't know i mean look but that's what i'm saying stepped outside the journalistic boundaries of what you can absolutely prove and not prove and took a leap and that's why i think he took down the Dwayd -Reed pharmacy.
[1067] He didn't take it down?
[1068] Oh, it's still there as well?
[1069] Oh, okay.
[1070] There it is.
[1071] There was another thing about a...
[1072] Oh, he's got the milligrams per day.
[1073] Wow.
[1074] Where is this from?
[1075] I don't know.
[1076] He doesn't show it or anything, but I believe he got a copy of it from someone, or he talked to the doctor.
[1077] Drug was diethopropin, 75 milligrams a day, prescription filled with Dwayne Reed on 57th Street, Manhattan.
[1078] Not that I know things.
[1079] So, you know...
[1080] He's got the doctor's name, too.
[1081] Dr. Joseph Greenberg, I countered with medical records.
[1082] A White House admitted to me only a short time for diet that he took it when he was not overweight.
[1083] Well, okay, then that's fine.
[1084] He says, I counted with medical records.
[1085] They cut me off.
[1086] Wow.
[1087] Yeah, I mean, you know, one thing I will say is that when you're covering stories, sometimes you hear things and you know they're pretty solid, but it's not quite reportable because the person won't put their name on it or, you know, you're not a hundred percent sure that the document is a real document.
[1088] Maybe it's a photocopy.
[1089] And that can be very, very tough for reporters because they know something's true, but they can't.
[1090] Right.
[1091] They can't.
[1092] And social media has eliminated a barrier that we used to have.
[1093] We used to have to go through editors and fact checkers.
[1094] And now, you know, you're on Twitter.
[1095] You can just kind of, you know.
[1096] Right, right.
[1097] Or you can hint at something, you know.
[1098] And I think that's something you don't want to get into as a reporter too much, you know.
[1099] Yeah, that's a weird use.
[1100] of social media, right?
[1101] It's like sort of a slippery escape from journalistic rules.
[1102] Yeah, exactly.
[1103] Yeah.
[1104] You know, or you can you can insinuate that somebody did X, Y, and Z, or you can, you can use terms that are a little bit sloppy, like, you know, again, like, but it seems like they did admit that he took that stuff for diet.
[1105] Yeah, so if you have the, the White House, you know, spokesperson saying that they, he took it for a short time for a diet, then you find that's a reportable story.
[1106] Right.
[1107] Yeah.
[1108] Yeah.
[1109] Well, I think when people get into that shit.
[1110] It's very hard for them to get out of that shit.
[1111] That's the speed train, and I've seen many people hop on it.
[1112] It's got a lot of stops.
[1113] Nobody seems to get off.
[1114] Yeah, not with their teeth intact, right?
[1115] Yeah, no, it's that's not a good good way to end.
[1116] Also, he's so old.
[1117] He's so old, he doesn't exercise, he eats fast food, and he's got so much fucking energy.
[1118] I know.
[1119] I mean, people want to think he's this super person, you know?
[1120] But maybe he's on speed.
[1121] Maybe, yeah.
[1122] Maybe he's just going to have turn over and collapse one day.
[1123] Or not.
[1124] Maybe you can go a lot longer on speed than people think.
[1125] Maybe if you just do it the right way.
[1126] But isn't that kind of the way history always works?
[1127] It's like, again, not to go back to the Russia thing, but all the various terrible leaders of Russia, like they all died of natural causes when they were 85, right?
[1128] Whereas, you know, in a country where people get murdered and die of industrial accidents and bad health when they're, you know, 30 all the time.
[1129] Right.
[1130] But the worst people in the country make it to very old age and, you know, and they're alcoholics and maybe that's a thing right maybe maybe you know he has the worst diet in the world and maybe he's on the speed and maybe it's also your perception of how you interface with the world maybe because he's not this introspective guy that's really worried about how people see him and feel about him maybe he doesn't feel you know whether it's sociopathy or whatever it is he doesn't feel the bad feelings they don't get in there yeah and this he doesn't have the the stress impact right right And that's the thing about speed, apparently, because of the fact that it makes you feel delusional.
[1131] And it makes you feel like you're the fucking man. Like, you don't worry about what other people are.
[1132] These fucking losers.
[1133] Who cares?
[1134] Right, right.
[1135] Yeah, exactly.
[1136] Let's buy Greenland.
[1137] Yeah.
[1138] You know, that was, why not buy Greenland?
[1139] Why not buy Greenland?
[1140] Yeah.
[1141] And then when that came out of, what's wrong with that?
[1142] We bought Alaska.
[1143] Well, we leased Alaska, sort of.
[1144] Yeah.
[1145] Yeah, we were supposed to give it back, but we didn't.
[1146] It seems like Greenland would be a good place to scoop up, especially as things get warmer.
[1147] Right?
[1148] Yeah, exactly.
[1149] The fucking tweet that he made when he put the Trump Tower, I promise not to do this, and I have a giant Trump Tower in the middle of Greenland.
[1150] I was laughing my ass off.
[1151] I'm like, love or hate, that is hilarious.
[1152] His trolling skills are very good.
[1153] They're fantastic.
[1154] Oh, he knows how to fuck with people.
[1155] When he starts calling people crazy or gives him a nickname, like, it's so good because, like, it sticks.
[1156] Oh, yeah.
[1157] I mean, part of me wants to see a Trump Biden race next year just for that reason.
[1158] is just because the abuse will be unbelievable.
[1159] I mean, not that I'm encouraging that necessarily, but just as a spectacle, it's going to be unbelievable.
[1160] You can tell that he is salivating at the idea of Biden as a appointment.
[1161] Biden, to me, is like having a flashlight with a dying battery and going for a long hike in the woods.
[1162] It is not going to work out.
[1163] It's not going to make it.
[1164] Yeah, no, he's...
[1165] He's so faded.
[1166] He, you know, he has these moments on the campaign trail where he'll be speaking.
[1167] And, you know, these guys do the same speech over and over again so they can kind of do it on cruise control.
[1168] But every now and then, he'll stop in the middle of it.
[1169] And this look of terror comes over, like, where am I?
[1170] Yeah.
[1171] You know, what town am I in?
[1172] You know, like, he confused.
[1173] He thought he was in Vermont when he was in New Hampshire.
[1174] I'm sorry.
[1175] Yeah, he got those states confused.
[1176] He was like, what's not to love about Vermont?
[1177] He was in New Hampshire.
[1178] You know, that can happen, obviously, but it happens to him a lot.
[1179] But he's clearly old.
[1180] Yeah.
[1181] You know, I mean, he's not much older than Trump.
[1182] Right.
[1183] But he needs to get on the same pills.
[1184] Yeah, yeah.
[1185] Actually, that would be interesting.
[1186] We should get a go fund me to buy speed.
[1187] You imagine?
[1188] Yeah.
[1189] If they just filled him up with steroids and just jacked him up with amphetamines and had him going after Trump.
[1190] Because I really think he needs something.
[1191] like that.
[1192] Whatever he's doing on the natch, it's not working.
[1193] Right.
[1194] Yeah, yeah.
[1195] He's too tired.
[1196] Needs a little bit of enhancement.
[1197] It's not going to work.
[1198] If he, if he gets the nomination, the Democrats are fucked.
[1199] I just, I don't see, I don't see him, I don't see him withstanding the barrage that Trump is going to throw at him.
[1200] Trump's going to take him out, like Tyson took out Marvis Frazier.
[1201] He's just going to bomb on him.
[1202] Yeah.
[1203] That was a bad fight.
[1204] Yeah.
[1205] It was a bomb, that was a bad fight.
[1206] But it's going to be that kind of fight.
[1207] He's just going to bomb on him.
[1208] Yeah.
[1209] Doesn't have a chance.
[1210] He can't stand with that guy.
[1211] He doesn't have a chance.
[1212] He doesn't He's too, he's also too impressed with himself.
[1213] Yes, he's too used to people deferring to him.
[1214] Yes, like he thinks like the things he says make sense and are cool and are profound.
[1215] When they're just bland.
[1216] Right.
[1217] He's just serving bad meatloaf.
[1218] And he's like, ta -da!
[1219] And you're like, no, this is bad meatloaf.
[1220] Yeah, that's how he got to be vice president by being just bland enough.
[1221] Yes.
[1222] Right?
[1223] To get whatever constituency Obama was trying to get.
[1224] But you saw that exchange.
[1225] when he called Trump an existential threat earlier this year.
[1226] And Trump basically, he just went off on him.
[1227] Joe's a dummy.
[1228] He's not the guy he used to be.
[1229] Like, you know, that's going to be every day.
[1230] Yep.
[1231] You know, every minute of every day.
[1232] And then other people are going to chime in because they love it.
[1233] People love piling on.
[1234] Oh, yeah.
[1235] And his fans, oh, my God.
[1236] He's the asshole king where people never had a representative before.
[1237] There's a lot of assholes out there like, like, where's my guy?
[1238] Right.
[1239] And then finally, bam, look at this.
[1240] There he is.
[1241] asshole made it to the white house holy shit i can be an asshole now the president's an asshole he wants me to be an asshole lock her up lock her up yeah lock her up yeah totally like all that's I mean that that's gonna wear on a guy I mean have you been to one of Trump's rallies no chance yeah I can't I have to wear a rubber nose and fucking I've covered them and they're like they're unbelievable first of all the the the t -shirts are amazing you know like Trump 2020 fuck your feelings you know what I mean like Trump as the punisher you know it's like the punisher skull what the thing like it's it's amazing and the crowds it's like totally out of idiocracy Is there a fucking Punisher skull with a Trump wig on it?
[1242] Yeah yeah Oh my goodness I'm gonna have to get one of those I mean he's there's there oh the t -shirts The army Do we have one?
[1243] Jamie that was such a loud laugh Oh my god, what a...
[1244] It's a red, white, and blue American flagged skull Punisher style with a Trump wig on it.
[1245] So I saw that...
[1246] I need that shirt.
[1247] I saw...
[1248] It wasn't the one red, white, and blue one.
[1249] It was the one with the black.
[1250] And I saw that on a, on like an eight -year -old kid.
[1251] Oh, my God.
[1252] It was like a mother with her little kids and the Trump Punisher Skull, but...
[1253] Do they sell that shirt on Amazon?
[1254] Can you find out they show it?
[1255] I'm sure it's being...
[1256] sold everywhere.
[1257] It is now.
[1258] These are stickers and these are being sold.
[1259] Oh, God.
[1260] These fucking people.
[1261] I mean, the, the, the merch is, he's the most t -shirtable president in history.
[1262] I mean, Trump 2020 grabbing by the pussy again.
[1263] Oh, boy.
[1264] I mean, they, they like embrace that shit.
[1265] The trolling aspect of all of it is like the fun part for his crowds.
[1266] Sure.
[1267] What they get off on.
[1268] is how how freaked out, you know, quote -unquote liberal audiences are by their appearance, their attitude and everything, and they lean into it, you know what I mean?
[1269] Which is interesting because, you know, that kind of like group camaraderie thing, you don't really find that on the campaign trial on the Democratic side.
[1270] It's different.
[1271] I mean, it's a different vibe entirely, but yeah, it's crazy.
[1272] Well, it's dumb, and that's the thing that he's sort of like captured is this place where you can be dumb.
[1273] like it's fun to be dumb and say grab her by the pussy like everybody knows that's kind of a dumb thing to say publicly of course but you can say it there because he said it yay you know build that wall build that wall yay right like it's like it's this chance to like shut off any possibility of getting over like 70 r pm like you we're going to cut this bitch off at 70 there's no high function here we're going to cut it off at 70 and just let it rip right yeah no totally totally And it's funny, the way you say that they all, everybody knows it's a dumb thing to say, right?
[1274] So, like, I would talk to people at the crowds and, you know, I'll talk to like a 65 -year -old grandmother.
[1275] And you say, do you agree with everything that Trump says?
[1276] And, like, almost to the last, they all say, well, I wish he hadn't said this particular thing.
[1277] Right.
[1278] But they're all there chanting, you know what I mean?
[1279] Like, they're all into it.
[1280] And the crowds are, they're so huge.
[1281] Like, I was in Cincinnati, and I was late.
[1282] to one of his events and I made the mistake that I couldn't drive in because they blocked off all the bridges if you've ever been there, right?
[1283] I was in the Kentucky side.
[1284] So I had to walk like three miles away and like walk over a bridge and I thought I was going to be the only person there.
[1285] And it was like something out of a sci -fi movie.
[1286] It was just like a line of MAGA hats like extending over a bridge all the way into Kentucky like a mile down a road.
[1287] I mean, they had to turn with thousands of people to get into this event.
[1288] It was incredible.
[1289] How many people did it see?
[1290] It was like 17 or 18 ,000.
[1291] and it was the, you know, the, I forget what arena that is.
[1292] It's the, it's the indoor one.
[1293] Look at the size of those places.
[1294] He's the only one that can pull those kind of crowds, period.
[1295] Oh, yeah.
[1296] There's no one, no one can do that.
[1297] You know, Bernie and Warren have had big crowds.
[1298] Bernie had a, he had a 25 ,000 person crowd in Queens a couple of weeks ago.
[1299] You'll see crowds that big, but Trump's crowds are just dating back to 2016.
[1300] They're just consistently huge.
[1301] everywhere.
[1302] And again, this gets back to what I was saying before.
[1303] All the reporters saw this and they all saw that Hillary was having real trouble getting four and five thousand people into her events.
[1304] And so we all, you know, we were all talking to each other.
[1305] Like, that's got to be a thing that's going to, you know, play a role in the election eventually.
[1306] But nobody kind of brought it up or they explained it away.
[1307] Well, I think they felt like if you discussed it and brought it up that somehow or another you were contributing to Trump being, to Trump winning.
[1308] Right, but that's a, that's a fallacious way to look at it.
[1309] Because covering up the reality of the situation, I think, created a false sense of security for Democrats.
[1310] Sure.
[1311] And they thought they were going to win by a landslide, right?
[1312] That's what everybody was saying, but it wasn't true.
[1313] I mean, there were serious red flags throughout the campaign for Hillary.
[1314] And people, I think, were too afraid to bring up a lot of this stuff.
[1315] because they didn't want to be seeing it's helping Trump.
[1316] But that's not what the business is about.
[1317] We're not supposed to be, you know.
[1318] Helping people.
[1319] Facts don't have, you know, political indications.
[1320] We're just supposed to tell you what we see.
[1321] How do you get journalism back on track?
[1322] Is it possible at this point?
[1323] I mean, is it a lost art?
[1324] Is it going to be like calligraphy?
[1325] I mean, I think, yeah.
[1326] Yeah, right.
[1327] Yeah, like, yeah, exactly.
[1328] The Japanese calligraphy, right?
[1329] You have to pass it down through masters.
[1330] Yeah, and maybe that's going to be what journalism is like.
[1331] I mean, there's, there's, there's, there's, there's, two things that could happen.
[1332] One is that, like, if you created something like neither side news right now, right?
[1333] And just like a...
[1334] That's a great name.
[1335] Yeah, like a network where it was a bunch of people who just kind of did the job without the editorializing, I think it would have, it would probably have a lot of followers right away.
[1336] It would make money.
[1337] And nobody has clued into that yet.
[1338] Like, if some canny entrepreneur were to do that and that were to bring back the business, that or, you know, journalism has always been kind of quasi -subsidized in this country.
[1339] You know, going back to the Pony Express, newspapers were carried free across to the West, right?
[1340] The U .S. Postal Service did that.
[1341] The original Communications Act in 34, the idea was, you know, you could lease the public airways, but you had to do something in the public interests.
[1342] So you could make money doing sports and entertainment, but you could take a loss on news.
[1343] And so it was kind of quasi -subsidized in that way.
[1344] But that doesn't exist anymore.
[1345] There's no subsidy, really, for news anymore.
[1346] I'm not necessarily sure I agree with that.
[1347] that being the way to go, but there has to be something because right now the financial pressure to be bad is just too great, you know, like there's no, there's no way to, sorry to go on this, but I came, when I came from the business, when the money started getting tighter, the first thing they got rid of were the long -form investigative reporters.
[1348] Like, you couldn't just hire somebody to work on a story for three months anymore because you needed them to do content all the time.
[1349] Then they got rid of the fact checkers, you know, which had another serious problem, you know, and so now the money's so tight that they just have these people doing clickbait all the time and they're not doing real reporting.
[1350] And so they have to fix the money problem.
[1351] I don't know how they would do that, though.
[1352] How much has it changed recently?
[1353] Because like when that piece that you, the stuff that you wrote about the banking crisis was my favorite coverage of it and the most relatable and understandable and the way you spelled everything out, could you do that today?
[1354] Yeah, but I think it would be harder because that's not that long ago.
[1355] It's, it really isn't.
[1356] It's only, you know, that was, I really stopped doing that in like 2014 or so, but, so we're five years out.
[1357] But the, the, the big difference is social media's had a huge impact on attention span.
[1358] So, you know, I was writing like 7 ,000 word articles about credit default swaps and stuff like that.
[1359] And I was trying really hard to make it interesting for people, you know, you use jokes and humor and stuff like that.
[1360] that.
[1361] But now people would not have the energy to really fight through that.
[1362] You'd have to make it shorter.
[1363] Even TV, you know, they, people, you don't see that kind of reporting, that in -depth, you know, kind of process reporting where you're teaching people something, because people just tune out right away.
[1364] They need just a quick hit, a headline, and a couple of facts.
[1365] So, yeah, there's a big problem with audience, right?
[1366] We've trained audiences to consume the news differently and all they really want to get is a take now you know it's like the everything's like an ESPN hot take right things you know so that's the counter to that though is this what we're doing right now like these are always these long -ass conversations they're hours and hours long and there's a bunch of them out there now it's not like mine is an isolated one and there's so many podcasts that cover and some of them cover them like in a serial form like the dropout was that Was it they called it?
[1367] Yes.
[1368] It was the dropout was the one about that woman who created that fake blood company.
[1369] Oh, yes, right.
[1370] Susan, what was her name?
[1371] Elizabeth Holmes.
[1372] Elizabeth Holmes.
[1373] Elizabeth Holmes.
[1374] That's right.
[1375] That's right.
[1376] Theranos, yeah.
[1377] The completely fraudulent company.
[1378] That was an amazing podcast series that if I read it, I probably, you're right.
[1379] I probably would have like, oh, boring.
[1380] Right.
[1381] I probably would abandon it earlier, but listening to it in podcast form, listening to actual conversations from these people, listening to people's interpretations of these conversations, listening to people that were there at the time, telling, you know, telling stories about when they knew things were weird and when they started noticing the, there's, like, tests that were incorrect, that they were covering up, that kind of shit.
[1382] Like, you can do that now with something like this.
[1383] And I think that one of the good things about podcast, too, is you don't need anybody to tell you that you could publish this.
[1384] Yeah, no, absolutely.
[1385] I think you're right.
[1386] And I think formats like this reveal that the news companies are wrong about some things, about audiences.
[1387] Like, they think that people can't handle an in -depth discussion about things.
[1388] They think that audiences only want to watch 30 seconds of something.
[1389] They don't.
[1390] They're interested.
[1391] They do have curiosity about things.
[1392] It's just, it's very difficult to convince people in the news business especially to take chances on that kind of content.
[1393] You know, they'll do it for a podcast.
[1394] They'll do it for a documentary.
[1395] But for the news, they're making things shorter and shorter and shorter.
[1396] You know, I was really lucky to have an editor who, you know, understood the idea that we have to get into this in depth or also it's going to be meaningless to people, right?
[1397] That's pretty rare.
[1398] You know, for the most part, you don't see them taking that kind of bet anymore.
[1399] But maybe podcasts will help people puncture that.
[1400] But the flip side of that is that they're not investing in stuff like international news in the way they used to.
[1401] Like when I came up in the business, every bureau, every big network had bureaus in every major city around the world, you know, Rome, Berlin, Moscow, whatever it is, right?
[1402] And they had newsrooms full of people who are, you know, out there gathering.
[1403] news.
[1404] Now there's none of that.
[1405] Because they figured out they can make the money just as easily by having somebody sit in an office in Washington or New York and just, you know, link to something and have a take on something, you know.
[1406] So I think the news is getting worse.
[1407] Podcasts are getting more interesting.
[1408] Maybe there's a happy medium they can find in between.
[1409] Well, documentaries as well.
[1410] Documentaries are commercially viable.
[1411] If it's a great subject, like a good example is that wild wild country one where you know i didn't even know that that cult existed i had no idea what what happened up there and then so this this documentary sheds light on it does it over like i think it was like six episodes or something like that it's fucking amazing and it made a shit ton of money yeah or making a murderer was another one if it was really good like they you take because that's something that happens all over the place you have these criminal justice cases and they're you know terrible injustice has happened um and you know if you really tell the whole story and make characters out of people and invest the time and energy to tell it well.
[1412] People still like really good storytelling.
[1413] But I think within the news business, they just have this belief, their hard -headed belief that people can't handle difficult material.
[1414] And I don't know why that is.
[1415] Yeah, I don't know why it is either.
[1416] It's, I mean, I think there's a large number of people that aren't satisfied intellectually by a lot of the stuff they're being spoon fed and they think that because the the vast majority of things that are commercially viable are short attention span things I think it's like this real sloppy way of thinking non -risk taking way of thinking they're like this is how people consume things you got to give them like a music video style editing or they just tune out but there's always been a thirst for actual long inform conversations.
[1417] You don't, an actual real in -depth exploration of something in a very digestible way.
[1418] Like, one of the good things about doing your podcast or this podcast, any podcast really, is that you could listen to it while you're commuting.
[1419] You listen to it, and it'll actually give you something that occupies your mind and interests you during what would normally be dead time.
[1420] Right.
[1421] Yeah.
[1422] And you're absolutely right about the thirst for something else.
[1423] Yeah.
[1424] And again, I think when people turn on most news products, they're getting this predictable set of things, and that doesn't quench that thirst for them.
[1425] They're not being challenged in any way.
[1426] They're not seeing different sides of a topic.
[1427] You're not approaching covering a subject honestly by genuinely, you know, exploring the idea that people you may have thought were bad or right or people you may have thought are good or wrong.
[1428] It's just all predictable.
[1429] I think people are fleeing to other things now, right?
[1430] They just, they want, they want to just get the story.
[1431] They don't, they don't want to have a whole lot of, you know, editorializing on top of it.
[1432] And, yeah, and, yeah, and they, I think also, there's a lot of underestimating of audiences going out there.
[1433] And, like, we, we just think that they can't handle stuff and they can.
[1434] Yeah.
[1435] They're, they're interested, but we, we just take it for granted that they can't do it.
[1436] Maybe I'm guilty of that, too, you know, because I've been doing this for so long.
[1437] But, yeah, it does happen.
[1438] I don't think people have changed that much.
[1439] Yeah, no, probably not.
[1440] It's just difficult.
[1441] You know, maybe it's also, we don't have the stamina to stick with a story in the same way that we used to.
[1442] Like now, if a story doesn't get a million hits right away, we don't return to the subject.
[1443] You know, you think about stories like Watergate.
[1444] Like, when Woodburn and Bernstein first did those stories, they were complete duds.
[1445] Like, everybody thought they were wrong.
[1446] the wrong path.
[1447] They were the only people who were covering it.
[1448] And a lot of those stories kind of flailed around.
[1449] You know what I mean?
[1450] They didn't get the big response.
[1451] And it wasn't until much later that it became this hot thing that everybody was watching.
[1452] And you wouldn't, so that wouldn't happen now, right?
[1453] Like if reporters were on a story, if it didn't catch fire within the first couple of passes, your editor's probably going to take you off it now.
[1454] What was that story that the New York Times worked on about Trump and they worked on it for a long time and it was released and went in and out of the news cycle in a matter of days and nobody gave a fuck the yeah the one about his finances yes and it was like a 36000 word story it was like unbelievable it was like six times as big as any is the biggest story i've ever written in my life they thought it was a giant takedown right yeah and it was it you're it was like a 36 hour thing if that right and and maybe maybe yeah and people kind of said oh and This is amazing.
[1455] It's got all this information and it just fell flat, you know.
[1456] And that's, and the important thing about that is that news companies see this.
[1457] And they say, wow, we invested all this time and money.
[1458] We put our, you know, really good reporters on this.
[1459] We gave them six months to work on something.
[1460] And it got the same amount of hits as, you know, some story about, you know, a carp with a human face that was filmed in China.
[1461] You know what I mean?
[1462] Like something that we, you know, we picked off.
[1463] the wires and we stuck it in page 11, whatever it was.
[1464] So then what that tells them the incentives now are let's not bother.
[1465] Let's not do six months investigations of anything anymore because what's the point?
[1466] We're going to get as many hits doing something dumb.
[1467] So they just don't take their risk anymore.
[1468] God, it's so crazy that that's the incentive now that it's all clicks.
[1469] Totally.
[1470] It's such a strange trap to fall into.
[1471] And there's also the other thing, which is the The litigation problem, you know, and this is another thing I wrote about in the book, is that there was a series of cases in the 80s and 90s where reporters kind of took on big companies.
[1472] I remember the Chiquita Banana thing that the Cincinnati Inquirer did?
[1473] Remember the movie The Insider about Brian and Williamson, the tobacco company, CBS, right?
[1474] There was another one with Monsanto in Florida where some Fox reporters went after Monsanto.
[1475] So they all got sued, and it cost their companies a ton of.
[1476] of money and reputational risk.
[1477] And so after that, what news company said is, why take on a big company that can fight back and throw a lawsuit at us?
[1478] And what do we win by that?
[1479] We're not going to get more audience from that, you know?
[1480] So now if you watch consumer reporting, you know, at like a small TV station, usually it's, they're going to bang on some little Chinese restaurant that has roaches or something like that, they're not going to go after Monsanto or, you know, Chiquita Banana because there's no point.
[1481] It's too much of a risk, so they just don't do it.
[1482] And that's another thing that's gone wrong with reporting.
[1483] You know, they've, the economic benefit of going after a powerful adversary isn't there anymore, so they don't do it.
[1484] And that's a problem.
[1485] Now, clearly you've seen a giant change in journalism from when you first started to where we are now, do you have any fears or concerns about the future of it?
[1486] I mean, this is what you do for a living?
[1487] What are your thoughts on it?
[1488] Where do you think it's going?
[1489] I mean, I'm really worried about it because because you need the journalists to kind of exist apart from politics and to be a check on everything.
[1490] The whole idea of having a fourth estate is that it's separate from the political part.
[1491] parties, right?
[1492] I mean, I don't work for the DNC.
[1493] It's not my job to write bad news about Donald Trump, right?
[1494] That's the DNC's job, you know?
[1495] They put up press releases about them.
[1496] And if people see us as being indistinguishable from political parties or being all editorial, then we don't have any power anymore.
[1497] Like, that's the first thing.
[1498] Like, the press doesn't have any ability to influence people if people don't see us as independent and truthful and all those things.
[1499] And so So that's what I really worry about right now is, like, people won't, we'll stop listening to the media.
[1500] They'll still tune us out.
[1501] They don't trust us anymore.
[1502] And like Walter Cronkite from, you know, in 1972, the Gallup poll agency found that he was the most trusted man in America.
[1503] And that was true also in 1985.
[1504] Like for 13 consecutive years, he was the most trusted.
[1505] There's no reporter in America who's trusted.
[1506] The most trusted man in America?
[1507] Right.
[1508] It doesn't exist.
[1509] Yeah.
[1510] It doesn't exist.
[1511] So people think of us as clowns and, you know, entertainment figures.
[1512] And so how are you going to impact the world if people think you're a joke, you know?
[1513] And so that's what I really worry about.
[1514] We don't have any institutional self -respect anymore.
[1515] You know, we don't feel like we have to, you know, challenge audiences, challenge powerful people.
[1516] You know, it's just a bunch of talking points.
[1517] And that's not what the business is about.
[1518] So I worry about it.
[1519] And, you know, I think there are a lot of journalists who kind of say the same thing.
[1520] We all kind of talk amongst ourselves, which is, you know, the job as we knew it is kind of being phased out and changed into something else.
[1521] And that's not a good thing, you know, because people do need, in tough times, people need the press, you know, as ridiculous as that sounds now.
[1522] But it's true.
[1523] And I don't know where we go from here.
[1524] legitimate journalism is so important it's so important it's the only way you really find out what's going on right the only way right you're not going to find out through the depictions of the people that were actually involved in it that want you to see it a certain way you're not going to find that from people that have financial incentives and giving you a specific narrative you need real journalism yeah it's so hard to find and i think it's one of the reasons why we're so lost and it's one of the more insidious aspects of the term fake news because god damn that's so easy to throw around it's like it's so easy to call someone a bigot it's so easy to call someone a racist and it's so easy to say fake news and although they all have the same sort of effect they just diminish anything that you have to say almost instantaneously totally and and there's when you can cast the entire news as being fake people can tune it tune it out but a lot of that has to do with but who's doing the news reading now right like in the 60s and 70s maybe before reporters a lot of them came from the middle and lower classes like you know they were it was the job was originally kind of like being a plumber right it was more of a trade than a profession and so you had a lot of people who who went into the job and they had this kind of attitude of just wanting to stick it to the man you know like they they they didn't want to be close to power they wanted to take it on people like see more hers right like if you see that kind of personality who just wants to take the truth and rub it in somebody's face.
[1525] But then after all the president's men, it became this sexy thing to be a journalist.
[1526] And you saw a lot of people from my generation who went into journalism because they wanted to be close to politicians and hang out with them.
[1527] It's kind of like the primary colors thing, right, where you see people who they just want to like have a beer with the presidential candidate.
[1528] And that's totally different from what it used to be.
[1529] like now so now we're on the wrong side of the rope line you see what i'm saying like we're like we used we used to be outside of power like taking it on and now we're kind of seen we're we're more upper class in the press and we're we're kind of in bed with the same people we're supposed to be covering and that's that's not a good thing people when people see that they they they you know that's that's one of the reasons why they say they call us fake news is because they they see us as doing PR for you know rich people one of my favorite books ever about politics is fear and loathing on the campaign trail yeah i wrote the introduction to that did you yeah the last the last edition of that oh that greatest book yeah it's a fantastic book and it's a great example of someone who knew that they weren't a part of that system so they could talk about it as an outsider he knew he was only going to be covering it for a year so i just went in guns blazing got everybody fucked up drinking on the bus making everybody do it burned all of them you know yeah and he's says that in the book he's like he's like look this isn't my beat i don't have any friends i have to keep you know yeah so i'm going to tell you everything that that i see and fuck it and and that's that's a real problem in reporting when you when you you're in a beat for too long you end up having developing unhealthy relationships with sources and you end up in a position where you're not going to burn the people who you're dependent on to get your information and when that happens to reporters.
[1530] I think that's one of the reasons it's good to kind of cycle through different topics over the course of your career.
[1531] If you get stuck in the same beat too long, eventually you fall into that trap.
[1532] And Thompson, of course, never did that.
[1533] Like, you know, every story that he covered was, you know, he let it all hang out and just said whatever the hell he thought and, you know, he let the chips fall where they may. And that's kind of the way, I mean, you can't do that all the time, probably, but I think that's the thing.
[1534] That was great.
[1535] It was amazing.
[1536] And there's no other examples of it.
[1537] no no kind of like that yeah yeah yeah I mean that book was so great on so many levels like he I always thought of as being also kind of like a novel because it's a it's this story about this person who's like obsessed with finding meaning and truth but he goes to the most fake place on earth which is the campaign trail to look for it and so all these depictions of all these terrible lying people they're just so hilarious And so it's kind of, you know, it's almost like a Franz Kafka novel.
[1538] It's amazing.
[1539] And then it's great journalism at the same time.
[1540] Like, he's telling you how the system works and how elections work.
[1541] And it's really valuable for that.
[1542] So, yeah, that was brilliant.
[1543] He also changed a lot.
[1544] I mean, he actually affected politicians.
[1545] Like the shit that he did with Ed Muskie.
[1546] Oh, my God.
[1547] That was fantastic.
[1548] Well, he was on the Dick Cavett show.
[1549] And Dick Cavett asked him about it.
[1550] He goes, well, there's a rumor.
[1551] That, uh, he was on Ibrahimagane and, uh, I started that rumor.
[1552] I mean, it's just he, like, literally that he got in that guy's head.
[1553] Oh, yeah.
[1554] And I remember that he put that picture of musky.
[1555] He just found a picture of musky, and it's, he's basically doing like that.
[1556] Yeah.
[1557] The caption is musky in the throes of an ibogaine frenzy, right?
[1558] And you couldn't get really get away with that now.
[1559] Like he just, he just, you know, Well, it's a crazy drug to choose, too, because it's a drug that gets you off addictions.
[1560] Right, yeah, exactly.
[1561] It's one of the more hilarious aspects of his choice.
[1562] But it sounded great.
[1563] Yeah, and with the witch doctor and all that stuff.
[1564] Brazilian witch doctor?
[1565] Yeah, yeah, yeah.
[1566] It was fantastic.
[1567] Oh, so good.
[1568] Yeah, but, you know, that kind of stuff probably wouldn't go over all that well right now.
[1569] No, you could get sued.
[1570] Yeah, but also he had this very, very, um, sort of a great.
[1571] aggressively characterizing way of looking at politics and politicians and that that wouldn't go over that well now either like people don't want you to rip on the process as much as he did in that book so it was great it was just a fantastic book yeah i mean he had a bunch of them that were great but that one particularly it's you can sort of redo it you could reread it every time we get to an election cycle and sort of like goes oh you let's you know these are these are repeating cycles.
[1572] This is just like the same shit that he was dealing with in, you know, various different forms.
[1573] But you can see it all today.
[1574] And it's funny, the reporters, everybody's read that book, everybody who covers campaigns.
[1575] You know, I'm on my fifth right now for Rolling Stone.
[1576] Like, I have his old job.
[1577] And everybody has read that book.
[1578] And so they unconsciously try to make the same characters in each election cycle.
[1579] So there's always like a Christ -like, McGovern figure.
[1580] There's a, you know, a turncoat, quizzling, spineless musky figure.
[1581] There's the villain Nixon, you know, Trump kind of fills that role for a lot of reporters now.
[1582] And then they all, a lot of them try to behave in the same way that their characters, um, behaved in that book.
[1583] So you remember Frank Mancoitz was, was, uh, McGovern's sort of handler.
[1584] And he was having beers with, with Thompson after the events and kind of, you know, strategizing with them.
[1585] Well, reporters try to do that.
[1586] They all try to do that with the candidates and their handlers.
[1587] Now they try to develop those same relationships.
[1588] It's just interesting.
[1589] It's like they're reliving the book, you know?
[1590] That's a problem with someone that's really good, you know.
[1591] They take on so many imitators or so many imitators take on their demeanor or their thought process.
[1592] Like, and Hunter was just such an iconic version of a writer that it's so difficult.
[1593] a fan of his to not want to be like that guy.
[1594] Oh, totally.
[1595] I mean, I, you know, I know that, uh, you know, especially because I'm writing for the same magazine and covering a lot of the same topics.
[1596] You have to immediately realize that you can't do what he did.
[1597] Like he, Thompson's writing was incredibly ambitious and unique.
[1598] He, he was using a lot of the same techniques that the great fiction writers use.
[1599] Like he was creating almost like this Ford dimensional, you know, story, but at the same time it was also journalism.
[1600] Like, you can't really, most people couldn't get away with that.
[1601] You have to be a great, great writer.
[1602] I'm talking like a rare Mark Twain level talent to really do what he did, which is to kind of mix the, you know, the ambition of great fiction with journalism.
[1603] So if you try to do that stuff, it's going to be terrible.
[1604] And I've done, I've certainly, if you go back and look at my rating, you'll find a lot of like shitty Thompson imitations.
[1605] And, uh, and so I learned to not do that pretty early.
[1606] Um, but yeah, no, it's one of those don't try this at home things, uh, for young writers.
[1607] If you can, if you can avoid that for sure.
[1608] Do you have any, do you have any hope?
[1609] Does it or anything that, that you look to, you go, maybe this is going to be where this turns around in terms of journalism in terms of like...
[1610] Yes, I mean, I think, I mean, oddly enough, I think shows like yours and the kind of proliferation of what you're talking about with podcasts.
[1611] The great thing about the internet, there are lots of bad things, but the great thing about it is that it's given, it's provided a way for people to just have an audience if they're good, right?
[1612] And if people have a demand for it, If there's a demand for it, you can exist.
[1613] You can have a platform.
[1614] And so that's what I think is going to happen, is that people are going to crack the code of what kind of journalism people want.
[1615] And they're going to create something that people are going to flock to.
[1616] And I don't have a lot of faith that CBS, MSNBC, ABC, CNN, that they're going to figure it out.
[1617] Like, I think it's going to be some independent kind of voice that is going to come up.
[1618] with something a new formula and people are that is going to rise up you know i mean you've you've seen it a little bit with uh things like the young turks you know although they're you know they're changed they've changed a little bit but they figured out that if you provide something it's an alternative from the usual thing that you can you can succeed you can get a viable functioning business a lot faster than you used to be able to what do you mean by they changed um you know i i think you know, they've kind of become a little bit more in the direction of a traditional news organization than they were originally maybe.
[1619] I don't know.
[1620] I don't watch it as much as I used to, so maybe I shouldn't say that.
[1621] But, you know, again, the ability to do that is a lot different than it used to be.
[1622] Like, in order to have an independent journalism outlet, you used to have to, like, for instance, put out your own newspaper, which do your own distribution, do your own printing, do your own design.
[1623] All that stuff cost a ton of money.
[1624] And it was very, very hard to do it without big corporate sponsors.
[1625] Now, you know, now anybody with a good idea can pretty much, you know, do something.
[1626] And so I have a lot of hope that somebody's going to figure it out.
[1627] It just, it just, we're not there yet.
[1628] I agree with you.
[1629] I'm optimistic.
[1630] I have a lot of hope, too, but I'm always like, rock, let's hurry up already.
[1631] Yeah, I know.
[1632] I know.
[1633] And it's just, until we get there, the remnants of the old system of media, they're just, you know, it's just so tough to watch flailing.
[1634] You know, they're flailing.
[1635] They don't really know what to do.
[1636] They're kind of caught between just purely chasing the money and trying to adhere to what they thought the news look like in the past.
[1637] So it's like not entertaining, you know, if they were just chasing the money, if they just come up organically today, they would have had a different product entirely.
[1638] But they're trying to sound like legitimate news.
[1639] but they're also completely selling out at the same time, and it's just not working, you know?
[1640] Yeah.
[1641] And so, yeah, we'll see where all that goes, but it's, we're not here, you're right, they're flailing right now.
[1642] Well, Matt Taibi, I appreciate you, man. Thanks a lot, Joe.
[1643] I really do, it's always an honor to talk to you.
[1644] No, likewise.
[1645] Your book, tell people, hate ink.
[1646] It's called Hate Inc. It's called Hate Inc. It's by O .R. Books.
[1647] It's out now.
[1648] You can buy it on Amazon, and my podcast is called Useful Idiots with Katie Halper, Rollingstone .com.
[1649] So you can watch, check that out once a week.
[1650] Thank you.
[1651] All right.
[1652] Thanks, Joe.
[1653] Appreciate it.
[1654] Bye, everybody.
[1655] Sounds great.
[1656] Awesome.