Insightcast AI
Home
© 2025 All rights reserved
ImpressumDatenschutz
#1242 - Tim Pool

#1242 - Tim Pool

The Joe Rogan Experience XX

--:--
--:--

Full Transcription:

[0] Three, two, one.

[1] Hello, Tim.

[2] How's it going?

[3] Thanks for finally being here.

[4] Long story, right?

[5] I definitely drank too much coffee before we get here.

[6] So if I appear like cracked out, I swear to God, I'm not in pills.

[7] Glad to hear it.

[8] So we had a nice conversation on the phone about de -platforming and social media.

[9] And what was very obvious to me in talking to you was that you're way more school.

[10] on this than I am.

[11] So that's why I wanted to have this conversation with you.

[12] Right.

[13] Yeah.

[14] Part of what was, like, I re -listened to my podcast with Jack and you had a good criticism of it.

[15] I agree with a lot of what you said.

[16] First of all, I agree that it was kind of boring.

[17] Yeah.

[18] And it was, I think in many, for many reasons, it was my fault.

[19] I don't think I prepared enough for it and I also don't think I understood the magnitude of how other people felt about de -platforming on Twitter and in all social media, YouTube and all these different things, and what the ramifications are and how much this means to people to have very clear and obvious free speech outside of very egregious examples of like threats and doxing and things like that, which I think we can all agree, right?

[20] I think this problem might be one of the, like, one of the worst problems we're facing right now politically.

[21] Yes.

[22] You know, the Twitter is where public discourse is happening.

[23] It's where journalists are, and this is a problem, sourcing a lot of their stories.

[24] Yes.

[25] So if you have somebody who's completely removed from public discourse, that's exile.

[26] You know, I can imagine why some people kind of lose their minds when that happens.

[27] And I think going into that conversation with him, well, that's what I wanted it to be.

[28] That's why I don't really interview people.

[29] I kind of have conversations with him.

[30] Yeah.

[31] Occasionally we have disagreements and we, you know, we talk about things and you know but it's not i don't have uh uh like a mandate my only the only thing i wanted to get out of the conversation is i wanted to find out what it was like to start that organization and to have no idea when you were doing it that it was going to be essentially like one of the most important distribution at uh avenues for information an activist buddy an activist buddy of mine asked me if i knew why people smash windows smash Starbucks it's not because they think they're going to cause damage.

[32] It's because they want to strike a symbol down of something they view oppresses them.

[33] Jack Dorsey is that symbol to a lot of people.

[34] And, you know, to see, you know, what I was saying earlier is I think a lot of people look at you.

[35] You're like a real dude.

[36] You know, your conversations are real.

[37] You're not one of these fake news journalists that people are very critical of, that feel their bias or an agenda.

[38] So when you sit down with Jack Dorsey and doesn't go anywhere, people then feel like the last person who's not supposed to let us down, let us down.

[39] You know what I mean?

[40] Yeah.

[41] No, look, I felt it.

[42] They hate you twice as much.

[43] I felt it, and I noticed that I got, you know, more hate for that one than probably anything that I've ever done.

[44] And, you know, I'm not a guy who shies away from criticism.

[45] I try to figure out what I did wrong and try to regroup and figure out how to approach it again.

[46] And in Jack's defense, you know, I think he's very open to talking about anything.

[47] And also, he's also very open for self -criticism.

[48] I mean, he was openly discussing what they're doing wrong, where they need to be.

[49] be clear.

[50] We need to get better.

[51] I don't believe any of it.

[52] You don't?

[53] I don't trust that guy.

[54] Not at all.

[55] Why don't you trust him?

[56] I mean, first of all, there's the obvious thing that he's running these, a bunch of companies.

[57] I could be wrong, but I believe he actually left Twitter.

[58] He wasn't the CEO for a while.

[59] They brought him back in or something.

[60] But it either sounds like...

[61] We should probably check on that.

[62] Yeah, yeah.

[63] I try to avoid asserting things that I'm not 100 % sure on, but...

[64] Me too, but I do it all the time anyway.

[65] Right, right.

[66] Jack says things like, he said to you.

[67] He said to you.

[68] He said to Congress.

[69] I believe he said to Congress.

[70] We don't ban people based on the content.

[71] We ban people based on their conduct.

[72] Okay, you literally have a terms of service that banned specific content.

[73] Like, what do you mean you don't make people based on content?

[74] There's a, you know, I'll just get into naming some people, right?

[75] Megan Murphy, for example, is a feminine.

[76] Who's Megan Murphy?

[77] Is she, okay, she's that woman that, the whole issue with a man, right, right.

[78] So, so this is what's important.

[79] She was responding to somebody.

[80] Please explain that for, so this could be standalone.

[81] Because we talked about it yesterday with Sam Harris.

[82] So I don't know too much about Megan Murphy, but she's a feminist.

[83] She's a, what they call a trans -exclusionary radical feminist, but I think that might be offensive.

[84] Says let's remember why Jack Dorsey was fired as Twitter CEO.

[85] He was fired?

[86] Okay, let's see what it says.

[87] This is in Fortune?

[88] 2008.

[89] That was, there wasn't even, was there a Twitter?

[90] Yeah, yeah.

[91] In 2008?

[92] Jesus Christ.

[93] Dorsey's management was so problematic.

[94] Twitter's board, this is an opinion piece.

[95] Fired him in 2008, offering him a passive chairman role.

[96] and silent board seat 2010 he was founding square square he went rogue okay so something happened you know it's funny is you called it an opinion piece but do they I don't know that's where we're at in journalism today well when someone said well it has to be an opinion piece when someone says was so problematic right I mean that's an opinion I mean the real facts are he was fired you know you could state the specific reason that was stated by the company and that would be a non -opinion piece but as soon as you flavor it it's all opinion right it is right Yeah.

[97] That is a, that is an issue, right, with information, the distribution of information that's flavored by opinion and ideology.

[98] All of it.

[99] Yeah.

[100] All of it.

[101] Well, we can talk about that, but I don't want to derail the Megan Murphy thing.

[102] Yeah.

[103] Okay.

[104] So Megan Murphy, please explain it.

[105] She was a feminist.

[106] She said.

[107] She's a, so she's called her a trans exclusionary.

[108] Look, I understand this is offensive.

[109] You know, I guess calling someone a trans exclusionary radical feminist, I'm assuming.

[110] Why is that offensive?

[111] It's just used in an offensive way, I suppose.

[112] to her offensive against against people like her yeah so there's intersectional feminists right they tend to be trans inclusive meaning that they believe that uh someone who's born biologically male can compete with those biologically female if they transition if they take hormones and things like that can compete compete right like power lifting racing biking and stuff that's where i step in yeah you know i've seen some of the stuff you've talked about uh the trans exclusionary group think they shouldn't and they've said things that are considered to be I say considered to be offensive.

[113] I'm not trying to assert who's offended by it, but there's one recent story where a trans exclusion, radical feminists said that the trans rights movement is a men's rights movement, right?

[114] They say things like that.

[115] In the case of Megan Murphy, she responded to someone.

[116] She said, men aren't women, though.

[117] That's not harassment.

[118] Right.

[119] That was a conversation with somebody else.

[120] Well, it's also a fact.

[121] She was permanently banned.

[122] Well, that's crazy.

[123] Right.

[124] Just saying men aren't women.

[125] Okay, out of context, just say it right there.

[126] women.

[127] Who the fuck's going to argue with that?

[128] But then when you say trans people, okay, now you're into gray area.

[129] But the statement, men aren't women.

[130] I mean, you don't have to take it in context, right?

[131] Right, right.

[132] But this is where we start getting into the nitty gritty of, I guess, left -wing ideologies.

[133] The culture war.

[134] If you go on Wikipedia and you look up man, it will tell you a man is an adult human male.

[135] Right.

[136] But if you look up trans man, it will say a trans man is a man. And so the trans section of Wikipedia is at odds factually with the man section.

[137] So the reason I bring this up is that when it comes to Twitter then, we can clearly see the bias.

[138] Twitter says you can't misgender someone.

[139] And presumably that's why Megan Murphy was banned.

[140] Okay, that's a left -wing ideology.

[141] Right.

[142] But that's not, is she, was she talking about a specific human?

[143] I think she was, I think they were having a conversation about somebody.

[144] I don't know the full details, details.

[145] But I got to say, look, right now people are being banned or suspended.

[146] for saying learn to code okay that's yeah what is that about you explain that to me too and i saw a few people getting but should we start one at a time should we yeah let's let's stick with megan well i mean i think we've we've we've reached that the point of megan right she was banned for having a conversation and saying men aren't women though that was the quote men aren't women though right so and they're saying that they would never ban someone for content they banned them for behavior right so what is that behavior i've no idea right like listen if if you are using you are using you choosing Twitter the way Twitter was designed to engage and respond to people is that bad conduct it can't be conduct it literally can't be conduct well not only that aren't you allowed to have opinions that are in fact based in biology yes you should be and I should point out before we go any further before you get called alt right you're very left right I you know I typically say center left center left like oh God people were saying how many times someone commented how many times will Tim mention he's a social liberal like I'm center left.

[147] I was a big fan of Bernie, you know, Bernie Sanders.

[148] He's still one of my favorite politicians.

[149] People then call me a socialist.

[150] You know, conservatives call me left.

[151] The left calls me right, whatever.

[152] You know, these labels are so fucking toxic.

[153] It's so confusing to people, and it causes so much, so much division between two sides that might not even differ that much.

[154] You know, the funny thing about it is I got my start during Occupy Wall Street.

[155] And conservatives called me far left, because, I was reporting on the protests, what they were doing, police brutality, the arrests.

[156] They said, this is a far left activist.

[157] Now that I'm, I've always been critical of the more extreme factions.

[158] Like I've got interviews from six, seven years ago where I'm critical of these people.

[159] Now all of a sudden, they're accusing me of being alt -right for being critical of extremists and masks, you know, starting fires and things like that.

[160] Or alt -right adjacent.

[161] That's my new favorite.

[162] Yeah, boot -licker, all -right adjacent.

[163] Oh, boot -licker.

[164] bootlicker that's that's you know there's a lot of phrases that people use that mean literally nothing boot licking um one of one of my favorites is to keep in the context of twitter they say freedom of speech freedom of speech does not mean freedom from consequence that literally doesn't mean anything it literally means nothing yeah well that's just trying to skirt around freedom of speech that's what that is so what ends up happening but but you agree with that like you should be able to speak your mind but there's certain consequences of certain things that you say.

[165] If I throw this bottle at the wall, there's a consequence.

[166] Well, that's a physical motion.

[167] If I drink this water, there's a consequence.

[168] I'll go to the bathroom.

[169] Right.

[170] So to point out that actions have consequence is not actually addressing any of the issues.

[171] It's just literally saying nothing.

[172] But it's almost, it's just like you can predict when someone will say it.

[173] And it's usually when a specific person is banned.

[174] They'll say freedom of speech, freedom of consequence, all that stuff.

[175] they don't say it when it's their people they say you know but but this is true too in talking about Twitter censorship there are people on the left who have been banned unjustly and this is where it gets actually scarier in my opinion you'll have I could name so many people Jesse Kelly was banned for no reason C .J. Pearson was banned I think twice you say no reason what now they have no recourse Twitter said it was an accident oops they said it was an accident yeah banned for life or banned he was he was short amount of time no I so I could be wrong my understanding is Jesse Kelly who's a conservative had his account just banned And there was a huge stink in the media like, whoa, whoa, this guy didn't do anything.

[176] Who is Jesse Kelly?

[177] She's a conservative guy.

[178] You know, he posts snarky tweets.

[179] He doesn't harass people or anything.

[180] He's a verified Twitter user.

[181] So a bunch of stories came up saying, what is this?

[182] Twitter then reinstates it and said it was a mistake.

[183] Now, is it possible that they're just dealing with blunt tools and that there was a mistake?

[184] Well, yeah, absolutely.

[185] Absolutely.

[186] But then I'd have to, the reason why I don't think it was a mistake.

[187] Very simply is, for one, we can see the.

[188] ideological bent to their rules, but then you look at someone like Milo Yanopoulos.

[189] Like, I'm not a fan of Milo.

[190] I have to make sure, like, everybody knows that.

[191] But just because I'm critical of the actions taken against him, it doesn't mean I support him.

[192] But why was he banned?

[193] Because he tweeted at Leslie Jones.

[194] Right.

[195] And the idea was that his tweet caused his fans to attack her, which I think is, that's a stretch.

[196] That's, that's just ridiculous.

[197] He didn't say go get her.

[198] He didn't say attacker.

[199] He was just tweeting at her.

[200] Yeah.

[201] And, you know, what did he call her?

[202] Ugly.

[203] say something like that?

[204] He was insulting or I think he called her ugly.

[205] He was he was mocking this feminist version of Ghostbusters right that's what he was doing he was talking it was like a critique of the movie.

[206] Look I've had Milo on the podcast way back in the day I had him on twice I enjoy talking to him he's hilarious he's very smart he's very witty he's a character he's very much a provocateur right but he's also you know he's pushing buttons on purpose like he's trying to get reactions from people and yeah trolling Yeah.

[207] I mean, I almost think like he married a black guy just to let people know he's not gay.

[208] Let people know whether he's not racist.

[209] It's hilarious in that way.

[210] I wouldn't never say something like that.

[211] I wouldn't say that either, but I almost think it.

[212] Sure.

[213] You can understand why Milo would do that.

[214] He's calculated.

[215] Right, right.

[216] Yeah.

[217] But I don't think he did that.

[218] I don't think he did that.

[219] It should be clear.

[220] But I mean, that's how much of an act, a lot of what's going.

[221] But if you talk to him off camera, he's a very nice guy, very reasonable, very polite.

[222] I don't trust them.

[223] I'm not a, I'm not a big fan.

[224] You know, he fat shamed a dude at the gym, right?

[225] Literally is a picture of Milo making fun of a guy.

[226] He took a photo of a guy at the gym.

[227] At the gym.

[228] Yeah.

[229] And I actually argued with him about it at one point.

[230] I was like, you won.

[231] Like, you won.

[232] You're skinny, he's fat.

[233] But no, you could, you literally shamed him to the point where he decided to go to the gym to better himself and you're still making fun of him.

[234] So did he, wait a minute, he shamed him before the guy went to the gym?

[235] No, I mean, I don't mean literally.

[236] I mean like, Milo's rhetoric of shaming people and getting them, you know, saying they're nasty and stuff.

[237] And then it.

[238] Right.

[239] Right, but I doubt the guy listened to him, and that's why he went to the gym.

[240] No, no, no, I didn't mean literally.

[241] You know, I just mean, like, if Milo plays it up like, I'm going to shame people until they go work out, why would you?

[242] But I'll say this.

[243] Look, that's fine.

[244] Milo can say the nasty things and be the kind of person he is.

[245] Right.

[246] He shouldn't have been banned from Twitter.

[247] That's ridiculous.

[248] Why was his verification badge removed?

[249] That was another, it's plain as day.

[250] Well, the verification one was weird.

[251] It's like, we're going to keep you here, but we're going to take away the verification that lets people know you're you.

[252] Right.

[253] So it opens the door to fraud.

[254] Right.

[255] It opens the door to fake Milo's, and you don't know who's who because there's no blue check mark.

[256] That doesn't necessarily make any sense.

[257] And then we get to Julian Assange, who couldn't get verified.

[258] What?

[259] Yeah, they wouldn't have.

[260] And so a bunch of fake accounts started popping up.

[261] Wait a minute.

[262] He's not verified?

[263] I believe WikiLeaks is.

[264] And then I think Julian Assange's account was changed into the Defend Assange account, but it's not verified.

[265] At least, I'm pretty sure.

[266] What was the reasoning behind not verifying Julian Assange?

[267] I don't think there was one.

[268] Okay, let's take a little sidebar here.

[269] Can you tell me what's wrong with Julian Assange?

[270] Like, what is the idea that this guy is some super villain, some bad person that did something terrible because he exposed some information?

[271] Like, what is it, what am I missing?

[272] I guess it depends on how conspiratorial you want to get.

[273] But Julian Assange, they've labeled him, like the intelligence agencies on the camera, I think may have been James Clapper, said that he is acting as a private, intelligence adversary of the U .S. or something that effect.

[274] Of Russia, you mean?

[275] No, the Julian Assange is acting independently against the U .S. Oh, against it.

[276] Right, right, right.

[277] And so that the leaks Assange put out very damaging to the U .S. Right.

[278] I think that's fair to say.

[279] Sure.

[280] They don't like him.

[281] So, you know, then he ends up getting accused of, I believe, molestation.

[282] No, I think it was secret.

[283] What was it?

[284] It's pretty complicated.

[285] He was having consensual sex with a woman.

[286] And then, no, no, no, I don't think so.

[287] I think while they're in bed, then the accusation was that without a condom on, he had sex with her again.

[288] I don't think that's, you know, I'll be honest, it's been years, but I don't think that's the case.

[289] I thought that's what was.

[290] I'll say this, it's been years, I think, you know, but there was reference to a condom breaking.

[291] And I think what happened was he said it was fine or something.

[292] But, but, you know, outside of that, you know, the guy's been locked up.

[293] for how long, you know, I think, I think it was the U .N. said something like, it's a violation of his human rights or whatever.

[294] Yeah, we went over.

[295] It was like, what is it, six years?

[296] I think he's been locked up for more than six years.

[297] Yeah, that's crazy.

[298] And it's because there's an, man, this is getting a little bit out of my wheelhouse, but I think the U .S. has preparing a grand jury indictment against him, so, but again, you know.

[299] And Chelsea Manning, who gave him the information, is now free and out of jail.

[300] I don't, I think it's presumed that Chelsea did.

[301] I thought it was the whole reason why she went to jail.

[302] It's been years since I've tracked a lot of this stuff.

[303] Okay.

[304] But back to the main point, Assange wasn't verified.

[305] WikiLeaks was, I'm pretty sure.

[306] Okay, so verification is not just, hey, this is Tim Poole, there's Jamie Vernon.

[307] Oh, that's the real Jamie.

[308] Give him a blue checkmark.

[309] It's, uh, we don't like you.

[310] So we're going to take away your checkmark, even though we know you're the real you.

[311] It's like, uh, it's a class.

[312] Yeah.

[313] It's an elite class of people.

[314] A removal of approval.

[315] Yeah.

[316] There are some people who, I think, have removed their own verification badges.

[317] How do you do that?

[318] You can just change your name and it erases.

[319] I think PewDiePie did it once and then immediately got it back.

[320] Oh, that's funny.

[321] Yeah, yeah, yeah.

[322] I could, yeah.

[323] But, you know, why remove someone's verification?

[324] Right.

[325] What does that do?

[326] No, I'm 100 % in agreement.

[327] So another interesting thing we can sort of segue into is why was Laura Lumer band?

[328] You don't have to be a fan of Laura Lumer.

[329] you can question her politics.

[330] Was she banned before or after she jumped Nancy Pelosi's fence?

[331] Before.

[332] But she's, you know, I'll say this, man. Like I tweeted about it if, oh God, talking about Twitter.

[333] If you, even if you don't like her, you got to admit she knows how to get press.

[334] She knows how to generate that buzz.

[335] And she's really good at it.

[336] So, but here's a thing.

[337] She got banned permanently because she tweeted to, I believe it was Elon Omar, criticism about Sharia law.

[338] She accused, you know, you could pull up the tweet, but I think she accused Elon of promoting Sharia, which results in like all these horrific things, and they banned her for it.

[339] Okay, disagree with her all you want, but that was her criticizing a politician.

[340] You can't have a lawsuit against Donald Trump claiming you can't, you know, Trump can't block somebody because it's a public forum.

[341] But then when it comes to a congressperson, you know, just permanently ban someone for saying something critical of their ideology.

[342] And I think what's really critical here is that there has to be some sort of clarification for what policies were violated and how they were violated.

[343] That seems to be especially for public figures because it's one thing if we don't know a person or the background, but when you know a person, whether it's a Laura Lumer or a Milo Yiannopoulos and it's a public case, and then you get this feeling that to say, no, because we decide, and this is it.

[344] well joe don't worry because no matter what twitter does they're going to be defended by the new york digital you know journalist elites who will misrepresent what's going on in an effort to obfuscate or sometimes outright lie about what's going on and this brings me to learn to code yeah okay yeah right so learn to code so i asked you about this the other day right people are getting banned for learn to code i'm like what the fuck is that like what is that so when coal miners were getting laid off a bunch of articles emerge saying teaching minors to code can we teach minors how to code and they were showing videos about it I don't believe it was it wasn't intended to be derogatory or insulting but to a lot of people it came off as this bourgeois let them eat cake oh your career has been destroyed you're a 50 year old man with a family go to Silicon Valley and do something you've never even thought about right so it came off to a lot of people as just elitist right so when these journalists are getting laid off this meme spreads I don't know exactly where it started where they say learn to code to the journalists.

[345] Well, an interesting thing happens.

[346] John Levine, I think his name, from the rap tweets.

[347] Someone from Twitter told me, you can be banned for tweeting, learn to code at a laid off journalist.

[348] Conservatives start tweeting it far and wide.

[349] Like, here we go.

[350] This is a reporter from the rap who's confirmed this.

[351] All of a sudden, then other journalists come out and say, this is a lie.

[352] This is not true.

[353] This is fake news.

[354] Conservatives are spreading fake news again.

[355] And they say, we have a new statement from Twitter that said, we're only banning, we're only banning people who are engaging in a harassment campaign.

[356] Well, now you've got a few problems.

[357] Is tweeting a meme at somebody critical of them, a harassment campaign?

[358] Is that a meme?

[359] Yeah, right.

[360] It's like it condenses an idea.

[361] So here's a thing.

[362] I got sent a bunch of screenshots from people.

[363] Now, people can fake screenshots.

[364] I understand that.

[365] But I checked some people's Twitter accounts.

[366] I saw that they were tweeting this and I believe, for the most part, this is what happened.

[367] Someone tweeted something to a BuzzFeed journalist.

[368] You know, oh, you guys believed X, X, and Z. yeah, whatever, hashtag Learn to Code, criticizing them, suspension.

[369] So then these journalists come out and say, this is not true, it's just people engaging in a harassment campaign.

[370] So I said, look at this guy's account.

[371] He's got one tweet that says Learned to Code.

[372] Is that him harassing somebody?

[373] And they said, oh, but you're taking out of context.

[374] Then John Levine from the rap says, update, Twitter spokesperson, who was my source, is now saying, clarifying it is about the harassment campaign.

[375] And then another journalist comes out and says, his quotes fake, Twitter is denying ever saying it.

[376] But here's a thing.

[377] The editor -in -chief of the Daily Caller, just a couple of, I think a couple days ago, took a tweet from the Daily Show.

[378] And it was from the State of the Union, and he tweeted, learn to code, and quote, tweeted a video suspended.

[379] So it's very clearly not about a harassment campaign.

[380] But why then were all of these journalists so ready to jump up and defend Twitter when Twitter, you know what I said?

[381] Okay, if Twitter is claiming they're banning people who are engaging in harassment campaign, you mean they've confirmed they're banning people for tweeting, learn to code.

[382] code, they just consider it harassment.

[383] How is it that learned to code is harassment, but Kathy Griffin saying to all of her millions of fans, I want these kids names several times, or another verified account, I'm not going to name because it's not as famous, literally calling for the death of these kids and instructing people to kill them is not a ban offal offense.

[384] It's not harassment campaign.

[385] Is that true?

[386] I don't want to mention the guy's name.

[387] You don't have to mention the name.

[388] But yes, absolutely.

[389] He said, something to the effect of put them in a school, lock it and burn it down, and when you see them fire on them.

[390] this guy's still active on Twitter you know right now there's so much here dude we've got the proud boys all of them purged from Twitter okay say whatever you want about the proud boys if they deserve to be banned fine why wasn't Antifa banned?

[391] A lot of people respond to me and say but Tim Antifa's random people who wear masks you don't know that's not true there are branded cells of Antifa that have their own merchandise still active some of these groups have published the private information of law enforcement officers still active, no action taken against them.

[392] So, you know, I don't.

[393] So this, this indicates a heavy left -wing bias.

[394] I wouldn't necessarily say left -wing.

[395] I would say intersectional, identitarian, ideological bias, right?

[396] It's hard to pinpoint what the tribes are in the culture war.

[397] But Twitter is clearly acting in defense of intersectional activism.

[398] Now, do you think that this is a mandate?

[399] Do you think this is written somewhere?

[400] Do you think there is people who are in the company that have power that are acting independently?

[401] It's grains of sand that make a heap, right?

[402] You're in Silicon Valley.

[403] You're in a very blue area.

[404] The people who get hired tend to hold certain views.

[405] And because they all live in their own bubble, they believe they're the majority.

[406] And thus they think they're acting justly to ban those who are at odds with them.

[407] Right.

[408] Social engineering.

[409] And this brings back into journalism, the big problem.

[410] It's, you know, for decades, I don't know how long.

[411] journalism has been dominated by self -identified liberals.

[412] There's a ton of polls.

[413] I think there's a 2015 poll showing Republicans are like 7 % of journalists or some ridiculously small number.

[414] And there's a really simple reason for it.

[415] News organizations are headquartered in big cities, the big ones.

[416] You know, even Fox News is in New York.

[417] So there's a lot of people who work at Fox News are actually liberal.

[418] People don't seem to know that you live in New York.

[419] You're probably not a staunch conservative.

[420] So what happens then?

[421] News breaks.

[422] You've got all these journalists because I've worked with them.

[423] You know, I worked for Vice.

[424] I worked for Fusion.

[425] and they sit around at tables they meet up after work from different offices and they talk about things and they all tell each other the exact same thing and so this is why you see Covington happen these people all follow each other on Twitter so when someone tweets this MAGA kid got in the face of Nathan Phillips they only see each other's tweets and they just write it they don't do any journalism and it goes man I can't believe for days and that was even in the New York Times correct?

[426] Yeah, yep the New York Times Paris talked about that yesterday and it's mind blowing to me because the second video that came out from Covington, you literally watch Nathan Phillips walk up to the kid and get his face.

[427] Bill Maher, you know, what four or five days later, says the kid got in his face.

[428] And I'm like, how are you?

[429] You know, shame on Bill Maher for saying that.

[430] That's not true.

[431] But at the same time, we have a serious journalism problem.

[432] And this links back to Twitter.

[433] And that story in particular really almost like condensed all the problems into one event.

[434] Yeah.

[435] And what's what's fascinating is following the story, an op -ed, I believe it was in the New York Times, said, stop tweeting.

[436] Or it said, never tweet.

[437] Brian Stelter from CNN, then got a statement that I always say I believe, because I don't have the sources pulled up, but someone from Twitter said journalists are the lifeblood of our platform.

[438] And so that's why I think you've got these predominantly New York -based progressive writers.

[439] They're fresh out of college.

[440] They get hired for, you know, moderate salaries to work in a newsroom, sit around each other all day, sharing the same ideas, not exploring anything outside their bubble and Twitter supports them because they're the ones who drive traffic to Twitter.

[441] They keep the conversation going.

[442] And I think that's where Twitter's bias partly comes from.

[443] The other is that clearly you're in San Francisco, you're going to have, you know, your staff, the people who are, you know, running content, curation and banning people, they lean left.

[444] So why Kathy Griffin wasn't banned?

[445] Probably because she's very famous, but then I have to wonder why Alex Jones was.

[446] So the only real differentiator there, I guess, either mainstream notoriety or ideological tribe.

[447] Well, Jamie, you pulled up why Alex was banned too, which is, you know, it's not very clear.

[448] Like, when you think about the fact that they were saying that he had never done anything on their platform that was banable.

[449] And then what was the one final thing?

[450] Like, and Jack didn't know what it was.

[451] He got, he confronted Oliver Darcy of CNN in D .C. and for several minutes was yelling at him while they filmed.

[452] And apparently, that's my understanding, was the justification for banning him that he was harassing a journalist or something, that effect, which is, in my opinion, absurd.

[453] Was he doing it on Twitter?

[454] I guess they posted it to Twitter.

[455] Live on Periscope, which is a Twitter -Bone platform.

[456] So if you do something on Periscope that could get you banned from Twitter?

[457] Well, that's the same thing.

[458] Right.

[459] Yeah, the same thing.

[460] Because they're connected.

[461] Yeah.

[462] I don't know at what point last year.

[463] I think it was last year.

[464] There was like an announcement.

[465] I saw it on Twitch, but I think it also happened on YouTube.

[466] They like collectively said if you do something on our platform, I'm sorry, if you do something on another platform and we see that, you could lose your status on our platform too.

[467] Or if you're like that, that means public also.

[468] And we see that with with Patreon, but I don't want to deviate into Patreon.

[469] Yeah, we can, we could get to that later.

[470] But so.

[471] So in my opinion.

[472] So it makes a good point.

[473] How does Alex Jones get banned for giving that guy a hard time?

[474] but Kathy Griffin doesn't get banned for literally calling for these children leading a harassment campaign against kids.

[475] Someone with millions of followers led a harassment campaign.

[476] I'm going to use their language.

[477] If you're calling on your followers to do something, you're engaging in a campaign.

[478] But Alex Jones confronting the journalist to advocate it for his banning is a banable offense.

[479] Here's the important thing about Jones.

[480] Oliver Darcy said on CNN, it wasn't that Jones broke the rules that got him banned because what Darcy said is he's been breaking the rules in the past.

[481] They never cared.

[482] It was only because of media pressure they took action against him.

[483] Okay.

[484] Well, we know many other people break the rules.

[485] We know far -lap accounts have doxed law enforcement.

[486] We know Kathy Griffin led a harassment campaign.

[487] There's no media pressure.

[488] That's one of the big problems.

[489] Twitter knows conservatives aren't going to be able to level any kind of campaign against their platform.

[490] They're not scared of it.

[491] But, you know, I often wonder why is it that as prominent and powerful as conservative groups can be, why they often lose these cultural battles.

[492] And I'm not going to say this is the primary reason, but I will point out, do, does Twitter believe that, you know, I often use Sargon of Akata as an example, the liberalist anti -SGW character?

[493] Do they believe he'll lead a group of liberal, liberalists and individualists to Twitter headquarters with crow bars and Maltz off cocktails?

[494] Of course not.

[495] So they, what did he get banned from?

[496] My understanding, he, uh, because I know what happened with Patreon, but what happened with?

[497] His original Twitter thing was that he posted an image of interracial gay porn at white nationalists.

[498] So, but I don't, I think that was the first time.

[499] And then he got, he got, well, he then he came back to the platform and then got, I don't know what happened the second time.

[500] I think it was ban evasion or something.

[501] Well, what are the porn rules?

[502] Because sometimes I'll be scrolling through my feed and you'll just see porn.

[503] I understand it is not allowed.

[504] Porn's just not allowed.

[505] I mean, what about porn stars?

[506] But I've heard it is.

[507] I've heard it is allowed and I've heard it isn't.

[508] Well, it's definitely there.

[509] Yeah, yeah, yeah.

[510] It's like porn stars have porn.

[511] If you go to a porn star's page, you'll see porn on it.

[512] A lot of it.

[513] Yeah, like real penetration porn.

[514] They don't care.

[515] Hmm.

[516] Or maybe there's the truce between...

[517] It has to get marked by someone saying this is inappropriate.

[518] And if enough people that follow a porn star don't think it's inappropriate, it doesn't then get flagged in the system.

[519] Well, that's hilarious.

[520] This is good news.

[521] I think we may have found the Switzerland of the culture war.

[522] Porn.

[523] Yeah, no one who wants to ban porn.

[524] The left and the right.

[525] We're like, whoa, whoa, hold on, hold on.

[526] It's okay, it's okay.

[527] It's okay.

[528] We can ban them for their ideas.

[529] Yeah.

[530] But just leave the porn alone.

[531] But the point I was making is you see Antifa at Berkeley.

[532] Yeah.

[533] $100 ,000 with the damage throwing malt off cocktails, threatening people.

[534] In Portland, you had a Bernie voter carrying an American flag.

[535] These anti -fascists, the Antifa, tried stealing the flag from them, clobbed them over the head, gave him a concussion, put him in the hospital.

[536] So when I see the ramifications of ire from the left or the right, what do conservatives do?

[537] I mean, the GOP couldn't even find a yearbook in the Virginia governor race.

[538] I don't think they're considered to be that big of a cultural threat.

[539] They react to things.

[540] They get upset about things that are unfair against them, but they don't go through the streets with clubs and bricks and smash windows like Antifa and other far leftists do.

[541] Well, if they do, they're considered racist.

[542] It's always like some sort of a racist mob.

[543] That's like the label they get put on them, right?

[544] Yeah.

[545] And then, you know.

[546] Like that was the label that was put on the proud boys almost immediately, right?

[547] That they're white supremacists, even though there was people of college.

[548] or that were amongst the ranks.

[549] And, well, they changed the definition of racist.

[550] Do you know what the whole origin of the proud boys is?

[551] I do, yes.

[552] Because it's kind of fucking hilarious.

[553] Yeah.

[554] I'm sorry.

[555] Go ahead.

[556] Anthony Coomia told the whole story on this show because it happened with him and Gavin McGinnis.

[557] That Gavin McGinnis came up with it because of a guy that worked there and they were doing it as a goof.

[558] Yeah.

[559] And then it became a movement.

[560] And then it became like anybody can join.

[561] And the people that joined, they took it into a radical way.

[562] And it became looking to beat up Antifa, and it's just like, fuck.

[563] Well, Gavin's crossed the line.

[564] That's, you know, there's, and I'll point this out for your, you mentioned media matters in a recent, you know, that you're talking about Alex Jones.

[565] I wouldn't use them as a source for anything.

[566] Well, that's a good point.

[567] But it was just clips of Alex talking.

[568] That doesn't matter.

[569] They, there's a, there's a clip going around of Gavin McGinnis where you can hear him saying these crazy things and you're like, well, he said it.

[570] Right.

[571] But it turns out, some of the clips he was talking about dogs, they, they, you know, you know, you.

[572] really can take the context out of things and these clips.

[573] I understand that.

[574] So what happens if, I'm even afraid in my videos, I don't quote people anymore because people have taken me reading a quote from the newspaper and attribute it to me simply for reading someone else's quote.

[575] And they say, oh, but he said it.

[576] Right.

[577] I mean, you look at the really, the really funny instance of Count Dankula, the guy with the Nazi pug.

[578] That is a hilarious story.

[579] And they told him context didn't matter.

[580] So when he leaves the courtroom, the reporter says, you said this phrase, which I'm not going to say.

[581] And he says, so did you.

[582] If context doesn't matter, you should be arrested too.

[583] But that's, you know, so what ends up happening is these activist groups, they take these quotes out of context.

[584] But admittedly, I think it's fair to point out.

[585] A lot of people recognize Gavin, if you can assume they were jokes or not, doesn't matter.

[586] He said things that were over the line.

[587] Well, Gavin's the main problem, the indefensible problem was the call to violence.

[588] Right.

[589] The calling for violence.

[590] And I don't think, Gavin's like, he's another one.

[591] He's like a prankster.

[592] rock style prankster and he likes to burn it to the ground.

[593] Yep.

[594] And look, I'm a fan of Gavin's interviews on YouTube where he'll, he hoodwinks people and is sitting down and talking to him or I don't even know if they're still up anymore but my understanding is he ruined some chick's life.

[595] How so?

[596] I mean this like somewhat figuratively.

[597] She was a left wing individual.

[598] He didn't know who he was.

[599] He asked her to come on and her friends basically just disavowed her immediately.

[600] Because she was talking to him?

[601] Because she sounded unintelligent.

[602] Oh, yeah.

[603] I got to I got to tell you, man. Yeah, he cornered her.

[604] I don't know if you have this issue, but for the longest time, it's, it's substantially harder to interview someone on the ideological left than anyone else, right?

[605] So I recently reached out to, you know, I regularly reach out to people.

[606] I'm not going to name drop because I don't want to drag people.

[607] But, man, it takes weeks trying to organize a meeting with some, you know, these personalities who are progressives and on the left, weeks.

[608] Really?

[609] Yep.

[610] Even people who, like, I've gotten messages from people saying, yeah, man. I watch yourself all the time, but hold on, let me, let me think about it and talk to some people first.

[611] And I'm not saying they're doing it because they're, you know, skittish or it's just harder.

[612] It's a lot harder.

[613] Well, they're probably more cautious, especially if, you know, where your ideology stand is ambiguous.

[614] It's if they're trying to figure it out.

[615] Oh, yeah.

[616] If you're, you know, man, even David Pacman and Jimmy Dor get dragged through this sometimes.

[617] Yeah.

[618] I see people on Twitter calling them alt -right, weirdly enough or intellectual dark web.

[619] Jimmy Doar is an interesting character.

[620] I really like that guy.

[621] Yeah, I think I...

[622] He's such an interesting guy.

[623] He's very smart, but he's like a...

[624] He's an angry lefty, but, you know, like...

[625] But he defends free speech.

[626] Yeah, that's what's gonna say.

[627] He's, a hundred percent.

[628] Here's my thing, man. I was in Berkeley.

[629] It was this big protest against Ben Shapiro.

[630] And there's a guy wearing a mask with a, you know, a communist flag, full, like, red gear.

[631] So I go up to interview him and I'm like, you mind if I interview him?

[632] He's like, yeah, yeah, of course.

[633] And I was like, really?

[634] oh, that's surprising.

[635] And then we start talking and I said, how do you feel about these people who dress in all black and, you know, are fighting people and causing problems?

[636] Oh, that's terrible.

[637] And I was like, you think so, really.

[638] It's actually, I'm surprised because often when I see people wearing, you know, fully masked up with communist stuff, they're typically in favor of the, by any means necessary strategies.

[639] And he was like, no way, man, that's wrong.

[640] And so I'm like, you know what, man, I don't care if you're a communist.

[641] I don't care if you're whatever, as long as you're not an authoritarian who thinks you have the right to beat other people.

[642] to instill your ideology on them.

[643] Yeah.

[644] Or use manipulative force or coercion or extortion.

[645] So let me talk about why I think what we're seeing with Twitter might be one of the biggest problems ever.

[646] Twitter, YouTube, Facebook, these platforms are where we exist.

[647] Socially, politically, it's where our ideas are exchanged.

[648] It's where we learn about who we're going to vote for or why we won't vote for somebody.

[649] When you ban somebody, you exile them.

[650] They're no longer part of that conversation.

[651] So they're very much so told you are outside.

[652] the city walls, right?

[653] You can't come in, you can't talk to us, and there's nothing you can do about it.

[654] But then when you realize the rules are actually bent, you know, they're slanted in a certain direction, you can then predict where things are going.

[655] Did you see the green, the Green New Deal, Alexandria Ocasio -Cortez?

[656] No. There's in, so she publishes, it's a non -binding resolution, which means even if it passes, they can't enforce anything.

[657] But my God, I, the fact sheet, they released alongside it literally said, they want to provide, economic security for people who are unwilling to work, right?

[658] What?

[659] Unwilling.

[660] Really?

[661] I swear to God.

[662] Yeah, yeah.

[663] CNBC, reported this.

[664] They covered this.

[665] Apparently, this got removed from the site.

[666] The reason I bring this up is there's a chart from the economist.

[667] I frequently show this in the content I make where you can see that conservatives are coalescing around common ideologies.

[668] For a while, there was some upset in the party because people didn't like Trump.

[669] But now they've pretty much, you know, they say it's the party of Trump.

[670] agree with him.

[671] Ted Cruz even stands and gives him a standing ovation at the State of the Union.

[672] But the left has been spreading out.

[673] And again, this is from a chart put together by the economist.

[674] The Democrats are very clearly being spread from far left to center.

[675] And it's kind of making it very difficult for the Democrats to, you know, put forward something that makes sense.

[676] If Alexandria Ocasio Cortez, you know, she puts out the Green New Deal, but in the bill, it talks about equity, racial justice, the gender pay gap, things that have nothing to do with the environment.

[677] And then Nancy Pelosi says it's green dreams, you know, and she derives this.

[678] You can see that there's a new faction of the Democrats that have, you know, wholly ideological drive.

[679] And I think one of the reasons for this is what we see on social media, right?

[680] The ideological bent of the platforms then lead to the mass followings of specific individuals who then use specific tactics to get elected.

[681] And it's, you know, when these platforms only allow certain ideas to form, those ideas will naturally rise to the top of our political space.

[682] And then you get crazy stuff, like if you're unwilling to work, we'll provide you economic security, which, you know, I don't know necessarily what that means other than some people who choose not to work will get paid, I guess, from taxpayer money, but.

[683] Well, that seems completely insane.

[684] And not only that, where's that money coming from?

[685] Right.

[686] From you.

[687] This is actually another funny thing that, no, I mean, I know what you mean.

[688] Literally you.

[689] Yeah.

[690] Literally.

[691] Andrew Cuomo said, God forbid, if the rich leave New York, because I believe 1 % of New York, the top 1 % of New York, the top 1 % pay for.

[692] 46 % of their taxes, of the revenue they use.

[693] And so they just had a big budget shortfall.

[694] I believe it was something Trump did that caused a shortfall.

[695] And they were asked if they would tax the rich.

[696] And he was like, no, God forbid, they'll leave and they're already leaving.

[697] So I'm, you know, incoming a million people saying I'm conservative for bringing that up.

[698] But, you know, facts are facts.

[699] And that's what's really important about this.

[700] And when you suppress any ideology, if you are on the left and you suppress the right, it is just going to shore up their defenses and they're just going to harden their line.

[701] That's just how it goes.

[702] That's how human nature is.

[703] You can't tell people what to do.

[704] You just can't.

[705] You're right.

[706] And a lot of people might say I'm a little alarmist when I mentioned a potential civil war.

[707] But let me clarify.

[708] Like I'm not saying, because I've brought this up before, I'm not saying it's going to be like, you know, 1800s, two big battlefields.

[709] But at the same time, what people don't seem to realize when it comes to history is that when you read about World War II, we've condensed all the highlights into a very short paragraph or a series of paragraphs.

[710] And you don't realize the war was several years.

[711] There were periods where nothing happened, right?

[712] I was in Egypt during the Second Revolution.

[713] You could look down and you could see Tahrir Square.

[714] People screaming, laser pointers, helicopters, Apaches, and they announce in the news, we've deposed the president two blocks away.

[715] A dude's eating a cheeseburger or McDonald's watching a football match, as if nothing's happening.

[716] So when you look at these street battles, the political violence, when you look at the biased bannings, you look at the dude, there was a guy who fired a couple rounds at a police officer in Eugene, Oregon, and some bombs got planted at the police department or somebody planted bombs at a statue in Houston.

[717] It starts to feel like there's some kind of political violence that is bubbling up that can't be mended at this point.

[718] And a lot of this comes from this suppression that we're talking about, where people don't feel like they have a voice or that voice is being suppressed by an opposing ideology.

[719] You know, yes, but it is really complicated, and it's, I can't claim to know how everything happens, but what I will say is, I believe social media is responsible for the political violence.

[720] I believe it's, and it's not just about suppression.

[721] It's, you look at the systems that were built, Facebook, right?

[722] What content can make it to the front page of your Facebook profile when you're looking at your news feed?

[723] Well, Facebook has to build an algorithm to determine what matters most.

[724] Companies then figure out how to manipulate that algorithm.

[725] to get that content in front of you because, you know, at most you can see, what, three posts on Facebook?

[726] So what happens is early on, companies quickly found out that anger drives the most shares of any emotion.

[727] All of a sudden, we see a wave of police brutality videos.

[728] Yeah.

[729] There was one website that posted almost exclusively police brutality content, and it was like Alexa 400 in the world.

[730] Some ridiculously high number.

[731] It blew my mind.

[732] I knew someone claims that they were making six figures writing police brutality articles because it was pure rage bait, right?

[733] Yeah.

[734] Content that just shares really easily.

[735] But that content constantly being put in front of somebody breeds an ideology.

[736] You then tell someone, did you know that, you know, white supremacy is on the rise and there are 11 million white supremacists in the U .S. and they go, I can believe it.

[737] But that's nonsense.

[738] It's just not the case.

[739] You know, the anti -defamation league in the SPLC say that rough estimates are maybe like 10 or 12 ,000.

[740] But people really believe that there is like that the president is secretly a Nazi and that he's being propped up by the secret.

[741] cabal or there's an alternative influence network on YouTube where you and me are somehow trying to convince people to you know it's just ridiculous well that's the the aim thing yeah what was it called uh data in society right right yeah and that's that's that was that nonsense yeah we would did we get connected to are we all right adjacent are we bootlickers or we that one we're part of the it's a network that feeds into extremist ideologies and other they connected me with people but it's the so schizophrenic the way it's drawn out the little map where one person person's connected to another person.

[742] And what I said to her, I said, Barbara Walters interviewed Fidel Castro.

[743] Did that make her a communist?

[744] That's what I tweeted at her.

[745] I'm like, you're crazy.

[746] This is a crazy way to look at things.

[747] But what happened with that story?

[748] Media reported uncritically.

[749] I reached out to a bunch of journalists.

[750] I know a ton of journalists.

[751] I'm a member of the online news association.

[752] Like, I've been a speaker at their events.

[753] And I'm reaching out to these journalists like, hey, why did you guys write that?

[754] That's just completely fake.

[755] It's got my name, like my name in the middle.

[756] Right.

[757] You know me. Right.

[758] You can call me. to quote you.

[759] They don't do it.

[760] They just uncritically report it.

[761] And there's a couple of reasons for it.

[762] Facebook recently changed their algorithm.

[763] I don't know.

[764] This was a while ago.

[765] They may have changed it again.

[766] But it was a huge hit to the incomes of a lot of these companies when all of a sudden news articles stopped appearing as much.

[767] Because Facebook wanted friends and family to be more connected and less so news organizations.

[768] So these news organizations who write this viral clickbait and rage content weren't getting as much traffic.

[769] So they have to go crazy.

[770] They have to, you know, and so it's a downward spiral of where these journalists all follow each other.

[771] They start producing, I don't think it's a conspiracy they produce this stuff.

[772] I think they're hired specifically because the content they produce is viral.

[773] And it's viral for a reason, right?

[774] And so the more they produce it, the more they eat their own, you know, excrement, essentially.

[775] And then it's a game of telephone where they're sitting in a circle constantly telling each other the craziest things and it gets crazier and crazier.

[776] But another aspect of it is when they write an article saying, you know, Trump is racist.

[777] It goes viral.

[778] The next day they can't write the same article.

[779] So they write Trump is the most racist.

[780] The next day, they have to keep one -upping it.

[781] And it gets...

[782] And we talked about this with Forbes articles.

[783] The term nasty surprise.

[784] They use it with tech.

[785] Like they'll say, the new Galaxy S -10 has a nasty surprise.

[786] The new iPhone 10 has a nasty surprise.

[787] And they keep saying, it's hilarious.

[788] It's almost like there's a form letter.

[789] And they just take whatever Xbox, stick it in there.

[790] Nasty surprise.

[791] And it's 100 % click.

[792] and it's Forbes and were you telling me that Forbes that what's essentially is like user contributions yeah yeah I could probably submit an article they have like a network of people I don't know how you get approved but there's a lot of articles that just get written about like the new video game today so it's like a clickbait title yeah just get some ads has a nasty surprise but it's almost like they have like a pattern that they've just accepted this is going to work but it's not a conspiracy it's just like -minded people who are only ever around each other sharing the same things among each other believing all the same things and so you'll notice that certain words emerge specifically among certain groups you know like the left will use certain words and then if like learn to code doesn't appear that much in left wing rhetoric but the conservatives and the anti -identitarian types understand what it is and so the justification for banning someone for saying learn to code regardless of the context seems insane yeah that seems insane it seems like that one in particular is almost in defense i mean not almost that's indefensible absolutely like there have been people who uh but let me let me let me be fair there are people on the left who have been banned absolutely there was a lot of venezuelan accounts that were banned and a lot of people were very critical i saw abby martin was was criticizing this because they accused them of being government actors because they were pro venezuelan government but the most the one thing there's some occupy wall street activists who absolutely detest me they lie about me i do not like them for doing this they were banned abruptly for literally no reason.

[793] And this is what's more worrisome to me is that no one defended them.

[794] No one defended them because conservatives certainly won't, but neither will the mainstream, you know, ideological left.

[795] These are activists for class issues, for international issues.

[796] They're on the left squarely.

[797] And they were accused, I guess, of being bots or something.

[798] It was just an abrupt purge of like 50 accounts.

[799] And some of them were like independent citizen journalists, just wiped out.

[800] and with no recourse no recourse none so i mean at some point you have to realize how important twitter is when the president is on it could you imagine if there was a physical space where everyone was talking and the president shows up and everyone keeps yelling at them and they're all talking because you had that lawsuit where they said it was a public forum imagine that happens and then a private private individual bars you from hearing what the president has to say right it's a complicated issue i very you know you you know you get a lot of people on the left saying private businesses can do whatever they want.

[801] That blew my mind because the left was usually about not letting massive multinational billion -dollar corporations get away with suppressing speech.

[802] Well, that was another thing that people got pissed at me about Jack Dorsey, rightly so, that he said that it's a human right, to be able to communicate online as a human.

[803] But the fact that he said it, but yet all these people are banned.

[804] So like how, like to take away someone's human right, there should be an egregious example.

[805] I mean, it should be something, like doxing someone, like calling for violence, like trying.

[806] But even then?

[807] But clearly that's not the case if Kathy Griffin's still online.

[808] But hold on.

[809] You can kill a human being and get 25 years.

[810] Right.

[811] Good point.

[812] So you can literally strip someone of their everything and still not be purged permanently.

[813] This was one of the things that Jack and I discussed post podcast.

[814] I said, you know, when we were going back and forth about doing this again, you know, I told him I would really like to see if there's some sort of a path to redemption.

[815] Like, you know, for example, for Milo.

[816] I mean, who's just like, we talked about yesterday about Christian Picolini, who was a white supremacist, who realized the error of his ways and then became this activist against racism.

[817] And now he gives these TED speeches and he's, you know, accepted by everyone as being this guy who's achieved redemption and really understands the error of his ways.

[818] if Milo's banned for life Milo's only like 34 years old right how old is Milo around there I honestly don't know I hope I didn't make him older than he is it probably be mad but whatever it is like who's to say that Milo you know in three years from now won't have a change of heart or you know have a fucking ass a trip or something that makes him a different person but if you're banned for life are we throwing people away like you ever see that tweet from I'm gonna say I think it was Tyler the Creator where he said, how is cyberbullying real?

[819] Just, you know, like, close your eyes.

[820] Go outside, close your eyes.

[821] That's, you know, I'm sorry, man. If you want to ban hate speech, I can understand.

[822] I am no fan of hate speech.

[823] I think it's wrong.

[824] I think you shouldn't, you know, target people for specific characteristics.

[825] We should respect one another.

[826] At the same time, I'm also a human adult who understands.

[827] Sometimes people are mean.

[828] You ever walk that, you ever go to, you know, subway in Los Angeles and some guy starts calling you all the names in the book?

[829] What are you going to do about it?

[830] Nothing.

[831] That's just life.

[832] People are mean sometimes.

[833] If they punch you, they cross the line.

[834] Right.

[835] But on Twitter, you know, Milo wants to say mean things, block.

[836] Mute.

[837] You know what I do?

[838] I press mute.

[839] Yes.

[840] Yeah, mute or block.

[841] I don't even block people.

[842] I block some people.

[843] Yeah, the Milo one was very, very weird.

[844] They were looking for a re, and here's the other part of it.

[845] You know, what he ultimately got in trouble, air quotes for, was him talking about the positive experiences that he had as a young man being molested.

[846] I think that was after he was already banned, though.

[847] I don't think so.

[848] Well, he might have been.

[849] band from Twitter already.

[850] But then he got kicked off of YouTube and he left.

[851] I don't think he's kicked off YouTube.

[852] He's not.

[853] I think he's still on YouTube.

[854] Was it Breitbart that he left?

[855] He got fired.

[856] He quit Breitbart.

[857] This is another thing too, though.

[858] You know, I see all these journalists writing all these articles saying like Milo is gone, Milo is whining and he's no more.

[859] I'm like, dude's got like six million followers across his Instagram, his YouTube, and his Facebook.

[860] He posts all the time.

[861] He's not in the public conversation as much as he was before.

[862] Because.

[863] Dave censored him.

[864] When you're not on Twitter.

[865] the journalists who make up a huge core of their verified users who apparently according to a CNN article are the lifeblood of their platform aren't talking about you but here's my point he's not saying that men should go have sex with younger boys he was basically saying that it could be a positive experience because it was for him well I don't I don't know anything about what happened in that capacity my my point was if I said when I was 13 a 21 year old girl fondled me do you think I'd get in trouble if I said it was awesome no I bet I wouldn't You wouldn't.

[866] Yeah.

[867] That's weird.

[868] My brother was pointing out, because law and order SVU is basically on 24 -7.

[869] It's like 98 % of the episodes are only ever about women, never about men being victims.

[870] Sometimes they are.

[871] But he was like, oh, I just realized that.

[872] And I was like, let's decide you.

[873] Well, that's special victims unit.

[874] That's the show.

[875] They have so many versions of law and order, but that one in particular, right?

[876] Oh, no, but I mean, it deals with sex crimes.

[877] Yes.

[878] And almost every episode, it rarely ever talks about male.

[879] victims.

[880] Right.

[881] Which exist.

[882] You know what I mean?

[883] And so I don't want to get like into a men's rights thing.

[884] But no, I think it's, you know, fairly obvious to a lot of people, like you mentioned.

[885] Yeah.

[886] If you said it, nobody would have cared.

[887] But Milo's gay.

[888] Right.

[889] And so it becomes, you know, it becomes a thing for him.

[890] Yes.

[891] Yeah.

[892] Yeah.

[893] It's some, what, there's, we need some kind of clear guidelines, right, where you can operate inside these guidelines and all's fair.

[894] It's just, I mean, to an extent it is tough should comedians be allowed to operate dancing on the line you know what I mean like obviously I think so right right I'll tell you so I can't tell you this shit I'll tell you something after this is over that's gonna you're gonna think it's hilarious and we'll find out about it in the future oh now everyone's gonna be like they're keeping secrets no no it's not a secret it's a comedy secret it's about something but you know but I'll take this up you too remind me but I'll take this opportunity to segue into another point when it comes to the bias right How is it that you can have Jimmy Kimmel, Jimmy Fallon, dress in blackface on their...

[895] When?

[896] CNN made a big list.

[897] I don't know exactly when it happened.

[898] I think Kimmel was on The Man Show.

[899] He dressed like a basketball player.

[900] Oh, Jesus.

[901] And Jimmy Fallon dressed like Chris Rock.

[902] Oh, Jesus.

[903] Sarah Silverman did it.

[904] I think she addressed it, though.

[905] Yes.

[906] Nobody loses their minds.

[907] Nobody loses their minds over that.

[908] Because they're on the left, do you think?

[909] I don't know.

[910] I don't know.

[911] I honestly, I would say to an extent, there's probably some kind of...

[912] of tribal bias.

[913] Well, I think when you're going back to high school yearbooks looking for outrage from 55 -year -old people.

[914] You've lost the plot.

[915] You've lost the plot.

[916] Isn't that crazy?

[917] It's fucking insane.

[918] Well, what's crazier is when Kathy Griffin tweets out that the three -pointer hand sign at a Covington basketball game was a Nazi hand gesture.

[919] Like the three -pointer, you know what the three -point sign is?

[920] Oh, this thing?

[921] Yep.

[922] See, I'm not going to do it because the photos are going to fly.

[923] It's just okay.

[924] People send me death threats.

[925] I put a series of them on my Instagram.

[926] When I found out about that, I put Bill Cosby doing it.

[927] Someone found one of me from news radio isn't it aren't they using it though for that symbol I know it's a it's a universal symbol that means a lot of things but aren't they using it as that symbol yes they're not they're not they're not what are they 100 % it's not okay what were those cops using it for that's the that's the that's the what is it called it's called the okay game or the don't look game where you put the symbol under your waist and if someone looks at it you get to punch them what there's a game kids play no no no no no these those SWAT cops that had it on their on their legs no they were all doing it there was like four of them doing it in a photograph and they were they holding it up or are they holding it on their legs oh i don't know yeah no but i don't so let's let's let's let's break this down i can explain this to you okay please do Donald trump when he talks he makes the okay hand sign well he's he's pointing he's making i mean okay if he does it this way is that okay oh no i don't know if he does this no i don't care what trump does when he talks i mean if he flicks people off it's probably a bad thing but so what happens is he starts doing the okay okay sign.

[928] So a bunch of Trump supporters start doing it too to be like, hey, I'm like Trump, right?

[929] Right.

[930] A 4chan campaign gets started saying, convince everyone this actually means white power.

[931] Right.

[932] It was fake.

[933] The Anti -Defamation League said it was fake.

[934] Yes.

[935] A bunch of journalists said it was real.

[936] That's what I put on my Instagram.

[937] I put all this on my Instagram, including the article where it showed the original thing came from 4chan.

[938] Forechan's fucking hilarious.

[939] Powerful.

[940] It's hilarious how much shit they start.

[941] They started the Flat Earth movement.

[942] Okay, there's the guys.

[943] That shot, I don't, I'd never heard that.

[944] Oh, yeah, yeah, yeah, they started to fuck with people.

[945] Let's look at this photo and see if I can give you, uh, it's gone.

[946] A little, it popped up.

[947] It's a conspiracy.

[948] How do we get it back?

[949] It's that.

[950] What's going on?

[951] The connection to the TV's just, I don't, I have no idea.

[952] No. It's a Hollywood conspiracy.

[953] The button I have is making it go there and it's not going there, so I don't know.

[954] Jesus, Jamie, there's gremlin's in this fucking room.

[955] I would, I would figure it out.

[956] It's hard, allegedly.

[957] I would say this.

[958] We need to see that, though.

[959] 90.

[960] I can show you my laptop.

[961] Yeah, show me in the laptop.

[962] I would be willing to Okay there's the image Take a good look at it Tim So these guys You see how it's on their leg Yes That's specifically a game Where when you look at it They get to punch you What?

[963] They're not holding the hand sign up They're not flashing it like you see Conservatives do right I'm not denying that is a game But to say that that's what those guys are doing Is a bit of a stretch I believe Well what do you think they're doing What is it?

[964] If fortune did that To make to mix it up at some point people would think that that is true though and they might start doing that that's like no way no way that's happened look I'll say this is it is it well free bleeding free bleeding came from 4chan right if you don't know what free let's explain that to people it's all you okay free bleeding was 4chan thought it would be hilarious by the way shy al -a -boo was at the fucking comedy store last night I wonder why because we're always ragging on him free bleeding came from 4chan where they said that They were promoting this idea that for women's rights, that they would get away from this whole idea of you have to control your menstrual cycle.

[965] You know, it's empowering to just bleed all over your crotch.

[966] And so women actually started doing it because it actually, if you can fucking, if you can get those ideas out there, a certain number of knuckleheads are going to take it and run with it and think it's real.

[967] Of course.

[968] So you don't think that's possible with the white power, though?

[969] I think it's extremely unlikely.

[970] I think it's possible.

[971] So here's the thing.

[972] Nothing's absolute, right?

[973] Are there some white supremacists who are doing it?

[974] For sure.

[975] But don't you think you're looking at badasses with fucking guns?

[976] They're playing this little silly game.

[977] Yeah.

[978] Really?

[979] I think they're a bunch of bros who are, you know, you ever hang out with like some frat dudes at a college?

[980] They punch each other.

[981] There's a game.

[982] So how does it go again?

[983] You make the okay hand sign.

[984] Okay.

[985] And you hold it under a table or on your legs.

[986] These are adults.

[987] These are guys that are not in college.

[988] They're definitely not a below 25 years old.

[989] But what does that mean?

[990] I know 40 -year -old guys who play, you know, Pokemon.

[991] We'll go on, welcome on go.

[992] Can we fix this fucking thing?

[993] Well, I don't.

[994] Listen, listen.

[995] Okay, but what is their job?

[996] What were they?

[997] Their SWAT team guys?

[998] Is that what they were arresting a drug dealer?

[999] And what do you think they're doing?

[1000] They're trying to make sure everybody knows that they're flashing an overt white power hand gesture because everyone knows that's what it means.

[1001] Maybe they didn't think everyone knew.

[1002] Well, no, that's just not, it's just not the case.

[1003] the thing, the point is holding the okay sign up next to you is what's, you know, people say it's the W and the P. Putting it on your leg has always been the punch -em game or whatever.

[1004] I don't know that punch -em game.

[1005] You know that punch -em game?

[1006] You know that punch -em game?

[1007] 100%.

[1008] That's why.

[1009] You put it out and put the okay symbol on your leg or under a table and you say, hey look, and if they look down and see it, they go ah, and then you punch them in the arm.

[1010] Yeah, but I know it's so well that that's why I don't think that's what they're doing.

[1011] Like, me and my friends still play it.

[1012] So then so the question is, yeah, so it's a child.

[1013] It's a major look, it's major look.

[1014] Major look, right.

[1015] So, but what are they doing?

[1016] Right?

[1017] Look, listen, man, if you want to, if you want to make assumptions about what you think their intentions were, that's all you.

[1018] I don't have any facts to support that.

[1019] And the only thing I know of is there is a game where you put the okay sign on your leg and then you punch somebody.

[1020] And here are some guys putting the okay sign on their leg.

[1021] What evidence do we have?

[1022] It's anything other than that?

[1023] Nothing.

[1024] So that's, I'm not going to go any further than that.

[1025] I'm going to say, was it poor judgment?

[1026] Oh, hell yeah.

[1027] Maybe, but listen, do you know about what happened in Philly with these Marines who got beat up by Antifa?

[1028] No, I do not.

[1029] So there was a rally put on by some constitutional libertarians.

[1030] I don't know exactly what it was all about.

[1031] Antifa shows up in protests.

[1032] Some Marines apparently are just walking by because there was a Marine event.

[1033] Antifa sees them and yells, are you proud?

[1034] They said, he says, I'm a Marine.

[1035] It said, are you a proud boy?

[1036] And he said, you know, I don't know.

[1037] They beat him up.

[1038] They arrested several people charging with multiple felonies.

[1039] Marines got beat up.

[1040] They didn't know what Proud Boy was.

[1041] So to assume that these guys know anything about what's going on in cultural politics, it's, you know, when you're in the know, and you're on Twitter, when you're reading the news all day, you look at that and say, they knew what they were doing.

[1042] What, these are small, like, what city are these guys from even?

[1043] Do they watch the news all day?

[1044] Do they go on 4chan?

[1045] Do they go on Vox .com and read and know what this is about?

[1046] Okay.

[1047] I appreciate you're looking at this with a broad perspective, but it is entirely possible that they did.

[1048] Sure.

[1049] It's also entirely possible that within their friend group it means you're buying lunch it could mean a million things it could mean right in the cultural context of 2018 when this happened yep that oh the okay symbol doesn't even mean white power it is is it is it is it is a tribal sign among anti intersectionals and trump supporters but don't you remember when was there there was a woman that got in trouble for she was in court and she had it on her arm she was just standing there like that's so insane so insane so insane Because she was basically...

[1050] She had her finger and her thumb like that.

[1051] Yeah.

[1052] And they went wild with it.

[1053] Yeah, she's making a white power symbol.

[1054] So you have to understand when I see that and I see that, you cut me out.

[1055] But there's a difference between someone just moving their hands around and doing this and, you know, making a weird thing on their arm.

[1056] She did, she did full on do it the next day, though.

[1057] Probably on purpose.

[1058] Fuck you.

[1059] Or maybe that's what she does when she puts her arm there.

[1060] Yeah, it's just the okay sign.

[1061] You have to assume she's watching the news then.

[1062] I would.

[1063] She's sitting at the Kavanaugh hearing, but it's entirely possible, though, albeit unlikely, that she was just telling me. somebody okay.

[1064] Or that she was, she's used to doing that with her arm and she doesn't even think about it.

[1065] You're allowed to make assumptions, right?

[1066] Yeah.

[1067] And operate off assumptions.

[1068] But eventually you start getting off so crazy and, you know, how many assumptions are you going to believe until you're believing the moon landing was fake?

[1069] Let me ask you this, though.

[1070] Don't you think that some people do that and they do it because they're making the symbol for white power?

[1071] Some as in what, 10, 15, 20?

[1072] I don't know.

[1073] I think.

[1074] I don't know about a number.

[1075] I said, listen, nothing's absolute.

[1076] I'm pretty sure there are probably some white supremacists who do that.

[1077] News, which used to be called Fusion.

[1078] I worked there, full disclosure, claimed it was a white power hand gesture.

[1079] And she got Cassandra's suit over it.

[1080] She ultimately lost because it's like, you know, slander is hard to sue for.

[1081] But she did it because it was a Trump sign, not because it's white power.

[1082] So the people who or even white supremacists aren't signifying white power, they're signaling to other Trump supporters too, right?

[1083] It doesn't mean white power.

[1084] Right.

[1085] So just because someone on the left says it means white power, that does not mean it means white power within their group.

[1086] Is that what you're saying?

[1087] Yeah.

[1088] So, like, we all just decide that this means something else.

[1089] Like, my friend...

[1090] It means peace among nations?

[1091] Peace among worlds.

[1092] My friend Steve Ronella talked about this in the podcast, said he got beat up once by his friend, where he grew up in Michigan.

[1093] And in Michigan, as almost like for fun, like, like if I said, hey fucker, like, if I called you, hey fucker, like, as friends, you would laugh and like, what's up, dude?

[1094] You know, it'd be cool.

[1095] So he would give the bird and they would call it the Michigan hello.

[1096] Yep.

[1097] And so the Michigan wave or something like that.

[1098] So as he was driving by, he saw his friend, he went like that.

[1099] Like, if I saw you do that, I'd be like, ah, what's up, Tim?

[1100] There was a...

[1101] But you know what I'm saying, but his friend didn't know this.

[1102] So his friend, he grabbed him through the ground.

[1103] He goes, you want to fight motherfucker?

[1104] He's like, what are you talking about?

[1105] Like, what's going on?

[1106] He's like, you gave me the bird.

[1107] He's like, oh, Jesus, bro.

[1108] I'm from Michigan.

[1109] Yeah.

[1110] Like, that's, we're having fun.

[1111] Like, that's just, I'm your friend.

[1112] And he was over the guy's house helping him build a greenhouse or something.

[1113] Like, he was doing some work with the guy.

[1114] I still threw him to the ground because he thought that this was oh you got it fixed no there's naturally there's going to be a ton of people you know saying oh Tim's bootlicker and all that stuff you're white supremacist listen man like I'm not I do not I'm not a big fan of conspiracy theories I'm not a big fan of making assumptions about the intentions of other people if you can prove it I'm willing to hear it but people in this country are innocent until proven guilty what do we have we have a photo of some cops doing something dumb right do I think it was ill advised it was wrong I shouldn't have done it of course absolutely shouldn't have done it do I think it means they're white supremacists?

[1115] No. They're cops.

[1116] Yeah.

[1117] I'm not good.

[1118] Look, I've, I, I am no fan of police.

[1119] I, you know, I grew up as like a left, far left anarchist skateboarder.

[1120] A cop screw with me all the time.

[1121] I've had cops kick my door and guns drawn.

[1122] I was in Chicago and cops pulled me over, me and my buddies, this is all on video at gunpoint screaming at us.

[1123] It was the craziest experience I ever had.

[1124] I am no fan.

[1125] But if you don't have evidence, I'm not just going to, so this is the thing about how, how these biases function.

[1126] You get people who, will see all these videos, they'll have these experiences, and they'll immediately assume the worst about these guys.

[1127] I don't know anything about these guys.

[1128] I know they did something dumb, but I don't know why.

[1129] So I can't really go beyond that.

[1130] Other than I believe their official statement was they were playing the game.

[1131] But I could be wrong.

[1132] But to make assumptions about their character or what they believe simply because they made an okay sign on their leg, it's like, you can't convict somebody in a court.

[1133] You know what I mean?

[1134] And I'm a big fan of the presumption of innocence and Blackstone's formulation and how we side on the, we err on the side of protecting the innocent.

[1135] I think you got a good point in also in the fact that this is a extremely recent hand gesture that's being associated with white supremacy and clearly came from pranksters.

[1136] And you have to assume these guys are on 4chan or read these websites.

[1137] Like, come on, man. These dudes, they probably go to work all day.

[1138] They talk about football.

[1139] They go home.

[1140] They sit in their lounge chair and have a, you know, have a beer and a slice of pizza or whatever it is they do.

[1141] I don't think these guys, man, you know, people don't know how to separate their own personal bubble from reality.

[1142] they assume if I know it you must know it right it's actually something Shane Smith told me he said he doesn't understand why is it that if he can do it you can't right and that's it's like an interesting point that people don't seem to realize what do you mean by that like he said to me I can speak French why can't you like people live in this mindset right where they assume if I know it everyone knows it right so so they're going to be like no every I saw an article about that everyone must know what it is it's like no no no dude there are some people who don't watch TV.

[1143] There are some people play video games all day.

[1144] There are some people who don't do any of that.

[1145] For all you know, these guys, every day after work, they go to a children's shelter and provide soup, and they don't watch the news at all.

[1146] Like, I don't believe that they actually do.

[1147] I'm just saying, you have no idea what's going on in their lives.

[1148] You're making assumptions about what they know, who they are.

[1149] And I think, you know, I'm a firm believer that we have problems of racism in this country.

[1150] I believe institutional and systemic racism, real problems need to be solved, all that stuff.

[1151] That still doesn't mean you get to just label someone and make assumptions about what they believe, who they are, because of one thing.

[1152] You know, if you made a, if you made a joke, what if, what if they did it because they were ironically doing it?

[1153] Right.

[1154] If you made a joke 10 years ago, am I going to assume you actually believed it?

[1155] Maybe said something silly.

[1156] You know, we had this newscaster in New York who accidentally said Martin Luther.

[1157] Yeah.

[1158] You know, and I happened again, you know.

[1159] Another guy did it.

[1160] And you also, and another thing happened that no one cared about was a guy on CNN said a racial slur for Jewish people in the same way, no one cared about that one.

[1161] You know?

[1162] So there was a CNN anchor.

[1163] What did he say?

[1164] I don't want to say it.

[1165] Okay.

[1166] But he was, he was, he was, he was, the K word.

[1167] Right.

[1168] I think you'd still say it as long as we're not calling anybody.

[1169] And it was the same thing, he quickly, fixed it himself.

[1170] Right.

[1171] And that didn't come up as an issue.

[1172] But the point is, this dude, you know, why, why is he being fired?

[1173] Right.

[1174] Even people came to his defense.

[1175] Are you going to assume nasty things about him?

[1176] Like, are we really getting to the point where we're going to look at a photo?

[1177] We don't know the context.

[1178] We don't know who these people are.

[1179] We're going to be like, right up at the stake.

[1180] Well, there's another issue, too.

[1181] And this is, this is, oh, you're doing it.

[1182] Oh, you're doing it.

[1183] Oh, Jesus.

[1184] That's so crazy.

[1185] I didn't even, I meant to do this.

[1186] But there is another issue that people do accidentally, like, because they're worried about saying something.

[1187] They will say something.

[1188] Right.

[1189] I had a friend, and he was a warm up guy for a television show.

[1190] Do you know how warm up works?

[1191] Like, there's a, like, say, if a sitcoms being filmed, there's a guy who, like, keep the crowd laughing.

[1192] And he walks around, keep the crowd.

[1193] And he had an anxiety attack.

[1194] He had a panic attack.

[1195] And the panic attack was this.

[1196] He was doing the Cosby show.

[1197] And he was, for whatever reason, it got into his head.

[1198] Don't say the N word.

[1199] Don't say it.

[1200] Don't say it.

[1201] And he started sweating.

[1202] And he said he started stammering.

[1203] And he literally couldn't talk.

[1204] He had some anxiety issues.

[1205] And he locked up and literally could, could, barely talk.

[1206] He didn't say it, did he?

[1207] No, he didn't say it, but he was terrified that he was going to say it.

[1208] That's weird.

[1209] But that could be what's going on with someone saying Martin Luther Coon, like, especially the second guy.

[1210] Obviously, with context, obviously the second guy heard the first guy do it.

[1211] And it was, it's a Freudian slip.

[1212] It's in his mind and then, don't say it.

[1213] Don't say it.

[1214] I'll let you on a funny, funny little secret though.

[1215] I did it in one of my videos.

[1216] You said that?

[1217] And then I started laughing.

[1218] It was about it.

[1219] It was about it.

[1220] It was about the guy.

[1221] It was about the story.

[1222] And so you said it.

[1223] And I said it laughed.

[1224] And I just, I edit it.

[1225] I edit my videos.

[1226] I don't do, I don't do live shots.

[1227] You know, I do laugh sometimes.

[1228] But I was just like, I started laughing.

[1229] I just literally just, I started laughing.

[1230] And I'm like, I can't believe it, dude.

[1231] Because the story was about him saying it.

[1232] I purposefully don't say these words on YouTube because I don't want to get the ban hammer.

[1233] Sure.

[1234] So I'm like, this guy accidentally said Martin Luther.

[1235] And then he used a slur.

[1236] And then later when I was reading it, I said it.

[1237] And I started laughing.

[1238] I'm just like, but I understand what, you know, it's not even about saying the slur.

[1239] It's about when you have two words in your head and you accidentally put them together.

[1240] Yes.

[1241] You know, that's what happened to CNN guy.

[1242] Well, sometimes I call someone the wrong name and don't realize I've done it.

[1243] Right.

[1244] Like, Jamie said, you said Jack.

[1245] Oh, I thought I said John.

[1246] Like, we've had that conversation before.

[1247] I literally in my head think I'm saying the right thing, but I'm not.

[1248] I did, you know, I made a huge, huge mistake on one of my videos.

[1249] I deleted the video because of it.

[1250] Because instead of saying Harvey Weinstein, I said Brett.

[1251] oh no and brett's awesome Brett's such a cool dude I'm a big fan and I published the video and someone messaged me and they were like just want to let you know you made this big mistake and I said some pretty awful things and I felt so bad I think I've done similar I was like oh my god dude I was like that guy is so cool like I'm a big fan of his work I just was like I tried editing it and I'm like I'm deleting it hands down I would I would rather just remove the video outright than to say a disparaging word against that man yeah and I was an accident I didn't even know I said it yeah I think I did the exact same thing.

[1252] I did the exact same thing, I believe, on stage once, because I had a bit about Harvey Weinstein.

[1253] And you said Brett?

[1254] And I think I said Brett.

[1255] Oh, dude.

[1256] Yeah, I think I said it on stage.

[1257] It's very unfortunate.

[1258] Yeah.

[1259] Because they're polar opposites.

[1260] Yeah.

[1261] Brett's fantastic.

[1262] No, he's, but it's actually Weinstein.

[1263] Right.

[1264] So that will help.

[1265] Yeah.

[1266] Say Stein instead of Steen.

[1267] For Harvey?

[1268] For Brett.

[1269] For Brett and Eric.

[1270] Yeah.

[1271] Weinstein.

[1272] Yeah.

[1273] Harvey Weinstein.

[1274] Yeah.

[1275] Harvey Weinstein.

[1276] I didn't even know I did it.

[1277] Spelled the same.

[1278] I, I, I, I did I did a light edit, I published it, and then I started getting messages from people and I was like, no. And then I watched it and I was like, oh my God, dude.

[1279] Well, it's a problem, you know, like when you're thinking and talking at the same time, there's a bunch of words bouncing around in your head and you're just trying to, and you think you're saying the right thing.

[1280] You know, it's, that's why intent is so critical.

[1281] George Carlin.

[1282] Yeah, and magic words are so fucking dangerous, you know, and that is what I'm going to tell you about later.

[1283] Oh, right on.

[1284] Yeah.

[1285] About a specific comedian that we know and love.

[1286] Well, George Carlin was absolutely amazing.

[1287] Well, he not only was he absolutely amazing, but people don't have to, you almost have to have lived during the time where he was getting arrested.

[1288] Oh, yeah, not me. Much like Lenny Bruce before him, but to understand how significant he was, when he was doing that seven dirty words you can't say on television, like back then, people were like, what the fuck is this guy doing?

[1289] You know, it was groundbreaking.

[1290] when he did I think it was in what like 92 he did that bit well he probably did it a lot where he basically ran off 50 racial slurs and then actually called Eddie Murphy and Richard Pryor the N -word do you remember that oh yes and everyone laughed yeah everyone left because they understood his intent they understood that he didn't actually disrespect them they understood that he didn't mean anything bad about it he was making a point about the the racist asshole behind the words yes and it's funny to me this is the thing man I when I was younger, I was, I was far left men, skin tight, black, you know, shirts, the virus, you know.

[1291] The virus?

[1292] Punk rock bands, anti -flag, things like that.

[1293] I was trying to think of, it's been so long.

[1294] I was trying to think of some of the band shirts we used to have.

[1295] And, man, we were angry and pissed up all the time.

[1296] I grew up like that.

[1297] And then over time, I went through a ton of really important life lessons.

[1298] One of the first and most important was I was a young skateboarder in Chicago, really looked up to some of these older guys who were really good.

[1299] I went to Catholic school when I was younger.

[1300] I ended up becoming this punk rocker, guitar playing, far left, skin tight, you know, skateboarding, angry, yeah, and no flags, you know, fuck the government.

[1301] And then I go to this dude's house, who's this really great skateboarders, got a picture of Jesus on the wall.

[1302] And I immediately scoff.

[1303] Like, I'm, you know, high and mighty.

[1304] I was like, it's like, would you like a Christian or something?

[1305] And he goes, no. And I was like, then why do you have a picture of Jesus?

[1306] And he goes, oh, I just thought like a story about a dude traveling around helping people was kind of cool.

[1307] And I went, oh, that's a good point.

[1308] Wow.

[1309] I was like, I was like, wow, maybe I don't really understand.

[1310] Maybe, maybe this means something different to other people.

[1311] That's what it means to me. Well, if Jesus was an Indian man that, you know, had wooden beads and, you know, and was a Hindu god, we would love him.

[1312] You know, it would be like Shiva or Vishnu.

[1313] We would think he's the most amazing thing ever.

[1314] It's the fact that if you look at what Jesus preached and what he was all about, I mean, it seems pretty all right.

[1315] I think it's about as spiritual.

[1316] and loving, and it's about, I mean, his whole ideology, the Jesus of the Bible.

[1317] Yeah.

[1318] And he was essentially about loving your brother and treating people as if they're you.

[1319] But this for me, like I bring this up because it was kind of a formative moment where all of a sudden I realized was my ideology predicated on assumptions?

[1320] Like was I holding these views because other people told me to hold them?

[1321] Right.

[1322] Did I actually understand that there were some positive things on the other side?

[1323] and then I slowly moved over to more of a center -left position.

[1324] And, you know, now what, the reason I bring this up is I looked, I watched that video.

[1325] I tweeted this, the video of George Carlin, because, man, George Collin was a, I used to watch his videos.

[1326] My mom would put him on.

[1327] And my mom's been a hippie, liberal, far left, all that stuff.

[1328] And you'd probably consider a conservative by today's, you know, measures, the way things have been going.

[1329] Now they look at, you know, Kevin Hart.

[1330] He said a bad joke 10 years ago.

[1331] Get him out.

[1332] Like, could you imagine, I, God, for you.

[1333] forbid what would happen with George Carlin's routines today.

[1334] They wouldn't they would be running all of his old routines saying no, you have to ban him from the show.

[1335] He literally called these people the N -word.

[1336] Why was it that George Cron could go on stage and talk about how Republicans were dumb and how religion is crazy.

[1337] He was clearly on the left his whole life.

[1338] And he said these things that by today's standards would be considered conservative.

[1339] Right.

[1340] And so for me, it's a weird thing to go from being on the far left as a young person.

[1341] It was around like 19 or 20.

[1342] I started to become more moderate and then to see them today being extremely offended like people used to be in the 50s and 60s like that's regressive yeah that's trying to bring things back to the way they used to be with offensive you know note nothing on puritanism and all these things and now I feel like I guess the the cliche is the modern the modern left whatever people call it like the you know capital L tribal left seems to be being indoctrinated not by left wing policy ideas it's not about necessarily socialism it's about identitarianism it's about policy based on your immutable characteristics and how, you know, like going back to the Green New Deal, like in the bill, it talks about racial equity.

[1343] What does that do with the environment?

[1344] But what does that mean?

[1345] Well, equality would be like you equal opportunity.

[1346] Two people are allowed to try and if one succeeds, congratulations.

[1347] Equity would be, well, let's determine whether or not you are advantaged or privileged and then hold you back or push you forward based on these certain metrics, I suppose.

[1348] it's it's the problem i have with it is that it's not quantifiable so this was actually something that was really shocking to me i was sitting with my niece uh and my my sister my niece and my mom and i showed this image that people like to share it's three people standing up by a baseball a fence and there's a baseball game there's a really short person who can't see there's a medium -sized person who can see a little bit and a very tall person who just stands right up above the fence it says this is equality each of them gets one crate well one crate isn't enough for the short guy, and the two guys can already see.

[1349] It says, this is equity, and it shows the short guy getting, you know, three crates so they can all see now.

[1350] And I said the problem is when it comes to someone's height, sure, we can understand, let's give the crates to the short guy so he can see along with us.

[1351] But how do you determine equity based on the color of someone's skin or their, you know, like characteristics that can't necessarily be quantified, right?

[1352] So when Alexandria Ocasio -Cortez pushes a bill forward that's purporting to be about the environment, but it includes racial equity clauses, are we to assume that her ideology states that if you are not white, you are poor by death, like, it's a guarantee?

[1353] Or do we have to assume that each individual has different advantages, different cards to play, and some are born wealthy, some aren't?

[1354] And yes, there's historic racism, but we can't make those assumptions, right?

[1355] Right.

[1356] So this is, this is to me one of the biggest problems I've been having as like a lifelong left -leaning individuals.

[1357] Who do I vote for?

[1358] Right.

[1359] I was a big fan of Bernie Sanders for a while.

[1360] But then Bernie Sanders gets up on stage of the debates and says white people don't know what it's like to be poor.

[1361] Well, that's hilarious.

[1362] Go to fucking West Virginia and visit the coal miners.

[1363] Right.

[1364] And, you know, what's really weird, I saw one of Bernie's tweets that I looked at and I said, oh, come on, man, you know this is fucked up.

[1365] He was talking about how much more money white men make than black women, then Latino men, then all these different things.

[1366] And what he didn't include was Asian men.

[1367] Because Asian men make more money.

[1368] But here's what I think a lot of people on the right miss. He said pay equity, not pay equality.

[1369] Okay?

[1370] I think perhaps we should stop assuming they don't know what they're saying.

[1371] Because a lot of people assume what they're saying is, you know, the gender earnings gap is real, but the gender pay gap, you know, it's not.

[1372] If a man and a woman are both offered the exact same job, exact same experience in education, women tend to get, I think it's like three to five percent less.

[1373] And many people believe that's because they're less likely to negotiate, which is why you have, like, lean in, tell women to be more, you know, trying to be more assertive.

[1374] But it's not this 77 % number that's, but there is an earnings gap, right?

[1375] The median salaries of men and women are different.

[1376] So when Bernie says white men make X more than these other demographics, he said in his tweet, pay equity, not equality.

[1377] He doesn't want fair pay for everyone based on job.

[1378] he's actually saying it doesn't matter what job you have everyone should be paid the same yeah that's nonsense but then you see what cortez releases on our website if you're unwilling to work they'll provide economic security they actually mine i believe they took it off the site right but i think when they include it in the bill that they want equity not equality when they include on their website if you're unwilling they'll pay you and when burney says equity as well i think they're not talking about equality like i don't think you know the average american understands what they're actually saying is you should be paid a flat rate period And when you're talking about the pay gap being different from men and women, we should clarify that what you're saying essentially is that men choose different jobs and they work more hours.

[1379] And that's the reason why they make that much more money where it's 77 cents on the dollar.

[1380] Yeah, there was another study came out recently that said that hours worked was almost the 100 % of the reason why men and women earn different at a median median salary.

[1381] In some areas, women actually earn more than men.

[1382] And there's, I think, seven cities, I think this was on Pew.

[1383] Again, you fact check me if you get, you know, everybody thinks I'm wrong or whatever.

[1384] But I believe it was seven cities where women out -earned men by, like, seriously high numbers, like up to 20%.

[1385] So there's a lot of issues when it comes to the pay gap and equality.

[1386] But without going on a tangent in that area, I think what ends up happening is, you know, I saw Bernie's tweet.

[1387] And I responded to it by saying, good news, Bernie, pay equality is enshrined in law.

[1388] And I cited three examples of where it's illegal to discriminate based on gender.

[1389] And then someone points out to me, Tim, he didn't say equality.

[1390] He said equity.

[1391] That would imply it doesn't matter what job you have.

[1392] And a petroleum engineer should earn the same as a store clerk at H &M.

[1393] That's equity.

[1394] That just because you have an advantage, because of your education, doesn't mean you should earn more than somebody else.

[1395] You see what I'm saying?

[1396] So the fact that there's people that actually believe that, that don't believe that, first of all, that's going to absolutely discourage people.

[1397] people from trying to succeed.

[1398] Why would you?

[1399] If you could get the same amount as a CEO of Exxon as you can working at 21 forever, forever 21 or whatever, Abercrombie and Fitch, why would you, why would you try hard?

[1400] Why would you exceed?

[1401] Why would you succeed?

[1402] Why would you excel?

[1403] I think it's fair to point out some people would.

[1404] I don't know if there's no, if there's no real, I mean, other than social status, if there's no real positive consequences.

[1405] I think social status might be one of the prime driving factors of most people.

[1406] I don't know, man. Not enough.

[1407] Not enough to really encourage innovation and progress.

[1408] I can agree with that for sure.

[1409] But I also think there's a saying that I was told a lot that people don't get rich because they want money.

[1410] They get rich because they're passionate about something and the money comes after.

[1411] They always say that the money comes after or something like that.

[1412] And I'd point out every day, I take no days off.

[1413] It's been a couple of years with me not taking any days off.

[1414] I work literally every day full time.

[1415] Right now I'm producing six YouTube videos every day.

[1416] Only one of them is a real...

[1417] Seven days a week?

[1418] Seven days a week.

[1419] Non -stop for two years straight.

[1420] The fuck you're doing yourself.

[1421] I like it.

[1422] It's fun.

[1423] And that's a thing.

[1424] I'm not doing it because there's a lot of the end of the tunnel.

[1425] I do it because you know, I see things on, uh, so for the most part on my main YouTube channel, I do one video every day at 4 p .m., which tends to be just like a news analysis piece, but I'm not perfect.

[1426] Sometimes I, you know, get all hyperbolic and stuff.

[1427] My second channel is me just ranting and, you know, not really swearing, but still just like heavy opinion stuff.

[1428] I do it because it's fun.

[1429] You know, I see something.

[1430] I want to explore it.

[1431] I used to travel all the time.

[1432] You know, when I worked at Vice, I was in all these different countries and all these dangerous places, not because I wanted to have a name for myself, not because I wanted to make money.

[1433] I wanted to watch a revolution.

[1434] I wanted to know why it was happening and I want to talk to the people who are experiencing it.

[1435] So I can relate to people who say money isn't a motivator for sure.

[1436] But I've also talked to people in Scandinavia who have told me they sort of give up at a certain point because I can't remember which country it was.

[1437] it may have been Sweden or Norway, but these two women told me that after like $77 ,000 per year, they tax like 80 % of your income, so people just stop.

[1438] They literally just stop.

[1439] Well, that's another proposition, right?

[1440] This is something else that's been discussed in terms of anyone who makes more than X amount per year taxing them over 70%.

[1441] I agree with the progressive tax wholeheartedly.

[1442] I disagree with a number of that high.

[1443] What do you think it should be?

[1444] I don't know, but I will say we need more progressive tax.

[1445] brackets.

[1446] We need to keep going.

[1447] And, you know, I got to say maybe, maybe at $10 million, 70 % does make sense, but I kind of lean towards not really, because it seems like, man, that's a lot of money.

[1448] That's a lot.

[1449] That's, that's, that's more, that's, that's ridiculous amount.

[1450] You know, I think Steve Bannon said something like a five in front of it or something, but I don't know.

[1451] I'm not an economist, but I do believe a progressive tax makes the most sense.

[1452] And I can explain it to you if you want to hear it.

[1453] Sure.

[1454] So there was a study, I, I believe it was from Harvard.

[1455] You need $77 ,000 per year.

[1456] This may have been 10, years ago in order to be middle class median in the United States.

[1457] That means if you make 77 ,000, you'll have vacation, you'll have insurance, you'll have a car, you can raise a family, you can send them to school, all that stuff.

[1458] But you have nothing left over for savings, you have nothing left over for investments.

[1459] If you make $100 ,000 a year, you're going to have $23 ,000 left over for investing.

[1460] Eventually at a certain point, if you only need $77 ,000, if you're making $10 ,000, you've got $9 ,000 ,000 ,000 ,000 that you can invest and just be independently wealthy and be rich forever.

[1461] Now, I have no problem with being wealthy.

[1462] I have no problem with other people being wealthy and living off of their investments and all that.

[1463] But there is a point where you have to realize the coalescing of power, the monopolizing of power is a really dangerous thing for any society.

[1464] Too few individuals holding too much power can destabilize an economy, can destabilize a country.

[1465] The problem with communism, you snap your fingers and you put a centralized authority in place, at least that's how it's been every single time.

[1466] And then they hold all the cards and they can oppress whoever they want.

[1467] The problem with laissez -faire capitalism is over time, which is why it's better in a lot of ways.

[1468] Over time, it eventually becomes a centralized oligopoly of a few corporations controlling everything, which we're kind of seeing now.

[1469] So all the progressive tax can really do is slow that process down, which I think is a good thing.

[1470] But ultimately, I think just looking at the system, eventually you end up kind of where we are, where six media companies control everything.

[1471] And then, you know, some companies are the biggest funders of certain politicians and corporations, they just have too much power.

[1472] I mean, there's a, I remember reading a report or a story about how wealthy people have like three or four times more ability to influence a politician than like the majority of the people in the country simply because paying for expensive dinners and lobbying earns you favors.

[1473] You know, super PACs paying, you know, guaranteeing funding for a politician earns your favors.

[1474] So it's, you know, look, if a million people tell me they want, you know, X, but the people who are paying me, like funding my campaign are paying me more, well, who am I going to provide favors to?

[1475] And then once I'm done with my campaign run, I can go to a job at their company.

[1476] Yeah, these are problems.

[1477] So without going on too big a rant, I think ultimately a progressive tax can help slow the process down of special interests acquiring too much power.

[1478] Eventually it happens anyway.

[1479] But with a flat tax, you're basically saying, at a certain point, you can just keep dumping more and more money into different investments, making more and more money and increasing your power exponentially and other people can't catch up to you.

[1480] And then power becomes too.

[1481] quick, right?

[1482] Yeah, I think in this country, we try to look at success and achievement is something that everyone's striving for, and we don't want to put any restrictions on that.

[1483] We look at capitalism as the reason why everything's going so great over here.

[1484] This is America, land of the free, home of the brave, go out there and kick ass.

[1485] We're not going to saddle you down with debt.

[1486] But it makes sense that after a while, as we're seeing today, but I don't mean what is the best way to do it.

[1487] I mean, socialism is not going to work.

[1488] What does work?

[1489] I think a mixed economy.

[1490] A mixed economy.

[1491] Where we are right now, right?

[1492] A portion of income is paid in taxes for government programs for the, for defense and things like that.

[1493] I just think we have a big problem with corruption.

[1494] I think we've got bloat.

[1495] I think we've got government agencies that instead of reforming and breaking them down, we just slap more bandades on top.

[1496] We got a festering wound and we're putting bandages over bandages.

[1497] You know, it's like a certain point.

[1498] You got to redo it.

[1499] We also have systems that are in place that I mean in terms of like the way communities have always existed in certain communities there's just poverty and crime and no one does anything to fix it right and it seems to be that we're more than willing to go to other countries in nation built we're more than willing to pump money into different countries especially if they have natural resources but in our own country we're not I mean the the greatest resources of course human beings and the best way to make America great or stronger would be to have less losers.

[1500] Well, what's the best way to have less losers to have more people succeed?

[1501] What's the best way to have more people succeed?

[1502] Give them more opportunity and chance to not be stuck in a quagmire, not be trapped in a ghetto.

[1503] This is, yeah, so I believe we should allocate access from other areas to improve the, you know, certain areas.

[1504] In that sense, I believe in socialism to a certain extent.

[1505] Like, I believe in it with fire departments.

[1506] I believe in it with the police department.

[1507] I believe we should spend money that comes out of, you know, you know, the public pool to fix things.

[1508] Right.

[1509] I look at New York.

[1510] There are some neighborhoods that are really bad, some neighborhoods that are really good.

[1511] Well, if we take excess from the really great neighborhoods and use that to fix roads, pay for schools, and poor neighborhoods, crime is one of the biggest correlations for crime is poverty.

[1512] Yes.

[1513] So if we can get better schools, we need to reform the school system straight up.

[1514] If we can get better hospitals, if we can fix the roads, then we're doing a lot to reduce crime and reduce poverty and a rising tide lifts all ships.

[1515] Yes.

[1516] So that's why, you know, I, that's why I like Bernie Sanders, although I will, I say, I make sure I tell people like, he is a little too left for me. He is.

[1517] But when we were looking at who we had in 2016, I was like, yeah, Bernie's my guy, you know.

[1518] I like some parts of him.

[1519] I liked him culturally.

[1520] I like that, you know, for the same reason in some ways that I liked Obama culturally.

[1521] I mean, I don't like the drone attacks.

[1522] I don't like the attacks on whistleblowers.

[1523] There's a lot of aspects of it that I found very disturbing and distasteful and against the narrative of what we think of who he is.

[1524] We think of him as this extremely articulate, very well read, educated guy and, you know, a good figurehead in terms of like who the president is.

[1525] Right, right.

[1526] And he was drone bombing foreign countries.

[1527] His administration was claiming that military aged men who were enemy combatants, you know, so one of the, one of the biggest.

[1528] Military aged men, period.

[1529] Right, right.

[1530] And so it's like a dude carrying water pills gets blown up and they're like, he's a combatant, you know.

[1531] Right.

[1532] He's 18.

[1533] And so, you know, when it comes to Trump, Trump, what did he do, missile strike in Syria, weapons deal with Saudi Arabia, commando raids in Yemen, a little girl got killed.

[1534] And I've been, for the most part, very, very critical of him and any other administration who engages in regime change, foreign wars.

[1535] And look, I got to say, you know, when it comes to domestic issues, I'm not, that's not my wheelhouse.

[1536] When it comes to foreign policy stuff, only a little bit.

[1537] Like I've been to some countries, I've experienced this stuff, but really, you know, on the ground, cultural between, you know, people is more my thing.

[1538] But I know more about foreign policy than domestic.

[1539] And when I see Trump's foreign policy, I was very critical of it, but I will point out withdrawing from Syria.

[1540] I'm a fan.

[1541] I understand a lot more than probably the average person does some of the issues surrounding Syria, Russia, the Qatar Turkey pipeline and things.

[1542] But typically, I think it's usually a bad idea when the U .S. involves itself in foreign interests and tries engaging in these regime change strategies to build allies.

[1543] But one of the things that really blew my mind is I saw a survey, at least it was going around on social.

[1544] media, so who knows if it's true, that claimed the Democrats have a favorable opinion of George W. Bush, you know, something of that effect.

[1545] And they're, in retrospect?

[1546] Like, yeah, like today.

[1547] They pulled him.

[1548] They're like, oh, he was good.

[1549] And there's a video of him, like, giving a piece of candy to Michelle Obama.

[1550] And everyone was like, it was going viral and people were laughing about him.

[1551] I'm like, are you nuts?

[1552] They hold hands.

[1553] That's crazy.

[1554] That guy's awful.

[1555] I don't like that guy.

[1556] But what, what happened?

[1557] And then when Trump announces he's going to pull trips out of Syria, everybody opposes it, like the media saying it's wrong.

[1558] And you've got a lot of, like, mainstream people on the left saying it's wrong and I'm like it is what do you mean like that's crazy to me like you know all of my activist friends we've never been in favor of that stuff we've always opposed that's always been the left's position and now I'm seeing people who claim to be on the left support multinational businesses as private businesses doing whatever they want meaning like social media corporations yeah like like dude you know Saudi i think it's a Saudi prince is one of the biggest investors in Twitter really yeah yeah and so it's like there there have been a series of people who have gotten letters that they violated Pakistani law.

[1559] I mean, they send screenshots, you know, screenshots can be faked, but there have been a couple people who have been like, for some reason, Twitter decided to inform me of this.

[1560] And then it's a multinational, let's talk about the algorithmic apocalypse.

[1561] Let's segue into that.

[1562] You've got a platform where our public discourse is happening, where the left has repeatedly said that Russians used it to manipulate our elections, where one of its biggest investors is a Saudi prince, or something to that effect.

[1563] And they're banning people of a certain, who oppose a certain ideological bent.

[1564] Like that sounds like a democratic crisis, right?

[1565] If, if this is where the public sphere is, if you, you know, you said Milo's no longer in the conversation, you know, he's banned from Twitter, even though he's got millions of followers.

[1566] Right.

[1567] They don't talk about him anymore.

[1568] He's off Twitter.

[1569] Twitter really is important.

[1570] The president is there.

[1571] So if you start removing people, you've got foreign interests who have a stake in what Twitter is doing, yeah, they can seriously influence our elections.

[1572] Yeah.

[1573] And, and they, they are.

[1574] But I'll move into what's really crazy.

[1575] New York Times reported there's a group that false flagged the Republicans in Alabama with fake Twitter accounts they made to convince the media Russians were propping up the campaign of Roy Moore.

[1576] So, basically, according to the New York Times, is all fact.

[1577] They've seen the documents.

[1578] They've reported it that Democratic operatives smeared, engaged in a false flag campaign to make it appear like the Russians were.

[1579] popping up Republicans, and the national media in the U .S. ran with it.

[1580] How that's not a crime is beyond me. That's interfering in elections, and we know it.

[1581] And this group is still being cited.

[1582] They're smearing Tulsi Gabbard, right?

[1583] Yeah.

[1584] An NBC news article came out saying that Russians are, you know, have taken notice of our campaign, are promoting it.

[1585] Same group.

[1586] Still running the story.

[1587] Damn.

[1588] How that's not a crime is mind -blowing to me. But the New York Times reported it.

[1589] So it's at this point, it's like, I mean, that's - Well, I think we have to be aware that there's so much manipulation going on right now from almost every angle.

[1590] Oh, yeah.

[1591] So, I mean, and what's hilarious is that people look at what's going on with Russian troll factories and, you know, the way they're trying to influence our elections, that it's particularly egregious.

[1592] Yeah.

[1593] Where we do that shit.

[1594] Oh, totally.

[1595] Plus.

[1596] And I'm pretty sure we learned about the U .S. doing it well before we learned Russia was doing it.

[1597] And I'd be willing to bet the U .S. started it.

[1598] It's just the whole thing is so, you can't pay attention to all of it.

[1599] That's part of the problem.

[1600] I mean, maybe you can if this is what you're doing seven days a week.

[1601] And not stop.

[1602] And even you will probably struggle to keep up.

[1603] But like me, I can't.

[1604] And it's one of the reasons why I brought you in here.

[1605] When you and I had that conversation, one of the first things I realized right away, I was like, okay, I could have you coach me. Like, we talked about this.

[1606] But I'm not going to, I'm like, I'm not going to get at all.

[1607] I'm going to have to, like, study it for a long time.

[1608] And I don't have the resources to do that.

[1609] This is part of the problem that I had with that one podcast with Jack.

[1610] I just don't have the resources.

[1611] Well, yes.

[1612] So I think it'd be fair to point out, you know, like YouTube criticism too, because in talking about censorship, I think a lot of people immediately assumed I'd come on here and start waving my arms in there screaming their bias against conservatives, which I think, to an extent, I kind of did.

[1613] But YouTube is a bit different.

[1614] YouTube does.

[1615] It has demonetized LGBT content.

[1616] And YouTube has said that these topics are not suitable for all advertisers because it deals with sexuality.

[1617] They have target many left -wing channels.

[1618] there are a lot of non -mainstream left -wing outlets.

[1619] And when you say target, what you mean is demonetize.

[1620] Demonitize or sometimes outright ban.

[1621] And I think for demonetizing, I think they've made a state.

[1622] What was the policy?

[1623] It's essentially things that are political, correct?

[1624] If they made that statement, I don't know that I was going to interject that they've also banned, or not banned, but demonetized people are extensively people that smoke weed on their channels.

[1625] Oh, totally.

[1626] There's certain video game channels that have, for unknown reasons, just stopped having monetization on their channel, too.

[1627] Right.

[1628] And that is the problem of ultimately them having this incredible power, where they really, there's not real open competition in terms of like another, like a parallel competitor.

[1629] I don't think there will be.

[1630] Really?

[1631] But here's the thing.

[1632] I think YouTube's on a bunch of really bad things.

[1633] I'm going to give a very important shout out to Mumkey Jones, who was wrongly turned.

[1634] terminated from YouTube for highly dubious reasons.

[1635] He is a dark comic.

[1636] He had hundreds of thousands of followers.

[1637] He made jokes about things like school shootings.

[1638] Very dark stuff.

[1639] But it was clearly mocking some of these people.

[1640] He was mocking Elliot Roger.

[1641] He's making jokes about it.

[1642] And in fact, some of his videos were approved manually for monetization.

[1643] But for some reason, YouTube just wiped them out.

[1644] One day, gone.

[1645] So he set up a new channel and said, okay, you know, we're not going to do that anymore.

[1646] It doesn't matter.

[1647] They got rid of him.

[1648] He's effectively off of YouTube.

[1649] And he was like, he's a well -known funny guy.

[1650] He wasn't breaking the rules.

[1651] He wasn't, but they still deleted his channel.

[1652] So I bring him up because I think it's worrisome that, yes, without an alternative, your career is wiped out in a second with no recourse and no reason why.

[1653] And the response they give you is, it's our platform.

[1654] Yeah.

[1655] And you'll hear people say, oh, but they're a private business.

[1656] They can do what they want.

[1657] Sure, but they're a monopoly.

[1658] We've got to have restrictions on that.

[1659] Yeah.

[1660] Yeah.

[1661] Sorry, I have a question to ask.

[1662] here because I've had this conversation with some friends of mine and this came up and like this is I guess the devil's advocate to this question because this happened with MySpace.

[1663] MySpace still exists and there are people that had millions of followers on that platform.

[1664] Are they owed something by MySpace because MySpace failed and their their accounts no longer have the clout that they once had?

[1665] No. You know like so if for instance, like if YouTube failed tomorrow, could Pooty Pie sue them because they've made bad business decisions and now his business is a different issue.

[1666] That's a different issue.

[1667] Totally different.

[1668] I know, but like, in the, I see what you're saying.

[1669] In the Twitter account, like, they're not banning someone's IP address from using Twitter .com and going to see slash real Donald Trump and see what he's saying.

[1670] Right.

[1671] But listen, your, MySpace failed because of alternatives.

[1672] Facebook became more prominent.

[1673] There was an option.

[1674] And the other issue is, Monkey Jones followed the rules.

[1675] He was, he was told by YouTube, this is what you can and can't do.

[1676] And he said, you got it.

[1677] And then one day they just wiped him out.

[1678] And so.

[1679] And they never gave any explanation.

[1680] I don't believe so and it was really weird because and it's been a little while since I went over the story but there was a video he posted that was like a music video making fun of Elliot Roger he's that school that that master shooter guy it existed on different channels on other like other prominent channel it existed and they banned him and then he brought up like why was it banned on mine but this one was approved and then like a day later they banned the other one they copyright struck it the issue here is if there was an alternative to YouTube you where you can operate I'd be on it right So one of the things, I'll give a shout out.

[1681] I've been using Mines, M -I -N -D -S dot com.

[1682] Yeah, I'm actually having Bill, the guy who owns it.

[1683] How do you say his last name?

[1684] Otman.

[1685] Otman.

[1686] Bill Otman's coming on soon.

[1687] I made a concerted effort after that podcast to reach out to a bunch of different people and try to expand this conversation.

[1688] I want to have the guy from Gab too.

[1689] Gabs in murky.

[1690] You know, a gab has been...

[1691] Merkey in terms of content, but not really in terms of ideology.

[1692] I don't think so.

[1693] Really?

[1694] So I did it.

[1695] I researched.

[1696] this a little bit.

[1697] If you look at their Wikipedia page, it's simple sourcing.

[1698] There was a study done on GAB that found, I think they only have like 5 % of the posts are considered to be hate speech, whereas Twitter is like 2 .3 or something.

[1699] So GAB is predominantly not.

[1700] Hate speech, not much more than Twitter.

[1701] And when you consider that we're looking at percentages, you can actually see that Twitter's hate speech is in the millions, and GAB is in the tens of thousands.

[1702] But GAB is run, you know, the media nonstop.

[1703] Yes.

[1704] Man, I think when it comes to YouTube, I actually trust them.

[1705] I do.

[1706] I really do.

[1707] I think the reason they took down Mumkey is because the potential for a PR backlash over his kind of content was so great.

[1708] They said, we don't care.

[1709] And that's unfair and it's wrong.

[1710] I think they were wrong to do it.

[1711] But you understand the motivation.

[1712] I understand the motivation.

[1713] And I actually think YouTube does more to protect free speech on the Internet than a lot of these other companies do.

[1714] And I'll give you a few examples.

[1715] One good example is, first of all, when you're demonetized, you still earn YouTube red revenue.

[1716] So YouTube still will, like, they're trying to pay you.

[1717] They want the money.

[1718] But with Sargon of Akkad getting banned from Patreon, Patreon banned him because they said, you used the naughty word on YouTube eight months ago.

[1719] YouTube doesn't care, he said the word, right?

[1720] That pod where he called the alt, right, the white N word.

[1721] YouTube didn't care.

[1722] YouTube said, it's fine on a platform.

[1723] That's fine.

[1724] Patreon.

[1725] Especially the context.

[1726] Right.

[1727] And the context of which it was used.

[1728] It just takes 30 seconds listening to it.

[1729] Oh, I see what he's doing.

[1730] Even if it's kind of a clunky use of the words, he's not using it in a racist way.

[1731] He was trying to, you know, actually this is a...

[1732] He was trying to show how stupid they are.

[1733] But can we talk about the bias on Twitter with Sarah Jung, who for three years was posting anti -white racist, like, mean -spirited, awful things?

[1734] And the excuse was she was using the language of her oppressors.

[1735] Yeah.

[1736] Well, what do you think Sargon was doing?

[1737] Yes.

[1738] He was using their language against them.

[1739] Exactly.

[1740] They hired at the New York Times.

[1741] Yes.

[1742] That's terrifying.

[1743] And there was a huge backlash.

[1744] You know, and I want to take this opportunity, too, to point out media matters.

[1745] There was, when that AIN thing went around the alternative influence network, lying.

[1746] Like, there was a media matters wrote about it.

[1747] I politely reached out to the person who did.

[1748] And their response was, I hope a bird poops on your head and it gets in your eyebrow and you smell like farts.

[1749] That was, what?

[1750] That's the response they gave me. What?

[1751] Yeah.

[1752] I said, just want to give you a heads up.

[1753] There's a lot that's wrong in this report.

[1754] It's not true.

[1755] I don't hold these beliefs.

[1756] I've never read this individual, and that was the response.

[1757] The literal response.

[1758] Was, I hope, a bird poops on you, and it falls into your eyebrow, and you smell like dirty farts or something.

[1759] Then when I, when I, I, so yeah, I mean, I got a hold of their fucking typewriter, or keyboard, rather.

[1760] And, and you know, the, the guy who started it was at Politicon, and he set up on stage, fake news is predominantly, it's, or he said it's only, it's only a phenomenon.

[1761] the right.

[1762] The left doesn't engage in conspiracies in fake news, which is nonsense.

[1763] Of course they do.

[1764] And I, afterwards, I said, let me ask you then.

[1765] Because one of your writers responded to me in this way.

[1766] And he's like, I don't know, I don't know anything about it.

[1767] And I can't respond.

[1768] And then he's like, you know, walked off.

[1769] And I was like, you know, I want to circle back to Gab.

[1770] Because I get what you're saying that the volume is lower and that there's less.

[1771] Well, it's, I read.

[1772] Right.

[1773] But what do you, what is your opinion of it, though?

[1774] You were, you kind, we kind of got sidetracked.

[1775] Oh, it's a, it's, it's.

[1776] It's, it's.

[1777] It's.

[1778] I. It's.

[1779] It's.

[1780] I don't know.

[1781] I can agree with the general notion, the ideology of if it's legal, it's allowed.

[1782] Right.

[1783] And I would argue that, you know, what Jack is trying to do with Twitter, or I don't say him personally, but what Twitter is trying to do is, what, create like a comfy, padded neon room for the kids to hang out it?

[1784] No, the real world is harsh.

[1785] Yeah, I think what they're trying to do, and I don't want to speak for him, but I think they're trying to engineer the conversation to be more polite and civil.

[1786] Says who?

[1787] Exactly.

[1788] You know, whose definition?

[1789] That's a good point.

[1790] And that's, and that's, I see so many of these people who just, they wield power.

[1791] and they're unaccountable.

[1792] Well, it's also, you don't recognize the consequences of telling people what they can and can't do.

[1793] And this is a very slippery slope.

[1794] You're running up a greased hill.

[1795] Yeah.

[1796] And people don't like it.

[1797] They don't like it.

[1798] And, well, the thing is, like, when you see something like, if everything was just open, what would the conversation be like?

[1799] If there was no banning, if there was nothing, it was just everything, real free speech, you would have, I mean, if there was.

[1800] was it was impossible, let's put it this way, it was impossible to ban someone from any social media platform, whether it's YouTube or Twitter or Instagram, what would the conversations look like?

[1801] How much different would they be?

[1802] And would we maybe have a healthier way of adjusting?

[1803] I think it would be worse.

[1804] Worse.

[1805] Because I could be, but temporary or in the long run?

[1806] Maybe temporary, but I believe it's called the online inhibition effect, basically.

[1807] the opportunity to be anonymous online and the in the distance makes people they have no problem being their worst self yeah and so i can understand what twitter wants to do they say hey we need to figure out how we stop people from being mean all the time right because twitter is is a hellscape of just mean people saying nothing but mean things but it's also again because they're anonymous and because it's possible but then look Alex jones can can say a bunch of really awful things it's his right to do so should be bent he shouldn't if if so Does Alex Jones, I'm not, I don't want to directly accuse him because I don't watch his show.

[1808] But if he goes on Twitter and he says something that's deemed to be false, should he be banned for that?

[1809] No. If he, if he challenges a journalist, should he be banned for that?

[1810] No. Well, let me tell you, because he says things about me that aren't true.

[1811] I don't think he should be banned.

[1812] And he's saying the things about me and I don't think he should be banned.

[1813] I think, Alex, I think we should talk.

[1814] We should have a conversation.

[1815] I watched those videos he's made.

[1816] He's upset.

[1817] He's upset, and I get it in certain ways.

[1818] I get it, and he thinks I took a shot at him, and I get it, and I probably did, and I shouldn't have.

[1819] So here's the thing.

[1820] But my point is, I don't think he should be banned.

[1821] Right.

[1822] And it's, he's doing it at me. Yeah, yeah.

[1823] Like I mentioned, those people from Occupy who lie about me all the time, they've posted the most ridiculous lies about me. I still don't think they should be banned.

[1824] Right.

[1825] The thing is, if you ban them and then someone opposes them, but then someone opposes the people who oppose them and they want them banned, And then you have this fucking war back and forth.

[1826] And instead of fighting bad ideas or incorrect ideas with correct or good ideas, now you just have people pressing ban hammers left and right.

[1827] And you're just trying to figure out who the majority is so you can side with the biggest group.

[1828] And you're trying to virtue signal and you're trying to get something that supports your ideology, whether it's right or left.

[1829] So here's the thing with Jones, though.

[1830] If the people on the left want to argue that he is making the platform worse and horrible, I understand that.

[1831] and I recognize, well, that's unfortunate, right?

[1832] This is the real world, and sometimes people say things you don't like.

[1833] But more importantly, a lot of people argued that when he said, you know, he said something about Sandy Hook, which, again, I haven't seen the videos, but they've, they've, he's been sued.

[1834] They said that he said San Hulk never happened.

[1835] So, so what?

[1836] Is fake news to be banned?

[1837] Well, many people were saying, yes, Facebook needs to ban fake news, but think about what that means.

[1838] It means you're not allowed to be wrong, okay?

[1839] Because fake news doesn't mean you did it on purpose.

[1840] More importantly, you're not allowed to be stupid.

[1841] Well, here's another thing.

[1842] in defense of Alex, one thing that Alex did do is in the future, after he was done saying the things that he said about Sandy Hook, he then said it definitely happened.

[1843] So he corrected his course.

[1844] And so is there a path for redemption when you correct your course?

[1845] And if there's not, what are we doing?

[1846] Because we're not treating people like human beings then, whether it's Milo or whether it's anybody.

[1847] I mean, I'm sorry to cut you off again.

[1848] But the path to redemption is so fucking critical.

[1849] Yes.

[1850] So let me tell you, I went to Sweden.

[1851] People were claiming things were happening in Sweden a couple years ago.

[1852] I decided to go check it out.

[1853] For some reason, a bunch of people believe that I pushed some far right wing conspiracy when I actually...

[1854] Physically pushed him?

[1855] What was it?

[1856] You pushed a conspiracy theory or a person?

[1857] Theory.

[1858] So basically, you had Donald Trump say last night in Sweden.

[1859] As soon as he says this, the media goes wild.

[1860] My friend and I, my friend Emily, who works with me on and off set, we were like, we should just go to Sweden and just like walk around film stuff and make a video about what we experience.

[1861] We decided to go.

[1862] Almost immediately, it's reported, we found nothing.

[1863] We found the neighborhoods were actually very nice, substantially nicer than Chicago, but we did find there was an increase in crime.

[1864] It was very tepid.

[1865] For some reason, now people are claiming that, like, I'm pushing conspiracies or something.

[1866] I was going to go somewhere with it, but I kind of lost my train of thought.

[1867] Oh, okay, here we go.

[1868] When I was there, Joey Salads, who's a big YouTuber, reached out to me on Twitter, saying, hey, I'm here too.

[1869] Let's meet up.

[1870] I got really angry.

[1871] Because Joey Salads made a video, I think a couple years ago, where he staged a bunch of black guys destroying a car to make it seem like these neighborhoods, you know, black people were going to destroy a Trump car and it was like, it was just very racist.

[1872] It was awful, horrible.

[1873] Yeah, he's a big YouTuber.

[1874] And so he reached out to me and I got really pissed like, dude, you're fake news.

[1875] You're a racist.

[1876] And I was like, and I started cussing at him.

[1877] I was like, and that's not normal for me. But afterwards, kind of realized something.

[1878] Joey made a mistake.

[1879] he reached out to me. It was polite.

[1880] He was honest.

[1881] I think he did something really, really bad.

[1882] If I tell him, if I cuss him out, if I just be mean to him, what's he going to do?

[1883] The only people who will accept him, if the only people are going to be nice to him are the actual racists, well, then he's going to go to the racists.

[1884] He needs a path to redemption.

[1885] So I apologized.

[1886] And I said, that was wrong.

[1887] I shouldn't have done that.

[1888] I should be willing to hear you out.

[1889] And I should give you an opportunity to better yourself.

[1890] Otherwise, you won't.

[1891] And then I met up with him and talked to him.

[1892] And I think he made a huge mistake.

[1893] I think what he did was.

[1894] wrong.

[1895] I think it was self -motivated.

[1896] I think it was money -motivated.

[1897] But I think none of that matters.

[1898] All that matters is you tell them, you do this one more time, you're out, you're out with the wolves.

[1899] But if you agree to do the right thing and you're willing and you're sincere, then, okay.

[1900] Good for you.

[1901] Good for you.

[1902] I think that's got to be a part of the conversation.

[1903] Yeah.

[1904] I mean, we, I think this idea of just banning people for life, but letting people out of jail after they commit murder and they can reenter society.

[1905] I mean, it's I mean, it's still is hard for them for a long time, like X -Con.

[1906] Sure, it's not easy, but we're saying there's a path to redemption.

[1907] But what ends up happening is they create parallel economies.

[1908] They create parallel networks, and that causes more division, more anxiety.

[1909] Like you see these alternative social networks emerge because people get banned, so they all move to one place.

[1910] Which is back to Gab.

[1911] Yeah.

[1912] Yeah.

[1913] So, but, you know, I think Gab is mostly, I got to be honest, I don't surf Gab.

[1914] I've read what the people have said about it.

[1915] But at this point, I don't trust a lot of the media.

[1916] I don't.

[1917] And that's not unique for me. Most Americans have unfavorable.

[1918] You mean GAB or you mean everything?

[1919] What they've said about GAB and the - Oh, everything, everything.

[1920] Right.

[1921] But your opinion of GAB, I mean, so you don't have like a clear opinion of it.

[1922] My opinion of GAB is that they do allow, there's a lot more people with extreme views simply because it's legal.

[1923] Right.

[1924] So they go there naturally.

[1925] Right.

[1926] But based on what I've read about it from like actual reports, it's not substantially worse than Twitter, but it does allow them.

[1927] That's a big difference, right?

[1928] Yes.

[1929] You can go on Twitter.

[1930] You can get away with it for a while.

[1931] Gab, you just go on Gab and say a bunch of crazy stuff.

[1932] Right.

[1933] Unless you, you know, break the law.

[1934] There have been accusations leveled against Gab that they've actually dragged their feet on getting rid of illegal things, like calls for violence, you know?

[1935] Mm -hmm.

[1936] But for the most part.

[1937] There are some things that you can get kicked off for, like, doxing and things on those lines.

[1938] Yeah.

[1939] So it's not the Wild West, but it's wilder.

[1940] But you know what the media, these people, and when I say the media, I specifically mean these digital New York -based outlets for the most part.

[1941] They like to say things like, you know, that guy who shot up at synagogue.

[1942] Yes.

[1943] Horrible.

[1944] Just disgusting.

[1945] But they say Gab user.

[1946] So -and -so, a Gab user.

[1947] He was a Twitter user.

[1948] He was on Twitter.

[1949] He drank water.

[1950] Yeah.

[1951] You know who else drink water?

[1952] Hitler.

[1953] Hitler.

[1954] Yep.

[1955] Boom.

[1956] Yeah.

[1957] There it is.

[1958] But it's crazy.

[1959] It's like, come on man. You know, but I will recognize right now that at this point, Gab is, it's, yeah, if you're going to go on there, the media is going to accuse you of every single name in the book.

[1960] Of sure.

[1961] You got to, you know.

[1962] I've had conversations with people on here Well, they'll just immediately start throwing out These descriptions of what Gab is and what Gab stands for And like, okay, I mean, I had it with Barry Weiss.

[1963] I was like, I was gonna say, and some politicians Yeah.

[1964] That was, uh, oof, that's all I'll just say, oof.

[1965] Which part?

[1966] Oof, the Tulsi Gabbard, Tote.

[1967] I, I, I, you know, I, did you see the, did you see what, um, uh, yeah, Jimmy did about it?

[1968] I was laughing the whole time.

[1969] Yeah, Jimmy Dorr.

[1970] Jimmy Dorr, I like Jimmy Dorr.

[1971] He's fucking.

[1972] No, he's fucking great.

[1973] He's great.

[1974] And the way he did it in front of a live crowd.

[1975] Yeah, yeah.

[1976] And, but here's the thing.

[1977] I got to say, like, I'm sorry to Barry.

[1978] It really did sound like you didn't even know what the word meant.

[1979] Yeah.

[1980] Because if you say, what does that word mean?

[1981] I'll tell you what I think it means.

[1982] I won't ask someone to check the feed first.

[1983] Right.

[1984] You know, so it's like, and she also, she's also been dragged because she did a story where she used a satire tweet from Antifa or something.

[1985] I don't, I don't know necessarily.

[1986] So she's had her share.

[1987] But, you know.

[1988] Well, I've been busted doing that, too.

[1989] She's...

[1990] I got caught with an Antifa, a fake Antifa post on a wall that was like...

[1991] Oh, really?

[1992] It was a troll.

[1993] Somebody trolled it.

[1994] But, you know, that's what's so crazy is that things are so fucking blurry.

[1995] It's hard to see the troll sometimes.

[1996] Oh, yeah.

[1997] And so therein lies the big problem.

[1998] I mean, my brother might be one of the most notorious.

[1999] Notorious is the wrong word because he, like, kind of hides in the shadows.

[2000] but he's done some my brother cloned uh a well i can't remember which website he cloned a website right the actually the yes men recently did this little watching post copied perfectly and he wrote an article about how a new strain of marijuana was discovered cannabis australius he said that he was the dean at the university of sydney or something like that who was dating pop star megan trainer just the most ridiculous thing and that he said we need to find a female plant so far is what we have, it's groundbreaking.

[2001] So he did this very clever thing.

[2002] He bought a domain name that was something like com dash guest.

[2003] Info.

[2004] That's the URL.

[2005] He then created a subdomain.

[2006] So it's like, you know, just hypothetically, CNN .com dash guest, the average person just sees CNN .com.

[2007] Right.

[2008] They assume it's real.

[2009] I don't know how he shared it, but he makes this whole, it's so ridiculous, man. I got to say, if you thought it was real, I got a bridge to sell you, high times picked it up.

[2010] It got 50 ,000.

[2011] shares breakthrough new strain of marijuana discovered my brother is just laughing the whole time and i was like what did you do like my brother's like my opposite you know i i try to be honest to the best of my ability i mean you can you know i think you know people accuse me of being a liar that's not fair i could be wrong for sure but i try my best to be rational my brother on the other hand is like editing videos making them ridiculous as possible he made another video where and it's crazy because he tries to make sure people it's over the top he made a video where it's It's a van getting pulled over.

[2012] It looks like a police dash cam.

[2013] And then the cop walks up to the car, checks the guy's license, walks back, and then the driver releases pot on a balloon.

[2014] And then an air points to it saying stash.

[2015] Like, he's disposing of his drugs with helium balloon, so they fly away.

[2016] And then the cop runs up firing guns at it.

[2017] This thing got hundreds of millions of views.

[2018] It was on Facebook every other day.

[2019] It was like, he made it four years ago or something.

[2020] And I saw it on Instagram, like, two weeks ago.

[2021] And I was like, dude, and I showed him, people think it's real.

[2022] like it's the craziest but he you know yeah I don't know so much fucked up stuff that is real it's so hard to differentiate but I mean hold on like let's throw some shit at high times like when they did this they said when they corrected the article and saying it was a hoax they said we wanted to call for a verification but we thought the story was too hot to pass up that's a really good example but that's a really good example of Covington it's a really good example of what these news organizations do but it's high times they were probably bar barbecue out of their fucking mind when they wrote that.

[2023] And I say, like, they're not, they're not the bastions of great journalism.

[2024] We don't hold high times on a pedestal next to the New York Times.

[2025] Not to be disrespectful.

[2026] Right.

[2027] You know, I respect their publication.

[2028] But the New York Times is the paper of record.

[2029] Yes.

[2030] At the same time, if it's your job and you're writing stories on a specialty, like, come on, man. Yeah.

[2031] But look, this is, this is a, there's two big things that affect media that I think are worrisome.

[2032] Covington exemplifies really well that people just wrote.

[2033] They didn't do any research at all.

[2034] Right.

[2035] And it was all wrong.

[2036] Then even watch the videos, because you can see in the first video.

[2037] Well, not only that, but what's even more egregious is they did this several days afterwards.

[2038] And they were still, like the Bill Maher thing.

[2039] Yeah, yeah, yeah.

[2040] It's very unfortunate.

[2041] But then the other.

[2042] It's all avoidable.

[2043] If you're going to discuss something and you're going to do it in a public forum like that, and you know about it in advance, this is not like you're on a podcast ad -libbing and you say something and you misspeak.

[2044] This is something that's planned.

[2045] The other thing that worries me greatly is what we see with.

[2046] with Learn to Code, right?

[2047] So this NBC reporter is, he goes on this big Twitter threat about how Twitter needs to take action against these harassment campaigns and they refuse to do it.

[2048] The next day, he writes an article citing an activist about how a far -right campaign is sending death threats to journalists and Twitter isn't doing anything about it.

[2049] And after that, he starts posting about how he's getting death threats.

[2050] A day after that, Twitter announces they'll take action.

[2051] So what do we see here?

[2052] This guy called for action, couldn't get it done, wrote an article slanting it as a far -right campaign against journalists, Twitter decides to take action, now people are getting banned for tweeting Learn to Code.

[2053] How many people do you think are making these critical decisions?

[2054] I think it's a handful.

[2055] I mean, I can't imagine Twitter's got a massive staff.

[2056] What?

[2057] What?

[2058] You think they have a huge staff?

[2059] What, a couple hundred?

[2060] There's a hundred offices and roughly 3 ,500 employees at those.

[2061] Oh, okay, so way more than I thought.

[2062] Yeah, that's what I was 30.

[2063] But how many of those people are operating in the bubble?

[2064] They're in that ideological bubble.

[2065] But it's not just that.

[2066] You know, you ban Kathy Griffin.

[2067] She's a celebrity with millions of followers.

[2068] They don't care about Alex Jones.

[2069] Alex Jones sell supplements, right?

[2070] It's complicated.

[2071] I mean, I can't really understand why.

[2072] He's, well, he's more polarizing, but he's also very famous.

[2073] Oh, come on.

[2074] Kathy Griffin showed a photo of Donald Trump with his, you know.

[2075] With his head cut off, which is even more insane.

[2076] You want to know something really crazy.

[2077] The guy who wrote all the articles demanding, I don't want to say demanding.

[2078] That's maybe unfair.

[2079] Oliver Darcy goes on CNN and says it was media pressure that got Jones banned he's the guy who repeatedly wrote about it he was the guy Jones confronted about it Oliver Darcy interviewed me in 2016 about a video I made where I said it's worrisome that Twitter is banning people for their political ideologies Oliver Darcy I don't know if he interviewed me but he wrote an article about my video and this is a journalist at Business Insider who says oh it's a really interesting thing Tim let me write about your take on what's happening and then two years later is talking about how his media pressure got Jones banned.

[2080] It's like a weird 180, you know?

[2081] Wow.

[2082] Yeah.

[2083] So I'm not trying to be disrespectful to him because I, you know, I know I'm somewhat in passing, but I will say there's a certain point where I think it's unfair to accuse a journalist of advocating for something simply for covering it.

[2084] But then he went on CNN and said it was media pressure that got him banned.

[2085] And I'm kind of like, okay, you have to realize at that point, you were the one who led that charge, you know?

[2086] Right.

[2087] And, you know, I don't care if you like him or not, banning him for For dubious reasons, just creates huge problems.

[2088] It shows your bias.

[2089] But just saying it that way, too, it's like when Hunter S. Thompson spread the rumor about Ed Muske being on Ibogaine, and then he went on the Dick Cavett show.

[2090] It's a hilarious clip.

[2091] And he goes, well, actually, there was a rumor.

[2092] It was a rumor about him being on Abbegain.

[2093] And I know, because I started the rumor.

[2094] Oh, right on.

[2095] You know, I mean, that's exactly.

[2096] That's kind of the same thing.

[2097] Yeah.

[2098] It's like, yeah, you.

[2099] You caused it.

[2100] Yeah.

[2101] Yeah, yeah, yeah.

[2102] You spread the rumor.

[2103] But, you know, there's another thing, too, is Patreon banned Lauren Southern.

[2104] I think it was like a year and a half, two years ago.

[2105] And when she got banned, I don't know if you know what happened, but she went in the boat in the Mediterranean.

[2106] Yeah, what was that?

[2107] She did something with the migrants.

[2108] She did a protest action with, I believe it was Generation Identity, which are, I don't know how to describe them because, you know, people like to throw labels around, but they're, like, European nationalists and people have called them white nationalists but again I don't know enough about their group so I think it's fair to say that that might be the case forgive me for being ignorant for the most part their ideology I know people are going to tear me apart she gets in a boat they go up to one of these migrant vessels that does they say it's a search and rescue vehicle but that's been a point of debate and she like waves a flare in the air and then she says on the stream like get in front of them get in front of them but I believe she never did Jack Conte banned her and they were like what you were doing may have caused loss of life your ban a lot of people then started to point out that there's a website called It's Going Down and this is a you know considered to be far left extremist site one of the articles was teaching people or advocating for pouring concrete on train tracks to disrupt derail these trains and can cause loss of life so Jack so I saw this and I thought this is really fascinating I'd like to know why Patreon banned Southern and why they don't take action in this regard I reached out Jack Conti.

[2109] I tweeted at him.

[2110] He said he'd call me on the phone.

[2111] And I said, you know, I'd like to understand your decision making how this came to be.

[2112] What brought you to the attention of Lauren?

[2113] What about this?

[2114] He ended up banning it's going down.

[2115] And then they wrote an article titled Tim Poole and the alt -right get, you know, it's going down banned from Patreon or something.

[2116] And that's been cited, Jimmy Dorr headed on his show.

[2117] And a bunch of people were like, just, well, you know, Tim Poole's not alt -right.

[2118] And I said, listen, man, I didn't advocate for them to be banned.

[2119] I don't want anyone banned.

[2120] Right.

[2121] I just wanted to know what their decision -making process was, and this was the thing that was going viral among people on Twitter who were, you know, asking about this.

[2122] And so now I get accused of campaigning to get them banned, just like, you know, Alifur Darcy was with, with Jones.

[2123] But, uh, well, and then immediately they slapped that distinction on you.

[2124] As soon as, I mean, just calling someone alt -right today, it's so strange how, you know, I don't know if you know, but all in the world of stand -up comedy used to be progressive, liberal, like weird coffee shop type rooms.

[2125] It was all comedy.

[2126] They don't even use that term anymore because it's so toxic.

[2127] And I try to avoid saying, we used to say alternative media all the time.

[2128] And this has happened over the course of just a couple of years.

[2129] I mean, the thing is shifted, and it's moving and evolving and morphing so quickly.

[2130] You want to know what's really crazy?

[2131] For years on my Wikipedia page, it claimed that I invented a zeppelin.

[2132] A zeppelin?

[2133] Like a blimp?

[2134] Yes.

[2135] Yeah, it's funny, right?

[2136] Well, Wikipedia said I was Brian Callan's brother for a decade.

[2137] Here's the thing.

[2138] When I went to Sweden, I specifically stated, let me, let me back up.

[2139] Paul Joseph Watson of Infoars put out a call saying, I challenge any journalists to spend a weekend in Malmo and I'll cover your costs.

[2140] Everybody's bombarding him, saying, oh, pay me, pay me. And he's kind of just ignoring it.

[2141] People are threatening to sue him.

[2142] You better pay, you promised.

[2143] I had already set up a GoFundMe for the project before, I believe it was before he announced.

[2144] I made a video about it saying Donald Trump said X, Y, and Z, we're going to go do the story.

[2145] When I saw he made this call, I think it was actually Emily who noticed it, I said, hey, I'll do it.

[2146] And he was like, to be honest, I think he said something like I was just taking the piss.

[2147] But yeah, sure, I'll send you a donation.

[2148] And I laughed.

[2149] And I was like, hey, man, I'll take, you know, if he wants to throw money at my go fund me. Here's what ends up happening is people then claim I was, I went there because Paul just watched and challenged me. Not true.

[2150] He donated about 9 % of our total fund that we raised, and I was already planning on going there.

[2151] Wikipedia, there was a challenge on my Wikipedia page where someone said, you wrote, Tim Pool went there because Paul Joseph Watson challenged him to.

[2152] That's not true.

[2153] What's your proof?

[2154] This YouTube video from Tim Pool where he says, we've already arranged this, we are not going here because of Paul Joseph Watson.

[2155] The response, that's not a reliable source.

[2156] Someone came back with a reliable source.

[2157] You know what it was?

[2158] a Huffington Post article that quoted my YouTube video.

[2159] How is that?

[2160] I don't understand.

[2161] Why couldn't you just take my word for it?

[2162] Why did you have to get Huffington Post to just quote me?

[2163] That was apparently...

[2164] So, you know, that's enough, I guess.

[2165] So the reason I bring this up is because what happens then if you're a conservative and a bunch of friends who work for various news organizations all at the same time write 10 articles saying Joe Rogan is all right.

[2166] Now on Wikipedia, 10 articles pop up immediately saying, this is a fact.

[2167] Ten different organizations have written it.

[2168] And there it is.

[2169] In your page.

[2170] And the crazy thing is, the UK does us all the time.

[2171] You know, they call various personalities alt -right.

[2172] They call Sargon.

[2173] They call Dankula.

[2174] Just go to Wikipedia and look up the phrase.

[2175] It means white nationalist, neo -confederate.

[2176] It's like literally about a white ethno state.

[2177] These people have denounced this.

[2178] And, you know, it's like with Sargon of Akkad, it was a really fascinating phenomenon on Patreon where all of a sudden these left -wing outlets said Sargon was banned for going after the alt -right and I'm like but hold on you've written in the past that he was alt -right well you when you say you like the media is not one giant these various organizations same organizations but different authors well I don't want to I don't want to okay but not the same journalists I will lean towards I believe yes because it's it's but so so whenever it's convenient they just throw throw that there are some people who write the problem is if I mean if I name these people they're going to point their pens in my direction and then all these things are going to swing at me and the other thing too is I don't want to I don't want to brigade them it's not important anyway what's really important is the actual reality of how it was done there are I really want to name this organization it's one of the prominent well known but they repeatedly write stories that are just so over the top I roll my eyes and I'm like I will say the Huffington Post you know is another organization that wrote like oh god what was it some some this guy guy wrote about Nazis on Steam.

[2179] Oh, Steam.

[2180] Like, that Steam as a Nazi problem.

[2181] And very clearly, like, anybody was like, the video game?

[2182] Right, right.

[2183] This is what we use.

[2184] Because apparently he was referencing a very small, like, here's what happens.

[2185] Okay, I don't know too much about this.

[2186] I know there was criticism in the journalism space around what he did.

[2187] But it's like you find three guys on Twitter who are saying something.

[2188] They quote tweet them and say, boom, fact.

[2189] Like you find one guy who says, you know, I just plain don't like this group.

[2190] And they'll be like, oh, Nazis don't like this group.

[2191] a big story about it because I found one tweet.

[2192] You know what I mean?

[2193] Yeah.

[2194] So apparently, you know, this guy in the Huffington Post wrote a story about how people on Gab were presenting really disgusting recipes for food.

[2195] And I'm just, I'm wondering, what's newsworthy about a random Gab user's Mastacholi recipe?

[2196] Oh, I saw that.

[2197] It was hilarious.

[2198] But then again.

[2199] Is that a comedy article?

[2200] Yeah, it's an opinion piece.

[2201] But it's just calling people assholes.

[2202] But the guy who wrote that is like their senior editor, a senior writer who travels and covers news on the ground.

[2203] Right.

[2204] But hold on, isn't that funny or isn't it interesting?

[2205] I mean, it's if it's just some shitty recipes.

[2206] Like, the only thing that's odd about it is that it's a gab shitty recipe.

[2207] Because it was just a shitty recipe online.

[2208] But that's my question then.

[2209] If it was on medium, which, by the way, we can't leave without talking about Jeff Bezos.

[2210] Oh, all right.

[2211] Well, we can, I don't know.

[2212] We have to get into that.

[2213] Let's talk about it.

[2214] But if it was on medium, if someone said, like, look at these dummies with their terrible fucking recipes, it would still be almost as interesting.

[2215] but it's flavored more by you're allowed to mock them because it's gab.

[2216] Sure.

[2217] But it's almost like...

[2218] They're uneducated.

[2219] You know, Vox ran an article claiming that people who hold alt -right views are like 11 million or some huge number.

[2220] And that's just like, it's absurd.

[2221] It's not 11 million American...

[2222] Well, you defined alt -right as being some sort of white supremacist.

[2223] I will quote the Associated Press.

[2224] The AP guidelines are that alt -right means white nationalist a design.

[2225] is i can i ask you this who defined that how does it why are they because richard spencer is the man who popularized the term oh he did yeah so he's not the one who coined it but he popularized it and he is a white nationalist right so i mean you know if you want to be a part of his movement there's certain things that are attached to it and other other alt -right people have written huge things about what it is so the ap said these are our guidelines and i'll defer to the associated press i have a lot of respect for them there you go you know so if if these news organizations there was a will limit weekly this is this is it's it's god man i just really i i i worked for vice i was actually uh one i'm one of the the key reasons vice news exists and i look back on it and it makes me kind of sad how they've written some of these most just like ridiculous articles i'm really proud of a lot of stuff they've done but i i quit i quit when i got an offer from fusion and fusion is it was an abc univision joint venture when i when i started there they said we won't be partisan For some reason, they decided to go far left and start pushing a lot of things that I thought were wrong.

[2226] They told me to, in effect, lie.

[2227] Right.

[2228] So the thing I bring up is I just have such disdain for these news organizations and how they use definitions that suit their needs to get the clicks they want.

[2229] You're alt -right today.

[2230] You're not alt -right tomorrow.

[2231] Right.

[2232] The click thing is an issue, right?

[2233] How much of an issue is that journalists are essentially fighting for their lives because newspaper is almost dead.

[2234] It's online publications.

[2235] They're trying to get subscriptions.

[2236] Like I subscribe to several different news online publications that used to be newspapers.

[2237] But the last time I picked up an actual newspaper, it's so much so that I felt like I had a joke about reading something in the paper and like turning the page.

[2238] I almost felt like I'm a liar for doing the joke about turning the page of a paper.

[2239] Right.

[2240] I don't remember the last time I fucking did that.

[2241] Everything I read is either on a tablet or on a laptop or...

[2242] Do you know what the gel -man amnesia effect is?

[2243] No. When you're an expert in MMA, you're like one of the foremost experts.

[2244] Have you ever read a news article about MMA that was so wrong?

[2245] Okay, so that same newspaper you're reading it.

[2246] You see that story and you laugh how wrong it could be.

[2247] You turn the page and there's a story about Syria and you go, I didn't know that.

[2248] Oh, right, right, right.

[2249] Why would you forget how wrong they were?

[2250] But the reason I bring this up is the analogy is that you turn the page.

[2251] Yes.

[2252] Nobody does.

[2253] You click the link.

[2254] Right, right, right.

[2255] But let's talk about clicks.

[2256] One of the things we talked about the other day is traffic assignment.

[2257] So I know you asked me they're fighting for their lives.

[2258] It's a serious issue, but I'd just like to point out their lives never existed in the media space.

[2259] Yeah, please explain that because that was one of the most illuminating aspects of our conversation on the phone.

[2260] Yeah.

[2261] It's publicly known, but not talked about a whole lot, that these media or organizations, mostly these digital new startups don't actually get a lot of views.

[2262] So what they do is, it's called Traffic Assignment.

[2263] There's a company called ComScore that tracks the viewership, the unique views these sites have.

[2264] If you're trying to attract investment and you say, we get 20 million views per month, they're going to say, that's cool, but this site gets 60.

[2265] What do they do?

[2266] Well, there are some sites, this is according to variety, modernfarmer .com.

[2267] What is that?

[2268] I have no idea.

[2269] I've never heard of it.

[2270] but there are many sites which you've probably seen where it's like the top 25 celebrities who you know mess up their makeup yes you click the page and it'll show you a photo in order to see the next photo you got to click the next page that's that way they turn you one person to 25 unique views or 25 views i don't want to say unique then a company like vice for instance will buy the assignment of your traffic and attribute it to themselves so when the com score numbers come out, it will say all of those views from those clickbait sites are actually vice, right?

[2271] That's fucking crazy.

[2272] And so there was a controversy a bit.

[2273] Again, I'm quoting Variety here.

[2274] I don't want to get sued, but Variety said that their traffic went down 17 percent because someone they were buying traffic assignment from was like going through turmoil and being shaken up.

[2275] And another one of their traffic assignment partners switched to, I think, like, you know, got sold to NBC or something.

[2276] So what ends up happening?

[2277] Well, I can say a little bit.

[2278] There was a company that was a prominent digital news outlet.

[2279] I knew someone there who was decently high up who told me our company is contemplating whether or not we should engage in traffic assignments to inflate our numbers.

[2280] And I said, don't do it.

[2281] Like, that's wrong.

[2282] And they said, but we need investment.

[2283] So I wonder if is that fraud?

[2284] Yes.

[2285] But if ComScore is just lumping the numbers together.

[2286] And I go to you and say, according to ComScore, our network brings in 60 million.

[2287] I didn't lie.

[2288] That's all true.

[2289] Yes, that's true.

[2290] So here's what happens.

[2291] These companies get massive investment.

[2292] They don't actually generate enough clicks or enough money.

[2293] Then once the investment runs out, those jobs never existed.

[2294] Those were those were padded by investors.

[2295] So, God, that's squirrely.

[2296] Everything collapses.

[2297] It seems like fraud.

[2298] I mean, It does to me, yeah.

[2299] That seems like if that was, if you were doing that with some tech stock.

[2300] Yep.

[2301] Yeah, right?

[2302] Yep.

[2303] If you were, Doranos or something like that, you know?

[2304] I was told by another individual who was at one of these digital companies that he felt like what we were seeing was akin to the securities problem.

[2305] The mortgage -backed securities from 2008.

[2306] That's a better analogy.

[2307] Right.

[2308] That's what he said.

[2309] He said, think about this.

[2310] You've got all of these big companies, these big investment, hundreds of millions of dollars, $200 million invested into these digital media outlets because they're seeing these numbers, but underneath there's nothing there.

[2311] That is crazy.

[2312] So if you're investing money, say if you've got some cash, you've worked your ass off and you've generated a lot of money, and you're like, look, we're going to get into the digital space.

[2313] We have a website that has 90 million clicks, and we're going to take that, and then you find out you just got fucking hosed.

[2314] How is that not fraud?

[2315] Because buyer beware.

[2316] Well, because there's laws that legalize fraud.

[2317] You know, here's a thing, right?

[2318] But is it a law?

[2319] Is it the correct way?

[2320] It's an interesting loophole.

[2321] It's not fraud.

[2322] Right.

[2323] Because they didn't lie to you.

[2324] They didn't deceive you.

[2325] You had every opportunity to look at those numbers and see where they came from to understand what their network was.

[2326] Okay, well, that may be the case, but the act of doing it and the act of the fact that you can do it and really you're getting modern.

[2327] Seems shady.

[2328] It's fucking crazy shady.

[2329] But I'll tell you what, something else too.

[2330] And I say this with the utmost respect for Shane Smith, who, you know, I worked at Vice.

[2331] We've gone out for drinks.

[2332] They flew me out.

[2333] I love Shane.

[2334] He's cool dude.

[2335] But he's brilliant, absolutely brilliant.

[2336] He is a master of, I don't know what you'd call it, but it's a form.

[2337] You know, so I grew up with a bunch of hacker buddies in a small little hacker community, and social engineering is something that I've been, you know, relatively well -versed in.

[2338] And Shane, whether he knows it or not, really, really understands how people think and how to get them to do things.

[2339] So I'll give you a fascinating example.

[2340] I left device in 2014.

[2341] And after I left, some of the people I had brought on through recommendation were still there.

[2342] This buddy of mine says, dude, good news.

[2343] I'm going to be helping produce the news program for the cable channel.

[2344] I was like, wow, congratulations.

[2345] Does that mean you're moving to Toronto?

[2346] He goes, why would I move to Toronto?

[2347] And I was like to work on the cable channel.

[2348] This was back in 2014.

[2349] He goes, what are you talking about?

[2350] We're getting a cable channel.

[2351] channel.

[2352] I'm going to work on the news show.

[2353] And I was like, dude, it was a Roger's deal.

[2354] The cable channel's in Canada.

[2355] It wasn't a couple years later, so they got the US -based channel.

[2356] But what happened was a bunch of my friends who worked advice didn't know the cable channel they got was based in Canada, but they believed they were going to be working on a cable channel in the US.

[2357] That's important because you need people to really want to work there and be passionate.

[2358] And Shane was a master of giving you just enough information so that you believed in what you were doing without realizing it's actually not that great.

[2359] And again, I'm not trying to be dick.

[2360] I think Shane's fantastic.

[2361] I don't blame him for this, but it's clever.

[2362] You get a bunch of employees who believe they're going to be on this big new American cable channel.

[2363] Well, Shane never said American.

[2364] He just had cable channel.

[2365] It's your fault for assuming it was going to be in America.

[2366] But that meant a lot to those employees.

[2367] So you're able to boost morale, get everybody really excited until they find out was in Canada.

[2368] And then they were like, wait, what?

[2369] But eventually, they did get their U .S. channel.

[2370] But that's, you know, Shane's, he's brilliant.

[2371] He really is.

[2372] He's a, he's, you ever play Civilization in the video game?

[2373] No. fantastic.

[2374] I love this game.

[2375] I'm playing Civ 6 right now.

[2376] How do you?

[2377] And you can get, they're called great people.

[2378] They're called great people.

[2379] That if you earn enough points, you'll get like, you'll get like Galileo.

[2380] He'll appear in your civilization.

[2381] I firmly believe that in 100 years, the next, you know, Civilization 50, you'll be able to earn a great merchant to Shane Smith because of how, like, I, you know, he was able to build this empire.

[2382] He did it through very clever ways of getting investment.

[2383] And admittedly, I really like the stuff they used to do back in the day.

[2384] I think the guys, wow.

[2385] Wow.

[2386] He knows how to do it.

[2387] And the reason I bring him up is because the big story about traffic assignment was vice losing like 17 % because of that practice they were doing.

[2388] So he really knew how to do the smoke and mirrors, you know.

[2389] But when you look at the other, when you look at how it pans out to all the other news outlets, then never gets laid off.

[2390] You have to one, you don't have to wonder why a thousand jobs just got lost in the past week.

[2391] It's just investment money.

[2392] And once they reached their threshold, it all, you know, came crashing down.

[2393] God, the modern farmer example.

[2394] Modern farmer, what is that?

[2395] But just the clicks, just the fact that you can actually buy those clicks and attribute them to something different.

[2396] And then you can tell people in the way, oh, how many views I got?

[2397] Yeah.

[2398] So I could go out and do that and send them over to joe rogan .com.

[2399] Oh, yeah.

[2400] And there's other really clever things too.

[2401] Like, I don't think this one is on par, but make, you have one YouTube channel with a thousand subs.

[2402] make 10 more, ask all your subs to subscribe.

[2403] Now you've got 10 ,000 subs.

[2404] Get it?

[2405] Right.

[2406] Same thousand people 10 times.

[2407] And you go around telling people, I've got 10 ,000 subscribers.

[2408] When in reality, it's just 1 ,000 people on 10 channels.

[2409] So there's really clever ways to inflate your numbers, and this attracts investment.

[2410] It also, but more importantly, it allows leverage in dealing with ad buyers and ad networks.

[2411] What are people doing when they're boosting up their Instagram numbers?

[2412] I don't know a lot about Instagram.

[2413] I'm trying to pull it up.

[2414] I can't show you right now.

[2415] I just went to a website.

[2416] buy YouTube views .com for $2 ,800, you can have a million views.

[2417] There's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, there's, if you want to be a YouTuber.

[2418] If you want to be successful on YouTube, buying views, ensures you will never make it.

[2419] But look at it this way.

[2420] No one do this.

[2421] No one, I am not advocating for what I'm about to explain.

[2422] Film 10 videos, a full season of, uh, the Joe Rogan travel adventure.

[2423] buy a million views on each to be patted out over a month go to a and e look at this show we launched million views per episode you want to buy it wow got a bunch of fans a million people watch easy the episodes fantastic you see what you see what i mean yes well if you're full of shit yeah but hey man there are people who it's snake oil there's people who do this yeah but here's the thing you get access to a network with real views yeah you'll get traffic right you know if you can trick your way into getting on tv fake it till you make it it it's dirty.

[2424] It's illegal, right?

[2425] Is it illegal?

[2426] That's fraud.

[2427] Well, actually, I don't know.

[2428] I don't think it is fraud.

[2429] If buying the clicks like that, I think, I think New York, there was someone sent an article that the attorney general of one state may have been New York said, misrepresenting yourself online through fake views, clicks, and likes is illegal.

[2430] I think what they were saying is that using other people's images to create fake accounts is like invasion of privacy or something, but we're getting to that point.

[2431] But, you know, There's people who play that game, I guess.

[2432] You can really pull it off.

[2433] I've met a lot of people who, there's a bunch of tricks, man. You know what team followback is from Twitter?

[2434] I've seen it.

[2435] I've seen the hashtag.

[2436] What does that mean?

[2437] It may have changed, but essentially, I follow you, you follow me. That way you'll see some people, they'll be following 100 ,000 people, and they'll be followed by, you know, 90 ,000 people.

[2438] And then they walk around and bragging about how they got 90 ,000 followers.

[2439] And it's like, well, hold on.

[2440] like you just have an agreement with them you're not influential it was a trick that people would do to inflate their numbers right but those numbers are legitimate if you do post something on Twitter it will reach 90 ,000 people so there's definitely merit oh yeah yeah yeah I've seen it work I've seen it translate but it's just a it's a trick it's a trick of gaming the system yeah there's a lot of sneakiness going on but well I have fake Twitter followers I didn't pay for them but I found them through that one of those how many of your Twitter followers are fake those aren't so good.

[2441] Yeah, what does that mean is like so many of the people don't engage?

[2442] I think that's what it's based on, right?

[2443] And that doesn't make sense.

[2444] Well, it doesn't make sense also because what if they just are logged in and they just read?

[2445] Exactly, and that's most Twitter users.

[2446] Right, which is still engaging.

[2447] You're just not going two ways.

[2448] So that's one of the big problems with tracking fake accounts is that it's just someone's opinion.

[2449] So there's two things.

[2450] Most high profile accounts will read as having a ton of fake follow followers because people will sign up just to follow you and read what you have to say because they want your feet they want you they want joe they want you know bill clinton or whoever and they want to have that feed of people they don't they don't interact and so then they're labeled fake the other thing is when people make fake bot farms they purposefully will follow people like you so they look real right so you do get fake ones sure and it's hard to know you know people point to politicians and always like to claim they have fake followers but i'll tell you what you could you could easily go online and buy followers for someone else it happened i i believe this happened the Daily dot, I say I believe it a lot, because I don't want to get sued, just so you know.

[2451] The Daily.

[2452] Dot one day jumped like 30 ,000 Twitter followers and had to put out a message saying, someone bought followers for us.

[2453] We didn't do this.

[2454] We are actively trying to remove them right now.

[2455] It's difficult because you don't know who's real and who's fake after it happens.

[2456] Right.

[2457] Man, there's, how could you possibly remove them that?

[2458] You'd have to find out, like, what the timeline was and how quickly they came in and go to each individual account.

[2459] Yep.

[2460] Well, there's a thing that, I think there's an app that allows you to purge what they view as fake.

[2461] You can block and then unblock them and they're not following you anymore.

[2462] but it's tough man they're there they're terrifying ways to you know I mentioned buying views will destroy your YouTube channel because YouTube knows they're fake right yeah so think about what you can do to other people if you have three grand you want to drop to damage yep yeah buy their YouTube vibe buy views for them attributed to them and then get them in trouble but I will say for people like you and to an extent people like me because I have contacts at Google that wouldn't work on us right I just make a phone call and say, I just want to let's say you know, and they go, no problem.

[2463] Yeah.

[2464] But let's say you're somebody with 100 ,000 subscribers.

[2465] You know, you're making a living.

[2466] You got a career on YouTube, but you don't have a manager with YouTube.

[2467] You don't have contacts with Google.

[2468] If someone attacks your account, you're out.

[2469] Yeah.

[2470] You know, they can just do that.

[2471] It's scary.

[2472] Yeah.

[2473] Yeah, man. Jeff Bezos.

[2474] Oh, yeah, Jeff Bezos.

[2475] How crazy is the National Acquirer allegedly?

[2476] We should say allegedly.

[2477] Allegedly.

[2478] I believe.

[2479] Allegedly.

[2480] I don't want to get sued.

[2481] tried to extort him and it's about the Jamal Khashoggi investigation from the Washington Post it's all connected to that I read I read a bit about it I started reading more when Ronan Farrow came out and claimed that they went to him to so I'm not as versed like I haven't read I read the they they meaning the inquire yeah Ronan Farrow it was a story I think was from the week where Ronan Farrow says that they approached him and and you know put pressure on him as well But, you know, the interesting thing is, when I read that story, I kind of laughed because I was like, remember Gawker?

[2482] Yeah.

[2483] Remember when they outed Peter Thiel?

[2484] And then, you know, Hulk Hogan, like, you know, these people are really brave to go up against billionaires like this and try and drag them.

[2485] Yeah.

[2486] I'll also point out it's kind of scary that you live in a world where a billionaire can destroy a company because they're angry at you.

[2487] But it's also kind of scary that they can use this to extort him so that he doesn't, he takes the Washington Post or they're attempting.

[2488] to get him to take the Washington Post and remove a legitimate news story about an actual murder.

[2489] I don't trust the Washington Post, but, you know, that's an aside.

[2490] That is an aside.

[2491] Yeah.

[2492] Like, if, I get it.

[2493] You know, when I look at what the National Enquirer did, it just reminds me of what the media does.

[2494] They know what they can do, and they know how to do it.

[2495] The media is influence, its power, you know, brands, they're scared.

[2496] They're scared.

[2497] So you look at what happens with some of these Twitter accounts that will lead campaigns, where they encourage all their followers to send emails.

[2498] It's not the same as blackmail by no means.

[2499] But when you know there's an attack vector, like, you know, what is that, wild sardines company, they don't want to deal with a brigade from activists.

[2500] You tweet at them, your fans tweet at them, and they immediately cancel on your show and they disavow you unless you do something, unless you say something, unless you disavow something.

[2501] You know, so granted, it's leaps and bounds worse when a National Enquirer, allegedly, you know, tries to extort Jeff Bezos.

[2502] But the craziest thing about it is, again, allegedly, that the investigator, I think his name is Becker, was entertaining the possibility that a government entity intercepted the texts, the nude selfies from Bezos.

[2503] But I did see another journalist tweet that they're not entertaining that.

[2504] They're not pursuing it.

[2505] It was just a thought.

[2506] So maybe it's not real.

[2507] Well, part of the other thought was that his girlfriend's brother, who's a Trump supporter, might have somehow another gut screen grabs of her phone.

[2508] It was really interesting when Bezos said, because he owns the post, people presume he's their enemy.

[2509] Yeah.

[2510] And that's another point I would bring up to when it comes to, like, banning Alex Jones.

[2511] Just because someone's reporting something doesn't mean they're advocating for it.

[2512] But sometimes they are.

[2513] It's like, you've got to understand the nuance in that.

[2514] Yeah.

[2515] But, yeah, I mean, Bezos is probably not the person you'd want to target.

[2516] Could you imagine what would happen if Bezos showed up at the Washington Post and said, guys, kill the story?

[2517] Every journalist would tweet it.

[2518] Yeah.

[2519] Bezos showed up, it just killed our story.

[2520] Yeah.

[2521] impossible.

[2522] It's so unrealistic.

[2523] But maybe there's more to it.

[2524] I mean, yeah, the real, the, and this is the big conspiracy theory, was that, you know, Trump always calls him Jeff Bozo.

[2525] Right, right, right.

[2526] Someone from that side is involved in this.

[2527] Yeah.

[2528] And, you know, because Trump's always had this relationship with the Inquirer.

[2529] That's the big conspiracy.

[2530] Yeah, right, right.

[2531] Yeah, so I don't, I don't know, I guess I said, you know, this is a relatively new story, and I've been, you know, here, so I don't know as much about it as I probably should.

[2532] Which just shows you how crazy digital media is, but digital things, like sending things through the air, and it's just that people can get intercepted.

[2533] That's why, you know, when I mentioned earlier the potential for civil war, like, we don't know what it could look like.

[2534] This could be it.

[2535] It could be special interests using information.

[2536] It's the information war, you know, things that people have talked about.

[2537] I was thinking about this a while ago.

[2538] It's like, man, why did people shoot each other 100 years ago?

[2539] I mean, so to do.

[2540] But, like, you know, World War II, why are they running it?

[2541] Because they wanted to gain control.

[2542] They wanted to centralize power or they wanted to push an ideology or a government or expand their power.

[2543] You don't need to shoot somebody to do that.

[2544] You just need to convince them you're right or you need to get them to fight each other.

[2545] So I think it's fair to say, yeah, the Russians are absolutely screwing with us.

[2546] But we've learned through if you trust the reporting and it's hard to know what's real or not that the Russian campaigns were not only promoting Trump supporters, but they were promoting Black Lives Matter.

[2547] Why?

[2548] They didn't give them to fight.

[2549] Well, they weren't just doing that.

[2550] There's a fantastic podcast that Sam Harris released recently with this woman.

[2551] Let me get her name so I can see if you can find it, Jamie.

[2552] It's called The Information War.

[2553] That's the name of the podcast.

[2554] But they were doing all sorts of different things, like not just trying to.

[2555] Renee DeResta.

[2556] And I'm working on getting her down here soon.

[2557] that they were they also had like uh texas culture they had trans rights they had they even organized facebook campaigns where they had a pro texas group and a pro -muslim group meet across the street from each other i mean they're they're sowing seeds of dissent like organizing it and i think i'd be willing to entertain the possibility that what we call the culture war today was seated specifically by special interests potentially russia nothing you can do about it it's done.

[2558] You know, when people adopt an ideology, you can't easily break that.

[2559] And some people refuse to cross that divide, you know.

[2560] Yeah, but it's just so funny how many different ways they were attacking this.

[2561] They had a blue Lives Matter groups and black Lives Matter groups, and they put people at odds with each other.

[2562] And one of the big ones that they did was they targeted African Americans and were trying to get them to vote for anyone other than Hillary.

[2563] And this is like an engineered campaign that Jill Stein's our vote.

[2564] And like, we can't vote for Hillary.

[2565] Like, Hillary does not support us.

[2566] We can't vote for Hillary.

[2567] And they made it very tribal.

[2568] The first thing we have to assume is that it was effective.

[2569] And that what we view in the culture war was exacerbated by these campaigns.

[2570] We don't know, we don't know to what extent they had an influence over the U .S. But I will say, I think it's fair to point out they play a role.

[2571] And then we can see what happens, Charlottesville.

[2572] Yeah.

[2573] You know, where we can see the dramatic escalation where you end up with some crazy guy associated with, you know, you You know, white nationalism, ramming a car into a bunch of protesters.

[2574] Yeah.

[2575] It's, you know, people get riled up to a point.

[2576] There's a really great video called This Video Will Make You Angry by CGB Gray, where he talks about how these groups, they argue amongst each other, not against each other.

[2577] They make each other angry by posting images of the other.

[2578] You know, there's certain subredits where I don't want to, you know, start a brigade, but they'll post memes nonstop attacking a particular politician.

[2579] They're not arguing with the left or the right.

[2580] They're arguing to themselves about what's wrong with the other.

[2581] And so these groups grow and get angry and anger and anger.

[2582] And then when they finally meet in the real world, you get extreme violence.

[2583] So it's very possible to cede those communities and rile people up, push these things.

[2584] Yeah.

[2585] And well, it's also, it seems pretty straightforward how to manipulate them and how to appeal to their tribal nature.

[2586] It's terrifying how easy it is.

[2587] Yeah.

[2588] And also terrifying how few people are aware that, I mean, there's a lot of these.

[2589] really toxic pages that you'll find that are commenting on things that are yeah whether it's uh instagram or twitter or whatever they're not they're not really who you think they are and what ends up happening the average person say on twitter will look at their mentions and see 10 tweets where they say you shouldn't talk about this anymore how dare you and they'll assume everyone when in reality it could be one person right but it works well this is this is what gets companies to run away from sponsoring you know shows and stuff they think everyone's attacking them could be one person where do you think this goes if you're if you're looking at this like this is fairly new right all of this stuff is fairly new and and really kind of uh this is uncharted territory in terms of how to navigate this stuff and they're they're sort of figuring out what the influences are as we go along and again 10 years ago this didn't even exist so this is all all new where do you think it's going u .s. destabilization really at a dangerous dangerous level uh so patreon right it's the it's the it's the independent economy.

[2590] If you're a YouTuber, if you're a podcaster, if you're an artist, if you're a cosplay or whatever, well, they decide to ban some people for reasons that don't make sense.

[2591] Like Sargon didn't violate the TOS, but they ban them anyway.

[2592] So Sargon goes to subscribe star.

[2593] Activists then decide Sargon shouldn't be allowed to make money, period.

[2594] So they start, you know, campaigning against Subscribesar.

[2595] Subscribesar loses access to PayPal and Stripe.

[2596] So it can no longer process payments.

[2597] Who made that decision?

[2598] To remove it?

[2599] Yes.

[2600] Most likely PayPal, I believe.

[2601] I believe they said on the website, PayPal, you know, because what happens is PayPal gets scared?

[2602] Oh, we don't want to be involved in this.

[2603] Well, something else happened.

[2604] Subscribes are reactivated.

[2605] They've got a new payment processor, which means we've seen the budding off of a mirror economy, which is dangerous.

[2606] The fact that Americans in general can't share the same platform and had to create an alternate, an alternate, that had to be supported by separate means, if this continues in that direction, we're going to end up with tons of systems that operate for own.

[2607] certain political factions.

[2608] Jack Conti, the CEO of Patreon, said to, I believe, CNBC, you can't say anything you want in the world.

[2609] What does what I say in the world have to do with what service you provide?

[2610] Now, by all means, if you want to ban them, you can, but you can then see the adopting of ideology.

[2611] Someone posted a funny comment, a company that refuses to sell water to a dehydrated man in the desert because they think the wrong thing.

[2612] But then what happens?

[2613] Different companies emerge and you get tribes that are divided not only by their ideology, but literally they're unable to communicate with each other.

[2614] That can only lead to one thing.

[2615] The tribes getting physical, Charlottesville, Portland, Boston, Berkeley, San Bernardino, these various instances where they've clashed and bashed each other, people have been killed.

[2616] Some people show up with guns.

[2617] I think that's, I don't, I got to be honest, I don't think there's a way to fix it.

[2618] I don't.

[2619] Really?

[2620] There's a hilarious comment.

[2621] I'm sorry, comic, programmer humor on Reddit.

[2622] And they said, when you talk to an airplane engineer or mechanic, they say, oh, yeah, these things are engineered safety.

[2623] You're more likely to get in a car accident than die in an airplane.

[2624] You know, when you talk to an electrical engineer, oh, God, we've got so many redundancies that the system will fail if, you know, when you talk to a programmer, everything's, everything's bad.

[2625] Voting machines are corrupt.

[2626] The system is failing.

[2627] We can't secure it.

[2628] Like there was a kid at DefCon, the hacker convention, who hacked a voting machine like that.

[2629] Yeah.

[2630] Like the whole system is, I got to say, you know that story I told you about high times?

[2631] Yes.

[2632] Imagine if someone did that, but it was a political story.

[2633] How easy would it be to rile people up and get them violent?

[2634] It's dangerously easy.

[2635] It's terrifying.

[2636] Yeah, it's very easy to get people riled up with fake stories.

[2637] And, you know, and once they go viral.

[2638] There's no stopping it.

[2639] And I think we're seeing a really even more terrifying phenomenon in that even when we have evidence in front of our eyes, Covington.

[2640] People refuse.

[2641] Reza Aslan, for instance, he won't back down.

[2642] It's like, it was the first video that just shows the one kid standing next to Phillips.

[2643] That goes viral.

[2644] I can understand why people saw that and were like, whoa.

[2645] But the second video that went viral, almost at the exact same time, showed Phillips walk up to the kid, immediately disproving the original narrative that the kid approached him.

[2646] But even Bill Maher still got it wrong.

[2647] You know, so even when people can see exactly what happened, it doesn't fit their narrative.

[2648] Don't believe it.

[2649] And there was an article that I think it may have been in Gizmodo or Deadspin or something that said, don't listen to them, we all know what we saw.

[2650] And it's like, dude, if we're getting to the point where kids at a blackout basketball game are in black body paint and throwing up three pointer signs is Nazis, how do you bring those people back from the brink?

[2651] And I'll say this too.

[2652] Like, I obviously have been, you know, I've very, didn't mention alt -right violence all that often.

[2653] But then you have to realize the alt -right is tiny, tiny, tiny, tiny.

[2654] they're rare they've admitted defeat you know richard spencer i think he said this at antifa one or something no one showed up he hit an event in florida 11 people showed up i'm not worried about that guy i'm worried about these fringe ideologies that are racist intolerant and violent slowly seeping into our culture like when you see uh politicians openly embrace like race -based government policy it's it really does worry me you know my i i think i have this perspective growing up in a mixed race family where I've been insulted by the left for being white and I've been insulted by far -right racists for being, you know, a mutt.

[2655] And so I don't like either of it.

[2656] I really don't.

[2657] But the white supremacist types are falling apart and they don't threaten me anymore.

[2658] They just don't.

[2659] But the left -wing racism and these ideas of racial equity and determining what you're worth based on the color of your skin are becoming more and more pervasive.

[2660] The lawsuit with Harvard, that Asians have a harder standard, a tougher standard for getting in, even though Asians are a smaller minority than white people.

[2661] Yeah.

[2662] Why does that make sense?

[2663] Why should I have to approach someone and justify my race to them?

[2664] That terrifies me. It really does.

[2665] And when we see Kirsten Gillibrand tweet the future as intersectional, well, intersectionality is that ideology of race -based policy.

[2666] Ocasio -Cortez puts forth the Green Deal that says racial equity.

[2667] I prefer to judge someone on the content of their character, not the color of their skin.

[2668] Well, it's also that's the kind of talk that gets Trump reelected.

[2669] Oh, God.

[2670] And that's one of my biggest pet peeves.

[2671] Because, you know, I've talked about progressive tax.

[2672] I think we can do a lot for a public option, for expanding Medicaid.

[2673] The idea of a Green New Deal at its core to me is fascinating.

[2674] Can the government, you know, can we allocate tax money to invest in new technologies, fusion, nuclear and reduce carbon emissions and do great things?

[2675] Can we make high -speed rails?

[2676] But then when you come out and say pay the unwilling and these other equity things and intersectionality, I'm like, That's not what I'm talking about.

[2677] I can't support that.

[2678] The unwilling is the most preposterous one.

[2679] The idea that someone's unwilling to work and we need to provide them with a living.

[2680] That's insanity.

[2681] I mean, that's the ultimate progressive bend.

[2682] I mean, that's where it keeps going where they're trying to out -progressive the next concept.

[2683] But hold on.

[2684] In a socialist or communist society, you still have to work.

[2685] Yeah.

[2686] I don't know what society exists where you expect people to undertake the greatest construction project in the history of humanity, a massive train network that makes all planes obsolete.

[2687] but at the same time tell people they don't have to work if they want money.

[2688] Well, this is what we're saying.

[2689] It's like as these ideologies get more and more ridiculous, they try to out progressive the next step.

[2690] Like it's a fundamental right for you to earn a living.

[2691] And Pew Research put out a poll last couple weeks ago, the Democratic Party, 54 % want more moderate policies.

[2692] So I fall in that bracket.

[2693] But still, you still have about 43 or so that want more left -leaning policies.

[2694] But what that means is the party split.

[2695] And so here's the problem I see.

[2696] If you're going to put me up, if you're going to say, Tim, you've got to vote.

[2697] You have to vote.

[2698] You have to make a choice.

[2699] You've got a moderate conservative who believes things I really don't agree with, but he doesn't want to give money to people who don't work.

[2700] And he doesn't believe in identitarian politics and race equity or whatever.

[2701] And then you've got the Democrats who are so far left to me. I can't even see them anymore.

[2702] Who do you think, you know, social liberals and liberals are going to vote for?

[2703] The closest person to them politically will be a conservative.

[2704] Yeah.

[2705] That's what scares me. You know, we had this great future with a potential for a public option, for expanding Medicaid, for, I mean, look, I really do believe social programs are important.

[2706] We can do more.

[2707] I like a lot of what Bernie has to say.

[2708] I think we need to reform education, but I think education could be expanded.

[2709] Again, I'm interested in the ideas.

[2710] I want to advocate for them, but we need to figure out how to do it.

[2711] But where I fit politically, I'm politically homeless.

[2712] I don't agree with someone judging me based on my race.

[2713] I've been through that.

[2714] Hell no, never again.

[2715] People vandalizing my home because they didn't like that.

[2716] I'd a brown mom and a white dad.

[2717] I don't want to live in that world.

[2718] I don't want you to look me in the eyes and say, I tell people look at this way when it comes to Harvard.

[2719] I want you to look into the eyes of that little Asian boy and say, honey, you can't go to Harvard.

[2720] You're Asian.

[2721] You look too much like those people.

[2722] How does that make sense?

[2723] Why is it that just because this kid looks similar to this kid, you're going to tell him he has a harder standard for the SATs to get into the school?

[2724] I just, I refuse.

[2725] I absolutely do.

[2726] It's disgusting.

[2727] And the fact that it's coming from Harvard is so confusing that the high I mean that is the school right when you think about higher learning like where'd you grab oh my dad graduated from Harvard whoa Harvard that's Harvard that's the school and the fact that they're practicing this way they say it's because if they didn't do that the student body would be overwhelmingly Asian disproportionate to their catch up white people I don't care I don't I don't think you get to look at someone and say you look too much like they do no it's crazy it's it's it's it's it's not a meritocracy if you do that.

[2728] I want to talk to you about universal basic income.

[2729] What do you think about artificial intelligence and artificial intelligence and automation and the removal of all these jobs, which is a real concern that a lot of people have.

[2730] And then the way to mitigate it that's being bandied about is universal basic income, that they would give you a certain amount of money.

[2731] And I think the idea is that everyone would get it, even wealthy people would get it.

[2732] Yeah.

[2733] And that would be the only way to make it fair.

[2734] But where the fuck is it is.

[2735] that money coming from and then what do you think about it i i don't think i'm not an economist man right you know but i don't i don't think it's feasible at least right now i do believe that on a technological level we will eventually reach like a star trek you know kind of future where it's not about communism it's just literally like we have scarcity's gone right like they have replicators but when it comes to universal basic income people need to understand some basic economic principles if everybody gets a thousand dollars okay that's 300 million thousand dollars Right, right.

[2736] But it's not everybody.

[2737] It's everybody of working age, right?

[2738] Yes.

[2739] So, so let's, let's do this.

[2740] I own, I own a burger shop.

[2741] I need to hire someone to flip burgers.

[2742] So I say, we pay 10 bucks an hour.

[2743] It's, it's not a lot, but we're a small business.

[2744] We can't really afford to pay more.

[2745] Is that acceptable?

[2746] They say, I get a thousand bucks a month.

[2747] Why would I, why would I spend, you know, if, you know, that not really?

[2748] Well, the idea is that the thousand bucks a month that you get, you get to keep, and then the flip burger money.

[2749] My time's worth more than $10 an hour.

[2750] I don't care.

[2751] When I was 17, 18, I worked for American Eagle Airlines.

[2752] I was lifting like 50 ,000 pounds per day, and I was getting $10 an hour.

[2753] You must have been jacked.

[2754] No, well, I've been skateboarding my whole life, so I was certainly, you know, in shape.

[2755] But if you told me, hey, you know how you're making less than $1 ,000 a month working full -time after taxes?

[2756] How about we just give you a thousand bucks?

[2757] I bet I'll be at the skatepark.

[2758] Yeah.

[2759] You know, so there's some positives that people will pursue their passions.

[2760] But hold on.

[2761] How many people do you know want to be comedians and they're not funny?

[2762] quite a few quite a few and imagine if you said we're going to subsidize your endeavor into a thing you're not good at yeah how many people are really good at car being carpenters but they wish they could be pro football players how many people are really good at being teachers but want to be a famous actor so what do you think would be some sort of an appropriate response to automation and artificial intelligence i mean you can't just have millions and millions of people just have nowhere to go and that's that's a uh a techno it's a quagmire this is this is why i uh you know i've i've never been a pot smoker.

[2763] I think I've smoked like once in my life.

[2764] Want to try now?

[2765] No. It's just not my thing.

[2766] But everyone always would think I was stoned because I would talk about this kind of stuff with my friends while they were stoned.

[2767] The philosophical consequences of technological innovation.

[2768] It is not the postmaster's fault that he spent 30 years becoming the best of the best in working at the post office that technology emerged that is going to displace him and put him in the poor house.

[2769] When I was about 19 years old, I was skateboarding in downtown Chicago and I saw an old black homeless man. And I had some leftover food, and I was like, hey, what's up, dude?

[2770] You want some food?

[2771] And he was like, oh, thanks, man. And I was just like, I got to know.

[2772] Can I ask you a question?

[2773] How did you become homeless?

[2774] And he said, you know what, man?

[2775] He's like, I think he was like 60 something.

[2776] He said, I used to have a job.

[2777] I worked all day, every day.

[2778] I had a family.

[2779] Eventually, you know, my, my, didn't have kids.

[2780] My friends started to get old and move on.

[2781] I lost touch with a lot of them.

[2782] Some of them died.

[2783] And one day I got told that my job wasn't needed anymore.

[2784] and so I couldn't do anything went on unemployment for a little bit but my you know the job I was good at didn't exist right I can't remember exactly what he said this 14 years ago but he was like so everywhere I went I said I'll do anything I'll do anything but even the small jobs that paid a little bit to flip burgers weren't enough to cover my rent after a few months I got evicted then because I didn't have a place to live I couldn't go to the job I did have I started sleeping outside and I've been here ever since I'm like that's sad you know and that's that's a sad reality what do you do I don't know.

[2785] Go to AOC.

[2786] She hooks you up.

[2787] But that's the thing.

[2788] She goes too far, but this is why I believe in some kind of social policy and saying that social security, something to help these people.

[2789] Well, for sure.

[2790] You know, if we have a real community, you would help out the people that are in your community.

[2791] We're too big.

[2792] Yes, that's the problem.

[2793] You know, communism works really, really well when you have like five people.

[2794] Right.

[2795] Yeah, I try explaining to people.

[2796] They always make this political, you know, the political compass, authoritarian, libertarian, left, right?

[2797] and they they people like to claim that anarchists like the violent smashy ones in antifa are libertarian left and i'm like no no no no like the libertarian left quadrant are pot smoking hippies live on farms and anarcho communism makes a ton of sense when it's you and your buddies working together on a farm sharing responsibilities it doesn't make sense for a community of you know 300 million people and you have to trade extremely specific resources to make a computer happen right at that point you need to be able to quantify the value of specific objects and that's why communism doesn't work for massive, on massive scale.

[2798] But I will say when artificial intelligence is a different conversation, you know, technological advancement is going to result in like Luddite riots.

[2799] You know, the opioid crisis.

[2800] Yeah.

[2801] I could be wrong.

[2802] I read that there was a connection between unemployment from these factories getting shut down and depressed dudes popping pills.

[2803] There was a masculinity report was published by Harry's, the shaving company.

[2804] And they said the overwhelming majority of what contributes to a man's happiness is gainful employment.

[2805] Like 80%.

[2806] So what happens when a factory shuts down?

[2807] You got a bunch of young dudes who want to do something.

[2808] They want to matter.

[2809] But they can't.

[2810] There's nowhere to go anymore.

[2811] Especially in a small town.

[2812] But I'll tell you what, man, Percocets feel real good.

[2813] You know, those drugs, they feel fantastic.

[2814] But they'll kill you.

[2815] So that's, you know, I think this contributes to the popularity of Bernie Sanders and Trump.

[2816] They talked about the working class.

[2817] I talked about these free trade agreements hurting people.

[2818] They talked about getting the factories back together.

[2819] And that means a lot to people who have been popping pills, who are depressed, who are sad and scared.

[2820] And it's a lot of people in this country.

[2821] So to wrap this up, the only other thing that I think I would like to at least make an attempt at is what would be the path for a person who's been banned from these social media sites?

[2822] What do you think would be a reason?

[2823] way to bring people back into the conversation?

[2824] Whether it's YouTube or whatever, what would be a reasonable way?

[2825] I mean, don't you think there should be...

[2826] Just be...

[2827] Just anybody.

[2828] Anybody, no matter what they've done...

[2829] Well, if you commit a crime, you've got to, you know, you get caught, you go to jail.

[2830] Right.

[2831] Then when you get out, you're free to engage in normal, you know, civic behavior.

[2832] Do you think that the concept of a permanent ban is in...

[2833] some way, almost an un -American concept?

[2834] Oh, man, I don't know, to get that specific, like...

[2835] Well, we're into freedom of speech, right?

[2836] I believe that as long as these companies are monopolies, and they are the public sphere, it is wrong to permanently exile someone for saying a bad word, for holding the wrong opinion.

[2837] Well, a lot of these, like, learn to code, they're fucking preposterous.

[2838] I mean, to ban someone permanently for something that...

[2839] Well, I'm saying a man is never a woman.

[2840] Right, that was a permanent ban.

[2841] men aren't women though that's that's absurd and terrifying yeah but even um like milo being banned why because he was tweeting at a celebrity you're supposed to tweet at celebrities well not only that it was a I mean the criticism of that movie was no different than the criticism of any movie that they thought sucked but it happened to be about a feminist idea or a woman's movie what was interesting is that I mean it's been so long so forgive me if I get this wrong but I believe Leslie Jones was tweeting her followers to go to Milo as well, or something that effect.

[2842] I believe so.

[2843] Yeah.

[2844] Well, you know, she's, she's, she's done some some questionable tweets herself.

[2845] Hey, look, people block me for no reason.

[2846] I have no idea why some people have blocked me. Yeah, I've, I've gone to people's pages and found out that I'm blocked.

[2847] Because it's block lists.

[2848] Yeah.

[2849] Well, there's a solution.

[2850] If you don't, if you don't, you know, so what they say is, um, it's actually really interesting precedent.

[2851] The Daily Stormer encouraged, this is, I'm going off of some news reports that could be wrong, encouraged people to send, you know, racist images and start sending mean things to this woman who I think, I believe who was Asian or Black.

[2852] And a court ruled that the First Amendment will not protect you if you encourage others to engage in harassment.

[2853] It's really interesting then when we consider what's going to happen with the lawsuits towards all of the people who smeared and defamed and called for action against Covington kids.

[2854] Right.

[2855] That precedent that was used against Daily Stormer is not going to be used against these high profile celebrities and personalities.

[2856] And are there lawsuits that are being formed right now for that?

[2857] Yeah, the big news.

[2858] the, like five lawyers, I think.

[2859] I can't remember the guy's name.

[2860] Huge list.

[2861] A lot of, like, you know, a lot of people did delete their tweets, but a lot of people didn't retract.

[2862] Like, Bill Maher, four days later, it's like, after people were already offering to represent these families and threaten lawsuits, Bill Maher then comes out with his bad information.

[2863] But anyway, ultimately, to the point of redemption, we shouldn't permanently exile people for saying things that we think are wrong.

[2864] What really scares me about the Alex Jones thing.

[2865] I'm not, what I'm about to say is not to claim that Alex Jones is mentally deficient or anything like that.

[2866] That's not my, that's if you can have whatever opinion you want, the point is, if your justification for banning him is that he said Sandhiko wasn't real, does that mean people who are, who don't have a grasp on reality, aren't allowed to use social media?

[2867] Is it, does that mean that people who are stupid aren't allowed to use social media?

[2868] People who are mentally ill. Right.

[2869] You have to have an intelligence test to use social media.

[2870] That's insane.

[2871] Yeah.

[2872] You know, people are allowed to say what they want to say.

[2873] And I think so long as Twitter is a monopoly, They should probably, we should probably have some protections on the ability to engage in public discourse.

[2874] I'll give you a really important point.

[2875] Occupy Wall Street took place in Zucati Park in New York on what's called a privately owned public space.

[2876] Pops.

[2877] This space is owned by a private entity.

[2878] However, they had no legal grounds for evicting the protesters from the park because it was encouraged, the public was encouraged to come.

[2879] So I would argue if Twitter is actively encouraging public participation, they lose the protections to ban whoever they want.

[2880] I think it's rather terrifying that you would seed political power in this capacity to foreign interest, stockholders and private individuals at a massive corporation, a monopoly that's not even beholden to the U .S. to a certain degree.

[2881] You know, forgive me for being a little bit of a liberal who wants regulation on massive corporations, but I'm just, I'm surprised I don't see it, you know, from other people on the left.

[2882] It's a very compelling argument, and I think the more it's fleshed out, the more it seems to, we definitely have an issue.

[2883] It's a giant issue, and it doesn't go away when you just ban people.

[2884] Oh, no. You create parallel economies, worlds, and you end up with backlash?

[2885] Yeah.

[2886] Well, listen, man, I'm really happy you came.

[2887] I'm really happy we did this, and just thank you.

[2888] Thank you for educating me on this and giving you a perspective, and it's a very articulate and very intelligent perspective, and I really appreciate it.

[2889] And I think for everyone, this helps us to sort of get an understanding of, you know, this, just the whole spectrum of what's going on with all this stuff.

[2890] Yeah, and I'll just add by saying, I don't think I'm the smartest person in the world.

[2891] I probably got a lot, I probably get a lot wrong.

[2892] I do my best to try and, you know, have my facts straight.

[2893] If I don't, forgive me, fact check me all the time.

[2894] You know, do your due diligence, but, you know, it is what is.

[2895] I appreciate your perspective.

[2896] Thank you, brother.

[2897] Yeah, thanks for having me. Thank you.

[2898] We'll do this again.

[2899] Tim Poole, ladies and gentlemen.

[2900] Bye.

[2901] Cool.