The Joe Rogan Experience XX
[0] Hello, David.
[1] Here we go.
[2] We're live.
[3] We're doing it.
[4] Yes, we're doing it.
[5] What's going on, man?
[6] I'm nervous.
[7] Don't be.
[8] I enjoy your show.
[9] I really do.
[10] Thank you.
[11] It's a pleasure to watch.
[12] You're very smart guy, man. And like I said, we're just talking about it.
[13] You're very reasonable.
[14] In this world, I think there's so much of this, the YouTube political world, the YouTube commentary world where people are so fucking toxic.
[15] You know, there's so much negativity.
[16] There's so much what they call dunking on people.
[17] There's so much dunking on people.
[18] You do a little dunking.
[19] Some of it's warranted.
[20] It is warranted, yes.
[21] But I don't know if it's beneficial.
[22] To the people doing the dunking?
[23] Yes.
[24] Or even to the cause.
[25] I think it is temporarily, well, sometimes it's good because it shows, it mocks people's positions and it makes people realize, yeah, that is a ridiculous position.
[26] So if you're on the fence or if you're not really quite sure how you feel about things and you see someone get mocked for a ridiculous position that maybe you've even shared for a little bit, maybe you haven't explored it deeply, and you see someone who has.
[27] has explored it deeply, sort of expose all the flaws in this line of thinking.
[28] It's good.
[29] But my thing, what I'm, and I interview a lot of people on the right and a lot of people on the left.
[30] And I just hate all this conflict that I'd see the unnecessary conflict, I think, is when you watch television today and you see Antifa fighting with, you know, Trump supporters and all this all this weird conflict i don't i don't necessarily think that most of it is is necessary necessary well i think the devil's in the details so like as an example if you want to bring together i don't know people who are on opposite opposite sides of the climate debate for example good luck sure right well why is part part of that you could argue is if one side just does not accept science.
[31] How can you really bring those people together?
[32] It doesn't mean you need physical conflict to resolve it.
[33] In fact, I completely agree with you.
[34] The physical conflict is totally counterproductive.
[35] But at a certain point on some issues, I understand why there's like an intractability to the debate where it seems completely impossible to move forward because whichever side you're on, I would argue that I'm on the right side of these issues and others would disagree.
[36] when you're far apart in a way that you can't even agree as to like what the starting point facts are about the conversation how do you even how do you start i have some ideas as to how i try to do it but it's very tough it is very tough i just don't think dunking on people always like constantly shitting on people is necessarily the way to do it yeah and i think it's important to distinguish between just straight up ad hominums where someone is wrong and bad uh because i think they're a bad person or they're an idiot or whatever to recognizing when somebody is a participant in bad faith in a conversation to when someone has maybe fallen prey to audience capture or whatever else might be kind of influencing what what and how they're doing i i think that those criticisms are legitimate but you got to stay away from just ad hominem yes yes i agree and i think that it's just so common today it's it's it's also extremely attractive the the youtube algorithm algorithm, you know, as far as comments go.
[37] I mean, you know, it actually kind of encourages it.
[38] And so does Facebook's.
[39] So does, you know, anytime there's a social media platform that is ad -dependent, one of the best ways to get people to engage is to have something they disagree with so they can get angry.
[40] Yes, until it becomes no longer brand safe according to whoever's running the platform.
[41] Right.
[42] Right.
[43] I mean, if you go back to April 2017, where I woke up and saw that my YouTube channel made 19 cents the previous day and I text Kyle Kalinsky and I say I think there's like a glitch.
[44] It says I made 19 cents and he says it says I made 35 cents or something like that.
[45] Something's going on and it was the beginning of like Adpocalypse 1 .0 and that was a rough three week period and so it's you know encourage the debate and the battle of ideas so to speak and all of the stuff until advertisers get worried, and they say, oh, you know, our ads are showing up on stuff that's a little bit touch and go for us.
[46] That's a weird one to me because I, YouTube has always been a secondary thought for me. The first thought was the audio version of the podcast.
[47] And in fact, when we were uploading it to YouTube at first, I was like, why are we even doing this?
[48] I guess why not?
[49] Some people probably want to watch it.
[50] And then somewhere along the line, it became at least close to as big as the audio version of it and then maybe even more significant because one of the things that the YouTube version has is the comment section which is often a fucking dumpster fire but it at least there is some sort of like a community engagement aspect of it that doesn't really exist in iTunes like in iTunes it's sort of it's in a vacuum right?
[51] Sure.
[52] But when the adpocalypse thing happened I was like hmm what was going on here like it wasn't my primary focus So it wasn't terrifying, but people that only did YouTube and people that relied on that for their living.
[53] I mean, it's a huge blow.
[54] It was huge.
[55] And at the time, I'm trying to think back, I think maybe like around 30 % of my entire show's revenue was coming from YouTube at the time.
[56] So it was not everything, but it was still significant, right?
[57] I mean, I have staff and overhead and all of that stuff.
[58] So just overnight, 30 % going away is huge.
[59] And that's why I've tried to move to the model of telling my audience.
[60] You can skip all of this stuff, you know, even some of these other, you know, super chats and all of this other stuff.
[61] Like, we run a membership program on my website.
[62] I control 100 % of it.
[63] So it's on a Patreon deal or anything?
[64] We're on Patreon, but it's not big for us.
[65] The way I think about it is as long as, I mean, listen, yeah, there's, you know, marijuana companies that are having trouble even processing payments.
[66] But assuming, like, Stripe and PayPal don't say you can't even accept payments anymore, David Packman, I control the entire.
[67] process on my website.
[68] So when people pay their six bucks, all but 2 .9 % gets to me. And when Apocalypse happened, I saw it as a maybe blessing in disguise and that I could now explain to the audience.
[69] Here's the problem with these algorithms.
[70] Here's the problem when it goes from I am fighting white supremacist content to an algorithm can't distinguish between that and white supremacist content.
[71] That's bad for me. When I interview Richard Spencer, I obviously don't agree with Richard Spencer.
[72] But can an algorithm figure out that there's a difference between an interview I do with Richard Spencer and white nationalist propaganda?
[73] I don't know, but we can kind of get around all of that if you just go directly to me. And that's why my focus has been growing that those direct members.
[74] Did you interview Richard Spencer?
[75] Yeah.
[76] Did you get shit for that?
[77] Yes.
[78] Yeah.
[79] That's a weird one, right?
[80] I'm sure you're aware that, what is it called, the date in society that can whether there was a woman who made a bunch of connections like Joe Rogan knows David Pacman oh and Joe Rogan also knows Alex Jones Alex Jones must be friends with David Pacman like the map yeah it's like one of those minds you know like it's and it's really weird it's it's like guilt by association I saw a couple of them there was like an initial one which made you're thinking of then there was a map of like the YouTube sphere specifically left middle and right or something like yeah yeah and this idea that everyone's like a part of a grand conspiracy to help each other out and push right ideology.
[81] Even though, you know, a lot of people that were labeled as right or aren't right.
[82] Like who?
[83] Like me. Oh.
[84] I'm not right at all.
[85] My sense is your politics are pretty left on most stuff.
[86] Although I don't, I mean, I don't know you personally beyond just seeing your shows.
[87] But maybe the critique is based on, because I think that those maps were based on what is the YouTube algorithm, suggesting.
[88] And so that may not be in line with your personal politics.
[89] Right.
[90] It's just maybe what we're talking about, like if you're interested in conflict, if you're trying to get engagement, that's the way to do it.
[91] Like if a YouTube algorithm is constantly suggesting people like Ben Shapiro or Gavin McGinnis or whatever, and those videos come up over and over again.
[92] Sure.
[93] And I mean, so a lot of those people's channels do really well on YouTube.
[94] So if you interview someone who has a channel themselves, there's a very good chance that the algorithm, if they're watching your interview with that person, will say, well, here's a lot of their stuff.
[95] And then once you click there, the algorithm very quickly starts to build a picture of every individual user.
[96] If you watch your interview with Ben Shapiro, and then it takes you to a daily wire video, then it takes you to like the daily wire second stringer guy, and then you're off who knows where, right?
[97] It's all machine learning, right?
[98] I mean, that's all that.
[99] For the most part.
[100] Yeah.
[101] Yeah.
[102] That's a, it is, it's a troubling aspect of that thing that they do where they suggest the next videos, which didn't used to be a thing.
[103] It used to be you would go to YouTube, you would watch a video, and then you would go find another video.
[104] Right.
[105] They didn't suggest anything.
[106] And then somewhere along the line, I don't remember what year it was, but this started happening.
[107] And then they started auto playing the next video.
[108] Auto play, yeah.
[109] Yeah, I think there was some kind of recommendation thing very early on, but initially it might have been restricted to just other videos from the same channel you're watching.
[110] Probably.
[111] And at a certain point, it started to recommend other things.
[112] And I don't know if you look at your analytics and see what percentage of your views are coming from that recommendations feed from other stuff.
[113] but it's significant for a lot of YouTube channels the tagging your videos and getting the right metadata on them in order to bring an audience is an important thing so it's a double -edged sword in some sense it sounds like but to get back to what you were saying about so it labeled you oh it's Richard Spencer okay yeah I was going to say they labeled you as right but you're not right well it's disingenuous I mean I've said it over and over and over again I've never voted for a Republican in my life I voted independent for Gary Johnson just because did my podcast.
[114] And I wasn't happy with Clinton and I wasn't happy with Trump.
[115] I was like, this is just gross.
[116] I'm just going to vote for Gary Johnson.
[117] I mean, I didn't think he was going to win.
[118] He had almost no chance when he didn't know where, what Aleppo was.
[119] I was like, that was his, uh, he was his scream.
[120] You know, like, what's his face from New Hampshire?
[121] Howard Dean.
[122] Howard Dean.
[123] Yeah.
[124] Well, voting in California also, I assume you vote in California.
[125] It wasn't going to.
[126] It's a joke.
[127] Yeah.
[128] Um, but people conveniently will just, Or they'll say that, like, you're a Trojan horse.
[129] They're like, you're a pretend left -wing person who's really just pushing right -wing ideologies.
[130] I'm like, well, which one?
[131] Which right -wing ideology?
[132] Is it gay marriage?
[133] Is it?
[134] What is it?
[135] I'm on the left on everything except maybe the Second Amendment.
[136] Right.
[137] I think the criticism that could be levied if one wanted to make it into a criticism would be if you engage with right -wing ideas that you don't agree with right like i take you at your face you know face value that you don't agree with a lot of the stuff that your right wing guests say one could make the argument that by not challenging those ideas it's implicitly lending them more credibility than maybe you think they should have that's interesting because what i try to do with people um unless something someone saying something egregious i try to let them talk i want to know how they feel i want to know what their thought process is.
[138] And so instead of just challenging them on everything, I want them to elaborate.
[139] Right.
[140] And I feel like by doing that, I get a sense of how they've come to that conclusion and whether it's logical.
[141] Right.
[142] Whether it's, uh, whether they've actually used their thoughts and they've really calculated and thought, this is the position I take and this is why.
[143] And I, a lot of people don't.
[144] And there's a lot of the, a lot of the times when you challenge people in their positions, you find out, like they don't really know.
[145] what the fuck they're talking about and that the best way to find that out is to let them talk like candace owens on climate change right that was i mean there's the socratic method of questioning you know which is why do you think that and how do you know that that's true etc etc and sort of some other questions that that come from it uh which i do as well i mean i think i don't know to to tie it to the richard spencer interview that i did some of the criticism i received after was from people on the left i mean the people on the right it was most of the was people on the low for doing the interview yeah or for what i said in the interview for what i doing it okay yeah for doing the interview at all the criticism was more from the left right for what i said in the interview the criticism was more from the right from people who just agreed with richard spencer like what things did they agree with uh that it is inevitable that people with different ethnic or religious backgrounds simply will not be able to coexist together peacefully and we're better off trying to figure out how can we separate people based on their membership in ethnic or religious groups.
[146] God, that's sad.
[147] I mean, literally separatism.
[148] That's sad.
[149] That's a sad thought that you just can't get along with people that do other things, that are interested in other things that come from other places that have different religions, that have different points of view.
[150] Like, why?
[151] Yeah, well, they have a series of, you know, decades of what they call, scholarship supporting their view, but for the context of my interview, I made it abundantly clear that I didn't agree with that stuff.
[152] Right.
[153] And my view is, and everybody can have a different view about how they do interviews, my view is if I just allow what I consider to be disgusting views to be spread out, right?
[154] You know, like a spray bottle, just spray them everywhere, not do anything else.
[155] I can't say that I'm doing something that I think is valuable.
[156] I don't feel like it's valuable.
[157] So my approach is, Are the ideas known enough to be worth refuting?
[158] That's number one.
[159] If it's some weird conspiracy theory that has not even any following whatsoever, I'm probably not going to choose to even entertain it because it's irrelevant in sort of all ways.
[160] So my first question is, was Richard Spencer relevant at the time?
[161] Altright was rising.
[162] This guy was considered by many the sort of creator of the alt -right.
[163] He was growing a following in the context of the Trump candidacy at the time, or maybe administration.
[164] I don't remember when exactly it was.
[165] It was, I think, 2016.
[166] I'm trying to remember when it was.
[167] I don't remember when I first heard his name.
[168] Yeah.
[169] How did he become to prominence?
[170] I don't know the sequence, but I think he had an alt -right website that had articles of some kind.
[171] And then he, that website became more known.
[172] That fucking term is so talk.
[173] Alt -right, you know, alt -left.
[174] the centrist like all these different labels are so i'd rather talk about issues i agree with you there it's so clunky but so you know first thing was i did want to interview him but if i had felt that i wouldn't be prepared to make it abundantly clear that i don't agree with the guy and i think his ideas are terrible i wouldn't have done the interview right right so the problem i had with the critiques from the left of me doing that some who said the last thing we need to be doing is giving this guy a voice that's often how they say it or a platform my response was this guy's getting interviewed in lots of other places that aren't even challenging him.
[175] Right.
[176] I'm at least making an attempt here to get something in the record that there are arguments against these ideas.
[177] These are bad ideas.
[178] And I don't want to be part of the diffusion of just the ideas themselves.
[179] I'm going to have to watch that.
[180] I'm going to have to watch that now.
[181] Now, when you did do that, like, what was his response?
[182] During the interview?
[183] What was his response to your pushback?
[184] I mean, he had answers.
[185] He was well prepared.
[186] I don't know if there were, they were unique or new arguments that I was making but there was no argument to be made that I was letting him just parrot white nationalist talking points unopposed which I just wouldn't feel good about that right it's not how I do interviews yeah and then the left was upset that you were giving him air quotes a platform a very small portion of the left I want to be super clear that's the left I mean my audience is very left almost everybody understood what I was doing 10 years ago I was interviewing the Westboro Baptist Church most people understood what I was doing they were more prominent at the time, but there was this sliver of the left that just didn't want the conversation to take place.
[187] And I always struggle with this because, as you can see, I have no problem criticizing that sliver of the left.
[188] My concern is getting like overly wrapped up to criticisms of the left that are only held by these like niche slices.
[189] Yes.
[190] And that's why I try to avoid going further than necessary into those criticisms.
[191] Like I think they're more serious critiques of the left to be made beyond anti -speech or want to limit speech or whatever.
[192] That's a pretty big issue, though.
[193] Well, I don't actually agree that it exists on a significant portion of the left.
[194] Like, I think a bigger issue, for example, like if you said, what is like a serious issue that the left needs to contend with right now?
[195] I would say a more serious issue is if you look at the progressive accomplishments of the early 20th century, for example, like 1905 to 19.
[196] and the New Deal accomplishments that the left had in the time of FDR.
[197] What was different, I think, then, than the left now is that you didn't have to be completely in line with a specific set of policies or ideas.
[198] And I worry that now there's a little bit of the left maybe having this idea that if you're not in line on all of these issues, whatever the checklist is, so to speak, you're not really worthy of being a participant in what is clearly a leftward move in sort of the average American's political orientation.
[199] I don't want to see that prevent progress.
[200] Yeah, that's the hard tribalism, right?
[201] That's where the line gets drawn.
[202] You're with us or against us.
[203] There's one way to think.
[204] There is a lot of that.
[205] I mean, I saw it with health care recently.
[206] With health care, I don't think that you can make any serious case from the left that health care is fine.
[207] and the for -profit employer -connected system that we have is working.
[208] Like, I don't think there's any progressive case to be made for that.
[209] Where people will differ is what about Medicare for all versus some other system, system that looks more like Canada's or the UK or Germany or whatever.
[210] And I've already started to see, like, when I say on my show, I'm kind of agnostic on this.
[211] Like, the system we have is a disaster.
[212] We need a system that will get coverage to everybody.
[213] The numbers can be made to work.
[214] any number of different ways.
[215] We've looked at it.
[216] But 80 % of people on Medicare, I believe it is, have some additional coverage.
[217] They either are still working part time or full time and get coverage that way, or they're poor enough to be on Medicaid.
[218] The point is Medicare for all doesn't solve every issue.
[219] It's way better than what we have.
[220] But here's like a dozen other possibilities looking at other countries.
[221] There is a portion of the left that doesn't like that because I'm saying I'm against Medicare for all.
[222] I'm not saying that.
[223] What I'm saying is there are a number of different ways to improve upon the system we have, all of which sever this relationship between usually your employer and these for -profit insurance companies.
[224] Why can't we be open to that?
[225] I really don't understand private citizens that don't want easy access to quality health care for everybody.
[226] That confuses the shit out of me. Like, have you ever been hurt?
[227] Have you ever been sick?
[228] Have you ever been broke?
[229] Right.
[230] Do you want to be broke and have no access to health care?
[231] No one does.
[232] does.
[233] No one wants anybody they care about to not have access to health care.
[234] Of all the things that we concentrate on this country, there's two things that drive me fucking crazy that people just dismiss education and health care.
[235] The idea that you have to, like my buddy Greg, Greg Fitzsimmons, he's sending his kid off to school.
[236] How much do you say that it was?
[237] 65 ,000 dollars a year for both of his kids, for each of his kid.
[238] So, you know, he's got two kids.
[239] That, that's, that hurts my head.
[240] You even think.
[241] about spending a hundred and thirty thousand dollars a year just on if you're a regular person with a regular job how the fuck do you do that impossible it's impossible it's so much fucking money and then that's not even paying for housing and food and transportation and books and everything else that you're going to need too and to make it more difficult for young people to succeed is one of the worst ways to make a stronger country if you want a strong country you want educated people that get to pursue their dreams.
[242] And the idea that we are so willing to spend so much money on these costly regime change wars and flying troops overseas to these places that they don't want to go.
[243] No one wants it to happen.
[244] And it's trillions of dollars.
[245] And people are fine with that.
[246] But you talk to them about some sort of socialized education system and people freak out and think you want to turn us into communists.
[247] Well, I think what is really important to understand is that the facts you just laid out don't matter to people who see this as an issue of what do people deserve what do they deserve so if you say to a fiscal conservative you know if you consider the amount that the employee pays for premiums plus the employer plus your co -pays plus co -insurance but you put it all together into some amount and you explain to them there's lots of great analyses that have been done which tell us that with roughly the same amount of money maybe a small payroll tax in addition with roughly that same amount.
[248] It all could be done with a single payer system that covers everybody.
[249] It's the same.
[250] You're taking all of these individual risk pools where you have different for -profit insurers and then you have systems for people that don't have enough money, Medicaid.
[251] You have systems for people that are over 65 Medicare.
[252] You put it all together.
[253] You spread the risk far wider.
[254] The employer no longer has to pay their part of the premium.
[255] The employee no longer pays part of their premium to the for -profit insurance company.
[256] The numbers work they're still not going to say you know what that that sounds great it's actually pretty fiscally conservative let's do it because at some point there is a portion of the right that just doesn't think people have earned health care they just haven't earned it they haven't or education or education that's right and it's very hard to uh change people's minds when that's their view i think it's might be george layoff who i believe calls it strict father morality which is like how would a really strict father treat a child who comes to them and says, hey, you know what, I figured out a way that we can all have health care.
[257] The strict father, even if the numbers make sense, would say, I'm going to teach you a lesson.
[258] You haven't earned that health care, either because you don't work or you don't make enough money or you're on disability, whatever the case may be.
[259] How do you convince someone to change their mind when that's their worldview?
[260] Yeah, how do you, when it's an ideologically based decision and you're on Team R or Team L, you know, which group of ideas do you adopt?
[261] Right.
[262] The UK system sucks.
[263] You talk to people that get health care over in the UK, it sucks.
[264] But at least they have a system.
[265] It's just not the same quality health care that you get in America.
[266] Same with my friends in Canada.
[267] I have friends in Canada that have come down here to get surgery because they find better doctors over here because of...
[268] Brandtsoil was going to Canada to get his operation.
[269] Yeah, why was that?
[270] Why did he do that?
[271] The best place is in Canada.
[272] The best place for hernias?
[273] For that type of hernia, I believe, yeah.
[274] Hmm.
[275] I mean, here's the thing.
[276] Even in saying the UK system and the Canadian system, neither one is that good.
[277] Right.
[278] Those two systems are totally different.
[279] Right.
[280] So I feel like.
[281] But they're both socialized medicine.
[282] They are both.
[283] Well, yes, in some sense.
[284] I mean, the Canadian system is administered at the province level.
[285] So the province is sort of like the market.
[286] Instead of having all these sub -markets attached to individual for -profit insurers at the provincial level, that's how it's organized.
[287] The U .K. has the National Health Service where they're actually, they don't actually run the healthcare facilities, but they're the ones who are contracting them.
[288] So it's sort of like the health care facility still is its own entity.
[289] It's not that you're going and the government is the employer of the doctors, so to speak, but they're contracting with the health care facilities.
[290] But the point I want to make is that there are criticisms of all of these systems, but they're different ones.
[291] when we say, the British and Canadian systems aren't that good.
[292] Right.
[293] Let's figure out in what ways each is not that good, because they're different ways, whether you're talking about health outcomes, early detection, cost per treatment, whatever.
[294] You really have to drill down and figure out in what way are we saying it's not as good.
[295] Yeah.
[296] What I'm saying is that there's no perfect system.
[297] There is no perfect system.
[298] Those systems are right.
[299] But I believe that most of the best doctors in terms of like North America at least are in the United States.
[300] I'm sure there's probably some very good doctors in Canada that do specialized medicine.
[301] But I think really good doctors are incentivized by profit.
[302] I really do.
[303] I think there is some motivation for if you spend so much money for medical school and you bust your ass, you want to make a lot of money.
[304] And some of the best doctors earn a really good living.
[305] That may be.
[306] I think limiting their ability to earn that money won't incentivize people to be excellent.
[307] So a couple different things.
[308] I mean, number one, to be clear, we're now starting to get into a little bit of broader economic philosophy.
[309] Like, I'm a capitalist.
[310] I'm for social democracy, which is a mixed system.
[311] That's a capitalist system that says we're going to invest tax revenue in a particular way to make sure that no one falls below a certain level.
[312] So just to contextualize that my point of view, is not from one of becoming a socialist country.
[313] And I think we share those views.
[314] I think so.
[315] A lot of doctors will say that even though on sort of on paper in a socialized medicine system, they might make less for a particular procedure, for example, or something like that, a lot of them are still in favor of those systems because it would drastically reduce their overhead.
[316] So there's all of this apparatus that includes medical billing, and coding both on the insurer end and at the healthcare provider end.
[317] The hospital and the insurance company both are battling over what is it that was done?
[318] What are the codes that apply here and what are reimbursement rates?
[319] There's fraud when it comes to that and that requires an apparatus for investigating and adjudicating that.
[320] That adds more and more cost.
[321] So I don't think it's as easy i don't think it's as obvious that under those systems at the end of the day a doctor that owns a pcp group for example or an orthopedic clinic or whatever the case may be i don't know that it's that clear that they end up taking home less money hmm that's interesting i'd wonder if in practice that would play out that way maybe and maybe i'm talking about just like high -end orthopedic surgeons that do knee surgeries for athletes and things along those lines they often would be outside of whatever insurance apparatus we're talking about anyway.
[322] A lot of those folks are often being paid out of pocket anyway.
[323] Yes.
[324] So it's less.
[325] At least some.
[326] At least some.
[327] So for the average person's experience, I think it's less relevant.
[328] Yeah.
[329] Then there's also liability insurance, which is extremely expensive.
[330] That's a giant issue with doctors.
[331] It's a huge expense.
[332] It is.
[333] Yeah.
[334] I mean, I think it is necessary.
[335] There's a question as to whether it's organized in the best way.
[336] I know less about that component than some of the other ones.
[337] Yeah.
[338] Um, the education and healthcare, those are the two things that I think we can both agree.
[339] We need to invest money on and we need to figure out some way to make that more accessible to people.
[340] Yes.
[341] Yes.
[342] And I don't understand people that don't think that.
[343] And if that's what that is, the strict father mentality, that just, the only thing that it makes sense to me is that you don't want people who are kind of half -ass in college, just that can just get in.
[344] I think that that comes up a lot when you hear about so -called free college, which isn't free.
[345] It's we're saying we're paying for it through taxation, really important to point that out.
[346] It's just not for everybody, and that's okay.
[347] I mean, I think that that sometimes gets lost.
[348] And yes, there are more and more jobs that require college degrees, even though you could make the case.
[349] Maybe the college degree is not actually necessary, but it's a way to sort of thin the herd of applicants in order to just make hiring, you know, more practical.
[350] but I do think that it's okay to say that college isn't for everybody, but the same ideas that apply to so -called free college, meaning college paid for through education, could apply to trade school, they could apply to retraining programs.
[351] There's a whole bunch of other ways that it could be done.
[352] Yeah, no, I agree.
[353] Yeah, I just the college is not for everyone thing is more true now than ever before, particularly with certain technology studies.
[354] You're learning things during your four years at a university that are just going to be completely outdated by the time he graduate.
[355] In what kind of program, for example?
[356] Well, Jamie, what he did with audio engineering.
[357] Oh, I see.
[358] He went to school for audio engineering.
[359] By the time he got out, it was all useless.
[360] But that was not a four -year bachelor's program, right?
[361] It is now.
[362] Oh, it is now.
[363] Okay.
[364] When I went there, it wasn't available for that, but since then, they have made that available, and that's also in the time that YouTube has made basically education free for a lot of people.
[365] Sure.
[366] I mean, I think with that, the issue is, in my mind, that when you consider the cost relative to the earnings potential, as you pointed out when you talk about $68 ,000 a year, you know, I guess taught at Boston College and I think it was like $64 ,000, something like that, depending on what field you're going into, it's almost impossible to pay that off ever.
[367] Yeah.
[368] So at some point, something needs to change.
[369] And this kind of gets us into the technological automation and unemployment stuff of what happens as computers and technology start to replace jobs.
[370] And that's where I think there's a pretty clear line between a free market capitalist, a social democrat like myself, and actual socialism, like what should happen with the gains that come from those technological advancements.
[371] But as far as the education piece is concerned, it's completely unsustainable.
[372] the way it is now.
[373] I knew about you before this happened, but then I really kind of got on board with you when someone was trying to get you fired from Boston University.
[374] And I remember Boston College.
[375] Boston College, sorry.
[376] And I tweeted it.
[377] And I was like, like, what?
[378] What is this?
[379] This is craziness.
[380] So it's a woman named Amy Siskind, who I don't know other than that incident where.
[381] And you had a disagreement about something.
[382] It wasn't, and it wasn't toxic.
[383] It wasn't hostile.
[384] I didn't think so.
[385] I didn't think so.
[386] Explain what you.
[387] said and what she said that you disagree with.
[388] So we may be able to even find the tweet, but she tweeted something, the gist being that she would not be supporting any candidate in 2020 who's white or male.
[389] I think that that was the gist of it.
[390] And I responded, I'm going from memory here, the gist was something like, isn't that the definition of racism, you're sort of preemptively excluding someone from consideration on the basis of race and in that case, gender, if it was white and male.
[391] There it is right there.
[392] Yeah, there is.
[393] I will not support white male candidates in the Dem primary.
[394] Unless you slept through midterms, women were our most successful candidate.
[395] Biggest Dem vote getters in history.
[396] Obama 08, Hillary 16.
[397] White male is not where our party is at.
[398] And it is our least safe option in 2020.
[399] Right.
[400] So I said, isn't there something not progressive about preemptively dismissing a candidate based on their race and gender?
[401] I feel like there's a word to describe that.
[402] As a progressive, I won't be jumping on board.
[403] Yeah, so it exploded.
[404] You basically didn't even say it's racist.
[405] Right.
[406] You said there's a word to describe that.
[407] And that's a very polite way of disagreeing with someone.
[408] I thought it was polite.
[409] And she tried to get you fired.
[410] Yeah.
[411] She contacted as far as I know.
[412] So, okay, I don't, I'm going by what she said.
[413] She said she contacted Boston College and told them not to allow me to teach there.
[414] That's insane.
[415] And Boston College, since I'm just an adjunct.
[416] I'm not on staff when I'm not teaching.
[417] Like, during the three months of the semester, I'm employed there, and the other nine months I'm not.
[418] So I think Boston College said he's not currently employed here.
[419] And I think that that's basically as far as it went.
[420] But I did talk to some other faculty there who were aware of the thing that was going on.
[421] That's a crazy thing to do.
[422] I think so.
[423] It's a nasty, mean thing to do, too.
[424] Someone can't disagree with you.
[425] and it's a very good point.
[426] I think she probably got upset because you made a very good point and tweets started coming her way.
[427] And a lot of people, they read those fucking comments and people get toxic in those comments.
[428] Yeah.
[429] Random, strange people that you don't know and then you're forced to, you know, look at their opinions and their criticisms and their insults.
[430] That incident started me down the path of drastically limiting my social media use.
[431] Good for you.
[432] Yeah.
[433] I mean, that was the beginning.
[434] And then, It became, I mean, you know this way more than I do because I think on all platforms you have roughly 10 times the following that I do.
[435] No matter what you do post or whatever, if you look at what the feedback is, it's extraordinarily toxic and horrible negative stuff that is only a distraction to what I'm trying to do.
[436] And most of it probably isn't, but I had Naval on yesterday, Naval Ravikant.
[437] And one of the things that he brought up that's so huge, it's so true, is that you can have 10 positive things.
[438] that one negative will outweigh the 10 positives.
[439] In your mind, you mean.
[440] In your mind.
[441] Especially if you're a person who's self -critical or self -objective.
[442] You're analyzing your behavior.
[443] Was that good?
[444] Was that bad?
[445] Right.
[446] And then you read that one bad comic, fuck, are they right?
[447] Yeah.
[448] You don't read all the people that say you're great.
[449] Oh, brilliant.
[450] Loved it.
[451] Yeah.
[452] Fuck you, loser.
[453] Oh, a loser?
[454] Absolutely.
[455] I mean, listen, when I announced I was going to be doing your show.
[456] If you look at what the comments were, almost all, this is awesome, great.
[457] left wing voice talking to Joe Rogan go get him David this is such a great opportunity can't wait to watch you faceplant and that's the one where I'm like man are they right like am I got a face plant so I don't know I'm just trying to limit the amount that I'm on social media one thing I am doing though because I mean my show is in part as successful as it is because of social media so I can't ignore that right you don't have to engage I don't have to engage and I also can just say I will check our networks in the morning.
[458] Then I'll spend the whole day.
[459] I'll do my show.
[460] I'll do what I need to do.
[461] And then before I sign off for the evening, I'll check it.
[462] And that's a terrible idea, because what if you read the worst shit right before you eat dinner or go to bed?
[463] Well, no, it'll be like 5 p .m. So you'll have five hours to cool up.
[464] Yeah.
[465] And it's way better.
[466] And weekends, I'm almost, I mean, people are like, David, you're still tweeting on the weekends.
[467] A lot of those are like pre -scheduled tweets where I'll just sit and schedule some stuff.
[468] And I am trying to stay off it.
[469] And it's been really great.
[470] I mean, it's been a fantastic experience.
[471] We looked at our phones yesterday.
[472] We did a sober October podcast with my four friends.
[473] We looked at our phones to look at phone usage, and I use my own four hours a day.
[474] I'm on my phone four hours a day.
[475] I'm like, fuck, that's a lot.
[476] What app did you use to measure it?
[477] It's something on your iPhone.
[478] Oh, okay.
[479] Yeah, you have an Android.
[480] I'm sure they have a similar thing.
[481] Yeah, yeah.
[482] Four hours of screen time.
[483] I'm like, ooh, that's not good.
[484] They're adding that to the computer, too, so it's going to combine.
[485] You'll know how much you're looking at all screens soon.
[486] Yeah, but what if you're writing?
[487] Well, is it going to count?
[488] That count?
[489] Yeah, sure.
[490] I mean, you're staring at the screen.
[491] That's how I work, bitch.
[492] Hey, that's what the argument you guys are making for the phone yesterday.
[493] That's Bert's argument.
[494] Yeah, well, Bert's argument's not good because he doesn't even write.
[495] That's ridiculous.
[496] One thing I did that actually is useful is I used to have my social apps on the home screen.
[497] And Cal Newport and some others have said, you got to get rid of those.
[498] He actually advocates getting rid of the apps altogether so that you have to go on a computer and choose to go to Facebook .com.
[499] or Twitter.
[500] I haven't gotten there yet.
[501] But even just removing them from the home screen makes me significantly less likely to even pull them up.
[502] It's two clicks to swipe up and scroll over to the app.
[503] Right.
[504] But even just getting them off the home screen keeps me off of them significantly.
[505] That's smart.
[506] That makes sense.
[507] Yeah, I need a certain amount of access to those things with my business.
[508] Right.
[509] Scheduling shows and things along those lines.
[510] But yeah, it's not good for Yeah.
[511] I mean, I think one of the biggest realizations is that people don't really miss you that much.
[512] They don't hear from you for a couple days.
[513] Like, that's one of the things where the idea of needing constant engagement comes from sort of like a slightly narcissistic point of view where like, people are going to notice if I don't tweet for from Thursday night until Monday morning or do anything like that.
[514] Right, right, right, right.
[515] And they really don't.
[516] They don't.
[517] They don't.
[518] They don't.
[519] There's a lot of other people to pay attention to.
[520] Yes, they are.
[521] There are a few people that tweet on you that that are kind of.
[522] of crazy and that want to hear from you all day long yeah yeah but they'll get used to it they'll get used to you they will vanishing and i don't know i mean cal newport has have you had him on no okay he wrote deep work and then more recently he wrote digital minimalism and he goes into detail about just the effect of uh i was going to write it down my phone but it feels sacrilegious to put that as a note to take it out right now yeah i mean he goes into detail about this stuff and just about, you know, we need more uninterrupted periods of concentration.
[523] What is the book called again?
[524] Deep work.
[525] Deep work.
[526] And then digital minimalism.
[527] And they're both, I interviewed him recently, really just solid, very solid stuff.
[528] Awesome.
[529] Yeah.
[530] What were we just about to get into?
[531] Oh, so we were into this woman, Amy Siskind.
[532] Did you reach out to her when she did that?
[533] Privately?
[534] Yeah.
[535] No. No?
[536] Did you reach out publicly?
[537] Well.
[538] You did publicly.
[539] declare that she tried to get you fired, right?
[540] Yes, I did.
[541] Did she respond?
[542] I don't know because she blocked me. Oh, God damn it.
[543] She blocked you over that?
[544] Yeah.
[545] Jesus.
[546] That is so sensitive.
[547] What is Twitter for?
[548] Right.
[549] Is it just to fall in line?
[550] Is it just to agree with everything someone says with no questioning whatsoever?
[551] What's extra interesting about it is she blocked me on Twitter, but then I treat my Facebook profile basically as public so I post stuff on there it's the same whether you're friends with me or not and I had posted something totally innocuous about I was at a restaurant or drinking an espresso on it I don't even know what it was she showed up there and commented that she had called Boston College and told them not to you know not to hire me or to fire me or whatever on a post about you having an espresso it was just a personal post right but the point is She determined that the exchange was worthy of blocking me on Twitter, but then she came to my personal Facebook page and said, I'm calling Boston College and telling them to fire you.
[552] There's a word for that.
[553] There might be a few.
[554] There's a gang, but there's a big one.
[555] There's a four -letter one.
[556] I just don't understand why someone would want to do that to someone.
[557] Why can't you disagree?
[558] Why can't you?
[559] And she's just upset that you pointed out a glaring problem with what she was saying.
[560] Yeah, apparently.
[561] And, you know, the thing is, I don't, I, the way I operate, I don't think even necessarily that she's a bad person.
[562] I just assume that she has, you know, some emotional thing going on.
[563] She could have had a terrible day.
[564] As far as I know, someone near and dear to her died that day.
[565] I am.
[566] If someone near and dear to me died, I wouldn't go to your Facebook.
[567] You wouldn't necessarily be on Twitter.
[568] I wouldn't stalk your Facebook or post about you drinking an espresso.
[569] Fair, fair.
[570] Tell people I try to get you fired.
[571] But my approach is, I just, I really do assume most people are pretty good people.
[572] And even when we have disagreements, I tend to give the benefit of the doubt that if we could only talk the way we're doing, we could figure out 95 % of the disagreement.
[573] I agree with you.
[574] Maybe not all of it.
[575] Right.
[576] But most of it.
[577] So I don't begrudge her.
[578] I mean, yeah, I don't, she behaved in a way I wouldn't behave, but who knows what she had going on, you know?
[579] I mean, it's.
[580] Yeah.
[581] Well, it didn't get you fired.
[582] so it's not that bad.
[583] But if it did, that would have been horrible.
[584] It would have been a different situation.
[585] Would have been probably good publicity.
[586] Yeah, it probably would help you.
[587] I think so.
[588] You get excited about that.
[589] Yeah, you got a little twinkling your eye.
[590] The reason I'm thinking back, actually, to a conversation I had at the time where someone said to me, if you do get fired, it's the best possible thing that'll happen.
[591] Right.
[592] It would just be fantastic, and it didn't.
[593] Because I wasn't actually employed there at the time.
[594] That's the irony of it.
[595] Well, this is the thing, the falling in line.
[596] the no room for deviation from the ideology.
[597] This is a giant issue that I have with both parties.
[598] And I think it's one of the reasons why people are in these parties to begin with.
[599] I don't necessarily think that people have clearly thought out every single aspect of whatever party they align with.
[600] I think they fall in line and they adopt a predetermined pattern of behavior that seems to be attractive at the time.
[601] And then they fall in line with whatever that party is saying.
[602] I think that is a giant percentage of people When someone deviates from that like you did Someone who is also clearly a progressive And clearly a left wing person And you're criticizing something And very Very politely And that she just goes haywire over that That's the thing right It's like Are you responsible For the people who also comment on your post And this is where We're getting to this like Vox thing That's happening with Stephen Crowder right now Right.
[603] Are you responsible for the reaction to what you post?
[604] Because if you look at what Stephen Crowder said to that, if we don't, people don't know the story, Stephen Crowder got into it with this guy who is a writer for Vox, who is, he's gay, his Twitter handle is gay wonk.
[605] Carlos Maza.
[606] Yeah, so it's not that he's hiding, that he's gay, talks about it all the time.
[607] He's kind of effeminate.
[608] And Stephen Crowder mocked that.
[609] And he mocked that.
[610] And he mocked that.
[611] in these videos where he was criticizing Carlos' position on Antifa, specifically, what I saw.
[612] And in doing that, he called him this queer Mexican, this, like, and he's doing it in a ribbing way.
[613] He's doing it in a joking way.
[614] And then Carlos Maza posts all these horrible tweets that came his way, and apparently he got docs, so people got his phone.
[615] and they were saying debate Stephen Crowder who was getting all these text messages in and all this hateful stuff that was coming his way so the question is who is responsible for that hateful stuff if Stephen Crowder calls him queer is what is queer okay is LBGTQ what do we do there what do we do if the Q is he is it okay to call someone gay who identifies as gay if he calls him the gay little Mexican is that is is that bad like what is how bad is that like what is that you know what I'm saying but so do you feel what I'm saying here it's like I do I know where you're getting at let's zoom out a little bit right and then we'll we'll get into this man where to even start with this because there's a lot to unpack here right if we we'll analyze the specifics in a second maybe but first if you look at the policy the terms of service of YouTube there's a verge article from yesterday before a few days ago earlier this week before YouTube had made the decision to to demonetize Stephen Crowder.
[616] Well, they made the decision to not act.
[617] Initially.
[618] They just say that it didn't violate the terms of services.
[619] And then today, as I got in here, Jamie informed me that they made a decision to demontize.
[620] That's right.
[621] So in the article where they made the decision not to act, they actually put what YouTube's terms of service are with regard to bullying and harassment.
[622] My reading of it, and we could go through them, if we could pull them up, we could go the terms and conditions.
[623] That was my view as I looked at what it was that was done by Stephen Crowder and what the terms of service are.
[624] Just matching it up, not looking at the comments from either person.
[625] What was it specifically?
[626] It was specifically targeting an individual on the basis of sexual orientation.
[627] But he wasn't targeting him on the basis of it.
[628] He was mentioning that with his bad ideas.
[629] He was targeting his bad ideas in regards to Antifa.
[630] A lot of the...
[631] Missing Antifa, but if you look at Crowder's video, and I can't believe I spent so much time doing this, but I spent like a whole hour on this two days ago.
[632] Yeah.
[633] He was talking about how Carlos just dismisses Antifa as being not that big a deal and that there's bias in the media, whenever there's anything negative that happens.
[634] Sure.
[635] But if you look at the overall picture, and then Crowder goes on to talk about all the assaults, all the murders, that there were sexual assaults, there was rapes.
[636] There was all these things that happened with Antifa.
[637] He was talking about all these different people that got.
[638] got maced in the face, all these people that got hurt.
[639] And he's highlighting all, like, this is not something to easily dismiss.
[640] And that the FBI had labeled Antifa a terrorist organization.
[641] So so far it's just politics.
[642] It's just what does he think?
[643] What do I think?
[644] So far it's just that part of it.
[645] And along the way, he's like, yeah, but the queer little Latino says this.
[646] Yeah.
[647] And when he does that, that's where it's like, okay, what is he doing?
[648] Is he mock?
[649] He's kind of mocking him.
[650] Right.
[651] And he's mocking him by saying he's queer, but he says he's queer.
[652] or he says he's gay yeah but that's like saying i mean listen just because the n word is in rap songs doesn't mean that any that it's defined to go right but the n word is not in like it's not like the l b g t n you know what i'm saying it's not like a part of their their organ it's i think the principle though is you're suggesting that because a certain word is sometimes used self -referentially by members of a group that any use of it from the outside is by definition Not problematic and I'm just saying it's more complicated and you've got to look at the specifics.
[653] It's certainly more complicated.
[654] You do have to look at specifics.
[655] I'm going from memory, but wasn't Stephen Crowder also wearing a shirt that said fags with the A with an asterisk?
[656] It said, wink wink, nudge, it said socialism is for figs.
[657] Okay.
[658] While he's calling a gay guy.
[659] The A is a fig instead of an I. As he's calling a gay guy a queer Mexico.
[660] Yes.
[661] I mean, in total, it's not crazy.
[662] No, there's certainly an argument that, but I don't necessarily think the T -shirt is for Carlos Mesa.
[663] I think that's a T -shirt that he just has because he thinks it's funny.
[664] And because Che Guevara, who is on the shirt, is that is one of the weirdest things that people worship that guy.
[665] He was a horrific human being, a mass murderer, a terrible sociopath, a psychopath, and because he looks good.
[666] Involved in the Cuban Revolution.
[667] Looks good with the beret on.
[668] you know he's he became for a long time i mean it's kind of died off but he became like the the woke poster boy i'm from argentina i i know uh are you really yeah i were you born there yeah no kidding yeah so i mean listen here's welcome my country bro i'm i made it um so listen i think that i i do appreciate what you're saying and i agree with you to a certain extent i believe that when youtube yesterday said we looked at the content in total and we don't think violates our terms and conditions.
[669] I disagreed with them.
[670] I thought it very clearly violated their terms and conditions.
[671] Where I am thinking about it now is the application of those terms and conditions violations because a similar thing happened with Alex Jones as well, which was there's lots of way smaller players that are violating these same terms and conditions, but nobody knows about them.
[672] YouTube doesn't know about them.
[673] They don't get any attention because they have no audience so i think there's the question of the application of these terms and conditions in a way that's sort of fair and is not ultimately going by the public blowback or reaction to situations because that that's how adpocalypse 1 .0 happened again i think it was a coke ad appeared on an obviously racist video on a channel with like 800 or a thousand subscribers the wall street journal i think it was did an article saying look at these screens screenshots of these advertisers on these crazy racist videos that led to blowback because YouTube didn't want to lose money.
[674] And ultimately, that's what this is about.
[675] I know that there are people who say YouTube has an inherently left -wing bias.
[676] Others say YouTube has a right -wing, whatever.
[677] YouTube's bias is towards corporatism and profit.
[678] That's fundamentally what it is.
[679] But as a company, they have a left -wing bias.
[680] I don't know that.
[681] In what sense?
[682] well in the sense that the woman who's the CEO of YouTube has talked about it pretty openly like the fact that she doesn't what was it that she had gotten into what she oh well first of all was the the James D 'more thing you know she was talking about the Google memo and she was talking about how it was incredibly damaging the damaging uh damaging stereotypes against women which it just wasn't it's not accurate is Home Depot a right wing company because because the CEO supports Trump?
[683] That's a good question.
[684] I'm basing it on that they're a part of Facebook, and Facebook is pretty clearly left wing.
[685] Who's a part of Facebook?
[686] Google.
[687] Oh, sorry, Google.
[688] They're part of Google.
[689] I meant Google.
[690] Google is a very, very left -wing group, and it's all Silicon Valley, which is almost entirely left -wing biased.
[691] So I think we have to distinguish between the personal political biases of Silicon Valley entrepreneurs and the broader place that Google has in the world.
[692] in the sort of corporate sphere.
[693] Google is part of the group of huge multinational corporations that lobbies for particular tax policy to avoid paying taxes legally.
[694] That is not a particularly left -wing thing to do.
[695] Google is part of the large tech companies that, in order to avoid serious regulation of their businesses, have come up with this idea of regulating themselves.
[696] Okay, which I know is a topic, self -regulation that's come up before on your program in a variety of ways.
[697] So those are not left -wing things.
[698] And if you want to make the case that as a company, it has a left -wing politic in the outward -facing world, you have to have something more than just a lot of their engineers live in Palo Alto and are hipsters who go to coffee shops.
[699] What do you think?
[700] I think that in terms of the place that it occupies within the economic, system we have, they are not very different from all of the large corporations that are pushing against regulation, pushing for ways to avoid taxes, period.
[701] So in terms of economic decisions?
[702] Yeah.
[703] I mean, listen, if we want to talk about how the personal politics of the employees translate to policy, we can do that, but we need to be able to make some specific claims about how it does.
[704] What I'm saying is we know the way in which the structure that Google is a part of leads to it advocating for things that are center -right corporatist capitalist, the status quo of tax shelters havens and not paying taxes, regulating ourselves, et cetera, et cetera.
[705] Well, that's what's interesting about this Crowder thing, is there ultimately the decision was to allow him to have his freedom to post videos on there.
[706] But the punitive aspect of it, is they're going to reduce his ability or eliminate his ability to make money from it?
[707] Well, I should say reduce, right?
[708] Because he could, couldn't he put videos, put ads up in his video?
[709] Yeah.
[710] I mean, you can, sure.
[711] So like we, one thing that I do is we, we kind of split off the ad sales for my show into an ad agency.
[712] And we're doing ad sales, not just for my show, but for other shows as well.
[713] And those include ad placements that are not like the pre -roll ads on YouTube it's the host is actually talking about a product or whatever it's a read you know a live read sort of thing unbox therapy does a lot of that yeah that's where I first saw those he does some pretty extensive ones okay so of course you can do that I mean yeah that there's nothing so he can do that but he can't just collect revenue like I assume he's been doing before he has a significant number of followers I think his uh his YouTube subscribers are more than three and a half million it's like it's very high yeah More than me. And they've just eliminated his income.
[714] Right.
[715] He comes out of YouTube.
[716] And this was their decision based on his way of talking about Carlos Mesa.
[717] That's what happened.
[718] Yeah.
[719] I mean, so what are the concerns to me?
[720] It's not that he didn't violate terms and conditions.
[721] Like I said, I think he pretty clearly did.
[722] The concerns to me are, is YouTube only going to even look into?
[723] these circumstances or instances when there is a public outcry.
[724] The answer is probably yes, because why would they look into stuff nobody's paying attention to?
[725] It's not a good use of their time.
[726] Well, it seems like they change their decision based on public outcry, based on Carlos Mase's reaction to their initial decision.
[727] I happen to think their initial decision was the wrong one, but I have a sort of broader concern here, which is about the fairness of the application and also the distinguishing between content that is promoting whatever falls under any of our definitions of hateful or whatever content and those who are fighting against it.
[728] So is it because he mentioned his sexual orientation and that he called him a lispy little queer or whatever he called him or a queer Mexican?
[729] And if he just called him a fucking idiot and he received the exact same amount of hate, would you still think that that was a good move?
[730] No, I mean, I think that it would not fall under what they are now claiming is the justification for the demonization.
[731] It would be different because it would have the same result.
[732] The only difference would be they wouldn't be attacking his sexual orientation specifically because of Crowder.
[733] That's the only difference.
[734] It's policing speech.
[735] You know, it really is.
[736] Here's, so who gets to decide, if not the private businesses, what their rules are?
[737] That's where the real question comes up, right?
[738] Tulsi Gabbard believes that it's a First Amendment issue, and she believes that everyone should have the freedom of expression.
[739] And that as long as you're not doing anything illegal, you're not putting anyone into danger by giving up their address or doxing them or something along those lines or making overt physical threats.
[740] Right.
[741] That you should be allowed to do that because that's what the freedom of the speech is all about.
[742] and freedom of speech when you eliminate social media in this country is your freedom is basically just yelling out in public I mean you're in this weird place as a culture as a culture it's unprecedented really in terms of the waters we're navigating right now there's a couple different things to so I like the principle like my principle is we do almost no moderation on any of our platforms that my program is on my only thing that I tell my team is if you see something that really seems to be illegal it's calling for violence whatever these are we have a very very high bar before we will remove anything and quite frankly we're just too busy what do you mean by that i'm just i'm confused like who's not your videos who's videos yeah on so if we find out that on our videos someone is posting endless comments oh in the comments yeah in the comments my personal view is if it's not illegal i just just let it all be there and sit.
[743] That's my personal view and that's a great principle to have.
[744] Well, we don't touch them.
[745] We leave them alone, even though we get accused of it.
[746] But the question is, YouTube at one point time had thrown out there that they were going to make people responsible for the things that were in their comments.
[747] I vaguely remember that, but it didn't ultimately happen.
[748] I think they backed out of it very quickly when they realized that places like yours, which like your average video gets how many thousands of comments?
[749] A lot and many of them anti -Semitic.
[750] And how would you, yeah, and how would you even be able to look at all those?
[751] I mean, you would have to be 24 -7 models.
[752] because you've also got people that are watching your videos from overseas at all times of the night.
[753] Yep.
[754] So I think that the principle of only illegal content will be removed is great.
[755] That's my personal principle.
[756] However, I think that there is no serious case to be made that a private company can't say these are our terms of service.
[757] And if you want to, I mean, it's sort of almost a conservative principle, right?
[758] The idea that unless illegal things are going, on.
[759] We are not going to tell a business how it is that it should be run.
[760] And that's where I think a lot of right -wingers start to stumble on this issue because they're calling for a very invasive form of government regulation.
[761] They're calling for the government to step in and even break up these organizations because they've gotten too large.
[762] But you're hearing that from the left as well.
[763] Yeah.
[764] Well, I think there's a difference, though, between Elizabeth Warren saying we should separate the social platform Facebook from the ad sales revenue generating piece of it.
[765] That's one thing that falls under antitrust.
[766] That's different than saying the government should come in and it should tell anybody who runs a social network that you can't even, you can't do anything unless the content is illegal because there are financial considerations, right?
[767] I mean, there's lots of content that would not be illegal, but it would make a platform, a video platform like YouTube not financially viable because advertisers would see it and they'd say oh we're not going near that right so i have a very hard time taking what is a very authoritarian perspective that the government should come in and say this is how social networks should be run now if you want to change the law here's the way it could be done if you want to change the law and argue that these platforms have gotten so big that they represent more of a town square so to speak then okay maybe you could pass a law that changes how they would be regulated.
[768] But that's typically the type of stuff the right is against because it is more regulation.
[769] It is more regulation, but it's regulation to keep a private company from regulating against free speech.
[770] You see what I'm saying?
[771] It's a sneaky kind of regulation.
[772] It's a regulation that's enforcing the First Amendment and the people's ability to freely express themselves.
[773] If we're admitting or if we're agreeing that we are entering into this new world.
[774] Yes.
[775] Where this is, that's my position is it is a town square.
[776] And I feel like everybody should be able to communicate.
[777] The really unfortunate, unsavory aspect of it is when someone gets harassed, like Carlos Mesa was because of this, where people are sending him all these homophobic tweets and all the, I mean, he's getting text messages and all this shit.
[778] That's, that's the unsavory and unfortunate aspect of it.
[779] How do you stop that?
[780] I don't know how you stop it.
[781] I don't know exactly how you stop it.
[782] But I think, think it would be useful for I mean one thing is when does a platform get big enough in your mind that it would qualify for this like town square designation well for sure YouTube let's talk about that one because that's the one we're on I mean fucking god damn it's huge okay it's gigantic so are there other types of businesses through which communication happens that you think should be regulated in the same way I'll that's not really clear so I'll give an example okay if you start regularly sending people via UPS similar things to some of the content that exists on YouTube.
[783] And UPS says, we're getting reports that you're sending people harassing stuff.
[784] We don't want you as a customer anymore.
[785] Here's a question, though.
[786] Isn't there a difference between someone sending something to a physical address and someone sending something, let's say to you, when your social media apps are on the third page of your phone and you have to swipe all the way over to get them and open it up?
[787] and you have to read them if you want to find them.
[788] Well, you don't necessarily have to read them.
[789] There's a difference in a practical sense.
[790] But I guess the question is, would we want the government, would you similarly want the government to enforce for telephone companies if you are harassing, getting harassing texts and you report it and report it?
[791] That's a different thing, I think.
[792] I think when it's coming to your phone and the phone is ringing, I think that's another step.
[793] That's another step towards invasive.
[794] It's a big gray area.
[795] Yes.
[796] Is the phone ringing?
[797] is it a phone call?
[798] Is it a text?
[799] Is it a WhatsApp message?
[800] You don't have to read that text.
[801] Right.
[802] Yeah.
[803] No, I feel you.
[804] I guess where I hesitate, and again, speaking of someone from the left who believes regulation of businesses is an important thing, I would want to be really sure about how exactly it is that the government would step in and mandate essentially that their view has to be listened to over the terms of service that a private company would wish to have.
[805] Yes.
[806] I feel like when you give people a gun, they start looking for targets.
[807] And that is a very common thing.
[808] If you give people the ability to censor, and if you give people the ability to censor based on their political ideology or based on what they feel is offensive where other people don't, it's a slippery slope.
[809] And I think that that can lead to all sorts.
[810] Look, that woman, what is her name again?
[811] The one who tried to get you fired.
[812] Amy Siskind.
[813] Imagine her being in charge of a social media platform.
[814] She tried to get you fired from Boston College for something that was incredibly polite.
[815] Right.
[816] That is what I'm talking about.
[817] Is that very action, that very same type of thinking that she tried to impose on you?
[818] That's what I'm worried about.
[819] And I'm worried about people that are really strictly trying to promote their ideologies and what they think is okay and not okay.
[820] And it's very slippery because there's a lot of weird people out there that believe a lot of weird things and want other people to conform to those weird things.
[821] and we sort of have to decide that's why I'm bringing up this Crowder thing like do I think that what he said was good no it's not nice to call someone a little lispy queer it's not nice it's kind of mean you know and especially when that guy wasn't even engaging with him but he's making fun of him he's a comedy show he's mocking them so the question becomes like when when is that mocking considered homophobic and when is it just ribbing right and that's his position his position is that it's just ribbing this is the problem with a discussion that is only about the principles.
[822] So, like, a lot of our conversation for the last 15 minutes has been, what is our principle about what types of business regulation is okay for the government to do and is not okay?
[823] Right.
[824] Or, when we talk about free speech, what should, do we have a principle of anything short of illegal content versus something that is more strict?
[825] The reality is that it's a, it's, there is a more gray area.
[826] Yeah, we're trying to sort of regulate the way people communicate with each other.
[827] So it's not, it's like, if someone said that to someone in a bar, a cop would not arrest them.
[828] Like, yeah, you'll list be a little queer, you know, that would be like, oh, that guy's an asshole.
[829] But the bar would be perfectly within their legal right to say, we don't, we don't want you in here.
[830] You're making our customers uncomfortable.
[831] And nobody would say that it would be against the law for the bar to say, you got to go.
[832] That's a good point.
[833] If they were doing it to their face.
[834] But what if he was in a corner talking about this guy that wasn't there?
[835] And he was saying, yeah, so he's talking about Antifa.
[836] This lispy little Mexican queer, if you came along and decided to kick the guy out of the bar then.
[837] It would, I mean, listen, at some bars, if you go into the corner and you yell about a lispy Mexican queer, they're going to ask you to leave.
[838] And it still would not be illegal and the bar would still not be doing anything.
[839] Right.
[840] But that's a bar, right?
[841] That's a private business where people are physically there.
[842] Yes.
[843] Isn't there a difference between that and something like YouTube, which again falls more in line with like a town square?
[844] Maybe that's what we need to revisit.
[845] Because so much human communication is now happening across these platforms.
[846] I would imagine most of it.
[847] Or most of it.
[848] We need to maybe stop drawing this arbitrary distinction that in person is a completely different thing than over the Internet.
[849] I mean, maybe it's not increasingly.
[850] Maybe it's more the same.
[851] Yeah.
[852] So I'm torn here, right?
[853] On one side, I say, well, it seems like they still allow him to have his freedom of expression because he's still on YouTube.
[854] he still is able to upload his show on YouTube, he will have to find other ways to make money.
[855] Sure.
[856] So one part of me looks at it that way.
[857] And no one has a right to monetize on YouTube.
[858] Right, right.
[859] So in a sense, they haven't violated his First Amendment rights because he's still able to express himself.
[860] But then you go as a company, they've made a punitive decision to eliminate his ability or radically reduce his ability to make an income off of their platform.
[861] that seems like and I'm not supporting that they did it but that seems more reasonable as a decision right to say we're going to demonetize you that seems more reasonable but the problem is there's no alternatives there's nothing remotely like YouTube to over there's an alternative to YouTube for him to regain that same level of monetization yes or for people that share his viewpoint and share his ideas and share his positions, and there's no right -wing YouTube, is my point.
[862] And there's - I would challenge the idea that YouTube is left -wing.
[863] I mean, in terms of it's enforcing its policies.
[864] How so?
[865] I mean, they have - Just this, just this particular issue.
[866] But this isn't a left -wing, how is this a left -wing enforcement?
[867] I mean, they have a -oh.
[868] Well, I think it is because Carlos Mesa is progressive and because the argument that he was making is a very left -wing progressive argument, and this is what Crowder was going after.
[869] he was going after the argument in the process of going after the argument he mocked his sexuality and his appearance I can assure you if it was focused merely on how much of a problem Antifa is this would not have happened I mean I think we both agree to that right yeah yeah yeah no it's all about mocking the guy's sexual orientation and looks that's where if Carlos Maza were a gay Republican and the exact same thing happened do you think the outcome would have been different yes why I just don't think people would be interested oh well that's a dip but so That gets to the real crux of it, which is my real concern with this is YouTube only getting involved in even publicly saying what they're doing about a channel when it becomes very public and it starts to have the possibility of impacting their bottom line and brand saying, this is too hot and we're getting out.
[870] Well, in that sense, what Carlos did once it was revealed that YouTube was not going to take action was very effective.
[871] Absolutely.
[872] I mean, he started tweeting like crazy and people just.
[873] jumped on board.
[874] He connected it to the LBGT movement and then it became this thing.
[875] I mean, the other side of this is, I mean, I don't know if we even want to go into identity politics, so to speak, but there has, I've read some comments on some of the few articles that have been written about this that are saying that this is effectively YouTube enforcing a defense of identity politics, so to speak.
[876] And I think that that's just, again, opening up the door to the incredibly broad application of that term identity politics.
[877] I don't even really fully understand that, and I don't even know if that's a path we want to go down to talk about like the identity politics component of what's going on with a lot of this regulation.
[878] Define what you mean by the identity politics component of it.
[879] I mean, listen, so I guess in order to define it, define it, it would be good to point out that I have been critical of, quote, identity politics.
[880] on the left in a very limited way that I think it is actually damaging, while at the same time recognizing that identity is a really important thing to consider when we think about sort of how the world should be organized.
[881] So like for your audience who may not know, when identity politics is used, you know, like a knife to enforce that because of someone's identity, their opinion supersedes and is the opinion that is the valid one over everybody else because of membership in some kind of group, I'm against that.
[882] I think that's extremely destructive.
[883] It would be very incorrect to believe, though, that identity doesn't play a role and that we shouldn't understand how one's identity might make us think differently about certain issues.
[884] I mean, any example would make that pretty clear.
[885] You know, I as an immigrant to the United States, do I get some privileged position to decide what policy should be?
[886] over all native -born Americans because I immigrated here?
[887] No, that would be me using identity politics as like a mallet or a cudgel or whatever.
[888] But as someone who did immigrate here, we should recognize that I may have things to say about it, which would be valuable and worthy and important to sort of think about that.
[889] That's my view on identity politics.
[890] But you're just not interested in the hierarchy of oppressed people.
[891] I'm not interested in the Oppression Olympics.
[892] And I'm not interested in using identity to silence ideas that could be perfectly good.
[893] coming from someone who is not a member or checking a certain box.
[894] Exactly, nor am I. I strongly believe in the individual, and I think it's one of the most important parts of a collective group of human beings like a country.
[895] We recognize that we're all different, and there's a lot of weirdness amongst us, but we're individuals.
[896] And I like to treat people based on who they are, not what classification they fall under.
[897] Now, do you think that that bad version of identity politics that I mentioned is a big problem on the left, or not a big problem.
[898] I'm curious.
[899] I think it's certainly a problem, but I think it's a vocal minority problem.
[900] That's what I think.
[901] I think if you just regular people that are on the left that are working jobs and having families and doing their hobbies and they just have left -wing ideas, I don't think the vast majority of them hold those positions.
[902] I think those positions are things that people use as revenue.
[903] I mean, not as revenue, but it's like they get points from it.
[904] You know, they get points from certain types of behavior that they support, certain types of thinking that they support.
[905] And it lets you, you know, you've got woke social justice points.
[906] Well, then we agree.
[907] Yeah, I mean, I've asked because I genuinely didn't know.
[908] I mean, I've heard you talk about identity politics.
[909] It's a dangerous number, though, in terms of college campuses, when you look at like what happened in Evergreen State, with Brett Weinstein.
[910] It's very disruptive.
[911] Yes and no. I mean, I do think that it's disproportionately a problem like you're saying.
[912] saying, I think a lot of the problem exists in the college campus setting.
[913] But, I mean, even at Boston College, you know, I had sort of maybe been incorrectly indoctrinated into the idea that this was really a problem everywhere on college campuses.
[914] And I had an incident, the details of which wouldn't be appropriate to talk about, but with a student when I taught at Boston College, that because of the circumstances and the identities involved, I was ready for it to go into this is going to be resolved the wrong way on the basis of the toxic identity politics I'm hearing is existing on college campuses and it was not it was the exact opposite so I think the same way that when you look at Yelp reviews people who had a bad experience are way more likely to go and write about it yes these individual stories get way more attention than the percentage of the problem that they represent I believe you're probably correct about that, but when you see videos like Nick Christakis getting just shouted down at Yale by a group of students and the support of the students and that kind of shit, you say, well, it is real.
[915] It does exist.
[916] It's real.
[917] It exists.
[918] I think that sensible people on the left like me call it out.
[919] But I want to be careful.
[920] Imagine that you had someone from Cato on the show, which is sort of like a traditional conservative or American Enterprise Institute.
[921] Maybe it's like a better example.
[922] example.
[923] And a lot of the conversation was about getting them to talk about or denounce the alt -right, for example.
[924] I'm sure they would do it.
[925] But how much should AEI denounce the alt -right when that's like a different thing that's not - That's a very good example.
[926] Yeah, it's a very good analogy.
[927] Yeah, I think we oftentimes are responding to this very vocal minority.
[928] Yes.
[929] And those are the people that are most invested in getting these ideas pushed through.
[930] And there are, there are, it's also people that, you know, for lack of a better term they're probably mentally ill and i don't mean mentally ill in terms of like have like legitimate diseases but in terms of their thought patterns they're probably obsessive i mean i've had friends that were uh especially friends that were heavily involved in this kind of stuff before and it was very damaging to their mental health this type of stuff being politics being woke left wing shout out at people attack people politics okay but i mean realized somewhere on the wine and then one of them my friend jamie kilstein was uh they turned on him and then you know devastated his life and he realized along the way like oh jesus christ like what what was i doing like i was checking my twitter every five seconds and insulting people left and right and attacking people just to get everybody to say yeah go get them and you know showing everybody how woke i am and how progressive i am and it becomes a weird sort of uh a point system like you're you're trying to score points you're you're trying to gain favor with your party there's a lot of that i think it's really important though so there's people on the left and right who get pulled into political wokeness whether it's i'm now tea party in 2010 people got sucked into tea party on the right and tifa whatever these are all groups with different sort of followings they're not all the same whatever yeah i do think that there is a difference between getting extremely passionate about the idea that everybody should have access to just basic health care, then getting extremely passionate about the idea that we need to go out of our way to shut down every abortion clinic in the country.
[931] I think that there's just, there's a difference.
[932] And so I don't want to participate in a false equivalency between, well, you got very far left and very far right people, and they're the same.
[933] And you've got center left and center right, and they're the same.
[934] It's just two sides of the same coin.
[935] Like, obviously, I have a perspective that is based on my politics, I'm glad to debate any of these issues with anybody who wants to on the merits, but I don't want to make the false equivalency.
[936] I mean, listen, when you look at anti -defamation league numbers, for example, the vast majority of hate incidents in the United States are coming from the right.
[937] We could talk about other ways that the left is active.
[938] We could talk about what it means or how things should be categorized, but that's the reality and so i want to make sure i don't play a false equivalency game my audience would crush me if i did that number one um but i think it's just wrong i think it's wrong to do that i don't think the facts bear it out i think you're right there and i also think that these false equivalency kind of conversations are they're ridiculous because each individual conversation about each individual issue deserves its own discussion yes and to say what about this or what about that those what about isms those are the the the death of any real rational discussion because they go on forever they're they go on forever there's no i mean it's like scroll this is why scrolling twitter endlessly is a problem because there's really no end you could always scroll a little more to that to the end for right yeah i mean yeah the it's the new tweets are coming fast it's the same with a lot of those conversations one of them was ever done that just scrolled until their phone died just charge it wake up in the morning and just scroll down all day how long does it take i think you wouldn't because the new content appears faster right because of the algorithm.
[939] But you would still never run out.
[940] You just keep going.
[941] No. There's a few things.
[942] I don't know if this would be interesting to go into, but there's a few things that I've found have been somewhat successful in conversations with people who really disagree with me. And at least like lowering the temperature a little bit and getting people to maybe engage in a good faith way.
[943] One of them is how do you think I came to my position?
[944] So you might be for.
[945] total free market for -profit health care.
[946] I am for a system where the government is more involved and even if you can't pay, you get care.
[947] Before we even start, if I say, how do you think I arrived at my position?
[948] Yeah.
[949] That has been pretty useful.
[950] Another example is, I think this came from Peter Bogosian, who I think you've had on.
[951] Yes.
[952] The defeasibility question, which is, what evidence if I presented it to you would bring you over to my side?
[953] I'm not saying I have that evidence or that it exist, but give me a framework as to what is keeping you from seeing this my way.
[954] Because sometimes that exists, the person just doesn't know about it.
[955] Those are two tools that I have found super useful in trying to make some headway with people who are hyperpartisan and very escalated with a lot of these issues.
[956] Yeah, that makes sense.
[957] Yeah, it's very difficult to have good faith conversations with people when you disagree with them, you know, and you have to have discipline and you have to have some sort of a sense of self.
[958] Yeah.
[959] And you have to know how to be calm and kind.
[960] You know, the dissent into insults and dunking on people is one of the reasons why, at the beginning of the conversation, I was saying that one of the things I enjoy about your YouTube videos is you're a very reasonable, rational person, and you don't get crazy and animated and insulting.
[961] And I think we need more of that.
[962] Because I think even though you're not going to convert some people, there's just, a certain section of the population that disagrees with you that's just going to.
[963] Right.
[964] But there's a significant number that I'm going to go, hey, this David Pacman guys, he's reasonable.
[965] He's making a lot of sense.
[966] He's intelligent and articulate.
[967] Well, my goal is, and I think that it's sort of working in that we have a lot of Trump supporters who are paid subscribers to my show.
[968] A lot.
[969] We have some.
[970] Are they taking notes?
[971] Maybe they are.
[972] Right to the boss?
[973] But my goal is, my goal is, yeah, that would, I mean, that would be an interesting day if I wake up and Trump has responded to one of my videos about him.
[974] What would you do?
[975] I think it'd be a good day.
[976] Well, that's what Colbert did.
[977] How funny was that when Colbert was on TV is like Donald?
[978] Yeah.
[979] How did you not know that you shouldn't respond to me?
[980] Right.
[981] Yeah, yeah, yeah.
[982] That's rule number one.
[983] It's an important one unless you want to create a shitstorm of a very certain kind.
[984] My goal is I don't pretend to be neutral.
[985] I think neutrality is almost always false because on most issues, people are not indifferent.
[986] I mean, neutral is another way of saying indifferent.
[987] You could be conflicted in neutral.
[988] You could be conflicted in neutral, but I try to at least be objective and transparent in how I arrived at what I believe.
[989] So you can disagree with my conclusion.
[990] You can even come to me and tell me the facts I've used to reach the conclusion are incomplete or wrong, but I'm completely genuine in how I arrive there.
[991] And I think that that is why we have some I mean yeah there's obviously if you look at YouTube comments there are right wingers that watch my show but choosing to support it financially is a different thing and I get emails from conservatives who say I don't agree with your conclusions but I do find that you're at least reasoning through the issues in a way that resonates with me and I want to support the fact that you're doing that.
[992] That's outstanding that's a huge victory there really is.
[993] In a sense in my eyes in this day and age I think this is the most polarized time I can remember as a 51 -year -old man looking back at, you know, my history of paying attention to social issues and the way we communicate with each other and just the partisan attitudes that people seem to have.
[994] I don't, I mean, I think it's probably because of Trump.
[995] That's a giant part of it.
[996] But it's also just a sign of the times of social media.
[997] I think it's in part engineered by the algorithms at Facebook and Twitter and all these other social media companies utilize.
[998] And it's also been engineered.
[999] engineered by bad faith participants and people that are actually manipulating it.
[1000] I don't know if you've paid any attention to Sam Harris had a fantastic podcast with, and we had one with her as well, René DeResta.
[1001] Renee DeResta analyzed all of the various accounts that the IRA had created with the Internet Research Agency that was responsible for all of these fake accounts that people thought were Black Lives, Matter accounts or pro -southern secession accounts or all these different accounts that were very polarizing and arguing with other people that these were just Russians that were working for this organization that was specifically trying to start chaos.
[1002] They were specifically trying to start arguments.
[1003] And when you see that and then, I mean, that's a factor.
[1004] That's a giant factor that kind of shit is a factor and that that is sort of become part of the sport of social media has been arguing absolutely i don't do it i don't engage but i do go on facebook sometimes and someone makes an abortion post and i just watch the chaos like oh my god yeah it's in or anything having anything to do with trump or anything having anything to do with the second amendment or anything that has anything to do with the the wall or immigration so i don't know that people are actually in large disagreements than they were previously.
[1005] I think that, yes, Trump has coarsened the language and the way in which it's now acceptable to talk about a lot of these things.
[1006] That's number one.
[1007] I think the social media algorithms, like you're pointing out, reward the most extreme and polarizing comments and reactions in a never -ending feedback loop where the most polarizing initial tweet generates more responses than less polarizing tweets.
[1008] And then the sub -responses that are most polarizing.
[1009] and aggressive do the exact same thing and this never -ending feedback loop I think it's all those things but I don't know that people are having bigger disagreements than in times past I just think that they're public in a different way well there's more disagreements because people have more opportunity to disagree so they have more opportunity to engage particularly when you're talking about people that are addicted to their phones and it's coming from a guy who uses his fucking phone four hours a day right I'd like to think that one hour of that is productive but I know that three hours of it is me staring at butts on Instagram.
[1010] Look at muscle cars and watching crazy videos.
[1011] And then how much are you on like a computer?
[1012] I don't know.
[1013] I didn't, I don't have that data.
[1014] Right.
[1015] But it's not as much.
[1016] And the good thing about it is most of my bullshit I'm doing on the phone.
[1017] Most of my computer work, unless I'm laying in bed, I just watch, embarrassingly enough, I watch YouTube videos on pool.
[1018] That's what I watch before I go to bed.
[1019] Yeah.
[1020] When I play pool.
[1021] Yeah.
[1022] So I watch like professional pool matches before I go to bed because it's constantly.
[1023] Yeah.
[1024] It's relaxing and I analyze positions.
[1025] That's interesting.
[1026] I do the same thing with chess.
[1027] Ah, there's.
[1028] There's like chess streamers that I watch and it's like similar.
[1029] Yeah.
[1030] It's cool.
[1031] You could kick back and sort of it's you're engaged, but it's nothing crazy and it's also kind of stimulating in an intellectual way.
[1032] Right.
[1033] Yeah.
[1034] And it's different than politics.
[1035] Yes.
[1036] Yes.
[1037] Yes.
[1038] Like on weekends when people, you know, like my mom will, you know, want to talk to me about politics.
[1039] Oh.
[1040] And I'm just like, I did this all week mom.
[1041] Did your mom watch your show?
[1042] She does and she watches other shows and my family's super political.
[1043] Oh, really?
[1044] Yeah, so on the weekend, if it's Saturday, I'm right in the middle of my break period.
[1045] Are they left wing?
[1046] Oh, yeah.
[1047] Thank God.
[1048] I mean, can you imagine?
[1049] Could you imagine?
[1050] Some mean dad calling you up.
[1051] What the fuck is wrong with you, David?
[1052] I might have had to D -Foo.
[1053] Ah, D -Foo.
[1054] Yeah.
[1055] Anyway.
[1056] Anyway, indeed.
[1057] I think that there's more opportunity, as we're saying, to disagree with people, more opportunity to argue.
[1058] And in those more opportunities, you're seeing more conflict and I think more polarization.
[1059] And I think, again, the social media algorithms and all the other nonsense that gets, I think there's, I really do believe that the feeling that I get, but it also might be because a big part of my job is being on the internet.
[1060] So maybe I'm more engaged with it.
[1061] Our view, excuse it a little bit.
[1062] But I think, so in practice, let's imagine that the disagreement, are equal to what they've always been, but there's more opportunities to disagree, and the algorithm favors more escalated disagreement than rational conversation.
[1063] The effect is that you might meet someone with whom you have 80 % in common in terms of your political views, but the circumstances in which you engage with that person are going to be on the 20 % that you don't.
[1064] So it makes it seem as though you just have very little common ground with anybody.
[1065] Right.
[1066] Because the 80 % agreement becomes background.
[1067] and the social media platforms, the debates happening on YouTube elsewhere are focused only on like the most divisive fraction of one's entire political views.
[1068] And that's, I think, what the problem is.
[1069] But it makes sense because most people agree that, I don't know, gas stations, I mean, just to pick something innocuous, most people agree that it's good to have a regulatory system that makes sure that when you think you've pumped five gallons of gas, you've gotten five gallons of gas.
[1070] Yes.
[1071] It's so uncontroversial that nobody's going to talk about it.
[1072] Like, it makes sense that the focus is going to be on the disagreements.
[1073] Where it's damaging is then when you meet people in real life and it's hard to relate or even be in the same room because only those differences are sort of like played up or relevant.
[1074] Yes, yes, yes, yes.
[1075] Yeah.
[1076] Yeah, that conflict gets highlighted.
[1077] You have conflict bias.
[1078] Yeah.
[1079] I don't know where I see this going.
[1080] That's one of the more interesting things about, particularly with social media and like things when you come to this Crowder situation.
[1081] I don't know where this is going because I didn't know this was ever going to be a thing.
[1082] I had never really considered that there was going to be some digital town square that we're all going to be enjoying, whether it's Twitter or YouTube or whatever it is.
[1083] That might even need regulation.
[1084] Yeah, that might even need regulation.
[1085] But getting back to the Crowder thing, the issue that you, so you agree with it in the sense that he was mocking this person's sexual orientation and appearance.
[1086] I agree with the assertion that YouTube's terms of service as written were violated on the basis that he was singling out an individual and the characteristics that that individual was.
[1087] being targeted with or spoken about were sexual orientation in terms that the terms of service say is not allowed.
[1088] Is that different, in your opinion, than someone singling something out for what you believe is their mental incompetency?
[1089] Well, mental incompetent.
[1090] Do you mean that they're ignorant or that they're mentally ill or cognitively limited?
[1091] Cognitively limited.
[1092] mocking their ability to think, mocking their intelligence, mocking their decisions, mocking their, the way they talk, and then encouraging other people to do the same thing.
[1093] And then that person gets harassed based on their intelligence, based on their performance on particular YouTube videos and conversations, and there's active harassers.
[1094] There's people that do that.
[1095] Is there a difference between, say, what Sam Cedar does to Dave Rubin?
[1096] What does Sam do to Dave Rubin?
[1097] He has.
[1098] I don't know that I've seen that video.
[1099] dozens of videos.
[1100] Don't say that video.
[1101] He has dozens of videos where he's just dunking on Dave Rubin.
[1102] So, I mean, I have some as well.
[1103] I believe that they are substantive.
[1104] That's my, my view is that my videos about Dave Rubin are substantive.
[1105] I don't really watch any left -wing stuff because I want to try to isolate myself enough to make sure that what I'm saying are my ideas and that I'm not taking them.
[1106] So Sam's a friend of mine.
[1107] Sounds like a comic.
[1108] Oh, that's interesting.
[1109] Comics do that.
[1110] I don't know, but if there's some specific examples we could comment about them but um i think that they're to your first question there is a difference uh between going after someone for sexual orientation right then going after them for the fact that they say things that are wrong or don't know stuff until you're making fun of someone who has an actual handicap of some kind some kind of you know limited cognitive cognitive limitation that would be a disability of some kind then you are mocking someone for a disability But the resulting effect of the harassment.
[1111] See, this is what I was getting at before with Crowder.
[1112] Like, what Crowder said was one thing, but one of the things that Carlos Mesa was discussing was what the people that had watched Crowder, what they were doing, how they were going after him.
[1113] As if, see, that and that is a real discussion.
[1114] Like, what happens when you say something about someone and then your fans agree and then they take action, which?
[1115] I didn't see that in the sense.
[1116] Steven Crowder decision that the reaction was part of YouTube's evaluation.
[1117] Now, I may just have missed that, but I didn't see YouTube say that part of the calculation had to do with what other people were doing.
[1118] I don't think they did say that.
[1119] I don't think they would.
[1120] But I think Carlos Mason did say that.
[1121] It was one of the things that he was talking about, this endless assault that he's experienced.
[1122] Well, he's right to call it deplorable.
[1123] I think we would agree with that.
[1124] I think your question is more about whether you're who's who's responsible for it right who's responsible for yeah these anonymous people that can just lash out at someone and insult them out of nowhere ultimately they are responsible those people are those people those people those people are responsible however um so there's this term stochastic terrorism i don't know if you're familiar with no i'm not stochastic terrorism is the idea that if you have a big enough audience um and you go and every day you're talking about someone should really do something about a particular politician.
[1125] You're doing it every day.
[1126] You're doing it every day.
[1127] At a certain point, given a large enough audience and enough repetition of that and the fact that there's like a distribution of people's emotional states, cognitive capacity, et cetera, it is statistically probable that someone from that audience is going to go and try to do something about whoever it is that you're targeting.
[1128] That individual who has the show and is hammering on this person, and day after day after day, they're not going to be legally responsible for that person from their audience who went and did something.
[1129] There's no way that you're going to hold them legally responsible under the current legal system that we have.
[1130] But you could argue that it is irresponsible in some way not to understand that your consequences have actions.
[1131] Of course, the person who goes and does the violent act is the primary person who is responsible.
[1132] Right.
[1133] But as long as you're not calling out for that act how do we make this distinction that someone is encouraging that act or someone is at least inspiring that act their judgment calls i mean listen i can go on my show and i can speak in vague terminology or specific terminology you know imagine that there's a local business that i don't like i could go on my show and i could say this business did this and i need everybody in my audience to show up there and to make it impossible to get in and patronize that business that's very clearly on one side of the gray area.
[1134] I could instead say, you know, there's a business and I could say the type of business, but I not name it.
[1135] If it's a small enough town, people would know exactly what business I'm talking about.
[1136] And I really don't like the way I was treated there.
[1137] And if only there was some way that someone could do something about it, the effect could be the exact same one.
[1138] Right, right, right.
[1139] I don't know how you measure when it's on one side.
[1140] Yeah, it's like, right, you could somehow another remove your, you could somehow or another make it so that it's yeah i'm agreeing with you yeah you could remove your responsibility for the action in some sort of what this just got interesting uh i'm i was just i was trying to find a tweet from azza about him sending or asking people to flag crowder's videos did you get the one where he asked people to go assault people with milkshakes and humiliate them at every turn you two tweeted an hour ago or yeah 1230 that to clarify this is responding to Carlos Maza, to clarify, in order to restate monetization on his channel, he will need to remove the link to his t -shirts.
[1141] Oh, so it's the Figgs T -shirt.
[1142] Yeah.
[1143] Oh.
[1144] Well, that's all he has to do?
[1145] He specifically asked about that, and then they responded.
[1146] That's all he has to do.
[1147] Wow.
[1148] That's pretty easy.
[1149] That's pretty straightforward.
[1150] That's pretty straightforward.
[1151] Yeah, this shirt's stupid.
[1152] But, you know, he's a comedian.
[1153] I mean, that's what Crowder's doing.
[1154] And his doing the thing about Mesa, he's mocking him for his appearance.
[1155] but Carlos Mesa specifically encouraged people to throw milkshakes at people that disagree with him and to harass them publicly and humiliate them.
[1156] So one thing doesn't justify the other.
[1157] No, it doesn't, but that is more egregious.
[1158] How far you - Asking people to assault people and asking people to physically humiliate people in person, in my opinion, is more egregious.
[1159] I don't agree with mocking his physical opinion.
[1160] physical appearances.
[1161] That's just what it is.
[1162] You know, but the sexual orientation aspect of it it's like, yeah, I get it.
[1163] I get it.
[1164] It's not nice.
[1165] How far do you think this is just a comedian thing goes?
[1166] Because I hear that a lot in just excusing things that are said.
[1167] Well, he's trying to do comedy, right?
[1168] So he's trying to make fun.
[1169] Is he?
[1170] Well, hold on, but make comedy and making fun of someone are two different things.
[1171] Like, I don't do comedy, but I will sometimes make fun of things people say.
[1172] Right, but he's doing it to be funny.
[1173] He's making fun of things specifically to be funny.
[1174] And sometimes, you know, when you do that, you go too far.
[1175] You cross lines.
[1176] I genuinely did not realize that Crowder does a comedy show.
[1177] Oh, yeah.
[1178] Yeah.
[1179] His show is a comedy show.
[1180] Wow.
[1181] Yeah, a lot of it is funny.
[1182] He does some funny shit.
[1183] He really does.
[1184] Whether you agree with him or disagree with him.
[1185] He's done some hilarious bits.
[1186] I genuinely, I'm reacting in real time because I had no idea.
[1187] He has this bit he does about this French socialist.
[1188] He puts on a wig and pretends to be this different person.
[1189] He's pretended to be a transgender person.
[1190] He's pretended to, he's done a bunch of these infiltration videos where he'll go into these ridiculous organizations and ask them questions.
[1191] But it's very much a comedy show.
[1192] Dressing up like a trans person's funny?
[1193] It is, if you're funny at it, if you're good.
[1194] I mean, Mrs. Doubtfire, wasn't that funny?
[1195] And that's what he was doing.
[1196] He was dressing up as a woman.
[1197] He was dressing up as a woman.
[1198] Right.
[1199] He was dressing up as a woman.
[1200] And that's a great movie.
[1201] I agree with you there.
[1202] Look, how many times in any living color did they dress up like women?
[1203] There's some humor to someone who is a man who's dressing up like a woman.
[1204] Sure.
[1205] That can be, and I shouldn't comment specifically on Crowder doing it if I haven't seen it.
[1206] No, he's got some funny shit.
[1207] He does.
[1208] You know, and I'll take shit for that, for saying that.
[1209] He's funny.
[1210] Makes me laugh.
[1211] That's why I'm staying quiet.
[1212] Yeah, I know.
[1213] I understand.
[1214] I get it.
[1215] I don't agree with him.
[1216] know, constantly going on and on about this guy being queer, or calling him a lispy little queer, but he's doing it to try to be funny.
[1217] So the question is, when can you do that to be funny?
[1218] And apparently with YouTube, you can do that and be funny.
[1219] As long as you remove the T -shirt.
[1220] Yeah.
[1221] Which is interesting.
[1222] That's actually even, that's even weirder now.
[1223] It's weird.
[1224] That they just, if he removes a T -shirt.
[1225] A link to the T -shirt.
[1226] Oh, he can still sell it.
[1227] He just can't link to it from YouTube.
[1228] I think that's what they're saying.
[1229] That's so minor that it's hard to, I mean, it's mind -lo.
[1230] Because it's sort of encouraging people to buy it, and then YouTube would say, well, if you have an ad on that, then you're encouraging homophobic behavior.
[1231] And we can allow that with our monetization policy.
[1232] I mean, it's minor in the context of everything else that's wrapped up in this.
[1233] It might be an important, you know, revenue generating T -shirt for a lot of them.
[1234] I mean, I think so much of this, again, these disagreements on issues, it comes down to what you and I were talking about before, that if two people are in a room together, 95 % percent.
[1235] of what they're talking about, you're going to agree on.
[1236] When someone's making a video on someone, if they just say, like, fucking David Pacman, man, here's my deal with that guy, and then you're just ranting thing, and I hate his fucking neck.
[1237] I don't like his shirt and his face is stupid.
[1238] When people do stuff like that, it's just, it's a terrible way to communicate because it's, first of all, you'd have to be a real asshole to say most of the things that people say about things when they're dunking on them in person.
[1239] You have to be a bad person.
[1240] Yeah.
[1241] So you know the person's going to see it.
[1242] So you're just deciding, I'm going to be a bad person, but I'm going to pretend I'm not a bad person because I'm going to do it in a way where they're not in the room.
[1243] So I'm just going to shit all over and give them my real opinion.
[1244] Sure.
[1245] But it's not like a real, it's not like you and I are at dinner.
[1246] And you're like, you know, fuck this guy.
[1247] And that's how people talk.
[1248] Yeah.
[1249] But when you're doing that, but you're doing it, you're broadcasting it.
[1250] I think we're all learning in this process of doing podcasts.
[1251] and video blogs and all the stuff.
[1252] We're all learning that you're not alone.
[1253] You are doing this and you're saying it in a way that that person's going to see.
[1254] And the same could be applied to Dave Rubin and Sam Cedar dunking on him all the time.
[1255] It's kind of the same thing.
[1256] And Michael Roberts as well.
[1257] It's the same sort of thing.
[1258] That's what they're doing.
[1259] Yeah.
[1260] And they're saying things that they wouldn't say if he was there.
[1261] In person.
[1262] Right.
[1263] But they would say.
[1264] If they were sitting around having lunch together, talking shit about some stupid thing that he said the night before.
[1265] Yeah, that's fine.
[1266] And I don't think, I mean, whether or not you would say something in person doesn't tell us whether it's a fair or unfair critique.
[1267] True.
[1268] I think it's fair to say.
[1269] I try to avoid strict ad hominums.
[1270] I will, I mean, listen, we're all trying to build an audience, right?
[1271] So at a certain point, yes, like, I will pick titles that I think are the most interesting titles to get people to watch the thing or whatever.
[1272] Or I'll use vocabulary that I might not use.
[1273] in person.
[1274] But at least what I'm trying to do is make it as substantive as possible and to sort of like justify how I came to my conclusions.
[1275] That part of it in in -person conversations usually will not lend itself to like screaming or violence or whatever if that part is the focus.
[1276] I completely agree with you on that.
[1277] Yeah.
[1278] I think we'd be better off if we did try to communicate with, but when you're doing comedy, that goes out the window.
[1279] It does.
[1280] But even comedy aside, I agree with the principle.
[1281] Communicate.
[1282] Battle of ideas.
[1283] Marketplace of ideas.
[1284] Very, very big ideas we all want to hear about and what are the best ideas and let's rank the ideas.
[1285] There are people whose views are so extreme that you can't really bring them to the table as reasonable negotiating partners for figuring something out.
[1286] Right.
[1287] Like Richard Spencer.
[1288] Sure.
[1289] Or even, I mean, okay.
[1290] imagine Lewis Farrakhan.
[1291] Lewis Farrakhan, who I've spoken out many times, imagine that we want to figure out what the tax rate should be, something that politicians have to do all the time.
[1292] If you have a group of people who believe that we need a 25 % flat tax and a group of people who want, you know, like an escalating progressive tax that gets as high as 70 % on income over 10 million, whatever, right?
[1293] Like fill it all in.
[1294] All those people are going to be able to have a conversation.
[1295] If someone comes in who says, any taxes the government collects are a form of slavery.
[1296] How do you integrate that into the conversation about how to set tax rates?
[1297] Hmm.
[1298] You can't.
[1299] Right.
[1300] Yeah.
[1301] So all of this stuff, you know, there's this new movement now, which I think is great about long -form conversations, going in depth, figuring out what our disagreements are.
[1302] Like, I'm for all of it.
[1303] I'm absolutely for all of it.
[1304] Well, you do it.
[1305] I do it.
[1306] I do it.
[1307] I do it.
[1308] And I'm sure.
[1309] Okay.
[1310] But where I do think that there's like a lack of pragmatic reality to it is some people's ideas are so extreme that they can't in any sensible way be incorporated into an actual good faith discussion of how society should be organized?
[1311] That is the problem with having conversations in scale, right?
[1312] And that's the problem with Twitter and with YouTube that you're dealing with millions and millions and millions of human beings.
[1313] And when you have that broad spectrum of humans, you're going to have people on the far ends of both sides.
[1314] And at a certain point, a decision has to be made about who actually gets to participate in the decision -making conversations.
[1315] It's great for everybody to have a voice on taxation on Twitter, but imagine if there was a significant portion of our elected officials who straight up think taxes or slavery.
[1316] I just don't know how that becomes integrated into a decision about tax policy.
[1317] Right.
[1318] I think the argument would be that bad ideas should be combated with good ideas, not with silencing someone, and that when you do silence someone, You just sort of create this blockade where the idea builds up behind it and then the opposition to your perspective builds up and then people start picking teams and picking sides.
[1319] And I honestly think that that's something that's going to be going on right now with this whole Crowder -Crowder Vox thing.
[1320] I think people are going to pick sides and they fucking love it.
[1321] People love a good conflict to get into.
[1322] There's a lot of people in the cubicles right now that are weighing in and firing up and there's people that want to docks them again and there's people want to, infiltrate his Facebook and his Twitter.
[1323] That's what people do.
[1324] And you're dealing with, I mean, what does Crowder have, 3 .5, 3 .8 million, something like that?
[1325] I mean, I think what you have to also remember is it's not just the reactions that are sort of like tailored to continue the escalation.
[1326] I mean, in the end, maybe Crowder personally in his personal life does refer to people he perceives to be gay or who are gay as queers.
[1327] I don't know.
[1328] I don't, or he uses the word fags.
[1329] I have no idea.
[1330] But he didn't use that word.
[1331] The T -shirt has the asterisk.
[1332] I get it.
[1333] It's a goof.
[1334] It actually has a fig. It doesn't have an asterisk.
[1335] Oh, okay.
[1336] I thought it was a, no. The A is a fig. It's for figs.
[1337] It's the idea it's an I. Got it.
[1338] Okay.
[1339] It's a goofy joke.
[1340] Fair.
[1341] My point is, I don't know, you know, any sensible person who lives in the West and has access to media, like Stephen Crowder or whoever, knows that the use of that language has a very specific path that and set of reactions that it's going to trigger so specifically the that shirt the shirt and referring to carlos maza as a you know queer mexican or whatever i whatever the phrase is see that that's that's a weird one the queer one is a weird one because it's what it's with lbg tq like what is like here's a good one right national association for the advancement of colored people okay and double acp you can't call people color people sure but that organization was named a long time ago and it's it's more acronyms than anything else at this point i understand but it's not right you both we both know what the acronym with the individual letters or um words in that acronym are i think the word queer is not a derogatory word but it is it can be or it cannot be depending on how it's used you go you fucking queer yeah right sure i mean listen it's the same way with jew right right right right if i'm in a family thing and it's a bunch of Jews or whatever.
[1342] That's a word that can be used in a way that if someone shows up, if Richard Spencer shows up or one of his followers and goes to a bar mitzvah and talks about this room full of Jews, the word is the same word.
[1343] Yes.
[1344] We're talking about two very different things.
[1345] That's a good point.
[1346] But should he be allowed to say this room full of Jews?
[1347] Allowed?
[1348] I mean, it's not illegal.
[1349] No, it's not illegal.
[1350] He is allowed.
[1351] Right.
[1352] He is allowed.
[1353] But where does it like this room full of Jews?
[1354] Where does it get toxic?
[1355] Well, if Richard Spencer shows up at a bar mitzvah and yells about this room full of Jews, I think it's gotten toxic.
[1356] That's a good subject to break this stalemate of this subject.
[1357] Not stalemate, but just, you know, sort of end this.
[1358] Anti -Semitism seems to be ridiculously on the rise.
[1359] And that's stunning to me. That shocked me. Why?
[1360] Because the Internet, the Internet sort of exposed anti -Semitism that I didn't necessarily know existed at the levels.
[1361] that existed at.
[1362] I knew there was anti -Semites, but I didn't even know they were so brazen and overt.
[1363] Well, they've gotten brazen since January of 2017.
[1364] Oh, okay.
[1365] I don't know that Donald Trump has created anti -Semites.
[1366] In fact, he probably hasn't.
[1367] Well, his son -in -law's Jewish.
[1368] His son -in -law's Jewish, his daughter converted to Judaism, et cetera.
[1369] But I think that Richard Spencer told me, we know that Trump is not literally a white nationalist who is going to talk about let's take control back from the Jews.
[1370] But we see him as the closest thing to what we would like.
[1371] He talks about people from Mexico.
[1372] He talks about shithole countries, etc. So it's just emboldened the movement.
[1373] It doesn't necessarily create.
[1374] Right.
[1375] But people from Mexico and shithole countries, that doesn't necessarily really equate with Israel.
[1376] No, well, anti -Semitism and Israel also are two totally separate things.
[1377] I mean, you could be against, you could be against the current Israeli administration, as I am, like Benjamin Netanyahu, and still call out anti -Semitism against Jews in the United States, for example, or whatever.
[1378] I see what you're saying.
[1379] It doesn't, one is not directly linked to the other, but if you're a group that already has these views, and then you see a guy who opens his campaign talking about, they're sending rapists and criminals, but some, I'm sure, are good people.
[1380] And I don't want people coming here from shithole countries.
[1381] What about Norwegians?
[1382] Whatever.
[1383] It's a signal.
[1384] It's a signal.
[1385] And I've spoken to former KKK people, some of whom are really interesting people to talk to.
[1386] And they know exactly why it's appealing because they see the signals and the vocabulary and the dog whistling.
[1387] So I think it's just brought it out into the forefront.
[1388] I don't know that new anti -Semitism has necessarily been generated.
[1389] Although it being in the forefront probably does start to get some people kind of curious.
[1390] Like, oh, maybe all the problems are because of the Jews.
[1391] I don't know.
[1392] Hmm.
[1393] It's just, I guess they find groups of like -minded folks and they join along, right?
[1394] Is that?
[1395] The anti -Semites?
[1396] Yeah, they find them online and then you can stumble into it where you ordinarily wouldn't be around people that are having those discussions.
[1397] That can happen.
[1398] And a lot of the people that I've talked to that got into those beliefs and then out of them said that they got in, usually on a community level.
[1399] There was something about the community that was appealing to them.
[1400] gangs or in the case of you know kKK and white supremacy people that um had a bad home situation and they found a group that would accept them like partially they would accept them because they were white right um and then they got pulled into the beliefs and eventually they they sort of got out of them yeah it's just uh so you think the rise of it in 2017 there's more anti -semitism or you think it's more overt i believe it's more overt because trump is president yeah and you know groups that track these incidents like the anti defamation league and others they have the data and there have been increases yeah it's it's stunning to me you know you see it online so many different places now and i just don't remember seeing it before or not like that not not so you'd run into it so often or people calling people Zionist shills yeah i mean that's an important thing to talk about i mean people call me that all the time and you know i feel like that is an issue where i try to speak i mean shill to me suggests that you're saying one thing but with some kind of other agenda that you're trying to push in some way in other words you are you are being in some way deceptive about your actual intentions and what you say so i think when people call me a zionist shill what they mean is i'm talking about one thing with the secret goal or below the surface goal of actually promoting some action by the state of Israel.
[1401] I think that's the idea of a shill.
[1402] But, you know, I mean, I'm opposed to the current prime minister in Israel.
[1403] I've made clear that...
[1404] Isn't he in trouble right now?
[1405] Yeah, I mean, he's been in tentative trouble for a long time.
[1406] His wife is in trouble as well, I believe.
[1407] But that, I mean, the problem is, and I know that there are people on the left and right that when I say this will, I mean, I'm going to get crushed from what I'm about to say.
[1408] Sometimes when someone says Zionist Shill, it's related to your view on the Israeli -Palestinian conflict.
[1409] Sometimes when someone says Zionist Shill, it's cover for just wanting to insult someone for being Jewish or anti -Semitism.
[1410] You've got to look at every instance one by one.
[1411] Yeah, and people just like saying things too.
[1412] Yeah, yeah, it's a popular thing to say.
[1413] Especially if they find out that you're Jewish.
[1414] Absolutely.
[1415] It's like a thing.
[1416] Yeah.
[1417] It's a thing to say.
[1418] It's a little weapon to use.
[1419] It absolutely is.
[1420] Do you find this is sort of an abstract question, but overall, like doing this show and having this increased, ever -increased exposure, do you enjoy it?
[1421] Are you weirded out by the interactions with all the people?
[1422] Do you feel pressure by all the comments?
[1423] and do you feel like a little bit of anxiety from all the social media aspect of it?
[1424] I do.
[1425] So I enjoy the idea that people are listening to my ideas and either agreeing or disagreeing, but they're considering them and then integrating it into how they figure out what they think about the world around them.
[1426] That's awesome.
[1427] I do get weirded out by sort of like safety security stuff that sometimes comes up, which I try not to even like put too much attention on because I feel like it just feeds and gives people ideas and people who, you know, come up to me and, I mean, I'm more curious to actually hear your thoughts about this, come up to me and, you know, they may not necessarily see the world the way I see it and I'm unsure sort of like what are their intentions type of thing.
[1428] I mean, it gives me anxiety and it gives people that are close to me anxiety for sure.
[1429] Yeah, because your profile is just, if you keep doing this, you're very good at it.
[1430] Thank you.
[1431] You're going to continue to get more and more popular.
[1432] Yeah.
[1433] And, I mean, I guess it's a double -edged sword.
[1434] I mean, I don't know.
[1435] Like, when you do a comedy show afterwards, is it kind of like a free -for -all where people can come up and chat with you?
[1436] Sometimes.
[1437] Yeah.
[1438] Yeah.
[1439] And do you get skittish?
[1440] No. You don't.
[1441] Nah.
[1442] Most people are nice.
[1443] I agree.
[1444] The vast majority of people are nice.
[1445] I agree.
[1446] They come to see you.
[1447] They're usually fans.
[1448] And they just want to take.
[1449] pictures, say what's up.
[1450] I guess it's a little different when what you do is like overtly political versus other areas.
[1451] Like if you're an actor, comedian, doing other things, race car driver.
[1452] Right.
[1453] You are in a much more conflict -driven profession.
[1454] Yeah.
[1455] In a sense.
[1456] Yeah.
[1457] I mean, I have political people on like you, but I'm not entirely engaged in politics like you guys are.
[1458] Right.
[1459] Yeah.
[1460] I don't know.
[1461] I mean, I do worry that no matter what happens in the next few elections, I don't know how we reverse the radicalization polarization effects of the social media echo chambers that we've been talking about.
[1462] And I only see that as further.
[1463] I mean, we could still accomplish good things while that's going on.
[1464] Like I think if we elect the right people, maybe we can get good things done.
[1465] But in parallel, there is this hyper -radicalized polarized narrative that's going on.
[1466] And I don't see any way that that's going to turn around.
[1467] I wonder myself.
[1468] I do.
[1469] and I'm very confused by it because I don't see any long -term solution for this other than some radical change in the way human beings communicate with each other and I've contemplated that and hypothesized and theorized and I really think that what has changed the way we communicate is technology and this immersive, immersive aspect of social media technology, the fact that we carry these devices with us all the time that allow us to communicate and allow us to read other people's communications or watch other people's communications.
[1470] And I have a concern that this is going to escalate with each expansion and each innovation in terms of like what, and I don't know what it would be because no one saw the internet coming.
[1471] If you go back 30 years ago, no one ever thought anything was going to be anything like it is now.
[1472] Well, Al Gore did.
[1473] Ha ha.
[1474] I bet he did.
[1475] But if you go 30 years from now, I mean, what are we really looking at?
[1476] What is this world going to be?
[1477] I don't think anybody has an idea.
[1478] I think we have no idea.
[1479] And I think it's going to be, if you look at the trend, the trend is not towards calming people down and giving people space and allowing people to meditate more.
[1480] No, the trend is to get more and more immersed.
[1481] The trend is for us to get closer and closer to each other, to remove boundaries, remove boundaries for information and ideas.
[1482] And even in long -term contemplations of this, I've often thought that everything, right, all of our communication is basically ones and zeros.
[1483] It's all information.
[1484] It's all words and thoughts and videos.
[1485] And now you're getting into cryptocurrency.
[1486] Now, cryptocurrency is essentially ones and zeros.
[1487] It's all digital.
[1488] Everything's digital.
[1489] And the bottlenecks, if any bottlenecks are there at all, the bottlenecks are slowly, but surely getting removed, the blockades and the walls.
[1490] I think we're going to probably experience some sort of a level of immersive technology in our lifetimes that's going to change the way human beings communicate, period, and that we're going to look back at this time, like, ha, remember when we thought that, like, social media arguments were, like, the big deal?
[1491] Yeah.
[1492] I was recent, I used to have more of, like, a techno -utopian view, and it started to sort of change partially because of some of the sci -fi I read and so everything but most recently so like 15 years ago I read the Richard K. Morgan book Altered Carbon and at the time I was like this has to be made into something that's the one that's on Netflix now right and then like a year ago Joel Kinneman was in the series and it was just awesome and the series good the series is quite good The series is quite good, yeah.
[1493] And I really like Joel K. Miniman and Richard K. Morgan, I interviewed who wrote the book years ago.
[1494] But that genre started to move me away from techno -utopianism and technology is just going to solve so many problems because it also is going to create new problems that we don't even yet know about.
[1495] So as an example, I went all the way back to the beginning when humans went from hunter -gatherers and figured out we can domesticate some crops, we can start agriculture, and settle and be in one place that was the acceleration of what we know of as wealth ownership what it was like the start right so much of what we had i mean agriculture allowed people to be able to live and do stuff other than find food which developed specialists who created technology which created our like it all came from agriculture in that way but tons of bad stuff came from it as well right the beginning of the concept of a sedentary lifestyle came from agriculture diseases that we got from animals and then that we brought other places and they killed tons of people.
[1496] So I've kind of adopted that view to technology now, which is, yeah, all the cool stuff we can imagine and improvements, I'm sure we'll be there.
[1497] But problems we aren't even aware of yet are also going to be there.
[1498] Yeah, I agree with you 100%.
[1499] That's what I meant by looking back at this social media problem.
[1500] I think we're going to have a far more invasive problem.
[1501] I think we're going to probably have some sort of a wearable thing that allows us to communicate through thoughts.
[1502] Sure.
[1503] Well, thoughts would be a next step, but at minimum, I mean, replacing, you don't need the screen on your phone.
[1504] Right.
[1505] You have contacts that are connected to something and everything is just displayed.
[1506] I mean, there will be steps.
[1507] Jamie, whatever happened with that Microsoft thing that we were looking at, remember when they had like the little mouse that was dancing in your hand or the elephant that was dancing in your hand?
[1508] It was an augmented.
[1509] Yes, that was available.
[1510] It was a Microsoft has HoloLens and they're on HoloLens too, but they've moved more towards like a commercial applications for it as opposed to like, consumer availability.
[1511] There are consumer availability AR things coming out right now.
[1512] What Apple just showed at their WWDC event this month, or actually on Monday, is really cool.
[1513] It's still just like watching through that phone, though.
[1514] They're not, I don't think anyone's made the device, like a glasses type AR thing yet, because the field of view isn't right.
[1515] They haven't mastered that.
[1516] Either projecting light into your eye, which is what Magic Leap does, or projecting onto to the glass that you're then looking at, which is what I think HoloLens and the other thing does.
[1517] They haven't figured out yet.
[1518] Betamax versus VHS race to see who figures it out.
[1519] I think so, yeah.
[1520] But like that Oculus Quest, which is different, also just came out, is really cool.
[1521] And they're so much closer.
[1522] They could be within a year or two or something could come out at the end of this year that hasn't been announced.
[1523] We're very close.
[1524] The question is like, how much is that going to affect daily life, right?
[1525] With augmented reality.
[1526] Yeah.
[1527] Yeah.
[1528] Yeah.
[1529] The other one that relates to that also is right now you at least can put your phone away.
[1530] Right.
[1531] What happens when the line between the technology and the body is?
[1532] Yeah, I have a bit about it.
[1533] I'm very concerned.
[1534] I really am.
[1535] I think we're giving up agency to something that has no feelings for us at all.
[1536] Well, there's no, I mean, I think the problems people have in practice often are different than the ones.
[1537] I mean, there's no transparency with a lot of the companies that are developing these technologies and setting up the algorithms and whatever.
[1538] There's really no transparency about what it is that's going on, what the end goals are, what the broader effects on society are going to be.
[1539] I know you've had Jonathan Haidt on who has talked a lot about the disproportionate effect of social media on suicidality, particularly in young girls relative to boys.
[1540] it's been years now that this stuff has been around and we're now kind of figuring that out.
[1541] So it's inevitably, we're behind always in figuring out what the effects are because you need time to measure it.
[1542] And that as things advance more and more quickly, whatever damage is potentially going to be done will happen even faster.
[1543] Yes.
[1544] Yeah, that's what the concern is that we are always behind and that it's sneaking into our lives before we have any idea of how dangerous it is.
[1545] Sure.
[1546] I mean, this happened with, you know, the food, the canned.
[1547] and processed food revolution of the 50s and 60s.
[1548] It was slower, but it was the same type of thing where all of these advancements in being able to make food last longer via how it was processed and stored.
[1549] It all sounded awesome in a time when food would just go bad.
[1550] Then we started learning about all the bad things that came with it.
[1551] Anything more before we wrap it up?
[1552] I think that's it.
[1553] Oh, so two things I wanted to mention.
[1554] One, when I announced that I was going to be on the show, companies started contacting me saying we will give you money if you work our name our product into the conversation what's the product i'm not going to say but i do want to talk about car insurance briefly have you heard that that's happened to other guests no you haven't interesting that's interesting yeah wow that's a weird sneaky thing yeah and no one's ever paid me to do that no one's paid you no no no no one's ever paid me to uh to uh to have an like like a conversation on a podcast.
[1555] Oh.
[1556] But one company did want to advertise and they wanted their CEO to come on the podcast and discuss their product.
[1557] And I was like, give me like an infomercial.
[1558] I was like, no. And they're like, well, you've talked about people before that have had products before.
[1559] I go, yeah, because I like their product.
[1560] Right.
[1561] And I think what they're doing is cool.
[1562] Right.
[1563] Zero financial investment in their product.
[1564] Right.
[1565] I only did it because I like it.
[1566] Sure.
[1567] Well, I mean, my audience knows that we do sponsored stuff like we have ads and whatever.
[1568] I disclose it.
[1569] I'm clear.
[1570] And my approach is I'm super up front with my audience, which is listen.
[1571] Me too.
[1572] Only like half a percent of you are paying for a membership.
[1573] The memberships are six bucks a month.
[1574] I know like 80 percent of you can afford it.
[1575] Only like half a percent are doing it.
[1576] That's fine.
[1577] I'm going to keep doing the show.
[1578] But I'm going to put some sponsored content up.
[1579] You don't have to watch it.
[1580] I'm in a market as such, period.
[1581] And I feel like for the most part, we have kind of an understanding.
[1582] of how it all works.
[1583] There's nothing wrong with it as long as this products that you actually enjoy and, again, that you maintain that transparency and that honesty.
[1584] Because I'm wrong with that.
[1585] If I don't have that with the audience, I don't have anything.
[1586] Right.
[1587] And I've been asked to compromise it.
[1588] You have?
[1589] Yeah, for sure.
[1590] Yeah.
[1591] It's just like it would be worth so much, you know?
[1592] Yeah.
[1593] I mean, there's this moral hazard sort of situation that exists with insurance where the people who don't really need the insurance are the ones that the insurance companies want to insure.
[1594] And the people that are more likely to use the insurance, the insurance companies are like, we're going to have to charge you six times as much type of thing.
[1595] It's easier to get the sponsorship money from stuff that's less interesting or less aligned or whatever.
[1596] And I don't know.
[1597] I mean, it's an ongoing battle.
[1598] I don't talk to any of our advertisers.
[1599] Like, we have a team that handles all of that, and that is great.
[1600] But there are, you know, there are still calls to make about like what is on this side of the line what's on that side of the line yeah i try to make the right calls no i think you're doing a great job i appreciate your show i appreciate your time thanks for coming down here tell everybody where they could find you d packman on p a k m an on i'm on twitter at d packman i'm on where am i i'm on instagram at david dot packman and uh my website david pacman dot com all right thank you david appreciate man thank you for having me