The Joe Rogan Experience XX
[0] Joe Rogan podcast, check it out.
[1] The Joe Rogan Experience.
[2] Train by day, Joe Rogan podcast by night, all day.
[3] And we're up, gentlemen.
[4] What's happening?
[5] Good to see you.
[6] Hey, you're happening, man. Good to see you again.
[7] Thank you for having me back.
[8] Beautiful purple shirt.
[9] I love it.
[10] Thanks for having us.
[11] And thank you.
[12] And Bill, first of all, tell me what's going on with Mines.
[13] Minds is one of the first that I was aware of, like, alternative social media networks that was committed to free speech.
[14] Yeah.
[15] How's it going?
[16] It's going.
[17] I mean, there's sort of a whole landscape of alternative networks emerging.
[18] And so you've got this spectrum of apps where you've, like, I think of, I put everything through a litmus test when I'm thinking of an alternative network.
[19] Basically, is it transparent?
[20] Does it publish their source code?
[21] Most of these alternative apps, I don't need to name names, but I could.
[22] They don't publish their source code.
[23] So you can't look at the algorithms to see what's happening.
[24] You can't see if there's spyware in there, if they have Google Analytics, little nasty stuff that...
[25] So you're talking about getter?
[26] Getter.
[27] Yeah.
[28] Because I've found out the...
[29] Parlor, Rumble...
[30] All of them?
[31] I'm not trying to trash these people.
[32] I think that the free speech stuff is good.
[33] Like, the more...
[34] but some of their terms aren't even free speech.
[35] So, you know, free speech policy is essential.
[36] So I absolutely respect any network that is putting forward a free speech policy.
[37] But if you can't have free speech policy with sketchy algorithms and close source code, because then we don't know if you're soft censoring, shadow banning.
[38] We don't know what's happening in the news feed behind the scenes.
[39] Right, which we definitely know Facebook does, Instagram does, Twitter does.
[40] That's all real.
[41] Right.
[42] So then you've got, are they privacy focused, end -to -end encrypted?
[43] Do they have access to the content of your messages?
[44] So we use an N10 encrypted Messenger protocol called Matrix So that we don't even have access to people's conversations Like I don't want access Right and then you've also got you know do they pay creators fairly So you've got these check marks that you go through with each but open source is key The future there is nothing without open source any app if they're claiming to be an alternative and they're not open source They're not in the same conversation It's a completely different animal, and they should not be taken seriously because they're not being transparent with the world.
[45] And then you get into decentralization and actually building an app that – so Google says don't be evil.
[46] But it's really can't be evil.
[47] We want to make it impossible for us to even take down our network at all.
[48] And that's why immutable distributed systems like blockchains and, you know, Tor and all of the IPFS.
[49] all of these different decentralized systems are emerging and we're in we're interacting with them we're not fully decentralized yet so but that's there's like a progression that a lot of apps in the web 3 slash decentralized web space are moving towards so okay and and so Darrell to fill people on you you've been on the podcast before and you have an incredible history you're a brilliant musician and you have personally converted what's the number now is more than 200 Ku Klux Klan members, neo -Nazis.
[50] I mean, we talked about these guys giving you their clan outfits and retiring because they met you.
[51] And just because you had reasonable conversations with them made them realize how stupid these ideologies that they had somehow another been captivated by.
[52] I mean, at the end of the day, you know, a missed opportunity for dialogue is a missed opportunity for conflict resolution.
[53] It's as simple as that.
[54] But it's not just having a dialogue or a conversation or debate.
[55] It's the way that we have it, how we communicate, you know, that makes it effective.
[56] Like, for example, you know, I've been to 61 countries on six continents.
[57] I've played in all 50 states.
[58] So all that is to say that I've been exposed to a multitude of skin colors, ethnicities, religions, cultures, ideologies, et cetera, and all of that has shaped who I've become.
[59] Now, all that travel does not make me a better human being than somebody else, it just gives me a better perspective of mass humanity.
[60] And what I've learned is that no matter how far I've gone from our own country, right next door to Canada or Mexico or halfway around the globe, no matter how different the people I encounter may be, they don't look like me, they don't speak my language, they don't worship as I do, or whatever, I always conclude at the end of the day that we all are human beings.
[61] And as such, we all want these same five core values in our lives.
[62] Everybody wants to be loved.
[63] Everybody wants to be respected.
[64] Everybody wants to be heard.
[65] We all want to be treated fairly.
[66] And we all basically want the same things for our family as anybody else wants for their family.
[67] And if we learn to apply those five core values when we find ourselves in an adversarial situation or a culture or society in which we're unfamiliar, I can guarantee that the navigation will be a lot more smoother.
[68] And essentially, that's what's happening here at Mines.
[69] We're allowing people to be heard.
[70] We're showing them that kind of respect.
[71] We don't have to respect what they're saying, but respect their right to say it.
[72] And we provide that platform because, you know, when you don't do that, you're driving people to a platform that will embrace them.
[73] And then it becomes an echo chamber.
[74] And essentially, it could become a breeding ground for a cesspool of nefarious activities, whether it's extremism or violence or conspiracy theories or what have you.
[75] So it seems like there's an issue with many social media companies where they want to censor bad ideas.
[76] And it seems to me that part of that is because the work involved in taking a person who's a neo -Nazi or Ku Klux Klan member and showing them the error of their ways, allowing them to spread their nonsense and then slowly but surely introducing them to better ideas, it's exhausting.
[77] and they're not willing to do the work exactly so what Twitter does is like fuck you get out of here what Instagram does the same thing with all these people but the problem with that is then it goes further and further and further and further down where you're getting rid of people for just not agreeing with you so this is empirical now so Darrell and I just wrote this paper called the censorship effect along with Jesse Morton Justin Lane Ron Schultz and my brother Jack and the multiple PhDs, like serious research has gone into this.
[78] Even the left out of outlets like Vox are now admitting that de -platforming causes more severe radicalization.
[79] This is being admitted across the board.
[80] So the fact that big tech apps are not looking at this data and applying it to their policy, it makes you almost have to speculate that they're intentionally causing it.
[81] I mean, because these are very smart people that work at big tech sites.
[82] They know about data science.
[83] They know the spread of information.
[84] I don't think they're intentionally causing it.
[85] I think, first of all, there's an ideology that is attached to all the big tech companies, whether it's Google or Facebook or Twitter.
[86] You have to be what they think is woke, right?
[87] you have to subscribe to a certain line of thinking.
[88] And anybody that deviates from that line of thinking should be suppressed, minimized, or banned.
[89] So how is that not intentional?
[90] But it's not intentional, meaning they're not trying to radicalize people.
[91] That's not what they're trying to do.
[92] No, but they, I don't.
[93] They're just foolish in their approach.
[94] I think some of their data science researchers do know.
[95] Yeah, but they're not getting to the people that are the CEOs.
[96] No, they're not.
[97] The CEOs have to virtue signal.
[98] All the people that are executives have to virtue signal.
[99] And they have to say, we're doing our best to stop harmful talk.
[100] But what they call harmful, like a lot of it is like disagreeing with pharmaceutical companies, which is just fucking crazy.
[101] Like, these are the lyingest liars that ever lied.
[102] Did you see Zuck on Lex's show?
[103] Yes, I did.
[104] What did you think?
[105] You know, it's hard.
[106] It's hard because, like, that guy has an enormous responsibility.
[107] He has, he's the head.
[108] of this insanely huge platform that covers the entire planet Earth, and everything he says has to be measured.
[109] It's like, you ever see him drink water?
[110] He drinks water like this?
[111] Like a weird way of drinking water.
[112] He doesn't fucking drink the water.
[113] He like sips it, it touches his lips, and then he's done.
[114] He's like, everything is like measured, measured.
[115] Like, I can't imagine trying to speak freely when you're the CEO of Facebook.
[116] I think it's almost like pointless to talk to him in that sort of circumstance.
[117] Well, you know, to your point about, you know, people doing this.
[118] and defending it and so forth and so on.
[119] I mean, I think the quote by Upton Sinclair comes into play, I think he said something to the effect of, it's difficult for a man to understand something when his salary depends upon him, not understanding it.
[120] Yes, yes, yes, yeah.
[121] And if you live in that world, if you live in that tech world, and I have many friends who have, you know, they're executives at these places, that is just the fucking doctrine.
[122] You have to, like, you have so many employees that they have these, like radical ideas about what you're supposed to do and not supposed to do and what you're supposed to platform and not platform and this idea of platforming people you know like I have people on this podcast all the time that I don't agree with at all and I have them on or I agree with them very little and I want to see what's going on in their head and I'll get that like you're platforming these people you're platforming a bad person like I don't think they're bad people I just don't agree with them and they have a right wing ideology that I don't things should be suppressed.
[123] I think you should try to pick it apart.
[124] You cannot change someone's mind if you do not platform them.
[125] It is impossible for someone with horrible ideology to change.
[126] But I should say, not just a right -wing idea.
[127] There's a lot of people with left -wing ideologies that I think are ridiculous.
[128] And I want to pick those apart too.
[129] I want to have conversations with people.
[130] And this idea that you're only supposed to have conversations with people that you absolutely agree with and that what you're doing is just broadcasting these ideas to better humanity like if you want a better humanity have fucking conversations with people.
[131] Look you know this goes all the way back I mean centuries even back to BC as in before Christ right?
[132] I mean let's we can go back as far as let's just say Copernicus the astronomer who passed away in 1543 okay he's up until then the belief was that we are a geocentric model universe.
[133] Disinfo.
[134] Sorry, no, I was saying that it would have been called disinfo.
[135] Okay, yeah, exactly.
[136] So even the Catholic Church in Dorsa, we are a geocentric model universe, meaning that the earth is the center of the universe and everything revolves around us, right?
[137] And Copernicus said, no. It was just another planet.
[138] The sun is the center of the universe, and everything revolves around the sun, which makes it a heliocentric model.
[139] and everybody scorned him, ridiculed him.
[140] A hundred years later, Galileo came along and built upon Copernicus's theory and developed it even further and said, yes, we are a heliocentric model.
[141] And he got arrested, arrested for heresy against the Catholic Church.
[142] All right.
[143] But guess what?
[144] He was right.
[145] He was right.
[146] So, you know, sometimes we have to stand up to the masses, not just join.
[147] in because everybody else thinks this way.
[148] And it's also the problem of the walled garden, right?
[149] There's a lot of people that get booted from the social media platforms, whether it's Twitter or Facebook, and then they look at that and they look at those people with further and further disdain, and it separates them from whoever's there.
[150] And we're not even just talking about radical people.
[151] One of the things that really alerted me to how crazy the censorship shit was was Brett Weinstein had a group that he put together called Unity 2020.
[152] And the idea was to bring people that were from the left, that were really reasonable, and from the right that were really reasonable, that weren't captured by corporate greed, and to have them as an alternative candidate.
[153] Like, instead of saying, like, you have to be a Republican or you have to be a Democrat, let's get reasonable left -wing and right -wing people that can agree on a lot of stuff and have them work together.
[154] And maybe you have a candidate that's, like, a vice president and a president, who's right -wing one.
[155] Yeah, like it would be a great way to sort of like come together in the middle.
[156] Twitter banned the account.
[157] Twitter banned an alternative account.
[158] Like the, and there was nothing unreasonable about what they were saying.
[159] It was all just conversations with people that are brilliant, that happen to be left wing and brilliant, that happened to be right wing.
[160] Let's get them together and see if we could lead this country in a better direction than having this polarization of right versus left where people get super tribal about it.
[161] Like this would be a great way to meet in the middle.
[162] And Twitter was like, fuck you.
[163] And they banned the account.
[164] They had such good intentions.
[165] Yes.
[166] But the idea that you can get banned for trying to come up with another political party.
[167] Are you saying that this system is infallible?
[168] This right versus left system of blue and red is infallible?
[169] That's so crazy.
[170] We are here because someone didn't like what was going on in Europe in the 1700s.
[171] And they took a chance on starting a new system, a system of self -government that, It was a complete experiment, and it had never been done before in the world, and that created the United States.
[172] And the idea that you, the fucking tech dorks, are going to step in and say, no, this is dangerous thinking.
[173] Yeah, oh, the battle tested First Amendment, hundreds of years of precedent, legal precedent.
[174] We, like, talk about a good content policy, the First Amendment.
[175] But it doesn't apply.
[176] They say it doesn't apply because this is a private company.
[177] They think that their lawyers are better at drafting healthy conversations.
[178] than the First Amendment.
[179] And that's just not, that's not true.
[180] I think, you know, there was a real concern in the early days of Twitter and of social media where a lot of these people that were like outrageous right wing people were starting to get a lot of attention.
[181] Like Milo Yiannopoulos was a big one, Gavin McGuinness.
[182] And a lot of these guys, they were getting a lot of attention.
[183] And the response from the left was like, no, no, no, no, silence them.
[184] Like I heard this.
[185] one woman talking about her kid is listening to Ben Shapiro.
[186] And I would love to get Ben Shapiro removed from all platforms.
[187] Oh, Kara Swisher.
[188] I think that's who, she's a Vox report.
[189] And Vox is interesting because they're like smart people.
[190] But they're also, you know, they sort of embody this.
[191] But their recent article, I don't know, Jamie, if you could find it.
[192] It's like does de -platforming work out of Vox?
[193] And they - They don't need to find that.
[194] Well, I'm just saying that.
[195] No, no, no. No, but the reason I was so happy was because they're referencing similar studies that we reference in our paper, they're starting to be forced to acknowledge that the censorship is having serious negative consequences.
[196] It polarizes this country.
[197] And so what you were saying before about, you know, people, their beliefs being reinforced after they get banned.
[198] You know, they're victims.
[199] Now they believe the thing that they were ranting about.
[200] It's called, so in the, in the literature, it's called certainty, level of certainty.
[201] That's what's measured.
[202] And there's, it's clearly shown that certainty accelerates with deplatforming based on whatever you were thinking before.
[203] So isolation and certainty have an overlap.
[204] Yeah.
[205] So if you have an idea, like especially with something as innocuous as unity 2020 or beneficial, the idea of unity, I mean, come on, it's like literally in the title.
[206] That's what we're all hoping for.
[207] We're united as a community, the United States of America, all these different ideas.
[208] Let's work together.
[209] No, fuck you.
[210] You're not a right.
[211] wing, you're not a left wing.
[212] You can't be a part of the problem because you're going to draw votes away from the people that we think it's imperative that they win.
[213] So it changes the whole idea of what democracy is because they're kind of admitting that they have an influence on the way our elections go.
[214] You know, I mean, and speaking of unity, then you got those people who are out protesting every day, you know, to help change and bring people together, but a lot of them are the very ones who will not sit down and talk with the person.
[215] that they're protesting against.
[216] Yes.
[217] You know?
[218] So how badly do they really want unity?
[219] Well, this happened to us personally.
[220] Precisely.
[221] Well, it's happened in universities.
[222] That's where it happened first.
[223] I started seeing it, I mean, I guess it was like a couple of decades ago.
[224] You started to see when someone was a controversial speaker, they would come to a university.
[225] And instead of someone debating that person or someone, you know, listening to that person's ideas and picking them apart, instead they were like pulling fire alarms and shouting.
[226] people down and screaming at the top of their head in the middle of the auditorium, they're silencing people's ideas because they feel that their ideas are better, which is exactly the opposite of what the founding fathers were trying to sort of manage when they came up with the First Amendment.
[227] I mean, we're really trying to make this less of an emotional debate because I think the censorship and speech stuff is obviously very emotional.
[228] You know, people, we're talking about hate speech.
[229] We're talking about a lot of horrible stuff that hurts people personally.
[230] And so, you know, the big tech strategy is, oh, you know, we, we care about people's feelings and we want to hide this information because it's offensive.
[231] But we need to remove the emotion and look at this empirically in terms of what is actually making society more healthy and what is actually preventing radicalization and violent extremism.
[232] So if we can prove to big tech that de -platform, we want them to adopt a free speech policy.
[233] I think that's the goal here.
[234] We don't expect that Facebook and Google are going away.
[235] It's not going to happen.
[236] There's not going to be no MySpace of Facebook and Google.
[237] They are embedded in the infrastructure of the planet.
[238] So they need to change their policy.
[239] They need to start open sourcing more code.
[240] And they need to start adopting more open policies because when they ban, it's all network topology and whack -a -mole.
[241] You know, you ban it from Facebook, and then it pops up over here.
[242] And it's just this whole little interconnected matrix.
[243] Let me ask you this, like for minds, like say if someone starts like a neo -Nazi group and they start posting on minds and they start talking about the master race and eliminating Jews and crazy Nazi type shit, what do you do?
[244] Oh, I mean, as long as it's not calling for violence or having true threats of violence, then it will be go, it will go under an NSFW filter.
[245] So it will go under sort of, you know, it'll have a sensitive kind of click through.
[246] So you'll be warned before you're seeing that.
[247] So it's like one of those Instagram videos, like if you see a car accident or something like that, there's Instagram videos, you have to like say that you're willing to see this offensive thing.
[248] Exactly.
[249] So, you know, we have tags so that people, we don't want anyone to see stuff they don't want to see.
[250] So how, but what if someone doesn't use the tags?
[251] Then it'll get reported and get tagged.
[252] Okay.
[253] So it's like if someone starts posting Nazi propaganda, they just immediately, like someone reports it?
[254] Yep.
[255] Yeah.
[256] And we also have a joke.
[257] system.
[258] So we're rolling out this system where the community can sort of help create consensus around tags on different content.
[259] And if we make a mistake, it can get appealed.
[260] And the community actually votes, not us.
[261] And Darrell, your take on this is like, how do you think that a social media company like Twitter, something that's really huge, can pivot from the model that they have now where they just ban people.
[262] Because, you know, that points to them assuming that the majority of people out here are stupid and that these companies need to tell you what to believe, okay, which to me is offensive.
[263] It is offensive, you know.
[264] So I believe, you know, yes, there's a lot of bad information out there.
[265] And, you know, the more liberal you make your platform allowing anybody, everybody to come in, yeah, you're going to have some bad actors, sure.
[266] But the way you address it is you combat bad information by providing more good information.
[267] Yeah, well, that's the age -old idea.
[268] So Clarence Thomas Supreme Court Justice came out, and he said that he thinks networks above a certain size should be considered common carriers.
[269] Yeah.
[270] Now, common carriers, so there's this whole debate about Section 230 and, you know, whether networks have a right to take things down.
[271] It's pretty definitive that big social networks, private companies do have the right to monitor.
[272] That's a fact.
[273] Section 230 doesn't say you have to keep up everything, but the common carrier, like a phone company, can't ban you for your views.
[274] And so they're common carriers.
[275] And that's, that's an important distinction.
[276] I think that's a rational suggestion from Thomas that, you know, once you reach a certain size, you cannot just be going and playing favorites.
[277] Yeah.
[278] Yeah, I know Jack Dorsey had an idea of two versions of Twitter, a curated, moderated version of Twitter and then a Wild West version.
[279] And he, he, was trying to pitch that and I think they shot him down but his idea was like we should have some Twitter that's like got some sort of decentralized control where it's not up to the moderators to decide what's on it and people can just put on whatever they want yeah he wrote he launched a project called blue sky which is sort of a research initiative into decentralized social media kind of very much in our in our space before after he left and then he left and like two days after he left, there was this huge censorship issue where they said, oh, you can, if it's a private image, it can get taken down on Twitter.
[280] So like any private image of anybody.
[281] And he - Oh, after they left, after he left, they ramped up censorship in a big way.
[282] Yeah.
[283] And it seems like, I mean, it's a hard position to be in because, you know, it's like your baby.
[284] He's, as his company he's been working on forever and he doesn't want to bad mouth it.
[285] But I would not be at all surprised if there were some internal wars happening about, I mean, there's a huge wired piece about internal free speech wars in Twitter management.
[286] So it's a fact that it's not, it's not, you know, one single ideology in these companies.
[287] There's definitely overwhelming ideology, but I think that there is starting to be pushback.
[288] So that's positive.
[289] Yeah, there's some intelligent people that realize the error of their ways.
[290] Yeah.
[291] This whole thing is going in a negative direction.
[292] And Darrell, how did you get involved with, with Bill and minds and, like, what was your idea going into this?
[293] Well, Bill had contacted me after seeing me on some interview or reading about me or something to participate in an event he was originally going to have in New Jersey, then it got moved to Philadelphia.
[294] How long ago was this?
[295] Oh, what, five, six years ago?
[296] I think, no, like 2019.
[297] That was it wrong?
[298] Yeah.
[299] Before the pandemic?
[300] Before the pandemic, yeah.
[301] Yeah, pre -pandemic.
[302] So 2019.
[303] And I liked what he was talking about, all different people from different political backgrounds, you know, stations in life, whatever, coming together.
[304] And so I said, yeah, you know, count me in.
[305] And I went and did it.
[306] And he had everybody there from all different walks of life.
[307] We all got along.
[308] We had different views.
[309] We talked together.
[310] We listened to each other's presentations.
[311] And then we had an after party together where everybody just kind of let their hair down, all that kind of stuff.
[312] The only people who were not supportive were the protesters across the street, some of whom called me a white supremacist.
[313] Yeah, I think Melissa Chen talked about this on your show.
[314] a while back, but basically Antifa was like protesting the event.
[315] You know, we had all these big YouTubers, Tim, and, you know, people on the left, right.
[316] Andy was there.
[317] Andy was there.
[318] Yeah, and there were some progressives.
[319] Tim and Andy, and Andy, is, said the last thing.
[320] Sorry, Andy, no, Tim Poole.
[321] But we also had some, there were some leftists there as well.
[322] And we really did our best to make it as balanced as possible.
[323] and you know communists and capitalists and the protesters are like they don't care communicating with each other but the antifa protest i mean it's like are they even real people it's like i guess they are and i'm joking but it's like what are we doing when you're allowing these people to dictate they're so crazy and they're we're allowing them to dictate what is and isn't said based on the threats of violence and lighting buildings on fire and shit they got us D platform from the original theater that we were going to have it in.
[324] So we had to move to Philly.
[325] How is that possible?
[326] We sold out that it was, you know, fear.
[327] Fear.
[328] Yeah.
[329] Fear.
[330] I do need to say one thing that, so I mentioned Jesse Morton, who was one of the co -authors of the paper, but so he actually, a good friend of Gerald's of mine recently passed away.
[331] He's a former extremist and he is one, you know, leading in the deradicalization space.
[332] He actually was an al -Qaeda propaganda lead so he ran a propaganda site for al -Qaeda he went to Columbia he was doing this in New York City so he's from the US and you know Darrell maybe you can explain oh I know who that gentleman is you do I was looking to have him on my podcast and he passed away yeah this was this was one of his last big projects Darrell's actually out in Portland right now I'm sure you guys know about Majid right oh yeah who's one of the best examples of that like someone who was like radicalized like his is his is even the name of his podcast is radical his books radical that's you know he was a guy who was deeply embedded in in this sort of Islamic group and then went to jail and realized why he was in jail started reading sort of like and sort of examining this thought process and came out of it this like sort of brilliant mind to analyze like what is what what what's the process where people get radicalized how does it happen and he could say it from his first of all he's incredibly articulate so he can say it from this way he's coming from this place of i was this guy instead of like i know what's wrong with these people like i was these people i am evidence yes yes exactly and that's what you know and that's what minds has in terms of doing the research you know we've done like a a polymath a 360 digging from all different genres psychologists uh former extremists trolls all kinds of people people like myself with boots on the ground dealing with current extremists and things like that.
[333] So all of that comes, you know, into the conclusion of this paper.
[334] Unlike, you know, a lot of other papers, they talk about, you know, why people do this.
[335] Others talk about the effect of what they've done.
[336] Some talk about the cause and the effect.
[337] But we have the cause, the effect, and the solution.
[338] It's just hard to get people to jump onto a new social media network.
[339] That seems to be a real issue because...
[340] Human beings are creatures of habit, not change.
[341] Yes, and you get, if you're used to checking Facebook, oh, let's see what grandma posted.
[342] You're used to doing that, and this is your go -to thing, and you only have so much time in the world, it's hard to get someone to, like, deviate from that, right?
[343] There's no rush.
[344] Like, we're not, we're seeing huge growth, just naturally.
[345] How many people do you have?
[346] More than 5 million.
[347] Five million?
[348] And it's, Madgeon's on there, actually, Minds .com slash Madgeon.
[349] Of course he is.
[350] Yeah, he just signed up.
[351] And so, you know, it's all just long.
[352] term like thinking where are we actually headed where are we going to be in 10 20 years like you don't and also it makes it harder to grow for what you said people are just stuck in their ways but also Facebook and Google use the dirtiest tricks in the book to grow I mean they literally latched their tentacles into everybody's phones grabbed all their contacts like you know followed you in your browser like every surveillance tactic they could get to grow Explain that.
[353] What do you mean?
[354] So there are these sort of dark growth hacking tricks that a lot of apps will use to increase their user base.
[355] And it's basically like manipulative growth techniques to get people to give them more information than you otherwise would.
[356] So say, let's just say I'm a person who's never used Facebook before and I just got a new phone and I said, you know what?
[357] I'm going to download Facebook.
[358] What happens?
[359] Oh, you know, they take you through their nice onboarding flow, super slick UX because they have brilliant designers.
[360] What's UX?
[361] User experience.
[362] Okay.
[363] So, you know, you just keep pressing that big blue button.
[364] Yep, yep, yep, yep.
[365] Oh, yeah, agree to terms, yep.
[366] And so they just put it, you know, they make it very subtle what you're doing.
[367] And there are benefits.
[368] What is happening?
[369] They're grabbing all of your contact book.
[370] They're grabbing.
[371] So they grab all your phone numbers?
[372] All your phone numbers.
[373] So when you sign up for Facebook, it has access to all of the phone.
[374] Yes.
[375] If you give it to them.
[376] If you give it to them.
[377] You can say no. Twitter is doing this.
[378] They won't, you know, stop it.
[379] Like you say no and then, you know, shows up in your feet another prompt to do it.
[380] So are they getting the full contact with the names and everything?
[381] It really depends.
[382] It depends on the specific app.
[383] And they all kind of have different, you know, kind of levels of invasive.
[384] And how many people read the entire policy agreement?
[385] How about zero, right?
[386] Who fuck's reading that?
[387] No. So like if I have you on my phone and I sign up for Facebook, does it get Bill Ottman plus your phone number and then they could target you?
[388] Yeah.
[389] So they could just send you a text message or...
[390] They could.
[391] Or they can sync up to your Facebook that you have.
[392] Yeah.
[393] Because your app, they're aware that we're communicating with each other because we both have each other's phone number.
[394] Well, it's like, you know, you go on Facebook and there's some sponsor.
[395] through an ad there, and it has a list of your friends that like this ad.
[396] Right.
[397] He's like, I didn't know she liked that.
[398] Yes.
[399] Well, how do they know she liked it?
[400] And how do they know that you might like it because so -and -so liked it?
[401] Yes.
[402] Let me ask you this, because this is a big one that everybody always wants to know.
[403] Sometimes we're talking about stuff, and then I'll hear an ad.
[404] Boom.
[405] See an ad for the thing we're talking about.
[406] Like, Jamie and I've talked about this many times where it's like, there is no way this this random ad would have popped up just on its own.
[407] It seems like it has to, they have to be listening.
[408] So I wish Lex had asked this question because this is a key question that honestly I'm not going to claim to know.
[409] I mean, we don't have access to their source code.
[410] So we do not know.
[411] And they've denied it repeatedly.
[412] I think that the geo can trick you a lot of time into thinking that they're listening.
[413] And, you know, different associations, they're.
[414] able to make in the back end.
[415] But I don't know the answer, but I know thousands of stories like what you're saying.
[416] And I feel like they're just skirting around it.
[417] And I would not at all be surprised.
[418] But again, if we can't see the source code, we do not know.
[419] So no one has definitively proven that they're actually listening to you.
[420] Well, they are listening because you can say, okay, Google.
[421] How does it know that you're, that you said, okay, Google?
[422] Right.
[423] Oh, hey, Siri.
[424] Yeah, so it's definitely listening for certain cues, and we don't know the breadth of that.
[425] So it could be that it picks up certain words that would indicate products that maybe you'd be interested in buying, and they show you those ads.
[426] I was looking in my Google data history like a couple weeks ago, and I had, you know, I have a bunch of different phones, actually, which I want to show you later, some, like, new open source stuff.
[427] Show me now.
[428] Okay, just in a sec. So the, in my Google data history, it showed when I said certain words, like, that triggered it.
[429] Like, there was all these different words that are sort of commands for assistant, Google Assistant, I think it's called.
[430] And I had turned Google Assistant off.
[431] And yet it still had, it was like on June or on June 18th, you said, you know, hello or whatever it was.
[432] And it was just this whole history.
[433] And I just deleted it all and like turned it all off.
[434] And it's, they're definitely listening for Q. So even if you say no, you opt out.
[435] Yeah, I turn it off.
[436] Now, as he points out, you know, a lot of them are closed source codes.
[437] Mines is an open source code.
[438] You can get on, you can see it, and take our code and use it.
[439] And we want you to.
[440] It's available.
[441] Yeah, and Mines.
[442] Transparency.
[443] The way Mines, I've never actually used Mines.
[444] I know I have an account over there, right?
[445] Yeah, you got it.
[446] We'll get you set up again.
[447] We'll make it easy, so you don't have to do anything.
[448] We'll listen in.
[449] It's all about making it easy because everyone's so busy But I don't go on any social media anymore I post and then I get the fuck out of there You're an Instagram boy Yeah Well I use Twitter and Facebook too But I don't use them very much I use Twitter to see how crazy people are Like how crazier are people today Let me just look through my feed And see who's fucking screaming in everybody What's going on in Russia And who believes this is all conspiracy And you know where's the tinfoil hat brigade On this one And just to get a kind of a finger on the pulse.
[450] Yeah.
[451] And then with Instagram, I just post.
[452] And then when on Facebook, like, Facebook is nonsense for me. I just, I go there for just nonsense.
[453] I go there for like videos and stuff.
[454] And I'm not even remotely looking to engage with people.
[455] So I don't, I feel like the engagement with people for, for a person with a profile as big as mine, it's too much work.
[456] It's like, it's too, like the interaction with people, it's too toxic.
[457] So many people are mad for whatever fucking reason and it's a lot of bad faith conversations like i just for my own mental health i opt out i think of it as like extra distribution outlets so like someone like you you know you got a million things going on you it's just literally impossible to post to multiple places you don't have time is there an app that allows you to do to post something to like a shitload of places we have well we have import auto importing from youtube and twitter so you can pull your stuff in and just post on your own so we can maybe get that youtube as well so you have a video aspect Yeah, we do.
[458] Yeah, we support video.
[459] We support all multimedia, blogs, messenger.
[460] Do you host that video?
[461] Yeah.
[462] So what are the implications?
[463] Like, what if someone post something illegal?
[464] Yeah, that gets taken down.
[465] Okay.
[466] Yeah.
[467] So we definitely are like U .S. law, First Amendment based.
[468] Right, right.
[469] So just quickly to show you.
[470] So these are, this is called a Libram.
[471] This is made in the U .S. So this is like trying to get rid of conflict minerals.
[472] And it's very heavy.
[473] It's a tank.
[474] Is it suck?
[475] It kind of sucks They're a great team Honestly it's it's not not an easy project It's amazing for how hard of a project It does not suck It's it's it's a legitimate effort This is called a pine phone These are all these things on the back These like switches Yeah don't switch those Because I don't know what they do yet I just got it like two days ago Does it charge?
[476] Is it charged up?
[477] Yeah I'll have to charge it later Okay that's the Libram 5 Yeah yeah security and privacy focus phone.
[478] You gotta go to the USA one.
[479] Because I think the important, made in the USA does matter because you talk a lot about the conflict mineral situation with phones.
[480] And I've seen you bring up other phones.
[481] There was like the fair phone.
[482] There was like some other attempts at it.
[483] And I think...
[484] What happened, Jimmy?
[485] So this one...
[486] What is this one?
[487] That one's called the Pine Phone.
[488] Yeah, I've heard of this one as well.
[489] Yeah, so that's much cheaper.
[490] This one's like 2K.
[491] Really?
[492] Yeah.
[493] Why is it so expensive?
[494] Because it's not...
[495] It's a beastly machine.
[496] This is a computer.
[497] Right.
[498] So you can run this on, you can hook this up to a monitor.
[499] This is a Linux OS.
[500] Yeah, it's got kill switches.
[501] That's what those buttons are.
[502] Physically disconnect the components.
[503] And the CIA is like, yeah, yeah, go ahead.
[504] That works.
[505] Yeah.
[506] Use our unique hardware kill switches to physically disconnect Wi -Fi, Bluetooth, cellular signal, microphone, and camera with kill switches.
[507] Yeah, because a lot of it has to do with the chain of custody of these products.
[508] because proprietary surveillance chips will get added to the phone in its life cycle throughout the factories globally.
[509] So they're saying, look, we need to make sure to be able to commit to our customers that there's no sketchy chips on this thing that's feeding data to some place we don't know about.
[510] Right.
[511] And that was the big thing with Huawei, right?
[512] One of the things about banning Huawei in the United States was they had proven that some of their routers were allowing access from third part.
[513] parties to access the information as it's distributed between the two parties.
[514] So a third party could come in, scoop up all the, you know, intellectual property and just use it.
[515] And, you know, sometimes some of these companies, they work both sides.
[516] You know, so the ones that create the device to prevent something is the same company that creates the device to take something.
[517] Like in the Washington, D .C. area, for example, a few years back, D .C. was being sued for the, for the The cameras, the red light cameras, you know, you run in the red light, you get a ticket.
[518] Well, Lockheed Martin had created those cameras, and they were shortening the length of the yellow light.
[519] So you got a bigger chance of running the red light, all right?
[520] So for every ticket that was written, Lockheed Martin was getting a dollar, and the rest of it would go to the D .C. Police Department, right?
[521] So dirty.
[522] But Hula Packard, you know, the same ones who make the radar gun that you get caught on are the same ones who make the radar detector that we use.
[523] So they get money from both ends, you know?
[524] Well, it's good business.
[525] This one, is this as good?
[526] The pine phone?
[527] I honestly, it's all very early.
[528] Yeah, sorry that it's not charged.
[529] Have you used any of these?
[530] I just, so, I just picked up these from the...
[531] Because if you just use an iPhone, we got a fucking problem.
[532] No, I do.
[533] I do.
[534] I'm using...
[535] I got this because the cool thing about the Libram is that you can plug this into a monitor with a keyboard.
[536] And this is a computer.
[537] Do you have a USBC charger?
[538] You could maybe...
[539] I have one.
[540] You do?
[541] If Jamie wants to put that in, he can plug this in.
[542] I'd like to see what that thing's all about.
[543] Something like this?
[544] One, yes, perfect.
[545] The other thing is, you know Adam Curry, right?
[546] I've seen...
[547] Yeah, I've seen it.
[548] Adam Curry, who is the original podfather.
[549] He's like, he is literally the man who created the original podcast.
[550] His, under your leg, there's...
[551] You're wrapped up, though.
[552] You're wrapped up.
[553] yeah um there's a see it under the table oh yeah yes it's connect actually connected to the table um give it a adams has this no agenda podcast and they have a no agenda phone and it's essentially a degooled android phone yeah that removes all of the tracing stuff all the stuff where you know but you can't use navigation on it you can't there's a lot of shit you can't use is it based on graphene i do not no i think it is i'm not sure jama check out a no i don't know a no agenda phone okay sorry um this thing does this get does this have this is not really usable like it's it's not really replaceable for your for your standard this is not android based so pine and neither is pine phone actually um so this is it no agenda phone small batch art is insecure private so this is it open source go to the footer go all the way down to the footer so typically scroll up it looks like it's not are they publishing their code?
[554] Keep going up.
[555] It is graphene.
[556] It's graphene.
[557] That's good.
[558] Good.
[559] All right, great.
[560] So this OS is essentially raw AOSP, Android Open Source Project, some custom bits.
[561] If you choose a phone that is supported directly by lineage and not some random developer on XDA, it is a secure as the G variant.
[562] If you choose to build, I don't know what they're saying here.
[563] Do you know what they're saying here?
[564] All right.
[565] That's legit.
[566] So Graphene OS is the most secure option endorsed by Edward Snowden, entirely funded by donations, and a guy.
[567] What does that mean?
[568] What does that mean?
[569] I don't know.
[570] How fucking random is that?
[571] A guy.
[572] The OS is updated and patched more often than G does with every conceivable method of hardening possible.
[573] The only downside is casual adopters.
[574] It's a relatively limited compatibility layer for apps and access services similar to those provided by G. G must be Google.
[575] So would it be less effective if it was entirely funded by donations and a gal?
[576] Yeah, or they?
[577] It says, so Edward Snowden speaks on it at the bottom.
[578] It says that software is equally important.
[579] The iOS and Android operating systems that run on nearly every smartphone conceal unaccountable numbers of programming flaws, known as security vulnerabilities.
[580] That means common apps like iMessage or web browsers become dangerous.
[581] you can be hacked.
[582] So he uses the graphene network.
[583] This is his base operating system.
[584] Yeah, I think that's a legit project, for sure.
[585] Graphene seems like a stripped -down version of Android, which Google created Android.
[586] But there's a lot of stuff that you can't use, right?
[587] Like navigation.
[588] That's a big one for me. Yeah, you can, no, I think you can get probably like open street maps and have some very simple navigation.
[589] You're going to be lost as fuck.
[590] You got a friend in the country.
[591] You're not going to find them.
[592] Yeah, I mean, look, it comes down to, like, are people willing, what sacrifices are people willing to make?
[593] And can we have a reasonable conversation with these companies to find a middle ground?
[594] Well, the key would be, then, to have that graphene phone, right, with kill switches.
[595] So you use the navigation when you need it, then kill it.
[596] Yeah, I found out that, like, find my iPhone.
[597] Doesn't that work even if your phone is off?
[598] How the fuck is it?
[599] Android don't do that.
[600] No?
[601] I don't think so.
[602] Is it you think Android's better?
[603] Well, I'll tell you, I'll tell you, a quick experience, right?
[604] Okay.
[605] I was flying with, won't name the airlines, and I got off the plane, went to turn it on my phone, it wasn't there.
[606] So I tried to get back on the plane because I figured fell down the seat.
[607] Oh, you can't get back on the plane, sir.
[608] What seat were you in?
[609] We'll go look for your phone.
[610] Give them the seat number.
[611] They come back five minutes later.
[612] We check between the seats under the seat in the pocket.
[613] No phone.
[614] You know, call customer service or email customer service.
[615] give them a description, they'll look for it for 30 days.
[616] So go through all that, and every couple of days, they would contact me. Now, I called Verizon, and has my phone, you know, can you track my phone?
[617] Well, we can't track your phone if it's turned off.
[618] So they told me, I have an Android.
[619] Okay, so I kept calling to see if anybody used it, right?
[620] And I kept calling my phone and stuff, nothing.
[621] So every couple days, they will let me know, we're looking for your phone.
[622] We're looking at you.
[623] We haven't found anything yet.
[624] And then in 30 days, they will stop.
[625] the search, but if anybody turns it in, whatever, they'll let me know.
[626] So on the 40th day, where they gave me an email on the 30th day saying, you know, we're sorry we had not located your lost item.
[627] However, if anybody turns in it, we'll let you know.
[628] On the 40th day, I got an email from them saying, your lost item has been found.
[629] It's in the lost.
[630] Now, I left it in the seat in D .C. when I was flying somewhere.
[631] and they found the phone, and it's in the Los Angeles and found at the Houston Airport.
[632] Now, that plane flies around the country 10 times a day, gets cleaned 10 times a day every time people debaward, right?
[633] And so people are cleaning that plane for 40 days, and nobody found that phone.
[634] It was found deep in between the seats.
[635] It'd fall off my hip, right?
[636] So the moral of the story is, you know, the best clean planes leave out of Houston.
[637] so you couldn't find it like the way you find an iPhone so iPhones apparently there's like some signal that being sent with find my yeah so that alone is a little bit of a red flag right yeah i mean apple is is they try to have this privacy argument and it's so shocking to me that they try to push that like oh you know we're not going to let the FBI in like trust us and look apple makes beautiful products.
[638] Everybody knows that they're the best designers in the world.
[639] What do you use?
[640] I use stripped Android and I'm going to start using this because this can be my computer as well.
[641] So I'll, but I'll, you know, I'll use both but I'm playing with all the options right now.
[642] But like I like to use Linux as my desktop computer.
[643] Can I see what you use?
[644] That's your phone?
[645] This is my phone.
[646] This is a, well, this is just Android.
[647] That's just regular Android.
[648] Yeah, this is just regular Android.
[649] It's not even stripped?
[650] This is a non -stripped version.
[651] So what a privacy guy is having a phone that's tracking him everywhere he goes?
[652] I have, dude, we're all in the midst of this world.
[653] I have five, look at me. I mean, come on, you got to give me credit for having five different phones.
[654] Do you have different phones because you have to check how Mines is on different operating systems?
[655] I'm not doing, I mean, I'm not doing all of our QA, but like I have Linux devices, Windows devices, Apple devices.
[656] I use it all.
[657] So you use an Android as your main phone.
[658] Yes.
[659] Why do you do that?
[660] I do that because Android is at least open source in its base function.
[661] So I will, as over Apple, I will choose Android because like we see with Graphene, you can fork Android and create a stripped down version.
[662] Now, it is absolutely imperative.
[663] I need to get a graphene, pure graphene version.
[664] That's on my list of things to do.
[665] I've got, you know, there's a clear OS here, which is, What's that?
[666] What is Clear OS?
[667] That's another open source Android.
[668] Does this, ooh, that's pretty.
[669] Does this have some sort of GPS system?
[670] I think it does, yeah.
[671] You want to open that so I can check it out?
[672] I mean.
[673] So what people are concerned with is obviously someone being able to access their information, someone tracking them I think people are concerned and like all right so look at me for an example like I'm sort of in this privacy world but I'm also not like a privacy I'm a privacy maximalist for what I want to be private I'm not saying like I'm never going to use any big tech app ever it's just an irrational impossible mission I've driven myself crazy like thinking that I should do that you have to go off the grid there You got to go off the, like, I'm going to be a human, okay, and I'm going to explore all the different options and hopefully transition.
[674] So, like, I have gotten rid of most, I don't use big tech nearly as much as I used to, probably like 10 % of what I used to.
[675] I deleted most of my accounts.
[676] I'll check in sometimes because I like to see what's going on and to understand the market.
[677] But, you know, I, it's, I'm not going to, I need to get around too with maps.
[678] And so I'm going to, as soon as we have an alternative, I will do it.
[679] I'll be the first one in line when someone can put something in, I'm trying to get all these options in front of me. But it seems like operating systems and applications, the trend is for them to get more intrusive, right?
[680] Like TikTok is supposedly, they back engineered it and said it is the absolute worst software that they've ever examined in terms of like violating your privacy.
[681] Yeah, let me just go through.
[682] also on my actual phone I'll just name a few apps which I think are a huge part of like privacy future because like it's not all about you know minds is a part of a bigger network it's like the the ultimate place where things are going is not there's not going to be some new replacement that for Google that's centralized it's going to be protocols that apps are all interoperating on so like Breyer is this amazing app that's currently going viral in Ukraine and Julian Assange actually posted about this app from prison he was able to communicate to his people.
[683] So Breyer is fully decentralized.
[684] It runs over Tor, and it can even run offline.
[685] So you can, we could chat over Bluetooth.
[686] I could be in a burning building and you're across the street in Ukraine.
[687] We're getting bombed.
[688] Internet is down.
[689] And we're chatting.
[690] Like, unbelievable mesh networking technology.
[691] It's Briar, like, B -R -I -A -R.
[692] Yeah.
[693] So it's been a long time coming for them.
[694] We're looking at integrating with the Bramble protocol, which is kind of the base protocol of Breyer.
[695] But, you know, there are a handful of fully decentralized options, also secure scuttlebut and some others.
[696] But it's really cool.
[697] I recommend checking it out.
[698] And I think that off -grid technology that's not reliant on Internet service providers is just, I mean, that's crazy.
[699] The fact that that's even possible to chat with no internet.
[700] It is crazy.
[701] And then, and sometimes, you know, when you go to a different country with your phone, you know, you have to be compliant with that country's internet laws.
[702] I mean, they can get into your phone where maybe the U .S. can't.
[703] Right.
[704] Do you think, like, in terms of privacy, would you recommend Google or Apple?
[705] Because like, that's a, that question is not a question.
[706] Really?
[707] No. But isn't, doesn't Apple at least.
[708] give you the option to block advertisers from being access to your information, block cross -platform or cross -application, sharing of data.
[709] They've been locking down their app store, which has taken, like, billions of dollars away from Google and Facebook advertising because they don't allow apps to do what they used to be able to do.
[710] Right.
[711] Isn't that good?
[712] That is good.
[713] That is good.
[714] So Apple would be a better choice?
[715] I don't know.
[716] I think that, yes, that is a good thing in sort of cost -benefit.
[717] So, but Apple is the mother of closed systems.
[718] I mean, Steve Jobs literally said, proprietary.
[719] It's like, we have a closed walled garden.
[720] Yes.
[721] And that was his whole thing.
[722] Like, we do not want anyone seeing what we're doing, hyper competitive.
[723] Apple does very, you know, relatively little open source compared to a lot of other companies.
[724] Well, I remember the days of clones where you could buy a fake Apple machine that runs Mac OS.
[725] and they shut them all down because you could buy like a bomber machine that has like crazy power and gigantic hard drives and like multiple hard drives like way more potent than anything that Apple was selling like in the 90s and they banned all that stuff is that for gaming and stuff yeah for gaming and just for people who do like video editing and just people that wanted like some crazy ultra hyped up machine and it was still run the iOS and this was the early iOS you know This was before OSX -10, which was the Unix -based operating system.
[726] That was back when Apple's operating system was a little janky.
[727] It was kind of sketchy.
[728] Crash a lot.
[729] You know, no preemptive multitasking.
[730] It was like no memory protection.
[731] It would crash.
[732] People were, like, really devoted to it, but that shit would crash a lot until OSX came along.
[733] Do you feel like you're at all willing to sacrifice any convenience in your technology?
[734] Yes.
[735] Yeah, I'm willing to sacrifice some.
[736] What, like, what would be something that you would be willing?
[737] That's good question.
[738] I ask myself that all the time.
[739] Because I'm fucking sure the government's paying attention to my phone.
[740] You know, so it's like, what, you know, what am I willing to sacrifice?
[741] You know, right now I just, I treat all interactions as if the government's watching.
[742] That's what I do.
[743] Yeah, like there's an Android app store called Eftroid.
[744] Yeah.
[745] Which is, you know, like a non -Google play app store.
[746] so you can actually get apps off of Google Play.
[747] On iOS, you can only get apps on the app store.
[748] But how do you know if you go to this F store?
[749] Is that what it's called?
[750] F -Droid.
[751] F -Droid.
[752] How do you know if you go to this F -Droid whether or not this is spyware?
[753] Is it vetted?
[754] Well, how do you, there's spyware, all, half the apps on, 90 % of the apps on Google Play are spyware.
[755] Really?
[756] Yes.
[757] I mean, every app you install is, like, infecting your system.
[758] Most of them, because most apps.
[759] are proprietary.
[760] But Google is worse for that.
[761] I mean, every app is different.
[762] Every app has different permissions that they're giving and, you know, different security implications.
[763] So I don't think that there's I think that the people at FDroid and in the open source community as a general trend just care about these things more.
[764] So they're not going to, you know, but there still can be malicious stuff in the open source realm.
[765] But, you know, you got to kind of understand the scene and And that's a lot of work.
[766] It is a lot of work.
[767] But do you research the food that you eat?
[768] Well, it's pretty simple.
[769] Is it?
[770] Yeah.
[771] It's a pretty big education.
[772] I think there's a lot of people who don't know.
[773] Yes, but I've already had that education.
[774] Right.
[775] It's the same way with tech.
[776] Yeah.
[777] Not sure.
[778] There's a learning curve.
[779] But it seems like people are making new things.
[780] Like, they're not really making new food.
[781] Real food.
[782] Like, real food is kind of been established.
[783] Impossible burgers.
[784] Yeah, that's not food.
[785] That should say.
[786] Terrible.
[787] Have you had one?
[788] Darrell, have you had one?
[789] I've not.
[790] I've seen them, but I've not had one.
[791] Have you?
[792] No. Really?
[793] Yeah, no, actually, I have to lie.
[794] I did.
[795] It's a lie, rather.
[796] I did.
[797] We did a show once at Stubbs, and my friend C .K. brought a bunch of burgers from a bunch of different places, and some of them were plant -based.
[798] So I took a bite.
[799] And just, like, it's like a bland burger.
[800] Yeah.
[801] I actually switched from vegan.
[802] To what?
[803] To what?
[804] I was vegan for, like, four years.
[805] Switch to what?
[806] Switched back.
[807] Switched back.
[808] Yeah, switch back.
[809] I was eat meat eater, yeah.
[810] Well, my wife actually, so has an autoimmune issue, not to overshare.
[811] But so she, when we were vegan together, have you heard of Weston Price?
[812] No. He's a really famous nutritionist and has this diet, like heavy into organ meats and...
[813] Nose to tail, that kind of thing.
[814] Yeah, fermented foods and probiotics and stuff.
[815] and so she was being told by her doctor that you have to go on this drug called remicade every six weeks IV for the rest of your life.
[816] What is it for?
[817] It's for Crohn's.
[818] Okay.
[819] And she was like, what?
[820] Life?
[821] Like, every six weeks.
[822] You're kidding me. And so she was just like, no, I'm not, I'm...
[823] Okay, so she switched her diet.
[824] So she switched her diet and is in remission.
[825] Really?
[826] And she, like, they're...
[827] And her diet consists of what now?
[828] Um, it's pretty much, uh, I mean, if you look up Weston Price, but, you know, meats, uh, fermented foods, um, just avoiding bread, sugar.
[829] Salmon, yeah, yeah.
[830] And she, so there's been studies done on cabbage juice.
[831] This is a big thing for people with ulcers.
[832] There have actually been studies that, uh, hardcore cabbage juice for like six weeks can reverse ulcers.
[833] And there have been studies on this.
[834] And her, her regular gastro doctor didn't even know about that.
[835] And that, she credits a huge.
[836] huge transformation from the cabbage juice regimen.
[837] Anyway, not to go off on a tangent.
[838] We are on a tangent, but your own personal experience, what was the difference between going from vegan to eating meat again?
[839] I mean, I respect vegans.
[840] I really do, especially the ones who aren't annoying.
[841] There's like five of them.
[842] There are five of them.
[843] There's this guy, Ed, Earthling Ed, he's very honest and not preachy and has good information.
[844] But anyway, like, I feel better.
[845] I love eggs and meat and all that stuff.
[846] And I do feel like healthier, but I think I was healthy vegan.
[847] And you can be healthy vegan.
[848] And I wouldn't be surprised if in a thousand years humans are not eating nearly as much meat.
[849] Well, why did you decide to go back?
[850] Because I wanted to, you know, I didn't need that much of a reason, you know, doing it with my family.
[851] And also, it's just.
[852] you know I want I'm I'm I'm not ideological about stuff I don't want to get stuck in ideology about food or whatever this is why I wanted you to talk about this because this is exactly the kind of conversation that some people would like to suppress because there are people that say that eating meat is bad for the environment and I've had a bunch of people on to try to discuss that pro and con the latest is what is Diane's last name is Rob Wolf and Diana Rogers they wrote a a book called Sacred Cow, we were talking about regenerative farming with them, but there are people that think that those conversations should be suppressed and that when you have these kind of conversations, they should be flagged, you should be shadow banned.
[853] There's a lot of people that promote the carnivore diet on Instagram that find themselves shadow band.
[854] And they have like real issues.
[855] Paul Saladino, Carnivore MD.
[856] I think they took as a countdown.
[857] As miss info?
[858] I don't know what the fucking excuse was.
[859] I think some wacky vegan activist who works with the company, can just decide that they're going to take your account down.
[860] I think there's a certain amount of control that the people that work there have where it's very subjective.
[861] So, imagine, so Facebook spends tens of billions on moderation.
[862] And they have.
[863] And so our vision, imagine if rather than tens of thousands of sensors who are just going like, down, down, take it down, hate speech, misinformation, conspiracy theory, what if you had tens of thousands of mental health professionals and positive intervention people and just like people engaging in dialogue who can provide mental health resources to users who need it to share information like I'm not saying you need no moderation you definitely do need a certain level but that's so much money and human energy I mean you've seen the PTSD studies of these content moderators at Facebook who these people get depressed they're suicidal because They're just seeing Al -Qaeda videos all day.
[864] Yeah, they're just watching crazy stuff, and that's a real thing, but that is unavoidable to a certain degree.
[865] But, I mean, to bring in experts in dialogue to engage, imagine if Facebook spent billions of dollars on that, mental health resources for the community.
[866] Would that be effective?
[867] Well, yeah, because, I mean, look at it this way.
[868] say 25, 30 years ago, insurance companies were not paying for acupuncture.
[869] Oh, that's no nonsense.
[870] It's what it called a placebo or something.
[871] Now they do.
[872] Now they see value in it.
[873] Well, Chinese people have been using that for 2 ,000 years.
[874] Would they still be using it 2 ,000 years later if it wasn't working?
[875] So now we're accepting, you know, some Eastern culture.
[876] Now we're, you know, when our doctor does not give us what we hope will cure us for our cancer, or diabetes or whatever, we go to the holistic route.
[877] And we found some pretty amazing results.
[878] And that's what, you know, Mines is doing the holistic approach by giving everybody a platform to share their information.
[879] Like you just shared about the cabbage juice.
[880] You know, somebody hears this podcast and goes out and tries cabbage juice and it clears up their wife's ailment or something like that.
[881] And this is a good subject to talk about now because we just got through the pandemic.
[882] and that was one of the things that was suppressed was information about methods of treating COVID.
[883] I mean, it was a giant issue where if you talked about, whether it was hydroxychloroquine or ivermectin or whatever you would talk about, even vitamins, talking about like the difference between the COVID results of people that were vitamin D insufficient versus people that had sufficient levels.
[884] It's a giant difference.
[885] But if you talked about that, you would get in trouble for disinformation or misinformation, and you would get either shadow banned or outrightly banned.
[886] I mean, there are people that were banned from social networks for suggesting that people who are vaccinated can still spread COVID.
[887] That turns out to be an absolute fact now.
[888] But if you said that, eight months ago, nine months ago, instead of having this conversation and having medical experts debate it and people that understand it and don't understand it, So ask questions and people who are following the standard narrative, they express themselves.
[889] And then people that have alternative ideas express themselves.
[890] And we find out what's right and what's wrong.
[891] Somebody expressed that it could be treated with bleach, right?
[892] Wasn't that Trump?
[893] Yeah.
[894] He said like, yeah, like an infusion of bleach.
[895] There should be warnings.
[896] But imagine if rather than a fact check warning like, you know, these three think tanks said that this is false.
[897] what if you could actually see a visualization of the debate that showed both sides and gave you like a probability score or something on the piece of content as opposed to saying like black or white well who checks the fact checkers yeah that's that's a lot of fact checkers that are just full of shit like there's a lot of things that are like mostly true or mostly false and you look into it and you're like fuck you this is not mostly false or true it can't be mostly it's like you know multiple statements somebody's like into and you're like you're like you're like somebody being sort of pregnant.
[898] I mean, either you're pregnant or you're not, you know?
[899] But if someone is, if there's multiple statements about an issue and some of them are correct and some of or not, then it would be like mostly true.
[900] Well, take a piece of COVID, you know, content like you were talking about and, you know, there's going to be studies on one side and another.
[901] What do you do on minds for that stuff?
[902] Well, we're building out a sort of citation tool to kind of show the citations on both sides of various arguments and, you know, have more crowdsourced.
[903] this really gets into the realm of decentralized identity and where we're moving in terms of like reputation and credibility on the internet and like right now you've got all these different logins what we're talking about where things are going with with crypto and with like the web standards there's really we're moving towards a place where you have these credentials associated with your core identity which can be generated from like a crypto wallet or something like that and you'll have all these badges that you're earning everywhere you go.
[904] And you can decide to disclose those or not disclose those, like NFTs.
[905] I mean, right now at - I'm confused.
[906] What are you earning badges for?
[907] Can we see the interference?
[908] Would you pull up mine so we can see the interface?
[909] Yeah.
[910] So ultimately, credibility on the internet.
[911] It's like how do you measure that?
[912] How do you trust users?
[913] So it's like if I say, oh, Bill is a very good guy.
[914] He says a lot of true things.
[915] He's very reasonable.
[916] So you get a badge for that?
[917] there could be any infinite number of, you know, badges that you could potentially earn.
[918] But, like, you could be trusted by, say someone in martial arts trust you, and they give you a signal of trust.
[919] Then that would add to your credibility in martial arts in your decentralized identity on the Internet, which would be interoperable between social networks.
[920] So that there's sort of this web of...
[921] Oh, look, I got a page.
[922] You got a page.
[923] Who put my picture up there?
[924] You're YouTube.
[925] You took over the account.
[926] You asked me for the creds.
[927] I know.
[928] I mean, I'm just saying who took my picture up there?
[929] Some fan or something.
[930] I don't know, put my picture there.
[931] Someone fan can just put my picture up there?
[932] I don't know.
[933] I mean, people create fan pages or not.
[934] Okay, so like in 2020, I posted something, it says.
[935] And it got 31 ,000 views.
[936] Wow, look at that.
[937] Huberman, that episode's down.
[938] Okay, so what is making these things post?
[939] I don't know.
[940] that they must have just someone must have just posted them well how can they post it under my name just might have been just linked from youtube because these are just youtube posts right but how are they linked in my name i didn't do that um i don't know maybe i have a hundred and eighteen thousand subscribers dude you probably have a bunch of tokens too oh i have tokens so you should we'll we'll figure it we'll figure it out but that's so this is two years ago this is august 13 so this is during the pandemic.
[941] I 100 % didn't post that.
[942] Okay.
[943] So someone is posting that in my name.
[944] Did my account get hacked?
[945] Imagine my account at mine's got hacked and some dude is just posting.
[946] I mean, I sent you the password.
[947] Do you change it?
[948] I don't know.
[949] Well, we'll figure it out.
[950] Okay, we'll figure it out.
[951] But someone's posting as me and I have 118 ,000 subscribers.
[952] I should probably get on that.
[953] Yeah.
[954] So you said it was, I think it was connected.
[955] I think it was connected.
[956] It was pulling in your YouTube and, you know, something might have gotten.
[957] I'll fix it.
[958] Fuckery.
[959] Just doesn't seem good, Jamie.
[960] There's nothing on it.
[961] It's just a link to your YouTube link.
[962] It's just a YouTube stuff.
[963] But some of them are missing.
[964] Some of the YouTube videos are missing.
[965] Well, that's because there's missing YouTube.
[966] We don't have everything up on YouTube.
[967] Right.
[968] Oh, that's right.
[969] When we changed over to Spotify, we removed some of the stuff where, you know, Spotify only allows us keep 100 full -length episodes on YouTube at a time.
[970] Most of it has to be, you know, they're trying to channel.
[971] Here, go to minds .com slash change.
[972] So this is the link to...
[973] So this is the stuff that you guys are doing.
[974] This is Change Minds.
[975] Yeah, this is the link to the paper.
[976] This is the censorship effect.
[977] Yeah.
[978] And so when Vox, who are like very strongly left -leaning, When they have a piece that they write saying that there are harmful effects of censorship that actually pushes people towards more radical ideas, like what are they suggesting?
[979] Are they suggesting that places like social media, sites like Twitter, back off of censorship and maybe choose an alternative?
[980] They're not going that far, but I think it's a step in the right direction.
[981] They also talk about the reach.
[982] A lot of their question is, what is the reach of content?
[983] Like Alex Jones, for instance, we did a empirical analysis of his reach after he got banned, and it actually went up globally.
[984] So in terms of all of the views on - Let me stop you right there.
[985] Wouldn't it keep going up if he wasn't banned?
[986] Like, because everybody goes up.
[987] Like, my shit goes up every month, right?
[988] So when you say his went up, does that mean it went up at a proportionate level that it as same as it would if he stayed on Twitter?
[989] or did it go up just based on the baseline of winning a band?
[990] Yeah, I mean, that's sort of an impossible thing to know because you can't really know what the world would be.
[991] You could follow the trend line.
[992] Yeah, you could track the trend.
[993] I mean, I think that what Vox is saying is, it depends if, like, you know, Alex has a platform.
[994] So he was going to grow a huge kind of either, in either direction, I would imagine.
[995] But small people, when they get banned, you know, that kind of gets buried.
[996] You know, no one's, no one's complaining when some random person posting a COVID, posts gets banned from Twitter, they're just lost.
[997] Right.
[998] There's millions of people who just get lost from that.
[999] And so, you know, anyway, in the analysis, we saw the total views of Alex's content went up significantly, but I think that it's, you know, it's called the stricent effect, but it's also, there's variation on that, and I think it is definitely, censorship also works, like, in an isolated system.
[1000] So if you're on Google or Facebook, like, or Twitter, like, yeah, you can silence.
[1001] certain words or topics but you when you're thinking of the internet as a whole then um you know the the total reach is not necessarily going down because and we need to start thinking about the internet as a whole not just isolated networks like you can't claim that censorship of COVID misinfo worked when you just banned it from Google and it just went up like what about the global numbers we need that's what we need to be looking at So when you guys got together, how long have you guys been working together?
[1002] Four years?
[1003] Yeah.
[1004] And when I first joined on, he was just approaching 2 million members, and now it's over 5 million.
[1005] So it is growing.
[1006] Oh, yeah.
[1007] The minds is growing.
[1008] It's like, well, hopefully you get a lot more after this one.
[1009] But it's the difference between that and Facebook.
[1010] Like, what is Facebook?
[1011] Oh, God, billions.
[1012] And Twitter?
[1013] Hundreds of millions.
[1014] If not billions.
[1015] I don't, right?
[1016] Yeah, probably close.
[1017] Yeah.
[1018] I don't know.
[1019] So there's a giant difference in terms of the user experience.
[1020] Yeah, yeah.
[1021] But here's the crazy thing.
[1022] Like, in, you can actually get more reach on minds than Facebook or Twitter if you're a small creator.
[1023] Because small creators, like getting out of the void on social is so hard.
[1024] And we have this reward mechanism where you can earn tokens and boost your content.
[1025] And, you know, we also do.
[1026] just wrote out this build your algorithm feature where you can actually opt in to see people who are different from you or similar for you for you for you or you can opt in to increase the level of tolerance that you have to ideas that you disagree with how do you adjust that um there's there's these uh toggles so if like say if you're a vegan and you're like maybe you know like starting to feel a little sick like maybe we should pay attention to some of these carnivore people yeah you could let a little of that in yes yes yes Yes, exactly.
[1027] Open up your recommendations to not just stuff that's going to bring you down your own echo chamber, but expand it.
[1028] Now, Darrell, I want to talk to you about your personal experience on minds with what you do, what you're known for.
[1029] Have you had interactions with people on minds that have been favorable, that you've kind of pushed people into a...
[1030] Yeah, I've had a few, and I've had my share of detractors.
[1031] Some people think, you know, what I'm doing is totally wrong and don't get me or whatever.
[1032] But yeah, I've had interactions for some people.
[1033] When you say, like, people have said it's totally wrong.
[1034] Like, what kind of criticisms do they have for that?
[1035] It depends upon where they're coming from.
[1036] Some people think it's not my job to teach white people how to treat us.
[1037] Us mean black people.
[1038] Others think it's ridiculous to sit down with a white supremacist.
[1039] Why would you waste your time?
[1040] You know, those people can't change.
[1041] Do you point to your success ratio?
[1042] Because it's pretty amazing.
[1043] Oh, yeah.
[1044] I point to that.
[1045] But a lot of people, you know, they don't see that because they don't, they would not tolerate the time to sit down and have somebody tell them some nonsense that Jews or the childs or the devil or, you know, some crazy things like that.
[1046] I will sit and listen to that and I will put up with it because in order for me to speak my mind, I have to listen to somebody else's.
[1047] Yes.
[1048] Right?
[1049] So they're not willing to put in that time.
[1050] I am.
[1051] Yeah, and so when, how do you have the time to do this?
[1052] This is what I do, in between my music gigs.
[1053] Yeah, but I mean, like, that's, what kind of commitment are you talking about?
[1054] Like, how much time do you spend doing this?
[1055] A lot.
[1056] I mean, it's my life now.
[1057] How many, like, email, dialogue?
[1058] Oh, God, I get emails all the time.
[1059] I give me emails from people I don't even know.
[1060] I even get emails from people who've seen me on podcasts or on TV shows.
[1061] These are white supremacists, clansmen, whatever.
[1062] and say, you know, you made some sense in it.
[1063] Would you like my robe?
[1064] I've even gotten robes in the mail from people I don't even know.
[1065] Yes.
[1066] Yeah.
[1067] I think there's a lot of sad people that just need a group of people to belong to.
[1068] And they'll decide that what these people are saying makes sense because at least they'll be a part of it now.
[1069] Let me explain something to you.
[1070] As you already know, once perception is one's reality.
[1071] Okay.
[1072] You cannot change anybody's reality.
[1073] If you try to change their reality, you're going to get pushed back.
[1074] Because they only know what they know.
[1075] Whether it's real or not, it's their reality.
[1076] So what you want to do is you want to offer them a better alternative perception.
[1077] And if they resonate with your perception, then they will change their own reality because their perception becomes their reality.
[1078] Just just a quick example.
[1079] Let's say you got a seven or eight year old brother, right?
[1080] And he goes to a magic show with his buddies.
[1081] And he comes back and tells you, Joe, you know, this magician, he asked for a female volunteer and 50 women raised her hand.
[1082] He picked up this one, come up on stage.
[1083] He told her to climb into this long box and stick her feet out that hole and put her head out this hole.
[1084] Then he closed the lid, told her to wiggle her feet, and she kicked her legs.
[1085] And they took a chainsaw and cut that box in half.
[1086] He cut that woman in half.
[1087] And you're like, ah, it didn't really happen like that.
[1088] Yes, it did.
[1089] I was there.
[1090] You weren't even there.
[1091] I saw it with my own eyes.
[1092] You are challenging his reality.
[1093] He knows what he saw.
[1094] and that magician cut that woman in half and then to make it even more obvious to you he tells you that the magician, after you cut the box in half, took the half with the legs sticking out and moved it over here to stage right and the half with the head over here to stage left and then he went over there and talked to the head of the woman and she talked back to him and then he brought the two halves back together, opened the box, and out popped the woman full form, no blood.
[1095] He cut her in half and he put it back together.
[1096] You're saying, it was just an illusion.
[1097] No, it wasn't.
[1098] I saw, with my own eyes, I was there.
[1099] You weren't even there.
[1100] So, again, you're attacking his reality.
[1101] He's going to resist.
[1102] He's going to fight you.
[1103] All right?
[1104] So what you do is you offer him a better perception.
[1105] You say, hey, listen, I hear what you're saying.
[1106] But could it be possible that just maybe, out of those 50 women that raised their hands and he picked one, maybe she works for him?
[1107] Maybe he planted her in the audience.
[1108] She knows the trick.
[1109] She travels to every show around the country with him.
[1110] and when she gets in the box, there's a pair of mannequin legs laying on the floor of the box that are wearing the same stockings and same shoes that she has on.
[1111] She picks him up, shuns him out the hole.
[1112] When he says move your feet, she shakes those things, and then she brings her own legs up under her chest.
[1113] So her whole body is on that half of the box.
[1114] So the saw doesn't even touch her.
[1115] And obviously, when he separates the two halves, the feet are over there, now she can't move them.
[1116] So he has to distract your attention by going over here.
[1117] so you're not looking at those feet.
[1118] And he's talking to the head and she's talking back.
[1119] Of course, when he brings them back together, she pulls the dummy legs, leaves him on the floor of the box, she climbs out.
[1120] And then your brother says, hmm, you know, I guess that would be the only way that would work.
[1121] You've offered him a better perception.
[1122] And that perception then becomes his reality.
[1123] So don't attack somebody's reality, regardless of what it is, even if you know it to be false.
[1124] Give them a better perception and allow them to resonate with it Because it's always better when somebody comes to the conclusion, I've been wrong.
[1125] Maybe this is something I need to think about.
[1126] Yeah, this will work.
[1127] It's a perfect example of not silencing people's ideas, but giving them better ideas.
[1128] And this is what the answer to censorship has been.
[1129] Exactly.
[1130] And, you know, so Darrell always talks about how much he listens when he starts the dialogue and doesn't even try to, you know, push ideas at the people that he's engaging with, different extremists or whatnot.
[1131] Would you agree with that statement?
[1132] Absolutely.
[1133] And let me just give you an example of that.
[1134] Okay, so I'm interviewing a clan leader, white supremacist, right?
[1135] And I ask, you know, how can you hate me?
[1136] You don't even know me. You know, all you see is this.
[1137] You come in my room five minutes ago, and you've already determined, you know, whatever you determine.
[1138] Well, Mr. Davis, you know, black people are prone to crime, and that is evidenced by the fact that there are more blacks in prison than white people.
[1139] Now, I'm just sitting here listening to this guy.
[1140] He's calling me a criminal.
[1141] And, but he's right.
[1142] He's 100 % right.
[1143] and the statistics show that there are more blacks in prison than white people.
[1144] So that feeds what he already thinks he knows, the data, right?
[1145] But he does not go to find out why does that data show that?
[1146] He doesn't realize there may be an imbalance in our judicial system that send black people to prison for a longer periods of time than white people who've committed the same crime.
[1147] So I just listen to him, right?
[1148] Because when he walks in that room and he sees me, I'm the enemy, His wall goes up.
[1149] His ears are like this.
[1150] He's ready to defend whatever his stance is.
[1151] So I'm just listening.
[1152] And then he goes on to say, you know, black people are inherently lazy.
[1153] They always have their handout for a freebie.
[1154] They're always trying to scam the government welfare programs and all that kind of stuff.
[1155] So now he's called me a criminal.
[1156] Now he's calling me lazy.
[1157] And I'm just sitting here listening.
[1158] I'm not pushing back.
[1159] And then he says, and black people are born with smaller brains.
[1160] And the larger the brain, the more capacity for intelligence.
[1161] The smaller the brain, the lower the IQ.
[1162] So now I'm being called stupid.
[1163] Now, it's what, he says that this is evidenced by the fact that every year the data shows that black high school students consistently score lower on the SATs than white kids do.
[1164] Again, he's 100 % correct.
[1165] That does show that.
[1166] But he doesn't realize why.
[1167] All right.
[1168] Where do most black kids in this country go to school in the inner city?
[1169] where the most white kids go to school in the suburbs.
[1170] It is a fact.
[1171] Suburban schools are better funded.
[1172] They have better facilities, better teachers, et cetera.
[1173] I will guarantee you, white kids who go to school in the inner city can score just as low as those black kids, if not some, lower.
[1174] Black kids who go to school in the suburbs can score just as high as the white kids, if not higher.
[1175] It has absolutely nothing to do with the color of the student's skin or the size of the student's brain.
[1176] but it has everything to do with the educational system in which that child is enrolled.
[1177] But, of course, he won't go to research that because the data already supports what he already believes that I'm inferior.
[1178] So now he's called me all these things.
[1179] I've already done my research on him.
[1180] I know this guy sitting across from me just barely made it out of high school.
[1181] I have a college degree.
[1182] So do I throw that in his face?
[1183] No. But because I sat there and listened to him, that wall is coming down.
[1184] Because you cannot impart information to somebody when the wall is up.
[1185] It's like hitting a brick wall.
[1186] We want that wall to come down and then the ears open up.
[1187] So now he's exhausted all his vitriol.
[1188] And now he's wondering like, how come this black person isn't pushing up against me like most of them do?
[1189] And he's curious as to what I think about what he just said.
[1190] So now the wall is down and he feels compelled to reciprocate because I sat there and listened to him insult me. So now is my turn.
[1191] I could go on the offense and say, no, you are the one who's a criminal.
[1192] You're the one hanging black men from trees and dragging them behind pickup trucks and bombing their churches.
[1193] And I would be 100 % correct because the clan has over a hundred year history of doing that.
[1194] But if I did that, that wall will go right back up.
[1195] So I don't want that to happen.
[1196] I want to keep the wall down and let him hear what I'm saying.
[1197] So rather than go on the offense, I go on the defense.
[1198] And I say, listen, I hear what you're saying.
[1199] However, I don't have a criminal record.
[1200] And I'm as black as anybody you've ever seen.
[1201] So I don't have size goes, I've never measured the size of my brain, but I'm sure it's the same size as anybody else's.
[1202] And as far as my SAT scores go, they got me into college.
[1203] Now, I already know that he doesn't have a college degree.
[1204] I do.
[1205] Does it make me a better person than him?
[1206] No, but it gives me a better experience, right?
[1207] So I let him know this.
[1208] He goes home and he thinks, just like we all do at the end of the day, we reflect on what we did during the day, he thinks, man, I just had a three -hour conversation with a black guy, you know, when we didn't come to blows.
[1209] And what that Darrell guy said, it makes sense.
[1210] Oh, but he's black.
[1211] But what he said was true.
[1212] But he's black.
[1213] So they're having a cognitive dissonance.
[1214] Right?
[1215] And they struggle with that for a while.
[1216] And then they have that dilemma.
[1217] I got to make my mind, what am I going to do?
[1218] So the dilemma is, do I disregard whatever color he is and believe the truth because I know it to be true and change my ideological direction?
[1219] Or do I consider the color of his skin and continue living a lie?
[1220] In most cases, will follow the truth.
[1221] But then there will be those who don't want to give it the power or the notoriety or whatever, and they will follow the lie.
[1222] Well, the way you're doing it is brilliant because you're doing it so patiently and contrary to the way most people handle arguments.
[1223] Most people handle arguments by trying to shut down the other person's argument and shit all over them instead of trying to what you're saying offer an alternative perspective, which is really probably the only way to get people to think about things in a different light.
[1224] And, Joe, that comes from the fact that I've done a lot.
[1225] to travel.
[1226] Okay.
[1227] I've been exposed to people from all over the world.
[1228] And we all got along.
[1229] We all got along.
[1230] We told the story on the podcast the first time you hear about not even understanding racism until you were a child because you grew up overseas.
[1231] Right.
[1232] Exactly.
[1233] And we got, so I saw that.
[1234] So I saw something that they have not seen.
[1235] Right.
[1236] And that's why I want to share that with them vicariously to let them know.
[1237] No, it's, you know, the whole, every white person in the world is not like every white person in this country.
[1238] Every black person in the world is not like every black person in this country.
[1239] You know, there are white people over in France.
[1240] Like in the 1940s and 50s, a lot of black Americans moved to France to live.
[1241] Some even gave up their U .S. citizenship because the French people were treating them as equals.
[1242] They didn't see color, you know.
[1243] And those French people were a lot more white than the white people here in this country who might be mixed to something else.
[1244] So, you know, people need to see.
[1245] In fact, my favorite quote of all time is by Mark Twain, or otherwise known as Samuel Clemens.
[1246] It's called the travel quote.
[1247] Mark Twain said, quote, unquote, travel is fatal to prejudice, bigotry, and narrow -mindedness.
[1248] And many of our people need it sorely on these accounts.
[1249] Broad, wholesome, charitable views of men and things cannot be acquired by vegetating in one little corner of the earth all one's lifetime.
[1250] That guy was so good.
[1251] Wasn't he?
[1252] He had so many great quotes.
[1253] Exactly.
[1254] Isolation.
[1255] And so Sam Harris actually did a study that we talk about in the paper about.
[1256] He did a neuroimaging study of people being exposed to political beliefs different from their own and actually looked at people's brains when they were going through this experience.
[1257] And they actually talked about this thing called the backfire effect, which is sort of what you're talking about when the wall's up.
[1258] And so they sort of detected that, interestingly.
[1259] And I forget the exact name of the study, but it's in the footnotes.
[1260] so I think the patience is it that it's long term you're not changing someone's mind like in five minutes of you know chattering on in comment sections or you know yelling at someone at the dinner table like that you that you barely know like darrell knows how to create long term relationships and not be like thirsty for them to change their mind like it's just by like look we're here we're we're hanging out whether it's a network or, you know, offline or online network doesn't really matter.
[1261] And so I think the backfire effect that Sam found and that we're sort of talking about with walls going up is very real.
[1262] And that's why it's just, it has to be long term.
[1263] You know, Darrell, I'm just thinking while I'm listening here, like these conversations that you've had with these white supremacists and neo -Nazis, how amazing would it be if that was a podcast?
[1264] It is.
[1265] No, what I'm saying.
[1266] If you sat down with those people from the beginning, from first meeting them, and see that conversation play out.
[1267] That would be very relatable.
[1268] I've got some of that.
[1269] Do you?
[1270] Where I've sat down with some of these people while they're still in, and now I'm sitting down with them now that they're out.
[1271] Some of them even come on my lecture tours with me and stand on stage with me and speak out against their former organization.
[1272] Do you have videos of these conversations?
[1273] Yeah, some of them, yeah.
[1274] God, are they online?
[1275] Some of them I think are, but if not, I can send you some.
[1276] I think those videos would be a great tool for someone that's maybe trapped but like at least partially open -minded where they have this like view of things like maybe I'm incorrect about this maybe I need to reevaluate yeah you know like that but as a podcast that would be brilliant I mean that's a great idea to have someone from the jump like walk in a KKK member and have this conversation when they sit down with you over hours and hours and present all these articles about crime I mean, a brain size, all this shit, and have you just tell them your perspective and see the wheels start turning.
[1277] Because I think sometimes a lot of these people, they're only interacting with people that think like them.
[1278] Right, exactly.
[1279] Now, I'll give you a crazy -ass example of something, unbelievable, right?
[1280] So this exalted cyclops, which means a district leader and the clans.
[1281] Okay.
[1282] Yeah.
[1283] Okay.
[1284] So he's in my car with me, right?
[1285] Dragons, wizards, exalted cyclops?
[1286] That's hilarious.
[1287] He's in the passenger seat.
[1288] I'm driving.
[1289] And we got on the topic of crime and stuff.
[1290] And he was talking about, you know, black on black crime and how violent we were and all that kind of stuff.
[1291] And he said, you know, black people have a gene within them that makes them violent.
[1292] Now, I'm driving.
[1293] He's over here.
[1294] And I said, you know, what are you talking about?
[1295] And he says, well, look at all the carjackings.
[1296] and drive -by as in Southeast.
[1297] He was referring to Southeast Washington, D .C., which is predominantly black.
[1298] There's some whites that live there, but it's predominantly black, very high crime -ridden.
[1299] I said, okay.
[1300] I said, but, you know, you're not considering the demographics.
[1301] That's what lives there.
[1302] I said, what about all the crime in Bangor, Maine?
[1303] White people, because that's what lives there, right?
[1304] I said, you know, he goes, no, no, no, that has nothing to do with it.
[1305] You know, you all have born with that gene.
[1306] And I said, look at me. I said, I have never, I'm as black.
[1307] Does anybody you know, I have never committed a drive.
[1308] by or a car jacking.
[1309] How do you explain that?
[1310] This man didn't even think about it.
[1311] He didn't hesitate one second.
[1312] He goes, your dream is latent hasn't come out yet.
[1313] It almost came out then, but, you know, but, I mean, he had an answer for everything.
[1314] And I was, you know, stupefied.
[1315] Like, he's over here all smuggled.
[1316] You know, you got nothing to say.
[1317] And so I thought about it.
[1318] Well, if I gave him some, you know, PhD knowledge or whatever, it wouldn't faze him.
[1319] So I had to go where he was.
[1320] I said, well, you know, we all know that every white person has a gene in them that could make them a serial killer.
[1321] And he says, how do you figure?
[1322] I said, well, name me three black serial killers.
[1323] He couldn't do it.
[1324] I said, I'm going to name you one.
[1325] I named one for him.
[1326] I said, here's one.
[1327] Just give me two.
[1328] He couldn't do it.
[1329] I said, Charles Manson, Jeffrey Dalmer, Henry Lee Lucas, John Wayne Gasey, Ted Bundy, Albert DeSalvo, the Boston Strangler, David Berkowitz, son of Sam, on and on.
[1330] I said, they're all white.
[1331] I said, son, you are a serial killer.
[1332] He goes, he goes, Daryl, I never killed anybody.
[1333] I said, you're Gina's Lathen.
[1334] Has it come out yet.
[1335] He goes, well, that's stupid.
[1336] And I said, well, duh.
[1337] I said, yeah, it is stupid for me to say that about you.
[1338] But it's no more stupid for me to say that about you than what you said about me. And he got very, very quiet.
[1339] But you see his wheels will go, and then he changed the subject.
[1340] And within five months, He quit the Klan, and his robe was the first robe I got.
[1341] Yeah, I remember that stupid conversation.
[1342] I remember that conversation you relaying on the podcast.
[1343] Yeah.
[1344] I had a conversation on a podcast many years ago where a guy actually did bring up that gene thing with black people.
[1345] Oh, it's common.
[1346] Yeah.
[1347] And he said it, and I didn't know the guy before I had him on.
[1348] And while I was having him on, I was realizing, like, a lot of the shit that this guy's saying is, like, I probably shouldn't have had him on.
[1349] No, you should have.
[1350] But yeah, but back in those days, like, I would have people on.
[1351] I would just read something.
[1352] They'd say, well, this is probably a conversation that's controversial.
[1353] I'll talk to this guy.
[1354] But some of the things he was saying, that was one of them, was that black people had this gene for violence.
[1355] And I go, well, how the fuck do you explain war?
[1356] I'm like, my take was like, most war started by white people.
[1357] Like, if you looked at the amount of war that goes on in the world worldwide, like how much of it is instigated and initiated by white people?
[1358] and is there a thing more violent than war?
[1359] Nothing.
[1360] It's like literally you're telling people that don't even know people, that it's their obligation to kill someone based on what land they're from or what part of the world.
[1361] That's the most violent shit we know and it's all by white people.
[1362] Black on black crime is a myth.
[1363] No such thing.
[1364] It's a crime of proximity.
[1365] Okay?
[1366] Because they need something immediately.
[1367] They're not going to go all way across town to the white neighborhood and attack some white guys.
[1368] Somebody right here might have it go into his house, break it, take his stuff, beat him up, whatever, all right?
[1369] So we hear about black -on -black crime.
[1370] So do we call Russia invading Ukraine and killing all these people white -on -white crime?
[1371] That's exactly what it is.
[1372] Yes.
[1373] Yeah, I mean, and some people are actually using that as an argument for how racist the way we look at war is.
[1374] Because during the time where all this is happening in Ukraine, how many people are bombed in Yemen?
[1375] How many people are bombed in all sorts of parts of the world where they're, are these military actions that we're ignoring.
[1376] There's actually like a chart that someone put up, it's like a graphic that shows the bombings and the people that died in Ukraine versus the people that are dying right now simultaneously due to U .S. drone strikes and all sorts of other shit that's happening all over the world at the same time.
[1377] It's like we're concentrating on this one thing and it's in the news and that's part of the reason why people are concentrating on it so much.
[1378] I learned a long time ago when I was living overseas.
[1379] If you want to learn about your own country, read a foreign newspaper.
[1380] Yeah.
[1381] Like the Herald Tribune, the French paper, tell their perspective on what's going on in the U .S. Because we don't tell our own people.
[1382] Just the same way, Russians don't tell their own people everything.
[1383] I'm interested, you know, that you had that feeling that, you know, maybe you shouldn't have had that person on.
[1384] This is early in the podcast.
[1385] I know, no, I know.
[1386] But I'm just, I'm saying that I think that, because I'm sure that was a producer.
[1387] I don't know who you're talking about, but I'm sure.
[1388] that was a productive conversation in certain ways.
[1389] And I feel like there's this chilling effect that is happening where we're afraid to have a conversation like with a murderer or maybe not a murder, but that's kind of the funny thing.
[1390] Like you could interview probably a serial killer on this show and that would be fascinating.
[1391] And no one would be like, oh, dude, Joe's like going to become a serial killer.
[1392] He just had a serial killer on his show.
[1393] And like people are obsessed with, true crime and, you know, obsessed with interviews with some of the worst humans that have ever existed.
[1394] And those are considered to be extremely valuable interviews.
[1395] And I think that you should, I hope that you, you know, own your ability to do that in a way where people aren't assuming that you think or you endorse the views of people that you're talking to.
[1396] That is a sickness.
[1397] This is an argument that's always going to take place where you're platforming those people.
[1398] This is the dialogue that the left likes to use today, that you're platforming these people.
[1399] And it's the dumbest...
[1400] That's what I hear if we're sitting down with those people.
[1401] It's so dumb.
[1402] It's such a dumb argument.
[1403] I mean, especially in your case, like, look at the results.
[1404] What are the human being has a documented result of literally hundreds of KKK and neo -Nazi people abandoning their ideology because they've had a conversation with you and literally had a change of heart, an actual change of heart.
[1405] Yeah, no, no journalist whining about, you know, intense content on the internet has ever de -radicalized anybody.
[1406] They have no track record.
[1407] They have no data.
[1408] So it's just all -emotional.
[1409] In fact, it polarizes some people that disagree with them.
[1410] Yeah.
[1411] And especially when those people get banned.
[1412] If they get banned from the social media platforms for having different perspectives or different views.
[1413] Well, for instance, sorry, Vice did a piece.
[1414] about us and they said minds has no idea what to do with all the neo -Nazis and just like i talked to these reporters for hours and explained to them what we were working on with darrell and we were sort of in the beginning of phases of writing this paper and they so disingenuously characterized what we were trying to do it's a lot of bad faith conversations over there it's so it's toxic and you know i'm just hoping that honestly no offense to them i feel like they're they're in their world, hopefully, you know, we can all get on the same page somehow about what's actually going on here.
[1415] Like, I'm not trying to have, you know, combative tone with any of these media outlets or with big tech even.
[1416] Like, I don't want to polarize it between like alternative tech and big tech.
[1417] It's like we need tech to adopt certain principles that have to do with digital rights and freedom.
[1418] That's just a reality.
[1419] It has to happen.
[1420] And the ones that do, what would be smarter than whether it's Google, Facebook, Twitter, whoever, to actually start doing some of this stuff and start to be more transparent.
[1421] I think the amount of moderation that they would require would be extraordinary.
[1422] You can achieve it with community -centric moderate.
[1423] Pay the users to help.
[1424] Yeah, but they're not going to do that.
[1425] They will.
[1426] They are.
[1427] They do it.
[1428] You think so?
[1429] Who's going to do that?
[1430] Facebook?
[1431] Anyone, yeah, Twitter actually rolled out.
[1432] They're going to pay users to moderate.
[1433] They should.
[1434] They don't.
[1435] You're not.
[1436] You're saying they're going to.
[1437] No, no. I'm saying they're not.
[1438] Well, I'm saying that Twitter rolled out a product called Birdwatch, which was a, and I don't know if it's still going on, but this was like last year, it was a community -centric moderation tool to get the community.
[1439] I'm not, so let's separate payments from actually getting the community involved in the moderation.
[1440] So community is already heavily involved in the moderation.
[1441] They're doing the reporting, they're flagging stuff, and then it's getting escalated through.
[1442] Yeah, but they flag things that aren't even really offensive.
[1443] They do it to fuck with people.
[1444] Right.
[1445] And so you have to be careful of that.
[1446] But that's why juries are, I think that juries are a big part of the future of moderation on social media.
[1447] And Daryl, you were about to say something.
[1448] So, you know, a lot of hypocrisy, you know, about who to put on a platform, who not to put it on a platform.
[1449] I do a lot of speaking to a lot of colleges across the country, universities.
[1450] And I would say two or three times a year, you know, some student activities board.
[1451] or student council has booked me. And then two weeks before the event, the administration will shut it down.
[1452] Oh, no, no, we can't have him on campus.
[1453] He's too controversial.
[1454] Yeah.
[1455] You know, stir stuff up.
[1456] Which is not true, you know.
[1457] At all.
[1458] You know, they don't want to deal with it.
[1459] And this is unfortunate because they are an institution of higher learning.
[1460] While on the campus, perhaps everybody is being treated equally.
[1461] Gay people, LGBTQ, black, white, Muslim, Jewish, whatever, in the confines of the campus.
[1462] But the objective of higher education is to teach people how to navigate society beyond the campus and be a productive citizen, right?
[1463] So you've got to let people learn that, hey, you're a woman.
[1464] Here you're treated equally.
[1465] But when you graduate and you go out there and work in the real world, you might be sexually harassed by your boss.
[1466] You might not get paid as much as your male counterpart who knows less than you or whatever.
[1467] Or you may not get the job because you're black or because you're whatever.
[1468] This is, you know, in addition to the academic education, they need this empirical education.
[1469] And those institutions that are shutting me down are not providing it.
[1470] But what I'm going to say also was, today you got, and speaking of cancel culture, you've got people banning books and banning history classes under the guise of CRT, critical race theory, things like that.
[1471] You've seen the pictures of a black girl walking in towards a white school building for the first time.
[1472] people behind her yelling at her and all that kind of stuff, or the four black guys sitting at the Woolworth County, counter in Greensboro, and people pouring stuff over their heads.
[1473] 1960s.
[1474] All right.
[1475] Those white people that did this made history back then, and now it's those same people that are saying, we don't want that taught in the schools.
[1476] So make up your mind.
[1477] You know, that is history.
[1478] It's part of American history, whether it's good, bad, ugly, or shameful.
[1479] all those cars need to be turned face up and it'll be transparent and then we address them and then we move on together.
[1480] Okay, but history is history.
[1481] So don't create history and then tell me you don't want that history being taught that you created that you were so proud of.
[1482] You know, I'm going to stand in the doorway and not let these black kids come in.
[1483] I think it's about if there's, is there a neutral lens that we can look at those events and I think that some of the criticism of CRT is that it's not approaching, it's not approaching those events in it from a neutral lens it's not and it's not that you know it's news it's not a neutral police dogs attacking peaceful black marchers there's nothing the courthouse to register to vote there's there's nothing neutral about it but it i i think that there's definitely some ideology that is attached to critical race theory that is rooted in critical theory which is you know a left leaning there are there are multiple definitions of of critical race theory and nobody has really explained it satisfactorily.
[1484] So people who are against it will explain it this way.
[1485] You're trying to victimize white people as the oppressors and victimized black people as the oppressed and that's how you are and you will never change.
[1486] You know, that's how the people who are opposed to it define it.
[1487] But, you know, but that's not necessarily how some of the people who participated in the creation of it like Kimberly Crenshaw, I can speak to all of them, define it.
[1488] You know, so it needs to be, all history needs to be taught.
[1489] Absolutely.
[1490] You know, and through the lens of what happened and then move forward.
[1491] But you can't create history and say, you know, we don't want to talk about it until 50 years later.
[1492] Like when I was in high school, I'll be 64 this month.
[1493] When I was in high school, we did not learn.
[1494] And I went to high school in Montgomery County, Maryland, which has one of the top school districts in the whole country.
[1495] Montgomery County, Maryland and Fairfax County, Virginia.
[1496] We tie neck and neck each year.
[1497] Anyway, we were not taught that we had Japanese interment camps in this country.
[1498] I did not learn that until I was in high, in a college.
[1499] I'm like, what?
[1500] Are you kidding me?
[1501] No way.
[1502] I asked my parents.
[1503] They said, yeah.
[1504] I could not believe I didn't learn that in high school.
[1505] Now, I knew about the Tulsa race riots 30 years ago.
[1506] People today are just now learning about that.
[1507] So, you know, that's what I'm saying.
[1508] We need to educate.
[1509] Education exposure is the key to advancement.
[1510] Well, what we need to do is your take on the way you've had these conversations.
[1511] with these KKK people and these neo -Nazi people, that has to be across the board with everything.
[1512] Let a person explain their position, and then you come up with either a better argument or you agree with part of what they're saying, or the only way is to not silence them, to let them talk.
[1513] So if people are against critical race theory for any particular reason, they should listen to the entire argument of what critical race theory entails, and then from at least that person's perspective, and then this is what I agree with, this is what I don't agree with, and have a conversation that's rational, they're not having ad hominims, they're not attacking the human, they're not attacking the person with insults, they're just talking about what is correct and incorrect about everything from economics to health care to everything.
[1514] These kind of conversations are how people find out what's right and what's wrong, and how people find out what resonates.
[1515] with them.
[1516] And as soon as you shut people down, those conversations stop.
[1517] And then these people go off into their own corner of the world where they are accepted and they get in an echo chamber and they just reinforce whatever stupid idea they had in the first place.
[1518] Yeah, what you were saying about watching people change their minds, like interviews, like that is so powerful.
[1519] And we're actually watching this change minds sort of challenge where we're going to be trying to, like, as like a campaign on the site to have people make videos and tell stories of like a meaningful time that they change their mind because everybody doing that more like what's the time that you what's a recent time you've changed your mind about something so sort of meaningful oh I don't know it's it happens all the time right it happens all the time it's not only a woman's prerogative to change my mind all the time I change my mind all the time I'll change my mind in the middle of a conversation I'll go wait a minute I don't think so let me change I'm gonna change my mind right now I do that but can you think of something like in your life that like from when you were younger that you were really locked into like what's what's just a big one oh i don't know man i've had so many of them this this conversation would take 15 minutes for me to sit down think about it all right darrell you got one yeah i can i can give you one okay so as a kid um i learned that a tiger does not change his stripes a leopard does not change his spots right okay and so when i first went into to to interview white supremacists and kKK people or whatever I was not going there to convert them.
[1520] Never, okay?
[1521] All I want to know is how can you hate me?
[1522] We don't even know me. That's all I want to know.
[1523] And then I'm out of here.
[1524] I never see you again.
[1525] Right.
[1526] Because if a leopard cannot change its spots and a tiger cannot change his stripes, why would I think that a Klansman could change his robin hood?
[1527] It's who he is, right?
[1528] But I changed my mind because those conversations did change that person.
[1529] And you're right, a leopard cannot change his spot.
[1530] And a tiger cannot change its stripes because those two animals were born with those spots and stripes.
[1531] That Klansman or Klanswoman was not born with that robin hood.
[1532] That was a learned thing.
[1533] And what can be learned can be unlearned.
[1534] So that's why I changed my mind and why I continue to do this today to sit down with those people.
[1535] And the only way that works is with open dialogue.
[1536] Exactly.
[1537] I mean, it's funny that you answered it like that because for you, it's just second nature to constantly be changing.
[1538] I have a philosophy about that.
[1539] I don't think you should ever be your ideas.
[1540] You should never be connected to your ideas.
[1541] Your ideas should be something that's independent of you that you either think this is a good one or this is a bad one.
[1542] But if someone comes along and says that's a bad one, you shouldn't be defensive.
[1543] You shouldn't, like, hold on to it and cling to it.
[1544] Maybe, like, try to defend it because you think it's correct.
[1545] Like, oh, I thought that was right.
[1546] But then once it isn't, there's some people that, for whatever reason, never want to admit they're wrong.
[1547] Because they think that being wrong makes them less.
[1548] Yeah, to play devil's advocate with ourselves, I mean, I'm not even ideological about our model.
[1549] I actually think that I'm open to seeing, you know, over the course of 10 years, like, let's actually come back in a few years and look at the data that and the information that we've gathered about the rate of deradicalization and what.
[1550] like what really works what is the the most balanced moderation policy for a social network like you know first amendment i think is a great starting point and obviously there's edge cases spam weird like there's this doxing yeah doxing that stuff we don't we don't deal with and that's not like covered in the first amendment right um but i think that we're we're flexible you know we're not trying to, this isn't like a dogmatic piece of, uh, uh, policy.
[1551] It's, uh, but we need to A, B tested at least.
[1552] I mean, for God's sakes like, big tech is just like hemorrhaging censorship.
[1553] Just like, people just millions of people getting banned today.
[1554] And we don't have something to test it against.
[1555] Like, where's the major network with a, with a, with a free speech policy that is, you know, it's a responsible free speech policy.
[1556] Let me ask you this.
[1557] What do you guys do?
[1558] about bad actors, like troll farms, like Russian troll farms, that kind of thing.
[1559] Yeah, I mean, so we have, I mean, so troll, like, detecting different types of spam and troll, like harassment is not okay.
[1560] Harassment, you know, you'll get banned.
[1561] Like, and harassment is, like, legally not allowed.
[1562] So, but there's all different types of spam.
[1563] And, you know, misinformation and whatnot, I think that...
[1564] Yeah, but I'm talking about bad actors.
[1565] I'm talking about...
[1566] Like, when you have these Russian troll farms, these are people that are hired to disseminate propaganda.
[1567] They're hired to muddy the waters of conversations by having fake arguments or bad faith arguments.
[1568] Mercantaries.
[1569] Yeah, they are.
[1570] I mean, there's...
[1571] You've seen those, right?
[1572] And so, yeah, so we have the distinction between misinformation and disinformation.
[1573] The difference is that disinformation is intentional...
[1574] Yes.
[1575] manipulation um you know i think that it really depends on the context of the specific posts that we're talking about so i don't want to make a generalization about some troll farm i you know there's troll farms in the u .s that are doing all kinds of inauthentic content engineering for different political purposes doesn't matter what so what country doesn't matter what's what do you do about it so what i'm saying is you look at it on a case -by -case basis and evaluate you know is there is it breaking the law Because at the end of the day, information is information.
[1576] If someone is trying to put, everything is propaganda.
[1577] Propaganda is coming from, you know, every single angle.
[1578] So if people, it depends on the specific nature of the content and the troll that you're talking about.
[1579] I don't think that you can have a blanket solution saying, programming an AI to say, hey, every time, you know, you detect X, Y, and Z, just like ban them.
[1580] Wouldn't an alternative B, everyone has to have like a user ID, like a driver's license to register or something?
[1581] So you have one account because you are Bill Ott.
[1582] Well, I think that's where the decentralized reputation is starting to come into play.
[1583] And there's this project Verite that's coming out.
[1584] There's the DID spec, which is starting to build this like interoperable identity that you carry between social networks.
[1585] And then so basically you're bringing your your credibility, your identity, whatever you want to share, whether it's, you know, art, you know, content.
[1586] 10, it's all tied to you and you're sort of moving around freely in a sovereign situation.
[1587] I think that's where we want to go long term, so that you're not locked in.
[1588] As technology evolves, so should ideology.
[1589] Yes.
[1590] Yeah, ideology should also like it needs technology.
[1591] Because your ideology should be tested.
[1592] And the best way to test your ideology is to have an encounter other ideologies and see if it stands up to scrutiny.
[1593] Exactly.
[1594] And the thing, when people don't want that and they want people censored, what you're saying is your ideas won't hold up because you don't want to, if we could all have debates in real time with good ideas versus bad ideas and everyone gets a chance to watch, it's going to be messy.
[1595] But at the end, you're going to at least know where you stand on something, because you've had both arguments played out in front of you, whether it's left versus right or whatever it is when you're talking about ideologies.
[1596] You got to watch these people have these conversations.
[1597] And if you can do that, you can kind of agree with one or disagree with the other and find out where you fit in this.
[1598] Or take something good from that person, something good from that person and put them together.
[1599] Yes.
[1600] I think the focus on long form is key.
[1601] And that's why, you know, so we do support video.
[1602] It's not like necessarily.
[1603] Do you host video?
[1604] Yeah, we do, yeah.
[1605] So someone can do, like, an hour -long video and upload it.
[1606] Your bandwidth cost must be extraordinary.
[1607] Oh, God, it's bad, yeah.
[1608] But there are, like, distributed systems, like, IPFS that I mentioned, and R -Weave and some of these, like, systems where it's decentralized.
[1609] And, you know, you don't have to pay for all of the storage.
[1610] But the bandwidth is still an issue.
[1611] And, you know, it's a spectrum with the decentralized stuff.
[1612] but yeah so dude I have this cool thing it is Darrell is out of time Darrell's out of time I just got to I got to wrap up with so this is called this is called an open dime Bitcoin wallet so this has a Bitcoin on it this has a full Bitcoin on it okay and it has this hole that you can puncture so I can get this is basically the cash equivalent it's a bearer instrument for Bitcoin So I can hand this to you Okay I'm not giving it to you but just We can see it up there So works as like a USB drive?
[1613] So yeah you you plug in your computer And you can send Bitcoin to it And then you got to puncture that hole And that's what unlocks the private key So I can give it to you And you cannot access the Bitcoin on this Until that hole is punctured And then you plug it back in And you can actually take control of the Bitcoin What does that have to do with censorship?
[1614] So well this is the ultimate censorship resistant crowdfunding mechanism you can this is totally uncensurable money that anyone could send crypto right but we're talking about discussions conversations well yeah and so the reason i'm bringing it up is because we are putting a full bitcoin towards you know our work with darrell and this we're going to have this basically sit and we're going to watch it over the years and we're going to use the funds the the address for this wallet is published on minds dot com slash change And so what we want to do, you know, you see all the censorship, the financial censorship happening, which is correlated to censorship of speech.
[1615] Google is now suspended monetization on YouTube for all users in Russia.
[1616] Applies to other services as well.
[1617] That's a whole creator industry up in smoke.
[1618] No way these guys can make up that revenue on Victorica.
[1619] It's unbelievable.
[1620] Yeah.
[1621] So I just wanted to bring this up because, so we're, we're.
[1622] We're going to be doing more events.
[1623] We believe in offline events, too.
[1624] It's not only an online social network.
[1625] And so if the address, if anyone is interested in supporting the conversations, the long -form conversations we're having with Daryl, please, you know, send Bitcoin to this address.
[1626] And we're going to put it towards that.
[1627] Thank you.
[1628] Thank you, guys, for coming here.
[1629] And thank you, Daryl, for all of your time and effort that you put into this.
[1630] It's extraordinary.
[1631] I mean, your patience is unbelievable.
[1632] So is yours, my friend, all the stuff you got to put up with.
[1633] Yeah, well, Mike, that's what I do, though.
[1634] I guess that's what you do as well.
[1635] And Bill, thanks for what you're doing with mine.
[1636] Let's figure out what fuckery was going on and fix that on your account, yeah.
[1637] Let's figure out.
[1638] Okay.
[1639] All right.
[1640] Thank you, everybody.
[1641] Oh, tell everybody one more time the Minds address slash change.
[1642] Yeah, minds .com slash change.
[1643] You also get the app, minds .com slash mobile or me at At Otman.
[1644] And Darrell, what social media are you?
[1645] use and other than minds?
[1646] I use darrell davis .com.
[1647] Darrell at Darrelladavis .com.
[1648] I'm on Twitter, Instagram, but I have somebody handling that for me. But most of it is, most of you use minds now.
[1649] Yeah.
[1650] Okay.
[1651] I use minds .com.
[1652] Also fair, the foundation against intolerance and racism.
[1653] Okay.
[1654] Fairfor -all .com.
[1655] Beautiful.
[1656] Well, let's do this again in the future when we have more time.
[1657] Thank you, Joe.
[1658] All right.
[1659] Thanks, guys.