The Joe Rogan Experience XX
[0] five, four, three, two, one.
[1] Legit.
[2] Hello, Bill.
[3] Hey, man. What's going on?
[4] Here.
[5] You are here.
[6] Yes.
[7] With a book, you got a book of shit.
[8] I got a book.
[9] You come prepared.
[10] I mean, yeah, I'm trying to write.
[11] I'm trying to get back into handwriting.
[12] For people who don't know, Bill is the CEO and co -founder of minds .com, and we've been going back and forth through email, and you got hoaxed by some dude who said he was Joey Diaz.
[13] It did happen.
[14] He really believed.
[15] You're like, Joey's been on my network.
[16] like he was messaging me in joey's voice yeah like basically cloning it like there's weird people out there yeah well that's not hard to do yeah you know yeah yeah watch enough joey basically just cloning his his his tweets yeah yeah yeah yeah every monday morning or so there's a there's a there's a tweet about someone needs to suck your dick that's they need to suck your dick you need to let them know that's uh on the regular um what's the notes man just some rambling from this morning yeah yeah important stuff it's actually the first thing that i uh i've written in this notebook i've not been doing handwriting much at all in the last years probably mostly digital which is not good because i actually majored in english yeah you definitely lose your ability to write words it's funny i tried writing in um i for whatever reason i write mostly in all caps you know because they're just mostly just write notes but i tried writing with like lowercase letters, and then I tried writing it cursive.
[17] And my cursive is like, it's almost like I have to relearn it.
[18] Yeah, I was finding just like trailing off at the end of certain words.
[19] But I blend it all together.
[20] You what?
[21] I blend it all together with capital and lowercase.
[22] Oh, why you do that?
[23] I mean, well, just as a normal person would proper grammar.
[24] I thought you were just mixing them up randomly.
[25] No, no. Though I did write my college thesis in all lowercase.
[26] Why?
[27] Typeed.
[28] we protesting yeah kind of it was stupid it's like a cool move right I'm not going to use any uppercase who cares man there's weird like postmodern theory about like capitalization and that's kind of what I was talking about I got a little bit indoctrinated at UVM to be really yeah yeah they're like they this one class it was called critical theory which one is what is UVM Vermont yeah oh Vermont is like yeah super social justacy right and And paved with good intentions.
[29] Yeah, yeah, yeah.
[30] They have great ice cream up there, too.
[31] Mm -hmm.
[32] Nice folks.
[33] But, like, this one class was called Critical Theory, and we had to watch Buffy the Vampire Slayer and apply, like, Marxist theory to it to show how, like, it's the rise -up of the lower class.
[34] It's like you're forced to write these papers in, like, a certain way.
[35] Yeah?
[36] Yeah.
[37] What are they trying to prove?
[38] Class division.
[39] Class division.
[40] in Buffy the Vampire Slayer It's there There's like books and books Written about Buffy the Vampire Slayer And Marxism Yeah come on Not kidding Really?
[41] Yep What do they have to say?
[42] I don't remember I don't want to go into it It's amazing I mean I'm sure you're aware of James Lindsay and Peter Bogosian And what is it The other woman's name?
[43] Helen shit I didn't meet her unfortunately The Grieven Studies hoaxes that they submitted a bunch of fake studies to these journals and not only got reviewed but got lauded and praised for their academic scholarship.
[44] Yeah, I think that stemmed from this guy, Sokol, who first trolled a lot of the postmodern journals, and he was the first.
[45] And so it's called a Sokol hoax to do that kind of trolling.
[46] It's hard to figure out who's who, right?
[47] It's hard to figure out what's the hoax.
[48] I got tricked by that.
[49] There's this one thing called the Postmodernism.
[50] generator online.
[51] It's a computer that writes articles that puts all of this fancy language together.
[52] And I, someone sent it to me and I showed it to my teacher and I was like, oh, this is saying something pretty interesting.
[53] But it was, it was nothing.
[54] Yeah.
[55] What do you, you know, what do you make of all this hoaxing?
[56] You've been hoaxed twice then that you've just admitted in the first minute of the show.
[57] I mean, I think that you sort of have to have the right to to hoax?
[58] To hoax?
[59] Yeah.
[60] To be wrong.
[61] Mm -hmm.
[62] To mess up.
[63] Well, that's not messing up.
[64] That's deceiving people.
[65] True.
[66] But you kind of have the right to do that, too.
[67] You kind of have the right to troll.
[68] Yeah, you don't have the right to impersonate, but I have the right to get trolled and be wrong.
[69] Well, you don't have the right to impersonate it.
[70] Like, you can't have a verified account and pretend you Joey Diaz.
[71] Oh, no, of course not.
[72] But harmless trolling like that guy did to you?
[73] Isn't that kind of like part of freedom?
[74] Yeah.
[75] Yeah, like taking down disinfo from social networks because it's wrong, that doesn't make sense.
[76] It also depends on the intent.
[77] I mean, like if someone is purposely, okay, let's say that you find some Chinese bot that's purposely that's purposely disseminating incorrect and negative information about maybe a potential presidential candidate.
[78] Let's pick one.
[79] Tulsi Gabbard.
[80] they're disseminating fake news about her.
[81] You know for sure that it's fake.
[82] You know for sure who the source of it is.
[83] I don't know how you know, but you know for sure who the source of it is.
[84] You don't think that should be taken down?
[85] I think that if it's illegal, it should be taken down.
[86] If it's illegal.
[87] Yeah.
[88] What if it's just lies?
[89] Then that could be illegal.
[90] I'm not a lawyer.
[91] So I'm trying to position the network or just like advocate for other networks to take more of a neutral statement.
[92] There's this cool thing called the Manila Principles, which the Electronic Frontier Foundation wrote with a bunch of other internet freedom groups, which is talking about how digital intermediaries shouldn't be making these subjective decisions about what's getting taken down and should require a court order.
[93] Now, with a DNS provider or, you know, something less content focused.
[94] Explain DNS to people don't know what you're talking about.
[95] Like a domain name, yeah.
[96] so explain what that is it's like where you buy domain and you know but for a social network we're hosting tons of content so it's harder for us because we see illegal content and it we should proactively take some of it down if we know it's illegal right but at the same time you know it's just slippery well i follow quite a few people who have uh i want to i want to say they have hoax accounts but they have parody accounts and you know a lot of people read into it wrong and think they're being honest and argue with them.
[97] Like, there's this progressive dad guy on Instagram.
[98] You ever follow him?
[99] I know exactly what you're talking about.
[100] He's hilarious.
[101] The one I follow it too.
[102] That's pretty funny.
[103] It's like, I follow quite a few of them.
[104] There was a guy who was the wrong skin guy who was saying he was born in the wrong color skin that he's transracial and he would make these ridiculous arguments about it.
[105] And Elwick, what was his name?
[106] Apparently, he's a comic from the...
[107] I was just thinking, this sort of start a couple...
[108] I don't know the moment they had to do it, but let's say four years.
[109] years ago, a lot of those troll accounts had to sort of say, we're not this person.
[110] For instance, the main one I follow is now it's known as not Bill Walton.
[111] He tweets funny sports jokes all night long, like what's on TV, as though he's Bill Walton, like in Bill Walton's voice.
[112] But like, at some point he had to put in the name, like, this is not Bill Walton, but still has quite a few followers and he still gets the jokes out.
[113] But that's funny.
[114] Well, what's your stance on people who make accounts of your stuff put it out there well i have a lot of them you know um there's a ton of them some of them actually do good stuff like they make little clips and they put those clips on one and it's actually it's good it's good for people enjoy the show they get a little one minute snippet of things and then some of them just will pretend to be me and contact people and try to book them on the show which is really weird oh yeah we've had that um but you know i mean who is that is that a 16 year old kid Indiana.
[115] I mean, who is that?
[116] It's, uh, it's odd.
[117] It's, but it's overall, like get past my own personal feelings, uh, because it's about me. It's interesting.
[118] This is the, you know, this, this strange new ground that we're covering, you know, I mean, we've been discussing this ad nauseum on the podcast lately that essentially we've been dealing with 20 years of this.
[119] And in those 20 the years, it's changed radically.
[120] I mean, what it is, it's become something completely different.
[121] It's become something that changes public opinion on things overnight.
[122] It's become something where you can distribute information from, you know, person to person about some, you know, a huge international news event.
[123] You could get all of your information from Twitter, whether it's what happened in Venezuela or what, you know, anywhere there's something in the world.
[124] people are turning to social media before they turn anywhere you're looking when I hear about something I almost always before I even Google it I almost always go to Twitter and check Twitter and you know and see what's going on and see this duck go duck go have you heard of that one no what's that it's like a privacy focused search engine it's like pretty much the only privacy alternative to Google it's like this idea that we say oh just Google it right why do we I mean our whole process has been to like purge proprietary surveillance tools from our company and I've been trying to do it myself like getting off Facebook getting off Twitter getting off Instagram it's just like they're so abusive to everybody and it's like there's brilliant people who work there I mean Instagram is such a well -designed app are you kidding me beautiful but so what do you think is abusive about it particularly let's start with Twitter they're all the same.
[125] They're all the same.
[126] Yeah.
[127] Do you think they're all the same because they're all gigantic businesses?
[128] Yeah.
[129] And they're all the same because none of them share their source code.
[130] And they all spy on everybody.
[131] And they don't show you what is happening behind the scenes.
[132] They don't show you what the code's doing.
[133] So like in that in that note I wrote you the other day, it's like, I compare it to like food transparency.
[134] You know, 50 years ago, nobody thought about that.
[135] And then 20 years ago, everyone's like, I want to know what's in my food.
[136] But why wouldn't you want to know what's in your apps?
[137] Yeah.
[138] I mean, it's super sketchy what they're doing.
[139] But how so?
[140] Like, what's super sketchy?
[141] We don't know, but we know that they're spying on everyone and tracking you everywhere you go.
[142] They're targeting things at you based on physical location, browser history.
[143] Even when you're not on those websites, they're following you around where you're going on the internet.
[144] Right.
[145] And so some people accept that for this free search engine with free e -mail.
[146] mail and things along those lines.
[147] They accept the fact that a certain percentage of what they're doing is not going to be private.
[148] Or at least their searches are not going to be private.
[149] Like say if you search, like you're thinking about buying a Jeep and you search Jeeps, you look at, you know, 2019 Jeep.
[150] And then all of a sudden all your Google ads are about Jeeps.
[151] Right.
[152] They're like, we know.
[153] We know you're thinking about a Jeep bill.
[154] And I don't think that that makes people want to spend more time on Google and Facebook.
[155] What do you think it does?
[156] Do you think it freaks them out?
[157] I think that we're just numb to it, and so we accept it.
[158] Yeah, I think it's more than that.
[159] Yeah, and so there's all different layers of, like, what we use with your browsers, your apps, your operating system, your food, your, you know, government, your energy.
[160] Like, all of this technology has code that's associated with it.
[161] So, and when you open up your computer, when you sign into a browser, when you open up an app, you are empowering that app.
[162] That's how the apps of the world become huge, monstrous corporations is because we all use them every day.
[163] Right.
[164] So if you switch from, you know, OAT, MacOS to like Ganoon Linux or, like, Debian or Ubuntu, if you use Brave or Firefox, if you, you know, Duck, Go is actually proprietary, which is annoying, but they are very privacy focused.
[165] And then there's apps, there's minds, there's all sorts, there's other open source decentralized social networks out there that we can potentially federate with.
[166] There's really cool, new interesting protocols like Dat and IPFS that are like more torrent style backend.
[167] So there's actually no servers in a giant warehouse like Facebook and Google.
[168] It's more, it's fully peer to peer.
[169] And we're trying to balance it because it's not like decentralization equals good and centralization.
[170] equals bad but like you know in order to get a sweet app like instagram style you need servers to like process video and so the tech is still sort of immature in the fully peer -to -peer like you know bitcoin style internet but we're definitely getting there and i just think it's important for people to use things that are transparent to them and respecting our freedom yeah i think one of the problems with these giant companies is that once they become big you kind of use them as a default and it's very difficult to get people to communicate with you off of them.
[171] You know, it's hard to say, hey, man, I'm launching this new social media app.
[172] I would imagine you could speak to this.
[173] I'm launching this new social media app, and I want you to join it.
[174] People are like, but I'm already on fucking Facebook.
[175] I'm already on Google.
[176] I'm already on Instagram.
[177] I don't want to do that, man. It's too much, too much extra.
[178] And we make it a million times harder for ourselves because we're not scooping into people's contacts and, you know, taking all their information.
[179] That you're not.
[180] We're nines.
[181] No. So, like, when you give your address book to an app, who does that?
[182] Every, most apps.
[183] You've got to be an asshole.
[184] You know, no, but when you say, oh, I want to find my friends who are on this app and you share your contacts.
[185] But you're not supposed to do that.
[186] You're not supposed to do that, yeah.
[187] But most people do, and, you know, your friends didn't give you permission to give Facebook their phone number.
[188] Do you do that?
[189] I probably used to, like, seven, eight years ago, but everybody don't do it anymore.
[190] I always say the same thing when it pops up, get the fuck out of here.
[191] that's always what I say would you like to share your your contacts get the fuck out of here no you can't have my contacts you asshole I know what you're doing yeah Facebook is a weird one man it's so it's such a sneaky one you know Facebook and you know like all this uh the congressional hearings and the inner workings of it all and there the fact that it profits off of outrage so it wants people to argue.
[192] Like the AI, the computer learning specifically wants people to have like contentious debates about things because that keeps their eyes focused on the website.
[193] And if your eyes are focused on Facebook, you know, then those Facebook ads are very valuable.
[194] It's really fascinating, man. I think the outrage is unavoidable on any network.
[195] It's more, you know, are you going to, are you going to take down?
[196] They're taking down outrage.
[197] something yeah sure so and it just seems so inconsistent and subjective how they're applying i mean even just yesterday i think some uh journalists got banned from facebook yeah you aware of the story yeah yeah let me i'm gonna send this to you jamie because it's a really crazy one because they wanted her to show who her funding sources were and i didn't even know that there was an area where you could show that so it's almost like they're making this up as they go along yeah Kyle Kalinsky sent me this today.
[198] I'm going to send this to you right now, Jamie.
[199] Hang on one second.
[200] Hold on.
[201] I'm very quiet.
[202] Unfortunately, this is an audio show.
[203] This is live air.
[204] Yes.
[205] Not dead air.
[206] There you go.
[207] Okay, buddy, I just sent it to you.
[208] Okay, Facebook suspended in the now tweets page at the behest of CNN and the U .S. government funded think tanks.
[209] It says we had almost four million subscribers, did not violate Facebook rules.
[210] We're given no warning, and Facebook isn't responding to us.
[211] So, yeah, what is, what was the, what, it would actually started this off?
[212] I mean, who knows?
[213] They don't communicate with anyone.
[214] They've been banning legit accounts for years.
[215] You cannot even send a minds .com link through Facebook Messenger right now.
[216] It's blocked.
[217] What?
[218] Yeah.
[219] What?
[220] If you post in the news feed, it says, careful, this could be an unsecure website.
[221] site oh actually just I clicked on a link from TMZ yesterday and got the same thing from Twitter Twitter said this might be malicious there's spam there could be from Mines no it's from TMZ link it was clicking this like this story is on TMZ here do I see the rest of the story so they're trying to keep you from going to TMZ yeah I don't know why it's the first time I've ever seen that huh huh it's probably caught up in some algorithm I send an actual written letter to Facebook about it obviously they don't get back and there's no human activity.
[222] A written letter?
[223] You wrote it with a piece of paper?
[224] No, no. That would have been cool.
[225] I signed it with ink.
[226] Really?
[227] Well, yeah, no, because our lawyer said to, you know, that actually proves that you sent them something, some sort of diligence.
[228] But there's just no recourse.
[229] Right.
[230] It's, they're lost.
[231] So explain.
[232] So if someone is trying to say on Facebook Messenger, hey, you should go check out minds .com.
[233] It won't let you post that link.
[234] No. And what is their excuse?
[235] They don't tell you.
[236] No, they don't tell you.
[237] So is it because you're a competing social media network?
[238] I don't know.
[239] I don't want to get into point.
[240] I don't know.
[241] You don't know.
[242] I'm not going to say that.
[243] But you just know that it does.
[244] Yeah.
[245] You don't know why it does, but you know it does.
[246] And they're calling us unsecure, and I'm pretty sure that Facebook got hacked.
[247] You know, they compromise everybody's data.
[248] Like, you want to talk about unsecure.
[249] There's no more unsecure site that exists.
[250] it is kind of funny right i mean after those hearings and after all the uh the russia stuff yeah it's um it is kind of funny calling somebody else insecure yeah they're insecure mark zuckabberg is very insecure well he's also stupid rich he um he seems like he's too rich like he fucked up like he's there sipping water like a robot trying to figure out what the fuck he's doing with his life i think that they're scared because they know they've betrayed everybody and so it's hard to get them to speak you know it's interesting with with dorsey here because i you know give him credit for speaking but the fact is that he's not answering the questions well he's bringing somebody else in to answer the questions the next go round and so that should be very interesting you think he actually didn't know the answer to those questions i think he probably doesn't know all the specifics because he's a CEO of not one but two different corporations he's busy shit and also rich as fuck true but i think that when we look at the policy that exists on these networks like he is in control of the policy to a large degree there's a board there's a decision there's a decision making process but he has a large voice okay i don't know how large his voice is i assume that's probably true but one of the things we did detail on the last podcast with tim pool was how he wasn't the CEO for quite a long time fired and then rehired yeah yeah so um obviously there's some contention, there's some issues, and, you know, there's a, it's a lot of money involved in these things.
[251] And I think that plays a giant part in how they decide to make decisions.
[252] But do you think that an advertiser, in reality, doesn't, like, say you're an advertiser and you want to advertise your computer.
[253] Okay.
[254] And there's a video on YouTube that is about something controversial.
[255] Does it actually make sense for that advertiser to not show their product on that controversial video, don't they want to sell computers?
[256] Well, it depends.
[257] I mean, if the controversial videos are about how Jews are evil, and you have this video about Jews being evil, and then, you know, you're like, buy Razor computers.
[258] Come on, da -da -da -da -da -da -da.
[259] Right, but do you think that people actually, I can understand not wanting to support certain types of content, and maybe advertisers feel like they're supporting that content by advertising next to it, but I also don't think that people when they're watching a controversial video on the internet say, oh my gosh, you know, this advertiser is completely out of line for being next to this controversial thing.
[260] I don't think that's a healthy direction to move.
[261] Well, okay, that's one way to look at it.
[262] Another way to look at it is if you are a giant company that sells things, let's say you're Toyota and you're selling tundras, you don't want your tundras to be associated in any way with something that you might think is negative.
[263] their prerogative.
[264] They're paying for advertising.
[265] They can kind of decide.
[266] This is one of the things that's leading YouTube in specific, and I've had a ton of conversations about this.
[267] It's leading them specifically to try to demonetize things that could be considered distasteful or insensitive or controversial, and it's very frustrating to content creators.
[268] When you talk to them they're essentially saying that they need to do better and that their tools are very blunt that they don't really have the correct computer learning tools to figure out what is offensive and why and then there's a human review system which is very weird and we've run into that many times where like we'll have like a podcast with like say tom papa who's an uncontroversial fantastic stand -up comedian and it's like demonetized and then we're like why what happened And then we go, what the fuck can we talk about?
[269] We didn't talk about anything crazy.
[270] And it's really damaging for brands when it gets demonetized right away because it's that initial time period that generates the most revenue.
[271] So when you have to go back and do it, I mean, so I agree with that.
[272] So we built a tool that's like a peer -to -peer advertising tool.
[273] So it's not, there's two options.
[274] You can, so you earn crypto for your contributions.
[275] And then...
[276] Which cryptos do you support?
[277] We have an Ethereum -based token.
[278] But we're going to support all of them.
[279] So what is it a Ethereum -based token?
[280] So it's an ERC -20 token.
[281] What does that mean?
[282] It means that we basically reward people for all of their activities.
[283] Okay.
[284] So, like, say, if Jamie's posted on mines and people love his posts, he gets rewarded in some...
[285] Yeah.
[286] How much?
[287] How much you get?
[288] Well, can I go buy a house?
[289] One token will give you a thousand impressions.
[290] Oh.
[291] So we're not focused on, like, oh, you're going to make money from this.
[292] That's not what we're saying.
[293] One token will give you a thousand impressions.
[294] or you get a thousand impressions from you get a token from one thousand impressions when you use a token to advertise on mines oh you get a thousand impressions when you boost your posts with it so wait a minute if you use the crypto you use a token you guarantee views yeah that's weird isn't it why well you're guaranteeing people see something well we when you boosted it gets fed to people's news feed chronologically right i see so there's just a backlog so sort of like when instagram has those sponsored posts except we're not spying on people to when we send them instagram spies on people too my yeah i don't know man i'm stupid help me out it's yes they do and the thing is we just don't know so this is where free and open source software is just essential like the big networks there's no excuse for them not to be sharing their software.
[295] It's like when you're a public forum on that scale, the community just has a right to know what the algorithms are doing.
[296] So you think that they're not sharing their software because their software is encoded and designed to spy on you and extract information and sell that information.
[297] Partially.
[298] Like when Jamie gives up your contacts, when he signs up for an app and he says, yes, you can get access to all my contacts.
[299] There's a lot of reason and they don't want people to compete with them like anyone could actually take all of our code and make their own social network and compete with us they could set up on their own servers and we encourage that that's what like the fetterverse is called that's what Elon must does with Tesla all of his electric patents for electric cars I think that he open up the patents I don't think if he he open sourced all the the code of the car right but he's definitely in moving in the right direction of like he wants to build the market yes and he also wants to save the world I mean he'd legitimately has this and he also has shitload of money he's got enough money yeah that's i think that's a big factor with those guys but don't don't you think that it's almost like it's gonna help whatever network does that oh is more transparent stop spying on people is more community run and evolved wouldn't that be the network that you would think humanity would want to stick with in the long term like wouldn't that be a good move of them yes and For the average person, what are they losing when they get on Facebook or Google?
[300] What's bad?
[301] Well, now their likes are going down.
[302] Everybody's likes are going down, and that makes everyone very sad.
[303] What do you mean?
[304] Well, the algorithms, you're only reaching 5 % of your own followers organically on Facebook now.
[305] And they're starting to change the chronological feed on Instagram, too.
[306] And they know that this causes depression, and they're still doing it, because they know that they think they're better at showing you what you want to see than you are and they want to make money from it what do you mean by they know that this causes depression they've done studies about mental health in relation to actually facebook got exposed like five years ago for doing a secret study on like a few million users where they were injecting both positive and negative content into the news feed and they proved that they could affect people's moods this was with Princeton.
[307] There's a huge backlash and they're like, oh, sorry.
[308] Whoops.
[309] Right, but this isn't of injecting negative or positive content.
[310] This is just moving these images or these posts around so that less people see them.
[311] There's two different topics there.
[312] The basic news feed on Facebook is now a mysterious conglomeration of thousands of variables, which we don't know.
[313] But additionally, like a few years ago, they were express.
[314] for having been experimenting with people's brains.
[315] That's right.
[316] I remember that now.
[317] I remember that now.
[318] That's right.
[319] Yeah, I remember thinking like, wow, that's kind of creepy.
[320] They can do it.
[321] They're experimenting on the people that are on their site and they're not telling these people they're experimenting on them.
[322] Yeah.
[323] But do you, I mean, if they're trying to make it better, do you think that they're really, that's a factor that it actually, I mean, how does it cause?
[324] depression if they're just if your images or your posts are not being seen by as many people have you talk to kids posting on social media and their reactions to how many likes they're getting they get very very um concerned well that seems like more of a problem with that it is both it's it is on both sides being addicted to likes as some sort of a you know to weird dopamine here, right?
[325] It's not healthy.
[326] And we need to learn to not care about that.
[327] But I think that the core purpose of a social network is to subscribe to someone and see their stuff.
[328] And when people subscribe to you, they see your stuff.
[329] Right.
[330] So when you spend years building up a following on social media and say you earn 100 ,000 followers or something, and then suddenly the network says, no, no, your friends can't see that anymore.
[331] That's, not cool.
[332] And even like Twitter's default newsfeed is no longer chronological.
[333] You have to click it to go chronological and then it defaults back to their weird algorithm thing.
[334] So we're saying look, 100 % organic chronological raw forever as default.
[335] And then if you want to curate algorithms or have recommended stuff come in as an alternative, fine.
[336] But that is the core purpose of social media is to connect with people that follow you and the other way.
[337] What do you think the purposes?
[338] Like, why do you think Facebook would decide to have things not in chronological order and only be seen by 5 % of your followers?
[339] Like, what would be the benefit in that for them?
[340] Revenue.
[341] Revenue, how so?
[342] How does that generate revenue?
[343] They just know that they can keep you on the app better.
[344] If you get less likes?
[345] No. If your stuff is seen by less people, it doesn't make sense.
[346] That's a good point.
[347] It sort of works both ways.
[348] I think that they, they, think they know the people that you're going to react to the most so as a consumer when you're getting that content you know the algorithms are showing you what you typically like have you noticed that i'm really not paying much attention but i believe you so yeah for creator it's hurting creators people who post are getting hurt people who are sitting there just scrolling they're the ones who are really getting you know addicted more so with the algorithms so how are the people that are posting getting hurt they're getting hurt because their stuff is being seen by less people yeah because it's not chronological and it's not organic because it's curated huh but aren't they doing it because they think it's going to be a better and an experience that's more conducive to your likes that's what they say what do you think they're doing it for then they're doing it because they have studied through looking at the data how to keep people on the app more.
[349] Right.
[350] And that way is to give them, like say if I Google or if I look at muscle cars on Instagram.
[351] Now, if I go to my search, it's all muscle car stuff.
[352] So that's what it is.
[353] They say, oh, he likes that.
[354] So we're just going to give him a lot of that.
[355] And I think that's okay as an alternative feed or to put that somewhere.
[356] I just think the core feed always needs to stay pure.
[357] Because otherwise you're just down the slippery slope again.
[358] And it's just feeding they're they're injecting things into your head that you didn't ask for right and they're doing it because they want to keep you around yeah that makes sense um how many different companies are subscribing to that is it seems like all the big ones we're saying are curating and moving things around and all the big ones have an algorithm that's designed to keep you on board right and that's okay to pursue i think there's really cool things you can do with AI and machine learning and algorithms that is really beneficial, but it's just taking away people's reach when they have worked years and years to achieve it, it's not okay.
[359] Do you think that this is this marriage between something that is this social media network that's designed to allow people to communicate with each other and then commerce, like this business, like how do we maximize this business, how do we get more profit out of this business, how do we get these people to engage more?
[360] And then they start monkeying with the code and screwing with, what you see and what you don't see?
[361] You think that's what's happening?
[362] Yeah.
[363] But in the short term, it's probably working.
[364] But in the long term, they're betraying everybody's trust.
[365] It has to be more of a consent -based system.
[366] So, you know, at least give people, well, it should be opt -out by default.
[367] And fine, give me messages to opt -in so that you can show me certain things.
[368] But this whole forcing people into surveillance is, it just has to stop.
[369] it's it's super scary how's it super scary to you it's just too much power yeah it's too much power for something that's supposed to be silly right like what was facebook supposed to be supposed to be some silly thing that you just can communicate with friends it was but from the beginning all of these none of these networks have ever really been about the people of the networks it's always been closed source since the inception so but then look at open networks out there.
[370] You have Wikipedia.
[371] Totally open source, community run.
[372] Granted, they have their issues with moderation.
[373] Fine.
[374] But it's a top 10 website in the world.
[375] It's totally open source.
[376] Creative Commons content.
[377] Incredible human achievement.
[378] Bitcoin, open source money.
[379] WordPress even is an open source CMS system that is like powering 25 % of the internet.
[380] So why wouldn't that happen with social media?
[381] It should.
[382] I mean, this is where everyone's hanging out.
[383] So we should all sort of collectively even own it.
[384] We did an equity crowdfunding round, so like thousands members of our community actually own the site.
[385] Now, how many people are on minds?
[386] We have like a million and a half registered, like quarter million active.
[387] We're small.
[388] But the weird thing is that even though we're a fraction of the size, especially smaller creators who come get better reach on minds than they do on Facebook and Twitter, because we have this reward and incentive system sort of like gamified where you earn reach and you earn more of a voice for contributing.
[389] So like you could have an account on Twitter for 10 years and post thousands and thousands of tweets and you never hit that viral nerve and you just never really get much exposure.
[390] So we're trying to help people be heard.
[391] And so you'll find a small creator who on other networks has no followers have thousands and thousands of followers on mines.
[392] And what do you think you would like to do with minds in the future that you haven't been able to do yet?
[393] Engineer the control out of ourselves so that we aren't even in a position to really, you know, take people's stuff down or...
[394] What would have someone posts your house and your information, where your kids go to school?
[395] I think that on the central servers, Obviously, yes, we're always going to moderate, and if it's legal, it can stay.
[396] If it's not illegal, it can't.
[397] But a decentralized social network is definitely where we have to go because, and yeah, okay, it's scary.
[398] And, you know, you've talked about this, like things are getting more transparent.
[399] Our lives is sort of like the inevitable evolution of technology.
[400] I mean, how many hours a day do you stream?
[401] A couple, you know, 25 years ago would you have thought?
[402] you'd be sharing, you know, 20 % of your life, live streaming to, you know, millions of people, like, our lives are becoming more transparent, just inevitably.
[403] It's just pulling us.
[404] Yeah, I agree.
[405] So, you know, Bitcoin, crypto, dat, torrent type architecture.
[406] That is just where we're going because it's more resilient.
[407] It's less censorship prone.
[408] there's just benefits of it.
[409] I think that we can balance it too.
[410] Like maybe when you post, you have a decision.
[411] Do you want to be able to delete this at any point?
[412] All right, fine.
[413] Then you can post the central server.
[414] Do you want this to get unleashed?
[415] Yeah, it's scary because, you know, there's scary stuff on the internet.
[416] It's already like that.
[417] But, you know, getting into censorship more, does censorship even solve the problem?
[418] Or does it make it worse?
[419] What problem?
[420] The problem of crazy content, illegal content.
[421] How could it make it worse?
[422] Well, I mean, it seems like it can often amplify radicalization.
[423] It definitely can, right?
[424] Yeah.
[425] Yeah.
[426] And it definitely, when you censor people, it just makes them aware that there's a plot against them too, right?
[427] It's like a lot of conservatives on Twitter or find that.
[428] Somebody, Sam Harris actually just sent me an article that was detailing the bias against conservatives on Twitter that they've actually done, you know, like some real studying it.
[429] it's pretty demonstrable.
[430] Demonstrable?
[431] It affects both the left and the right.
[432] Demonstrable?
[433] Yeah, the way I'm saying it wrong.
[434] But it affects the left and the right for sure.
[435] That's what Kyle was saying.
[436] I watched that video that he did.
[437] And it's anti -establishment that seems to be getting targeted.
[438] And so, you know, Abby's been censored on Facebook.
[439] Abby Martin.
[440] Yeah.
[441] And, yeah, this person today, I mean, most of the stuff coming out of RT is progressive, which is weird.
[442] And who knows what kind of games are getting played behind the scenes with the rush.
[443] I mean, who knows?
[444] But the point is they have a right to be there.
[445] And I mean, look at, and this is not YouTube's fault.
[446] But remember the YouTube shooter?
[447] Mm -hmm.
[448] I mean, she thought she was getting censored on YouTube and she went and brought a gun to the YouTube headquarters.
[449] Like, people get pissed when they get censored.
[450] It affects you.
[451] Right, but in her case, you're talking about a crazy person that wasn't really.
[452] Of course, but there's crazy people out there.
[453] No, she thought she wasn't being censored.
[454] Well, she was getting censored just like everybody else is getting soft censored on these networks.
[455] Well, she just thought this wasn't, she wasn't getting promoted the way she wanted to.
[456] I don't think anybody was actively doing anything to her because her stuff was terrible.
[457] I'm saying that the soft censorship of the algorithms, people getting demonetized, this has an impact on psychology.
[458] Okay.
[459] I see what you're saying.
[460] So I'm not saying they were deliberately targeting her.
[461] And it's horrible what happened.
[462] but yeah so what you're saying is that these algorithms that they use in order to maximize their revenue and give people things that they like but actually takes away from things being posted chronologically keep certain things from being seen by as many people so it keeps them from being as viral so it keeps the whole thing from being organic yeah yeah makes sense yeah it's uh it gets to that point where we're realizing that all the of these things, all these social media things, are really recent.
[463] We've only had them for a few years, and we don't necessarily know what the rules should or shouldn't be.
[464] So it's good, I mean, it's one of the reasons why I wanted to have you on.
[465] I wanted to find out where these upstarts, where these new people, or they're coming into the game, like minds, like where you're coming into the game from, and what is your position on what's wrong with the current state of affairs?
[466] Yeah, and look, there is messed up stuff on social media.
[467] Like, I'm not, we'll get pigeonholed and to be like, oh, you support all of this, this crazy stuff.
[468] First of all, most of the users on minds are like artists, musicians, filmmakers, activists, journalists, just trying to get their content out there.
[469] There's a very tiny minority of like actually, you know, crazy content.
[470] When you say crazy content, what do you mean you're all right?
[471] Yeah.
[472] Or, I mean, I'm not even going to make decisions on what is and isn't crazy.
[473] That's not, that's not my place.
[474] but the it's been proven that censorship is not the answer i mean look at the history of prohibition yeah it's it's you have digital content it's substances it's it's anything people want information and they want the ability to make the decision for themselves they certainly do and then the argument on the other side is when people are distributing and i'm going to use the big air quotes hate speech that's when it gets slippery to me because who's to decide what's hate speech and what's not hate speech I mean I've seen people make some ridiculous fucking statements about all sorts of people that are inaccurate and they do that in order to categorize them and pigeonhole them and an easily definable and dismissable characterization you know you just decide hey that Bill Ottman guy that guy's a this oh he's a radical that and he believes in this so fuck him and they're like okay fuck him sweep more hey and then cancel culture comes in like we're going to cancel bill ottman we're not listening to him anymore you know he lied to us about his source or whatever the fuck you're doing have you heard of darrell davis no i have not unless i forgot darrell davis is your boy he's my boy he he'll you'll you'll i haven't met him but he's my boy i want him to be my boy so he is a black man who befriended hundreds of members of the kKK and he got them all to leave he got them to leave the kKK all of 200 members after he was like, yeah, I'm just going to talk to you.
[475] Really?
[476] Did you ever see the W. Kamau Bell's show when he visited with those white supremacists?
[477] Not that specific one.
[478] No, it's really good because he's such a nice guy.
[479] He's so easy to get along with that they were like sort of, they let their guard down around him.
[480] And you get to see these people kind of confused that they like this guy.
[481] You know?
[482] That's why I think.
[483] think initiating human contact via the social networks like that's really important but to play devil's advocate it's one of the worst ways for people to express themselves in a way where you consider other human beings experiences and feelings and the way they're going to receive what you're saying because there's no social cues you're not interacting with them you're not looking at them in the eyes it's one of the weirder forms of communication between human beings and one that i would argue we have not really necessarily successfully navigated it yet i agree i'm i was actually saying that i think we should use social media more to get people to get together in real life do you know who uh megan phelps is no she uh was uh with the um westboro baptist church uh you know the the the famous one that protests those soldiers funerals and you know and anything gay and they're like ruthlessly viciously fundamental Christians where you know they do a lot of protesting at funerals and doing a lot of stuff to try to get she was with them for for the longest time and then got on Twitter and through communicating on Twitter and when you meet her you would never believe it in a million years that she was ever this fundamentalist and that she was ever some mean person sending hateful messages to people because their son was gay or whatever it was, now she's completely cured of it.
[484] She has no contact with the church anymore.
[485] She's married.
[486] She has a kid.
[487] She's completely outside of it.
[488] She does podcast now and gives Ted talks and speaks about radicalization and about how she was kind of indoctrinated and grew up in this family.
[489] And her grandfather, Fred Phelps, was this, you know, it's like, it's a fucking mean guy, like a really mean vision.
[490] He's the God hates fags guy.
[491] You know, they would have those signs that they would hold up at at soldiers funerals i mean it's like really inflammatory stuff but through twitter through her communicating with people on twitter specifically her now husband like he he cured her like just with rational discourse and communication and she was open to it yeah they will change yeah and so that's why banning them i mean and you i i saw in a recent podcast you've been talking about redemption yeah so but I'm curious, what do you, do you think people, what is, how does that look like?
[492] Well, 10 minutes, look, in the case of like Megan Phelps, that's a real thing, right?
[493] She really did change.
[494] Another example is Christian Piccolini.
[495] Do you know who he is?
[496] He was a white supremacist KKK member guy who he's been on Sam Harris's podcast.
[497] He's also done some TED talks, who now speaks out against it and talks about how he's indoctrinated and talks about how lost he was and then he was brought into this ideology um there are there's many people like that all over the world um uh madjid magid was another perfect example he was an islamist i mean he was you know trying to form a caliphate was literally thinking about radical Islamic terrorism as being some sort of a solution now he's the opposite now he's trying to get people to leave and he's trying to get people to be more reasonable and Did you see what happened to him?
[498] Yeah, he got punched in the street.
[499] Yeah, some guy called him a fucking packy, I guess, and punched him in the head and fucked his head up.
[500] And he's got this giant cut on his head from a ring and his face is swollen up.
[501] But apparently they have the guy on video and, you know, they think they're going to be able to arrest the guy.
[502] I've had Majit on the show.
[503] He's a super nice guy.
[504] The hard thing is that, all right, yes, we see these transformations take place and makes us feel warm inside.
[505] And, yes, people can change.
[506] but at the same time what are people have should people have to go apologize to Twitter oh I'm sorry like can I come back right I mean that's not like sometimes people are going to think completely differently than you and you just have to deal with it right and it that should be okay we shouldn't force people to come in to our way of thinking in order to have discourse no that's a good point that's a very good point and like who is to decide what this path of redemption is and whether or not you've completed it right Who has to decide, like, maybe you are like a hyper -radical lefty and maybe Jamie's points of view and yours are just never going to line up.
[507] So you're like, fuck him.
[508] He's banned for life, which a lot of people have been banned for life.
[509] And when you look at some of the infractions they've been banned for, they're like, boy, I don't know about that one.
[510] That doesn't really make sense.
[511] Almost none of the high -profile banning cases make much sense.
[512] No. It's like a short -term solution that's creating a long -term problem.
[513] yeah that's really what it is so i just think that we have to talk about it more i don't know i it's like why can't we just get everyone to talk about it yeah like at the same time i mean it's like we're just wasting time here well sort of but i also think we're figuring it out as we go along with a bunch of different competing ideologies um you know you have yours which like you Dude, you look like a hacker unlike House of Cards.
[514] You look like a guy you call in to break into the mainframe server.
[515] I'm not that, honestly.
[516] I believe you're not.
[517] I hang out on GitLab and check out code, but I cannot code.
[518] Listen, man. I'm not claiming to be a developer.
[519] No, these people are another level.
[520] It is incredible.
[521] I understand.
[522] Right.
[523] Yeah.
[524] I get it.
[525] Well, that's like if someone says to me, like, you're an MMA fighter.
[526] I'm like, I'm definitely not.
[527] and they are on another fucking level.
[528] Like there's a different, I know a little martial arts, but just settle the fuck down, right?
[529] Same kind of thing.
[530] I think, though, that your ideology is going to be, your point of view and perspective is going to be very different than maybe someone who's like a radical Marxist.
[531] You know, it shouldn't maybe be allowed to post on the site too?
[532] Someone who's like an extreme socialist, someone like AOC, you know, someone who thinks that we should give money to people who are unwilling to work, someone who thinks that we should try to engineer society and tax the top X percent, you know, 70 something percent of their income.
[533] There's a lot of those different people and we have to figure out how to make it so that, well, we have to figure out a way to make it so all the ideas can compete in the marketplace of ideas, right?
[534] All these different ideas can compete and we can find out which one is better and we can find out which one is better yeah you don't always find out which one is better though right you find out which one is most i mean that's what happened with hitler right or you don't really find out what's better you find out what what's got more juice behind it it's just it's too risky even being in the position that i'm in you know i see these edge cases like we say look if it's legal it can be there but we still see edge cases where we have to make decisions okay what's like an edge case I mean let's see I mean there is I don't even want to go here but I will there is a type of animation uh oh porn anime yeah that is very sketchy that is you know like child porn animated child porn and we've taken the stance that look it's it could fall under obscenity laws so we don't we're not cool but you know that is a huge debate right that that that has not been decided by the supreme court if animated you know kids like they will do the weirdest stuff and i i just don't want to be telling people what is and what is not art right so like some of that japanese stuff with tentacles like some of that stuff is just like what is happening here got like octopuses banging chicks in every hole and they're choking on it and they've got one in their ass and one of their vagina and it's all like very very liquidy you know there's a lot of splattering going on you're like what the fuck is this and is that okay because it's just art right I mean if it was a person getting fucked left right and center by an octopus he'd be like yeah I think we've crossed some lines here that's beastiality but if it's an image and then then the image is a girl with a school girl costume on she's dressed like Catholic school girl with a little skirt and she's getting banged by an octopus.
[535] You're like, what do you do with that?
[536] Right?
[537] Yeah.
[538] Yeah.
[539] What would you do?
[540] It's a good question.
[541] I'm glad I don't have a social media site where I have to make that decision.
[542] Well, the real concern would be, is this something that is actually illegal?
[543] That's the thing.
[544] Right.
[545] And we've tried to look at the case law and we've seen that this type of stuff has been, you know, called obscenity before and so we're just not going to risk it but i i still you know in a all right nipples nipples look right did you know that free the nipple started out with four cham well everywhere it's a whole it's a whole movement to be honest right time magazine just did a really interesting piece about a statue that got banned from facebook and it was a naked ancient statue that's a nipple like I'm sorry that's not that's not realistic that that's not helping society taking down a naked statue what we were talking about the other day uh during the Super Bowl that uh Adam Levine had his shirt off and Brian Redband was like hey wasn't that like what Janet Jackson got in trouble for yeah like yeah why does why is it okay if Adam Levine shows his nipples and Janet Jackson's nipples are offensive because they're sexualized because she's a woman this is the weird fact men had to gain the right to have their nipples shown in public back in the day when's the day if you if you go on the free the nipples site there's this go on their Instagram or something I think that's maybe where I saw it back when I used Instagram but you know society is evolving we're going to get there we're going to be able to handle it I think or give people the controls so that they can only see the types of things that they want to see that's ultimately what it's about so like you should have like a filter like do i want 18 plus do i want um pg 13 like what what kind of distinction do i want yeah yeah and then when when things come up like one of things that instagram's been doing is like they say i follow a lot of hunters and instagram has things where they say warning this is sensitive content uh nature's metal gets popped on that a lot too because nature's metals and Instagram site that's all like these crazy images and videos of animals eating other animals and attacking other animals.
[546] And sometimes some of them, they just decide this one's too fucked up.
[547] You know, like just decide.
[548] Right.
[549] You know, like this one of them where a lion is looking out of a wildebeest asshole, like from the inside, like there's this giant hole they've eaten through its stomach and it's looking out its asshole.
[550] And they're like, yeah, this one, you're going to have to click on your own.
[551] You have to double click.
[552] What do you got, Jamie?
[553] Basically, from what I just looked up, Tarzan is the catalyst for why guys wanted to wear their shirts off.
[554] They had like in the 1920s, 1910s, they had to wear in pools.
[555] They had to wear a top.
[556] But look, this only covers one nipple.
[557] They're probably tired or sweaty.
[558] That's the rebellion right there.
[559] That's how they started doing it.
[560] They're pulling down the strap.
[561] Look what it says here, saucy lifeguards flash rebellious nipples.
[562] That is hilarious.
[563] That's hilarious.
[564] That's hilarious.
[565] So is Tarzan 1937 New York State's male shirtless ban That's when they overturned it The incident Attracted press attention As Atlantic said at any other waterfronts similarly mandated against mannips With that legal domino tipped Along with the help of Hollywood hunks And you were talking about how they're Twitter has porn Yes Yo we got banned from Google Play For that Twitter has it They're up on Google Play Yeah, Twitter has a substantial amount of porn You know, you follow like some of them gals And they just want to see, look, here's one in my pussy Right there, take a look.
[566] Like, full -blown.
[567] It's not offensive.
[568] Well, it's not offensive if you follow them You follow certain porn stars.
[569] I think it's against their own terms.
[570] Oh, really?
[571] But they're just allowing it because they know they want that traffic.
[572] Oh, is that what it is?
[573] You know they want that traffic.
[574] Oh, you know they want that traffic, bro.
[575] Yeah, um, it's a, problem if you hand your phone to your kid you know they accidentally click on that link and they're like mommy what's happening with her but jonathan hate hate or height height height height height he was talking about an interesting thing where you know should there be an age where we really get into social media i don't know i mean people should be free what they're free to do what they want to do but you know the internet is the wilderness Well, his book, The Codling of the American Mind, I'm in that right now.
[576] I just finished his other one, and I'm working on that one, and a lot of it has to do with social media, and a lot of it has to do with the impact that it has on young people.
[577] You know, people are not really designed for this, and you might be able to handle it if you're a 32 -year -old man or a 35 -year -old woman or whatever you are, but if you're a 15 -year -old girl, it might be overwhelming.
[578] I mean, and the angst and the anxiety and, you know, wanting...
[579] That's what I was saying about the depression.
[580] You know, they see if they're not at a party where their stuff's not getting liked, that has an impact on them.
[581] And ultimately, I think the networks need to be helping educate people how to, you know, whether it's disinfo, educate people how to research.
[582] I did see that YouTube is starting to do like, you've been on this for too long type thing.
[583] Really?
[584] Yeah.
[585] Like, get a life, you fuck.
[586] Yeah.
[587] They tell you that?
[588] I want to build stuff like that.
[589] That's really important.
[590] Yeah.
[591] Helping people get off.
[592] Yeah.
[593] I haven't seen that.
[594] I haven't done enough time on YouTube where they're kicking me off.
[595] I have.
[596] It's easy.
[597] You know, I sent Eddie Bravo, this thing from The Guardian, about the upsurge and people that believe in the flat earth and all of it because of YouTube videos.
[598] And that apparently now YouTube is, they want to censor those.
[599] They want to, uh, they, they feel like flat earth videos and, uh, I think another one, check this, check this if I'm wrong about this, but I think they also want to lean on those anti -vaccination videos.
[600] I think there's a concern with those.
[601] I think they're worried about a bunch of different things along those lines, you know, like they feel like there's disinformation and outright lies that are being spread.
[602] how do we combat it, we own this platform, what do we do?
[603] They feel like they have a responsibility.
[604] I think there is responsibility.
[605] Okay, but what is the responsibility if there's a debate?
[606] I think it's more to educate people how to research, as opposed to saying this is or is not true.
[607] Right.
[608] Because who's deciding that?
[609] Well, I believe the earth is round, but I also believe it's such a stupid conspiracy that you should have it.
[610] you should be allowed and it should be something you should show your friends like dude i need you go look at this this has 37 000 thumbs up and they really believe that the fucking earth is flat they really believe there's an ice ball outside Antarctica they really believe that the sky doesn't move that it's that the you know that we're in some sort of a i think it's like projected images or something like there's a bunch of like really really wacky theories like i think those are okay.
[611] Yeah, of course.
[612] But I think freedom of information sort of transcends a lot of these little debates.
[613] So if there was more freedom of information, so we actually knew everything the government knew about all of the different conspiracies and black projects, the black budget, more information is going to give both sides the ability to understand what is happening.
[614] That's true.
[615] The reality is that we don't know what's happening.
[616] And there is lots of secret stuff.
[617] The problem with that though is then you're dealing with foreign governments that are way better at keeping secrets than we are and if they have access to our secrets.
[618] Like one of things has been kind of disturbing is seeing the actual influence of these Russian troll farms have had on not just our political process, but sowing seeds of dissent amongst people and starting conflict amongst people and how people are buying into it.
[619] You know, like this podcast I've been talking about a lot with Sam Harris and Renee DeResta, that's her name right, where they talked about how these Russian troll farms set up a conflict by having a pro -Muslim rally across the street from a pro -Texas pride rally.
[620] And they just set it all up and had it there.
[621] And then a skirmish broke out because these people were across the street from each other.
[622] And that they do this with, they were having these African -American groups that were seeing anyone but Hillary and they were really trying.
[623] to get people to vote for Jill Stein, really trying to get people to even consider Trump, but anyone but Hillary.
[624] And then they were also having like ones that were against them.
[625] They're trying to like make debate.
[626] They're trying to make anger.
[627] I don't think you can stop that.
[628] But it's a fascinating thing, isn't it?
[629] That this is like a concerted effort?
[630] Yeah.
[631] Like how do you feel about that?
[632] When you are, you know, you're in a position where you have a fairly small network, but it's influential, right?
[633] And then so you're watching Zuckerberg and.
[634] and the Facebook shit on TV and they're talking to these congresspeople and senators and they're talking all these politicians about what's going on and how to stop it and what they're trying to do and and you feel like, oh God, like this is kind of a, this is an arena that I'm getting into.
[635] What would you do?
[636] I mean, I think more conversation needs to happen, not less.
[637] Yeah.
[638] So I think you're right.
[639] I just, I want more information.
[640] I want from the government, from the corporations.
[641] From the trolls?
[642] From the trolls.
[643] I mean, you know, I feel like I have a pretty good ability to discern what is and is not troll behavior.
[644] I think help people understand how to absorb information.
[645] Like, just banning an account that is trying, that has an agenda is everyone has an agenda.
[646] It's a propaganda back and forth between everybody.
[647] I don't, I mean, just because somebody posts a Jill Stein meme, okay, what's your point?
[648] Like, their intention, okay, I'm not saying that, you know, regime change behavior is positive or negative.
[649] I don't know how we, we sort of switch gears.
[650] No, we did, but let me step in here.
[651] When you say in a Jill Stein meme, there's absolutely nothing wrong with you posting a Jill Stein meme.
[652] Like, say, if you have a joke about Jill Stein, you wanted to post it in a meme.
[653] There's nothing wrong with that.
[654] What's weird for people is that people are being hired to make these memes.
[655] And these memes may not have anything to do with their own personal ideology.
[656] They might just decide, hey, I'm going to collect this check.
[657] And they make, apparently, according to Renee in this podcast, she did with Sam Harris, they make really hilarious memes.
[658] Like, some of them are really funny.
[659] I listened to that podcast, yeah.
[660] It was great, right?
[661] She said that she started laughing at a time.
[662] Yeah, yeah.
[663] Yeah.
[664] And she would, you know, she had to go through thousands and thousands of them.
[665] That's weird, right?
[666] There's this idea of a web of trust, which is interesting, sort of like a peer -to -peer.
[667] It's not like a Chinese social score, but it's like if you, the people that you're connected with show a certain account to be untrustworthy, then, you know, because you trust your little network.
[668] So it's sort of like a peer -to -peer score.
[669] We're looking at different ideas.
[670] I think that transparency and understanding what's going on with different accounts and if it's the real person, that's all important stuff.
[671] We don't want frauds.
[672] We don't want disinfo.
[673] But, you know, we just have to really step back and think about how we're doing it rather than letting AI and algorithms run the show.
[674] Right.
[675] I see what you're saying.
[676] Um, do you think that there's a, I don't want to say there's a market.
[677] Is there a demand for this?
[678] Like are a lot of people responding in a positive way to the way you guys are approaching the game?
[679] Yeah, for sure.
[680] Every time there's a big scandal, every time, whether it's, you know, data manipulation or, you know, our first big growth spurt was during the Snowden days when he released all the information.
[681] Right.
[682] People are really upset with what's happening.
[683] It's just, you know, what are they supposed to do?
[684] Like, this is what they're using for their communication.
[685] It's not easy to just achieve, you know, a multi -billion person network overnight so that everybody's there.
[686] And so we're stuck.
[687] But again, I think that supplementing, just installing these alternative apps, not just us.
[688] Like the whole open source market, I'm not even here trying to just talk about what we're doing.
[689] Right.
[690] It's like if you don't have those apps on your phone and you don't, you don't, you use those browsers.
[691] I'm sorry.
[692] You just, you're not helping.
[693] And people just want to vote with their energy, I think, and vote with their time.
[694] So it's more of an education thing.
[695] People just don't know that this matters and that this can help change the whole internet simply by logging into an app once in a while.
[696] It's like, it's like organic food.
[697] I mean, we want to put things into it.
[698] We want to support things that have integrity.
[699] So when you click something, you are supporting that thing.
[700] When you're sitting on an app all day, you are feeding that app.
[701] That's how the apps get all the money.
[702] That's where they get all their funding.
[703] That's where it's all based is in user retention and energy.
[704] And do you think that most people are even aware of this?
[705] Do you think they're just using it because it's convenient?
[706] The biggest charade going on right now.
[707] And most people don't know that Facebook owns Instagram.
[708] They think Instagram is like, cool because, you know, it's not Facebook.
[709] Right.
[710] It's one giant umbrella.
[711] Yeah.
[712] What is the difference between the two?
[713] Obviously, with Instagram, it's just pictures mostly.
[714] And then whatever the post is, below the pictures.
[715] But with Facebook, it's a lot more commentary and long, verbose statements on shit.
[716] And then people arguing in the comments about it.
[717] Yeah.
[718] All the Instagram founders left abandon ship, the WhatsApp founders abandoned ship, the Oculus founder abandon ship, all because of the privacy stuff.
[719] They're like, you took this good thing, well, it was proprietary, so I would argue if it was ever actually a fully good thing.
[720] But at least it wasn't completely corrupted by Facebook, but all of the founders of those companies left because they hate what's going on.
[721] The WhatsApp guy joined Signal, which is a really cool.
[722] open source, end -to -end encrypted messaging app.
[723] So, you know, these people know...
[724] So what happened with WhatsApp?
[725] It's not the same anymore?
[726] WhatsApp is owned by Facebook.
[727] I don't know this.
[728] Oh, yeah, yeah, sorry.
[729] Come on, bro.
[730] Get in the program.
[731] Dude, get in the jungle, man. Well, you're deep into this, man. That's why I want to talk to you about it.
[732] Yeah, it's...
[733] And they're all buying up companies and using these same sort of ideas.
[734] Yeah, now they're talking about integrating the mess.
[735] messages between what's app Instagram and Facebook so it's all one system oh yeah yeah centralization it has happened hasn't been turned on yet but I believe it's happening like your DMs on Instagram you're going to have to download the Facebook messenger so what is what are the challenges for something like mines when you're trying to take off like there was a social media Instagram type thing that was around a little while ago remember I used it like once and I posted about it and what was that called but then Vero?
[736] Is that what it was?
[737] But then a lot of people were saying it was bullshit and they're proprietary.
[738] You don't know, it's closed source.
[739] No idea of what's going on behind the scenes.
[740] Same as the other ones.
[741] A lot of apps try to say that they're alternatives and that you know they support X, Y's privacy or free speech or whatnot but I don't think it's any new paradigm if they're not showing their source code so that people can see the algorithms the people who care you know obviously most people aren't going to go and inspect the code right but just the principle that the experts could because they will you know there's all kinds of think tanks and whatnot that would love to dive into the source code to understand how these companies were actually behaving so you know waving the privacy flag without being open source or this is getting a little bit into the weeds but a lot of this comes down to licensing of content or code.
[742] So the license that we use for our code is called the general public license, the AGPLV3, which means that anyone can take our code and do whatever they want with it.
[743] They can sell it.
[744] They can do anything.
[745] But if they make changes, they have to show them with everybody else.
[746] So it's sort of like the Creative Commons share -like license, which essentially says the same thing.
[747] take my video photo remix it do whatever you want but you have to share the result with everybody else open source basically means you can do whatever you want with it you can take it make it your own you know keep your own little secret sauce if it makes you feel good they get conflated because you know free software sounds like free as in free beer not free as in freedom so you know license Listening is really what this all coalesces into.
[748] But it's been proven that you can make a lot of money with free and open source software.
[749] I mean, look at WordPress.
[750] It's a hugely successful technology corporation, the multi -billion dollars.
[751] People share the code.
[752] It created a network effect because they did that.
[753] It's like the Grateful Dead would let everybody record their music and that's how it spread.
[754] So it's actually a good marketing tactic.
[755] And it also gives transparency so people can see what the hell is going on.
[756] Right.
[757] Now, when you started this, what was your objective?
[758] And were you thinking about it as a potential large scale source of revenue?
[759] Or were you just thinking this is something that I would like to do and do correctly because I don't think anybody's doing it this way, open source, you know, pro -censorship or pro -freedom of speech, anti -censorship.
[760] and to, you know, to just do the bare minimum amount of managing content.
[761] I think everyone should be able to make money.
[762] I don't think it should have to be mutually exclusive, like you do something for free for everybody and you also can't make money.
[763] That's like a big misconception.
[764] We're trying to give people the tools to make money.
[765] I mean, we have like a monthly recurring subscription system, sort of like a crypto -patri -type tool.
[766] so you can subscribe to people.
[767] We had the ability for creators to accept Fiat dollars, but we took it out because it's Stripe.
[768] And Stripe is a closed source system, which we just didn't have long -term faith in.
[769] So Stripe was some sort of an extension to your site?
[770] Yeah, we were using their API to facilitate peer -to -peer payments.
[771] This is why Patreon most likely banned Carl.
[772] because Sargon of a car.
[773] Sargon who, yeah, because the payment processors went to them, are like, look, you know, Stripe has very strict terms.
[774] And we didn't want to be, you know, talking to, like, we don't want to be subject to overlords.
[775] Right.
[776] In our company decisions.
[777] Of course.
[778] So, you know.
[779] Do you think that's what happened with Carl?
[780] That is most likely what happened.
[781] So they stepped in and said, hey, we don't want this guy to be a part of the site.
[782] Yeah.
[783] I think that Stripe probably stepped in with a lot of explicit content, controversial content.
[784] I mean, it's in their policy that you can't facilitate payments dealing with that type of content.
[785] And now we're seeing banks actually go after people.
[786] Well, the thing about Carl, though, was that his content, that it was questionable, wasn't even related to Patreon.
[787] It had nothing to do with it.
[788] It was on another person's podcast.
[789] It was from six months prior, and that other person's podcast was on YouTube.
[790] It had nothing to do with Patreon.
[791] And they had specifically said that they were not going to act.
[792] act on content that was outside of their network, they were only going to react to things that were on Patreon.
[793] Right.
[794] Because you can make little blogs and stuff on Patreon, right?
[795] They do have some content.
[796] Yeah.
[797] So, but that's not what Patreon said.
[798] Obviously, the processors, I don't want to, who knows?
[799] Yeah.
[800] I don't want to act like I know.
[801] I also don't want it to seem like I have an ideology that I'm trying to push right now.
[802] Like, I'm very open to moving in the direction that makes the most sense.
[803] for the community.
[804] I'm not attached to what I'm thinking.
[805] Good.
[806] I like that.
[807] I wish more people would do that.
[808] I mean, I try to do that.
[809] I'm really getting way better at it.
[810] But that's something I actively work on.
[811] Like these ideas that I have, I'm not fucking married to them.
[812] Don't argue them.
[813] Look at them.
[814] If someone says something different, go, huh, all right, don't go, no, man, that ain't right, bro you know because that that natural instinct to argue and to you know claim some sort of a personal identity with your with your ideas that's that's part of the problem that we have yeah i think it's it's a main conflict issue with social media is you one of the things that i see a lot of that look i used to do it way back in the day if you go back to the early days of twitter i used to argue with people i used to argue with people on social media then i realized some long time ago like there is no good that comes out of that.
[815] I might correct someone if someone said something that's incorrect, but I'm not going to argue and I'm not going to insult.
[816] I'm just not just not.
[817] It doesn't even work.
[818] It doesn't work.
[819] It just makes people argue back and insult you back and nothing ever gets accomplished.
[820] Occasionally you dunk on people and it's fun.
[821] But in reality, especially me, I mean, I kind of dunk on people for a living.
[822] So I'm just going to, I'm not going to engage.
[823] And I don't, this is going to sound corny as fuck.
[824] I don't want to hurt anybody's feelings.
[825] I really don't.
[826] I don't want to be in some argument where someone is looking at their phone like, fucking fuck you, fuck you.
[827] I don't want that.
[828] I don't want that.
[829] I get that.
[830] I know what it is.
[831] I know what it is.
[832] But it's in this flat medium, okay?
[833] This two -dimensional medium of typing text and then sending text and you type text and send text, the conflict that arises through that is never beneficial, in my opinion.
[834] I don't get anything out of it.
[835] it so what you know if i'm expressing something almost always i try to express something about shit i like oh i love this new show oh this movie was great oh this is amazing like check out this picture yeah you might want to check this out it's it's tone yeah honestly like i i'm same with me like i was much more trying to convince people about what i thought was right yes you know coming out of college you think you're all high and mighty it doesn't work no people just you know are allergic to i'm allergic to i cannot handle it like it's not no one wants to to talk like that right it's one thing if you're having a good time yeah and trying to just like show someone up yeah and it just you can have fun with it it's more comedic but when you're actually taking yourself seriously like it's not going to work no it's not going to work and it actually has the exact opposite effect that's like um the expression what it what how's the expression like uh jealousy is like a poison that how does it go jealousy is like a poison that you take yourself because you don't like what someone else is accomplishing i forget the that terrible job paraphrasing that my worst paraphrasing of all time mumble -mouth motherfucker that i am but uh but the idea is that it has the exact opposite effect like if you're jealous about someone it actually makes you feel bad instead of them feel bad it also makes them not want to hang out with you well you know they probably don't want to hang out with you anyway let's be honest but but the the the you know what you're doing by back and forth on so I know and I know people who do engage in it and sometimes they have these anxiety moments where they don't sleep for days because they're involved in these Twitter feuds I mean I know people that have done this where they've gotten involved in Twitter feuds and they'll wake up at three o 'clock in the morning they check their Twitter feed and like oh Christ man like you got to go go uh go on a yoga retreat or something like you can't do this you can't live your life like this i think there may be some value to the debate it should be there debate it should be there debate and it's like okay i'm not going to spend my time doing it that way some people want to spend their time doing it that way yeah and if there's cool mechanisms for you know the most voted content to be seen so that i mean okay that's interesting to check out sometimes to look at feedback but it's uh it's it's not nearly as an effective way of communicating your ideas as like making something more personal right that's you know more i mean even video is is is more effective than that because people actually have a chance to look at you or obviously in person yeah even better yeah well in person is obviously the best um and i think my concern really about the future is i'm holding back a sneeze right now sorry trying to keep it together do it i don't think i can it's one of those borderline ones what are you supposed to do you're supposed to stare at the light are you trying to resist no I'm trying to get okay we're good we're out of the woods um uh I lost my train thought what was it what we were just saying AI something no jealousy quote do you're talking we want I pass that Jamie I know we've been asleep for days I was looking for it no I lost it I lost it in my holding back a sneeze um that oh that's what I was worried about AI um not Not AI, augmented reality.
[836] That's what I'm really worried about, not artificial, but augmented.
[837] And my concern is that what we're experiencing right now in this flat form of two -dimensional text is something that is very overwhelming to a lot of people's time.
[838] I mean, you're looking at some kids that are online, social media, eight, ten hours a day, just staring at their phones.
[839] I'm extremely concerned, and I have some jokes about it in my act, about the next wave.
[840] because I think that we're overwhelmed by this incredibly attractive medium where we're attracted to our phones, we're attracted to this style of engaging in information, receiving information, and passing information, and then online arguments and debates and looking at pictures and this constant stream, which, you know, just looking at your phone, it's not that thrilling.
[841] It's just like, hmm, it's not that thrilling.
[842] It's like, okay, yeah, but it's still getting you all day long.
[843] Like, there's nothing really crazy happening.
[844] What my concern is if something really crazy does start to happen, when you really can have experiences that are hypernormal, like, that are more powerful than anything you can experience in this regular carbon -based physical touch and feel world.
[845] And once we start experiencing augmented reality, the integration between humans and technology, and then, the ability to share augmented reality like to share like if you were at work and you have these fucking goggles on and your girlfriend is at work on the other side of town and you you guys both have like these similar video pets that are with you and dancing around and providing you with fucking advertisements and giving you things there's there's next levels to this stuff that i'm trying to like see the future but i'm too fucking stupid and i don't really know anything about technology, but I know that they're going to get deeper into our lives.
[846] I know that these technologies, not they like the government, but these technologies, they're going to get deeper into your life and that they got you by the balls and the clit with a fucking phone.
[847] And it doesn't even do much.
[848] Take some pictures, look at some pictures, look at some text, watch some videos.
[849] That's all it does.
[850] And access to most human knowledge.
[851] That's true.
[852] But how many people are using that?
[853] Right.
[854] Well, you know, they are for sure.
[855] They definitely are.
[856] There's a lot of Googling going on.
[857] I'm sorry.
[858] What is the other one?
[859] Duck, Duck, Go.
[860] Duck going going on.
[861] I'm holding out for another one.
[862] We might start working on search more.
[863] Bing is a goddamn ghost town, isn't it?
[864] I bet if you go to Bing, you got to blow fucking dust off your keyboard as soon as you open it up.
[865] Like, no one's in there.
[866] Who's in Bing?
[867] Bing is just Microsoft's.
[868] I know.
[869] But who's using that?
[870] Oh, ladies?
[871] I think that YouTube is the number two search engine.
[872] on the web.
[873] YouTube.
[874] That's like people can make so many videos about so many weird topics and just like, it'll just pop up and you get.
[875] I don't think we can stop it.
[876] Oh, but, you know, because look, it would be fun to, you know, with the frequency that you go to like an arcade, like I would go and do some crazy AR, VR, VR stuff.
[877] I mean, it would be fun as like a rare entertainment thing to do.
[878] I just want to make sure that, Like, see, even with the robots that we're carrying around now, do, is, is it respecting my freedom?
[879] Is this thing on my side?
[880] It's not.
[881] I don't think it is right now because I'm using Android as open source.
[882] Are you an Android guy?
[883] Of course you are.
[884] But all these crypto guys, they're all Android people.
[885] You have to, it's, it's just more freedom.
[886] But now, Google's version of Android is just as bad as iOS.
[887] So who's version of Android do you use?
[888] I am Yeah I'm not I'm not perfect man No I'm not perfect I'm on I'm on the Google Android right now But there's a version called Replicant Which is Which is a fully free version of Android That I'm probably Because I just cracked my phone Like a day ago So I might Have to get a new one What do you use What phone do you use?
[889] S8 Oh look at you Yeah Kind of retro Is it?
[890] I don't know It took a year ago There's this one called The black phone Which I'm looking into Which is that It's like a hyper encrypted phone But I don't know if it's fully free.
[891] Wasn't there a blockchain -based phone that they were coming out with, an Ethereum -based phone, isn't that?
[892] I don't know.
[893] Wasn't that something, Jamie?
[894] Sure, I definitely got announced, but I don't know that it's still in development.
[895] You can't run everything on a blockchain.
[896] No?
[897] Blockchains are pretty slow.
[898] We use it.
[899] Even to publish to the Ethereum blockchain, like when you send each other payments on mines, it costs like, you know, there's a gas fee.
[900] So the way that the network is powered is that the miners get paid with gas With a little bit of ether So it costs like a buck to do a post Like there's fully decentralized social networks There's one called P -Pith Which you have to pay for everything you do Right on it And so this is why it's a cool experiment But it's really not scalable So you know it's gonna be a combination of decentralized technology Like not just blockchain People like to you know Say the blockchain is going to solve all the problems And it's going to solve a lot of problems.
[901] It's an incredible tool.
[902] But, yeah.
[903] I'm, what is this, Jamie?
[904] Is this it?
[905] The Finney?
[906] It's one that's out now.
[907] Yeah, just went on sale like a month ago or something.
[908] It's pretty.
[909] Siren OS, which I'm not exactly.
[910] What is that, Jazz?
[911] Good luck getting a fucking app with that.
[912] And some of those Android apps, they're sneaky, right?
[913] Don't they steal Bitcoin?
[914] There was an Android app that got in trouble for stealing cryptocurrency.
[915] currency it was stealing it in the background while you had your app open and it was on the google play store see if that's true i might have made something up i could get sued i don't think i did though i think it might have been mining and i think it was stealing oh yes it's steals yeah yeah pulled that up so we can see it um i have an android phone as well i have a note nine i really like it it's giant huge screen great battery life beautiful so you use both yes bitcoin scam warning over fake android app that steals cryptocurrency from your phone yeah i use both um android's very good now it's very good i i was an early adopter and it was like clunky and shitty and then i would go to my iphone i was like oh my god this is so much better what iphone is great with is integration with like apple tv integration with a laptop but i also have a windows laptop that i use a lot i really like uh i have a lenovo think pad for as for writing the keyboard's better in fact i actually bought an older macbook just for the keyboard because as a writer, you want tactile feedback as you're writing.
[916] It just helps.
[917] It makes it easier for you to recognize where the keys are.
[918] And Apple has decided to go so far towards design and just for aesthetic beauty that they've ruined the tactile feedback.
[919] Do you remember that though when the old smartphones, they still had the keyboard?
[920] I thought I would never leave that because it was tactile, but then I ultimately left.
[921] That's true, but that's a different experience.
[922] That's just thumbs.
[923] I can do that with my thumbs, and I kind of know where everything is, and I'm not writing a novel.
[924] You know, when I'm writing material or essays or something like that, I need a fucking keyboard.
[925] You don't think that the holographic screen that's just here, you don't think if it just like autocorrects everything you do and you could just like...
[926] Maybe, but there's a...
[927] I like the tactile, too.
[928] Yeah, there's a feeling.
[929] I like mechanical keyboards.
[930] there's a feeling of knowing Did you test out that FaceTime bug?
[931] Did you hear about that?
[932] FaceTime bug?
[933] Yeah, there was a FaceTime bug where What was it?
[934] You didn't see that?
[935] I heard about it, but I didn't look into it at all.
[936] Yeah, we didn't test it out and play it with this.
[937] Basically, you could call someone and hear them without them picking up.
[938] Oh!
[939] Yeah.
[940] Without them picking up.
[941] Without them picking up, yeah.
[942] So the thing is ringing on FaceTime and they don't even have to pick up and then you're on the other end talking shit about them.
[943] Yeah, you can basically serve hell anybody.
[944] Fuck Bill and fuck minds.
[945] That guy's full of shit.
[946] As soon as the big companies come to him, he's going to stick his ass in the air, just like all of them.
[947] I said the camera could be turned on too.
[948] Makes sense.
[949] I was thinking about that when I'm beating off.
[950] Don't you?
[951] You should.
[952] They, Apple acts like it cares about privacy, which maybe it doesn't turn certain over things to the FBI.
[953] I don't know exactly what's, but we don't know what the Apple phones are doing.
[954] right apple is all locked down closed source and additionally there was a creepy speech that tim cook just just gave creepy did you see it no let's listen to it yeah let's listen to it should we play spooky music do the ADL speech do you have any spooky music you could play in the background why he's doing the speech we might get in trouble for that so again it's good intentions like people who want less hate speech we all want less hate speech really Of course.
[955] We want people to get along better.
[956] Yeah.
[957] But this idea that I don't want to give it too much away, but he's acting as if they are going to be the moral authority about the types of content that can exist on the app store.
[958] Yeah.
[959] So I just don't know how that's scalable.
[960] Yeah, what does that mean?
[961] Like, let's hear what you said.
[962] I was just hoping this is the right one.
[963] Is this it?
[964] It says CEO Tim Cook banning hate.
[965] division is the right thing to do 12 -3 -2018.
[966] Is that it?
[967] December?
[968] That's it.
[969] Okay, let's hear it.
[970] Hello, Tim.
[971] Volume, please.
[972] Our devices connected to the humanity that makes us, us.
[973] We do that in many ways.
[974] One of the most important is how we honor a teaching that can be found in Judaism, but is shared across all faiths and traditions.
[975] It's a lesson that was carried forward by the late Elie Wiesel.
[976] May his memory be a blessing.
[977] It's a lesson put into practice by America's Muslim community who raised thousands for the victims of the tree of life killings.
[978] Lo, Tahmoud al -Dam re -eca.
[979] Do not be indifferent to the bloodshed of your fellow man. do not be indifferent.
[980] This mandate moves us to speak up for immigrants and for those who seek opportunity in the United States.
[981] We do it not only because their individual dignity, creativity, and ingenuity have the power to make this country an even better place, but because our own humanity commands us to welcome those who, who need welcome.
[982] It moves us to speak up for the LGBTQ community, for those whose differences can make them a target for violence and scorn.
[983] We do so not only because these unique and uncommon perspectives can open our eyes to new ways of thinking, but because our own dignity moves us to see the dignity in others.
[984] Perhaps most importantly, it drives us not to be bystanders as hate tries to make its headquarters in the digital world.
[985] At Apple, we believe that technology needs to have a clear point of view on this challenge.
[986] There is no time to get tied up in knots.
[987] That's why we only have one message for those who seek to push hate, division, and violence.
[988] you have no place on our platforms you have no home here from the earliest days of iTunes to Apple Music today we have always prohibited music with a message of white supremacy hold on a second what do you think they're signaling here like are they signaling that they're about to start censoring things like what they already are they already are they okay i agree you should you probably shouldn't put white supremacy music on but there's a lot of like really violent stuff that you can get on itunes right i mean if you go back to the old nwa albums there's some that's available right uh sure i i i'm assuming yeah i don't know that it is for sure but yeah like straight out of compton that's some violent shit and then how about the films that they have how about the films that you can get on the iTunes store there's a lot of very very very violent films like extremely violent there's a lot of films that like is it that they're making the distinction between something that's fiction that although it may be disturbing you understand that this is a movie and this is something someone wrote versus someone art versus yeah versus someone with commentary their commentaries and then here's the other thing when he was saying hate and division they won't promote promote division, but that's a weird one.
[989] Yeah, that means...
[990] Like, what does that mean?
[991] People who disagree with you.
[992] He has good intentions.
[993] You can sort of feel it.
[994] That's the problem with this.
[995] Right.
[996] That it's...
[997] He's not allowing the conversation to take place.
[998] So this is in direct conflict with the Daryl Davis's, with confronting these issues.
[999] Right.
[1000] So...
[1001] But we can kill it.
[1002] But I think when he's saying, you have no place.
[1003] on our platform, they probably feel like you can go somewhere else.
[1004] He's building a wall.
[1005] Yeah.
[1006] I mean, but this is what I'm saying.
[1007] Like, everybody kind of feels like you can go somewhere else.
[1008] That's what happens, though, and that's how things get more radicalized.
[1009] And everybody goes to gab.
[1010] So, I don't know.
[1011] Look, the conversation needs to take place.
[1012] People on the left.
[1013] He's acting like he's speaking for all LGBTQ people.
[1014] Yeah.
[1015] He's not.
[1016] There's lots of people on the left.
[1017] and LGBTQ people aren't always on the left.
[1018] And not all of them want that.
[1019] Well, not only that.
[1020] There's division in LBGT and Q. Like there's a big issue right now with Martino Navatilova that was going on about her discussing the reality of trans women competing against biological women and that she opposes it.
[1021] And she thinks there's some fundamental advantages, which is leading to a lot of weightlifting world records being.
[1022] broken by trans women and she's like this is fucking preposterous including trans women with penises now they're attacking her for being transphobic so there's not even a united united opinion in the lbg t q community for sure and that's why that uh megan murphy i think she yes the i go to this restaurant in bridgeport connecticut called bloodroot which is like sort of an old school feminist like vegetarian vegan spot in bridgeport in bridgeport really yeah Yeah, yeah, yeah.
[1023] Bridgeport, it's kind of, no offense.
[1024] I know.
[1025] It's kind of a dump.
[1026] It's pretty wild place, yeah.
[1027] We would have, uh, gathering in the vibes music festival.
[1028] It's cool.
[1029] I helped to organize that.
[1030] I used to do stand -up in Bridgeport.
[1031] This is a place called the Joker's Wild.
[1032] Hmm.
[1033] It was a comedy club.
[1034] So, they...
[1035] I saw the owner beat a guy with a shoe there.
[1036] Yeah, beat a guy in the face with a shoe, pulled a shoe off and smacked the face.
[1037] You didn't intervene?
[1038] I was 24.
[1039] I didn't know what the fuck was going on.
[1040] So, anyway, though, that restaurant, you know, they get called.
[1041] what is it, turf turfs, yeah, trans exclusionary, radical feminist, yeah.
[1042] And so again, you know, they're the old school ones and they're saying, look, you know, we're not against your battle.
[1043] Right.
[1044] You know, we're not against trans, right?
[1045] Who would be against trans?
[1046] Yeah.
[1047] But they're just saying that's not our thing.
[1048] So again, there's diversity and they're trying to clump everyone together in the whole intersectional world.
[1049] And look, people want to band together.
[1050] the oppressed groups want to band together they should yes but like it's not that simple well there's always going to be differing opinions and especially when you have something like you know trans women competing against biological women and you know you have someone like martina davertilova that made her her life's work and her career competing as a biological woman um she's going to have some opposition to that and then the idea that everyone's supposed to be lumped in together with some mandate that no one has really openly discussed and you're supposed to agree and it fluctuates and moves like the tide you know like what is and is and moves like the tide it just changes it's like this court of public opinion it's it's constantly rendering new verdicts and you have to keep up and catch up things that were acceptable just a few years ago are totally unacceptable i mean comedy is is the key area too yeah it is not what what's happening on social media now not sustainable for comedy because it's fine oh really yeah it is it is how because it creates outrage and then comedy it relieves that that pressure like believe me there's a lot of blowback and believe me there's a lot of debate and discussion but also believe me when someone does do some politically incorrect really good stand -up people go fucking bonkers they love it it's going it's like one of the best times ever right now to do stand -up people go fucking ape shit oh yeah no it's incredible material but i'm just saying for you know comics that are running into issues with getting banned or whatnot i mean well who's running into issues with getting banned i mean i think you know one owen yeah yeah owens had some issues yeah um and you know you can make some arguments that owen's not doing so well right now um but he's also developing his following because of the factors people that don't agree with him being banned he's he's a very specific example um other people that are being banned do you know of other who what other stand -up comedians you've had a bunch of comics on maybe they haven't been fully banned from social media but they've had their performances uh shut down who was that one guy who was that oh nemesh nemesh but that was at a college see but it's the same it's the same it's the same thing.
[1051] Yes.
[1052] But universities have been bad for that for a long time.
[1053] They're the most sensitive of all audiences and they're the ones who are the most, they believe the most that they're going to change the world and that their ideals are, their ideals are rock solid and they have to push back against anything that opposes them.
[1054] Ari was temporarily banned for...
[1055] That was an accident.
[1056] Yeah.
[1057] The Ari thing was he was joking around and they thought he was making legitimate death threat.
[1058] He's joking around with a good friend of ours.
[1059] The algorithms and the moderators are just not.
[1060] We can't just be having this happen all the time.
[1061] And then they just keep saying, oh, sorry.
[1062] Oh, sorry.
[1063] There has to be a new approach completely.
[1064] It can't just be, oh, let them back on and just keep doing what they're doing.
[1065] Like, we need to completely re -approach how moderation is happening.
[1066] the whole policy situation, the transparency situation.
[1067] It's not just a matter of going to the overlords and saying, can I please come back?
[1068] Right.
[1069] That's not suitable for the communication structure of the planet Earth.
[1070] Well, I think what's not suitable is that commerce should not dictate how human beings are allowed to openly communicate with each other.
[1071] And one of the things that Jack said that's kind of contrary to his company's actions, was that he believes that the ability to communicate as a fundamental right, like the ability to get electricity.
[1072] Like, if you're in the KKK, you can still order electricity.
[1073] So should you be able to just distribute information?
[1074] If people say no, then you have to say, okay, well, who's to decide what can and cannot be distributed, and then who's to decide if they can go somewhere else?
[1075] And then what happens if you tell a person they can't go anywhere?
[1076] Then things get really weird.
[1077] We're looking at more of a community moderation structure.
[1078] so that we've even been considering like a juror system so that if we make a bad decision and someone appeals it, then the community can potentially make the decisions as opposed to us.
[1079] Or, you know, but then when you go far enough into the decentralization world, it just becomes impossible.
[1080] So we sort of have to decide.
[1081] I think that's where it's going to go.
[1082] So I don't know.
[1083] that's that's the uncensurable internet and this idea that we can do things and then just delete them like in the GDPR the European privacy laws have this whole idea of the right to be forgotten online which is very difficult because deleting things from any database especially a blockchain is not easy so the idea that you can go on the internet do crazy shit and then just have it taken away it's a paradox because privacy means control but you know it doesn't jive with the way that technology works to just be able to to delete things like you're writing to a database that's not even how the universe probably really works like you can't just say oh i just went punch that guy in the face in the bar and i just want to delete that from having happened right yeah Yeah.
[1084] And again, I think that what we're dealing with now is like you have to interface with it, right?
[1085] You have to interface with your computer.
[1086] You have to interface with your phone to access all this stuff.
[1087] My real concern is that that's just a temporary step and that we're going to just consistently and constantly be interfaced with all of each other.
[1088] You know, Elon brought something up when he was on the podcast called Neural Link.
[1089] And he did want to.
[1090] want to fully describe it because he said he couldn't but he said it's going to be live within you know a matter of X amount of months and he was talking about it increasing the bandwidth between human beings and information in a radical way that's going to change society that's what I'm talking about yeah he's talking about an injection you basically doing like your throat and it's a neural lace and it just you know threads around your brain what and yeah you serious yeah that's what he's talking about yeah yeah yeah he's got a damn alien try and turn us into robots so but the question is what's the nature of those robots robots are going to exist they exist right but should you shoot them into your brain if you're dying of cancer would you go for it yeah I'd want to see God let's see what's up so like do the nanobots you know that Kurzweil talks about like do we have control as a community over those robots what's what's the code running those so are they infallible I mean, what if they crap?
[1091] I mean, our fucking tricaster crashes every other podcast.
[1092] Yeah, whether it's open source or free or not, it makes no difference to whether it can fuck up your brain.
[1093] Right.
[1094] What if somebody puts that shit in and then for whatever reason, they have a blown fuse and they stomp on the gas and drive right into a tree?
[1095] It depends on the level of risk you're willing to take.
[1096] I mean, you see some of those videos.
[1097] Like, I've cried of those videos where like the woman like hears for the first time, you're like, oh, shit.
[1098] Yeah, yeah, yeah, yeah, yeah.
[1099] And people seeing color for the first time, putting on certain glasses that allow them see color.
[1100] Yeah, all this stuff is amazing.
[1101] I mean, all that stuff is very cool.
[1102] And in talking to David Sinclair, and he was talking about emerging technologies with reversing aging and age -related diseases, I mean, we're entering into an incredibly strange time for the influence of technology and innovation on human beings, on our bodies and our brains.
[1103] And we're going to have to decide, you know, how far you want to go on this ride.
[1104] next stop far rock away like when do you get now i live there yeah i lived on the beach there i'm going well it's no i don't know but you know what i'm saying it's like one of those things like where do you get off like when where do you go that okay that's enough you know like with you you're deleting facebook you delete to delete instagram and you go just gonna be on minds and that's enough i'll go in other places i mean these there are alternatives that are getting very big yes And so, and together, like, Signal has tens of millions of users.
[1105] I don't know what that is.
[1106] I've never heard of it.
[1107] That's like the encrypted messaging app that...
[1108] Do you know it?
[1109] I don't...
[1110] It's open source.
[1111] What is it?
[1112] Snowden is on their advisory board or whatnot.
[1113] What is Signal?
[1114] It's just a messaging app.
[1115] So a messaging app, like a WhatsApp or like a Twitter?
[1116] Okay, WhatsApp.
[1117] So you have to know the person and then contact them through it.
[1118] Yeah, but we're considering using the Signal protocol for our messaging.
[1119] system because our messaging system needs an upgrade, but all of us together are going to be able to create sort of like a group of apps that are like sort of a more open freedom supporting privacy alternative.
[1120] And like, so we're not going to solve it by ourselves.
[1121] And it would be way easier if one of these big companies would just switch gears and start doing things the right way.
[1122] I mean, we've spent eight years building this.
[1123] If one of the big companies, Google, Facebook, had just been free and open source, we would have spent the last seven years building on top of them.
[1124] Right.
[1125] Because, you know, they already did something cool that they're sharing with everybody.
[1126] So it's actually closed source projects stifle innovation.
[1127] Because if you think we had to reinvent the wheel, we went and built an alternative with much of the similar functionality.
[1128] think about how much further the world would be if everyone was building on top of more common protocols but you're looking at it in terms of your own personal benefit you're looking in terms of mine's personal benefit i mean you created this thing it was not just pure for altruistic reasons it's a business right so if they had established this open source network that was facebook and you just came along and built yours well that yeah that would be great for you but why would that that be great for them.
[1129] I mean, they're obviously in a business.
[1130] Now, the problem with the business is this business is the business of distributing information.
[1131] And then we have to decide, okay, at what point in time do we allow these, air quotes, overlords, to dictate what can and cannot be distributed.
[1132] And how did this happen?
[1133] Because in the beginning, I bet it didn't happen.
[1134] I bet in the beginning you could just put on whatever the fuck you wanted.
[1135] And then they had to deal with that.
[1136] And then they had to figure out after a while, okay, maybe we shouldn't have this on.
[1137] like hey if we're going to sell advertising we really should maximize the amount of clicks okay how do we do that well we put things in people's feeds that they want to see we put things that people want to debate about and argue about and political things and all sorts of different things that excite them and get them to be engaged with the platform that's that's their business the businesses i mean it's no different in a lot of ways than amazon or than any other business that wants to grow like how do they grow well they grow by maximizing their profits and by maximizing the amount of eyes that get to their advertising so they get more clicks and more people get engaged.
[1138] That's what their business is.
[1139] You're deciding by saying if they were open source, look how much further along the world would be.
[1140] They would be further along too.
[1141] I don't know if they would agree with that.
[1142] I think they're worth fucking cajillions of dollars, so they've figured it out.
[1143] Well, it just depends on whether or not you think that people have a right to know what is going on.
[1144] I mean, it's like food transparency.
[1145] I'll talk about that I will talk about that until the end of time We're interfacing with this And it's affecting us I agree I fully agree with what you're saying I'm playing devil's advocate by saying that In their position They have a business And their business is to make money And they're going to lose because of what they're doing Because it's not sustainable Even after post hearings They're losing active users Are they?
[1146] But I thought their business went up after the hearings Probably.
[1147] Did it?
[1148] But it's not going to last.
[1149] I mean, look at...
[1150] Why do you say that?
[1151] It's just, the game's over.
[1152] It's going to take a long time for us to build it up as, you know, all of these different organizations and companies working together.
[1153] But Linux, for instance, is the operating system that most banks, it is the most popular operating system in the world.
[1154] It's in your, it's open source.
[1155] Yeah, it's open source.
[1156] It's in your phone.
[1157] It's in everywhere.
[1158] It got there because of that, because everyone used it and it.
[1159] incorporated into their product.
[1160] Facebook, they are all using free and open source software in their stacks.
[1161] They're just not sharing their product with everybody else.
[1162] So they're benefiting from it, but not giving back.
[1163] And, you know, I almost feel like I shouldn't even be saying that they should just pivot because, you know, that's their only chance to survive.
[1164] So this is based on your estimations of the future?
[1165] Yeah, it just seems like things are becoming.
[1166] coming more open and if is that possible because you engage with a lot of other super nerds and you guys all have these similar ideas you just have to look at what's happening with bitcoin i don't know what's happening with bitcoin it's becoming bitcoin and ethereum and you know lots of other blockchains are are growing really fast maybe the you know the price is separate the development energy the number of people who are building apps on top of bitcoin and ethereum is growing massively there's It's a whole new infrastructure that's like a common protocol that people can build on.
[1167] So that is growing rapidly.
[1168] The price is secondary.
[1169] That's not even what Bitcoin and Ethereum are really about.
[1170] It's a decentralized database.
[1171] So this is just where the internet is meant to be decentralized.
[1172] It sort of started out that way.
[1173] And then we moved into this like Web 2 silo system with like just these massive.
[1174] companies that are controlling everything but it's going to it's going to keep waving okay again to play devil's advocate the vast amount of users are not using those platforms a vast amount of users are using these controlled platforms like facebook and instagram and twitter like if that's if you're if you're talking about i'm just guessing but if you're talking about the the the gross number of human beings that are interacting with each other on social media they're mostly uncontrolled networks you're saying this is not going to last but there's no evidence that it isn't going to last there's tons of evidence what is the evidence Wikipedia what is what happened in Cardi remember that disc you put in your computer that was your encyclopedia where is that no one uses it okay that's different this is not a social media network the social media networks that people are using are almost all controlled right yeah no it's not it's gonna take a very very long time How long?
[1175] I would say 10 years.
[1176] And what do you think is going to be the catalyst?
[1177] What's going to cause these people to make this radical shift to open source?
[1178] I think we have to be, we have responsibility to be competitive, functionally.
[1179] Minds does.
[1180] Yeah, we do.
[1181] We do.
[1182] We're moving there fast.
[1183] We just hired a ton of new developers.
[1184] And it's going to take time.
[1185] We're not there yet.
[1186] Right.
[1187] But once we have functionally competitive products that you wouldn't even know the difference and there's enough people there, then it's basically the decision of, you know, am I going to choose the one that respects my privacy and freedom or the one that doesn't?
[1188] And people are, kids don't like Facebook.
[1189] Everyone is sick of it.
[1190] We're just drug addicts.
[1191] Hmm.
[1192] Is that what it is?
[1193] they're just sucked into this thing where you constantly want to check and see who's writing what and yeah and there's monopolies arguably yeah yeah right especially when Facebook owns Instagram right um what if they bought Twitter as well they almost did I think Google almost did what if Google steps in and buys everything then you're like oh no they probably could right yeah they easily could they could probably buy everything Apple could with cash yeah yeah Tim Cook could come in with a big purple pimp suit on just slapped down a briefcase bitch i just wonder and like look all the all these executives i you know jack seems like a cool person he's a very nice guy it's not i just sense so much inconsistency and you know he's talking about bitcoin like it's this important new internet money simultaneous which is he knows the infrastructure is open but then His platforms are the opposite.
[1194] Why is he so inconsistent?
[1195] It's like it's just hypocritical to the maximum.
[1196] I think it's partly because it's a giant business, you know, and I think when you have an obligation to your shareholders and to maximize profits, and when you're trying to maximize profits too, and there's this universal growth model where every year just has to get a little bit bigger, otherwise you're fucking up as a CEO.
[1197] Like, you don't have to experience that with minds.
[1198] you're one of the co -founders how many people are involved it's like 15 of us now and do you have like a board where you sit around where you make critical decisions is that stressful as fuck yeah luckily we've started off from the point where we're saying okay we're embedding principles into how we're doing things so we're not in a position where we would ever change that that that's for us to do that would just be a total waste of time right so we're making it harder for ourselves to make money in the beginning we're making it harder for ourselves to grow because we are not going to compromise people's privacy in order to do those things and so we're just going to build up slowly steadily and just get there when we get there how much time a day is this how much of an obligation is this for you same as any job i'm it's it sucks because you know my wife alley would say like it's too blurred my life because it's like what is pleasure what is I mean it probably happens with you too like when you're on your phone like your family doesn't know if you're working or if you're doing something for fun it's like your work is sort of in the digital realm partially a lot of it yeah so it's like I just need to put it down like no phones in bed these kinds of things like strict lines yeah that is that is huge strict lines are huge um yeah putting your phone in a physical place and pushing it away you know like that's it's huge it's just the the compulsion to look and check like instagram feeds to see if there's any cool pictures like what the fuck am i doing like why am i compelled to do this there's no benefit like occasionally if i'm bored like i'm in the dentist office you have 10 minutes, all right, see what the fuck's going on in the news.
[1199] Like, maybe, occasionally.
[1200] But there's so much of your time that's dedicated to that.
[1201] So much of it.
[1202] It's so taxing and it's so involved and so many people are doing it.
[1203] I went to a restaurant the other day and I was looking around and fucking everyone was sitting at a table looking at their phone.
[1204] It's weird.
[1205] Have you ever did this stack game?
[1206] What's that?
[1207] Just like, if you're out to dinner with a bunch of people, just everyone's put their phone a stack in the middle.
[1208] And if you touch it, then you pay.
[1209] Oh, I'd rather just...
[1210] Just not.
[1211] Not.
[1212] Yeah.
[1213] People wipe their butts and don't fucking wash their hands and touch their phone.
[1214] And you know, your phone is filled with all kinds of dirty shit.
[1215] They've, like, done these swab tests of phones.
[1216] They're covered with E. coli.
[1217] And people were gross.
[1218] I'm not...
[1219] A little germaphote.
[1220] Keep my fucking hands clean, bro.
[1221] I'm not, but I know your phone probably has your butt all over it.
[1222] True.
[1223] Just be honest.
[1224] true we don't have to do but I know what you're saying like it's a good idea you know the guys who run Joe beef in Montreal it's this amazing restaurant Fred and Dave and they were they were talking about it that when they go to dinner they shut their phone off like I'm going to I'm a good guest a good table guest I shut my phone off I don't I don't engage it don't check it it's a you know it's a it's a similar thing to podcasting in a way and that one of the good benefits of podcasting is that for three hours or two hours or whatever the fuck you're doing, you're going to sit down and you're just going to engage with a person, just you and I, me and Bill, we're just talking, right, and that we're not checking our phone, we're not looking at the television, we're not looking at the laptop.
[1225] There's no distractions.
[1226] And that is one of the rare moments in life where you get to talk to someone for several hours.
[1227] And over the last, you know, nine years that I've been doing this podcast, it's benefited me tremendous.
[1228] tremendously, just in having real conversations with people.
[1229] We're just sitting across from somebody for hours, just talking to them.
[1230] Like, getting better at understanding how people think, getting better and understanding how I think, getting way better at communicating and, you know, knowing when to talk and when not to talk and what questions to ask and try to understand the thought process that another person has.
[1231] And, you know, you walk out of that with some lessons, like real, legit, tangible lessons.
[1232] those fucking don't happen when you're staring at your phone while you're talking to people it like cuts all that off that the conversation stays shallow you miss important points like i'm sorry what you're you're saying you do that kind of shit and like then the other person knows you're not engaged and it's just it's weird yeah it's all shades of gray i mean it's done incredible things for like democratizing the ability to share information so it's not just these juggernaut media companies are the only places that can share information so it's incredible It's crucial.
[1233] We need everyone to have the ability to share and so that you can check and because maybe you're more likely to get the reality of what's going on the world from your news feed than the big companies.
[1234] But...
[1235] We need management skills.
[1236] Personal management skills.
[1237] Yeah.
[1238] And I think we need to look at them the same way we look at like alcohol consumption and, you know, even poor food choices.
[1239] Like you can have a cheat day and eat a bunch of pizza and some ice cream like The Rock does.
[1240] No one's going to get hurt, right?
[1241] But most of the time you should probably take care of your meat vehicle.
[1242] I think the same thing can be said of your mind.
[1243] Like, you can have a day.
[1244] Like, I have a day a week, well, I will fucking plop down on the couch, and I don't give a fuck.
[1245] I just watch bullshit on TV and just relax, because I know that I'm redlining it six days a week, you know, and I'm doing three different things at a time.
[1246] I have three different jobs.
[1247] I'm working out.
[1248] I'm trying to take care of my family.
[1249] I'm writing comedy material.
[1250] And then, oh, let me see some documentary around some wacky fucking cult or whatever the hell I'm going to watch.
[1251] And I don't feel guilty when I do that because I know that I've kind of, air quotes, earned it, you know?
[1252] But I think that that's mental management.
[1253] And I think we certainly need, we need personal management in comes to the use of electronic devices.
[1254] Yeah, personal challenges.
[1255] Yeah.
[1256] Challenge psychology is really fucking interesting to me. Like, you guys do the October thing.
[1257] We're thinking about doing it twice a year now.
[1258] I did this one with some of my friends.
[1259] Jamie just going to know a shit.
[1260] Are you in it too?
[1261] No, no. Jamie doesn't get in.
[1262] We did one called the 100 Burpee Challenge.
[1263] I haven't done burpees in, I did them today because I was coming out.
[1264] I was like, I can fucking do burpees today.
[1265] Nice.
[1266] But 100 for time, every day for 100 day straight.
[1267] and it's you are drenched after going for time 100 burpees oh yeah ridiculous so it's hard and that was the best most discipline I've ever been working out it was like me and five friends my friend's mom did it too and it was changed my life really it was ridiculous how to change your life I felt better than I have ever felt by far and I have that was like a year ago and I've I've trailed off but I think like but not just physical challenges like digital ones too and like with the ice bucket thing that was crazy shit I didn't get in the vault I didn't do it either I'm like just watching it happen yeah was just really powerful I'm like I'm not throwing water in my head during a fucking drought stop yeah everybody stopped so not fixing anything how about I just write a check I'll give you some money yeah film yourself writing the check yeah stop yeah stop You know, throw a glass of water in my face when I'm done with the check.
[1268] Just stop.
[1269] But getting communities to sort of pressure each other into doing things.
[1270] In a positive way.
[1271] In a positive way.
[1272] Well, that was what the Sober October thing kind of turned out to be about.
[1273] And there's a lot of lessons learning that, too.
[1274] You know, you learn lessons about your reliance on either substances or things.
[1275] and one of the things that I learned from the Sober October challenge, the last one, was that when you engage in really rigorous physical activity six and seven days a week, you don't give a fuck.
[1276] Like, you don't give a fuck.
[1277] Like all the chatter, the internal chatter, it just goes away.
[1278] All the negative chatter, like, it's like taking a pill.
[1279] Like, I don't give a fuck pill.
[1280] It's amazing.
[1281] It's really amazing.
[1282] Because I think a lot of personal anxiety that people carry around with them is a physical energy that's not being expressed because I think the body has certain demands and certain potential.
[1283] And in order to have this certain potential, like your potential for athletic output, you have to have this energy source, right?
[1284] And this body energy source when not expressed and when you're sitting in a cubicle all day, day after day after day, it builds this like internal, anxious feeling and tension.
[1285] And that becomes your normal, the, the normal line, the normal frequency in which you operate, you operate under this intense sort of anxious state and you feel like, well, this is life.
[1286] God damn, and I'm depressed or goddamn and I'm anxious.
[1287] Yeah, I got anxiety.
[1288] I got to see a shrink.
[1289] I got this.
[1290] If you just blow that shit out every day, every day, you know, you burn off 2 ,000 calories and you fucking run for five miles and you do kettlebells and chin ups and fucking hit the bag for five rounds, dude, that shit goes away.
[1291] You don't give a fuck.
[1292] And then you get to look at things with real clarity.
[1293] So there was a lesson learned in that.
[1294] And then that lesson was only learned because we decided to challenge each other and push ourselves.
[1295] Do you think it would be too draconian to have like a company, 100 burpee a day policy?
[1296] Yeah, you know why, man?
[1297] I just don't think you should tell people what to do.
[1298] Yeah.
[1299] Their job is the job.
[1300] And then everything else is like a cult.
[1301] You know, it's like, no, we're only going to wear white robes.
[1302] We don't need anything about white robes okay when do you start fucking everybody and taking their money because that always comes next yeah because it's in vain yeah you can't you can't force people it's like being convinced but it wouldn't be a bad well the problem is if you were you could you could have some sort of a company -wide challenge where you invite people no because that's what I'm saying is like you would shame them into doing it or you would uh you know somehow or another make it seem like they would advance in the company more if they played along it could be yeah i'm not going to do it thinking about doing what you thinking about doing it i just did maybe i mean because it feels so good yes yes well you should encourage it but i almost feel like um yeah but you don't want to shame people like there's some brilliant people that don't work out at all they're brilliant but they just whatever for whatever reason that's their choice you know it should be your choice to go out like Christopher Hitchens and just fucking drink every day and smoke cigarettes and one day you get cancer and you're like well you know I mean this is like he mean he the way he described it like burning the candle at both ends it gave a beautiful brilliant light yeah he wouldn't have had those ideas if he had done another way it's very possible that's true and most of the madness that we see in brilliant artists like it's very possible that madness would not be expressed if they had their shit together there was something that Sam Harris was saying the other day on your show just about the free will stuff.
[1303] And I think that that connects to this information theory kind of thing.
[1304] So if we're just sort of a conglomerate of these actions and we're like flowing the actions through our body in unique ways, I mean, do you accept his theory on free will?
[1305] Well, it's not his theory.
[1306] It's a conventional theory of determinism that a lot of people are embracing and I think there's definitely some merit to it however you and I both know that you choose whether or not you decide to do something right you choose whether or not you do like someone says someone says something to you that's kind of shitty and you choose whether you decide to email them back something shitty like you have that initial impulse like well hey man fuck you you have that initial impulse you think on it you sleep on it but why are you thinking on it and sleeping on it are you doing that because of Determinism.
[1307] Are you doing that because you're trying to be a better person?
[1308] And are you trying to be a better person because of all the factors that played out in your life?
[1309] Like environment, genes, life experience, all those things.
[1310] Right.
[1311] I mean, it's a really good, it's a really good discussion.
[1312] So do you own the words that you're saying right now?
[1313] It's a good question.
[1314] Larry Lessig, who was on here the other day.
[1315] Yeah.
[1316] You guys didn't even talk about this, but he basically is one of the founders of Creative Commons and this whole licensing structure for content, like what we're saying right now, you know, this is going to be licensed.
[1317] I I don't know how.
[1318] How's it going to be right here?
[1319] This discussion is going to be licensed?
[1320] You, yeah.
[1321] Okay.
[1322] I mean, I think you're licensing it in a certain way.
[1323] Okay.
[1324] So you have the ability to license it however you want.
[1325] Right.
[1326] You could say, hey, anyone can take this and cut it up and remix it, or you could say, no, it's locked down.
[1327] But he helped create this whole licensing array of like six different licenses.
[1328] One says you can do absolutely anything you want with this.
[1329] Another says, you can share.
[1330] it, but you can't make money off it.
[1331] There's a handful.
[1332] And so the free will stuff is connected to how we're dealing with information.
[1333] And like if you, because if you, if you think, realistically, we don't own what we're saying.
[1334] We're a part of it.
[1335] Right.
[1336] We're a conduit.
[1337] We're a unique conduit.
[1338] So I don't think it aligns with how the universe works to really be lock down information.
[1339] I think that it makes sense probably in certain short -term business, but, you know, I think we have to open it up to what's really going on.
[1340] What do you mean about like locking down information?
[1341] Like source code, like classified files, like our content, like music, like video.
[1342] And now how does this connect to determinism and whether or not you have free will?
[1343] because are you the creator of your information?
[1344] Well, you are certainly if you put in the work.
[1345] Like, let's say you decide to write a book.
[1346] I mean, you put hundreds and hundreds of hours into this book and edit this book.
[1347] And then you release the book and someone says, no, you didn't, you didn't create that.
[1348] You're a product of determination and I'm going to just steal your book.
[1349] That's intellectual theft.
[1350] Intellectual theft is real.
[1351] It's certainly real in terms of like the creation of content.
[1352] It is.
[1353] If you're a stand -up comedian and someone takes your countless hours of work and steals it, that's intellectual theft.
[1354] For sure.
[1355] And then they try to pawn it off as their own through their own selfish needs, a selfish means.
[1356] That's why attribution is the key part of the Creative Commons license structure.
[1357] Always saying if you come up with a joke, you know, it came from here.
[1358] But what if it's a profit?
[1359] Like say if you wrote a book.
[1360] and I say, hey, this is a great book written by Bill Ottman.
[1361] Give me five bucks for it.
[1362] I'm putting it up on my side.
[1363] Do it.
[1364] Fuck that.
[1365] I mean, here's the thing.
[1366] I don't think.
[1367] Somebody makes all the money off of your book because they have a better platform to sell your book and they don't give it to you at all.
[1368] And you wrote the book.
[1369] You spent all the time.
[1370] You did all the work.
[1371] I would, for certain content that I create, completely give it away.
[1372] That sounds like a guy who's never written a book.
[1373] I've written a book.
[1374] Did you?
[1375] I mean, I've written a lot of content, yeah.
[1376] But have you written a book?
[1377] I give away, I give, I've not published, but yeah.
[1378] But the book that you sell, like if you were an author, but say if you were, I'm not saying people should be forced to do this.
[1379] Okay, I'm just, I'm just saying that this, I think, is how creativity happens.
[1380] And I just don't, people deserve to make money on their content.
[1381] Right.
[1382] And you deserve to own your stuff.
[1383] But I don't think that that's actually.
[1384] how the universe works and I don't think it's acceptable to say oh free you know free will doesn't exist I own your content but yeah that's why I'm struggling to see how they're connected because if you're not the originator then you're not the owner that's a weird argument because you are the originator Stephen King wrote all Stephen King books you are you're the unique conduit you are the originator of that specific configuration of information of information right and you deserve to be able to do everything you're saying with it i'm just saying that i don't know it's it's complex well it is complex if you're saying that all human beings essentially all of your actions have been determined by a lot of factors that are outside of your control whether it's genetics again life experience education all the different factors your environment, is that what's causing you to put out a fucking brilliant record?
[1385] It's part of it.
[1386] Right.
[1387] But you, maybe you have 50%.
[1388] Everything else has 50%.
[1389] Or there's, I don't know what the percentage is.
[1390] But if you're a musician and someone like Spotify comes along and says, boot, do you even make that, dude?
[1391] So we're just going to put it on Spotify and make millions and give you pennies.
[1392] That's not what I'm at.
[1393] indicating.
[1394] I'm saying Led Zeppelin uses the blues.
[1395] Well, more than that.
[1396] More than that.
[1397] There's a real plagiarism.
[1398] So, but that doesn't that doesn't mean that those aren't great records, obviously.
[1399] It's true.
[1400] It is true.
[1401] Yeah, I mean, it's a, let Zeppin this legit gray area.
[1402] You know, I found out this Bill Burr called me up and left this really disturbed message.
[1403] He was like really bummed out when he watched, and Bill's a music.
[1404] he's a drummer and when he saw videos of lead zeppelin music played and the band that used to open for led zeppelin we played it on the podcast we were like holy shit like they just stole stuff they just stole giant chunks and riffs and you know and i mean they made it better i guess but but yeah but that's a different thing than stephen king's book why because stephen king had to spend countless hours in front of his laptop trying to go over each and every sentence and each and every paragraph and suck you in and rope you in and all this work no they stole stuff dude they sold certain phrases but you think he didn't use a single phrase anywhere in any of his books that he didn't pull from somewhere no he certainly has yeah he certainly has um i don't think it's the same though i think it's similar led zeppelin what they also had to spend countless hours recording that performance to get it to the level of awesomeness that we heard that wasn't easy to do they just did a bitch -ass move and they didn't pay those people yeah if you no matter what you should be attributing if you're taking ideas put in the footnotes why does it hurt it doesn't make your art worse because then they'd have to admit they stole the riff or stairway to heaven from their opening band and then people would go what and then they would see it then they would look at lead zeppelin differently but you know human beings are fucking severely flawed um i don't know if i buy that with this this idea that you're saying in terms of like uh authors creating content i'm not trying to sell something no i know but if they are i don't think someone should be able to copy their stuff and sell it i don't think they should either but what do you think they should be able to do you think okay you should be able to do you think okay you should be able to decide so if you the content creator you should okay i agree with that.
[1405] Yeah.
[1406] Yeah, it's, it is, I mean, obviously I play devil's advocate a lot, but that's how you get to the bottom of these conversations, but it is a very complicated issue.
[1407] The complicated issue of who you are and why you are who you are at this moment versus who you are a decade ago or two decades ago, it's, it's all very weird, you know, I mean, you go back and think about stuff from high school and you're like, Jesus, what, am I really even that person?
[1408] Like, I think about, like, I've talked to my sister about stuff that happened when we're in high school.
[1409] Hey, you remember that guy?
[1410] And oh, he said to say hi.
[1411] And like, was that even me?
[1412] Do I even know that person?
[1413] Like, is that really me?
[1414] I was like, if I, if I see them again, I'll be like, oh yeah.
[1415] Oh, yeah.
[1416] We had, uh, 10th grade science together.
[1417] Oh, yeah.
[1418] Huh.
[1419] Crazy.
[1420] Like, I ran into a guy from my high school like a couple of weeks ago.
[1421] It was weird.
[1422] It was so weird.
[1423] You know, he remembered some strange story from English class.
[1424] And I was like, wow, you remember that?
[1425] Like, how weird.
[1426] And while he was talking me, I'm like, is that even really me?
[1427] Like, is he even really talking about me because I don't have any connection to the stuff that he's saying.
[1428] And I understand that he has this vague, distant, ghost -like memory in his mind of some slide images that he's pieced together, that he recognizes as a past interaction.
[1429] I mean, it's fucking strange.
[1430] It's super strange, too, like, in, you know, 50 or 20, 45, whatever.
[1431] You know, if your body can be replaced one piece at a time as time goes on, then your body literally, you could survive, but your body is going to be like almost completely different.
[1432] Exactly.
[1433] It's like the boat analogy.
[1434] Was it Graham Hancock that used that analogy?
[1435] Somebody used this analogy of certain boats that are like really ancient boats that are on display and every single piece of them from the original boat has been replaced because they rotted away.
[1436] and you're like, okay, what am I looking at?
[1437] What is this really?
[1438] Yeah, and that's kind of us.
[1439] And once that becomes a physical thing, you know, I met the guy who got his arm and his leg bitten off by a shark.
[1440] You ever see that guy?
[1441] He's got carbon fiber arms and legs.
[1442] John Joseph brought him to the comedy store.
[1443] Yeah, I met him at the UFC when you gave him tickets.
[1444] Oh.
[1445] I shook his hand.
[1446] I was like, oh, shit, that was weird.
[1447] Super nice guy, but he's got this, like a carbon fiber hand and forearm that moves around like a hand and he shakes your hand and then he walks with no limp he's got this carbon fiber I mean I think from the knee down the shark bit his leg off it's fascinating you're like okay you're still a person you're still here and what what there he is there's a gentleman right there Paul de Gelder Paul de Gelder super nice guy but is uh that is a fake arm that he's got from his arm being chomped off by a fucking shark so from uh see where it is from his right thigh like mid thigh down and his right elbow down all that shit chewed off by a shark and he's still jacked look at him no excuses and that's gonna keep becoming more biological right well what the real concern is remember six million dollar man do you remember that television show no you're younger than me there was a show called the $6 million man And the $6 million man He had been in some sort of a pilot accident And the gentleman, we can rebuild him We can make him better than he was Better, stronger, faster And they give him these bionic parts He gave him a bionic arm And they gave him bionic legs And he could run like 60 miles And he could run like crazy fast And he had these artificial arms And artificial leg And they had a bionic woman, same shit Except she was hot And she had artificial legs and I think she could see things and other people couldn't see.
[1448] Like one day, I mean, that was cool.
[1449] Well, you'd look at that.
[1450] You're like, wow, look what you could do.
[1451] Like, he got in a fight with Bigfoot on the TV show.
[1452] It's really stupid.
[1453] But one day, people are going to be given the option.
[1454] Do you want to keep your legs?
[1455] Or do you want to get these legs that allow you to jump over a building?
[1456] I'm curious if there's really, like, superhuman projects that are going on, where people actually can have these abilities.
[1457] that we know that with classified information it's just we know what that there's stuff we don't know that are extraordinary projects so you know this being in the future i feel like there's a disconnect between the state of technology on the planet earth right now with like what the public has access to with what the you know black projects have access to and that is really not cool because it's not fair for humanity to not understand what is going on.
[1458] I think that's true, but I also think that most of the state -of -the -art stuff is, it's peer -reviewed, right?
[1459] I mean, there's so many different people working on these different technologies, like CERN, you know, they're working on the Large Hydron Collider or anything else.
[1460] There's so many different people working on it, the people that are at the forefront of the technology list they're all gobbled up by the dark government they're you know they're they're the people at the head of the line kind of understand where the technology is at currently for sure for you and i we don't know what the fuck's going on but i think you're right i think there's probably some government programs where they scoop up the the wisest and the brightest and you know like they got oppenheimer you know and got him to develop the manhattan project there's probably some shit going on right now what do you think's happening what do you know bill tell me I want there to be huge Freedom of Information Act reform.
[1461] We know there are trillions going into the black budget.
[1462] So Trevor Paglin wrote a cool book called, I think it was him, called Blank Spots on the Map.
[1463] And it just talks a lot about the black budget.
[1464] So we know it exists.
[1465] We know, I don't know.
[1466] but it's it's holding us back but maybe I'm not saying everything should be shared because what if you have like a bioweapon right right you know so we need to understand I think that we need to push the threshold with what the public has access to like we need to go way deeper it's complicated yeah it really is right you know it really is I really appreciate your perspective and I really appreciate your point of view and I really appreciate your ethics and what you're working towards with minds and that's one of the reasons why I wanted to talk to you I think it is important and as much as I fuck around and play devil's advocate I do that to try to get to you know how you're thinking and why and whether or not you've had these arguments in your own mind but um I think ultimately I've said this before and I don't know if it makes sense because again I'm not that smart um I really wonder if there's there's a bottlenecks for progress that are going to be that we're going to run into.
[1467] And I think ultimately information is one of the big ones.
[1468] And information also in a lot of ways is money.
[1469] You know, I mean, when we think of money, we're thinking of ones and zeros that are being moved around on bank accounts.
[1470] It's data.
[1471] I mean, it's attributed to different people and you get to do more things because you have more of these numbers and more of these things.
[1472] But what is it really?
[1473] It's not gold -based anymore.
[1474] It's not a physical material object that you're coveting.
[1475] Now it's some weird thing.
[1476] And it's kind of like information on a database.
[1477] And what if we get to a certain point in time?
[1478] And I sort of feel like in this weird, vague, abstract way, we're moving towards this.
[1479] With all, it's one of the things that I want to really step back and wonder about this trend towards socialism and social democratic thinking.
[1480] I wonder what that is.
[1481] And I honestly think that we're moving towards this idea that, hey, you know, we've got a lot of fucking problems that could be cured if you move some of that money around.
[1482] And but should you be able to move some of that money around?
[1483] And when, what happens if that money becomes something different than, what if people start developing social currency instead of financial currency?
[1484] What if your ability to do things was based on how much you actually put in.
[1485] I mean, we're assuming, right?
[1486] We assume that the way we do things now, where if you want to buy a car, you have to have $35 ,000.
[1487] That's how much a Mustang costs, and you got to bring it to the bank, and this and that, and you can prove a loan.
[1488] But what if we get to a time in the future where it's not these pieces of paper that give you material objects, but rather your own actions and deeds provide you a social currency that allows you to go on vacations, or it allows you to eat at restaurants, it allows you to do things.
[1489] And there's this running.
[1490] tally.
[1491] That's not outside of the realm of possibility.
[1492] No, I think reward systems within everything that we're using are going to rise up.
[1493] I mean, that's what we're already kind of doing.
[1494] I mean, we reward tokens for activity.
[1495] We're going to see that rise up in more things that we're doing.
[1496] But what I'm saying is if we're doing it in, if it's a social currency and that your own personal behavior allows you to access more freedoms or more goods or more things, it would encourage positive behavior and community -based behavior because that would be the only way to advance.
[1497] I mean, obviously, this is a long time down the line.
[1498] But when the first caveman, you know, traded the first fucking shiny rock for the first spearhead, you know, whatever it was that they did that started this whole inevitable trend towards money, this is not something that has to be this way forever.
[1499] You know, and I wonder when we're looking at the distribution of information, which is arguably, not arguably, it's never, never been like what we have today.
[1500] There's never been a time in human history where everyone had so much access to information that you used to have to pay for.
[1501] He used to have to go to schools.
[1502] You used to have to earn your way to the position where you could open the very books that had all this information in it.
[1503] Now you just get it off your phone.
[1504] It's instant.
[1505] And this is a whole different way of interfacing with information.
[1506] And I think this is going to affect higher learning institutes.
[1507] I think it's going to affect a lot of different things.
[1508] But I wonder if this all can be applied ultimately someday, maybe not in our generation, but someday to money, that people start using social currency.
[1509] And that social currency is going to be almost like we have some sort of a database of social currency in this country distributed database yeah as long as the government can be running on open systems i think the reason we struggle with you know trusting the government to distribute wealth is because it's it's so inefficient we want to be deciding where it goes well they're also corrupt as fuck i mean there's no no doubt about that i mean at the end of the day that's a giant problem period if the people that are deciding what we can and can't do with information are also corrupt which means there's laws that allow them to be corrupt but doesn't mean that they're not corrupt right I feel like every politician the only politicians that I would support at this point I want to be pulling us in a direction like that is making their own position irrelevant basically building open secure voting systems that allow the planet or the country to decide and vote on what we're doing.
[1510] I mean, I, you know, I just think that we need more accurate representation of the consciousness of the communities.
[1511] And it shouldn't just be these singular people deciding for everybody.
[1512] We should have, we have the tech.
[1513] Right.
[1514] And by the time they get in there, they're so compromised by the special interest groups that are helping them out and all the different people that are contributing to their campaign fund.
[1515] Do you see anybody like that on the horizon?
[1516] I think that there are not specifically right now.
[1517] I don't see anyone talking about open systems and secure voting and completely changing the way that we're making decisions.
[1518] But I think that's probably just because they don't know about it.
[1519] I think there would be a lot of politicians who would be okay with that or want us to move in that direction.
[1520] but I think we need more technologists, scientists in these positions, building the things that we're using.
[1521] Yeah, and with an ethic of freedom.
[1522] All right.
[1523] Dude, great conversation, man. I really appreciate it.
[1524] Tell people how they can get on Mines, how they can check it out.
[1525] And do you guys have an app as well as?
[1526] We have an app.
[1527] You go to Mines .com slash mobile to get the app.
[1528] We're not on Google Play.
[1529] We are still in the Apple store.
[1530] Google Play won't let you in?
[1531] No. How come?
[1532] They're scared.
[1533] They're scared.
[1534] Yeah, the nipple.
[1535] But you find me, minds .com slash op -min.
[1536] Hopefully, we'll get you on there.
[1537] Yeah, well, I'm on.
[1538] I just, yeah, I haven't posted anything.
[1539] All right.
[1540] But you sent me my account.
[1541] Yes, thank you.
[1542] Appreciate it.
[1543] Let's do it.
[1544] Thanks, buddy.
[1545] Thank you.
[1546] Thanks for coming on, man. It was really fun.
[1547] I think we got a lot out of it.
[1548] Thanks.
[1549] That's great.