Insightcast AI
Home
© 2025 All rights reserved
ImpressumDatenschutz
#1236 - Jack Dorsey

#1236 - Jack Dorsey

The Joe Rogan Experience XX

--:--
--:--

Full Transcription:

[0] Three, two, one.

[1] Boom.

[2] Hello, Jack.

[3] What's up?

[4] Nice to meet you, man. Nice to meet you finally.

[5] Yeah, keep this sucker like a fist from your face.

[6] It's always good.

[7] First of all, dude, you started a company.

[8] When you started Twitter, when you guys first started, did you have any idea?

[9] There's no way you could have added any idea what it would be now.

[10] No. But one of the things I always try to emphasize with people when there are people like, oh, Twitter's crazy.

[11] I'm like, how could it not be crazy?

[12] There's never been anything like it before.

[13] Like imagine trying to predict The president of the United States uses Twitter to threaten other countries Yeah I mean Who the fuck saw that coming?

[14] Nobody saw that coming What did you think it was going to be When you first did it?

[15] Well, you know, we were building this thing for ourselves And that's how That's how everything starts We wanted to use it We wanted to We wanted to, you know, stay connected with each other Like a group text all right?

[16] Like a group text, we loved our phones.

[17] We loved technology.

[18] We actually started this as a hackweek project out of a failed company called Odeo.

[19] It's podcasting.

[20] I remember that.

[21] I remember Odeo.

[22] Super early on.

[23] We were really creative folks, but we weren't that passionate about where podcasting was going in our particular domain.

[24] We just got a lot of competition early on.

[25] iTunes just released their podcast directory.

[26] But we knew we were.

[27] wanted to work together, we knew we love this idea of one button publishing, we love this idea of collaboration, we love this idea of being anywhere and being able to share what was happening.

[28] That was the idea.

[29] I mean, that was it and that's what we wanted it to be.

[30] And I think the most beautiful and also sometimes uncomfortable aspect of Twitter is it, we really learned what it wanted to be.

[31] And the people helped create it.

[32] Like everything that we hold sacred now, the ad symbol, the hashtag, the retweet.

[33] Those were not invented by me or the company.

[34] Those were things that we discovered, things that we discovered people using.

[35] And we just observed it.

[36] And we noticed what they were trying to do.

[37] They were trying to talk with one another.

[38] They were with the hashtag.

[39] Has anybody figured out when the first use of hashtag something was created?

[40] Yeah, it was actually our lead designer Robert Anderson who leads our design of the cache app.

[41] Hired him for Square later on.

[42] But he was the first one.

[43] He was actually communicating with his brother.

[44] And he put at Buzz, his brother's name is Buzz.

[45] And it just kind of spread.

[46] It wasn't in mass, but people were doing it.

[47] But what was most interesting is not what they were doing, but what they wanted to do with it.

[48] They wanted to address each other.

[49] And that changed the company completely.

[50] That changed the service because it went from just broadcasting what's happening to conversation and to being able to address anyone publicly out in the open, which came with it a lot of power and also a lot of issues as well.

[51] Yeah.

[52] The use of hashtags like looking up hashtag you know fry fest or hashtag you know anytime there's something weird that's in the news that's uh that's such a unique way to find things but to go on Twitter and to to utilize that it's uh I mean it's it's it's interesting that you're that this guy just did it just to contact his brother well that that was the S symbol the hashtag was this guy Chris Messina and he was trying to he was trying to tag um he was trying to tag around topics that he was tweeting about.

[53] And again, that spread.

[54] All we did was made it easier.

[55] We made it more accessible.

[56] We enabled everyone to do it.

[57] With the ad symbol, we made a page that collected all mentions of your name.

[58] With the hashtag, we allowed people to search immediately, so you could tap on the keyword, and you would see everyone talking about that or tweeting about that specific hashtag.

[59] So these things were just emergent behaviors that we didn't predict, and they became the lifeblood of the service.

[60] What's fascinating to me about something like Twitter or even something like YouTube is that there's not a lot of other ones like it.

[61] There's just this one thing.

[62] Like, how does that happen where this one thing sort of gets adopted by everybody and takes over and then just becomes this overwhelmingly massive platform?

[63] I mean, there's really, there's Vimeo, there's a few other video services, but nothing on the scale of YouTube.

[64] And that's the same thing with Twitter.

[65] There's nothing on the scale of distributing information in a quick, short, 280 character form like that.

[66] We, I don't think we could plan for it.

[67] I don't think we could necessarily build for that.

[68] Someone said recently to, we just, you know, gathered a bunch of our leadership last week in Palm Springs for an offsite.

[69] And someone said recently that Twitter was discovered.

[70] And I think what's behind all that is that it, it hits something.

[71] something foundational.

[72] It hits something essential.

[73] And my co -founder Biz likes to say that Twitter can never be uninvented.

[74] It's here.

[75] It changed everything.

[76] The use of it has been revolutionary.

[77] And it's just a simple idea of, you know, if you could text with the entire world, if you could actually reach anyone in the world or anyone could see what you're thinking, which I think is also the beautiful thing about text and the medium, you can actually get at someone's raw thoughts and anyone in the world can see that instantaneously.

[78] It becomes this subconscious.

[79] It becomes this like global consciousness.

[80] And it gets to some really deep places in society and some of those places are pretty uncomfortable.

[81] Well, it also gets to some really deep places psychologically.

[82] There's a some, there's a weirdness to it, right?

[83] There's a weirdness to sending text, particularly anonymously, you know, and there's so many accounts that are just an egg, you know, and so many accounts where they're clearly designed.

[84] Like sometimes someone will tweet something mean to me, and I'm like, hmm, I wonder what this person's up to.

[85] So I go to their site, and it's just them tweeting mean shit at people all day long.

[86] Like, it's probably some angry person at work, and they're like, I'm just going to find people and fuck with them all day.

[87] Yep.

[88] When did you realize, or when did you realize, I'm sure you're aware of it, when did you realize that this was almost out of your control in terms of like the scale of it?

[89] There wasn't one moment.

[90] There wasn't one moment that it just felt completely resonant.

[91] It's just, it's unfolded into the next thing and the next use case, and it just keeps surprising us with how people are.

[92] are using it.

[93] We, you know, it definitely, um, recently, I think we've identified some of the areas of the service that we need to pay a lot more attention to.

[94] Um, Twitter is unique and that it has two main spaces.

[95] One, which is your timeline.

[96] And those are the people that you follow.

[97] And, uh, you know, when you follow someone, they've earned that audience.

[98] And then it has this other world where anyone can insert themselves into the conversation.

[99] They can actually mention you and you'll see that without asking for it.

[100] You can insert yourself into hashtags and to search.

[101] And these are areas that people have taken advantage of.

[102] And these are the areas that people have gamed our systems to, in some cases, artificially amplify, but also just to spread a lot of things that weren't possible with the velocity that they're not possible before.

[103] Now, when this is all happening, what's the conversation like at Twitter, when you're recognizing that this is happening that people are kind of gaming the system?

[104] Like, how do you guys, how do you mitigate it?

[105] What's the discussion?

[106] Well, early on, it was pretty surface level.

[107] Like how do we change some of the app dynamics?

[108] but more recently, we're trying to go a lot deeper and asking ourselves a question, when people open Twitter, what are we incentivizing?

[109] What are we telling them to do when they open up this app?

[110] We may not explicitly be doing that, but there's something that we're saying without being as clear about it.

[111] So what does the like button incentivize?

[112] What does the retweet incentivize?

[113] What does the number of followers in making that number big and bold incentivize?

[114] So I'm not sure what we should, I'm not sure if we should incentivize anything, but we need to understand what that is.

[115] And I think, you know, right now we we do incentivize a lot of echo chambers because we don't make it easy for people to follow interests and topics.

[116] It's only accounts.

[117] It's we incentivize a lot of outrage and hot takes because of the, some of the dynamics in the service, not allowing a lot of nuance in conversation earlier on.

[118] pseudonyms, this ability to not use your real name, incentivizes some positive things, like it allows for whistleblowers and journalists who might fear for their career or even worse their life and under certain regimes, but also allows for people like the example you mentioned of just random fire and spread of abuse and harassment throughout.

[119] So those are the things that we're looking at, and how do we enable more of the conversation to evolve?

[120] How do we increase the credibility or reputation of accounts?

[121] How do we identify credible voices within a position?

[122] particular domain, not just through this very coarse grain blue verified badge, but if you're an expert in a particular topic, how do we recognize that in real time and show that so that we can provide more context to who you're talking to?

[123] And if you want to engage in a deeper conversation or just ignore, mute, or block them.

[124] But what is the conversation like while you're at work?

[125] Like when you're realizing that all this stuff is happening, and you're realizing that now, I mean, particularly because the president uses it so often, it's such a, I mean, it's this preferred platform for communicating with the people.

[126] I mean, even more so than addresses.

[127] It's very strange.

[128] What's the conversation like in the office when you're trying to figure out, hey, what's our responsibility here?

[129] Like, how are we supposed to handle this?

[130] How do we, I mean, you're, in some ways, what Twitter is doing is, it's really kind of, it's flavoring the public narrative.

[131] It's flavoring the way we communicate with each other in our culture, worldwide.

[132] Yeah, I mean, the conversation has definitely evolved.

[133] I think in the past we just got super reactive.

[134] We were reacting to all the negative things that we're seeing, and that led to a lot of short -term thinking.

[135] More recently, we've just looked much deeper.

[136] We don't react to the present day, where we look for some of the patterns.

[137] And, you know, we have a company that is not just serving the people of this particular country, the United States.

[138] This is global.

[139] We have global leaders all around the world using us in different ways.

[140] Some, you know, more with a higher velocity, some recognize more of the power.

[141] Some put out statements, some lead conversations.

[142] but it's looking at all those dynamics and not trying to hyper focus on any one particular one because if we do, we're only building it for one portion of the population or only one perceived present -day crisis.

[143] What I'm trying to get at was like, okay, like when things come up, like say if you find out that there's people from ISIS that are using Twitter and they're using Twitter and posting things like what is the conversation like how do we do about this do we let leave this up do we recognize this as free speech do we only take it down if they're calling for murder or hate speech like what how do you handle that well it evolves I mean because like we first saw ISIS when the world saw ISIS and we needed to change our policy to deal with it what was the what was the initial reaction to it so once you realize that people from ISIS were making Twitter accounts and they were trying to recruit people and doing all these things.

[144] What was the thought process?

[145] Oh, it was a question.

[146] Like, what are we going to do about this?

[147] We haven't experienced this before.

[148] We need to...

[149] Nobody has.

[150] I mean, you're essentially pioneers.

[151] No, yeah, but there are people who have experienced in different forms, in different mediums.

[152] So we reach out to our government partners, for instance, or law enforcement partners.

[153] We reach out to our peer companies to ask if they're seeing the same things that we're seeing.

[154] We have a bunch of civil society, that we talk to to get their take on it as well.

[155] And we try to balance that across, you know, various spectrums, whether it be, you know, more organizations that are more focused on preventing online harassment all the way to the ACLU and the FF who are protecting the First Amendment online.

[156] So we try to get as many perspectives as possible, take that, and then make some informed decisions.

[157] But also realize that we're probably going to make some mistakes along.

[158] the way and all we can do to correct some of that is just be open about where we are and that's probably where we failed the most in the past is we just haven't been open about our thinking process what led to particular decisions how our terms of service evolve in terms of service as an area in our industry is just it's a mess no one reads them you know you sign up for these services and you quickly hit accept.

[159] Yeah.

[160] And we expect people to read these rules of the road, but they haven't read them.

[161] Have you ever read them?

[162] I have read them.

[163] You've read your own.

[164] Have you ever read Facebooks?

[165] I haven't read Facebooks.

[166] Yeah.

[167] I'm not on Facebook.

[168] You're not on it?

[169] I'm not on Facebook.

[170] Fuck Facebook, right?

[171] I'm just kidding.

[172] What about Instagram?

[173] You ever read theirs?

[174] I was in the first 10 users of Instagram.

[175] Really?

[176] Kevin was an intern.

[177] Kevin Sistram was an intern at Odeo, and I was one of the first investors of Instagram and love the service.

[178] I don't think I've ever read their terms of service.

[179] Yeah, that's what I'm saying.

[180] Even you.

[181] Even me. But I read ours, and one of the things I noticed right away is, you know, you read our terms of service.

[182] And one of the first things that we put at the top of the page was copyright and intellectual property.

[183] protections.

[184] You go down, you scroll down, you see everything about violent threats and abuse and harassment and safety.

[185] And it's not that the company intended for that to be the order.

[186] It just, we just added things going on.

[187] But even a read of that puts forth our point of view.

[188] Like we're actually putting copyright infringement above the safety, the physical safety of someone.

[189] So we need to re -look at some of these things and how they've evolved and how they reacted.

[190] But is it above just because it's listed second?

[191] I mean, they're essentially all in the same one sheet.

[192] They're all in the one sheet.

[193] When you bring it up, when you discuss it first, is that really critical?

[194] They're all part of the terms of service.

[195] Yeah, but I think that ordering matters.

[196] Like, what do we consider to be most important?

[197] Right.

[198] And we have to consider physical safety to be the one thing that we protect the most.

[199] So physical threat?

[200] physical threats, doxing, doxing, anything that impinges on someone's physical safety.

[201] This is an area where I don't think technology and services like ours have focused on enough.

[202] We haven't focused on the off -platform ramifications of what happens online.

[203] So what do you do, like, here's a good, for instance, this situation with this young kid who had the MAGA hat on and the Native American gentleman who was in front of him banging the drum and then people are calling for this kid's name they want his name they want his address including kathy griffin like how do you how do you handle something like that well because that's essentially request for doxing yeah and and that's that is a new vector that we haven't seen in mass these are these are the cases that bring up entirely new things so we have to study it we have to see how we reacted what happened with a network but this goes back to the incentives like we are incentivizing this very quick reaction, and it's taking away from some more of the considered work that we need to do to really diagnose what's happening in the moment.

[204] And it's such an interesting case study to see how that evolved over just 48 hours.

[205] Yes.

[206] That's one of the most fascinating news cycles or stories in the news cycle in quite a while, because it's nuanced.

[207] There's many different levels to it.

[208] Yeah.

[209] And a lot of, like, really knee -jerk reactions.

[210] Totally.

[211] And, but we helped that.

[212] How'd you help it?

[213] Well, it's just, that's how some of the dynamics of the service work.

[214] But is that how some of the dynamics of the service, or is it the way people choose to use the service?

[215] Like, if you're a thoughtful person, you wouldn't just, like, for instance, the original image that was distributed came from an account that's now banned, right?

[216] And so it was discovered that that account was a troll account.

[217] How does that happen and what was the thought process behind that?

[218] Because the image that they posted was a legitimate image.

[219] It really did happen.

[220] It was a part of an actual occurring event.

[221] So why did you ban the person or the troll account that put it up?

[222] I don't know about this particular case, but it's likely that it was found.

[223] And there's a lot of what you see on the surface, the Twitter, and some of the actions that we take on the surface.

[224] But where we spend a lot of our enforcement is actually what's happening underneath.

[225] So in many cases, we have trolls or people, like the case that you mentioned, whose sole purpose is just to harass or abuse or spread particular information.

[226] And oftentimes these accounts might be connected.

[227] or they start one account, that gets banned, they start another account, but we can actually see this through a network lens, and we can actually see some of those behaviors.

[228] So that might have been one of the reasons.

[229] I'm not sure in that particular case, but, you know, the...

[230] How do you know?

[231] Do you know because of IP addresses?

[232] Do you know because of the...

[233] A variety of things.

[234] Like, it could be trying to use the same phone number, same email address, IP addresses, device IDs, all these things that we can use to judge what's happening within the context.

[235] So we do have a lot of occurrences of suspending or temporarily suspending accounts because of activities across accounts.

[236] And that happens a ton.

[237] But what I mean in that we're helping this right now is like some of the incentives, like just imagine seeing that unfold.

[238] And when you see someone with one take, it kind of emboldened something to follow along, and then this mob kind of rules.

[239] Yeah.

[240] So there has to be a way for us to incentivize a lot more, more considered and more nuanced introspection of what's going on.

[241] Yeah, give everybody mushrooms.

[242] It's probably the only way.

[243] I don't know.

[244] How are you going to get people to be more considerate?

[245] I mean, what, what, I mean, this is essentially your engineering social behavior, right?

[246] Yeah, providing more context.

[247] Provided more context.

[248] How's that?

[249] More context.

[250] Like, in an example, let's take Brexit, for example.

[251] Okay.

[252] So if I followed a bunch of accounts that, like Boris Johnson, who was constantly giving me information about reasons to leave, I would probably only see that perspective.

[253] Nigel Farage.

[254] Yeah.

[255] And those, a lot of folks just will not follow accounts that have a completely different perspective or a different influence.

[256] A number of people do, hopefully journalists do, but most people won't do that work.

[257] So this is the only tool we give people, follow an account.

[258] If, however, during that time, you followed the hashtag, you followed the hashtag vote leave.

[259] 95 % of the conversation and the tweets you see are all reasons to leave, but there's a small percentage that shows a different perspective and that shows a different reasoning.

[260] We don't make it easy for anyone to do that.

[261] And that is a loss.

[262] Easy for anyone to follow the alternative perspective?

[263] Follow the hashtag, follow a topic, follow an interest.

[264] And because of that, we help build an echo chamber.

[265] Right.

[266] And something that doesn't really challenge any perspective.

[267] And not to say that we should foreset upon people, but we don't even make it easy for people to do in the first place.

[268] The way you do that today is you go to the Explorer tab, You look, you search for a hashtag or you tap into a hashtag and you can see all the conversation.

[269] But that's work.

[270] And most people just won't do the work.

[271] They'll stay in their timeline.

[272] They'll see what they need to see.

[273] And I can certainly imagine, you know, why if I'm just following a bunch of people who have the exact same take on this, it just continues to emboldened and emboldened and bold and bold and they see nothing of a different perspective on the exact same, the exact same situation.

[274] What's interesting to me is the difference between Twitter and Instagram.

[275] Essentially, it's not just the photographs.

[276] What's weird that has happened was there's shitty people on Instagram as well.

[277] I mean, there's a lot of arguments and things along those lines, but they don't overwhelm the initial post.

[278] Whereas with Twitter...

[279] It's on a different surface.

[280] Instagram is a post.

[281] I mean, it's a post.

[282] It's not really eliciting conversation.

[283] It's eliciting comments.

[284] It's difficult to follow the conversations.

[285] I don't think there is a conversation.

[286] Well, sometimes there is.

[287] Sometimes people are going back and forth about a particular subject that's discussed in the initial post.

[288] But it's not very clear.

[289] Yeah.

[290] Whereas with Twitter, it's only conversation.

[291] It's only conversation.

[292] But even if there's a photograph, even if somebody posts a photograph on Twitter and has conversation under it, the photograph seems to be like of secondary importance.

[293] Yeah.

[294] It's super fluid and super fluid.

[295] super messy too.

[296] Yeah.

[297] The thing is every, on Instagram or any blog you have this post, this statement, and you have comments underneath.

[298] Whereas with Twitter, everything is on the same surface.

[299] Right.

[300] It's all one surface.

[301] Yeah, my friend Kurt Metzger likes that about Facebook.

[302] He says because in Twitter he goes, I post something and then all these fucking morons post something and he goes, you know, Kurt, he's very animated.

[303] He's like, and their shit looks just like my shit.

[304] It's all together all piled up.

[305] He goes, but if I post something on Facebook, he goes, I have this whole thing.

[306] Like, this is the original statement.

[307] And then underneath it, yeah, you fucking say whatever you want.

[308] But no one's, no one's reading that.

[309] Like they're reading the original initial post, and it's clear that there's a differentiation between the initial post and the secondary posts.

[310] Yeah, I, you know, there's room for both models, but I, this conversation, most conversations, It's not you making a statement and me just reacting to that.

[311] Right.

[312] Like our conversation evolves based on what we say.

[313] We can interrupt one another.

[314] We can, you know, we can completely change the subject.

[315] I can take control of the conversation and the people who might find that interesting follow it and the folks that don't just stop listening.

[316] Whereas you can't do that in a post comment model.

[317] Yeah, it's also, text is so limited.

[318] there it's I mean it's great for just getting out actual facts but it's also thinking it's just so close to thinking like there's no composition right you know and and that's a that to me is the most beautiful thing about Twitter but also something that you know can be uncomfortable like I I can compose my life on Instagram I can compose my thoughts within a Facebook post and it can look so perfect but the best of Twitter is just super raw and it's right it's right to the thinking process and i just think that's so beautiful because it gets to it gets to consciousness it gets to something deeper and i think that deep how so how is it different than a post on instagram or a post on facebook the the the speed demands you know the the the character constraint the speed kind of just demands a um a more conscious present focused thinking versus like stepping back and composing a letter yeah and composing a letter and thinking about all the outcomes but oftentimes people do compose it as a letter and they break it up into separate 280 character posts yeah the other threads what was the thought processing going from 140 to 280 because the one thing that I liked about 140 is you can't be verbose yeah you can't just ramble and you know like it's great for comics because it forces us to write jokes like with economy of words exactly we uh we found a lot of resonance with journalists because of headlines.

[319] We found a lot of resonance with comics because of the rhythm.

[320] And we found a lot of resonance with hip -hop as well because of the bars and just the structure and the constraint allowed that flow.

[321] The thinking was we looked at our, you know, languages around the world and there's some languages like German, 140 characters you can't really say much.

[322] Can't really say much at all.

[323] Right, because the words are so long.

[324] There are some languages like Japanese, 140 characters is 140 words.

[325] And what was interesting about Japan was, Japan is one of our largest, our largest countries.

[326] We're bigger than Facebook there.

[327] Are you not bigger than Facebook in America?

[328] No. What the fuck?

[329] I don't even use Facebook.

[330] Sorry, Facebook.

[331] I don't either.

[332] I mean, I use it in terms of, if I post something on Instagram it goes to Facebook yeah but when I go to Facebook it's it just seems like a lot of well this seems like Twitter too a lot of arguing but what I well Twitter seems to be more fun if that makes any sense even though it's there's a lot of chaos when something one of my favorite things is when someone post something stupid and then underneath it is a bunch of gifs or gifs how do you say GIFs does anybody know ask him yes it gifts I say gifts I say of gifts that are hilarious.

[333] Like, I was just mocking someone relentlessly.

[334] Like, that is one of my favorite things about Twitter.

[335] When someone, like, Donald Trump posts something ridiculous, and then I'll go, and I'll look at the responses, I'm like, Baa!

[336] I don't even care.

[337] It's a public conversation.

[338] You can see how everyone interacts.

[339] But, like, it's all, the interesting thing about Twitter is there's not one Twitter.

[340] It's, like, you have politics Twitter, which can be super toxic.

[341] You have sports Twitter.

[342] You have NBA Twitter, you have MMA Twitter, you have UFC Twitter, you have K -pop Twitter, you have E -Sports Twitter, you have Black Twitter.

[343] That's Jamie's.

[344] You have all these different Twitters.

[345] And you have a completely different experience based on what Twitter you follow and what Twitter you participate in.

[346] Some of them are like super engaging, super funny.

[347] Some of them are, you want to walk away from it.

[348] Yeah, I got to a certain point where I couldn't read replies anymore.

[349] I just, mm, it's just, not that it's that toxic, the vast majority of interactions I have with people are super positive.

[350] I mean, absolutely, like more than 99%.

[351] But it's, I didn't, I don't have time and I don't have time to be constantly responding to people.

[352] And it just, the sheer numbers, I think when I got around three million -ish followers, I'm like, I can't do this anymore.

[353] It's just, it's overwhelming, like, I don't have the resources.

[354] Yeah, I'm a huge believer in serendipity, so you look at your replies once and you might see something that just, like, strikes you, and that's enough.

[355] You don't need to read through all of them.

[356] Yeah, sometimes, but then you might miss something, groovy.

[357] You might, but I also believe the most important things come back up.

[358] What I used to do a lot would I would go through my mentions, and when people would, I essentially used it as almost a news aggregator.

[359] I would go through my mentions and people would post cool stories, and then I would retweet those.

[360] And so because people knew that I would retweet them, they would send me a lot of course.

[361] cool stuff.

[362] So because of that, because of reciprocating, I got a lot of really cool stuff sent my way.

[363] Yeah, yeah, you're pushing more out, expand the network.

[364] Yeah, and I reinforced it.

[365] I just, I wanted to thank people for posting cool stuff, and they love the fact they would get a retweet, and so they would send me, like, interesting science stories or, you know, very bizarre nature stories, and I'd just be retweeting them all the time.

[366] But then after a while, I'm like, hmm, this is a lot of time.

[367] It's a lot of time.

[368] So now, essentially, Eventually what I do is I just post something and I just kind of like, I just walk away.

[369] But that, I mean, that speaks to what we want to incentivize more.

[370] We want more people contributing things back to the network, back to the public conversation.

[371] And I know it doesn't feel like this today for most people, but my ideal is someone walks away from Twitter learning something and they're actually learning something entirely new.

[372] And it might be a new perspective.

[373] That happens a lot.

[374] probably happens more often than we...

[375] Depending on who you follow.

[376] Exactly.

[377] It's all dependent on the Twitter you follow.

[378] And like the, you know, the health Twitter is amazing.

[379] You know, I learned so much, like I followed Rhonda Patrick and a bunch of folks who are into sauna and Wim Hof and Ice Bass and Ben Greenfield.

[380] And you just follow them and you just get all this new information about alternative views of how to stay healthy, how to live longer, and I can't find that anywhere else in one place like that.

[381] And then it's not just them broadcasting.

[382] When they retweet something or when they tweet something, there's a whole conversation about it.

[383] So, you know, some people say this is, you know, this is not been my experience or this is not true for me or actually have you seen this connected thing?

[384] And I just go down this rabbit hole and I learn so much.

[385] But that's not the experience for everyone.

[386] No. Well, yeah, it's not the experience for everyone, and it's not really, I don't think it's what everyone wants either.

[387] Sometimes people just like to go on there and talk shit.

[388] I mean, there's someone that's trapped in a cubicle right now.

[389] And they just want to go on there and get in arguments about gun control or, you know, whether or not Nancy Pelosi's the devil.

[390] I mean, this is, this is what, you know, it serves a purpose for them.

[391] Yeah.

[392] The thing that gets strange, though, is who's to decide.

[393] You know, there's this, there's this concept, there's a discussion.

[394] I should say, where some people believe that things like Twitter or Facebook or any forum where you're having a public discussion should be considered almost like a public utility.

[395] Like anyone has access to the electric power.

[396] Even if you're, you know, even if you're a racist, you still can get electricity.

[397] And some people think that you should have that same ability with something like Twitter or the same ability with something like Instagram.

[398] obviously this is we're in uncharted territory and you are you are in uncharted territory totally it just no one has been there before so who makes the distinctions when you see someone that it's saying something that you might think is offensive to some folks but not offensive to the person who's saying it maybe the person who's saying it feels like they need to express themselves and this is important to say and how do you decide whether or not this is a valid discussion or if this is air quotes hate speech which is a you know there's some things that are hate speech and there's sometimes people use the term hate speech and it's just a cheap way to shut down a conversation yeah we so the simple answer is we look at conduct we don't we don't look at the speech itself we look at conduct we look at how the tool is being used and you're you're right in that like i think when people see twitter they see and they expect it to be a public square.

[399] They can go into that public square.

[400] They can say whatever they want.

[401] They can get on a pedestal.

[402] And people might gather around them and listen what they have to say.

[403] Some of them might find it offensive and they leave.

[404] The difference is there's also this concept of this megaphone.

[405] And the megaphone can be highly targeted now with Twitter as well.

[406] So it's not it's not the speech.

[407] It's how it's amplified.

[408] So what do you do?

[409] Let's say let's say, let's say there's someone in the media.

[410] Let's say it's a prominent feminist.

[411] And then you have a bunch of people, or let's say just one person, and their Twitter feed is overwhelmingly attacking this prominent feminist, just constantly attacking her, calling her a liar, calling her this, calling her that.

[412] When do you decide this is harassment?

[413] When do you decide this is hate speech?

[414] When, like how do you, I mean, this is a, this is a fictional account, right?

[415] Fictional person we're talking about.

[416] but in this, for instance, what would dictate something that was egregious enough for you to eliminate them from your platform?

[417] Well, that's a heavy action, so that's the last resort.

[418] But we look at the conduct.

[419] We look at oftentimes, as you said, like the probability of someone who is harassing one person, it's highly probable that they're also harassing 10 more people.

[420] Right.

[421] So we can look at that behavior.

[422] We can look at how many times this person is being blocked or muted or reported.

[423] And based on all that data, we can actually take some action.

[424] But we also have to correlate it with the other side of that because people go on and they coordinate blocks as well.

[425] And they coordinate harassment.

[426] And they coordinate, I'm sorry, not harassment, but reporting.

[427] Reporting a particular account to get it shut.

[428] down and to, uh, to take the voice off the service.

[429] So these are the considerations we have to make, but it's, it's, it's, it all starts with conduct and all oftentimes we'll see coordinated conduct, whether it be that one person opening multiple accounts or coordinating with multiple accounts that they don't own to, you know, go after someone.

[430] And there's a bunch of vectors.

[431] People use retweet for that, uh, the quote tweet for that a lot as well.

[432] Like, you know, they'll quote tweet a tweet that someone finds and they'll say, look at this idiot, Twitter, do your thing.

[433] And then just this mob starts and goes and tries to effectively shut that person down.

[434] So there's a bunch of tools we can use.

[435] The permanent suspension is the last resort.

[436] One of the things that we can do is we can downrank the replies.

[437] So any of these behaviors in conduct that look linked, we can actually push farther down in the reply chain so it's all still there but you might have to push a button to actually see it you might have to show more replies to actually see uh this harassing account or what might look like harassing language and is this manually done no no no this is this is all automated it's automated yeah yeah but how would you know a lot of the ranking and the and looking at amplification and looking at the network is is automated right like in terms of down -ranking.

[438] Is there a discussion as to whether or not this person's reply should be down -ranked?

[439] Like, how do you figure that out?

[440] It's a machine learning and deep learning model.

[441] So it's AI.

[442] It's AI.

[443] And they learned, you know, and we look at, um, we look at how these things are doing and where they make mistakes and then we improve it.

[444] It's just constantly improving, constantly learning.

[445] Does that feel like censorship to you, like automated censorship?

[446] Because I mean, who is to decide other than people?

[447] whether or not something is valid.

[448] Well, we're not looking at the at the speech in this particular case.

[449] We're looking at the conduct, like the conduct of like someone in fast velocity attacking someone else.

[450] Okay.

[451] So those are the things that our technology allows.

[452] It changes the velocity.

[453] It changes how, you know, to broadcast a message that someone didn't really ask for and didn't want to hear.

[454] We don't touch.

[455] If I follow Joe Rogan, you'll see a. every single tweet.

[456] We don't touch it.

[457] Right.

[458] Right.

[459] But that's an audience that you earn.

[460] But in your replies page, we have a little bit more room because this is a, this is a conversation that starts up.

[461] And some people just want to disrupt it.

[462] And all we're saying is we're going to look at moving the disruption down.

[463] Not that it's hidden, but it's still there.

[464] But, you know, you just see it a little bit farther down.

[465] Like there was a, what was the instance with Ari?

[466] I should text him right now, get him to answer me in real time.

[467] But Ari Shafir got kicked off of Twitter because he said something to Bert.

[468] Like, Bert, I'm going to fucking kill you.

[469] Bert Kreischer being our good friend.

[470] All of us are good friends.

[471] And he's like, you fucking dummy, I'm going to kill you or something like that.

[472] He took his record albums.

[473] He said he was like, stealing him and break him on.

[474] He jokingly got mad.

[475] Right, right, right.

[476] That's Ari, though.

[477] Bert was pretend.

[478] I think it was all bullshit, right?

[479] I don't think Bert really stole his records.

[480] Yeah, he gave him back to him eventually.

[481] He's like, I'm going to fucking kill you.

[482] So what happened there, what probably happened there, and I'm not sure of the particular case, but what probably happened there is someone might have reported that tweet.

[483] One of our agents, human agents, without context of their friendship or that relationship saw it as a violent threat and took action on it.

[484] And those are the mistakes that we're going to make.

[485] That's why we need an appeals process.

[486] Or Bert needs to keep his fucking greasy hands off Ari's records, right?

[487] That's probably not going to happen.

[488] We need to make sure that we need to make sure that we're reacting.

[489] the right way.

[490] And like, look, we're going to make mistakes.

[491] We're trying to, the problem with the system right now is most of the, most of the work is actually, and the burden is actually on the victims of abuse while they're getting harassed.

[492] So a lot of our system doesn't enforce or act unless these tweets are reported, right?

[493] So we don't take suspension actions or removal of content actions unless it's reported.

[494] The algorithms rank and order the conversation, but they don't take suspension actions.

[495] They don't remove content.

[496] They might suggest to a human to look at this who might look at our rules and look at the content and try to look at the context of the conversation and then take action.

[497] But we would like to move towards a lot more automated enforcement.

[498] but more importantly, how do we highlight, how do we amplify more of the healthier discussion conversation?

[499] Again, not removing it.

[500] We're going to a world, especially with technology like blockchain, that all content that exists that is ever created will exist forever.

[501] You won't be able to take it down.

[502] You won't be able to censor it.

[503] It won't be centralized at all.

[504] Our role is around what we recommend based on your interest and based on who you follow and helping you to get into that on ramp.

[505] But if you look at the architecture, it's a given that any time something is created, it's going to exist forever.

[506] This is what blockchain helps enable down the line.

[507] And we need to make sure that we're paying attention to that.

[508] And also realizing that, you know, our role is like, how do we get people the stuff that they really want to see and they find valuable that they'll learn from, that they'll make them think that will help them evolve the conversation as well.

[509] Now, when you say amplify the messages that you deem to be more positive, right?

[510] Like, how do you decide that?

[511] People decide it.

[512] People decide it.

[513] People decide it based on, like, are they engaging in replies?

[514] Are they retweeting it?

[515] Are they liking it?

[516] Are they...

[517] But sometimes it's really negative.

[518] Like sometimes the people that are engaging it are engaging it because they're attacking someone.

[519] So is that valuable or is it just unfortunate?

[520] It's valuable.

[521] I mean, every signal is something that we can learn from and we can act on.

[522] But it's going to constantly evolve.

[523] Right.

[524] These models that we have to build will constantly have to learn what the network is doing.

[525] doing and how people are using it.

[526] And, you know, our goal is healthy contribution back to the public conversation.

[527] That is what we want.

[528] We want to encourage people into more bigger, informative, global conversations that they'll learn from.

[529] Are you, like, constantly aware of how much this is changing society and that you are one of the four or five different modalities that are radically, changing society, whether it's Facebook or Instagram or any of these social media companies, it's radically changing the way people communicate with each other.

[530] Like there's a giant impact on the way human beings talk and see each other and the way we process ideas and the way we distribute information.

[531] It's unprecedented.

[532] There's never been anything like that before.

[533] And you setting up something that you think is going to be a group chat.

[534] Do you remember the early days when you would say like, at, Jack is going to the movies.

[535] You would say, like, that's how we would say it.

[536] I would say, at Joe Rogan is on his way to dinner.

[537] That's how people would do it.

[538] Status, yeah.

[539] It's fucking, it was weird that somewhere along the lawn that morphed.

[540] It morphed because that's what the world wanted to do with it.

[541] That's what they wanted to do with it.

[542] And I just think it's so reflective of what the world is and in some cases what the world wants to be.

[543] So it's a pathway for thinking.

[544] just a pathway for people to get their thoughts out, but a really a powerful one, an unprecedented method of distributing information.

[545] It's really nothing ever been like this before.

[546] No, no. And it won't, this mode of communicating will not go away.

[547] It'll just get faster.

[548] It will become a lot more connected.

[549] And that's why our work is so critical to figure out some of the dynamics at play that make it that cause more negative outcomes and positive outcomes.

[550] I think about it because well, I think about it because it's just a hugely significant thing.

[551] But I also think about it because of podcasts because podcasts are in a similar way.

[552] Just no one saw it coming and the people that are involved in it are like, what the fuck are we doing?

[553] like me. I'm like, what am I doing?

[554] Like, what is this?

[555] Yeah.

[556] Like, for me, it's like, ooh, boy, I get to talk to guys like Ben Greenfield and Jonathan Haid and all these different people and learn some stuff.

[557] And I've clearly learned way more from doing this podcast than I ever would have learned without it.

[558] No doubt about it, unquestionably.

[559] But I didn't, I didn't fucking plan this.

[560] So now all of a sudden there's this signal that I'm sending out to millions and millions of people, and then people like, well, you have a responsibility.

[561] I'm like, oh, great.

[562] Well, I didn't want that.

[563] I didn't want a responsibility to what I distribute.

[564] I just wanted to be able to have a freak show.

[565] Just talk to people.

[566] Like, whatever, you know, like, there's certain people that I have on, whether it's Alex Jones or anyone that's controversial, where people will get fucking mad.

[567] Why are you giving this person a platform?

[568] I go, okay, hmm, I didn't think about it that way, and I don't think that's what I'm doing.

[569] I think I'm talking to people.

[570] listen.

[571] Yeah.

[572] But it's giving that person a platform because they're saying, well, no, they'll tone down, like, Milo Unopolis.

[573] That was one of the arguments people gave me. Like, he toned down his platform when he was on your show so he could get more people to pay attention to him.

[574] Like, okay, but he also talked about he, he, he, that was one of the reasons why he was exposed was my show, because he talked about that it's okay to have sex with underage boys if they're gay, because there's like a mentor relationship between the older gay man and the young people like what the fuck are you talking about and that was a big part of why he's kind of been removed from the public conversation that was there was one of the thing and and then there's the discussion like well what is that what is removing someone from the public conversation if someone is very popular and they have all these people that like to listen to them like what is the responsibility of these platforms whether it's youtube or twitter or anyone What is their responsibility to decide whether or not someone should or shouldn't be able to speak?

[575] And this is a thing that I've been struggling with, and I've been, I bounce around inside my own head, and I see that you guys struggle with it, and pretty much everyone does.

[576] YouTube does.

[577] And it's, it is a hugely significant discussion that is left to a very, you know, relatively small amount of people.

[578] and this is why this discussion of what is social media, is it something that everybody has a right to, or is it something that should be restricted to only people that are willing to behave and carry themselves in a certain way?

[579] I believe it's something that everyone has a right to.

[580] Everyone has a right to, but you still ban people.

[581] Like, say like Alex Jones, you guys were the last guys to keep Alex Jones on the platform.

[582] You were the last ones.

[583] And I believe you hung in there until he started harassing you personally, right?

[584] No, no, no, no, no. He came to your house, he banged no. No. He, uh, you know, he, he did very different things on our platform versus the others.

[585] Oh, okay.

[586] So we, we saw this domino effect over a weekend of one platform, uh, banning him and then another, another, another in very, very quick succession.

[587] Right.

[588] And, you know, people, I think would have assumed that we would just have followed suit, but he didn't violate our terms of service.

[589] Right.

[590] And afterwards, he did.

[591] And we have, you know, we have a policy.

[592] And if, you know, there's a violation, we take enforcement actions.

[593] One might be asking the account holder to delete the tweet.

[594] Another might be a temporary suspension.

[595] Another might be a permanent suspension.

[596] So what you say, so like let's use it in terms of like him saying that Sandy Hook was He did not say that on the platform.

[597] He did not say that on Twitter.

[598] He only said that on his show.

[599] I don't know all the mediums he said it in.

[600] What did he do?

[601] But what we're looking at is a conduct and what he did on our platform.

[602] So what did he do on your platform that was like, that you all were in agreement that is enough?

[603] I'm not sure what the actual, like, you know, violations were.

[604] but we have a set number of actions and if they keep getting if an account keeps violating terms of service ultimately it leads to permanent suspension and when all the other platforms were taking them off we didn't find those we didn't we didn't find those violations and they weren't reported but again it goes back to a lot of our model people weren't reporting a lot of the tweets that may have been in violation on our service and we didn't act on them.

[605] Right.

[606] Like a good instance is what's going on with Patreon.

[607] I'm sure you're aware of the Sargon of a Cod thing.

[608] He did a podcast a long time ago, I believe six months or so ago, where he used the N -word and the way he used it is actually against white nationalists.

[609] And he also said a bunch of other stuff.

[610] And they decided, Patreon decided that what he said on a podcast was enough for them to remove him from the platform, even though he didn't do anything on their platform that was egregious.

[611] And also, they had previously stated that they were only judging things that occurred on their platform.

[612] There's been a giant blowback because of that, because people are saying, well, now you're essentially policing and not based on his actions, just on concepts and the communication that he was using, the way he was talking.

[613] You're eliminating him from being able to make a living and that you're doing this because he does not fit into your political paradigm.

[614] The way you want to view the world, he views the world differently.

[615] This is an opportunity for you to eliminate someone who you disagree with.

[616] Yeah.

[617] I mean, I don't know the nuances of their policy, but like we have to pay attention to folks who are using Twitter to shut down the voices of others.

[618] Right.

[619] That's where it gets weaponized.

[620] And we also have to pay attention to where people are using it that put other folks in physical danger.

[621] And that is where we need to be most severe.

[622] But otherwise, everyone has a right to these technologies.

[623] And I think they also have a right to make sure that they have a very simple and open read of the rules.

[624] And we're not in a great state there.

[625] Our rules and our enforcement can be extremely confusing to people.

[626] What has been the one thing that came up that was perhaps the most controversial?

[627] I know my friend Sam Harris was trying to get you guys to ban Donald Trump.

[628] He was saying, if you follow your terms of service.

[629] I just did a podcast with him, actually, as well.

[630] Should come out today or tomorrow.

[631] He's a fascinating guy, Sam Harris.

[632] I love him to death.

[633] But what he was trying to do was like saying, hey, he's threatening nuclear war.

[634] Like he's saying, hey, Korea, my bombs are bigger than your bombs.

[635] like what else does the guy have to do to get you to remove him from the platform?

[636] When you guys saw that, what was your reaction to that?

[637] Was there an internal discussion about actually banning the president of the United States?

[638] Well, so two things there.

[639] One, it was the context that presidents of this country have used similar language in different mediums.

[640] They use it on radio.

[641] They use it on television.

[642] it's not just through Twitter.

[643] And even if you were to look at the presidency of Obama, it wasn't exactly the same tone in this exact same language, but there were threats around the same country.

[644] And we have to take that context into consideration.

[645] So the second thing is that we need, the most controversial aspect of our rules and our terms of service is probably this clause around public interest in new.

[646] newsworthiness, where powerful figures or public figures might be in violation of our terms of service, but the tweet itself is of public interest.

[647] Yes.

[648] There should be a conversation around it.

[649] And that is probably the thing that people disagree with the most and where we have a lot of internal debate.

[650] But we also have some pretty hard lines.

[651] If we had a global leader, including the President of the United States, make a violent threat against a private individual, we would take action.

[652] We always have to balance that with, like, is this something that the public has interest in?

[653] And I believe generally the answer is yes.

[654] It's not going to be in every case, but generally the answer is yes, because we should see how our leaders think and how they act.

[655] And essentially it informs voting.

[656] that informs the conversation, that informs whether we think they're doing the right job or we think that, you know, they should be voted out.

[657] Well, it's very important to see how someone uses that platform and when someone uses it the way he uses it and then becomes president and continues to use it that way.

[658] That's when people are like, what?

[659] He's been consistent.

[660] I think he joined in 2009, 2012.

[661] You look at all of his tweets all the way back then and it's pretty consistent to that.

[662] Yeah.

[663] I mean, he likes to insult people on Twitter.

[664] It's fun for him.

[665] He does.

[666] It's just, I never thought he would keep doing it.

[667] I thought once he became president, maybe just lock it down, try to do a good job for the country.

[668] And then, you know, after four years or eight years, just go back to his old self.

[669] Fuck you.

[670] Fuck the world.

[671] Fuck this.

[672] But no, he's just, he's just, in one way, it's hilarious.

[673] See, as a comedian, I think it's awesome because it's just, it's so hilariously stupid.

[674] it's so preposterous that he even has the time to talk about Jeff Bezos's affair and the fact that he got caught with the National Inquirer getting text messages and calls him Jeff Bozo like don't you have shit to do man like as but as a comedian I am a gigantic fan of folly almost against my better judgment I like watching it's I like watching I watch I like watching disasters.

[675] I like watching chaos.

[676] When I see nonsense like that, I'm like, oh, Jesus.

[677] I'm drawn like a moth of a flame.

[678] But in the other part of me is like, man, this sets a very bizarre tone for the entire country.

[679] Because one of the things about Obama, like Obama or hate Obama, because he was very measured, very articulate, obviously very well educated.

[680] And I think that that aspect of his presidency was very good for all of us because he represented something that was of a very high standard in terms of his ability to communicate, his access to words, the way he measured his words and held himself.

[681] I think that's good for us.

[682] It's aspirational.

[683] Well, it's like, look at that guy.

[684] He's talked better than me. That's why he's the president.

[685] But, you know, when he's see Trump, he like, he doesn't talk better than me. He doesn't use Twitter better.

[686] He doesn't.

[687] He doesn't use Twitter better.

[688] He's not, he's just this fucking madman.

[689] But isn't it important to understand that and to see it and like to it, hopefully that informs opinions and actions?

[690] 100%.

[691] That's my point.

[692] That's my point.

[693] It's like that this is this weird gray area where I think overall, I definitely support your decision to not, not ban him for violating your terms of service.

[694] Like we need to know.

[695] Yeah.

[696] You know, and it's, how do you know how many accounts are bots?

[697] How do you know how many accounts are from a Russian troll farm?

[698] How do you know that?

[699] So this is a real challenge and something that we're trying to wrap our heads around.

[700] But one of the things we're trying to do is like let's scope the problem down a bit.

[701] Let's use the technology we have available to us like Face ID, like Touch ID, like the biometric stuff to identify the humans.

[702] Let's identify the humans first.

[703] So how do you use that?

[704] Because Face ID is not really available.

[705] Is it available to you guys?

[706] Do you just leak something that you shouldn't?

[707] No, no, no, no, we haven't used it yet, but you can use it for things like, is this a human operating this?

[708] and therefore it is human.

[709] And technology always has to change.

[710] People will find ways around that and whatnot.

[711] But if we go the opposite direction and we look for the bots, the problem with looking for the bots is people assume that they just come through our API.

[712] But the scripting has become extremely sophisticated.

[713] People can script the app, can script the website and make it look very, very human.

[714] So we're going after this problem first trying to identify.

[715] the humans as much as we can utilizing these technologies none of this is live right now these are considerations that we're making and trying to understand like what the impact would be and how we might evolve it but we we need to because that information would provide context for someone like this is an actual human that I'm talking to and I can invest more time in it where I can just ignore the thing because it's meaningless now is Apple willing to share that with you I mean when you when you're talking about biometrics, fingerprints, or face ID?

[716] No, no, not the data.

[717] It's just the operating system verifies that there's an individual that's an individual and it's unlocking.

[718] Right.

[719] Like, you know, when you use the, our cash app uses us, right?

[720] So Squares Cash app, when you want to make a transfer to someone, when you want to send someone money, or when you want to buy Bitcoin, we turn on Face ID and you verify that you are you and you are the owner of the phone, and then it goes.

[721] We don't get images of your face.

[722] We don't see who you are.

[723] Oh, that's what you want me to think.

[724] I know what you said.

[725] That's all locked down by the operating system, and that's what it should be.

[726] Right, sure.

[727] Has there ever been any consideration to not allowing people to post anonymously?

[728] Well.

[729] I like what you said earlier about journalists and whistleblowers that is critical.

[730] So look at platforms that have a real names policy.

[731] Look at Facebook.

[732] Right.

[733] Are the problems any different?

[734] I don't know because I don't go there.

[735] But when I understand, there's a lot of still a lot of arguing there.

[736] It's the same.

[737] A lot of political arguing.

[738] A lot of old people.

[739] The same vectors, same patterns.

[740] Aren't they older?

[741] It's like older in general.

[742] It seems, I also am not really hanging out there, but it seems a little bit older.

[743] It's a lot of Grammys looking at pictures of their kids, grandkids and stuff.

[744] That's what it was made for.

[745] It's connecting with the people that you know.

[746] And that, to me, is the biggest difference with Twitter.

[747] It's connecting with the people you don't know.

[748] And you find interesting, and, like, it's around topics and stuff that you find that you want to learn more about.

[749] When you saw Zuckerberg testifying and in realizing, like, how this platform is being used and what are the dangers of this?

[750] And then you see these senators that really don't know what the fuck the technology is.

[751] Yeah.

[752] it's it really highlights how we're entering into this really yeah well not just a gap a gap in the critical understanding of how these things work and what they are in terms of like how these really important politicians who are the ones who are making these decisions as to whether or not someone has violated laws or whether or not something should be curbed or regulated and they don't even understand what they're talking about.

[753] No, I mean, there's...

[754] Because so few people do.

[755] Because they're not using it directly.

[756] They're not using in the way that people are using it every single day and they don't have the same experience that people have every single day.

[757] And, you know, in terms of regulatory and our regulators and our governments, I, you know, we, I think the conversation is often about how regulators will come in and start writing rules and setting expectations for how companies or services might behave.

[758] But there's a role for the company to educate.

[759] And there's a role for the company to educate on like what technology makes possible, whether it be positive and also some of the negatives that become possible as well.

[760] So I think we have a role to help educate and to help make sure that we're, you know, really we're pushing towards what I think the job of regulator is, which is not.

[761] Number one, protect the individual.

[762] Number two, level the playing field.

[763] And make sure that those two things are not compromised by special interest trying to protect their own domain or profits or dominance within a particular market.

[764] What do you mean by level the playing field?

[765] Level of the playing field so that an individual has the same opportunity that someone else might have or a company might have.

[766] So, okay, so like anybody.

[767] Anyone can have a Twitter account and, you know, they have at least, you know, an equal opportunity to contribute to it.

[768] And whatever they do with it will change the outcome.

[769] Some people might become very popular because they're saying stuff people want to hear.

[770] Some people won't see any following whatsoever because they're not adding anything original or interesting or different in terms of perspective.

[771] Where do you see this going?

[772] When you look at these kind of emerging technology, not necessarily.

[773] merging anymore established now but still you know a new thing in relative terms of human history where do you see this going and does it get more intrusive does it get deeper into our lives like what what when you look at new technologies like augmented reality and things along those lines do you see new possibilities and new things that make things even more complicated i mean yeah i mean we just have to assume that We naturally use more and more technologies, more and more things become open, more and more things increase their velocity.

[774] There's more communication, not less.

[775] Like, this is not going away.

[776] And it's just a question of what we do with it.

[777] So where I want it to go and where I want Twitter specifically to go is, you know, I think it's existential right now that we have global conversations about some things that.

[778] will become crisis, climate change being one of them.

[779] There's no one nation state that's going to solve that problem alone.

[780] Economic disparity being another, the rise of AI and job displacement, and just like us offloading decisions to these algorithms, those are things that no one nation, no one community is going to solve alone.

[781] It takes the entire world to do so.

[782] So I want to make sure that we're doing our best to get people seeing these global conversations and ideally participating in them because it helps us solve the problems faster.

[783] I just believe that more open society allows us to solve problems much faster.

[784] So you in many ways see Twitter as having some sort of a social responsibility in this discussion.

[785] Totally, totally, totally.

[786] Yeah, and I think a big part of it is like right now, like how.

[787] are we ensuring that there is more healthy contribution to that global conversation?

[788] And you know, I just think it's so critical that we start talking about the things that are facing all of us, not just one nation.

[789] I do think, you know, that that's where our current model really puts the world at a disadvantage because it incentivizes more of the echo chambers which lead to things like nationalism instead of taking the broader picture and looking at what's happening around the world to all people, to all of humanity.

[790] What do you do, though, to balance the conversation or what responsibility do you think you have to balance the conversation in terms of the way conservatives view it versus the way liberals and progressives view it?

[791] Balance it.

[792] I mean, is there a responsibility?

[793] Do you have a responsibility?

[794] Or is it just leave it up to the people and let them figure it out the same way they figured out hashtags and everything else.

[795] I think we have a responsibility to make it easier to do that.

[796] Easier.

[797] How so?

[798] Right now it's just too hard.

[799] Most people will not.

[800] Venture outside of a particular mindset.

[801] They will not venture out.

[802] They will not break their bubble.

[803] But if you, because it's right now on the service, it's just so hard to do that.

[804] I can only follow accounts.

[805] And I have to look into, and just imagine like, you know, trying to get an understanding of your own politics.

[806] People can't just look at your bio.

[807] They have to look through all your tweets.

[808] They have to listen to a bunch of your podcasts and whatnot.

[809] And that's a bunch of work.

[810] If we shift it more towards topics and interest, at least we have the potential to see a bunch more perspectives.

[811] How do you do that, though?

[812] The simplest thing is like, follow a hashtag.

[813] Follow a topic.

[814] Like, why can't you just follow Warriors' Twitter or, you know, NBA Twitter, why do you have to go and find all the coaches and the players and the team?

[815] We can do that.

[816] We can help make that a whole lot easier for folks.

[817] So there's something like Brexit or something like that.

[818] So if you go to hashtag Brexit, you're going to get the whole conversation.

[819] You're going to get the pros, the cons, the left, the right, the whole deal, the centrists.

[820] You're going to get everybody versus following the people that you already follow that agree with what you think.

[821] The probability is higher that you'll get more.

[822] You'll get more variety of perspective.

[823] And not even, don't even follow Brexit.

[824] Follow vote leave if you want to leave.

[825] But within that topic, there might be some dissenting opinions.

[826] And you get to choose whether those inform you, whether that emboldened your position or not.

[827] But, and again, I'm not saying that we should foreset upon it, but it's not easy to even do that today.

[828] Right.

[829] The only tool we give you is finding and following the accounts.

[830] But people search hashtags.

[831] They do search hashtags, right?

[832] They don't.

[833] They stand their timeline.

[834] I mean, a small percentage.

[835] of people, the people that really know Twitter know how to do that.

[836] But most people, they follow an account and they stay in their timeline.

[837] And what their world is their timeline.

[838] Hashtags can be corrupted too.

[839] I mean, people.

[840] They can be gamed.

[841] Yeah.

[842] And taken over.

[843] I took over hashtag vegan cat.

[844] Go to hashtag vegan cat now.

[845] What is that?

[846] That's a rap.

[847] That's mine now.

[848] I had a bit in my last, my last Netflix special about a woman who said a bunch of horrible things to me. because I put a picture up on Instagram of some deer and I wrote this is some meat from a deer that liked to kick babies and was about to join ISIS and then I wrote hashtag vegan which was a mistake right to write hashtag vegan but the hashtag vegan people went fucking crazy and came after me because I entered into their timeline with meat to have you got a hashtag vegan cat it's all either pictures of see like it says Joe Rogan write out thank you I haven't laughed that hard a while hashtag vegan cat It's people that are feeding their fucking cat vegan food, and they're all dying.

[849] And in the special, I say, every cat looks like it's living in a house with a gas leak.

[850] Like, they're all like laying down like, where the fuck is the real food?

[851] But this is real.

[852] Generates a conversation.

[853] But the thing is, if someone does something like that, like you can, like, pick a person, you know, whatever that person is, whatever they're doing.

[854] If they have a hashtag that they utilize all the time for their movement or whatever, someone could mock.

[855] them and then use that, and if it's a public figure or someone who's got a prominent voice, then all of a sudden that hashtag becomes people just take it over and start mocking them with that hashtag.

[856] But it has to be done in mass. I mean, it has to be coordinated.

[857] And sometimes people figure out how to game the system and coordinate it in a, amplify that message in an unfair way.

[858] And that's what our systems are trying to recognize.

[859] How sick did that cat look?

[860] The poor fucking cat.

[861] That was a real one.

[862] That was a real hashtag vegan cat.

[863] Wow.

[864] It's poor bastards.

[865] So when you look back at emerging social media, like we go all the way back to MySpace, right?

[866] MySpace, you got Tom.

[867] Tom was sitting there in your top eight, and, you know, people would like post music that they liked, and it was never political.

[868] It was very often very surface, and for comics, it was a great way to promote shows, and it was, it was a interesting way to see things.

[869] But it was, it was like the seed that became Twitter or Facebook or any of these other.

[870] That was one of them.

[871] I think we, at least for us, like, we got more of our roots from AOL Instant Messenger and ICQ.

[872] Mm, I see Q. Because it was, you know, you remember the status message where you said, like, I'm in a meeting or I'm listening to this music or I'm watching a movie right now?

[873] Yeah.

[874] That was the inspiration.

[875] Oh.

[876] And what we took from that was being able to, like, if you could.

[877] do that from anywhere, not bound to a desk, but you could do that from anywhere, and you could do it from your phone, and you could just be roaming around and say, you know, I'm at Joe Rogan's studio right now.

[878] That is cool.

[879] I don't need my computer.

[880] I'm not bound to this, and chain to this desk.

[881] I can, I can do it from anywhere.

[882] And then the other aspect of Insomessinger was, of course, chat.

[883] So one of the things that the status would do is you might, you might say, like, I'm listening to Kendrick Lamar right now, and I might hit you up on chat and say, like, what do you what do you think of the new album?

[884] But now it's all public.

[885] So it's just everyone can see it.

[886] That's the biggest difference.

[887] And that's, that to me is what Twitter is.

[888] MySpace was, it was profiles.

[889] And, you know, people organized around these profiles and this network that developed between people.

[890] And that is Facebook.

[891] Facebook optimized the hell out of that.

[892] And they scaled the world.

[893] We were something very different.

[894] We started with that simple status, and then people wanted to talk about it.

[895] We decided that it should be on the same surface.

[896] It shouldn't be suburbient to the status.

[897] It should be part of that flow, and that's what makes Twitter, you know, so fluid.

[898] Now, when you look at this sort of metamorphosis, this evolution between those initial social media, whether it's AOL, Instant Messenger, that eventually became like I. ICQ and what was that one that we would use that gamers would use that it was like a live stream sort of message board no it was like for a while people were using team speak but I don't know that's when you were playing that's more recent wasn't that wasn't that wasn't team speak it was like you would go there and share files and stuff and people would would would like if I used to play a lot of online video games and we play on teams and we have teams go and play other teams and he would use this sort of it was like a it wasn't a message board because it was all in real time what the fuck was it called?

[899] On live?

[900] No. All right, forget it.

[901] Anyway, guys would go there and you could send people files through it and teams would go and meet and it would be a chat, like an online chat that would be in real time.

[902] Yeah, it's, yeah, I mean we have a lot of our roots in AOL and some but also like IRC, Internet Relay Chat and Usenet, which were the, you know, these old Internet 70s technologies.

[903] IRC is what I was talking about.

[904] IRC?

[905] Yeah.

[906] Yeah, okay, okay.

[907] So Internet Relay Chat is like this giant chat room that anyone can join and so range around topics and that's, what's interesting about that is you can see people typing, you see it occurring in real time.

[908] You see it popping up in real time.

[909] Yep.

[910] You know, just I wonder like what is the next evolution of this.

[911] because no one saw anything going from ICQ to Twitter.

[912] No one saw anything going from that to Instagram and to where we're at right now where it really does flavor the conversation of our entire culture.

[913] I mean, before it was just a thing that was happening that was happening on people's computers.

[914] Now it's a thing that's happening on people's computers and now phones and now your whole life.

[915] It's a very different influence.

[916] And I wonder, because everything does accelerate, Things constantly move forward and become more and more and more integrated into our life experience.

[917] And I wonder what is the next stage of this?

[918] I mean, like the secular trends and you look at technology and you look at technologies like blockchain, for instance.

[919] And I think, you know, we're moving to a world where anything created exists forever.

[920] that there's no centralized control over who sees what.

[921] That these models become completely decentralized and all these barriers that exist today aren't as important anymore.

[922] What do you think of something like Gab?

[923] Gab seems to be a response to the fact that some people are getting banned from other platforms.

[924] and they're just allowing anybody to come on and say anything they want.

[925] The downside of that is, of course, the most horrible people are going to be able to say anything they want with no repercussions.

[926] The good side is anybody can say whatever they want.

[927] Yeah.

[928] I haven't studied them too much, but I do know that they have taken action on accounts as well.

[929] They have suspended accounts, and they have in terms of service as well.

[930] What have they suspended accounts for?

[931] I don't know.

[932] It's probably conduct related.

[933] It might be doxing, you know.

[934] Probably, right?

[935] But it's just a question of like, you know, the rules and if you agree to the rules, then, you know, do you sign up for the service?

[936] And if not, there will be other services.

[937] But like I, you look at the trends and I think, you know, certainly things become a lot more public.

[938] Certainly things become a lot more open.

[939] Certainly the barriers and the boundaries that we have in place today become less meaningful.

[940] And, I think there's a lot of positives in that.

[941] And I also think there's a lot of danger that we need to be mindful of.

[942] Now, you as a CEO, as a guy who's running this thing, what has this been, is the experience been like for you?

[943] Because I've got to imagine that it wasn't anything that you predicted.

[944] No one predicted Twitter, right?

[945] So to all of a sudden have this responsibility.

[946] Twitter changed everything.

[947] And you're a young guy.

[948] How old are you?

[949] 42.

[950] That's young.

[951] To be in control of that much, like, what, and to have it over the time of, what has it been, 11 years?

[952] 13 years, we'll be 13 in March, yeah.

[953] Yeah, so you were really fucking young.

[954] Yeah.

[955] Like, what has that been like for you?

[956] It's been, uh, it's been both beautiful and scary and uncomfortable and learning.

[957] It's just been a ton of learning and evolving.

[958] And, like, it shows me every single day where I need to push myself and what I don't know.

[959] And I think a big part is like just the realization that we're not going to be able to do this alone.

[960] And I don't think we have to either.

[961] These are what the technologies continue to allow is we can, if we have to have all the answers around enforcement or policy whatnot, we're not going to serve the world.

[962] We have aspirations to serve every single person on the planet.

[963] And we have aspirations to, you know, be the first consideration for the global public conversation.

[964] And, you know, if we're the bottleneck for all this, we're not going to reach those aspirations.

[965] So it's just thinking deeply about how we might distribute more of this work and decentralize more of it and look at, look at, you know, the platform itself and, like, what we need to change to reach that reality.

[966] And I think we've got to look really deep and foundational.

[967] It goes back to, you know, your question on 140.

[968] One of the things that we saw was, you know, we shifted to 280 characters, and that, you know, this 140 characters is so sacred.

[969] You know, it became this cultural thing, and I was in love with it, and so many people are in love with it.

[970] But one of the things we noticed as we moved to 280 is that the vast majority of tweets that are broadcasts don't go above 140, even with that limitation raised.

[971] but where they do go above 140 is in replies.

[972] When people reply, they tend to go over the 140 character limit and even bump up into the 280 limit.

[973] And what we've seen it allow is just more nuance in the conversation and allows people to give more context and kind of just get their experience on the table a bit more, whereas 140 did not allow that.

[974] So we have seen that increase the health of those conversations and the discussion.

[975] So it's stuff like that that we need to question and not hold so sacred.

[976] Is there any consideration to expanding it further?

[977] Not right now.

[978] How about a million characters?

[979] No?

[980] Well, we don't have edit tweets right now.

[981] Do you think that that's good or bad?

[982] Well, if you can't edit 140 characters, you're going to be really pissed off if you write a million characters and can't add those things.

[983] You know what I would like?

[984] I would like edit, the ability to edit, like if you make a typo or something like that, but also the ability for people to see the original.

[985] Yeah.

[986] Like edit, but see the original.

[987] Like, say...

[988] We're looking at exactly that.

[989] Oh, really good.

[990] We're looking at exactly that.

[991] The reason we don't have it in the first place is we were born on SMS.

[992] We were born on text messaging.

[993] When you send a text, you can't take it back.

[994] Right.

[995] So when you send a tweet, it goes to the world instantaneously.

[996] Right.

[997] You can't take it back.

[998] So when we have to...

[999] But doesn't that exist anyway?

[1000] I mean, no matter what, if you send someone something, even if you're on Instagram, people are going to know the original.

[1001] Yeah, they screenshot it and they, you know, they do their thing.

[1002] but, like, you could build it such that, you know, maybe we introduce a five -second to 30 -second delay in the sending.

[1003] And within that window, you can edit.

[1004] I'm going to need more time than that, dude.

[1005] If I fuck something up, like, someone has to tell me. Hey, man, you misspelled that word.

[1006] Ah, shit.

[1007] Did I?

[1008] God damn it.

[1009] Like, sometimes autocorrect gets you?

[1010] Totally.

[1011] But the issue with going longer than that, it takes that real -time nature and the conversational flow out of it.

[1012] So then we're delaying these tweets, And, like, when you're watching UFC or are you watching, like, Warriors Basketball, a lot of the great Twitter is, like, just, like, in the moment, just like, you know, it's the roar of the crowd.

[1013] It's like, you know, looking across at someone you're in this virtual stadium with and just saying, like, oh, my God, that shot, can you believe it?

[1014] But isn't clarity more important?

[1015] Because you're not going to give up.

[1016] It depends on the context.

[1017] Yeah.

[1018] You're still going to have the ability to communicate quickly.

[1019] Yeah.

[1020] But you also have the ability to clarify.

[1021] That's where we need to, really pay attention because if you're in the context of an NBA game you want to be fast and you just want to be out of the moment and you just you know you want to be raw but if you're in the context of considering what the president just did or making a particular statement then you probably need some some more time and we can be dynamic there what's interesting to me is how few people use video like I thought when you guys have video on Twitter I'm like wow a bunch of people are going to be making videos and putting those up on Twitter and it's not it's not that often it depends on who you follow it it's huge uh for some aspects of twitter it's it's uh it's less so in others but what aspects is it huge for um a lot of a lot of sports i mean we see a lot of like just the the replays and the recaps oh right for sure yeah you know aspects of the of a particular shot that people want to comment on i don't like i think it's dangerous for us to to focus too much on the medium whether it be images or gifts or video it's It's more about the conversation around it.

[1022] Like, that's what we want to optimize for.

[1023] It's definitely very popular for sports and folly and, you know, there's a, what is a, there's a bunch of animal attack videos.

[1024] Yeah, nature's metal is a good one and hold my beer.

[1025] That's another good one.

[1026] I mean, it's all videos.

[1027] But what I meant was people making a video, talking about something.

[1028] This is what I was trying to say.

[1029] What I think is this and that and blah, blah, blah.

[1030] You don't see a lot of that, and I thought maybe that would be something that people would adopt more.

[1031] It's a different speed.

[1032] You know, I think, like, the consumption of video, I mean, you see this in the technology right now.

[1033] Like, people are subtitling every single video because people might be in an environment where they can't turn the audio on.

[1034] Or, like, a video, like, I have to scrub through to see what's interesting.

[1035] Right.

[1036] With text, I can just see it.

[1037] And there's, it allows for a lot of serendipity to find something that I probably wouldn't have seen unless I watch the whole damn video.

[1038] So, like, the ability to clip something, the ability to, like, index in, I think is really critical.

[1039] So it's not, to me, it's not about the format.

[1040] It's about the use case and the context that you're in.

[1041] Now, going back to the responsibility that you guys have, and you, in particular, like, when this became what it is now and when it became evident that it became this gigantic way of changing the way human beings communicate with each other, was the, there ever any regret or was there ever a moment where you're like, what the fuck have I gotten myself into?

[1042] I mean, there's, I'm always reflective of where I am and what I'm doing.

[1043] I think the biggest has been around twofold.

[1044] One, how the dynamics of the service allow it to be weaponized in order to silence someone else.

[1045] or to drive them off the service entirely, which goes against the entire concept of free speech and free participation.

[1046] Like we just can't stand for that.

[1047] We need to make sure that everyone feels that they have an opportunity at a voice.

[1048] And when you have these coordinated attacks, it's not fair.

[1049] A second is around, you know, this concept of an echo chamber and a filter bubble.

[1050] I just don't feel personally good about that.

[1051] I don't feel that we thought that through enough in the early days.

[1052] I think we should have moved towards biasing the service towards topics and interests much, much sooner than we're now considering doing.

[1053] Now, when you have these considerations, when you take these actions, do you consult with psychologists or historians or people that try to put in perspective for you what the ramifications of each individual move would be?

[1054] I try to read as much as possible.

[1055] I try to talk to as many people as possible, just get a completely different perspective.

[1056] Is there any internal disagreement about actions that you take?

[1057] Oh, yeah.

[1058] There's always debate.

[1059] There's always debate.

[1060] But I think my role is to ask questions and make sure, like, what is our goal here?

[1061] What are we trying to do?

[1062] And that evolves.

[1063] That evolves.

[1064] That evolves.

[1065] And is this over the long term going to be a net positive?

[1066] for all humans, all humanity, like how do we balance the considerations of, of, you know, how we serve everyone?

[1067] And like, how do we get down to something, how do we get down to a fundamental answer and a central answer?

[1068] And that, to me, is where the real truth is, is when you can get to something foundational.

[1069] But I like, you know, I like having conversations with as many people from as many different fields as possible in getting the perspective on it.

[1070] So I ask questions all the time.

[1071] It's interesting the way you're phrasing this, too, that you are looking at this as a method to save or to help people, to serve people.

[1072] You're looking at this as a way that you can benefit society.

[1073] The society can benefit from your platform, can benefit from this ability to communicate.

[1074] You're not just looking at it as a tech company that has to remain profitable.

[1075] And that's one of the more interesting things about, tech companies to me. I mean, there's been a lot of criticism, maybe justified in some ways that tech companies all lean left.

[1076] But what is interesting to me is that name another corporation that willingly of its own choice takes that into consideration, that they want to serve the world and serve culture in a beneficial way, regardless of profit.

[1077] I mean, because you're not really selling anything, right?

[1078] You guys have a platform.

[1079] Obviously, it's financially viable, but you're not selling things.

[1080] Well, I mean, we do our models based off people's attention.

[1081] Yeah.

[1082] And they're paying us with their attention.

[1083] And that's extremely valuable in something that we need to really, really honor.

[1084] But I agree with you.

[1085] I mean, like, look at Tesla.

[1086] You know, they're, you know, I just listened to a the recent earnings call, and one of the things that Elon said was, look, there are two reasons for Tesla.

[1087] Number one is to advance, you know, different sources of energy and more renewable sources of energy because it's a fundamental and existential crisis that's facing all humanity.

[1088] And number two is to advance autonomy because it'll save lives and give people time back.

[1089] And, you know, then you started talking about how to make that possible.

[1090] And that's where, you know, our business comes in.

[1091] How do we make that possible?

[1092] And we have a great business.

[1093] We need to improve a bunch of it.

[1094] But it serves what we think are larger purposes, which is serving the public conversation.

[1095] We want to see more global public conversations.

[1096] We want our technology to be used to make the world feel.

[1097] a lot smaller to help see what common problems we have before us and ideally, you know, how we can get people together to solve them faster and solve them better.

[1098] You also seem to be embracing this responsibility that you're helping to evolve culture.

[1099] And this is part of the providing this method to communication, or of communication rather, it's helping to evolve culture.

[1100] And this is this is something that is really only applicable to tech companies in some strange way.

[1101] And it's weird that so many of them share this.

[1102] Like, I was personally a little weirded out when Google took out, don't be evil.

[1103] Like, that was a big part of their operating model.

[1104] Did they take that out?

[1105] Yes.

[1106] Yes.

[1107] Right?

[1108] Make sure that I don't want to get sued.

[1109] I'm pretty sure they removed that.

[1110] From, what would you call that, their operational directive?

[1111] Like, what is?

[1112] It was in the code of conduct.

[1113] Code of conduct.

[1114] And it's not there anymore, right?

[1115] They removed it.

[1116] So it says, yeah, it's an article that says they removed the clause.

[1117] And this is kind of a weird thing to tell people not to be evil.

[1118] It's weirder to take it out once you've already said it.

[1119] It's way weird to say, ah, fuck it.

[1120] We were wrong.

[1121] There's another way of saying that.

[1122] They changed it to do the right thing.

[1123] Yeah.

[1124] Oh, well, what does that mean?

[1125] the fuck does do the right thing mean do the right thing so you can make more money you know like hey we want to make money we'll do the right thing it makes more money yeah i mean it's i mean that's that's why like that's why like this openness is so critical i mean that that's why like the public to me the public conversation is so important is we can talk about stuff like that and like there will be companies forming today that look at um objectives and mandates like that and base their whole um culture around it and is that the right idea well I don't know, but if we're not talking about it, we won't be able to answer that question.

[1126] What's also interesting because Google is so all -encompassing, right?

[1127] You have Gmail.

[1128] You have Android.

[1129] I mean, they are the number one operating system for mobile phones in the world on top of being a search engine.

[1130] There's so much involved in that company.

[1131] And again, like almost all tech companies, they heavily lean left.

[1132] and because they had that don't be evil as a part of their code of conduct it seemed like something that was a good idea to have and it sort of defined what I was talking about that tech companies are uniquely progressive yeah I mean I don't know I don't know what makes that I think um no matter what like we the internet allows for a very healthy skepticism of nearly everything.

[1133] Yeah.

[1134] I'm from Missouri.

[1135] It's a show me state.

[1136] Are you really from Missouri?

[1137] Yeah, I'm from San Luis, Missouri.

[1138] We're all skeptics.

[1139] My mom was a Democrat.

[1140] My dad was a Republican.

[1141] My dad listened to Rush Limbaugh and Hannity all the time.

[1142] I found myself somewhere in the middle.

[1143] But one of the things I appreciated, we had a ton of fights and arguments and yelling matches around the kitchen table.

[1144] but like I appreciate the fact that I we could have them and I didn't I felt safe to do so and I didn't feel like I mean obviously they're my parents but they weren't judging me because of what I said and they didn't force you to be a Republican or a Democrat they didn't force me to to to think a particular way like I I think they were good at least showing different perspectives even in you know this in this union that that they have and I don't know it developed a skepticism in me that I think is healthy.

[1145] And I have a lot of skepticism of companies like ours and leaders like me. I think that's right.

[1146] I think that's right and people should.

[1147] And we, I mean, I was, I was formed through a lot of the ideals that they know that.

[1148] I just found love with what it made possible.

[1149] And And I never ever want to run afoul of those ideals and, you know, the removal of barriers and boundaries and the connection that we have because of it.

[1150] And I, you know, I think often and reflect often about my role and the centralization of my role and of our company.

[1151] And I want to figure out and help figure out.

[1152] out like how we can continue to add massive value and and be an amazing business, which is us and will always be us, but at the same time, be a participatory force in this greater good that the internet has really started.

[1153] And it's not led by anyone individual or anyone company, and that's the beauty of it.

[1154] And I want to make sure that we find our plan.

[1155] place in that and we can also contribute massively to it and I think we can it's just going to take a lot of work a lot of introspection and a lot of experimentation a lot of making mistakes and failures too well and it's very encouraging that you have that attitude because uh you know a lot of people i think in a similar situation would try to control the narrative they would try to reinforce their own particular perspective on things and try to get other people to adopt it or try to push it.

[1156] And I think it is, it's very important to just have this open discussion.

[1157] And, and I think it's very important to review your own thoughts and ideas.

[1158] And one of the best ways to do so.

[1159] Put it out there.

[1160] Yeah, put it out there.

[1161] Put it out there and have other people review.

[1162] Pure review is a great process.

[1163] And that's flavored the way this podcast has evolved more than probably anything.

[1164] Yeah.

[1165] Oh, that's the thing.

[1166] I mean, you did this because you want to learn from people.

[1167] And, and the platform that you've created, millions get to learn from.

[1168] it as well.

[1169] And that's just so amazing.

[1170] Like I learn from your podcast all the time.

[1171] And that's what technology makes possible.

[1172] But with that power also comes ramifications.

[1173] And if we're not talking about the ramifications and like at least being open about what we know and what we don't know.

[1174] And I think we're, I think we state and post a lot more of what we know rather than what we don't know.

[1175] And that is so interesting.

[1176] Why don't you guys steal don't be evil?

[1177] Put that in your own shit.

[1178] Fuck you, Google.

[1179] I don't know if that's going to help anything.

[1180] What is that telling our employees to do?

[1181] Don't be evil.

[1182] It's real simple.

[1183] Don't be a dick.

[1184] Yeah, I mean, how do we get deeper and just like, you know, seeing more conversation around what is quote -unquote evil?

[1185] Have you guys considered expanding your influence in other venues?

[1186] Like, you know, Google started off as a search engine, and now it's fucking everything.

[1187] How do you guys considered doing something similar?

[1188] I think we probably did too much of that early on, and that's what led to a bunch of issues from a corporate standpoint.

[1189] We're just trying to do too much.

[1190] Like what?

[1191] I don't know.

[1192] We were trying to be everything to everyone.

[1193] And, like, you know, we had a video thing, and we were looking at gaming stuff and messaging.

[1194] And it lost focus of what we are good at.

[1195] What we're good at is conversation.

[1196] and what we're good as is public conversation.

[1197] So we now have, as a company, we have just such an amazing focus on what that means and how that evolves.

[1198] And there's just, there's some really cool things that we can do there.

[1199] Like, we have this, we have this app called Periscope.

[1200] And one of the things that we're discovering is like a lot of people are using it to podcast.

[1201] A lot of people are using it to share their thoughts and these people come in and, you know, they chat and have a conversation.

[1202] and one of the things we did recently is we allowed the audio to play in the background.

[1203] It's super simple, but what we found was that people didn't necessarily want to watch the video of people talking.

[1204] They just want to hear what they're saying.

[1205] And that just opened the door for more types of use cases.

[1206] And there's some really exciting things coming out with Periscope that I think add a new dimension to what conversation looks like and how it is exciting.

[1207] experience and how it evolves.

[1208] And those are the things I get really excited about.

[1209] It's like how can we make conversation better?

[1210] And how do we how do we make it feel more live?

[1211] How do we make it feel more electric?

[1212] And how do we bring new technology into it that just opens a door for an entirely new way?

[1213] of talking.

[1214] And that's the thing that I think has been most educational to me about Twitter as we, you know, as we talked about, we started with this idea of sharing what was happening around you and then people told us what they wanted it to be.

[1215] And it became this conversational medium.

[1216] It became this interest networked.

[1217] And it became a, it became a thing that was entirely new.

[1218] And it, you know, we observed it.

[1219] And we learned more and more of what it wanted to be.

[1220] And as we get deeper and deeper that, we're going to be surprised by some of the technologies that we thought would be used in this way.

[1221] But it turns out that the massive use case and the resonant use case and the fundamental use case is going to be created right before our eyes by the people using it.

[1222] Now, did you guys acquire Periscope?

[1223] We acquired Periscope.

[1224] And what was the thought process when you were acquiring it?

[1225] We like the live nature of it.

[1226] We like the broadcast aspect.

[1227] Why keep it as Periscope?

[1228] Why not have it be like Twitter live?

[1229] There's a specific community on Periscope.

[1230] And I think it's interesting from an experimentation standpoint.

[1231] We can play with ideas there.

[1232] It's a smaller playground.

[1233] Scott Adams, I think, uses it better than anybody.

[1234] Yeah, he's really good at it.

[1235] And he's one that I think has figured out, you know, just the Nass.

[1236] behind it you know he he starts every one of them with with this simultaneous sip of coffee yeah he gets his listeners and his viewers engaged right away and then he just goes on and then every now and then he'll you know look at the comments and and riff off them so lets people build up too like he he'll announce that he's going on and then wait a little while say hello to some people yep then once a bunch of people are in the room then he starts talking.

[1237] Yeah.

[1238] And I just, I find, uh, I find that so interesting because that is the future of conversation.

[1239] It's looking at the patterns.

[1240] It's looking at what people are trying to do with the thing.

[1241] And then you build technology around it.

[1242] And that becomes, that becomes the next big thing.

[1243] And we just have to, we have to hone our power of observation, hone our power of like connecting the dots and looking all the patterns and what people are.

[1244] It's, what's the question behind the question?

[1245] What's, what's the statement behind the statement that they're making?

[1246] And if we can get good at understanding some of those fundamental essential things, then we've reached, we've at least created the probability that most people in the world will find it useful and find a valuable.

[1247] Joey Diaz is the other person that uses Periscope better than anybody alive, but he just gets eye.

[1248] He just gets baked.

[1249] He just gives you a morning, what does he call it?

[1250] Morning bong hit?

[1251] The morning joint.

[1252] The morning joint.

[1253] Yeah, he'll smoke a joint or smoke a bowl in the morning and then just sort of let everybody know what's going on his mind doing something in sync with one with with more people is interesting like we've um we've had some folks who are interested in doing meditations through periscope oh that's a great idea you can't i mean if you if you look at the surface level you can't imagine anything more boring than like watching someone meditate but if you're actually meditating with them there's something powerful about it and like what can we do to improve that experience What about people using it for group workouts?

[1254] Is anyone doing that?

[1255] I'm sure it's happening.

[1256] I haven't seen it personally, but I'm sure it's happening.

[1257] How much more people do you have on Twitter than Periscope?

[1258] A lot.

[1259] A lot.

[1260] A lot.

[1261] It's one of those things that I personally just have a lot of conviction around, and I have a lot of belief in the format.

[1262] And I, you know, every now and then we don't have instant hits.

[1263] It just requires a lot of patience.

[1264] And we need to really learn what it wants to be.

[1265] and sometimes that takes time.

[1266] And, you know, I think oftentimes I've certainly done this, we, you know, we shut down things a little bit too early.

[1267] We did this at Square.

[1268] Like, we had this amazing technology and app I love called Square Wallet.

[1269] And it allowed you to, you know, you link your credit card and you have all these merchants around you here in L .A. And you could walk up to a coffee merchant.

[1270] and as you walked up, your name would pop up on the register, so you could say, like, I want a cappuccino, put it on jack, and it just automatically charge your card, and it would only happen if you were within, like, two feet.

[1271] We're using Bluetooth and geolocation, whatnot.

[1272] But we had it for about three years, and it just didn't take off, and we shut it down, and I kind of regret doing that, but it also paved the way for another thing that I didn't want to give up on, and that was the cash app.

[1273] Like for four years, it was just a slog.

[1274] Like a lot of people in the company wanted to shut down the thing.

[1275] They saw it as something that wasn't successful.

[1276] And, you know, recently the team reached number one in the app store in the United States.

[1277] Yeah.

[1278] And like we were against all these incumbents like Venmo and PayPal and it finally clicked.

[1279] And it's just because we had the patience and the conviction around our belief.

[1280] Yeah, it's a great app.

[1281] too and the ethics behind it are really fantastic too we're really thankful for the cash app especially my friend Justin Wren and his fight for the forgotten charity that every time you use the code word Joe Rogan all one word it all goes five dollars goes to to that and they've built two wells for the pygmies in the Congo and they've raised thousands of dollars and building more wells right now it's really really cool yeah we're really really happy about that yeah yeah I love it I think it's um it's a great way to save money too I mean when you can save 10 % at Whole Foods yeah I mean, that's real.

[1282] Well, the other thing is, like, the population that we serve typically are underserved by banks or unbanked entirely.

[1283] Yes.

[1284] We are their bank account.

[1285] Right.

[1286] But more importantly, they don't have access to things like rewards.

[1287] You don't get rewards on a typical debit card or a credit card.

[1288] So, like, just, you know, going to your favorite place and getting an instant 10 % off or whatever it is is out of reach for most people because the financial institutions, don't enable that.

[1289] And they won't even enable them to get in the door in the first place.

[1290] Well, if people are listening to this on YouTube, you don't know what the fuck we're talking about.

[1291] The cash app has a thing called a cash card, which is a debit card that you get with it, and there's a thing called boosts.

[1292] And with boosts, all you do is pick a boost in the app and then use your cash card as a debit card and you get these automatic discounts.

[1293] And they're real discounts.

[1294] Yep.

[1295] And for folks with bad credit, there's no credit check.

[1296] You can direct deposit your paycheck right into the app.

[1297] And the fact that you guys do do things like support Fight for the Forgotten and you're supporting UFC fighter Ray Bork's son.

[1298] He's got some serious medical bills.

[1299] It's really, really cool.

[1300] Yeah.

[1301] Yeah, I'm really proud of it.

[1302] I'm really proud of the team.

[1303] It's a very small team, but they're doing some big things.

[1304] Well, I hear a lot of good things about it, too.

[1305] I've run into people on the street that tell me they use it and they're very happy about it.

[1306] So it's nice to see, again, an emerging technology that's profitable, but yet also has a really good set of ethics.

[1307] but yeah yeah do you have don't be evil in the cash house no no no no you should take it it's open no one of our one of our like equivalent operating principles within cash and square is like under like how do we understand someone's struggle like how do we understand like how do we have the empathy for like what they're struggling with and like when it comes to finance they're struggling with a lot yeah typically they're struggling with a ton what was the the thought process with i mean one of the things that's kind of cool about the cash app is that you can buy and sell bitcoin with it yeah um are you are you guys going to consider other forms of cryptocurrency as well not right now i so back to the internet i believe the internet will have a native currency really it'll have a native currency and i don't know if it's bitcoin i think it will because just given all the tests it's been through and the principles behind it how it was created and um you know it was something that was born on the internet that was developed on the internet, that it was tested on the internet, it is of the internet.

[1308] And the reason we, you know, we enabled the purchasing of Bitcoin within the cash app is, one, we want to learn about the technology and we want to put ourselves out there and take some risk.

[1309] We're the first publicly traded company to actually offer it as a service.

[1310] We're the first publicly traded company to talk to the SEC about Bitcoin and what that, what that means.

[1311] And it made us uncomfortable.

[1312] We had to, we had to, you know, like really understand what was going on.

[1313] And that was critical and important.

[1314] And then the second thing is that we, you know, we would, we would love to see something become a global currency.

[1315] It enables more access.

[1316] It allows us to serve more people.

[1317] It allows us to move much faster around the world.

[1318] And we thought we were going to start with how you can use it transactionally, but we noticed that people were treating it more like an asset, like a virtual gold.

[1319] And we wanted to, we wanted just to make that easy, like just the simplest way to buy and sell Bitcoin.

[1320] But we also knew that it had to come with a lot of education.

[1321] It had to come with constraint because, you know, two years ago, people did some really unhealthy things about, you know, purchasing Bitcoin.

[1322] They maxed out their credit cards and put all their life savings into Bitcoin.

[1323] So we, we develop some very simple restrictions and constraints.

[1324] Like you can you can't buy Bitcoin on the cash app with a credit card.

[1325] You have to, it has to be the money you actually have in it.

[1326] And we look for day trading, which we, we discourage and shut down.

[1327] Like that's not what we were trying to build.

[1328] That's not what we were trying to optimize for.

[1329] We made a children's book explaining what Bitcoin is and where it came from and how people use it and where it might be going.

[1330] So we really tried to take on the role of education and to have some like very simple, healthy constraints that allowed to be, allowed people to consider what their actions are in the space.

[1331] Now, when you have something like the cash app, which is a, it's very much a disruptive technology in terms of like decentralization of banks and currency and you know to have it where everything is going right at your direct depositing a paycheck right in the app if you so choose and then you could also buy Bitcoin which is another disruptive technology I mean that this is another step towards this sort of new way of doing things yeah and is there pushback from from any companies or is there oh yeah yeah i mean like you just look at like some of the major banks and their consideration around bitcoin they all love blockchain because of the efficiencies that can create for their business and potentially new business lines but um you know i think there is a explain blockchain for people don't know what we're talking about black chain is a distributed ledger and it what that means is that um it's basically a distributed database where you know the the source of truth can be verified at any point around the network work.

[1332] And you can see, you know, this annotation around how content or how around money like traveled.

[1333] So you don't have to go to an institution.

[1334] So the records.

[1335] Yeah, there's no centralized check.

[1336] There's no centralized control over it.

[1337] And I think that is threatening.

[1338] It's, it's, it's certainly threatening to certain services behind banks and financial institutions.

[1339] It's threatening to some governments as well.

[1340] So I just look at this and like, how do we, embrace this technology not react to it in a more from a threat standpoint but like what does it enable us to do and where does our value shift and that's what we should be talking about right now is like how our value shifts and there's always really strong answers to that question but if you're not willing to ask a question in the first place you will become irrelevant because technology will just continue to march on and make you irrelevant and it's the people that like are you know growing up with this technology or born with the technology only knowing that technology or are asking the tough questions of themselves that are going to be super disruptive to their business and they're thinking about right now and they're and they're taking actions and you know we're doing we're doing that at square and we're doing that at Twitter and like that to me represents longevity that represents our ability to to thrive and we we got to push ourselves we've got to make ourselves uncomfortable and we got to disrupt what we held sacred and what, you know, we think is success today.

[1341] Because otherwise it's not going to be bigger than what we have today.

[1342] Yeah, I couldn't agree more.

[1343] And I think that cryptocurrency, to me, represents one of the more interesting discussions on the Internet.

[1344] Like, what is money?

[1345] And why are we agreeing that it's these pieces of paper that the Federal Reserve prints out?

[1346] Totally.

[1347] It's a fascinating time in technology because like that that to me was one of the one of the last big centralized nationalized instruments is currency is money and when you think about the internet as a as a country as a market as a nation it's going to have its own currency and but what's interesting about the internet as a nation it's the whole world it is the whole world so the world gets one currency It gets one thing to communicate in, and that to me is just so freeing and so exciting.

[1348] Yeah, I'm very excited by it, and I'm also very excited that the fact that it's only been around for such a short period of time, but yet it's been, and it's become a part of the global conversation.

[1349] Yeah, it's got a good brand.

[1350] I also think that it's going to open up the door to potential universal languages, and I think this is, this is, yeah.

[1351] Yeah, that excites me a lot about Twitter.

[1352] I was like, how do we, like, if we want to get the world into a conversation into a, not a single conversation, but at least being able to see that global conversation, like, we got to work on technologies that, like, instantly translate.

[1353] We've got to work on technologies that I can speak as I'm speaking right now, and in real time people are hearing it in their context and their language and their dialect.

[1354] That is amazing.

[1355] That is so exciting.

[1356] And, like, just how that evolves and how it impacts, not just communication like this, but music.

[1357] and just like, you know, how hip hop and rap and, you know, just, it's amazing to think about where that can go and where that can take us.

[1358] Yeah, and I think you and I are extremely fortunate to be alive right now during this time because I think it's one of the strangest and most unique times in human history.

[1359] Totally.

[1360] I don't think there's ever been a time where things have changed so radically, so quickly.

[1361] Totally.

[1362] Yeah, and I feel, you know, we're just, we're able to.

[1363] through technologies like Twitter to at least see and acknowledge some of the issues that we're still facing that were probably in the dark before.

[1364] And I think that's so critical to making any sort of improvements for making any sort of evolution and for making it better for everyone on the planet.

[1365] And I, you know, as uncomfortable as, you know, sometimes Twitter makes people feel, I think it is necessary to see those things and have conversations about them so that we can understand how we might move forward and how we might really get at the biggest problems facing us all.

[1366] And, you know, there's some huge ones.

[1367] There's some huge ones right now that if we don't have, if we don't talk about it, like it will drive us to extinction and like it will threaten our ability to be a planet to live on this planet.

[1368] I agree.

[1369] Thank you.

[1370] Thanks for everything, man. Thanks for being here.

[1371] Thanks for doing what you're doing.

[1372] Thanks for having the attitude that you have.

[1373] I really, really appreciate it.

[1374] Thank you, Joe.

[1375] My pleasure.

[1376] Bye, everybody.

[1377] Thank you.

[1378] That was all great, man. Thank you.