Lex Fridman Podcast XX
[0] The following is a conversation with Kevin Sistram, co -founder and long -time CEO of Instagram, including for six years after Facebook's acquisition of Instagram.
[1] If you want to support this podcast, the best way is to check out the following sponsors in the description.
[2] First is Therogun, the device I use for post -warcom muscle recovery.
[3] Second is N .I, a company that helps engineers solve the world's toughest problems.
[4] Third is give well, a directory for best, most effective charities.
[5] Fourth is Blinkist, the app I used to read summaries of books.
[6] And fifth, Funrise, a platform for investing in private real estate.
[7] So the choice is muscle recovery, engineering, optimal charity, books, or investment.
[8] Choose wisely, my friends.
[9] And now, on to the full ad reads.
[10] As always, no ads in the middle.
[11] I try to make this interesting, but if you skip them, please still check out our sponsors.
[12] I enjoy their stuff.
[13] Maybe you will too.
[14] This show is brought to you by Thergun, a handheld percussive therapy device that I use after workouts for muscle recovery.
[15] It's surprisingly quiet, easy to use, comes with a great app that guides you through everything you need to know.
[16] this has been a tool I use often throughout all the physical pursuits I've been under from cardio to weightlifting to grappling but it has been recently especially relevant because of a hamstring injury I have and it's been really helping me on the physical therapy side on the recovery getting back to 100%.
[17] I mean I can't sing in enough praise.
[18] Thergun has been just a great device for that.
[19] Anyway, try Therogun for 30 days at Therobody .com slash Lex.
[20] Therogon Gen 4 has an OLED screen, personalized Thergun app, and is both quiet and powerful, starting at $199.
[21] This show is also brought to you by N .I, formerly known as National Instruments.
[22] N .I. is a company that has been helping engineers solve the world's toughest challenges for 40 years.
[23] Their motto is engineer ambitiously.
[24] I don't think a better model can possibly exist for any company ever.
[25] I love ambition.
[26] I love engineering.
[27] So engineering ambitiously is pretty much as good as it gets.
[28] They have a podcast called Testing 1 -2 -3.
[29] They have amazing articles, engineering -related, articles on n .com slash perspectives.
[30] They cover engineering, innovation.
[31] talking to people and getting their insights about how testing, how failure, how rigorous controlled error making that leads to learning and improvement is a part of their success.
[32] So it's so beautiful to be honest about your failures and to accept them as part of the process.
[33] I mean, that is what testing is.
[34] Testing is fun.
[35] fundamental to engineering.
[36] It's not about getting it perfect in the design stage.
[37] It really is actually testing it through rigor as it meets the real world.
[38] Anyway, engineer ambitiously with n .i at n .com slash perspectives.
[39] That's n .i .com slash perspectives.
[40] This show is brought to you by Give Well.
[41] They research charitable organizations and only recommend the highest impact evidence -backed charities.
[42] Over 50 ,000 donors have used Give Well to donate more than $750 million.
[43] The idea of giving optimally, of optimal charity, is core to this whole concept that I and a bunch of other people, Joe Rogan, Sam Harris, have had a lot of conversation about effective altruism.
[44] This is such a great implementation of that, because you don't have to worry that your money if you're giving donating money to charity that it's not being used optimally not only does give well make it easy for you to not get into trouble with charities that don't do a good job but they help you pick out the very best the highest impact and always backed by real evidence of a track record of previous performance so check them out go to give well .org pick podcast and select Lex Friedman podcast at checkout.
[45] Once again, that's givewell .org, pick podcast, and select Lex Friedman podcast at checkout.
[46] Good luck, my friends.
[47] This episode is also brought to you by Blinkist, my favorite app for learning new things.
[48] Blinkist takes the key ideas from thousands of nonfiction books and condenses them down into just 15 minutes you can read or listen to.
[49] They have some of my favorite nonfiction.
[50] Sapiens, Meditations by Marcus Aurelius, Beginning of Infinity by David Doch, the Snowden book, the list goes on and on and on.
[51] I use Blinkist for several reasons.
[52] So one is to review books that I've already read.
[53] Two is to select books that I would like to read next in full.
[54] And three is get a summary of books.
[55] I'm never going to get a chance to read.
[56] likely, but are still important to our culture so that I can understand what people's ideas are.
[57] In fact, Blinkett just does a remarkable job of summarizing books.
[58] I feel like sometimes the key insights that Blinkett presents just captures the entirety of the book.
[59] It's almost like you didn't need to read the book.
[60] Of course, the magic of reading is if you spend a long time with those ideas, you kind of marinate in those ideas, but that doesn't mean you can't have the key ideas, and that's what Blinkist does masterfully.
[61] Go to blinkist .com slash Lex to start your free seven -day trial and get 25 % off of a Blinkist premium membership.
[62] That's Blinkist .com slash Lex, spelled B -L -I -N -K -I -S -T, Blinkist .com slash Lex.
[63] This show is also sponsored by Fundrise, spelled F -U -N -D -R -I -S -E.
[64] It's a platform that allows you to invest in private real estate.
[65] If you're looking to diversify your investment portfolio, this is a good choice.
[66] I don't know.
[67] I keep hearing that people don't diversify, or at least don't diversify as much as I would prefer.
[68] I still believe in this age of cryptocurrency, in this age of all kinds of financial investment opportunities, it's still wise to diversify.
[69] It reduces not only the risk, but also the reduces the load on your mental anxiety associated with investing in something.
[70] And I think fund rise and private real estate is a really interesting sort of orthogonal solution to diversification.
[71] It's good to allocate some of your funds to this kind of investment.
[72] Anyway, I'm still a big fan of diversification, but what do I know?
[73] Definitely don't take my advice for it.
[74] But you should check out Fundrise at Fundrise .com slash Lex.
[75] 150 ,000 investors use it.
[76] It takes just a few minutes to get started at Fundrise .com slash Lex.
[77] That's fundrise .com slash Lex.
[78] This is the Lex Friedman podcast, and here is my conversation with Kevin Sistram.
[79] at the risk of asking the rolling stones to play satisfaction let me ask you about the origin story of instagram sure so maybe some context you like we were talking about offline grew up in massachusetts learned computer programming there like to play doom two uh worked at a vinyl record store then you went to stanford turned down mr mark Zuckerberg and facebook went to florence to study photography those are just some random beautiful impossibly brief glimpses into a life so let me ask again can you take me through the origin story of instagram given that context you basically set it up um all right so uh we have a fair amount of time so i'll go into some detail but basically what i'll say is um instagram started out of a company actually called bourbon and it was spelled b -U -rb -n And a couple things were happening at the time.
[80] So if we zoom back to 2010, not a lot of people remember what was happening in the dot -com world then.
[81] But check -in apps were all the rage.
[82] What's the checking app?
[83] Gawala, four square, hot potato.
[84] So I'm at a place.
[85] I'm going to tell the world that I'm at this place.
[86] That's right.
[87] What's the idea behind this kind of app, by the way?
[88] You know what?
[89] I'm going to answer that, but through what Instagram became and why I believe Instagram replaced them.
[90] So the whole idea was to share with the world what you were doing, specifically with your friends, right?
[91] But there were all the rage, and Foursquare was getting all the press.
[92] And I remember sitting around saying, hey, I want to build something, but I don't know what I want to build.
[93] What if I built a better version of Foursquare?
[94] And I asked myself, why don't I like Foursquare or how could it be improved?
[95] And basically, I sat down and I said, I think that if you have a few, few extra features, it might be enough, one of which happened to be posting a photo of where you were.
[96] There were some others.
[97] It turns out that wasn't enough.
[98] My co -founder joined.
[99] We were going to attack, you know, Foursquare and the likes and try to build something interesting.
[100] And no one used it.
[101] No one cared because it wasn't enough.
[102] It wasn't different enough, right?
[103] So one day we were sitting down and we asked ourselves, okay, to come to Jesus moment, are we going to do this startup?
[104] And if we're going to, we can't do, what we're currently doing, we have to switch it up.
[105] So what do people love the most?
[106] So we sat down and we wrote out three things that we thought people uniquely loved about our product that weren't in other products.
[107] Photos happen to be the top one.
[108] So sharing a photo of what you were doing where you were at the moment was not something products let you do, really.
[109] Facebook was like, post an album of your vacation from two weeks ago, right?
[110] Twitter allowed you to post a photo, but their feed was primarily text and they didn't show the photo in line or at least I don't think they did at the time.
[111] So even though it seems totally stupid and obvious to us now, at the moment then, posting a photo of what you were doing at the moment was like not a thing.
[112] So we decided to go after that because we noticed that people who used our service, the one thing they happened to like the most was posting a photo.
[113] So that was the beginning of Instagram and yes, like we went through and we added filters and there's a bunch of stories around that.
[114] But the origin of this was that we were trying to be a check -in app, realized that no one wanted another check -in app.
[115] It became a photo -sharing app, but one that was much more about what you're doing and where you are.
[116] And that's why, when I say I think we've replaced checking apps, it became a check -in via a photo rather than saying your location and then optionally adding a photo.
[117] When you were thinking about what people like, from where did you get a sense that this is what people like you you said we sat down we wrote some stuff down on paper where is that intuition that seems fundamental to the success of an app like instagram where does that idea where does that list of three things come from exactly only after having studied machine learning now for a couple years i like i have a you have understood yourself i i've started to make connect like we can go into this later but obviously the the um the connections between machine learning and the human brain i think are stretched sometimes right at the same time being able to back prop and being able to like look at the world try something yeah figure out how you're wrong how wrong you are and then nudge your company in the right direction based on how wrong you are it's like fascinating concept, right?
[118] And I don't, we didn't know we were doing it at the time, but that's basically what we were doing, right?
[119] We put it out to call it 100 people and you would look at their data.
[120] You would say, what are they sharing?
[121] What, like, what resonates, what doesn't resonate?
[122] We think they're going to resonate with X, but it turns out they resonate with Y. Okay, shift the company towards Y. And it turns out if you do that enough, quickly enough, you can get to a solution that has product market fit.
[123] Most companies, fail because they sit there and they don't either their learning rates too slow they sit there and they're just they're adamant that they're right even though the data is telling them they're not right or they're learning rates too high and they wildly chase different ideas and they never actually set on on one where where they don't groove right and I think when we sat down and we wrote out those three ideas what we were saying is what are the three possible whether they're local or global maxima in our world, right, that users are telling us they like because they're using the product that way.
[124] It was clear people like the photos because that was the thing they were doing.
[125] And we just said, okay, like, what if we just cut out most of the other stuff and focus on that thing?
[126] And then it happened to be a multi -billion dollar business.
[127] And it's that easy, by the way.
[128] Yeah, I guess so.
[129] Well, nobody ever writes about neural networks that miserably failed.
[130] So this this particular neural network succeeded.
[131] Oh, they sell all the time, right?
[132] Yeah, but nobody writes about it.
[133] The default state is failing.
[134] When you said the way people are using the app, is that the loss function for this neural network or is it also self -report?
[135] Do you ever ask people what they like or do you have to track exactly what they're doing, not what they're saying?
[136] I once made a Thanksgiving dinner, okay, and it was for relatives.
[137] and I like to cook a lot.
[138] And I worked really hard on picking the specific dishes.
[139] And I was really proud because I had planned it out using a Gant chart and like it was ready on time and everything was hot.
[140] Nice.
[141] Like I don't know if you're a big Thanksgiving guy, but like the worst thing about Thanksgiving is when the turkey is cold and some things are hot and something.
[142] Anyway.
[143] You had a Gant chart.
[144] Did you actually have a chart?
[145] Oh, yeah.
[146] Omni plan.
[147] Fairly expensive like Gant chart thing that I think means.
[148] maybe 10 people have purchased in the world.
[149] But I'm one of them, and I use it for recipe planning, only around big holidays.
[150] That's brilliant, by the way.
[151] Do people do this kind of...
[152] Over -engineering?
[153] It's not over -engineering.
[154] It's just engineering.
[155] It's planning.
[156] Thanksgiving is a complicated set of events with some uncertainty with a lot of things going on.
[157] You should be planning it in this way.
[158] There should be a chart.
[159] It's not over -eneged.
[160] I mean, so what's funny is brief aside.
[161] Yes.
[162] I love cooking, I love food, I love coffee, and I've spent some time with some chefs who, like, know their stuff.
[163] And they always just take out a piece of paper and just work backwards in rough order.
[164] Like, it's never perfect, but rough order.
[165] It's just like, oh, that makes sense.
[166] Why not just work backwards from the end goal, right?
[167] And put in some buffer time.
[168] And so I probably overspecified it a bit using a Gantchart, but the fact that you can do it, it's what professional kitchens roughly do.
[169] They just don't call it a Gant chart, or at least I don't think they do.
[170] Anyway, I was telling a story about Thanksgiving.
[171] So here's the thing.
[172] I'm sitting down.
[173] We have the meal.
[174] And then I got to know Ray Dalio fairly well over maybe the last year of Instagram.
[175] And one thing that he kept saying was like, feedback is really hard to get honestly from people.
[176] And I sat down at after dinner, I said, guys, I want feedback.
[177] What was good and what was bad.
[178] Yes.
[179] And what's funny is like literally everyone just said everything was great.
[180] And I like personally knew I had screwed up a handful of things.
[181] But no one would say it.
[182] And can you imagine now not something as high stakes is Thanksgiving dinner, okay?
[183] Thanksgiving dinner, it's not that high stakes.
[184] But you're trying to build a product and everyone knows you left your job for it.
[185] And you're trying to build it out and you're trying to make something wonderful.
[186] And it's yours.
[187] you designed it.
[188] Now try asking for feedback.
[189] And know that you're giving this to your friends and your family.
[190] People have trouble giving hard feedback.
[191] People have trouble saying, I don't like this or this isn't great or this is how it's failed me. In fact, you usually have two classes of people.
[192] People who just won't say bad things, you can literally say to them, please, tell me what you hate most about this and they won't do it.
[193] They'll try, but they won't.
[194] And then the other class of people are just negative, period, about everything, and it's hard to parse out, like, what is true and what isn't.
[195] So my rule of thumb with this is you should always ask people, but at the end of the day, it's amazing what data will tell you.
[196] And that's why, with whatever project I work on, even now, collecting data from the beginning on usage patterns, so engagement, how many days of the week, do you?
[197] they use it?
[198] How many, I don't know, if we were to go back to Instagram, how many impressions per day, right?
[199] Is that growing?
[200] Is that shrinking?
[201] And don't be like overly scientific about it, right?
[202] Because maybe you have 50 beta users or something.
[203] But what's fascinating is that data doesn't lie.
[204] People are very defensive about their time.
[205] They'll say, oh, I'm so busy.
[206] I'm use the app like i'm just you know um but i don't know you're posting on instagram the whole time so i don't know at the end of the day like at facebook there was you know before time spent became kind of this loaded term there the idea that people people's currency in their lives is time and they only have a certain amount of time to give things whether it's friends or family or apps or tv shows or whatever it's there's no way of inventing more of it at least not that i know of if they don't use it, it's because it's not great.
[207] So the moral of the story is you can ask all you want, but you just have to look at the data.
[208] And data doesn't lie, right?
[209] I mean, there's metrics.
[210] There's data can obscure the key insight if you're not careful.
[211] So time spent in the app, that's one.
[212] There's so many metrics you can put at this, and they will give you totally different insight.
[213] especially when you're trying to create something that doesn't obviously exist yet so you know measuring maybe why you left the app or measuring special moments of happiness that will make sure you return to the app or moments of happiness that are long -lasting versus like dopamine short -term all of those things but I think I suppose in the beginning you can just get away with just asking the question which features are used a lot let's do more of that and how hard was the decision and uh i mean maybe you can tell me what instagram looked in the beginning but how hard was it to make pictures the first class citizen that's a revolutionary idea like um it's whatever point instagram became this feed of photos that's quite brilliant plus I also don't know when this happened, but they're all shaped the same.
[214] It's like...
[215] I have to tell you why.
[216] That's the interesting part.
[217] Why is that?
[218] So a couple of things.
[219] One is data, data, like, you're right.
[220] You can overinterpret data.
[221] Like, imagine trying to fly a plane by staring at, I don't know, a single metric like airspeed.
[222] You don't know if you're going up or down.
[223] I mean, it correlates with up or down, but you don't actually know.
[224] We'll never help.
[225] you land the plane so don't stare at one metric like it turns out you have to synthesize a bunch of metrics to know where to go um but it doesn't lie like if your airspeed is zero unless it's not working right if it if it's zero you're probably going to fall out of the sky so generally you look around and you have the scan going yes and you're just asking yourself is this working or is this not working um but people have trouble explaining how they actually feel.
[226] So just it's about synthesizing both of them.
[227] So then Instagram, right?
[228] We were talking about revolutionary moment where the feed became square photos, basically.
[229] And photos first and then square photos.
[230] Yeah.
[231] It was clear to me that the biggest, so I believe the biggest companies are founded when enormous technical shifts happen.
[232] And the biggest technical shift that happened right before Instagram was founded was the advent of a phone that didn't suck, the iPhone, right?
[233] Like, in retrospect, we're like, oh, my God, the first iPhone that almost had, like, it wasn't that good.
[234] But compared to everything else at the time, it was amazing.
[235] And by the way, the first phone that had an incredible camera that could, that could, like, do as well as the point and shoot you might carry around was the iPhone 4.
[236] And that was right when Instagram launched.
[237] And we looked around and we said, what will change because everyone has a camera in their pocket?
[238] And it was so clear to me that the world of social networks before it was based in the desktop and sitting there and having a link you could share, right?
[239] And that wasn't going to be the case.
[240] So the question is, what would you share if you were out and about in the world if not only did you have a camera that fit in your pocket but by the way that camera had a network attached to it that allowed you to share instantly that seemed revolutionary and a bunch of people saw it at the same time it wasn't just Instagram there were a bunch of competitors um the thing we did I think was not only well we focused on two things so we wrote down those things we circled photos and we said I think we should invest in this but then we said what sucks about photos one they look like crap, right?
[241] They just, at least back then.
[242] Now, my phone takes pretty great photos, right?
[243] Back then, they were blurry, not so great, compressed, right?
[244] Two, it was really slow, like, really slow to upload a photo.
[245] And I'll tell a fun story about that and explain to you why they're all the same size and square as well.
[246] And three, man, if you wanted to share a photo on different networks, you had to go to each of the individual apps and select all of them and upload individually.
[247] And so we're like, all right, those are the pain points.
[248] We're going to focus on that.
[249] So one, instead of because they weren't beautiful, we're like, why don't we lean into the fact that they're not beautiful?
[250] And I remember studying in Florence.
[251] My photography teacher gave me this whole gay camera.
[252] And I'm not sure everyone knows what a whole gay camera is, but they're these old school plastic cameras.
[253] I think they're produced in China at the time.
[254] And I want to say the original ones were like from the 70s or the 80s or something.
[255] They're supposed to be like $3 cameras for the every person.
[256] They took nice medium format films, large, large negatives, but they kind of blurred the light and they kind of like light leaked into the side.
[257] And there was this whole resurgence where people looked at that and said, oh my God, this is a style, right?
[258] And I remember using that in Florence and just saying, well, why don't we just like lean into the fact that these photos suck?
[259] and make them suck more, but in an artistic way.
[260] And it turns out that had product market fit.
[261] People really like that.
[262] They were willing to share their not -so -great photos if they looked not -so -great on purpose, okay?
[263] The second part.
[264] That's where the filters come into the picture.
[265] Yep.
[266] So computational modification of photos to make them look extra crappy to where it becomes art. Yeah, yeah.
[267] And I mean, add light leaks, add like an overlay filter.
[268] make them more contrasty than they should be.
[269] The first filter we ever produced was called X -Pro 2.
[270] And I designed it while I was in this small little bed and breakfast room in Toto Santos, Mexico.
[271] I was trying to take a break from the bourbon days.
[272] And I remember saying to my co -founder, I just need like a week to reset.
[273] And that was on that trip, worked on the first filter because I said, you know, I think I can do this.
[274] And I literally iterated one by one over the RGB value.
[275] I use in the array that was the photo and just slightly shifted.
[276] Basically, there was a function of R, function of G, function of B that just shifted them slightly.
[277] It was in rocket science.
[278] And it turns out that actually made your photo look pretty cool.
[279] It just mapped from one color space to another color space.
[280] It was simple, but it was really slow.
[281] I mean, if you applied a filter, I think it used to take two or three seconds to render.
[282] only eventually would I figure out how to do it on the GPU and I'm not even sure it was a GPU but it was using OpenGL but anyway I would eventually figure that out and then it would be instant but it used to be really slow by the way anyone who's watching or listening it's amazing what you can get away with in a startup as long as the product outcome is right for the user like you can be slow you can be terrible you can be as long as you have product market fit, people will put up with a lot.
[283] And then the question is just about compressing, making it more performant over time so that they get that product market fit instantly.
[284] So fascinating because there's some things where those three seconds would make or break the app, but some things you're saying not.
[285] It's hard to know when, you know, it's the problem, Spotify solved making streaming like work.
[286] and like delays in listening to music is a huge negative even like slight delays but here you're saying I mean how do you know when those three seconds are okay or you just kind of have to try it out because to me my intuition would be those three seconds would kill the app like I would try to do the open gel thing right so I wish I were that smart at the time um I wasn't.
[287] I just knew how to do what I knew how to do, right?
[288] And I decided, okay, like, why don't I just iterate over the values and change them?
[289] And what's interesting is that compared to the alternatives, no one else used OpenGEL.
[290] Right.
[291] So everyone else was doing it the dumb way.
[292] And in fact, they were doing it at a high resolution now comes in the small resolution that we'll talk about for a second.
[293] By choosing 512 pixels by 512 pixels, which I believe it was at the time, we iterated over a lot fewer pixels than our competitors who were trying to do these enormous output like images.
[294] So instead of taking 20 seconds, I mean, three seconds feels pretty good, right?
[295] So on a relative basis, we were winning like a lot.
[296] Okay.
[297] So that's answer number one.
[298] Answer number two is we actually focused on latency in the right places.
[299] So we did this really wonderful thing when you uploaded.
[300] So the way it would work is, you know, you'd take your phone, you'd take the photo, and then you'd go to the edit screen, where you would caption it.
[301] And on that caption screen, you'd start typing you to think, okay, like, what's a clever caption?
[302] And I said to Mike, hey, when I worked on the Gmail team, you know what they did?
[303] When you typed in your username or your email address, even before you've entered in your password, like the probability once you enter in your username that you're going to actually sign in is extremely high.
[304] So why not just start loading your account in the background?
[305] Not like sending it down to the desktop.
[306] That would be a security issue, but like load it into memory on the server.
[307] Like get it ready, prepare it.
[308] I always thought that was so fascinating and unintuitive.
[309] And I was like, Mike, why don't we just do that?
[310] But like, we'll just upload the photo and like assume you're going to upload the photo.
[311] And if you don't, forget about it.
[312] We'll delete it, right?
[313] So what ended up happening was people would caption their photo.
[314] They'd press done or upload, and you'd see this little progress bar just go, and it was lightning fast, okay?
[315] We were no faster than anyone else at the time, but by choosing 512 by 512 and doing it in the background, it almost guaranteed that it was done by the time you captioned.
[316] And everyone when they used it was like, how the hell is this thing so fast?
[317] But we were slow.
[318] We just hit the slowness.
[319] It wasn't like these things are just like it's a shelly game.
[320] You're just hiding the latency.
[321] That mattered to people like a lot.
[322] And I think that so you were willing to prop with a slow filter if it meant you could share it immediately.
[323] And of course we added sharing options which let you distribute it really quickly.
[324] That was the third part.
[325] So latency matters, but relative to what?
[326] And then there's some like tricks.
[327] You get around to just hiding the latency.
[328] Like I don't know if Spotify starts downloading the next song eagerly.
[329] I'm assuming they do.
[330] There are a bunch of ideas here that are not rocket science that really help.
[331] And all of that was stuff you were explicitly having a discussion about, like those designs and you were having like arguments, discussions.
[332] I'm not sure it was arguments.
[333] I mean, I'm not sure if you've met my co -founder, Mike, but he's a pretty nice guy.
[334] And he's very reasonable.
[335] And we both just saw eye to eye and we're like, yeah, it's like, if you make this fast or at least seem fast, it'll be great.
[336] I mean, honestly, I think the most contentious thing, and he would say this too initially, was I was on an iPhone 3G.
[337] So like the not -so -fast one, and he had a brand -new iPhone 4.
[338] I was cheap.
[339] nice um and his feed loaded super smoothly like when he would scroll from photo to photo buttery smooth right but on my phone every time you got to a new photo it was like kachunk kachunk allocate memory like all this stuff right i was like mike that's unacceptable he's like oh come on man just like upgrade your phone basically you didn't actually say that he's nicer than that um but i could tell he wished like i would just stop being cheap and just get a new phone.
[340] But what's funny is we actually sat there working on that little detail for a few days before launch.
[341] And that polished experience plus the fact that uploading seemed fast for all these people who didn't have nice phones, I think meant a lot.
[342] Because far too often you see teams focus not on performance.
[343] They focus on what's the cool computer science problem they can solve, right?
[344] Can we scale this thing to a billion users?
[345] And they've got like a hundred, right?
[346] Yeah.
[347] You talked about loss function.
[348] So I want to come back to that.
[349] The loss function is like, do you provide a great, happy, magical, whatever experience for the consumer?
[350] And listen, if it happens to involve something complex and technical, then great.
[351] But it turns out, I think, most of the time, those experiences are just sitting there waiting to be built with like not that complex solutions but everyone is just like so stuck in their own head that they have to over engineer everything and then they forget about the easy stuff I mean also maybe to flip the loss function there is you're trying to minimize the number of times there's unpleasant experience right like the one you mention where when you go to the next photo it freezes for a little bit so it's almost as opposed to maximizing pleasure it's probably easier to minimize the number of like the friction yeah and as we all know you just you just you just uh you just make the pleasure negative and then minimize everything so we're mapping this all back to neural networks but actually can i say one thing on that which is i don't know a lot about machine learning but i feel like i've i've tried studying a bunch that whole idea of reinforcement learning and planning out more than the greedy single experience i think is is the closest you can get to like ideal product design thinking where you're not saying, hey, like, can we have a great experience just this one time?
[352] But like, what is the right way to onboard someone?
[353] What series of experiences correlate most with them hanging on long term, right?
[354] So not just saying, oh, did the photo load slowly a couple times or did they get a great photo at the top of their feed?
[355] But like, what are the things, that are going to make this person come back over the next week, over the next month.
[356] And as a product designer asking yourself, okay, I want to optimize, not just minimize bad experiences in the short run, but like, how do I get someone to engage over the next month?
[357] And I'm not going to claim that I thought that way at all at the time because I certainly didn't.
[358] But if I were going back and giving myself any advice, it would be thinking, what are those second order effects that you can create?
[359] And it turns out, having your friends on the service is an enormous win.
[360] So starting with a very small group of people that produce content that you wanted to see, which we did.
[361] We seeded the community very well, I think.
[362] Ended up mattering.
[363] And so.
[364] Yeah, you said that community is one of the most important things.
[365] So it's from a metrics perspective, from maybe a philosophy perspective, building a certain kind of community within the app.
[366] See, I wasn't sure what exactly you meant by that when I've heard.
[367] Would you say that?
[368] Maybe you can elaborate, but as I understand now, it can literally mean get your friends onto the app.
[369] Yeah.
[370] Think of it this way.
[371] You can build an amazing restaurant or bar or whatever, right?
[372] But if you show up and you're the only one there, is it like, does it matter how good the food is?
[373] The drinks, whatever?
[374] No. These are inherently social experiences that we were working on.
[375] So the idea of having people there Like you needed to have that Otherwise, it was just to filter out But by the way, part of the genius I'm going to say genius even though it wasn't really genius Was starting to be Marauding as a filter app was awesome The fact that you could So we talk about single player mode a lot Which is like can you play the game alone And Instagram you could totally play alone You could filter your photos And a lot of people would tell me, I didn't even realize that this thing was a social network until my friends showed up.
[376] It totally worked as a single player game.
[377] And then when your friends showed up, all of a sudden it was like, oh, not only was this great alone, but now I actually have this trove of photos that people can look at and start liking and then I can like theirs.
[378] And so it was this bootstrap method of how do you make the thing not suck when the restaurant is empty?
[379] Yeah, but the thing is when you say friends, I mean, we're not necessarily referring to friends in the physical space.
[380] So you're not bringing your physical friends with you.
[381] You're also making new friends.
[382] So you're finding new community.
[383] So it's not immediately obvious to me that it's almost like building any kind of community.
[384] It was it was both.
[385] And what we learned very early on was what made Instagram special.
[386] And the reason why you would sign up for it versus say, just sit on Facebook and look at your friends' photos.
[387] Of course, we were alive.
[388] And of course, it was interesting to see what your friends were doing.
[389] now.
[390] But the fact that you could connect with people who, like, took really beautiful photos in a certain style all around the world, whether they were travelers.
[391] It was the beginning of the influencer economy.
[392] There's these people who became professional Instagrammers way back when, right?
[393] But they took these amazing photos and some of them were photographers, right?
[394] Like, professionally.
[395] And all of a sudden, you had this moment in the day when you could open up this app and sure, you could see what your friends were doing, but also it was like, oh, my God, that's a beautiful waterfall or, oh, my God, I didn't realize there was that corner of England or, like, really cool stuff.
[396] And the beauty about Instagram early on was that it was international by default.
[397] You didn't have to speak English to use it, right?
[398] You could just look at the photos.
[399] It worked great.
[400] We did translate.
[401] We had some pretty bad translations, but we did translate the app.
[402] And, you know, even if our translation were pretty poor.
[403] The idea that you could just connect with other people through their images was pretty powerful.
[404] How much technical difficulties there with the programming?
[405] Like what programming language you were talking about?
[406] Oh, zero.
[407] Like, maybe it was hard for us, but I mean, we, there was nothing.
[408] The only thing that was complex about Instagram at the beginning, technically was making it scale.
[409] And we were just plain old objective C for the client.
[410] So it was iPhone only at first?
[411] iPhone only, yep.
[412] As an Android person, I'm deeply offended, but go ahead.
[413] This was 2010.
[414] Oh, sure, sure.
[415] Like, Android's getting a lot better.
[416] Yeah, yeah.
[417] So I'd take it back.
[418] You're right.
[419] If I were to do something today, I think it would be very different in terms of launch strategy, right?
[420] Android's enormous, too.
[421] But anyway, back to that moment, it was Objective C, and then we were Python -based.
[422] which is just like, this is before Python was really cool.
[423] Like now it's cool because it's all these machine learning libraries, like support Python and right.
[424] Now it's super, now it's like cool to be Python.
[425] Back then it was like, oh, Google uses Python.
[426] Like maybe you should use Python.
[427] Facebook was PHP.
[428] Like I had worked at a small startup of some X Googlers that used Python.
[429] So we used it.
[430] And we used a framework called Django still exists.
[431] And people use for basically the backend.
[432] And then you threw a couple interesting things in there.
[433] I mean, we used Postgres, which was kind of fun.
[434] It was a little bit like hipster database at the time.
[435] This is MySQL.
[436] MySQL.
[437] Like, everyone used MySQL.
[438] So, like, using Postgres was like an interesting decision, right?
[439] But we used it because it had a bunch of geo features built in because we thought we were going to be a check nap.
[440] It's also super cool now.
[441] So you were into Python before it was cool and you were into Postgres before it was cool.
[442] Yeah, we were basically like not only hipster, hipster.
[443] Hipster photo company, hipster tech company, right?
[444] We also adopted Redis early and, like, loved it.
[445] I mean, it solves so many problems for us, and turns out that's still pretty cool.
[446] But the programming was very easy.
[447] It was like sign up a user or have a feed.
[448] There was nothing, no machine learning at all, zero.
[449] Can you get some context, how many users at each of these stages?
[450] We're talking about 100 users, a thousand users.
[451] So the stage I just described, I mean, that technical stack, lasted through probably 50 million users.
[452] I mean, seriously, like, you can get away with a lot with a pretty basic stack.
[453] Like, I think a lot of startups try to over -engineer their solutions from the beginning to, like, really scale, and you can get away with a lot.
[454] That being said, most of the first two years of Instagram was literally just trying to make that stack scale.
[455] And it wasn't, it was not a Python problem.
[456] it was like literally just like where do we put the data like it's all coming in too fast like how do we store it how do we make sure to be up how do we like how do we make sure we're on the right size boxes that they have enough memory um those were the issues but can you speak to the choices you make at that stage when you're growing so quickly do you use something like somebody else's computer infrastructure or do you build in -house i'm only laughing because when we launched, we had a single computer that we had rented in some kolo space in L .A. I don't even remember what it was called because I thought that's what you did.
[457] When I worked at a company called Odeo that became Twitter, I remember visiting our space in San Francisco.
[458] You walked in, you had to wear the ear things.
[459] It was cold and fans everywhere, right?
[460] And we had to plug one out, replace one, and I was the intern, so I just held things.
[461] but I thought to myself, oh, this is how it goes.
[462] And then I remember being in a VC's office, I think it was a benchmarks office.
[463] And I think we ran into another entrepreneur and they were like, oh, how have things going?
[464] We're like, uh, you know, try to scale this thing.
[465] And they were like, well, I mean, can't you just add more instances?
[466] And I was like, what do you mean?
[467] And they're like instances on Amazon.
[468] I was like, what are those?
[469] And it was this moment where we realized how deep in it we were because we had no idea that AWS existed, nor should we be using it.
[470] Anyway, that night we went back to the office and we got on AWS, but we did this really dumb thing where I'm so sorry to people listening, but we brought up an instance, which was our database.
[471] It was going to be a replacement for our database, but we had it talking over the public internet to our little box in L .A. that was our app server.
[472] Yeah.
[473] That's how sophisticated we were.
[474] and obviously that was very, very slow.
[475] Didn't work at all.
[476] I mean, it worked, but didn't work.
[477] Only, like, later that night, did we realize we had to have it all together?
[478] But at least, like, if you're listening right now and you're thinking, you know, I have no chance.
[479] I'm going to start a start, but I have no chance.
[480] I don't know.
[481] We did it.
[482] We made a bunch of really dumb mistakes initially.
[483] I think the question is, how quickly do you learn that you're making a mistake?
[484] And do you do the right thing immediately right after?
[485] So you didn't pay for those mistakes by, you know, by failure?
[486] failure.
[487] So, yeah, how quickly did you fix it?
[488] I guess there's a lot of ways to sneak up to this question of how the hell do you scale the thing?
[489] Other startups, if you have an idea, how do you scale the thing?
[490] Is it just AWS?
[491] And you try to write the kind of code that's easy to spread across a large number of instances, and then the rest is just put money into it?
[492] Basically, I would say a couple things.
[493] First off, don't even ask the question.
[494] Just find product market fit, duct tape it together, right?
[495] Like if you have to, I think there's a big caveat here, which I want to get to.
[496] But generally, all that matters is product market fit.
[497] That's all that matters.
[498] If people like your product, do not worry about when 50 ,000 people use your product because you will be happy that you have that problem when you get there.
[499] I actually can't name many startups where they go from nothing to something overnight and they can't figure out how to scale it.
[500] There are some, but I think nowadays it's a, when I say a solved problem, like there are ways of solving it.
[501] The base case is typically that startups worry way too much about scaling way too early and forget that they actually have to make something that people like.
[502] That's the default mistake case.
[503] But what I'll say is, once you start scaling, I mean, hiring quickly people who have seen the game before and just know how to do it, it becomes a bit of like, yeah, just throw instances of the problem, right?
[504] But the last thing I'll say on this that I think did save us, we were pretty rigorous about writing tests from the beginning.
[505] that helped us move very, very quickly when we wanted to rewrite parts of the product and know that we weren't breaking something else.
[506] Tess are one of those things where it's like, you go slow to go fast, and they suck when you have to write them because you have to figure it out.
[507] There are always those ones that break when you don't want them to break, and they're annoying, and it feels like you spend all this time.
[508] But looking back, I think that long -term optimal, even with a team of four, it allowed us to move very, very quickly because anyone could touch any part of the product and know that they weren't going to bring down the site, or at least in general.
[509] At which point do you know product market fit?
[510] How many users would you say?
[511] Is it all it takes is like 10 people or is it 1 ,000?
[512] Is it 50 ,000?
[513] I don't think it is generally a question of absolute numbers.
[514] I think it's a question of cohorts and I think it's a question of trends.
[515] So, you know, it depends how big your business is trying to be, right?
[516] But if I were signing up 1 ,000 people a week and they all retain, like the retention curves for those cohorts looked good, healthy, and even like as you started getting more people on the service, maybe those earlier cohorts started curving up again because now there are network effects and their friends are on the service or totally depends what type of business you're in.
[517] But I'm talking purely social, right?
[518] I don't think it's an absolute number.
[519] I think it is a, I guess you could call it a marginal number.
[520] So I spent a lot of time when I work with startups asking them, like, okay, have you looked at that cohort versus this cohort, whether it's your clients or whether it's people signing up for the service?
[521] But a lot of people think you just have to hit some mark, like 10 ,000 people or 50 ,000 people, but really seven -ish billion people in the world, most people forever will not know about your product.
[522] There are always more people out there to sign up.
[523] It's just a question of how you turn on the spigot.
[524] At that stage, early stage yourself, but also by way of advice, should you worry about money at all, how this thing's going to make money?
[525] Or do you just try to find product market fit and get a lot of users to enjoy using your thing?
[526] I think it totally depends and that's an unsatisfying answer um he was talking with a friend today who he was one of our earlier investors and he was saying hey like have you been doing any angel investing lately i said not really i'm just like focused on what i want to do next and he said the number of financings have just gone bonkers like just bonk like people are throwing money everywhere right now um And I think the question is, do you have an inkling of how you're going to make money?
[527] Or are you really just like waving your hands?
[528] I would not like to be an entrepreneur in the position of, well, I have no idea how this will eventually make money.
[529] That's not fun.
[530] If you are in an area, like, let's say you wanted to start a social network, right?
[531] Not saying this is a good idea, but if you did.
[532] there are only a handful of ways they've made money and really only one way they've made money in the past and that's ads.
[533] So, you know, if you have a service that's amenable to that and then I wouldn't worry too much about that because if you get to the scale, you can hire some smart people and figure that out.
[534] I do think that it is really healthy for a lot of startups these days, especially the ones doing like enterprise software, slacks of the world, et cetera.
[535] to be worried about money from the beginning, but mostly as a way of winning over clients and having stickiness.
[536] I like, of course you need to be worried about money, but I'm going to also say this again, which is, it's like long -term profitability.
[537] If you have a roadmap to that, then that's great.
[538] But if you're just like, I don't know, maybe never, like we're working on this metaphors thing, I think maybe someday.
[539] I don't know.
[540] Like, that seems harder to me. So you have to be as big as Facebook to like finance that bet, right?
[541] Do you think it's possible?
[542] You said you're not saying it's necessarily good idea to launch a social network.
[543] Do you think it's possible today?
[544] Maybe you can put yourself in those shoes to launch a social network that achieves the scale of a Facebook or a Twitter or an Instagram and maybe even greater scale.
[545] Absolutely.
[546] How do you do it?
[547] asking for a friend.
[548] Yeah, if I knew, I'd probably be doing it right now and not sitting here.
[549] So, I mean, there's a lot of ways to ask this question.
[550] One is create a totally new product market fit, create a new market, create something like Instagram did, which is like create something kind of new, or literally out -compete Facebook at its own thing, or I'll compete Twitter at its own thing.
[551] The only way to compete now, if you want to build a large social network, is to look for the cracks, look for the openings.
[552] You know, no one competed, I mean, no one competed with the core business of Google.
[553] No one competed with the core business of Microsoft.
[554] You don't go at the big guys doing exactly what they're doing.
[555] Instagram didn't win, quote unquote, because it tried to be a visual Twitter.
[556] Like, we spotted things that either Twitter wasn't going to do or refused to do, images and feed for the longest time, right?
[557] Or that Facebook wasn't doing or not paying attention to you because they were mostly desktop at the time and we were purely mobile, purely visual.
[558] Often there are opportunities sitting there.
[559] You just have to, you have to figure out like, I think like there's a strategy book.
[560] I can't remember the name, but talk about moats and just like the best place to play is where your competitor like literally can't pivot.
[561] because structurally they're set up not to be there.
[562] And that's where you win.
[563] And what's fascinating is like, do you know how many people are like images?
[564] Facebook does that, Twitter does that?
[565] I mean, how wrong were they?
[566] Really wrong.
[567] And these are some of the smartest people in Silicon Valley, right?
[568] But now Instagram exists for a while.
[569] How is it that Snapchat could then exist?
[570] Makes no sense.
[571] Like plenty of people would say, well, there's Facebook, no images.
[572] Okay, okay.
[573] Instagram, I'll give you that one.
[574] But wait, now another image -based social network is going to get really big.
[575] And then TikTok comes along.
[576] Like, the prior, so you asked me, is it possible?
[577] The only reason I'm answering yes is because my prior is that it's happened once every, I don't know, three, four, five years consistently.
[578] And I can't imagine there's anything structurally that would change that.
[579] So that's why I answer that way.
[580] And not because I know how.
[581] I just, when you see a pattern, you see a pattern.
[582] And there's no reason to believe that's going to stop.
[583] And it's subtle, too, because like you said, Snapchat and TikTok, they're all doing the same space of things.
[584] But there's something fundamentally different about like a three -second video and a five -second video and a 15 -second video and a one -minute video and a one -hour video.
[585] Right.
[586] Like fundamentally different.
[587] Fundamentally different.
[588] I mean, I think one of the reasons Snapchat exists is because, was so focused on posting great, beautiful, manicured versions of yourself throughout time.
[589] And there was this enormous demand of like, hey, I really like this behavior.
[590] I love using Instagram, but man, I just wish I could share something going on in my day.
[591] Do I really have to put it on my profile?
[592] Do I really have to make it last forever?
[593] Do I really?
[594] And that opened up a door.
[595] It created a market.
[596] Right.
[597] And then what's fascinating is, Instagram had an explore page for the longest time.
[598] It was image driven, right?
[599] But there's absolutely a behavior where you open up Instagram and you sit on the explorer page all day.
[600] That is effectively TikTok, but obviously focused on videos.
[601] And it's not like you could just put the explore page in TikTok form and it works.
[602] It had to be video.
[603] It had to have music.
[604] These are the hard parts about product development that are very hard to predict.
[605] But they're all versions of the same thing with.
[606] varying if you line them up in a bunch of dimensions they're just like kind of on they're different values of the same dimensions which is like I guess easy to say in retrospect but like if I were an entrepreneur going after that area I'd ask myself like where's the opening what needs to exist because TikTok exists now so I wonder how much things that don't yet exist and can exist is in the space of algorithms in the space of recommender systems So in the space of how the feed is generated.
[607] So we kind of talk about the actual elements of the content.
[608] That's what we've been talking, the difference between photos, between short videos, longer videos.
[609] I wonder how much disruption it's possible in the way the algorithms work.
[610] Because a lot of the criticism towards social media is in the way the algorithms work currently.
[611] And it feels like, first of all, talking about product market fit.
[612] there's certainly a hunger for social media algorithms that do something different.
[613] I don't think anyone, everyone's at complaining, this is not doing, this is hurting me and this is hurting society, but I keep doing it because I'm addicted to it.
[614] And they say, we want something different, but we don't know what.
[615] It feels like a, just different.
[616] It feels like there's a hunger for that.
[617] but that's in the space of algorithms.
[618] I wonder if it's possible to disrupt in that space.
[619] Absolutely.
[620] I have this thesis that the worst part about social networks is that there is the people.
[621] It's a line that sounds funny, right?
[622] Because that's why you call it a social network.
[623] But what does social networks actually do for you?
[624] Like just think, you know, like imagine you were an alien and you landed and someone says, hey, there's this site.
[625] It's a social network.
[626] We're not going to tell you what it is, but just what does it do?
[627] And you have to explain it to them.
[628] Just two things.
[629] One is that people you know and have social ties with distribute updates through whether it's photos or videos about their lives so that you don't have to physically be with them, but you can keep in touch with them.
[630] That's one.
[631] That's like a big part of Instagram.
[632] That's a big part of Snap.
[633] It is not part of TikTok at all.
[634] So there's another big part, which is there's all this content out in the world that's entertaining, whether you want to watch it or you want to read it.
[635] And matchmaking between content that exists in the world and people that want that content turns out to be like a really big business.
[636] Search and discovery.
[637] Search and discovery.
[638] But my point is it could be video, it could be text, it could be websites, it could be, I mean, think back to, I think.
[639] back to like dig right or stumble upon or yeah right nice yeah but like what did those do like they basically distributed interesting content to you right um I think the most interesting part or the future of social networks is going to be making them less social because I think people are part of the root cause of the problem so for instance um often in recommender systems we talk about two stages there's a candidate generation step which is just like of our trove of stuff that you might want to see, what small subset should we pick for you?
[640] Typically, that is grabbed from things your friends have shared, right?
[641] Then there's a ranking step, which says, okay, now given these 100, 200 things, depends on the network, right?
[642] Let's, like, be really good about ranking them and generally rank the things up higher that get the most engagement, right?
[643] So what's the problem with that?
[644] step one is we've limited everything you could possibly see to things that your friends have chosen to share or maybe not friends but influencers what things do people generally want to share they want to share things that are going to get likes that are going to show up broadly so they tend to be more emotionally driven they tend to be more risque or whatever so why do we have this problem it's because we show people things people have decided to share and those things self -select to being the things that are most divisive.
[645] So how do you fix that?
[646] Well, what if you just imagine for a second that why do you have to grab things from things your friends have shared?
[647] Why not just like grab things?
[648] That's really fascinating to me. And that's something I've been thinking a lot about and just like, you know, why is it that when you log on to Twitter, you're just sitting there looking at things from accounts that you followed for whatever reason?
[649] And TikTok, I think, has done a wonderful job here, which is, like, you can literally be anyone.
[650] And if you produce something fascinating, it'll go viral.
[651] But, like, you don't have to be someone that anyone knows.
[652] You don't have to have built up a giant following.
[653] You don't have to have paid for followers.
[654] You don't have to try to maintain those followers.
[655] You literally just have to produce something interesting.
[656] That is, I think, the future of social networking.
[657] That's the direction things will head.
[658] And I think what you'll find is it's far less about people manipulating distribution and far more about what is like, is this content good?
[659] And good is obviously a vague definition that we spend hours on.
[660] But different networks, I think, will decide different value functions to decide what is good and what isn't good.
[661] And I think that's a fascinating direction.
[662] So that's almost like creating an internet.
[663] I mean, that's what Google did for web pages that did the, you know, page rank, search.
[664] so discovery you don't you don't follow anybody on google when you use a search engine you just discover web pages and so what ticto does is saying let's start from scratch let's like start a new internet and have people discover stuff on that new internet within a particular kind of pool of people what's so fascinating about this is like the the field of information retrieval like i always talks out.
[665] As I was studying this stuff, it would always use the word query and document.
[666] So I was like, why are they saying query and documents?
[667] Like, if you just stop thinking about query as like literally a search query, and a query could be a person.
[668] I mean, a lot of the way, I'm not going to claim to know how Instagram or Facebook machine learning works today.
[669] But, you know, if you want to find a match for a query, the query is actually the attributes of the person.
[670] Their age, their gender, where they're friends.
[671] from maybe some kind of summarization of their interests.
[672] And that's a query, right?
[673] And that matches against documents.
[674] And by the way, documents don't have to be text.
[675] They can be videos.
[676] They're however long.
[677] I don't know what the limit is on TikTok these days.
[678] They keep changing it.
[679] My point is just you've got a query, which is someone in search of something that they want to match.
[680] And you've got the document, and it doesn't have to be text.
[681] It could be anything.
[682] And how do you match make?
[683] And that's one of these like, I mean, I have spent a lot of time thinking about this.
[684] And I don't claim to have mastered it at all.
[685] But I think it's so fascinating about where that will go with new social networks.
[686] See, what I'm also fascinated by is metrics that are different than engagement.
[687] So the other thing from an alien perspective, what social networks are doing is they, in the short term, bring out different aspects of each human being.
[688] So first, let me say that an algorithm or a social network for each individual can bring out the best of that person or the worst of that person.
[689] There's a bunch of different parts to us, parts who are proud of that we are and parts we're not so proud of.
[690] When we look at the big picture of our lives, when we look back 30 days from now, am I proud that I said those things or not?
[691] Am I proud that I felt those things.
[692] Am I proud that I experienced or read those things or thought about those things?
[693] Just in that kind of self -reflectic kind of way.
[694] And so coupled with that, I wonder if it's possible to have different metrics that are not just about engagement, but are about long -term happiness, growth of a human being.
[695] Well, they look back and say, I'm a better human being for having spent 100 hours on that app and that feels like it's actually strongly correlated with engagement in the long term in the short term it may not be but in long term it's like the same kind of thing where you really fall in love with the product you fall in love on an iPhone you fall in love with a car that's what makes you fall in love is like really being proud and just in a self -reflective way, understanding that you're better human being for having used the thing.
[696] And that's like where the, that's what great relationships are made from.
[697] It's not just like, you're hot and we like being together or something like that.
[698] It's more like, I'm a better human being because I'm with you.
[699] And that feels like a metric that could be optimized for by the algorithms.
[700] But anytime I kind of talk about this with anybody, they seem to say, yeah, okay, that's going to get out -competed immediately by the engagement, if it's ad -driven especially.
[701] I just don't think so.
[702] I don't, I mean, a lot of it's just implementation.
[703] I'll say a couple of things.
[704] One is to pull back the curtain on daily meetings inside of these large social media companies, a lot of what management, or at least the people that are tweaking these algorithms spend their time on are tradeoffs.
[705] and there's these things called value functions which are like, okay, we can predict the probability that you'll click on this thing or the probability that you'll share it or the probability that you will leave a comment on it or the probability you'll dwell on it.
[706] Individual actions, right?
[707] And you've got this neural network that basically has a bunch of heads at the end and all of them are between zero and one and great, they all have values, right?
[708] or they all have probabilities.
[709] And then in these meetings, what they will do is say, well, how much do we value a comment versus a click versus a share versus a, and then maybe even some downstream thing, right, that has nothing to do with the item there, but like driving follows or something.
[710] And what typically happens is they will say, well, what are our goals for this quarter at the company?
[711] Oh, we want to drive sharing up.
[712] okay, well, let's turn down these metrics and turn up these metrics and they blend them, right, into a single scaler with which they're trying to optimize.
[713] That is really hard because invariably you think you're solving for, I don't know, something called meaningful interactions, right?
[714] This was the big Facebook pivot.
[715] And I don't actually have any internal knowledge.
[716] Like I wasn't in those meetings.
[717] But at least from what we've seen over the last month or so, it seems by.
[718] actually trying to optimize for meaningful interactions it had all these side effects of optimizing for these other things.
[719] And I don't claim to fully understand them.
[720] But what I will say is that tradeoffs abound.
[721] And as much as you'd like to solve for one thing, if you have a network of over a billion people, you're going to have unintended consequences either way.
[722] And it gets really hard.
[723] So what you're describing is effectively a value model that says like, can we capture, This is the thing that I spent a lot of time thinking about, like, can you capture utility in a way that, like, actually measure someone's happiness that isn't just a, what do they call it, a surrogate problem where you say, well, can I think like the more you use the product, the happier you are?
[724] That was always the argument at Facebook, by the way.
[725] It was like, well, people use it more, so they must be more happy.
[726] Turns out there are like a lot of things you use more that make you less happy in the world, not talking.
[727] about Facebook, just, you know, let's think about whether it's gambling or whatever, like, that you can do more of, but doesn't necessarily make you happier.
[728] So the idea that time equals happiness, obviously, you can't map utility and time together easily.
[729] There are a lot of edge cases.
[730] So when you look around the world and you say, well, what are all the ways we can model utility?
[731] There's like one of the, please, if you know someone's smart doing this, introduce me because I'm fascinated by it, and it seems really tough.
[732] But the idea that reinforcement learning, like, everyone interesting I know in machine learning, like, I was really interested in recommender systems and supervised learning.
[733] And the more I dug into it, I was like, oh, literally everyone's smart is working on reinforcement learning.
[734] Like literally everyone.
[735] You just made people at Open AI and Deep Mind very happy, yes.
[736] But I mean, but what's interesting is like it's one thing to train a game and like, I mean, in that paper where they just took Atari and they used a ComvNet to basically just like train simple actions, mind -blowing, right?
[737] Absolutely mind -blowing.
[738] But it's a game, great.
[739] So now, what if you're constructing a feed for a person, right?
[740] Like, how can you construct that feed in such a way that optimizes for a diversity of experience, a long -term happiness, right?
[741] But that reward function, it turns out in reinforcement learning again, as I've learned, like reward design is really hard.
[742] And I don't know.
[743] Like how do you design a scalar reward for someone's happiness over time?
[744] I mean, do you have to measure dopamine levels?
[745] Like, do you have to...
[746] Well, you have to have a lot of, a lot more signals from the human being.
[747] Currently, it feels like there's not enough signals coming from the human being users of this algorithm.
[748] So for reinforcement learning to work well, you need to have a lot more data.
[749] It needs to have a lot of data, and that actually is a challenge for anyone who wants to start something, which is you don't have a lot of data.
[750] So how do you compete?
[751] But I do think back to your original point, rethinking the algorithm, rethinking reward functions, rethinking utility, that's fascinating.
[752] That's cool.
[753] And I think that's an open opportunity for a company that figures it out.
[754] I have to ask about April 2012 when Instagram, along with its massive employee base of 13 people, was sold to Facebook for $1 billion.
[755] What was the process like on a business level, engineering level, human level?
[756] What was that process of selling to Facebook like?
[757] What did it feel like?
[758] So I want to provide some context, which is I worked in corporate development at Google, which not a lot of people know, but corporate development.
[759] is effectively the group that buys companies, right?
[760] You sit there and you acquire companies.
[761] And I had sat through so many of these meetings with entrepreneurs.
[762] We actually, fun fact, we never acquired a single company when I worked in corporate development.
[763] So I can't claim that I had like a lot of experience.
[764] But I had enough experience to understand, okay, like what prices are people getting and what's the process?
[765] And as we started to grow, you know, we were trying to keep this thing.
[766] running and we were exhausted and we were 13 people and I mean we were trying to think back it's probably 27 37 now um so young and on a relative basis right and uh we're trying to keep the thing running and then you know we go out to raise money and we're kind of like the hot startup at the time and I remember going into a specific VC and saying our terms we're looking for are we're looking for a $500 million valuation.
[767] And I've never seen so many jaws drop, all in unison, right?
[768] And I was like, thanked and walked out the door very kindly after.
[769] And then I got a call the next day from someone who was connected to them and said, they said, we just want to let you know that, like, it was pretty offensive that you asked for a $500 billion valuation.
[770] And I can't tell if that was like just negotiating or what.
[771] But it's true.
[772] like no one offered us more, right?
[773] So can you clarify the number again?
[774] You said how many million?
[775] 500.
[776] 500 million.
[777] 500 million.
[778] Yeah, half a billion.
[779] Yeah.
[780] So in my mind, I'm anchored like, okay, well, literally no one's biting at 500 million.
[781] And eventually we would get Sequoia and Greylock and others together at 500 million, basically, post.
[782] It was 450 pre.
[783] I think we raised $50 million.
[784] But just like no one was used to seeing $500 million.
[785] companies then.
[786] I don't know if it was because we were just coming out of the hangover 2008 and things were still in recovery mode.
[787] But then along comes Facebook.
[788] And after some negotiation, we've two -xed the number from a half a billion to a billion.
[789] Yeah, it seems pretty good.
[790] You know, and I think Mark and I really saw eye to eye that this thing could be big.
[791] We thought we could, their resources would help us scale it.
[792] And in a lot of ways, it de -risks, I mean, it derives a lot of the employee's lives for the rest of their lives, including me, including Mike, right?
[793] I think I might have had like 10 grand in my bank account at the time, right?
[794] Like, we're working hard.
[795] We had nothing.
[796] So on a relative basis, it seems very high.
[797] And then I think the last company to exit for anywhere close to a billion was YouTube that I could think of.
[798] and thus began the giant long bull run of 2012 to all the way to where we are now where I saw some stat yesterday about like how many unicorns exist and it's absurd.
[799] But then again, never underestimate technology and like the value it can provide and man, costs have dropped and man scale has increased and you can make businesses make a lot of money now.
[800] But on a fundamental level, I don't know, like how do you have just.
[801] describe the decision to sell a company with 13 people for a billion dollars.
[802] So first of all, like how did it take a lot of guts to sit at a table and say 500 million or one billion with Mark Zuckerberg?
[803] It seems like a very large number with 13.
[804] Like especially like it doesn't seem.
[805] It is.
[806] It is.
[807] They're all large numbers.
[808] Especially like you said before the unicorn parade.
[809] I like that.
[810] I'm going to use that.
[811] Unicorn Parade, yeah.
[812] You were at the head of the unicorn parade.
[813] It's the, yeah, it's a massive unicorn parade.
[814] Okay, so no, I mean, we knew we were worth, quote, unquote, a lot, but we didn't, I mean, there was no market for Instagram.
[815] I mean, it's not, you could mark, you couldn't mark to market this thing in the public markets.
[816] You didn't quite understand what it would be worth or was worth at the time.
[817] So in a market, an illiquid market where you have one buyer and one seller and you're going back and forth.
[818] and well I guess there were like VC firms who were willing to you know invest at a certain valuation so I don't know you just go with your gut and and at the end of the day I would say the hardest part of it was not realizing like when we sold it was tough because like literally everywhere I go restaurants whatever like for good six months after it was a lot of attention on the deal a lot of attention on the product a lot of attention it was kind of miserable right and you're like wait like i made a lot of money but like why is this not great and it's because it turns out you know and i don't i don't know like i don't really keep in touch with mark but i've got to assume his job right now is not exactly the most happy job in the world it's really tough when you're on top and it's really tough when you're in the limelight so the decision itself was like oh cool this is great how lucky are we right so okay there's a million question yeah go first of all first of all why why is it hard to be on top why did you not feel good like can you dig into that it always um i've heard like Olympic athletes say after what they win gold they they get depressed um is it something like that where it feels like it was kind of like a thing you were working towards some loose definition of success and this sure is like feels like at least according to other startups this is what success looks like and now why don't I feel any better I'm still human and I still have all the same problems is that the nature or is it just like negative attention or some kind I think it's all of the above but to be clear there was a lot of happiness in terms of like oh my god this is great like we like won the Super Bowl of startups right um anyone who can get to a liquidity event of anything meaningful feels like wow this is what we started out to do of course we want to create great things that people love but um like we won in a big way but yeah there's this big like oh if we won what is like what's next i don't so they call it that we have arrived syndrome look where I can quote that from, but I remember reading about it at the time.
[819] I was like, oh, yeah, that's that.
[820] And I remember we had a product manager leave very early on when we got to Facebook and he said to me, I just don't believe I can learn anything at this company anymore.
[821] It's like it's hit its apex.
[822] We sold it.
[823] Great.
[824] I just don't have anything else to learn.
[825] So from 2012, all the way to the day I left in 2018, like the amount I learned and the humility with which I realized, oh, we thought we won.
[826] A billion dollars is cool, but like, there are a hundred billion dollar companies.
[827] And by the way, on top of that, we had no revenue.
[828] We had, I mean, we had a cool product, but we didn't scale it yet.
[829] And there's so much to learn.
[830] And then competitors and how fun was it to fight Snapchat?
[831] Oh, my God.
[832] Like, it was like Yankees Red Sox.
[833] It's great.
[834] Like, that's what you live for.
[835] You know, you win some, you lose some.
[836] But the amount you can learn through that process, what I've realized in life is that there is no end.
[837] There's always someone who has more.
[838] There's always more challenge, just at different scales.
[839] And this sounds like a little Buddhist, but everything is super challenging, whether you're like a small business or an enormous business.
[840] I say, like, choose the game you like to play, right?
[841] you've got to imagine that if you're an amazing basketball player, you enjoy to some extent practicing basketball.
[842] It's got to be something you love.
[843] It's going to suck.
[844] It's going to be hard.
[845] You're going to have injuries, right?
[846] But you've got to love it.
[847] And the same thing with Instagram, which is we might have sold, but it was like, great, there's one Super Bowl title.
[848] Can we win five?
[849] What else can we do?
[850] Now I imagine you didn't ask this, but okay, so I left.
[851] There's a little bit of like, what do you do next?
[852] right like what do you how do you top that thing it's the wrong question the question is like when you wake up every day what is the hardest most interesting thing you can go work on because like at the end of the day we all turn into dirt doesn't matter right but what does matter is like can we really enjoy this life not in a hedonistic way because that's those it's like the reinforcement learning learning like short term versus long term objectives can you wake up every day and truly enjoy what you're doing, knowing that it's going to be painful.
[853] Knowing that, like, no matter what you choose, it's going to be painful.
[854] Whether you sit on a beach or whether you manage 1 ,000 people or 10 ,000, it's going to be painful.
[855] So choose something that's fun to have pain.
[856] But yes, there was a lot of, we have arrived, and it's a maturation process.
[857] You just have to go through.
[858] So no matter how much success is, there is, how much money you make, you have to wake up the next day and choose the hard life, whatever that means next.
[859] That's fun.
[860] The fun slash hard life.
[861] Well, hard life, that's fun.
[862] I guess what I'm trying to say is slightly different, which is just that no one realizes everything's going to be hard.
[863] Even chilling out is hard.
[864] And then you just start worrying about stupid stuff.
[865] Like, I don't know, like, did so -and -so forget to paint the house today?
[866] or like, did the gardener come or whatever?
[867] Like, or, oh, I'm so angry.
[868] My shipment of wine didn't show up.
[869] And I'm sitting here on the beach without my wine.
[870] I don't know.
[871] I'm making shit up now.
[872] But like...
[873] It turns out that even chilling, aka meditation, is hard work.
[874] Yeah.
[875] And at least meditation is like productive chilling, where you're like actually training yourself to calm down and be...
[876] But backing up for a moment, everything's hard.
[877] You might as well be like playing in the game you love to play.
[878] I just like playing and winning and...
[879] And I'm on the, on the, I'm still on the, I think the first half of life, knock on wood.
[880] And I've got a lot of years.
[881] And what am I going to do, sit around?
[882] And the other way of looking at this, by the way, imagine you made one movie and it was great.
[883] Would you just like stop making movies?
[884] No. Generally, you're like, wow, I really like making movies.
[885] Let's make another one.
[886] A lot of times, by the way, the second one or the third one, not that great.
[887] But the fourth one, awesome.
[888] And no one forgets the second or everyone forgets the second and the third one.
[889] So there's just this constant process of like, can I produce and is that fun?
[890] Is that exciting?
[891] What else can I learn?
[892] So this machine learning stuff for me has been this awesome new chapter of being like, man, that's something I didn't understand at all.
[893] And now I feel like I'm one tenth of the way there.
[894] And that feels like a big mountain to climb.
[895] So I distracted us from the original question.
[896] No, and we'll return to the machine learning, because I'd love to explore your interest there.
[897] But, I mean, speaking of sort of challenges and hard things, is there a possible world where sitting in a room with Mark Zuckerberg, with a $1 billion deal, you turn it down?
[898] Yeah, of course.
[899] What does that world look like?
[900] Why would you turn it down?
[901] Why did you take it?
[902] What was the calculation that you were making?
[903] Thus enters the world of counterfactuals and not really knowing.
[904] And if only we could run that experiment.
[905] Well, the universe exists.
[906] It's just running a parallel to our own.
[907] Yeah.
[908] It's so fascinating, right?
[909] I mean, we're talking a lot about money, but the real question was, I'm not sure you'll believe me when I says, but could we strap our little company on the side of a rocket ship and, like, get out to a lot of people really, really quickly with the support, with the talent of a place like Facebook, I mean, people often ask me what I would do differently at Instagram today.
[910] And I say, well, I'd probably hire more carefully because we showed up just like before I knew it.
[911] We had like 100 people on the team and 200, then 300.
[912] I don't know where all these people were coming from.
[913] I never had to recruit them.
[914] I never had to screen them.
[915] They were just like internal transfers, right?
[916] So it's like relying on the Facebook hiring machine, which is quite sort of, I mean, it's an elaborate machine.
[917] It's great, by the way.
[918] They have really talented people there.
[919] But my point is, um, the choice was like, take this thing, put it on the side of a rocket ship that you know is growing very quickly.
[920] Like I had seen what had happened when Ev sold blogger to Google and then Google went public.
[921] Remember, we sold before Facebook went public.
[922] There was a moment at which the stock price was $17, by the way, Facebook stock price was $17.
[923] I remember thinking, what the, did I just do, right?
[924] And, um, Now at 320 -ish, I don't know where we are today, but like, okay, like, the best thing, by the way, is like, when the stock is down, everyone calls you a dope.
[925] And then when it's up, they also call you a dope, but just for a different reason, right?
[926] Like, you can't win.
[927] Lesson in there somewhere.
[928] Yeah.
[929] So, but, you know, the choice is to strap yourself to a rocket ship or to build your own.
[930] You know, Mr. Elon built his own, literally, with the rocket ship.
[931] that's a difficult choice because there's a world actually i would say something different which is Elon and others decided to sell PayPal for not that much i mean how much was it about a billion dollars i can't remember something like that yeah i mean it was early and um but it's worth a lot more now to then build a new rocket chips so this is the cool part right if you are an entrepreneur and you own a controlling stake in the company not only is it really hard to do something else with your life because all of the, you know, value is tied up in you as a personality attached to this company, right?
[932] But if you sell it and you get yourself enough capital and you, like, have enough energy, you could do another thing or 10 other things or in Elon's case, like a bunch of other things.
[933] I don't know, like I lost count at this point.
[934] And it might have seemed silly at the time.
[935] And sure, like if you look back, man, PayPal's worth a lot now, right?
[936] but I don't know like do you think Elon like cares about like are we going to buy Pinterest or not like I just he is he created a massive capital that allowed him to do what he wants to do and that's awesome that's more freeing than anything because when you are an entrepreneur attached to a company you got to stay at that company for a really long time it's really hard to remove yourself but I'm not sure how much he loved PayPal versus SpaceX and Tesla I have a sense that you love Instagram.
[937] Yeah, I loved enough to, like, work for six years beyond the deal.
[938] Which is rare, which is very rare.
[939] You chose.
[940] But can I tell you why?
[941] Sure.
[942] It's, um, there are not a lot of companies that you can be part of where the Pope's, like, I would like to sign up for your product.
[943] Like, I'm not a religious person at all.
[944] I'm really not.
[945] Yeah.
[946] But when you go to the Vatican and you're like walking among a giant columns and you're hearing the music and everything and like the Pope walks in and he wants to press the sign up button on your product.
[947] It's a moment in life, okay, no matter what your persuasion, okay?
[948] The number of doors and experiences that that opened up was, it was incredible.
[949] I mean, the people I got to meet, the places I got to go, I assume maybe like a payments company is slightly different, right?
[950] But that's why, like, it was so fun.
[951] And plus, I truly believed we were building such a great product.
[952] And I loved the game.
[953] It wasn't about a money.
[954] It was about the game.
[955] Do you think you had the guts to say no?
[956] I often think about this, like, how hard is it for an entrepreneur to say no?
[957] Because the peer pressure, so basically the sea of entrepreneurs in Silicon Valley are going to tell you, this is their dream.
[958] The thing you were sitting before is a dream.
[959] To walk away from that is really, it seems like nearly, nearly, impossible because Instagram could in 10 years be, you know, you could talk about Google.
[960] You could be making self -driving cars and building rockets that go to Mars and compete with SpaceX.
[961] Totally.
[962] And so that's an interesting decision to say, am I willing to risk it?
[963] And the reason I also say it's an interesting decision because it feels like per previous discussion, if you're launching a social network company, there's going to be that meeting, whatever that number is, if you're successful.
[964] If you're on this rocket ship of success, there's going to be a meeting with one of the social media, social network companies that want to buy you, whether it's Facebook or Twitter, but it could also very well be Google who seems to have like a graveyard of failed social networks.
[965] And it's, I mean, I don't know.
[966] I think about that how difficult it is for an entrepreneur to make that decision.
[967] How many have successfully made that decision, I guess?
[968] It's a big question.
[969] It's sad to me, to be honest, that too many make that decision perhaps for the wrong reason.
[970] Sorry, when you say make the decision, you mean to the affirmative.
[971] To the affirmative, yeah.
[972] Got it.
[973] Yeah.
[974] There are also companies that don't sell, right?
[975] And take the path and say, we're going to be independent, and then you've never heard of them again.
[976] Like, I remember Path, right, was one of our competitors early on.
[977] There's a big moment when they had, I can't remember what it was, like a $110 million offer from Google or something.
[978] It might have been larger.
[979] I don't know.
[980] And I remember there was like this big tech ranch article that was like they turned it down after talking deeply about their values and everything.
[981] And I don't know the inner workings of Foursquare, but like I'm certain there's, There were many conversations over time where there were companies that wanted Foursquare as well.
[982] Recently, like, I mean, one other companies, there's Clubhouse, right?
[983] Like, I don't know.
[984] Maybe people were really interested in them, too.
[985] Like, there are plenty of moments where people say no. And we just forget that those things happen.
[986] We only focus on the ones where, like, they said yes and like, wow, like, what if they had stayed independent?
[987] and so I don't know I like I used to think a lot about this and now I just don't because I'm like whatever you know like things have gone pretty well I'm ready for the next game I mean think about an athlete where I don't know maybe they they do something wrong in the World Series or whatever and like if you let it haunt you for the rest of your career like why not just be like I don't know it was a game next game next shot right if if you just move to that world like at least I have a next shot right no that that's that's beautiful but i mean just in insights uh and it's funny you brought up clubhouse it is very true it seems like clubhouses um you know on the downward path and it's very possible to see a billion plus dollar deal at some stage maybe like a year ago or half a year ago from facebook from google i think facebook was flirting with that idea too and and i think a lot of companies probably were i would wish it was more public.
[988] You know what?
[989] There's not like a badass public story about them making the decision to walk away.
[990] We just don't hear about it.
[991] And then we get to see the results of that success or the failure, more often failure.
[992] So a couple of things.
[993] One is, I would not assume Clubhouse is down for the count at all.
[994] They're young.
[995] They have plenty of money.
[996] They're run by really smart people.
[997] I'd give them like a very fighting chance to figure it out.
[998] There are a lot times when people called Twitter down for the count and they figure it out and they seem to be doing well right.
[999] So just backing up like and not knowing anything about their internals like there's a strong chance they will figure it out and that people are just down because they like being down about companies.
[1000] They like assuming that they're going to fail.
[1001] So who knows, right?
[1002] But let's take the ones in the past where like we know how it played out.
[1003] There are plenty of examples where people have turned down big offers and then you've just never heard from them again.
[1004] But we never focus on the companies because you just forget that those were big.
[1005] But inside your psyche, I think it's easy for someone with enough money to say money doesn't matter, which I think is like it's bullshit.
[1006] Of course money matters to people.
[1007] But at the moment, you just can't even grasp like the number of zeros that you're talking about.
[1008] It just doesn't make sense.
[1009] Right.
[1010] So to think rationally in that moment is not something many people are equipped to do, especially not people where I think we had founded the company a year earlier maybe two years earlier like a year and a half we were 13 people but um i will say i still don't know if it was the right decision because i don't have that counterfactual i don't know that other world yeah i'm just thankful that by and large most people love instagram still do by and large people are very happy with like the time we had there um and i'm proud of what we built so like i'm cool like we're you know now it's Now it's next shot, right?
[1011] Well, if we could just linger on this, Yankees versus Red Sox, the fun of it, the competition over, I would say, over the space of features.
[1012] So there are a bunch of features, like there's photos, there's one -minute videos on Instagram, there's IGTV, there's stories, there's reels, there's live.
[1013] So that sounds like it's like a long list of too much stuff, but it's, It's not because it feels like they're close together, but they're somehow, like what we're saying, fundamentally distinct, like each of the things I mentioned.
[1014] Maybe can you describe the philosophies, the design philosophies behind some of these, how you were thinking about it during the historic war between Snapchat and Instagram, or just in general, like this space of features that was discovered?
[1015] There's this great book by, Clay Christianson called competing against luck.
[1016] It's like a terrible title.
[1017] But within it, there's effectively an expression of this thing called jobs to be done theory.
[1018] And it's unclear of like he came up with it or some of his colleagues, but there are a bunch of places you can find with people claiming to have come up with his jobs to be done theory.
[1019] But the idea is if you, if you like zoom out and you look at your product, you ask yourself, why are people hiring your product.
[1020] Like, imagine every product in your life is effectively an employee, you know, your CEO of your life and you hire products to be employees effectively.
[1021] They all have roles and jobs, right?
[1022] Why are you hiring a product?
[1023] Why do you want that product to perform something in your life?
[1024] And what, like, what are the hidden reasons why you're, you're in love with this product?
[1025] Instagram was about sharing your life with others visually, period.
[1026] Why?
[1027] Because you feel connected with them.
[1028] You get to show off.
[1029] You get to feel good and cared about, right, with likes.
[1030] And it turns out that that will, I think, forever define Instagram.
[1031] And any product that serves that job is going to do very well.
[1032] Stories, let's take as an example, is very much serving that job.
[1033] In fact, it serves it better than the original product because when you're large and have an enormous audience, you're worried about people seeing your stuff or you're worried about being permanent so that a college admissions person is going to see your photo of you doing something.
[1034] And so it turns out that that is a more efficient way of performing that job than the original product was.
[1035] The original product still has its value, but at scale, these two things together work really, really well.
[1036] Now, I will claim that other parts of the product over time didn't perform that job as well.
[1037] I think IGTV probably didn't, right?
[1038] Shopping is like completely unrelated to what I just described.
[1039] But in my work, I don't know, right?
[1040] Products I think, products that succeed are products that all share this parent node of like this job to be done that is in common.
[1041] And then they're just like different ways of doing it, right?
[1042] Apple, I think, does a great job with this, right?
[1043] It's like managing your digital life.
[1044] and all the products just work together.
[1045] They sink.
[1046] They like, it's beautiful, right?
[1047] Even if they require like silly specific cords to work.
[1048] But they're all part of a system.
[1049] It's when you leave that system and you start doing something weird that people start scratching their head.
[1050] And I think you are less successful.
[1051] So I think one of the challenges Facebook has had throughout its life is that it has never fully, I think, appreciated the job to be done of the main product.
[1052] And what it's done is said, oh there's a shiny object over there that startup's getting some traction let's go copy that thing and then they're confused why it doesn't work like why doesn't it work it's because the people who show up for this don't want that it's different what's the purpose of facebook so i remember i was a very early facebook user i was the reason i was personally excited about facebook is um because you you can first of all use your real name like i i can exist in this world it could be like formally exist.
[1053] I like anonymity for certain things, Reddit and so on.
[1054] But I wanted to also exist not anonymously so that I can connect with other friends of mine, not anonymously.
[1055] And there's a reliable way to know that I'm real and they're real and they're connecting.
[1056] And it's kind of like, I liked it for the for the reasons that people like LinkedIn, I guess.
[1057] But like without the like not everybody is dressed up and being super polite like more like with friends but then it became something much bigger than that i suppose there's a feed it's it became this um i mean it became a place to get discover content to share content that's not just about connecting directly with friends i mean it became something else i don't even know what it is really so you said Instagram is a place where you visually share your life.
[1058] What is Facebook?
[1059] Well, let's go back to the founding of Facebook and why it worked really well initially at Harvard.
[1060] And then Dartmouth and Stanford and I can't remember.
[1061] I probably MIT.
[1062] There were like a handful of schools in that first tranche, right?
[1063] It worked because there are communities that exist in the world that want to transact.
[1064] And when I say transact, I don't mean commercially.
[1065] I just mean they want to.
[1066] to share, they want to coordinate, they want to communicate, they want a space for themselves.
[1067] And Facebook at its best, I think, is that.
[1068] And if actually you look at the most popular products that Facebook has built over time, if you look at things like groups, the marketplace, groups is enormous.
[1069] Yeah.
[1070] And groups is effectively, like everyone can found their own little Stanford or Dartmouth or MIT, right?
[1071] And find each other and and share and communicate about something that matters deeply to them, that is the core of what Facebook was built around.
[1072] And I think today is where it stands most strongly.
[1073] It's brilliant.
[1074] It's the groups.
[1075] I wish groups were done better.
[1076] It feels like it's not a first -class citizen.
[1077] I know I may be saying something without much knowledge, but it feels like it's a it's kind of bolted on while being used a lot it feels like it there needs to be a little bit more structure in terms of discovery in terms of yeah like i mean look at reddit like red is basically groups of public and open and a little bit crazy right in a good way yeah um but there's clear product market fit for that specific use case and it doesn't have to be a college it can be anything it can be a small group a big group it can be group messaging facebook shines I think when it leans into that.
[1078] I think when there are other companies that just seem exciting, and now all of a sudden the product shifts in some fundamental way to go try to compete with that other thing, that's when I think consumers get confused.
[1079] Even if you can be successful, even if you can compete with that other company, even if you can figure out how to bolt it on, eventually you come back and you look at the app and you're like, I just don't know why I open this app.
[1080] Like, why?
[1081] Like, there are too many things going on.
[1082] And that was always a worry.
[1083] I mean, you listed all the things at Instagram and I almost gave me a heart attack.
[1084] Like, way too many things.
[1085] But I don't know.
[1086] Entrepreneurs get bored.
[1087] They want to add things.
[1088] They want to, like, right?
[1089] I don't have a good answer for it, except for that, I think being true to your original use case and not even original use case, but sorry, actually, not use case.
[1090] Original job.
[1091] There are many use cases under that job.
[1092] Being true to that.
[1093] and like being really good at it over time and morphing as needs change, I think that's how to make a company last forever.
[1094] And I mean, honestly, I like, my main thesis about why Facebook is in the position it is today is if they have had a series of product launches that delighted people over time, I think they'd be in a totally different world.
[1095] So just like imagine for a moment, And by the way, Apple's entering this.
[1096] But like Apple for so long, just like product after product, you couldn't wait for it.
[1097] You stood in line for it.
[1098] You talked about it.
[1099] You got excited.
[1100] Amazon makes your life so easy.
[1101] It's like, wow, I needed this thing.
[1102] And it showed up at my door two days later.
[1103] And like both of these companies, by the way, Amazon, Apple have issues, right?
[1104] There are labor issues, whether it's here in the U .S. or in China.
[1105] There are environmental issues.
[1106] There are.
[1107] But like, when's the last time you heard like how?
[1108] large chorus being like, these companies better pay for what they're doing on these things, right?
[1109] I think Facebook's main issue today is like, you need to produce a hit.
[1110] If you don't produce hits, it's really hard to keep consumers on your side.
[1111] Then people just start picking on you for a variety of reasons, whether it's right or wrong.
[1112] I'm not even going to place a judgment right here and right now.
[1113] I'm just going to say that it is way better to be in a world where you're producing hits and consumers love what you're doing, because then they're on your side.
[1114] And I think that's, it's, the past 10 years for Facebook has been fairly hard on this dimension.
[1115] So, and by hits, it doesn't necessarily mean financial hits.
[1116] It feels like to me what you're saying is something that brings a joy.
[1117] Yeah.
[1118] A product that brings joy to some fraction of the population.
[1119] Yeah.
[1120] I mean, TikTok isn't just literally an algorithm.
[1121] In some ways, TikTok's content, an algorithm have more sway now over the American psyche than Facebook's algorithm, right?
[1122] It's visual, it's video.
[1123] By the way, it's not defined by who you follow.
[1124] It's defined by some magical thing that, by the way, if someone wanted to tweak to show you a certain type of content for some reason, they could.
[1125] But people love it.
[1126] So as a CEO, let me ask you a question, because leadership matters.
[1127] this is a complicated question why is mark Zuckerberg distrusted disliked and sometimes even hated by many people in public right that is a complicated question um well the premise i'm not sure i agree with the premise um and i can i can expand that to include even a more mysterious question for me bill gates hmm what is the Bill Gates version of the question.
[1128] Do you think people hate Bill Gates?
[1129] No, distrust.
[1130] Ah.
[1131] So, uh, take away one.
[1132] It's a, it's a checklist.
[1133] Uh, there is, I think Mark Zuckerberg's distrust is the primary one, but there's also like a, like a dislike, maybe hate is too strong award, but it's just if you look at like the articles that are being written and so on, hmm.
[1134] There's a dislike and it makes, it's confusing.
[1135] to me because it's like the public picks certain individuals and they attach certain kinds of emotions to those individuals.
[1136] Yeah.
[1137] So someone once just recently said, there's a strong case that founder -led companies have this problem and that a lot of Mark's issues today come from the fact that he is a visible founder with this story that people have watched in both a movie and they followed along and he's this boy wonder kid who became one of the world's richest people and and he's no longer marked the person he's marked this this image of of a person with enormous wealth and power and in today's world we we have issues with enormous wealth and power for a variety of reasons one of which is we've been stuck inside you know for a year and a half two years one of which is a lot of people were really unhappy about not the last election but the last last election, and where do you take out that anger?
[1138] Who do you blame, but the people in charge?
[1139] That's one example or one reason why I think a lot of people express anger or resentment or unhappiness with Mark.
[1140] At the same time, I don't know, I pointed out to that person, I was like, well, I don't know, like I think a lot of people really like Elon.
[1141] like Elon arguably like he kept his you know factory open here throughout COVID protocols which arguably a lot of people would be against while saying a bunch of crazy offensive things on the internet they still like basically you know gives the middle finger to the SEC like on Twitter and like I don't know I'm like well there's a founder and like people kind of like him so I do think that the so like the the founder and slash CEO of a company that's a social network company is like an extra level of difficulty if life is a video game you just chose the harder video game so I mean that's why it's interesting to ask you because you were the founder and CEO of a social network right exactly I challenge it because exactly so but you're one of the rare examples even Jack Dorsey's this light not to the degree, but it just seems harder when you're running a social media company.
[1142] It's interesting.
[1143] I never thought of Jack as just like, I think generally he's well respected.
[1144] Yeah, I think so.
[1145] I think you're right.
[1146] But like he's not loved.
[1147] Yeah.
[1148] And I feel like you, I mean, to me, Twitter is an incredible thing.
[1149] Again, can I just come back to this point, which seems oversimplistic, but like I really do think how a product makes someone feel.
[1150] They ascribe that feeling to the founder.
[1151] So make people feel good.
[1152] So think about it.
[1153] Let's just go with this thesis first.
[1154] Sure.
[1155] I like it, though.
[1156] Amazon's pretty utilitarian, right?
[1157] It delivers brown boxes to your front door.
[1158] Sure, you can have Alexa and you can have all these things, right?
[1159] But in general, it delivers stuff quickly to you at a reasonable price, right?
[1160] I think Jeff Bezos is wonderfully wealthy, thoughtful, smart guy, right?
[1161] But like, people kind of feel that way about them.
[1162] They're like, wow, this is really big.
[1163] We're impressed that this is really big.
[1164] But he's doing the same space stuff Elon's doing, but they don't necessarily ascribe the same sense of wonder, right?
[1165] Now let's take Elon.
[1166] And again, this is pet theory.
[1167] I don't have much proof other than my own intuition.
[1168] He is literally about living the future.
[1169] Mars, space, it's about wonder, it's about going back to that feeling as a kid when you looked up to the stars and asked, is there life out there?
[1170] People get behind that because it's a sense of hope and excitement and innovation.
[1171] And like, you can say whatever you want, but we ascribe that emotion to that person.
[1172] Now, let's say you're on a social network and people make you kind of angry because they disagree with you or they say something ridiculous or they're living a FOMO type life where you're like, wow, I wish I was doing that thing.
[1173] I think Instagram, if I were to think back, by and large, when I was there, was not about FOMO, was not about this influencer economy, although it certainly became that way closer to the end.
[1174] It was about the sense of wonder and happiness and beautiful things in the world.
[1175] And I don't know.
[1176] I mean, I don't want to have a blind spot, but I don't think anyone had a strong opinion about it one way or the other.
[1177] For the longest time, the way people explain to me, I mean, if you want to go for toxicity, you go to Facebook or Twitter.
[1178] if you want to go to make feel good about life, you go to Instagram to enjoy, celebrate life.
[1179] And my experience has been talking to people is they gave me the benefit of the doubt because of that.
[1180] But if your experience of the product is, kind of makes you angry, it's where you argue.
[1181] I mean, a big part of Jack might be that he wasn't actually the CEO for a very long time and only became recently.
[1182] So I'm not sure how much of the connection got made.
[1183] But in general, I mean, if you hate, you know, I'm just thinking about other companies than are in tech companies.
[1184] If you hate like what a company is doing or it makes you not feel happy, I don't know, like people are really angry about Comcast or whatever.
[1185] Are they even called Comcast anymore?
[1186] It's like Xfinity or something, right?
[1187] They had to read brand.
[1188] They became meta, right?
[1189] It's like, but my point is if it makes you angry.
[1190] That's beautiful, yeah.
[1191] But the thing is, this is me saying this.
[1192] I think your thesis is very strong and cool.
[1193] correct, has elements of correctness, but I still personally put some blame on individuals.
[1194] Of course.
[1195] I think, you said Elon looking up, there's something about childlike wander to him, like to his personality, his character.
[1196] Something about, I think, more so than others where people can trust them.
[1197] And there's, I don't know, Sanda Bichai's an example of somebody who's like, there's some kind, it's hard to put into words, but there's something about the human being where he's trustworthy.
[1198] Yeah.
[1199] He's human in a way that connects to us.
[1200] And the same with Sajun Adela.
[1201] I mean, some of these folks, something about us is drawn to them, even when they're flawed.
[1202] Even like, so like your thesis really holds up for Steve Jobs because I think people didn't like Steve Jobs, but like he delivered products that, and then they, fell in love every time.
[1203] I guess you could say that the CEO, the leader, is also a product.
[1204] And if they keep delivering a product that people like, by being in public and saying things that people like, that's also a way to make people happy.
[1205] But from a social network perspective, it makes me wonder how difficult it is to explain to people why certain things happen.
[1206] Like to explain machine learning, to explain why certain, the woke mob effect happens or the certain kinds of like bullying happens which is like it's human nature combined with algorithm and it's very difficult to control for how the spread of quote unquote misinformation happens it's very difficult to control for that and so you try to discelebrate certain parts and you create more problems than you solve and anything that looks at all like censorship can create huge amounts of problems as a slippery slope.
[1207] And then you have to inject humans to oversee the machine learning algorithms.
[1208] And anytime you inject humans into the system, it's going to create a huge number of problems.
[1209] And I feel like it's up to the leader to communicate that effectively, to be transparent.
[1210] First of all, design products that don't have those problems.
[1211] And second of all, when they have those problems, to be able to communicate with them.
[1212] I guess that's all going to, when you run a social network company, your job is hard.
[1213] Yeah, I will say the one element that you haven't named that I think you're getting at is just bedside manner, which Steve Jobs, I never worked for him, I never met him in person, had an an uncanny ability in public to have bedside manner.
[1214] I mean, some of the best clips of Steve Jobs from like, I want to say maybe the 80s when he's on these stage and getting questions from the audience about life or.
[1215] And he'll take this question that is like, how are you going to compete with blah?
[1216] And it's super boring.
[1217] And I don't even know the name of the company.
[1218] And his answer is as if you just asked like your grandfather the meaning of life.
[1219] Yeah.
[1220] Yeah.
[1221] And you sit there and you're just like, what?
[1222] Like, and there's that bedside manner.
[1223] And if you lack that or if that's just not intuitive to you, I think that it can be a lot harder to gain the trust of people.
[1224] And then add on top of that missteps.
[1225] companies right it's i don't know if you have any friends from the past where like maybe they crossed you once or like maybe you get back together and your friends again but he's just never really forget that thing it's human nature not to forget i'm russian you cross me one we solve the problem so my point is it there's humans don't forget and if there are times in the past where they feel like they don't trust the company or the company hasn't had their back that is really hard to earn back, especially if you don't have that bedside manner.
[1226] And again, like, I'm not attributing this specifically to Mark because I think a lot of companies have this issue where, one, you have to be trustworthy as a company and live by it, and live by those actions.
[1227] And then, two, I think you need to be able to be really relatable in a way that's very difficult if you're worth, like, what these people are.
[1228] It's really hard.
[1229] Yeah.
[1230] Jack does a pretty good job of this.
[1231] by being a monk.
[1232] But I also, like, Jack issues attention.
[1233] Like, he's not out there almost on purpose.
[1234] He's just working hard, doing square, right?
[1235] Like, I literally shared a desk like this with him at audio.
[1236] I mean, just normal guy who likes painting.
[1237] Like, I remember he would leave early on, like, Wednesdays or something to go to, like, a painting class.
[1238] Yeah.
[1239] And he's creative.
[1240] He's thoughtful.
[1241] I mean, money makes people, like, more creative and more thoughtful, like, extreme versions of themselves, right?
[1242] And this was a long, long time ago.
[1243] You mentioned that he asked you to do some kind of JavaScript thing.
[1244] We were working on some JavaScript together.
[1245] That's hilarious.
[1246] Like pre -Twitter, early Twitter days, you and Jack Dorsey are in a room together, talking about JavaScript, solving some kind of menial problem.
[1247] Terrible problems.
[1248] Yeah, I mean, not terrible, just like boring, boring widget.
[1249] I think it was the ODO widget we were working on at the time.
[1250] I'm surprised anyone paid me to be in the room as an intern, because I didn't really provide any value.
[1251] I'm very thankful to anyone who included me back in the day.
[1252] It was very helpful.
[1253] So thank you if you're listening.
[1254] I mean, is there audio that's a precursor to Twitter?
[1255] First of all, did you have any anticipation that this Jack Dorsey guy could be also ahead of a major social network?
[1256] And second, did you learn anything from the guy that, like, do you think it's a coincidence that you two were in the room together?
[1257] And it's coincidence meaning like, why does the world play its game in a certain way where these two founders of social networks?
[1258] I don't know.
[1259] It's so weird, right?
[1260] I mean, it's also weird that Mark showed up in our fraternity, my sophomore year, and we got to know each other then, like long before Instagram.
[1261] It's a small world, but let me tell a fun story about Jack.
[1262] We're at Odeo, and I don't know.
[1263] I think Ev was feeling like people weren't working hard enough or something.
[1264] Nice.
[1265] And I can't remember exactly what he had.
[1266] He created this thing where every Friday, I don't know if it was every Friday.
[1267] I only remember this happening once, but he had a, like a statuette, it's like of Mary.
[1268] And in the bottom, it's hollow, right?
[1269] And I remember on a Friday, he decided he was going to let everyone vote for who had worked the hardest that week.
[1270] We all voted.
[1271] Closed ballot, right?
[1272] We all put it in a bucket.
[1273] And he tallied the votes.
[1274] And then whoever got the most votes, as I recall, got the statuette.
[1275] And in the statuette was a thousand bucks.
[1276] I recall there was a thousand bucks now.
[1277] It might have been a hundred bucks.
[1278] But let's call it a thousand.
[1279] It's more exciting that way.
[1280] It felt like a thousand.
[1281] It did to me for sure.
[1282] I actually got two votes.
[1283] I was very happy.
[1284] We were a small company, but as the intern, I got at least two votes.
[1285] So everybody knew how many votes they got individually?
[1286] Yeah.
[1287] And I think it was one of these self -accountability things.
[1288] Anyway, I remember Jack just getting like the vast majority of votes from everyone.
[1289] And I remember just thinking like, like I couldn't imagine he would become what he'd become and do what he would do.
[1290] But I had a profound respect that the new guy who I really liked worked that hard.
[1291] And you could see his dedication.
[1292] even then, and that people respected him.
[1293] That's the one story that I remember of him, like working with him specifically from that summer.
[1294] Can take a small tangent on that?
[1295] Of course.
[1296] There's kind of a pushback in Silicon Valley a little bit against hard work.
[1297] Can you speak to the sort of the thing you admired to see the new guy working so hard?
[1298] That thing, what is the value of that thing in a company?
[1299] See, this is, like, just to be very frank, it drives me nuts.
[1300] Like, I saw this really funny, video on TikTok.
[1301] Was it on TikTok?
[1302] It was like, I'm taking a break from my mental health to work on my career.
[1303] I thought that was funny.
[1304] So it's like, oh, it is kind of phrased that way, the opposite often, right?
[1305] Okay, so a couple of things.
[1306] I have worked so hard to do the things that I did.
[1307] Like, Mike and I lost years off of our lives, staying up late, figuring things out, the stress that comes with the job.
[1308] I have a lot more gray hair now than I did back then.
[1309] It requires an enormous amount of work.
[1310] And most people aren't successful, right?
[1311] But even the ones that do, don't skate by.
[1312] I am okay if people choose not to work hard because I don't actually think there's anything in this world that says you have to work hard.
[1313] But I do think that great things require a lot of hard work.
[1314] So there's no way you can expect to change the world without working really hard.
[1315] And by the way, even changing the world, you know, the folks that I respect the most have nudged the world in like a slight direction.
[1316] Slight.
[1317] Very, very slight.
[1318] Like, even if Elon accomplishes all the things he wants to accomplish, we will have nudged the world in a slight direction.
[1319] But it requires enormous amount.
[1320] There was an interview with him where he was just like, he was interviewed, I think, at the Tesla factory and he was like, work is really hard.
[1321] This is actually unhealthy.
[1322] And I can't recall the exact, but he was like visibly shaken about how hard he had been working.
[1323] And he was like, this is bad.
[1324] And unfortunately, I think to have great outcomes, you actually do need to work at like three standard deviations above the mean.
[1325] But there's nothing saying that people have to go for that.
[1326] See, the thing is, but what I would argue, this is my personal opinion, is nobody has to do anything, first of all.
[1327] Exactly.
[1328] They certainly don't have to work hard.
[1329] Exactly.
[1330] But I think hard work in a company should be admired.
[1331] I do too.
[1332] And you should not feel like you shouldn't feel good about yourself for not working hard.
[1333] Like, so for example, I don't have to work out.
[1334] I don't have to run.
[1335] I hate running.
[1336] But like, I certainly don't feel good if I don't run because I know for my health.
[1337] Like there's certain values, I guess is what I'm trying to get.
[1338] There's certain values that you have in life.
[1339] It feels like there's certain values that companies should have.
[1340] And hard work is one of the things.
[1341] I think that should be admired.
[1342] I often ask this kind of silly question just to get a sense of people, like if I'm hiring and so on.
[1343] I just ask if they think it's better to work hard or work smart.
[1344] It was helpful for me to get a sense of people from that.
[1345] Because you think, like the right...
[1346] The answer is both.
[1347] What's that?
[1348] The answer is both.
[1349] I usually try not to give them that, but sometimes I'll say both if that's an option.
[1350] but a lot of people kind of a surprising number will say work smart and there are usually people who don't know how to work smart and they're literally just lazy not just there's two there's two effects behind that is one is laziness and the other is ego when you're younger and you say it's better to work smart it means you think you know what it means to work smart at this early stage.
[1351] To me, people that say work hard or both, they have the humility to understand, like, I'm going to have to work my ass off because I'm too dumb to know how to work smart.
[1352] And people who are self -critical in this way, in some small amount, you have to have some confidence.
[1353] But if you have humility, that means you're going to actually eventually figure out what it means to work smart.
[1354] And then to actually be successful, you should do both.
[1355] So I have a very particular take on this, which is that no one's forcing you to do anything, all choices of consequences.
[1356] So if you major in, I don't know, theoretical literature, I don't even know if that's a major.
[1357] I'm just making something out.
[1358] That's supposed to regular literature.
[1359] Applied literature.
[1360] Yeah, think about like theoretical Spanish lit from the 14th century.
[1361] Like just make up your esoteric thing.
[1362] And then the number of people I went to Stanford.
[1363] with who get out in the world and they're like wait what i can't find a job like no one wants a theoretical like there are plenty of counter examples of people have majored in esoteric things and gone on to be very successful so i just want to be clear it's not about the major but every choice you make whether it's to have kids like i love my children it's so awesome to have two kids and it is so hard to work really hard and also have kids it's really hard and there's a reason why certain very successful people, like, don't have, or not successful, but people who run very, very large companies or startups have chosen not to have kids for a while or chosen not to, like, prioritize them.
[1364] Everything's a choice.
[1365] And like, I choose to prioritize my children because, like, I want to do that, right?
[1366] So everything's a choice.
[1367] Now, once you've made that choice, I think it's important that the contract is clear, which is to say, let's imagine you were joining a new startup.
[1368] it's important that that startup communicate that like the expectation is like we're all working really really hard right now you don't have to join the startup but like if you do just know like it's almost as if you join i don't know pick your uh pick your pick your pick your like sports team like let's go back to the yankees for a second you want to join the yankees but you don't really want to work that hard you don't really want to do batting practice or pitching practice or whatever for your position right um that That, to me, is wacko.
[1369] And that's actually the world that it feels like we live in in tech sometimes where people both want to work for the Yankees because it pays a lot.
[1370] But like, don't actually want to work that hard.
[1371] That I don't fully understand.
[1372] Because if you sign up for some of these things, just sign up for it.
[1373] But it's okay if you don't want to sign up for it.
[1374] There's so many wonderful careers in this world that don't require 80 hours a week.
[1375] But when I read about companies going to like four day work weeks and stuff, I just like, I'd chuckle because I can't get enough done with.
[1376] a seven -day week.
[1377] I don't know how.
[1378] And people will say, oh, you're just not working smart.
[1379] And it's like, no, I work pretty smart, I think, in general.
[1380] Like, I wouldn't have gotten to this point if I hadn't, like, some amount of working smart.
[1381] And there is balance, though.
[1382] So I used to be like a pretty big cyclist.
[1383] I don't do it much anymore just because of kids and, like, prioritizing other things, right?
[1384] But one of the most important things to learn as a cyclist is to take a rest day.
[1385] But to me and to cyclists, like, resting is a function of optimizing for the long run it's not like a thing that you do for its own merits it's actually like if you don't rest your muscles don't recover and then you're just not as like you're not training as efficiently you should probably the successful people I've known in terms of athletes they hate rest days but they know they have to do it for the long term they think their opposition is getting stronger and stronger and that's the feeling but you know it's the right thing and usually you need a coach to help you yeah totally so I I mean, I use this thing called training peaks, and it's interesting because it actually mathematically shows, like, where you are on the curve and all this stuff.
[1386] But you have to, like, you have to have that rest, but it's a function of going harder for longer.
[1387] Again, it's this reinforcement learning, like, planning the aggregate and along, but a lot of people will hide behind laziness by saying that they're trying to optimize for the long run, and they're not.
[1388] They're just not working very hard.
[1389] But again, you don't have to sign up for it.
[1390] It's totally cool.
[1391] Like, I don't think less of people for, like, not working super hard.
[1392] It's just like don't sign up for things that require working super hard.
[1393] And some of that requires for the leadership to have the guts, the boldness to communicate effectively at the very beginning.
[1394] I mean, sometimes I think most of the problems arise in the fact that the leadership is kind of hesitant to communicate the socially difficult truth of what it takes to be at this company.
[1395] So they kind of say, hey, come with us.
[1396] We have snacks.
[1397] You know, but...
[1398] Unlimited vacation.
[1399] Yeah.
[1400] You know, Ray at Bridgewater is always fascinating because, you know, people, it's been called like a cult on the outside or cult -ish.
[1401] But what's fascinating is like, they just don't give on their principles.
[1402] They're like, listen, this is what it's like to work here.
[1403] We record every meeting.
[1404] We're like brutally honest.
[1405] And that's not going to feel right to everyone.
[1406] And if it doesn't feel right to you, totally cool.
[1407] Just go work somewhere else.
[1408] But if you work here, you are signing up for this.
[1409] And that's, that's been fascinating to me because it's honesty up front.
[1410] It's a system in which you operate.
[1411] And if it's not for you, like no one's forcing you to work there, right?
[1412] I actually did.
[1413] So I did a conversation with him at, kind of got stuck in a funny moment, which is, at the end, I asked him to give me honest feedback of how I did on the interview.
[1414] And I was, I don't think so.
[1415] He was super nice.
[1416] he asked me he's like well tell me did you accomplish what you're hoping to accomplish i was like that's not that's not i'm asking you as an objective observer of two people talking how do we do today and then he's like well he gave me this politician answer well i i feel like we've accomplished successful communication of like ideas which is i'd love to spread some of the ideas in that like in principles and so on back to my original point it's really hard to get even for Ray Dalia it's really hard to give feedback and one of the other things I learned from him and just people in that world is like man humans really like to pretend like they've come to that they've come to some kind of meeting of the minds like if there's conflict if you and I have conflict it's always better to meet face to face right on the phone.
[1417] Slack is not great, right?
[1418] Email's not great.
[1419] But face -to -face, what's crazy is you and I get together and we actively try to, even if we're not actually solving the conflict, we actively try to paper over the conflict.
[1420] Oh, yeah, it didn't really bother me that much.
[1421] Oh, yeah, I'm sure you didn't mean it.
[1422] But like, no, in our minds, we're still there.
[1423] Yeah.
[1424] So this is one of the things that as a leader, you always have to be digging, especially as you ascend.
[1425] Straight to the conflict.
[1426] Yeah, but as you ascend, no one wants to tell you you're crazy.
[1427] No one wants to tell you your, your ideas bad.
[1428] And you're like, oh, I'm going to be a leader.
[1429] And the idea is, well, I'm just going to ask people.
[1430] No one tells you.
[1431] So like, you have to look for the markers knowing that literally just people aren't going to tell you along the way and be paranoid.
[1432] I mean, you asked about selling, you know, the company.
[1433] I think one of the biggest differences between me and a lot of other entrepreneurs is like, I wasn't completely confident we could do it.
[1434] it.
[1435] Like we could be alone and and and actually be great.
[1436] And if any entrepreneur is honest with you, they also feel that way.
[1437] But a lot of people are like, well, I have to be cocky and just say, I can do this on my own.
[1438] We're going to be fine.
[1439] We're going to crush everyone.
[1440] Some people do say that.
[1441] And then it's not right.
[1442] And they and they fail.
[1443] But being honest in that moment with yourself, with those close to you.
[1444] And also, uh, you talked about the personality.
[1445] of leaders and who resonates and who doesn't.
[1446] It's rare that I see leaders be vulnerable, rare.
[1447] And one thing I tried to do at Instagram, at least internally, was like, say when I screwed up and like point out how I was wrong about things and point out where my judgment was off.
[1448] Everyone thinks they have to bat a thousand, right?
[1449] Like, that's crazy.
[1450] The best quant hedge funds in the world bat $50.
[1451] 0 .001%, they just take a lot of bets, right?
[1452] Renaissance.
[1453] They might bat 51%, right?
[1454] But holy hell, like, the question isn't, are you right every single time and you have to seem invincible?
[1455] The question is, how many at -bats do you get?
[1456] And on average, are you better on average, right, with enough bets and enough at -bats that your aggregate can be very high?
[1457] I mean, Steve Jobs was wrong at a lot of stuff.
[1458] The Newton was too early, right?
[1459] Max, not quite right.
[1460] There was even a time where he said, like, no one will ever want to watch video on the iPod.
[1461] Totally wrong.
[1462] But who cares if you come around and realize your mistake and fix it?
[1463] It becomes just like you said, harder and harder when your ego grows and the number of people around you that say positive things towards you grows.
[1464] I actually think it's really valuable that, like, let's amount of and a counterfactual where Instagram became worth like $300 billion or something crazy, right?
[1465] I kind of like that my life is relatively normal now.
[1466] When I say relatively, you get what I mean.
[1467] I'm not making a claim that I live a normal life.
[1468] But like I certainly don't live in a world where there are like 15 Sherpas following me, like fetching me water or whatever.
[1469] Like that's not how it works.
[1470] I actually like that I have a sense of humility of like I may not found another thing that's nearly as big.
[1471] so I have to work twice as hard or I have to like learn twice as much.
[1472] I have to, we haven't talked about machine learning yet, but my favorite thing is all these like famous, you know, tech guys who have worked in the industry pontificating about the future of machine learning and how it's going to kill us all.
[1473] And like, I'm pretty sure they've never tried to build anything with machine learning themselves.
[1474] Yes.
[1475] So there's a nice line between people that have actually built stuff machine like actually program something or at least understand some of those fundamentals and the people that are just saying philosophical stuff for journalists and so on it's it's a it's an interesting line to walk because the people who program are often not philosophers no or don't have the attention they can't write an op -ed for the wall street journal like it doesn't work so like it's nice to be both a little bit like to have elements of both my point is the fact that I have to stuff from scratch or that I choose to or like it's humbling uh yeah I mean again I have a lot of advantages I like but my point is it's awesome to be back in a game where you have to fight that is that's fun so being humble being vulnerable it's an important aspect of a leader and I hope it serves me well but like I can't fast forward 10 years to know I've just that's my game plan.
[1476] Before I forget, I have to ask you one last thing on Instagram.
[1477] What do you think about the whistleblower, Francis Howgan, recently coming out and saying that Facebook is aware of Instagram's harmful effect on teenage girls, as per their own internal research studies and the matter.
[1478] What do you think about this baby of yours, Instagram, being under fire now, as we've been talking about under the leadership of Facebook?
[1479] you know I often question where does the blame lie is the blame at the people that originated the network me right is the blame at like the decision to combine the network with another network with a certain set of values is the blame at how it gets run after I left like is it the driver or is it the car right is it that someone enabled these devices in the first place if you go to an extreme, right?
[1480] Or is it the users themselves just human nature?
[1481] Is it just the way of human nature?
[1482] Sure.
[1483] And like the idea that we're going to find a mutually exclusive answer here is crazy.
[1484] There's not one place.
[1485] It's a combination of a lot of these things.
[1486] And then the question is like, is it true at all?
[1487] Right.
[1488] Like I'm not actually saying that's not true or that it's true, but there's always more nuance here.
[1489] Do I believe that social media has an effect on young people?
[1490] Well, it's got to.
[1491] They use it a lot.
[1492] And I bet you there are a lot of positive effects.
[1493] And I bet you there are negative effects, just like any technology.
[1494] And where I've come to in my thinking on this is that I think any technology has negative side effects, the question is, as a leader, what do you do about them?
[1495] And are you actively working on them or do you just like not really believe in them?
[1496] If you're a leader that sits there and says, well, we're going to put an enormous amount of resources against this.
[1497] We're going to acknowledge when there are true criticisms, we're going to be vulnerable and that we're not perfect.
[1498] And we're going to go fix them and we're going to be held accountable along the way.
[1499] I think that people generally really respect that.
[1500] But I think that where Facebook, I think, has had issues in the past is where they say things like, I can't remember what Mark said about misinformation during the old.
[1501] election.
[1502] There was that, like, famous quote where he was like, it's pretty crazy to think that Facebook had anything to do with this election.
[1503] Like, that was something like that quote.
[1504] And I don't remember what stage he was on.
[1505] Yeah, yeah.
[1506] But ooh, that did not age well, right?
[1507] Like, you have to be willing to say, well, maybe there's, there's something there.
[1508] And, and, wow, like, I want to go look into it and truly believe it in your gut.
[1509] But if people look at you and how you act and what you say and don't believe you truly feel that way, it's not just the words you say.
[1510] but how you say them and that people believe they actually feel the pain of having caused any suffering in the world.
[1511] So to me, it's much more about your actions and your posture post -event than it is about debugging the why.
[1512] Because I don't know, is it, like, I don't know this research.
[1513] It was written well after I left, right?
[1514] Like, is it the algorithm?
[1515] Is it the explorer page?
[1516] Is it the people you might know unit connecting you to, you know, ideas that are dangerous?
[1517] Like, I really don't know.
[1518] So we'd have to have a much deeper, I think, dive to understand where the blame lies.
[1519] What's very unpleasant to me to consider, now I don't know if this is true, but to consider the very fact that there might be some complicated games being played here.
[1520] For example, you know, as somebody, I really love psychology.
[1521] And I love it enough to know that the field is pretty broken in the following way.
[1522] It's very difficult to study human beings well at scale.
[1523] because the questions you ask affect the results you can you can basically get any results you want and so you have an internal Facebook study that asks some question of which we don't know the full details and there's some kind of analysis but that's just the one little tiny slice into some much bigger picture and so you can have thousands of employees of Facebook one of them comes out and picks whatever narrative knowing that they become famous couple of the other really uncomfortable thing i see in the world which is journalists seem to understand they get a lot of clickbait attention from saying something negative about social networks certain companies like they even get some some clickbait stuff about Tesla or about especially when it's like when there's a public famous CEO type of person if they get a lot of views on the negative not the positive the positive they'll get i mean it actually goes to the thing you were saying before and if there's a hot, sexy new product, that's great to look forward to.
[1524] They get positive on that, but absent a product, it's nice to have like the CEO messing up with some kind of way.
[1525] And so couple that with the whistleblower and with this whole dynamic of journalism and so on, would social dilemma be really popular documentary?
[1526] It's like, all right, my concern is there's deep flaws in human nature, here in terms of things we need to deal with like the nature of hate yeah of bullying all those kinds of things and then there's people who are trying to use that potentially to become famous and make money off of uh off of blaming others for causing more of the problem as opposed to helping solve the problem so i don't know what to think i'm not saying this is like i'm just uncomfortable with i guess not knowing what to think about any of this because a bunch of folks i know that work Facebook on the machine learning sites, so Jan Lacoon.
[1527] I mean, they're quite upset by what's happening because there's a lot of really brilliant good people inside Facebook.
[1528] They're trying to do good.
[1529] And so, like, all of this press, Yan is one of them.
[1530] And he has an amazing team of the machine learning researchers.
[1531] Like, he's really upset with the fact that people don't seem to understand that this is not, the portrayal does not represent the full nature of efforts that's going out of Facebook.
[1532] So I don't know what to think about that.
[1533] you just, I think, very helpfully explain the nuance of the situation and why it's so hard to understand.
[1534] But a couple things.
[1535] One is, I think I have been surprised at the scale with which some product manager can do an enormous amount of harm to a very, very large company by releasing a trove of documents.
[1536] Like, I think I read a couple of things.
[1537] I think I read a couple of them when they got published and I haven't even spent any time going deep part of it's like I don't really feel like reliving a previous life but um wow like talk about challenging the idea of open culture and like what that does to Facebook internally if Facebook was built like I remember um like my office uh we had this like no visitors rule around my office because we always had like confidential stuff up on the walls and never was super angry.
[1538] because they're like, that goes against our culture of transparency.
[1539] And like Marks in the Fish Cube or whatever they call it, the aquarium, I think they called it, where like literally anyone could see what he was doing at any point.
[1540] And I don't know.
[1541] I mean, other companies like Apple have been quiet slash lockdown.
[1542] Snapchat's the same way for a reason.
[1543] And I don't know what this does to transparency on the inside of startups that value that.
[1544] I think that it's a seminal moment.
[1545] And you can say, well, you should have nothing to hide, right?
[1546] But to your point, you can pick out documents that show anything, right?
[1547] But I don't know.
[1548] So what happens to transparency inside of startups and the culture that will have, that startups or companies in the future will grow?
[1549] Like the startup of the future that becomes the next Facebook will be locked down.
[1550] And what does that do?
[1551] Yeah.
[1552] So that's part one.
[1553] Part two.
[1554] Like, I don't think that you could design.
[1555] a more like well orchestrated handful of events from the like 16 minutes to releasing the documents in the way that they were released at the right time that takes a lot of planning and partnership and it seems like she has a partner at some firm right that probably helped a lot with this but man at a personal level if you're her you'd have to really believe in what you are doing, really believe in it because you are personally putting your ass on the line, right?
[1556] Like, you've got a very large company that doesn't like enemies, right?
[1557] It takes a lot of guts.
[1558] And I don't love these conspiracy theories about like, oh, she's being financed from some person or people.
[1559] Like, I don't love them because that's like the easy thing to say.
[1560] I think the, the, the, the, the, the, uh -huhs razor here is like, someone thought they were doing something wrong and was like very, very courageous and I don't know if courageous is the word, but like, so without getting into like, is she a martyr?
[1561] Is she courageous?
[1562] Is she right?
[1563] Like, let's put that aside for a second.
[1564] Then there are the documents themselves.
[1565] They say what they say.
[1566] To your point, a lot of the things that like people have been worried about are already in the documents or they're already been said externally.
[1567] And I don't know, I'm just like, I'm thankful that I am focused on new things with my life.
[1568] Well, let me just say, I just think it's a really hard problem that probably Facebook and Twitter are trying to solve.
[1569] I'm actually just fascinated by how hard this problem is.
[1570] There are fundamental issues at Facebook in tone and in an approach of how product gets built and the objective functions.
[1571] and since people, organizations are not people.
[1572] So, yawn and fair, right?
[1573] Like, there are a lot of really great people who, like, literally just want to push reinforcement learning forward.
[1574] They literally just want to teach a robot to touch, feel, lift, right?
[1575] Like, they're not thinking about political misinformation, right?
[1576] Yeah.
[1577] But there's a strong connection between what funds that research and an enormously profitable machine that has tradeoffs.
[1578] and one cannot separate the two.
[1579] You are not completely separate from the system.
[1580] So I agree.
[1581] It can feel really frustrating to feel if you're internally, internal there, that you're working on something completely unrelated and you feel like your group's good.
[1582] I can understand that.
[1583] But there's some responsibility still.
[1584] You have to acknowledge, it's like the right value thing.
[1585] You have to look in the mirror and see if there's problems and you have to fix those problems.
[1586] Yeah.
[1587] You've mentioned machine.
[1588] learning reinforcement quite a bit.
[1589] I mean, to me, social networks is one of the exciting places, recommender systems where machine learning is applied.
[1590] Where else in the world, in the space of possibilities over the next five, 10, 20 years, do you think we're going to see impact of machine learning when you try it?
[1591] On a philosophical level, on a technical level, what do you think?
[1592] Or within social networks themselves?
[1593] Well, I think the obvious answers are climate change right like think about how much fuel or or just waste there is in energy consumption today because we don't plan accordingly because we take the least efficient route or the logistics and stuff of supply chain all that kind of stuff yeah i i mean listen if we're going to fight climate change like one really way one awesome way to do it is figure out how to optimize how we operate as a species and and minimize the amount of energy we consume to maximize whatever economic impact we want to have.
[1594] Because right now, those two are very much tied together.
[1595] And I don't believe that that has to be the case.
[1596] There's this really interesting, you've read it.
[1597] For people who are listening, there's this really interesting paper on reinforcement learning and energy consumption inside buildings.
[1598] It's like one of the seminal ones, right?
[1599] But imagine that at massive scale.
[1600] That's super interesting.
[1601] I mean, they've done, like, resource planning for servers for peak load using reinforcement learning.
[1602] I don't know if that was at Google or somewhere else, but like, okay, great, you do it for servers, but what if you could do it for just capacity and general energy capacity for cities and planning for traffic?
[1603] And, of course, there's all the self -driving cars and I don't know.
[1604] Like, I'm not going to pontificate like crazy ideas using reinforcement learning or machine learning.
[1605] it's just so clear to me that humans don't think quickly enough.
[1606] So it's interesting to think about machine learning, helping a little bit at scale.
[1607] So a little bit to a large number of people that has a huge impact.
[1608] So if you optimize, say Google Maps, something like that, trajectory planning or what did MapQuest first?
[1609] Yeah, getting here, I looked and it was like, here's the most energy efficient route.
[1610] And I was like, I'm going to be late.
[1611] I need to take the fastest as opposed to unrolling the map.
[1612] yeah yeah uh like and that's going to be very inefficient no matter what i was definitely the other day like part of the epsilon of epsilon greedy with ways where like i was sent on like a weird route that i could tell they're like we just need to collect data of this road like we just kevin's definitely going to be the guinea pig and great now we have did you at least feel pride oh going through it i was like oh this is fun like now they get data about this weird shortcut and actually I hit all the green lights and it worked.
[1613] I'm like, well, this is a problem.
[1614] Bad data.
[1615] Bad data.
[1616] They're just going to imagine.
[1617] I could see it's slowing down and stopping in a green light just to give them the right kind of data.
[1618] But to answer your question, like I feel like that was fairly unsatisfying.
[1619] And it's easy to say climate change.
[1620] But what I would say is at Instagram, everything we applied machining learning to got better for users and it got better for the company.
[1621] I saw the power.
[1622] I didn't fully understand it as an executive.
[1623] And I think that's actually one of the issues that, and when I say I understand, I mean the mathematics of it.
[1624] Like, I understand what it does.
[1625] I understand that it helps.
[1626] But there are a lot of executives now that talk about it.
[1627] And the way that they talk about the internet, or they talked about the internet like 10 years ago, they're like, we're going to build mobile.
[1628] And you're like, what does that mean?
[1629] They're like, we're just going to do mobile.
[1630] And you're like, okay.
[1631] So my sense is the next generation of leaders will have grown up having had classes in reinforcement learning.
[1632] supervise learning, whatever, and they will be able to thoughtfully apply it to their companies and the places it is needed most.
[1633] And that's really cool.
[1634] Because I mean, talk about efficiency gains.
[1635] That's what excites me the most about it.
[1636] Yeah.
[1637] So there's, it's interesting, just to get a fundamental first principles understanding of certain concepts of machine learning.
[1638] So supervised learning from an executive perspective, supervised learning, you have to have a lot of humans label a lot of data so the question there is okay can we gather a large amount of data that can be labeled well and that's the question Tesla asked like can we create a data engine that keeps sending an imperfect machine learning system out there whenever it fails it gives us data back we label it by human and we send it back and forth so this way then there is yama kuhn's excited about this self -supervised learning where you do much less human labeling and there's some kind of mechanism for the system to learn it by itself on a human -generated data.
[1639] And then there's the reinforcement learning, which is like basically allowing, it's applying the alpha -zero technology that allow through self -play to learn how to solve the game of go and achieve incredible levels at the game of chess.
[1640] Can you formulate the problem you're trying to solve in a way that's amenable to reinforce?
[1641] learning and can you get the right kind of signal at scale because you need a lot a lot of signal and that that's that's kind of fascinating to see which part of a social network can you convert into a reinforcement learning problem the fascinating thing about reinforcement learning i think is that we now have learned to apply neural networks to guess uh you know cue function like the Q function, basically the values for any state in action.
[1642] And that is fascinating because we used to just like, I don't know, have like a linear regression, like hope it worked and that was the fanciest version of it.
[1643] But now you look at it, I'm like trying to learn this stuff.
[1644] And I look at it, I'm like, there are like 17 different acronyms of different ways you can try to apply this.
[1645] No one quite agrees like what's the best.
[1646] Generally, if you're trying to like build a neural network, they're pretty well -trodden ways of doing that.
[1647] you use Adam, use Raleu, like, there's just like general good ideas.
[1648] And in reinforcement learning, I feel like the consensus is like it totally depends.
[1649] And by the way, it's really hard to get it to converge.
[1650] And it's noisy and it like, so there are all these really interesting ideas around building simulators, you know, like for instance, in self -driving, right?
[1651] Like you don't want to like actually have someone getting an accident to learn that an accident is bad.
[1652] So you start simulating accidents, simulating aggressive drivers, just simulating crazy dogs that run into the street.
[1653] Wow, fascinating, right?
[1654] Like my mind starts racing.
[1655] And then the question is, okay, forget about self -driving cars.
[1656] Let's talk about social networks.
[1657] How can you produce a better, more thoughtful experience using these types of algorithms?
[1658] And honestly, in talking to some of the people that work at Facebook and old Instagrammers, Most people are like, yeah, we tried a lot of things, didn't quite ever make it work.
[1659] I mean, for the longest time, Facebook ads was effectively a logistic regression, okay?
[1660] I don't know what it is now, but like if you look at this paper that published back in the day, it was literally just a logistic regression.
[1661] It made a lot of money.
[1662] So even at these like extremely large scales, if we are not yet touching what reinforcement learning can truly do, imagine what the next 10 years looks like.
[1663] Yeah.
[1664] How cool is that?
[1665] It's amazing.
[1666] So I really like the use of reinforcement learning as part of the simulation, for example, like with self -driving cars, it's modeling pedestrians.
[1667] So the nice thing about reinforcement learning, it can be used to learn agents within the world.
[1668] So they can learn to behave properly.
[1669] Like you can teach pedestrians to, like you don't hard code the way they behave.
[1670] They learn how to behave.
[1671] In that same way, I do have a hope, what is it, Jack Dorsey talks about healthy conversations.
[1672] You talked about meaningful interactions, I believe.
[1673] Like simulating interactions, so you can learn how to manage that.
[1674] It's fascinating.
[1675] So where most of your algorithm development happens in virtual worlds.
[1676] And then you can really learn how to design the interface, how you design a bunch of aspects of the experience, in terms of how you select what's shown in the feed, all those kinds of things.
[1677] It feels like if you can connect, reinforce and learning to that, that's super exciting.
[1678] Yep.
[1679] And I think if you have a company and leadership that believe in doing the right things and can apply this technology in the right way, some really special stuff can happen.
[1680] It is mostly like, and I'm likely going to be a group of people we've never heard about, start up from scratch, right?
[1681] And you asked if, like, new social networks could be built.
[1682] I've got to imagine they will be.
[1683] and whoever starts it might be some kids in a garage that took these classes from these people, you, right?
[1684] And they're building all of these things with this tech at the core.
[1685] So I'm trying not to be someone who just like throws around reinforcement learning as a buzzword.
[1686] I truly believe that it is the most cutting edge in what can happen in social networks.
[1687] And I also believe it's super hard.
[1688] Like it's super hard to make it work.
[1689] It's super hard to do it at scale.
[1690] super hard to find people that truly understand it.
[1691] So I'm not going to say that like, I think it'll be applied in social networks before we have true self -driving.
[1692] Let me put it that way.
[1693] We could argue about this for a long time, but yes, I agree with you.
[1694] I think self -driving is way harder than people realize.
[1695] Oh, absolutely.
[1696] Let me ask you in terms of that kid in the garage or those couple of kids in the garage, what advice would you give to them if they want to start a new social network or a business, what advice would you give to somebody with a big dream and a young startup?
[1697] To me, you have to choose to do something that even if it fails, like it was so fun, right?
[1698] Like, we never started Instagram knowing it was going to be big.
[1699] We started Instagram because we loved photography.
[1700] We loved social networks.
[1701] I had seen what other social networks had done, and I thought, hmm, maybe we did a spin on this, but like nowhere was our fate.
[1702] predestined.
[1703] It wasn't like, it wasn't written out anywhere that everything was going to go great.
[1704] And I often think the counterfactual, like, what if it had not gone well?
[1705] I would be like, I don't know, that was fun.
[1706] We raised some money.
[1707] We learned some stuff.
[1708] And does it position you well for the next experience?
[1709] That's the advice that I would give to anyone wanting to start something today, which is like, does this meet with your ultimate goals?
[1710] Not wealth, not fame, none of that because all of that by the way is bullshit like you can get super famous and super wealthy and I like I think generally those are not things that again it's easy to say with like a lot of money that somehow like it's not good to have a lot of money it's just I think that complicates life enormously in a way that people don't fully comprehend so I think it's way more interesting to shoot for can I make something that people love that provides value in the world that I love building that I love working on that I write that's um that's what I would do if I were starting from scratch and by the way like in some ways that I will do that personally which is like choose the thing that you get up every morning you're like I love this even when it's painful even when it's painful what about a social network specifically if you were to imagine put yourself in the mind if some compete against myself I can't give out ideas okay I got you no but like high level you like focus on community.
[1711] I said that as a half joke.
[1712] In all honesty, I think these things are so hard to build that, like, ideas are a dime a dozen.
[1713] But you have to talk about keeping it simple.
[1714] Can I tell you, which is a liberating idea.
[1715] My model is, it's three circles and they overlap.
[1716] One circle is, what do I have experience at slash what am I good at?
[1717] I don't like saying what am I good at because it just seems like what do I have experience in, right?
[1718] What can I bring to the table?
[1719] What am I excited about is the other circle?
[1720] What gets it?
[1721] What's just super cool, right, that I want to work on because even when this is hard, I think it's so cool I want to stick with it.
[1722] And the last circle is like, what is the world need?
[1723] And if that circle ain't there, it doesn't matter what you work on because there are a lot of startups that exist that just no one needs or very small markets need.
[1724] But if you want to be successful, I think if you're good at it, you have, sorry, if you're good at it, you're passionate about it and the world needs it.
[1725] I mean, this sounds simple, but not enough people sit down and just think about those circles and think, do these things overlap.
[1726] And can I get that middle section?
[1727] It's small, but can I get that middle section?
[1728] I think a lot about that personally.
[1729] And then you have to be really honest about the circle that you're good at and really honest about the circle that the world needs and I suppose really honest about the passion like what do you actually love yeah as opposed to like some kind of dream of making money all those kind of stuff like literally love doing I had a former engineer who decided to start a startup and I was like are you sure you want to start a company versus like join something else because um being a coach of an NBA team and playing basketball are two very very different things and And like not everyone fully understands the difference.
[1730] I think you can kind of do it both.
[1731] And I don't know.
[1732] Jury's out on that one because like they're in the middle of it now.
[1733] So but it's really important to figure out what you're good at, not be full of yourself, like truly look at your track record.
[1734] What's the saying like it ain't bragging if you could if you can do it.
[1735] But too many people are delusioned.
[1736] and like think they're better at things than they actually are or think there's a bigger market than there actually is just when you confuse your passion for things with a big market that's really scary right like just because you think it's cool doesn't mean that it's a big business opportunity so like what evidence do you have again I'm a fairly like I'm a strict rationalist on this and like sometimes people don't like working with me because I'm pretty pragmatic about things like I'm not I'm not Elon like I don't sit and make bold proclamations about visiting Mars like that's just not how I work I'm like okay I want to build this really cool thing that's fairly practical I think we could do it and it's in this way and what's cool though is like that's just my sweet spot I'm not like I just I can't I can't with a straight face talk about the metaphors I can't I just it's not me what do you think about the the Facebook renaming it's up to me that as a dig I just literally mean like I'm I'm fairly I like to live in the next five years and like what What things can I get out in a year that people will use at scale?
[1737] And so it's just, again, those circles, I think, are different for different people, but it's important to realize that, like, market matters, you being good at it matters, and having passion for it matters.
[1738] Your question, sorry.
[1739] Well, on this topic in terms of funding, is there, by way of advice, was funding in your own journey, helpful, unhelpful, like, is there a right time to get funding?
[1740] You mean venture funding?
[1741] Or any, borrow some money from your parents?
[1742] I don't know.
[1743] Yeah.
[1744] Like, is money get in the way?
[1745] Does it help?
[1746] Is the timing important?
[1747] Is there some kind of wisdom you can give there?
[1748] Because you were exceptionally successful very quickly.
[1749] Funding helps as long as it's from the right people.
[1750] That includes yourself.
[1751] And I'll talk about myself funding myself in a second, which is like, because I can fund myself doing whatever projects I can do, I don't really have another person putting pressure on me except for myself.
[1752] And that creates strange dynamics, right?
[1753] But let's like talk about people getting funding from a venture capitalist initially.
[1754] We raised money from Matt Kohler at Benchmark.
[1755] He's brilliant, amazing guy, very thoughtful.
[1756] And he was very helpful early on.
[1757] But I have stories from entrepreneurs where they raise money from the wrong person or the wrong firm where incentives weren't aligned.
[1758] They didn't think in the same way and bad things happened because of that.
[1759] The border room was always noisy.
[1760] There were fights.
[1761] We just never had that.
[1762] Matt was great.
[1763] I think capital these days is kind of a dime a dozen, right?
[1764] As long as you're fundable, like it seems like there's money out there is what I'm hearing.
[1765] It's really important that you are aligned.
[1766] that you think of raising money as hiring someone for your team rather than taking money if capital is plentiful, right?
[1767] It provides a certain amount of pressure to do the right thing that I think is healthy for any startup.
[1768] And it keeps you real and honest because they don't want to lose their money.
[1769] They're paid to not lose their money.
[1770] The problem, you know, maybe I could depersonalize it, but like I remember having lunch with Elon.
[1771] It only happened once.
[1772] And I asked him, like, I was trying to figure out what I was doing after Instagram, right?
[1773] And I asked him something about, like, angel investing.
[1774] And he looked at me with a straight face.
[1775] He was like, why the F would I do that?
[1776] Like, why?
[1777] Like, I was like, I don't know.
[1778] Like, you're connected.
[1779] Like, seems like, he's like, I only invest in myself.
[1780] I was like, ooh, okay.
[1781] You know, like, not the confidence.
[1782] I was just like, what a novel idea.
[1783] It's like, yeah, if you have money.
[1784] Like, why not just put it against your bag and like enable your, you're visiting Mars or something, right?
[1785] Like, that's awesome, great.
[1786] But I had never really thought of it that way.
[1787] But also with that comes an interesting dynamic where you don't actually have people who are going to lose that money telling you, hey, don't do this or, hey, you need to face this reality.
[1788] So you need to create other versions of that truth teller.
[1789] And whatever I do next, that's going to be one of the interesting challenges is how do you create that truth -telling situation.
[1790] And that's part of why, by the way, I think someone like Jack, when you start square, you have money, but you still, you bring on partners because I think it creates a truth -telling -type environment.
[1791] I'm still trying to figure this out.
[1792] Like, it's an interesting, it's an interesting dynamic.
[1793] So you're thinking of perhaps launching some kind of venture where you're investing in yourself?
[1794] I mean, is there in the books, potential?
[1795] I'm 37 going on 38 next month.
[1796] I have a long life to live.
[1797] I'm definitely not going to sit on the beach, right?
[1798] So I'm going to do something at some point, and I got to imagine I will help fund it, right?
[1799] So the other way of thinking about this is you can park your money in the S &P, and this is bad because the S &P's done wonderfully well the last year, right?
[1800] Or you can invest in yourself.
[1801] And if you're not going to invest in yourself, you probably shouldn't do a startup.
[1802] It's kind of the way of thinking about it.
[1803] and you can invest in yourself in the way Elon does which is basically go all in on this investment maybe that's one way to achieve accountability is like you're kind of screwed if you're if you fail man yeah that's yeah i personally like that i like burning bridges behind me so that i'm fucked if it fails yeah yeah yeah uh it's really important though there one of the things i think Mark said to me early on that sticks with me that I think is true.
[1804] We were talking about people who had left like operating roles and started doing venture or something.
[1805] He was like a lot of people convince themselves they work really hard.
[1806] Like they think they work really hard and they put on the show and in their minds they work really hard.
[1807] But they don't work very hard.
[1808] There is something about lighting a fire underneath you and burning bridges such that you can't turn back.
[1809] Yeah.
[1810] That I think, you know, we didn't talk about this specifically, but I think you're right.
[1811] There is, you need to have that because there's this self -delusion at a certain scale.
[1812] Oh, I have so many board calls.
[1813] Oh, like, we have all these things to figure out.
[1814] It's like, this is one of the hard parts about it being an operator.
[1815] It's like there are so many people that have made a lot of money not operating, but operating is just one of the hardest things on earth.
[1816] It is just so effing hard.
[1817] It is stressful.
[1818] It is you're dealing with real humans.
[1819] You're not just like throwing capital in and hoping it grows.
[1820] I'm not undermining the VC mindset.
[1821] I think it's a wonderful thing and needed and so many wonderful VCs I've worked with.
[1822] But yeah, like when your ass is on the line and it's your money, it's talk to me in 10 years.
[1823] We'll see how it goes.
[1824] Yeah, but like you're saying that as a source, when you wake up in the morning and you look forward to the day full of challenges, that's also where you can find happiness.
[1825] Let me ask you about love and friendship.
[1826] Sure.
[1827] What's the role in this heck of a difficult journey you have been on of love, of friendship?
[1828] What's the role of love in the human condition?
[1829] Well, first things first, the woman I married, my wife, Nicole, no way I could do what I do if we weren't together.
[1830] She had the filter I do.
[1831] Yeah, exactly.
[1832] We didn't go over that story.
[1833] everything is a partnership right and to achieve great things it's not about like someone pulling their weight in places like it's not like someone supporting you so that you could do this other thing it's literally like you know i mike and i and our partnership as co -founders is fascinating because i don't think instagram would have happened without that partnership like either him or me alone no way we pushed and pulled each other in a way that allowed us to build a better thing because of it.
[1834] Nicole, sure, she pushed me to work on the filters early on.
[1835] And yes, that's a fun story, right?
[1836] But the truth of it is being able to level with someone about how hard the process is and have someone see you for who you are before Instagram and know that there's a constant you throughout all of this and be able to call you when you're drifting from that, but also support you when you're trying to stick with that.
[1837] That's, I mean, that's true friendship slash love, whatever you want to call it.
[1838] But also for someone not to care, I remember Nicole saying, hey, I know you're going to do this Instagram thing.
[1839] I guess it was bourbon at the time.
[1840] You should do it because, you know, even if it doesn't work, we can move to like a smaller apartment and it'll be fine.
[1841] Like, we'll make it work.
[1842] How beautiful is that, right?
[1843] Yeah.
[1844] That's almost like a super power.
[1845] It gives you permission to fail, and somehow that actually leads to success.
[1846] But also, she's like the least impressed about Instagram of anyone.
[1847] She's like, yeah, it's great, but like, I love you for you.
[1848] Like, I like that you're like a decent cook.
[1849] That's beautiful.
[1850] That's beautiful with the Gant chart and Thanksgiving, which I still think is a brilliant, effing idea.
[1851] Thank you.
[1852] Big ridiculous question.
[1853] Have you, you're old and wise at the stage?
[1854] So have you discovered meaning to this whole thing?
[1855] Why the hell are we descendants of apes here on earth?
[1856] What's the meaning of it?
[1857] What's the meaning of life?
[1858] I haven't.
[1859] And I am in.
[1860] So the crazy, so the best learning for me has been like, no matter what level of success you achieve, you're still worried about similar things, maybe on a slightly different scale.
[1861] You're still concerned about the same thing.
[1862] You're still self -conscious about the same things.
[1863] is like, and actually that moment going through that is what makes you believe there's got to be like more machinery to life or purpose to life and that we're all chasing these materialistic things, but you start realizing like, it's almost like, you know, the Truman Show when he gets the edge and he like knocks against it.
[1864] He's like, what?
[1865] Like there's this awakening that happens when you get to that edge that you realize, oh, like sure, it's great.
[1866] It's great that we all chase money and fame and success.
[1867] You hit the edge.
[1868] And I'm not even claiming I hit an edge like Elon's hit an edge.
[1869] Like there's clearly larger scales.
[1870] But what's cool is you learn that like it doesn't actually matter and that there are all these other things that truly matter.
[1871] That's not a case for working less hard.
[1872] That's not a case for taking it easy.
[1873] That's not a case for the four day work.
[1874] What that is a case for is designing your life exactly the way you want to design it because I don't know.
[1875] I think we go around the earth, you know, the sun a certain number of times and then we die and then that's it.
[1876] That's me. Are you afraid of that moment?
[1877] No, not at all.
[1878] In fact, or at least not yet.
[1879] Listen, I'm like a pilot.
[1880] Like, I do crazy things.
[1881] And I like, no, I like, if anything, I'm like, ooh, I got to choose mindfully and purposefully the thing I am doing.
[1882] right now and not just fall into it because you're going to wake up one day and ask yourself why the hell you spent the last 10 years doing X, Y, or Z. Yeah.
[1883] So I guess my like shorter answer to this is doing things on purpose because you choose to do them so important in life and not just like floating down the river of life, hitting branches along the way because you will hit branches, right?
[1884] But rather like literally plotting a course and not having a 10 year.
[1885] plan, but just choosing every day to opt in.
[1886] That, I think, has been more, like, I haven't figured out the meaning of life by any stretch of the imagination, but it certainly isn't money and it certainly isn't fame and it certainly isn't travel and it's like, and it's way more of like opting into the game you love playing.
[1887] Every day, opting in.
[1888] Just opting in.
[1889] And like, don't let it happen.
[1890] You opt in.
[1891] Kevin, uh, it's great to end on, uh, love.
[1892] and the meaning of life, this is an amazing conversation.
[1893] It's a lot of fun, thank you.
[1894] You gave me, like, a light into some fascinating aspects of this technical world, and I can't honestly wait to see what you do next.
[1895] Thank you so much.
[1896] Thanks for having me. Thanks for listening to this conversation with Kevin Sistram.
[1897] To support this podcast, please check out our sponsors in the description.
[1898] And now, let me leave you with some words from Kevin Sistram himself.
[1899] Focusing on one thing, and doing it really, really well can get you very far.
[1900] Thank you for listening and hope to see you next time.