Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert, experts on expert.
[1] I'm Dick Rather, and I'm joined by Parenthood and Mouse.
[2] Welcome.
[3] Dick Rather, that's a rough name.
[4] I can't believe I've made it this far with that name.
[5] I had to overcome it.
[6] The adversity.
[7] The adversity.
[8] We have talked a bazillion times since we both saw the social dilemma, which is on Netflix, we really were infected by, and that, of course, became a gateway drug to doing rabbit hole, the New York Times podcast, which we loved.
[9] And so we had an opportunity to talk to Tristan Harris, who was front and center in the social dilemma, and we think you will enjoy him immensely.
[10] Tristan is a computer scientist that spent three years as Google designed ethicist, developing a framework of how technology should ethically steer the thoughts and actions of billions of people from screens.
[11] He is now co -founder and president of the Center for Humane Technology, whose mission is to reverse human downgrading and realign technology with humanity.
[12] He's really the head honcho on this topic.
[13] He is, yeah, he is the, what would we say?
[14] He, vanguard.
[15] He's the vanguard.
[16] He really is.
[17] He was the first one out there really sounding an alarm.
[18] Yeah.
[19] And we all are in debt to this man's huge ethical moral compass.
[20] So please enjoy Tristan Harris.
[21] Wondry Plus subscribers can listen to Armchair Expert early and ad free right now.
[22] Join Wondry Plus in the Wondry app.
[23] or on Apple Podcasts, or you can listen for free wherever you get your podcasts.
[24] Hey, Dex.
[25] How you doing?
[26] Good.
[27] How are you doing?
[28] Good.
[29] Do you know Monica?
[30] Monica, Padman.
[31] Hi there.
[32] Hey, Monica.
[33] How you doing?
[34] Good.
[35] So excited to have you.
[36] We've been wanting to have you on for years, I think.
[37] I think this is years in the making.
[38] Yeah.
[39] Excellent.
[40] I'm just excited to meet you, talk to you, and go through some of the many amazing things you are an expert on.
[41] First of all, I'm curious, where are you?
[42] I am in Scottsdale, Arizona at the moment, because believe it or not, my house and everything that I owned burned down in the recent Santa Rosa fires.
[43] No way.
[44] Oh, no. Yeah, so I've lost everything.
[45] Wow.
[46] You normally live in Santa Rosa?
[47] I normally live in the North Bay.
[48] I had moved up to Santa Rosa, which was my deceased family's home during the quarantine and pandemic, and then these fires three weeks ago just took everything.
[49] Oh, my gosh.
[50] That's awful.
[51] But it's also happening in the middle of when this, you know, film has come out.
[52] And so the most important thing I can do for the world is just keep focusing on the film.
[53] So that's where my attention is going.
[54] Being the social dilemma.
[55] Yeah, the social dilemma, yeah.
[56] Okay, great, great, great.
[57] I wanted to make sure that I wasn't in the dark about another film.
[58] We have seven films out right now.
[59] Oh, my God, you're a mini studio, mini major.
[60] Santa Rosa is Charles Schultz country, right?
[61] that's exactly right yeah did any of that stuff burned down did any of like an ice rink he had built and he used to have like a christmas pageant and stuff i think they did and i don't really remember everything there the airport is named after him the charles troltz airport and there's cute little life -sized figurines of the peanuts characters if i recall that's right yeah all around santa rosa downtown i think i have a picture of my then two -year -old hugging the snoopy statue yeah oh yeah where did you grow up did you grow up in santa rosa i grew up originally in San Francisco, and when I was about 11, my mother wanted to move up to Santa Rosa to fulfill her dream of riding horses and being near the state parks.
[62] So I went to high school up in Santa Rosa and then later back down the bay to Stanford for college to study computer science.
[63] And what was happening career -wise for mom in San Francisco?
[64] Are you like second -generation technology person?
[65] No, the opposite, actually.
[66] I was the only one in my family who really even touched a computer, and I was just obsessed as a child born in the year of the Macintosh would be.
[67] I was born in 1984, so I didn't see the Macintosh ad, but what's really interesting to loop the story back around is that the co -founder of our nonprofit Center for Humane Technology is Azar Raskin, whose father, Jeff Raskin, invented the Macintosh project, or started it at least at Apple.
[68] Oh, really?
[69] Yeah.
[70] And my life has been largely defined by, I think, the impact of the Macintosh on my early childhood.
[71] And yeah, that's a kind of an interesting place to theoretically start, which is prior to Macintosh, and again, I'm not super savvy about this, but I will say what used to be seemingly kind of a utilitarian pursuit of people who liked making simple programs on these early computers, with Mac comes, in my opinion, a culture or a movement.
[72] It transcends that space a little bit.
[73] Was that your experience with it?
[74] Did it feel like more of an emotional connection to that thing?
[75] It actually did, yeah.
[76] I didn't know any of the lore behind the Macintosh, you know, who Steve Jobs was or Andy Hertzfeld or Bill Atkinson, some of these characters who, you know, invented and put all of this culture and soul into this computer.
[77] But it's funny because not even knowing that history, I felt this kind of weird, mystical almost connection to the creativity behind it.
[78] Yes.
[79] And it really was, you know, as the saying goes, and as we say in the film, that the Macintosh and the computers were a bicycle for our minds.
[80] They would be giving us kind of leverage for the kind of conceptual creativity and capacity that we could have as human beings.
[81] And that's a really optimistic view, obviously, of technology that I still believe is possible.
[82] We just went astray with these business models that kind of pulled us into this mini dark ages.
[83] Well, and then to drill down a little bit on what you just said, which I think could be useful, is, yes, the bicycle is a tool.
[84] You do a great job of describing that, meaning a tool, a rake, a lawnmower.
[85] It sits in your shed until you want to use it to perform some tasks that it's going to assist you in.
[86] And that is not how we interact with these tools, our smartphones, our computers.
[87] They are actually engaging us at all times, right?
[88] We didn't wake up necessarily with the game plan of like, you know what, let's try to get five, six hours on that thing today.
[89] No, no, it handles that for you.
[90] Yeah, that's exactly right.
[91] I mean, it's important to state, if you go back to the Marshall McLuhan kind of theories of what is media, what is technology, and it's an extension of thinking and action.
[92] So all technologies are extending and reshaping the way we make sense of the world and the kind of actions that we might take.
[93] So a bicycle is going to change your basic sensemaking and choice making, the menu of options that might occur to your mind of what do I want to go today or what do you want to do today because your bicycle extends that.
[94] So it's important to say that first.
[95] However, as you said, a bicycle doesn't have an agenda about what it wants from you.
[96] Right.
[97] Right.
[98] Well, it wants you to lubricate the chain occasionally and keep the air pressure at optimum level.
[99] I guess at some existential level it wants to survive, but it doesn't have any means of doing that except by, you know, being able to make you care about that.
[100] But, you know, a bicycle doesn't have an avatar voodoo -dol -like model of each of us that uses trillions of data points to figure out predictively how do I get you to drive more to places like McDonald's and less to places like parks and playgrounds because I need to make.
[101] money from you using me in a very particular way.
[102] And that's the difference, as we say in the film, between a kind of tools -based technology environment and an addiction and manipulation -based technology environment.
[103] That's the thing that's really changed that we talk about in the social dilemma is that the business model of Facebook, YouTube, TikTok, Twitter, LinkedIn, Instagram, etc. They all depend on us using it in very particular ways that involve hours a day of screen time.
[104] But that's not really the core harm when you, I'm sure we'll get into this more.
[105] It's really the kind of the erosion of the life support systems that make up a society, that make a society work.
[106] Because we need our mental health.
[107] We need a common view of reality.
[108] We need shared facts and truth.
[109] We need to have a basic compassion for each other.
[110] And each one of those dimensions of our society are things that are not inside of the business model's interests.
[111] And that's really why we have to change the broader system.
[112] All right.
[113] And let's go back to the bike analogy.
[114] So yes, the bike changes culture in that.
[115] You used to only be able to have lunch, or no, a mile away, whatever you're willing to walk.
[116] Now, going five miles away is an option.
[117] So because the bike has no agenda, you'll just end up at some coffee shop or some restaurant, and you'll interact with people there, and they'll be of all variety because it hasn't selected for anything.
[118] And then you might like the food or not, and then you may ride your bike to another place.
[119] But you are generally in charge of, oh, I didn't dig what was going on at that coffee shop, but I kind of like what was going on at that one.
[120] You're still in the driver's seat, whereas the AI bike is going to take you to a coffee shop it knows you'll like because it's so, so good at predicting what you will like because it is memorized every choice you've made over the last 10 years online, right?
[121] I think people lose sight of we don't have a hundredth of a sense of what we are like as much as our computer does because we don't have the capacity to even remember all the times we've made decisions.
[122] But seeing all of our decisions accumulated over 10 years, it is very easy to recognize a pattern within that if you can see it from that 30 ,000 foot view, right?
[123] So we know 90 % of the time, if you're making a choice to eat at 3 p .m. and you didn't need a breakfast, we know pretty much you're going to the shittiest place imaginable and so on and so forth, right?
[124] And so the bike is now telling you what you like and predicting what you will like.
[125] And it'll rule out ever giving you something you don't like, which is so often how we grow and evolve and we don't think we like a French restaurant, but a friend drags us, and then we, lo and behold, ask cargo is not bad.
[126] It shouldn't be good, but by God, someone made me try it, and now I like this.
[127] That's exactly right.
[128] And then the question is also, what is the perceptual tool that a bike looks at from us to know what we like?
[129] Because, for example, let's give the bike an eyeball, and let's give that bike an eyeball called time spent.
[130] So the only way it knows what we like is about the amount of time that we spend there.
[131] Now, let's say there's like a certain place that we could go.
[132] the conspiracy theory cafe.
[133] And whenever we go to the conspiracy theory cafe, you spend five hours there talking to mind -blowing people who will tell you about aliens and government conspiracy theories and co -intel pro and MK Ultra.
[134] It's fascinating.
[135] And it keeps you there for hours and hours and hours.
[136] And let's say the bike doesn't have any other way to know what you like except by the time that you spent.
[137] It's erroneous data in that if its goal is to give you what you want, what you want isn't necessarily what you spend time doing.
[138] So first and foremost, I hope that would be arrogant to say, but it would be great if it had gotten to you how often we talk about social dilemma.
[139] We just absolutely loved it.
[140] It terrified us, and we've also found a couple of great companion pieces that go along with that, I would argue.
[141] Feels Good Man is a great kind of companion.
[142] Have you seen that documentary?
[143] No, I haven't seen it, no. It's about how the artists who created Pepe the Frog had his cute little creation that he had done many comic books with and short stuff just got owned by the alt -right and how what role pepe the frog had in the last election which is so fascinating so that's a really good detailed account of how this stuff can work and drive people further and further and further into an increasingly fundamental militaristic you name it direction and then have you heard the rabbit hole a new york times podcast rabbit hole yeah i know Kevin very well, too, yeah.
[144] Oh, okay, great.
[145] So we loved it.
[146] We loved it.
[147] Like seeing your movie, then those other two, for us, call us critical mass, where we're just like, oh, my goodness, wow, wow, wow, wow, wow.
[148] And I got to give you a lot of credit because I do remember when you first hit the scene giving interviews.
[149] I imagine it was six, seven years ago.
[150] When did you quote defect from Google?
[151] Yeah, well, the internal to Google, the first presentation that I was sort of calling a moral call to arms was back in 2013 and kind of 2012 time period.
[152] But the first TED Talk and 60 Minutes piece, which is probably the biggest public exposure was in 2017.
[153] So that was about four years ago, three years ago.
[154] Okay.
[155] And I remember at that time seeing you thinking many things.
[156] One, wow, this guy must have some integrity because he's most certainly said goodbye to many millions of dollars by doing this.
[157] That was kind of my first.
[158] I'm very financially driven and fearful.
[159] So that was my first thought.
[160] Second was, oh, wow, he's explained that we've transitioned into an attention economy, which at that time was a bit of an abstract premise to me. And I thought, well, how does one monetize time spend?
[161] It's still going to be a weird economy, but sure, in the Yuval Harareas sense, it's all a story and whatever we buy into work.
[162] So, okay, it wasn't until I noticed Netflix started publishing as a result of how good their platform was, how many hours had been spent watching Adam Sandler movies, right?
[163] So it was an insane amount of hours.
[164] I can't remember now, but it was in the hundreds of millions or maybe even a billion hours spent, whatever it was, I thought to myself, oh, wow, now I get it.
[165] Adam Sandler has a value that's very quantified because a billion hours has been spent staring at him.
[166] And I'm like, okay, he was right.
[167] That was about a year and a half after I first heard you talk.
[168] And again, I still didn't understand the full implications of what you were saying.
[169] And I can't imagine that, A, you knew the full implications.
[170] and then B, that you could have even told us at that time, because I didn't even understand an attention economy.
[171] So backing all the way up to you working at Google as an ethicist, right, you're in charge of what the ethics of the products are.
[172] Is that an accurate description?
[173] Well, yeah.
[174] I mean, just to make sure I don't overstate my level of authority, you know, I was concerned about how do you design products that are going to influence two billion people's attention whether you want to or not.
[175] Like, you know, so you have the power of a god like Zeus, but if you bump your elbow, you might scorch half the earth if you're not really thoughtful about, you know, where does attention really go?
[176] And there hadn't been a role or a name for what I was researching, which is what I called at the time, designed ethics.
[177] How do you ethically design two billion people's sort of, you know, steering the nervous systems and the kind of paleolithic cognitive biases that we have buried in our brains?
[178] How do you ethically push those buttons?
[179] Because no matter what choice you make, if you use a red notification color versus a blue notification color, you're going to get very different kind of outcomes with how the human nervous system reacts.
[180] And so really, the role became kind of a new field of design ethics, how do you ethically design based on a more vulnerability -centric view of the human animal?
[181] Yes.
[182] And while you were at Google, you at a certain point wrote this document, and it's a very Jerry McGuire moment, I would say.
[183] And as you came into work, you noticed it was on many people's computers, and people were really talking about it and they were all saying, oh, my gosh, I'm so glad someone said this.
[184] We too are, these are problems now we've created that are in our home.
[185] I look at my phone much too often.
[186] I'm addicted to my, that was your case, I think.
[187] You realize, well, I'm very addicted to my email.
[188] I can't stop checking it and interacting with it.
[189] Which is funny, by the way, because a lot of people in the film didn't understand that line.
[190] Like, who in the world is addicted to their email?
[191] Because most people just get junk.
[192] But, you know, maybe if you lived from the slightly earlier era and you...
[193] Big time.
[194] Yeah.
[195] Yeah.
[196] And it just doesn't end.
[197] It's a 24 -7 news cycle.
[198] And obligation list, right?
[199] You're always behind to get back to someone.
[200] And so if you don't actually have in your own mind a way to conceptualize what it means to be complete, then basically I've obligated you for the rest of your life because you're never going to be done getting back to all those people.
[201] And that's the thing also that I was seeing at Google back in the time was the growth rate of what was going to happen.
[202] Because at being at Google, you actually get bombarded in emails from thousands of people.
[203] You get just so much information.
[204] It felt like kind of getting a front row seat to the future where you're seeing, oh my God, what is this going to look like when everyone's getting thousands of emails and thousands of notifications?
[205] We better do something now to protect people who are going to be coming along with us.
[206] We're handing these phones out to the developing world.
[207] What are we going to do to the collective attention of two billion people?
[208] So that's kind of where the concerns came.
[209] It felt like an opportunity.
[210] We could change this before it got too bad.
[211] And we'd could really make it different.
[212] And Google was in this unique position because of Android and because of Chrome as a browser where they kind of were shaping the rules of the attention economy.
[213] We could really make a choice, almost like a government can make a choice, or a city can make a choice about what is the width of the sidewalks and is there stop signs and is there traffic lights?
[214] Right.
[215] You know, or not for attention.
[216] Or do we just sort of let it run, you know, there's no speed limits, there's no stop signs.
[217] Everyone just go as fast as you can.
[218] And it's a race to the bottom for who can kind of, you know, get wherever you want as fast as possible, let alone if you crash into, you know, kids or whatever you do.
[219] Yeah.
[220] And that's kind of what the initial effort at Google was about.
[221] So what happened is there was much fervor at Google amongst the employees, and it had been brought up to the higher -ups numerous times, and you at that moment thought, wow, this is going to really change things.
[222] And I'm very excited about this, and I'm glad it was received this way.
[223] And then virtually nothing happened, right?
[224] It just kind of passed as any super hot button topic in the news cycle would.
[225] where it's like, it's all we care about for 36 hours, and then now we don't, because there's too much other shit to care about.
[226] Correct, yeah.
[227] I mean, think about how do you hold the attention of a company to say that the attention economy itself is important.
[228] Well, there's no external forces that are sort of driving an agenda that says, hey, this is the number one thing we've got to work on.
[229] And so while, you know, I want to be really giving credit to Google and some of their generosity of letting me even try to work on this for several years, because there was a couple of executives who especially carved out space, you know, for me to even do that research.
[230] But it's really unfortunate that we weren't able to get anything really done.
[231] And that's what led me to realize, you know, there's always this question, you know, do you change things from the inside of the outside?
[232] You know, you could be inside of Exxon and you're in charge of a billion dollar budget of, you know, oil exploration and where the, how much investments go into renewables.
[233] You know, if you could just change Exxon by 1%, that would make this huge difference because Exxon is massive, you know, or do you go to the outside to Extinction Rebellion and to Greenpeace and you try to change how Exxon operates by putting pressure on And the same thing was true at Google, right?
[234] Like, I was there in the inside, and it was so seductive.
[235] If I could only change it by 2%, you know, and then you could change how Android works for a billion people, that would change the world.
[236] That would be incredible.
[237] And that's why I stayed for two and a half or so years trying to do so.
[238] But I really realized that it took this external movement, both with 60 minutes back in the day, later TED Talks, and now with the social dilemma, which, by the way, we just heard the Netflix numbers, I think, two days ago.
[239] the social dilemma was watched by 38 million households in the first 28 days, which I think was broken records for documentaries on Netflix.
[240] And if you assume that many families watch the film, that's over 40 million, 45, maybe 50 million people.
[241] So it's unbelievable the kind of global public attention that this issue now has.
[242] And just to loop back with the story, 60 minutes, that piece with Anderson Cooper called The Addiction Code, really talking about the addiction aspect of the attention economy is what led Apple and Google to launch these digital well -being screen time features.
[243] When you look at your phone, it shows you how long did you spend on different apps?
[244] How many times did you check your phone?
[245] Those things changed because of this external pressure.
[246] So it was really a demonstration that you needed the external pressure.
[247] And in response to having seen you on 60 Minutes, which is my all -time favorite show, I had to go out and find an actual app that would have kept track of how much time I was spending.
[248] At that time, that wasn't even an option from the manufacturer.
[249] Well, not only was it not an option, they were actively denying that option, right?
[250] Like apps were trying to get made, but they were not letting them in the marketplace at that time.
[251] Yeah, that's right.
[252] There's an app, actually, strangely, Tim Kendall, who's in the film, who's the former president of Pinterest and who also brought the business model of advertising to Facebook.
[253] He's that insider who's in the film, the social dilemma.
[254] He actually now runs a company called Moment, which was one of the business.
[255] of those first companies that did let you screenshot your battery page and would reverse engineer how much time you were spending on different apps.
[256] And this is a perfect example, by the way, of why we need the technology companies to take responsibility because we can't build our own solution to this problem.
[257] It's sort of like saying, we're building a nuclear power plant in your neighborhood.
[258] And by the way, if you have a problem, you have to go design and sew your own hazmat suit in case something melts down.
[259] That's actually what's inhumane about our current system is we have systemic problems and we give you an individual, we put the response on you, the individual, just like, you know, BP saying, you know, we really care about climate change.
[260] So we built this carbon calculator.
[261] So now you can calculate how much carbon you're using as opposed to we need to change our practices and accelerate the transition.
[262] Yeah.
[263] That's a great analogy.
[264] Right.
[265] What are you going to do?
[266] Create your own car at home.
[267] Create your own everything.
[268] Okay.
[269] I was just going to point out the irony.
[270] I love that this was stated in the documentary, which is it is both utopia and dystopia.
[271] So the notion that the problem also, is getting slowly addressed on the platform Netflix is ironic, right?
[272] And I think it became clear in the rabbit hole is I don't think any, well, certainly there are some nefarious folks, whatever.
[273] But in general, I don't think there were malicious intentions with many of these things.
[274] I think many of these algorithms were created with a goal in mind that did not seem bad, but then just in practice turned out to be something else that no one could have foreseen or just did not foresee.
[275] And I'm also a little bit sympathetic to these companies because you want them to own it and yet they don't want to be liable.
[276] This is where I think there's a little bit of an issue in our justice system where there's not a lot of latitude for companies to acknowledge their errors and then try to rectify the situation without having to deal with punitive damages, right?
[277] it's a little bit of a catch -22 for these companies.
[278] And I do think many of these companies are led by very ethical people.
[279] You're getting to exactly the right point, which is, let's say you find out that your product causes mental health problems for teenagers.
[280] Well, are you ever going to proactively go out and say that, hey, we're working on reducing the mental health problems for teenagers?
[281] You're never going to say that.
[282] And then also, if working on that problem means fundamentally changing the success metrics of your company, say your Instagram, and the success is time spent, and high time spent is correlated with, you know, teenage depression, which it is.
[283] Obviously, isolation and self -harm and teen suicide.
[284] There's some horrible metrics that have gone up with the increased use of especially sort of the teenage apps.
[285] You're not going to be able to admit that that problem exists.
[286] You know, there's a famous writer Upton Sinclair said, you can't get someone to question something that their salary depends on them not seeing.
[287] And I think, you know, you talked about how in many cases these were unforeseen consequences, but then later on, these were known consequences, but were not dealt with.
[288] And I actually think this is why the companies secretly kind of rely on outside pressure, like the film, The Social Dilemma, like 60 Minutes, like many of these really hardworking activists who I know, who've been in the space for a long time, the civil society orgs, who are screaming at the top of the lungs, look what's happening in Myanmar, look what's happening in the conspiracy theory, correlation matrix, and Facebook groups, look what's happening, and YouTube recommendations.
[289] You know, it's these groups who are pushing from the outside who are driving, I think, some of the most change.
[290] At the same time, I want to say I know, and we work with regularly at the Center for Humane Technology, many of the insiders who are leading the various key products and features inside these companies, and we find really good -hearted people who are trying to do the best they can.
[291] But they have kind of created a Frankenstein where once you've steered, let's say, 50 million people into QAnon or other fringe conspiracy theory groups, the damage has kind of already been done, right?
[292] And we can get more into some of those aspects, which I think are really the most existential.
[293] You know, there's many different societal ills that are coming from technology, but the real breakdown is our inability to have faith or trust in a shared information environment, to believe the same things, and to even interpret reality in the same ways.
[294] Because, as you know with your brain, once you have a hammer, everything looks like a nail, and technology has sort of, like Moses, part of us into two different sort of hammer -seeking nails infrastructures where we have, you know, polarized societies around the world because Facebook profits, the more that each of us have our own individual personalized Truman Show reality as opposed to one shared reality that we can actually have a conversation about.
[295] Yeah, because in the shared reality, you're going to get information you don't find that pleasing, and you're going to get information that challenges your assumptions, and most people aren't going to spend a lot of time having their assumptions challenged.
[296] It's kind of to what we enjoy about the internet.
[297] Stay tuned for more armchair expert, if you dare.
[298] What's up, guys?
[299] It's your girl Kiki, and my podcast is back with a new season, and let me tell you, it's too good.
[300] And I'm diving into the brains of entertainment's best and brightest, okay?
[301] Every episode, I bring on a friend and have a real conversation.
[302] And I don't mean just friends.
[303] I mean the likes of Amy Polar, Kell Mitchell, Vivica Fox, the list goes on.
[304] follow, watch, and listen to Baby.
[305] This is Kiki Palmer on the Wondery app or wherever you get your podcast.
[306] We've all been there.
[307] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers and strange rashes.
[308] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[309] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[310] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[311] It's called Mr. Ballin's Medical Mysteries.
[312] Each terrifying true story will be sure to keep you up at night.
[313] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.
[314] Prime members can listen early and ad -free on Amazon music.
[315] So I want to quickly, if I can in layman's terms, just explain kind of what happened with YouTube, as I understand it, which is in the rabbit hole, they get this great person who volunteered to be a part of this.
[316] And I will say, in general, when I hear about a QAnon person, I'm seeing them at the end of the line, right?
[317] Or if they're a very alt -right now white nationalist, misogynist, I'm meeting them as that person.
[318] And it's hard for me to imagine that they might not have been that person, years ago.
[319] And so this guy turns over his entire viewing history, and you can just watch him be led, you know, a micron at a time further and further away from where he started.
[320] And he starts as an environmental science major.
[321] He's probably, you know, a centrist at worst or something.
[322] And just because the algorithm had figured out that it's not enough to give people what they want, you also have give people something new, and they know if you enjoy this thing, you'll enjoy this incremental shift over to the right.
[323] Now again, none of this I don't think was malicious or had a bad intention.
[324] But the algorithm, once it takes off, it's just, it gets so perfect at knowing exactly where to bring you to get you to spend more and more time.
[325] And this guy kind of wakes up as a really far, alt -right, misogynist.
[326] Extremist.
[327] who is sympathetic to white nationalist movements because they have a right, too, to be proud of themselves.
[328] You know, and he just wakes up and goes, oh my God, who am I?
[329] Yeah.
[330] Can I give you two more examples of that?
[331] Yeah.
[332] It's important to rewind the clock and realize that, you know, we're now more than 10 years into this mass psychology experiment where three billion people's thoughts have been wired up to supercomputers that are steering what we all watch and look at.
[333] And as you said, no matter where you start, if you imagine a spectrum, where on one side you have the calm Walter, cronkite sort of section of YouTube, you know, calm, rational discourse, thoughtful, slow, maybe a little bit boring, but, you know, some kind of shared water cooler reality.
[334] On the other hand, aside of the spectrum, you have Crazy Town or Extreme Town.
[335] You have conspiracy theories, anorexia videos, insanity, you know, whatever's going to keep you glued, no matter where you start.
[336] You could imagine, you could start even in the Calm section, or you could start in Crazy Town.
[337] If I'm YouTube, and along that spectrum, what's the next set of videos I'm going to recommend to you?
[338] Am I going to recommend you towards the calm section, towards that section, or am I going to recommend you towards Crazy Town?
[339] You're always going to tilt the floor towards Crazy Town.
[340] So if you imagine these platforms taking the entire floor of humanity and then just tilting it by three degrees, several examples of this, if you were a teenage girl and you landed at a dieting video on YouTube, and maybe you even type that in, right?
[341] You started, like, oh, you're searching for diet videos.
[342] But then YouTube's trying to calculate for that right -hand sidebar, which we should lay out one more fact for people.
[343] What percentage of YouTube?
[344] YouTube's traffic comes from that right -hand sidebar of recommendations.
[345] Like, if you had to guess.
[346] Versus what you initially searched?
[347] Yeah, versus what you searched for you pick out at the very beginning, right?
[348] Well, I'm skewed because I listen to Rabbit Hole.
[349] Yeah, tell us, tell us.
[350] So you know, well, you know.
[351] Well, I don't even know.
[352] I just, I imagine it's in the 80s or 90s or something.
[353] So the last time YouTube released this number because I think they got scared by how good it was.
[354] They used to be very proud of it.
[355] It was more than 70 % of the watch time, a billion hours a day.
[356] That's 700 million hours.
[357] are controlled by what a machine is recommending.
[358] So if I told you, you know, if AI was controlling the world, would we know?
[359] You know, AI is controlling the minds that are controlling the information flows that go into each of us that then make up the choices that we make downstream.
[360] So it's kind of tapped into mind control happening way upstream.
[361] And that might sound salacious, but I'm sure we can defend the legitimacy that these things really have taken control.
[362] And as you said, you know, if a teen girl watches this dieting video on the right hand side, what does it recommend is anorexia videos or what are called Finspiration videos because they're very good at keeping teenagers attention.
[363] For those people who look like they clicked on those.
[364] And this is the equivalent to driving down the five highway in L .A. And according to YouTube, if we should give you what you pay attention to, while you're driving down and you see a car crash, everyone's eyes, literally single every car, their eyeballs veer to the right, they see the car crash.
[365] So according to these tech companies in the AI, everyone loves car crashes.
[366] We should just feed you in bottomless supply of car crashes.
[367] And this is the exact way that these systems work.
[368] You know, for parents who I think right now are, you know, forced to have YouTube be the new babysitter, you sit your kids down to maybe watch, you know, some videos, and they're watching World War II videos or something.
[369] And then next thing you know, they watch, it recommends Holocaust denial videos, because those were really good at keeping kids' attention.
[370] And then they come to the dinner table saying, the earth is flat, the Holocaust didn't happen, and you say, why is the world feel like it's going crazy everywhere all at once?
[371] And it's really because these things for 10 years have been steering us just slowly, bit by bit, into this crazier view of reality.
[372] And I'm saying this.
[373] I know it can sound frightening or dystopian, but it's really important that if we can all collectively see that this has happened, I can snap my fingers like hypnosis and say we can wake up from this trance now.
[374] We have a very artificially inflated view of our polarization or how much we should believe conspiracy theories because this is what it's been doing.
[375] Yes.
[376] I also just want to make a little bit of an analogy as a writer of movies.
[377] So if you can just think, first and foremost, it's very acknowledged that we are storytelling machines.
[378] That's how we pass on all the information that kept us alive.
[379] So we have this great, great capacity to understand, to retell and to understand stories.
[380] Now, a principle in writing that is undeniable is you will never see a movie where you see a car chase at the beginning of the movie and then see a second car chase in the middle.
[381] The second car chase will always be longer, more spectacular, and have more explosions.
[382] And then the third car chase at the end has to be the biggest crazy.
[383] one because you have to over deliver.
[384] You have to raise the ante each time or us as storytellers, followers, we will lose interest.
[385] You can't go backwards.
[386] So that's another way in which these YouTube videos are hacking into something very primitive in how we think, communicate, and what we're attracted to.
[387] So I had just one part of this that I would love to just say out loud, which is, I don't think I'm unique in believing that I'm in charge of my opinions and that I'm in charge of what I think is just and unjust and ethical or not.
[388] I think we all have a sense that we're really anchored in a self.
[389] And I think we're all, we all dramatically underestimate just how persuadable we are.
[390] And this is kind of faux arrogance we all have that we know who we are at our core and that we couldn't possibly be led astray.
[391] And the fact is that's just not the case, right?
[392] You had many classes on persuasion at Stanford.
[393] You know, could you just kind of give us a sense of how fallible and vulnerable we are to persuasion?
[394] Before we get into the fallibility and vulnerability, which we always talk about and is really an important frame, I mean, the biggest, most obvious example of us not choosing our own thoughts and our own contents of our own minds is the fact that we speak in a language that we didn't choose to speak in.
[395] I don't speak English because I chose that.
[396] I speak English and the accent that I have, there's actually a great New York Times thing where if you just answer 25 questions about which words you use about various objects in our environment, it'll actually pinpoint the exact zip code that you live in.
[397] Because the word choices are that specific to our geography.
[398] Yeah.
[399] But literally with something like 20 questions of which words you would use about various objects, like what do you call the dividing line in a street or something like that or what do you call, you know, the access lane or these kinds of things, they can actually pick the exact place that you live, which speaks to the fact that, you know, we are all operating with accents and dialects of language that we didn't choose.
[400] We just grew up in the spaces that we did.
[401] But on the persuasion front, you know, my background originally, you know, as a kid in San Francisco with my mother, she would take me to the magic shop.
[402] And I was really fascinated by magic because even as a child, you could do something that adults didn't understand how the trick worked.
[403] And I found that fascinating.
[404] You did a really impressive bit of coin work in social dilemma.
[405] Yeah, we enjoyed that.
[406] I love magic.
[407] She's so horny for magicians.
[408] It's crazy.
[409] I went out my birthday this year.
[410] That's amazing.
[411] Yeah.
[412] I feel so lucky to be able to have gotten to know certain magicians I've had such a huge, you know, magician crush on for a while, like whether it's Apollo Robbins or Darren Brown over the last couple of years.
[413] It's just astonishing when you really see the level of craft in a magician knowing something about how other people's, minds make meaning that other people don't know about themselves.
[414] That's what makes the illusion work, is the fact that there is an asymmetry of knowledge.
[415] If the asymmetry of knowledge wasn't there, that I know something about you that you don't know about yourself, if you knew the thing that I know, then the trick doesn't work.
[416] Right.
[417] And magic is a very visceral experience of all of us being influenceable.
[418] Maybe it's been a while since a listener out there has seen a magic show, but I actually was recently at the Magic Castle down in L .A. before coronavirus hit.
[419] And even being a magician myself, you still are just blown away by the craft of what these people can do.
[420] And later, you know, I studied at a class at Stanford called the Persuasive Technology Lab or Persuasive Design Course, which originally is really about how to use persuasion for good.
[421] Like, how do you embed persuasion into technology?
[422] For example, could you help people, you know, build habits for going on exercises or for cheering up their friends?
[423] Is this BJ Fogg's class?
[424] This is BJ Fogg's class, yeah.
[425] We interviewed him.
[426] What a sweetie pie.
[427] We like him.
[428] He's wonderful.
[429] And there's a false narrative out there that somehow we are enemies or something like that.
[430] Oh, he spoke highly of you.
[431] He said, like, two of the people he was most proud of have had in his class, you were one of them.
[432] Yeah.
[433] He was you and the Instagram guy.
[434] Or you guys were in the same class, right?
[435] And how bizarre that is.
[436] We were, yeah.
[437] I was in the class with the co -founder of Instagram, Mike Krieger, who's been a very close friend of mine.
[438] And there's actually a photo of the three of us at South by Southwest in 2011, which felt like a very historic photo somehow in that weird moment when it was taken.
[439] But, you know, just to complete that story, Mike and I, before he built Instagram, we worked on a project together in that persuasion class called Send the Sunshine, which was about, you know, in periods of bad weather, people get something called seasonal affective depression disorder.
[440] I have it.
[441] Self -diagnosed, but I definitely have it.
[442] Yeah.
[443] I have it today.
[444] It's cloudy here.
[445] I think she has a biological component, having come originally from India where there's a lot of sunshine.
[446] Evolutionary.
[447] Yeah, because some of my other Indian acquaintances have suffer from it at a disproportional level.
[448] This is armchair anthropological observation, but please continue.
[449] Well, you know, our environment really matters in terms of how, you know, how uplifted we feel and the cleanliness of our room to the weather outside.
[450] So this project was basically, and this is keep in mind back in 2006 even before the iPhone, so it's kind of a futuristic idea, which was, let's say you have two phones, two feature phones.
[451] And in one of the zip codes, the weather service just knows that you live in a zip code.
[452] where you've had bad weather for five or six or seven days in a row.
[453] What it does is it texts your other friend and says, hey, your friend, Dax has had bad weather.
[454] Do you want to take a photo of the sunshine?
[455] Because we know you're in a weather is a code with good weather.
[456] Oh, wow.
[457] And send some sunshine to him.
[458] And so you get this view of this kind of puppeteer who's trying to orchestrate something really positive.
[459] Yeah.
[460] You're trying to orchestrate compassion or love or thinking of you kind of vibes, right?
[461] And so this is an example of persuasive technology that could be used to help bring people a little bit closer together.
[462] You might even say to make sure we're charitable and even handed with Facebook that the birthday's feature is reasonable, right?
[463] I mean, to be able to know there's this one day a year where you get to celebrate your friends and it's helpful to have computers remind us and tell us what those days are.
[464] That's persuasive technology.
[465] And you start to realize that persuasion is kind of everywhere.
[466] It's, you know, our world is a choice architecture.
[467] You know, the height of shelves in a supermarket controls what we look at and how often we buy some things over others, you know, the fact that when you're at the end of a grocery store line and there's that last, you know, a bit of chocolate and sweets and gum and things like that, you know.
[468] Encaps?
[469] N caps, yeah.
[470] And so our world is, we are living already in physical choice architectures, physically persuasive environments, but if you now put the phone in our hands, this phone totally reshapes the menu of choices that we are picking from.
[471] And it provides, in any moment of anxiety or boredom, an instant excuse.
[472] You always have something sweeter on life's menu you can choose, which is to look at at your phone rather than be with yourself.
[473] When we had Tikkna Han, who's the famous spokesperson for world peace and for mindfulness, he's a Zen Buddhist monk who's passed away, I believe, recently from Vietnam, and we brought him to Google.
[474] And the way he described his concerns about technology is that it's never been easier to run away from ourselves.
[475] Yeah.
[476] Because these things provide an infinite excuse for a quick hit of pleasure or novelty, which is way more satisfying than the discomfort of.
[477] whatever is plaguing us, you know, when the lights go down.
[478] The stupid example I give is my wife and I went to this hotel for two days.
[479] And I just said on the way up, I'm not going to look at my phone for two days.
[480] First day, I was like, bored, agitated, grumpy.
[481] Second day, I found paper and some pencils.
[482] And I drew all day.
[483] And I loved it.
[484] And I was like, oh, my God, when's the last time I just sat and drew?
[485] And it was like a media, right?
[486] And I was shocked at my own level of being safe.
[487] by the never -ending blinking machine.
[488] Yeah, I had a friend who had created something called Camp Grounded, which was actually an experience for three days, which revolved around unplugging from your phone, but it was much deeper than that.
[489] When you arrived at this camp, there was three people in hazmat suits who would come up to you and take your phone and put it in a plastic -sealed bag along with your driver's license, and they would do this whole ritual to kind of take it away, and then they give you a new name for the weekend.
[490] So if your name is Tristan, then you might be.
[491] name would be presence for the three days that I'm there.
[492] And then the entire time that you're there, everyone has a different name.
[493] You can't talk about work.
[494] There's no technology and there's no time.
[495] So everyone also takes your watch away.
[496] Because time is also an interesting thing that it's a technology that really restructures our relationship to pressure and attention and things.
[497] So you don't have a watch.
[498] You can't talk about time.
[499] You can't talk about work.
[500] You don't talk about age.
[501] And you just have a new name.
[502] And so you kind of get to see who am I when everything is just taken away and it's very uncomfortable and people have these brief moments of kind of feeling anxious and everything but then it becomes this kind of freeing human experience where everyone was it an orgy by day two i have to imagine everyone just started fucking like the first original game boy that's an interesting uh you know it didn't go in that direction maybe a different redwood forest would have steered people differently but it's it's actually amazing because when there's no competition for attention everyone cares a lot more about each other and gets to be a lot more curious about each other because there's nothing else.
[503] The other people are the source of entertainment at that point.
[504] Yeah, it turns out people are really interesting if you actually have the patience and not just one way, but both people have the patience to get curious about who each other are.
[505] To your point earlier, yeah, you don't have this endless bit of homework, which is your email, all the responses you owe, and you don't have this pressure of what's supposed to happen at this time and that time.
[506] Yeah, there's a a lot that goes away right there.
[507] Completely, yeah.
[508] And this experience was created by Levi Felix, who was an amazing human who unfortunately passed away a few years ago from brain cancer.
[509] We were actually the same age.
[510] But he created these weekends at Camp Grounded where hundreds of people had totally life -changing experiences.
[511] And it wasn't just from the disconnection of technologies about reconnecting with what it means to be human, including being in a redwood forest, you know, with other people.
[512] Your next story you tell, if the person dies at the end, that'll be the third strike, okay?
[513] Because that was back to back.
[514] No more allowed.
[515] The Buddhist monk.
[516] The Buddhist monk passed.
[517] Yes, yes, yes, yes.
[518] I forgot about that.
[519] Or at least tell a couple where people are still thriving.
[520] Okay, so I have some thoughts.
[521] I rewatched it this morning while I was working out.
[522] And just one one sentence, again, I want to repeat that's so profound that people should really ask themselves, which is, if you're not paying for the product, you are the product.
[523] I feel like that's the easiest way to assess what's happening.
[524] It's kind of the old poker table saying if you can't recognize the fish, you're the fish.
[525] That's a powerful way to think about this, right?
[526] And I've heard of people who concentrate in your field talk about this kind of fork in the road we had, which was we made a decision to either pay for this service or to receive it for free from other people who would be footing the bill, which would be advertisers and this and that.
[527] So we didn't choose the paid version.
[528] Of course, I'm in a position where I could pay for the pay version and I would like the paid version, but it is inherently a little undemocratic, isn't it?
[529] If that were, it would just exacerbate the income inequality.
[530] How would we get around that?
[531] Is that part of what a potential solution could be is one that we picked that had a set of and then we just, we had to pony up and pay for it.
[532] Yeah, well, maybe first for the audience, it might be good to explain why, when we are the product, this is such an unaffordable outcome.
[533] One thing we like to say is free is the most expensive business model we've ever come up with because when the product is free, as you say, we are the product being sold.
[534] And specifically, Jaron Lanier is very quick to point out in the film that the subtle, imperceptible change in our beliefs and behavior and identity, that is the product.
[535] So the ability to shape just by 1 % or 0 .1%, what we're thinking, feeling, and doing, that ability to shift someone is the very thing that's being sold.
[536] Wait, don't go further for one second.
[537] That's a big concept that I want to set in.
[538] And yes, I think a lot of us have a kind of baseline understanding that we're the product in that we're being advertised to and we're going to go spend money and buy these products.
[539] But that's incomplete is what you're saying, and that there is something more dangerous on the table, which is not only will you buy this product, but I can change your mind about this product, whatever that product may be.
[540] I could be a political product.
[541] That could be anything.
[542] And that the real value for anyone that would invest in this is to lead the world somewhere they would want them to be led.
[543] That's right.
[544] The ability to shift minds.
[545] Now, when I say this, A lot of people might think, well, I'm not influenced by advertising, right?
[546] Not just some people, almost everyone believes that they're not influenced.
[547] There's something called the Dunning Kruger effect where everyone believes that they're, you know, they're better.
[548] I think 90 % of believe that they're better than average drivers, but of course that distribution doesn't work out.
[549] Also, the Dunning Krueger, also the stupidest person talks the most, right?
[550] That in general is part of the Dunning Kruger.
[551] There's so many aspects of the ways that we overestimate our capacities.
[552] But this is critical that it's not just about the advertising.
[553] Facebook's incentive is you are more profitable when you are addicted, anxious, polarized, attention -seeking, outraged, and disinformed than if you were actually a thriving citizen of our democracy.
[554] Because in each of those cases, you're spending time on the platform, right?
[555] If you are not addicted, you're not worth as much as someone who's addicted.
[556] If you're not anxious that causes you to check in more often with an unconscious habit, you're not as profitable as someone who is anxious who's checking in.
[557] If you're not attention -seeking, like you don't care about the number of people who looked at the video that you posted and how many comments it caught, you're worth less if you're not attention -seeking than if you are attention -seeking.
[558] So each of these cases, we are worth more if we're domesticated into a certain new kind of human.
[559] You can think of it as just like we domesticate cows, I'm sorry if this feels disgusting to people, but I think it's important to really make sure the metaphor lands that just like we don't have regular cows anymore or wild cows, we have the kinds of cows that give us the most milk and give us the most meat because that's the kind of cow that fits our economy.
[560] Well, we are becoming the kind of human that is, you know, shorter attention spans, more addicted, more distracted, more outraged, more polarized, more attention -seeking and more disinformed because each of those things are part of what makes the companies more money.
[561] Again, not because anyone, at Facebook, twists their mustache and says, this is how we're going to make all this money, but the machines that they, supercomputers, that they pointed our brains to calculate what thing can they show us and how quickly can they show it to us?
[562] And should it auto play or should it wait five seconds?
[563] Or should we make people hit reshare for instantly?
[564] Or should we have them wait and meditate on a mountain and then share only when they know if it's true?
[565] All of the worst aspects of society are more profitable than the best aspects of society, which is actually what puts each of us in the same boat as human beings.
[566] Like the good news is, no one wants the system because it's an economy based on cannibalizing our own life support systems that make us real human.
[567] Whether you're a Republican or a Democrat or you're a child or an 80 -year -old, each of us are in the same boat together because it's really our lower -level human instincts for each of us.
[568] Well, and once we're suffering, we start seeking comfort.
[569] And we generally, when suffering and feeling fearful, seek comfort in the quickest, easiest, most disposable ways that, of course, then impact even more suffering.
[570] And, you know, it's like the sugar spiral circle people get in or the lack of exercising leads to less exercise.
[571] You know, all these things we seem to always be spiraling up or spiraling down.
[572] That's exactly right.
[573] And I think a humane technology world is one in which you get virtuous cycles where more virtuous behavior creates more virtuous profits, creates more virtuous behavior, creates more virtuous habits, creates more virtuous ways of living, creates a more virtuous society.
[574] And so what we're looking for, as you said, is not these sort of downward spirals of human downgrading, which is the more the machines get smarter at predicting our behavior and manipulating us and making us more predictable.
[575] They make more money so they get to bigger machines that can better predict our behavior and then sort of a vicious loop of they get increasing power and we have become more predictable.
[576] Let's reverse that so that humans become more free, more wise, more thoughtful, more virtuous, and the machines are more in service of us as opposed to using us as the resources to be mind.
[577] Here's where we get into some big philosophical issues with all of this, which is we live in a capitalist market economy.
[578] How do you incentivize that?
[579] How could that ever be incentivized?
[580] Well, I think this now gets to the question you were asking about, you know, hey, we could have paid subscription models for these products, like, you know, much of people paid, you know, your listeners paid for a Facebook account recently.
[581] Nothing.
[582] But how are they worth $750 billion?
[583] Because obviously our attention is the thing that's being strip -mined for profit.
[584] So if the business model instead was subscription, we pay $10 a month, let's say.
[585] Yeah.
[586] You know, it's important to say that we, I think, you know, for basic phone service, it's not free either.
[587] We have to pay for a network, for access to a network.
[588] Yeah.
[589] So this is really not a radical idea.
[590] You know, for Netflix and peak entertainment with Netflix in the rise of, say, Game of Thrones, we get Game of Thrones and we pay for subscription television instead of the kind of race to the bottom for clickbait.
[591] So what we're really paying for is a world that we want to live in.
[592] Now, obviously, anytime you have a paid access economy, it introduces inequalities.
[593] We want to make sure that we're equalizing.
[594] However, to say that we want to equalize what we have now is like saying, it's like Coca -Cola saying, well, how else are we going to give the entire world diabetes if we don't, you know, use sugar?
[595] Right?
[596] You wait for the diabetics to come after you.
[597] I'm on their hitless.
[598] We're delivering the wrong product.
[599] We have to make sure that we're not delivering garbage that downgrades human civilization so we can't actually survive as a species.
[600] We want to deliver ubiquitously democratically the kind of self -reinforcing virtuous humane technology that is actually has our interests at heart.
[601] I thought of something today while pounding the bench press and listening to your movie.
[602] Has anyone thought about this technique?
[603] Now, there seems to be an agreed upon analysis of happiness in that the UN always rates Sweden as number one and God knows what's the last place one.
[604] There seems to be some pretty dependable metrics we can measure happiness, right?
[605] And we can measure suffering.
[606] We can measure on some level flourishing.
[607] why isn't there an independent government agency that studies what a year on a platform results in?
[608] So let's say that every one of these platforms, Facebook, Twitter, Instagram, from the day you joined, one year later, you would be asked a series of questions, and then they would give all of these different products a rating.
[609] So people are going to get 30 % more miserable.
[610] If I see that as I'm signing into Facebook versus I go to Instagram and let's say I get 2 % happier, I feel like that would be the equivalent of putting ingredients on labels and putting caloric intake and fat count and all that.
[611] No one has a sense of whether this thing's going to make them more miserable or not.
[612] And I trust that people wouldn't pick a product that's going to make them more miserable.
[613] How could that work or work in tandem with something?
[614] Yeah, well, this is definitely a line of thinking that people have gone down.
[615] it's a really important one.
[616] I think we have to make sure we're setting the right goalposts for what our objective is.
[617] So if the goalpost is happiness, the brave new world that Aldous Huxley conceived of where everyone has soma and pleasure, you know, a world where we're all just getting happy is not a world where we necessarily solve climate change, deal with racial injustice, or, you know, make a more equal and fair society.
[618] Obviously, a world where Facebook is delivering happiness in that way might just be kind of amusing ourselves to death or sort of distracting us.
[619] So I think what we need to make sure is that our technology environment, if we sit the right goalposts, conditions humans to have the kind of capacities that can solve our most pressing problems, including having the well -being, mental health, relationships, and connection that enable us to be full and thriving.
[620] What would require people to trust science, which is more and more, there are many, many people who are as a part of those metrics, is purpose, is community is, you know, not just, yeah, I'm on two hits of ecstasy and you gave me a lollipop measure of happiness.
[621] But again, if there was one we could agree upon, that was truly flourishing and using all the cognitive behavioral things we now know, and sorry, please continue.
[622] Yeah, no, that's exactly right.
[623] I mean, so you quickly kind of reverse engineer Maslow's pyramid and you end up with, okay, how do we make sure technology is enabling the kind of full, holistic, thriving, flourishing, and embodiment of our wisest values.
[624] Then you quickly enter in this kind of abstract philosophical conversation about what does it mean for us to be you know so wise and how do we know that we're flourishing and can facebook or should we wait for technology to deliver on all those benefits shouldn't humans also have the capacities intrinsic in themselves to achieve their own ends and so on can i interrupt you for one second because it is a point you made earlier in the documentary which is it occurred to you at one point wow not only are two billion people being affected by this but they're ultimately being affected by a handful of 25 to 30 year old white males.
[625] And so there's a, in anthro, we say naive realism.
[626] There's a tendency to believe that your hierarchy, your Maslow pyramid, and your culture is one that would transcend when in fact, if you're an anthro person, that's not true.
[627] So, yeah, you're designing something or my suggestion would be implicit in it, assuming other cultures would want our Maslow pyramid.
[628] And that's exactly what we've seen.
[629] seen as a kind of digital colonialism, especially since these companies, you know, especially Facebook with its free basics program, has gone into countries like Myanmar or Ethiopia and made a deal with the telecom provider.
[630] So if you were getting your very first phone, you've never been on the internet, you know, your phone comes with Facebook.
[631] That's the deal that they do with the telco provider.
[632] And you get Facebook access free, but everything else costs money.
[633] And this creates an asymmetry where now Facebook has crowded out competitors and alternatives.
[634] And people's first experience with the internet is Facebook.
[635] There is no HTTP colon.
[636] There is no WW