Insightcast AI
Home
© 2025 All rights reserved
Impressum
#291 – Jonathan Haidt: The Case Against Social Media

#291 – Jonathan Haidt: The Case Against Social Media

Lex Fridman Podcast XX

--:--
--:--

Full Transcription:

[0] The following is a conversation with Jonathan Haidt, social psychologist at NYU and critic of the negative effects of social media on the human mind and human civilization.

[1] He gives a respectful but hard -hitting response to my conversation with Mark Zuckerberg, and together him and I try to figure out how we can do better, how we can lessen the amount of depression and division in the world.

[2] He has brilliantly discussed these topics in his writing, including in his book, the coddling of the American mind, and in his recent long article in the Atlantic, titled Why the Past Ten Years of American Life have been uniquely stupid.

[3] When Teddy Roosevelt said in his famous speech, that it is not the critic who counts, he has not yet read the brilliant writing of Jonathan Haidt.

[4] I disagree with John on some of the details of his analysis and ideas, but both his criticism and our disagreement is a side.

[5] if we are to build better and better technologies that connect us.

[6] Social media has both the power to destroy our society and to help it flourish.

[7] It's up to us to figure out how we take the latter path.

[8] And now a quick few seconds mention of each sponsor.

[9] Check them out in the description.

[10] It's the best way to support this podcast.

[11] We got Uncruise for Adventure, Notion for startups, Blinkist for non -fiction, magic for snacks and ate sleep for sweet, sweet naps.

[12] Choose whysen it, my friends.

[13] And now, onto the full ad reads, no ads in the middle.

[14] I hate those.

[15] I try to make these ad reads interesting, but if you skip them, please check out our sponsors.

[16] I enjoy their stuff.

[17] Maybe you will too.

[18] This show is brought to you by Uncruise Adventures.

[19] Small ships, cruises, sailing to Alaska, Mexico, Costa Rica, Hawaii, and so on.

[20] can go hiking, kayaking, whale watching, and seeing the northern lights.

[21] There's something not just humbling, but somehow revealing about who we are what we're doing here when you look at nature.

[22] There's something about the calmness, the quiet of water stretching out to the horizon.

[23] There's something about trees as far as that I can see.

[24] that reminds you what this is all about and somehow it's impossible to put it into words.

[25] That's why it's great to take little journeys like this.

[26] You can save $500 on Alaskan Adventures through June for bookings made within 30 days of departure.

[27] Go to uncruise .com slash pages slash flex.

[28] This show is also brought to you by Notion, a note -taking and team collaboration tool.

[29] It combines note -taking, document sharing, wikis, project management, and much, much, much more into one space that's simple, powerful and beautifully designed.

[30] Obviously, I've been using Notion for a long time now.

[31] What I really mean to say is all the cool kids have been telling me for a long time about Notion.

[32] I'm originally, if I might admit, an Emacs person, so a lot of the, in terms of an IDE perspective, what I use for programming is still Emacs, but much, much less so.

[33] I use The more optimized tools for the job.

[34] For note -taking, specifically, Notion is just far above anything else I've used.

[35] It's an incredible tool.

[36] And it's also good for the work, the company setting, the startup setting.

[37] And they're running a special offer for startups.

[38] You can get up to $1 ,000 off the team plan.

[39] To give you a sense, that's almost a year of free Notion for a team of 10.

[40] Go to notion .com slash startups.

[41] This show is also brought to you by Blinkist My favorite app for learning new things Blinkist takes the key ideas from thousands of nonfiction books and condenses them down into just 15 minutes that you can read or listen to Yeah Actually the rise and fall of the Third Reich Unfortunately is not on there There's not many books about Hitler or Stalin on there Perhaps for a good reason But the best I would say non -fiction books are all on there.

[42] So you're talking about sapiens, Meditations by Marcus Aurelius.

[43] It goes on and on.

[44] I use it for all kinds of reasons.

[45] So one, of course, is to review the books I've already read.

[46] Two is to select the books I want to read in the future.

[47] And three, perhaps most powerfully, is books that I just don't have time to read, but are culturally and intellectually important.

[48] This life is about hard choices.

[49] And for the books you don't choose, sometimes Blinkist can save your ass.

[50] As a listener, you can get a free seven -day trial and 25 % off of a Blinkist premium membership at Blinkist .com slash Lex.

[51] This episode is also brought to you by an old and favorite sponsor of this podcast.

[52] My fellow travelers on this journey of the podcast is Magic Spoon.

[53] Something beautiful that this podcast was originally and still sponsored by cereal.

[54] And that connects to my sort of teen years when cereal was such a source of joy, but it was also a source of gigantic amount of sugar, but that's not the case with Magic Spoon, which is why it's awesome.

[55] It's low -carb, keto -friendly, zero grams of sugar, 13 to 14 grams of protein, only 4 net grams of carbs, 140 calories in these serving, and it's freaking delicious.

[56] My favorite flavor is cocoa.

[57] They have a bunch of other ones.

[58] Peanut butter is pretty good, too.

[59] They have a 100 % happiness guarantee, so if you don't like it, they refund it.

[60] Nothing else in life, I would say has a 100 % happiness guarantee, so maybe you should look to MagicSpoon.

[61] To get such guarantees about happiness, because the rest of life is full of suffering, friends.

[62] Save $5 off your order if you go to magicspoon .com slash Lex and use code Lex.

[63] This episode is also brought to you by 8th Sleep and its pod pro mattress.

[64] There are very few things I enjoy as much as a good day.

[65] nap, even more than sleep sometimes, a good nap, like 20 minutes.

[66] I'll sometimes actually drink coffee right before, take the nap, and I wake up refreshed in the way that I don't feel refreshed in any other condition.

[67] There's something magical about this power nap, and I am ready to take on the day, even if it's like 11 p .m. at night.

[68] And then I ruined my whole sleep schedule, but that doesn't matter because happiness is what matters.

[69] And the best, way to achieve happiness is a nap on an eighth sleep mattress or eight sleep pod pro cover on top of a mattress they uh have said pod pro cover that you can just add to your own mattress you can get there's two up to you you could do all this at eightsleep dot com slash lex and you can save two hundred dollars a checkout uh from the pod pro cover eight sleep currently ships within the united states of a Canada, and the United Kingdom.

[70] Enjoy your sleep.

[71] Enjoy your nap, friends.

[72] This is the Lex Friedman podcast.

[73] To support it, please check out our sponsors in the description.

[74] And now, dear friends, here's Jonathan Haidt.

[75] So you have been thinking about the human mind for quite a long time.

[76] You wrote the happiness hypothesis, the righteous mind, the coddling of the American mind, and today you're thinking, you're writing a lot about social media and about democracy.

[77] So perhaps if it's okay, let's go through the thread that connects all of that work.

[78] How do we get from the very beginning to today with the good, the bad, and the ugly of social media?

[79] So I'm a social psychologist, which means I study how we think about other people and how people affect our thinking.

[80] And in graduate school at the University of Pennsylvania, I pick the topic of moral psychology.

[81] and I studied how morality varied across countries.

[82] I studied in Brazil and India.

[83] And in the 90s, I began, this was like I got my PhD in 1992.

[84] And in that decade was really when the American culture war kind of really began to blow up.

[85] And I began to notice that left and right in this country were becoming like separate countries.

[86] And you could use the tools of cultural psychology to study this split, this moral battle between left and right.

[87] So I started doing that.

[88] And I began growing alarmed in the early 2000s about how bad polarization was getting.

[89] And I began studying the causes of polarization, you know, bringing moral psychology to bear on our political problems.

[90] And I was originally going to write a book to basically help the Democrats stop screwing up because I could see that some of my research showed people on the right understand people on the left.

[91] They know what they think.

[92] You can't grow up in America without knowing what progressives think.

[93] But here, I grew up generally on the left, and I had no idea what conservatives thought until I went and sought it out and started reading conservative things like National Review.

[94] So originally, I wanted to actually help the Democrats to understand moral psychology so they could stop losing to George W. Bush.

[95] And I got a contract to write The Righteous Mind.

[96] And once I started writing it, I committed to understanding conservatives by reading the best writing is not the worst.

[97] And I discovered, you know what?

[98] You don't understand anything until you look from multiple perspectives.

[99] And I discovered there are a lot of great social science ideas in the conservative intellectual tradition.

[100] And I also began to see, you know what?

[101] America is actually in real trouble.

[102] And this is like 2008 -2009.

[103] Things are really, we're coming apart here.

[104] So I began to really focus my research on helping left and right understand each other and helping our democratic institutions to work better.

[105] Okay, so all this is before I had any interest in social media.

[106] I was on Twitter, I guess, like 2009, and not much.

[107] Didn't think about it much.

[108] And then, so I'm going along as a social psychologist studying this, and then everything seems to kind of blow up in 2014, 2015, at universities.

[109] And that's when Greg Lukianov came to me in May of 2014 and said, John, weird stuff is happening.

[110] Students are freaking out about a speaker coming to campus that they don't have to go see.

[111] And they're saying it's dangerous, it's violence, like what is going on?

[112] And so, anyway, Greg's ideas about how we were.

[113] teaching students to think in distorted ways that led us to write the coddling of the American mind, which wasn't primarily about social media either.

[114] It was about, you know, this sort of a rise of depression, anxiety.

[115] But after that, things got so much worse everywhere.

[116] And that's when I began to think, like, well, something systemically has changed.

[117] Something has changed about the fabric of the social universe.

[118] And so ever since then, I've been focused on social media.

[119] So we're going to try to sneak up to the problems and the solutions at hand from different directions.

[120] I have a lot of questions whether it's fundamentally the nature of social media that's the problem.

[121] It's the decisions of various human beings that lead the social media companies.

[122] That's the problem.

[123] Is there still some component that's highlighted in the coddling of the American mind?

[124] That's the individual psychology at play or the way parenting and education works to make, sort of emphasize anti -fragility of the human mind as it interacts.

[125] with the social media platforms and the other humans through the social.

[126] So all that beautiful mess.

[127] That should take us an hour or two to cover.

[128] Or maybe a couple of years, yes.

[129] But so let's start, if it's okay.

[130] You said you wanted to challenge some of the things that Mark Zuckerberg has said in a conversation with me. What are some of the ideas he expressed that you disagree with?

[131] Okay.

[132] There are two major areas that I study.

[133] One is what is happening with teen mental health?

[134] It fell off a cliff in 2013.

[135] It was very sudden.

[136] And then the other is, what is happening to our democratic and epistemic institutions.

[137] That means knowledge generating, like universities, journalism.

[138] So my main areas of research where I'm collecting the empirical research and trying to make sense of it is what's happened to mental health and what's the evidence that social media is a contributor, and then the other areas, what's happening to democracies, not just America, and what's the evidence that social media is a contributor to the dysfunction.

[139] So I'm sure we'll get to that because that's what the Atlantic article is about.

[140] But if we focus first on what's happened to teen mental health.

[141] So before I read the quotes from Mark, I'd like to just give the overview.

[142] And it is this.

[143] There's a lot of data tracking adolescents.

[144] There's self -reports of how depressed, anxious, lonely.

[145] There's data on hospital admissions for self -harm.

[146] There's data on suicide.

[147] And all of these things, they bounce around somewhat, but they're relatively level in the early 2000s.

[148] And then all of a sudden, around 2010 to 2013, depending on which statistic you're looking at, all of a sudden, they begin to shoot upwards, more so for girls in some cases, but on the whole, it's like up for both sexes.

[149] It's just that boys have lower levels of anxiety and depression, so the curve is not quite as dramatic.

[150] But what we see is not small increase.

[151] It's not like, oh, 10%, 20%, no, the increases are between 50 and 150%, depending on which group you're looking at.

[152] you know, suicide for preteen girls, thankfully it's not very common, but it's two to three times more common now, or by 2015, it had doubled.

[153] Between 2010 and 2015, it doubled.

[154] So something is going radically wrong in the world of American preteens.

[155] And what we, so as I've been studying it, I found, first of all, it's not just America.

[156] It's identical in Canada and the UK.

[157] Australia, New Zealand are very similar.

[158] They're just after a little delay.

[159] So whatever we're looking for here.

[160] But yet it's not as clear in the Germanic countries.

[161] In continental Europe, it's a little different, and we can get into that when we talk about childhood.

[162] But something's happening in many countries, and it started right around 2012, 2013.

[163] It wasn't gradual.

[164] It hit girls hardest, and it hit preteen girls the hardest.

[165] So what could it be?

[166] Nobody has come up with another explanation.

[167] Nobody.

[168] It wasn't the financial crisis.

[169] That wouldn't it hit preteen girls the hardest.

[170] There is no other explanation.

[171] The complexity here and the data is, of course, as everyone knows, correlation doesn't prove causation.

[172] The fact that television viewing was going up in the 50s and 60s and 70s doesn't mean that that was the cause of the crime.

[173] So what I've done, and this is work with Gene Twangy, who wrote the book, IGen, is because I was challenged, you know, when Greg and I put out the book, the codling of the American mind, some researchers challenged this and said, oh, you don't know what you're talking about.

[174] You know, the correlations between social media use and mental health, they exist, but they're tiny.

[175] It's, you know, like a correlation coefficient of 0 .03 or, you know, a beta of 0 .05, you know, tiny little things.

[176] And one famous article said it's no bigger than the correlation of bad mental health and eating potatoes, which exists, but it's like it's so tiny.

[177] It's zero, essentially.

[178] And that claim, that social media is no more harmful than eating potatoes or wearing eyeglasses.

[179] It was a very catchy claim, and it's caught on, and I keep hearing that.

[180] But let me unpack why that's not true, and then we'll get to what Mark said, because what Mark basically said, here, actually, I'll read it.

[181] And by the way, just to pause real quick, is you implied, but just made it explicit, that the best explanation we have now as you're proposing is that a very particular aspect of social media is the cause, which is not just social media, but the like button and the retweet, a certain mechanism of virality that was...

[182] invented, or perhaps some aspect of social media is the cause.

[183] Good idea.

[184] Let's be clear.

[185] Connecting people is good.

[186] I mean, overall, the more you connect people, the better.

[187] Giving people, the telephone was an amazing step forward, giving them free telephone, you know, free long distance is even better.

[188] Video was, I mean, so connecting people is good.

[189] I'm not a Luddite.

[190] And social media, at least the idea of users posting things, like that happens on LinkedIn, and it's great.

[191] It can serve all kinds of needs.

[192] What I'm talking about here is not the internet.

[193] net.

[194] It's not technology.

[195] It's not smartphones.

[196] And it's not even all social media.

[197] It's a particular business model in which people are incentivized to create content.

[198] And that content is what brings other people on.

[199] And the people on there are the product which is sold to advertisers.

[200] It's that particular business model, which Facebook pioneered, which seems to be incredibly harmful for teenagers, especially for young girls, 10 to 14 years old is where they're most vulnerable.

[201] And it seems to be particularly harmful for democratic institutions because it leads to all kinds of anger, conflict, and the destruction of any shared narrative.

[202] So that's what we're talking about.

[203] We're talking about Facebook, Twitter.

[204] I don't have any data on TikTok.

[205] I suspect it's going to end up having a lot of really bad effects because the teens are on it so much.

[206] And to be really clear, since we're doing the nuance now in this section, lots of good stuff happens.

[207] There's a lot of funny things on Twitter.

[208] I use Twitter because it's an amazing way to put out news to put out when I write something, you know, you know, you and I, you know, use it to promote things.

[209] We, we learn things quickly.

[210] Well, there's could be, now this is harder to measure and we'll probably, I'll try to mention it because so much of our conversation will be about rigorous criticism.

[211] I'll try to sometimes mention what are the possible positive effects of social media in different ways.

[212] So, for example, in the way I've been using Twitter, not the promotion or any of that kind of stuff, it makes me feel less lonely to connect with people, to make me smile, a little bit of humor here and there.

[213] And that at scales a very interesting effect, being connected across the globe, especially during times of COVID and so on.

[214] It's very difficult to measure that.

[215] So we kind of have to consider that and be honest that there is a trade -off.

[216] We have to be honest about the positive and the negative.

[217] And sometimes we're not sufficiently positive or in a rigorous scientific way about the, we're not rigorous in a scientific way about the negative, and that's what we're trying to do here.

[218] And so, that brings us to the Mark Zuckerberg email.

[219] Okay.

[220] But wait, let me just pick up on the issue of tradeoffs because people might think, like, well, like how much of this do we need?

[221] If we have too much, it's bad.

[222] No, that's a one -dimensional conceptualization.

[223] This is a multi -dimensional issue.

[224] And a lot of people seem to think like, oh, what would we have done without social media during COVID?

[225] Like we would have been sitting there alone in our homes.

[226] Yeah, if all we had was, you know, texting, telephone, Zoom, Skype, multiplayer video games, WhatsApp, all sorts of ways of communicating with it.

[227] Oh, and there's blogs and the rest of the internet.

[228] Yeah, we would have been fine.

[229] Did we really need the hyperviral platforms of Facebook and Twitter?

[230] Now, those did help certain things get out faster.

[231] And that did help science Twitter sometimes, but it also led to huge explosions of misinformation and the polarization of our politics to such an extent that a third of the country, you know, didn't believe what the medical establishment was saying.

[232] And we'll get into this, the medical establishment sometimes was playing political games that made them less credible.

[233] So on net, it's not clear to me. If you've got the internet, smartphones, blogs, all of that stuff, it's not clear to me that adding in this particular business model of Facebook, Twitter, TikTok, that that really adds a lot more.

[234] And one interesting one we'll also talk about is YouTube.

[235] I think it's easier to talk about Twitter and Facebook.

[236] YouTube is another complex beast that's very hard to, because YouTube has many things.

[237] It's a content platform, but it also has a recommendation system.

[238] Let's focus our discussion on perhaps Twitter and Facebook, but you do in this large document that you're putting together on social media called.

[239] social media and political dysfunction collaborative review with Chris Bale.

[240] That includes, I believe, papers on YouTube as well.

[241] It does.

[242] But yeah, again, just to finish up with the nuance, yeah, YouTube is really complicated because I can't imagine life without YouTube.

[243] It's incredibly useful.

[244] It does a lot of good things.

[245] It also obviously helps to radicalize terrorist groups and murderers.

[246] So, you know, I think about YouTube the way I think about the Internet in general, and I don't know enough to really comment on YouTube.

[247] So I have been focused, and it's also interesting.

[248] One thing we know is teen social life change radically between about 2010 and 2012.

[249] Before 2010, they weren't mostly on every day because they didn't have smartphones yet.

[250] By 2012 to 14, that's the area in which they almost all get smartphones, and they become daily users of, so the girls go to Instagram and Tumblr, they go to the visual ones, the boys go to YouTube in video games.

[251] Those don't seem to be as harmful to mental health or even harmful at all.

[252] It's really Tumblr, Instagram, particularly, that seem to really have done in girls' mental health.

[253] So now, okay, so let's look at the quote from Mark Zuckerberg.

[254] So at 64 minutes and 31 seconds on the video, I time -coded this.

[255] This is the very helpful YouTube transcript.

[256] YouTube's an amazing program.

[257] You ask him about Francis Howigan, you give him a chance to respond, and here's the key thing.

[258] So he talks about what Francis Hogan said.

[259] He said, no, but that's mischaracterized.

[260] Actually, on most measures, the kids are doing better when they're on Instagram.

[261] It's just on one out of the 18.

[262] And then he says, I think an accurate characterization would have been that kids using Instagram, or not kids, but teens, is generally positive for their mental health.

[263] That's his claim, that Instagram is overall, taken as a whole, Instagram is positive for their mental health.

[264] That's what he says.

[265] Okay.

[266] Now, is it really?

[267] Is it really?

[268] So first, just the simple, okay, now here, what I'd like to do is turn my attention to another document that we'll make available.

[269] So I was invited to give testimony before a Senate subcommittee two weeks ago where they were considering the platform accountability act.

[270] Should we force the platforms to actually tell us what our kids are doing?

[271] Like, we have no idea other than self -report.

[272] We have no idea.

[273] They're the only ones who know, like, the kid does this, and then over the next hours, the kids depressed are happy.

[274] We can't know that.

[275] that, but Facebook knows it.

[276] So should they be compelled to reveal the data?

[277] We need that.

[278] So you raise just to give people a little bit of context.

[279] And this document is brilliantly structured with questions, studies that indicate that the answer to a question is yes, indicate that the answer to a question is no, and then mixed results.

[280] And questions include things like the social media make people more angry or effectively polarized.

[281] That's the one that we're going to get to.

[282] that's the one for democracy.

[283] Yes, that's for democracy.

[284] So I've got three different Google Docs here because I found this is an amazing way, and thank God for Google Docs.

[285] It's an amazing way to organize the research literature and it's a collaborator review, meaning that, so on this one, Gene Twenge and I put up the first draft, and we say, please, comment, add studies, tell us what we missed.

[286] And it evolves in real time.

[287] In any direction, the yes or the no. Oh, yeah, we specifically encourage because, look, the center of my research is that our gut feelings drive our reasoning.

[288] That was my dissertation.

[289] was my early research.

[290] And so if Gene Twenge and I are committed to, but we're going to obviously preferentially believe that these platforms are bad for kids because we said so in our books.

[291] So we have confirmation bias.

[292] And I'm a devotee of John Stuart Mill.

[293] The only cure for confirmation bias is other people who have a different confirmation bias.

[294] So these documents evolve because critics then say, no, you miss this or they say you don't know what you're talking about.

[295] It's like, great, say so.

[296] Tell us.

[297] So I put together this document and I'm going to put links to everything on my website.

[298] If users, sorry, if listeners, viewers go to jonathanhite .com slash social media.

[299] It's a new page I just created.

[300] I'll put everything together in one place there and we put those in the show notes.

[301] Like links to this document and other things like it that we're talking about.

[302] So yeah, so the thing I want to call attention now is this document, this document here with the title teen mental health is plummeting and social media is a major contributing cause.

[303] So Ben Sass and Chris Coons are on the Judiciary Committee.

[304] They had a subcommittee hearing on Nate Priscilla's Bill Platform Accountability Transparency Act.

[305] So they asked me to testify on what do we know, what's going on with teen mental health.

[306] And so what I did was I put together everything I know with plenty of graphs to make these points that first, what do we know about the crisis?

[307] Well, that the crisis is specific to mood disorders, not everything else.

[308] It's not just self -report.

[309] It's also behavioral data because suicide and self -harm go skyrocketing after 2010.

[310] The increases are very large, and the crisis is gendered, and it's hit many countries.

[311] So I go through the data on that.

[312] So we have a pretty clear characterization, and nobody's disputed me on this, on this part.

[313] So can we just pause real quick?

[314] Just so for people who are not aware.

[315] So self -report, just how you kind of collect data on this kind of thing.

[316] You have a self -reported a survey.

[317] You ask people.

[318] Yeah, how anxious are you these days?

[319] Yeah.

[320] How many hours a week do you social media?

[321] That kind of stuff.

[322] And you do, it's maybe, you can collect large amounts of data that way because you can ask a large number of people, that kind of question.

[323] But then there's, I forget the term you use, but more, so non -self report data.

[324] Behavioral data, that's right.

[325] Where you actually have self -harm and suicide numbers.

[326] Exactly.

[327] Exactly.

[328] So there are a lot of graphs like this.

[329] So this is from the National Survey on Drug Use and Health.

[330] So the federal government and also Pew and Gallup, there are a lot of organizations that have been collecting survey data for decades.

[331] So this is a gold mine.

[332] And what you see on these graphs over and over again is relatively straight lines up until around 2010 or 2012.

[333] And on the X axis, we have time, years going from 2004 to 2020.

[334] On the Y axis is the percent of U .S. teens who had a major depression in the last year.

[335] That's right.

[336] So when this data started coming out around, so Gene Twang's book, IGen, 2017, a lot of people say, oh, she, you know, she doesn't know what she's talking about.

[337] This is just self -report.

[338] Like Gen Z, they're just really comfortable talking about this.

[339] This is a good thing.

[340] This isn't a real epidemic.

[341] And literally the day before my book with Greg was published, the day before there was a psychiatrist in New York Times who had an op -ed saying, relax, smartphones are not ruining your kid's brain.

[342] And he said, it's just self -report.

[343] It's just that they're giving higher rates.

[344] There's more diagnosis.

[345] But underlying, there's no change.

[346] No, because it's theoretically possible, but all we have to do is look at the hospitalization data for self -harm and suicide, and we see the exact same trends.

[347] We see also a very sudden, big rise around, between 2009 and 2012, you have an elbow, and then it goes up, up, up.

[348] So that is not self -report.

[349] Those are actual kids admitted to hospitals for cutting themselves.

[350] So we have a catastrophe, and this was all true before COVID.

[351] COVID made things worse, but we have to realize, you know, COVID's.

[352] going away, kids are back in school, but we're not going to go back to where we were because this problem is not caused by COVID.

[353] What is it caused by?

[354] Well, just, again, to just go through the point, then I'll stop.

[355] I just feel like I want to get out the data to show that Mark is wrong.

[356] So first point, correlational studies consistently show a link.

[357] They almost all do, but it's not big.

[358] Equivalent to a correlation coefficient around 0 .1, typically.

[359] That's the first point.

[360] The second point is that the correlation is actually much larger than for eating potatoes.

[361] So that famous line wasn't about social media use.

[362] That was about digital media use.

[363] That included watching Netflix, doing homework on everything.

[364] And so what they did is they looked at all screen use.

[365] And then they said, this is correlated with self -reports of depression and anxiety.

[366] Like, you know, 0 .03.

[367] It's tiny.

[368] And but they said that clearly in the paper.

[369] But the media has reported it as social media is 0 .03 or tiny.

[370] And that's just not true.

[371] What I found digging into it, you don't know this until you look at the, there's more than 100 studies in the Google Doc.

[372] Once you dig in, what you see is, okay, you see a tiny correlation.

[373] What happens if we zoom in on just social media?

[374] It always gets bigger, often a lot bigger, two or three times bigger.

[375] What happens if we zoom in on girls in social media?

[376] It always gets bigger, often a lot bigger.

[377] And so what I think we can conclude, in fact, one of the authors of the potato studies herself concludes, Amy Orban says, I think I have a quote from here, she reviewed a lot of studies, and she herself said that, quote, the associations between social media use and well -being, therefore, range from about R equals 0 .15 to R equals 0 .10.

[378] So that's the range we're talking about.

[379] And that's for boys and girls together.

[380] And a lot of research, including hers and mind, show that girls, it's higher.

[381] So for girls, we're talking about correlations around 0 .15 to 0 .2, I believe, Gene Twangy and I found it's about 0 .2 or 0 .22.

[382] Now, this might sound like an arcane social science debate, but people have to understand, public health correlations are almost never above 0 .2.

[383] So the correlation of childhood exposure to lead and adult IQ, a very serious problem, that's 0 .09.

[384] Like, the world's messy, and our measurements are messy.

[385] And so if you find a consistent correlation of 0 .15, like, you would never let your kid do that thing, that actually is dangerous.

[386] And it can explain.

[387] When you multiply it over tens of millions of kids spending, you know, years of their lives, you actually can explain the mental health epidemic just from social media use.

[388] Well, and then there's questions.

[389] By the way, this is really good to learn because I quit potatoes and it had no time.

[390] And as a Russian, that was a big sacrifice.

[391] They're quite literal, actually, because I'm mostly eating keto these days, but that's funny that they're actually literally called the potato studies.

[392] Okay, but given this, and there's a lot of fascinating data here, there's also a discussion of how to fix it.

[393] What are the aspects that if fixed would start to reverse some of these trends?

[394] So if we just link around the sort of the Mark Zuckerberg statements.

[395] So first of all, do you think Mark is aware of some of these studies?

[396] So if you put yourself in the shoes in Mark Zuckerberg and the executives at Facebook and Twitter, how can you try to understand the studies like the Google Docs you put together to try to make decisions that fix things?

[397] Is there a stable science now that you can start to investigate?

[398] And also maybe if you can comment on the depth of data that's available because ultimately, this is something you argue that the data should be more transparent, should be provided.

[399] But currently, if it's not, all you have is maybe some leaks of internal data.

[400] That's right.

[401] And we could talk about the potential.

[402] You have to be very sort of objective about the potential bias in those kinds of leaks.

[403] You want to, it would be nice to have a non -leak data.

[404] Like, like, yeah, it would be nice to be able to actually have academic researchers able to access in de -individuated, de -identified form.

[405] the actual data on what kids are doing and how their mood changes and, you know, when people commit suicide, what was happening before?

[406] It would be great to know that.

[407] We have no idea.

[408] So how do we begin to fix social media, would you say?

[409] Okay.

[410] So here's the most important thing to understand.

[411] In the social sciences, you know, we say is social media harmful to kids?

[412] That's a broad question.

[413] You can't answer that directly.

[414] You have to have much more specific questions.

[415] You have to operationalize it and have a theory of how it's harming kids.

[416] And so almost all of the research is done on what's called the dose response model.

[417] That is, everybody, including most of the researchers, are thinking about this, like, let's treat it this like sugar.

[418] You know, because the data usually shows a little bit of social media use isn't correlated with harm, but a lot is.

[419] So, you know, think of it like sugar, and if kids have a lot of sugar, then it's bad.

[420] So how much is okay?

[421] But social media is not like sugar at all.

[422] It's not a dose response thing.

[423] It's a complete rewiring of childhood.

[424] So we evolved as a species in which kids play in mixed age groups.

[425] They learn the skills of adulthood.

[426] They're always playing and working and learning and doing errands.

[427] That's normal childhood.

[428] That's so you develop your brain.

[429] That's so you become a mature adult until the 1990s.

[430] In the 1990s, we dropped all that.

[431] We said it's too dangerous.

[432] If we let you outside, you'll be kidnapped.

[433] So we completely, we began rewiring childhood in the 90s before social media.

[434] And that's a big part of the story.

[435] I'm a big fan of Lenore Scanesi who wrote the book Free Range Kids.

[436] If there are any parents listening to this, please buy Lenore's book Free Range Kids and then go to letgrow .org.

[437] It's a nonprofit that Lenore and I started with Peter Gray and Daniel Shookman to help change the laws and the norms around letting kids out to play.

[438] They need free play.

[439] So that's the big picture.

[440] They need free play.

[441] And we started stopping that in the 90s that we reduced it.

[442] And then, Gen Z, Kids Point in 1996, they're the first people in history to get on social media before puberty.

[443] Millennials didn't get it until they were in college.

[444] But Gen Z, they get it because you can lie.

[445] You just lie by your age.

[446] So they really begin to get on around 2009, 2010, and boom, two years later, they're depressed.

[447] It's not because they ate too much sugar necessarily.

[448] It's because even normal social interactions that kids had in the early 2000s, largely, well, They decline because now everything's through the phone.

[449] And that's what I'm trying to get across, that it's not just a dose response thing.

[450] It's imagine one middle school where everyone has an Instagram account and it's constant drama.

[451] Everyone's constantly checking and posting and worrying and imagine going through puberty that way versus imagine there was a policy.

[452] No phones in school.

[453] You have to check them in a locker.

[454] No one can have an Instagram account.

[455] All the parents are on board.

[456] Parents only let their kids have Instagram because the kid says, Everyone else has it.

[457] And that's, we're stuck in a social dilemma, we're stuck in a trap.

[458] So what's the solution?

[459] Keep kids off until they're done with puberty.

[460] There's a new study actually by Amy Orban and Andy Shibilski showing that the damage is greatest for girls between 11 and 13.

[461] So there is no way to make it safe for preteens or even 13, 14 year olds.

[462] We've got a, kids should simply not be allowed on these business models where you're the product.

[463] They should not be allowed until you're 16.

[464] We need to raise the age and enforce it.

[465] That's the biggest thing.

[466] So I think that's a really powerful solution, but it makes me wonder if there's other solutions, like controlling the virality of bullying, sort of if there's a way that's more productive to childhood to use social media.

[467] So, of course, one thing is putting your phone down, but first of all, from a perspective of social media companies, it might be difficult to convince them to do so.

[468] and also for me as an adult who grew up without social media it social media is a source of joy so i wonder if it's possible to design the mechanisms both the challenge the ad driver of a model but actually just technically the recommender system and how viral how virality works on these platforms if it's possible to design a platform that leads to growth anti -fragility, but does not lead to depression, self -harm, and suicide, like, finding that balance and making that as the objective function, not engagement or something else.

[469] I don't think that can be done for kids.

[470] So I am very reluctant to tell adults what to do.

[471] I have a lot of libertarian friends, and I would lose their friendship if I started saying, oh, it's bad for adults and we should stop adults from using it.

[472] But by the same token, I'm very reluctant to have Facebook and Instagram tell my kids what to do without me even knowing or without me having any ability to control it.

[473] As a parent, it's very hard to stop your kid.

[474] I have stopped my kids from getting on Instagram, and that's caused some difficulties, but they also have thanked me because they see that it's stupid.

[475] They see that what the kids are heavily on it, what they post, they see that the culture of it is stupid, as they say.

[476] So I don't think there's a way to make it healthy for kids.

[477] I think there's one thing which is healthy for kids, which is free play.

[478] We already robbed them of most of it in the 90s.

[479] The more time they spend on their devices, the less they have free play.

[480] Video games is a kind of play.

[481] I'm not saying that these things are all bad.

[482] But, you know, 12 hours of video game play means you don't get any physical play.

[483] So anyway, physical play is the way to develop physical and to fragility.

[484] And especially social skills.

[485] Kids need huge amounts of conflict with no adult to supervise or mediate.

[486] And that's what we rob them up.

[487] So anyway, we should move on because I get really into the evidence here, because I think the story is actually quite clear now.

[488] There was a lot of ambiguity.

[489] There are conflicting studies, but when you look at it all together, the correlational studies are pretty clear and the effect sizes are coming in around 0 .1 to 0 .15, whether you call that a correlation coefficient or a beta.

[490] It's all in standardized beta.

[491] It's all in that sort of range.

[492] There's also experimental evidence.

[493] We collect true experiments with random assignment, and they mostly show an effect.

[494] And there's eyewitness testimony.

[495] You know, the kids themselves, You talk to girls and you pull them.

[496] Do you think overall Instagram is good for your mental health or bad for it?

[497] You're not going to find a group saying, oh, it's wonderful.

[498] Oh, yeah.

[499] Yeah, Mark, you're right.

[500] It's mostly good.

[501] No, the girls themselves say, this is the major reason.

[502] And I've got studies in the Google Doc where there have been surveys.

[503] What do you think is causing depression anxiety?

[504] And the number one thing they say is social media.

[505] So there's multiple strands of evidence.

[506] Do you think the recommendation is, as a parent, that teens should not use Instagram, Twitter, that's ultimately, maybe in the long term, there's a more new one solution.

[507] There's no way to make it safe there.

[508] It's unsafe at any speed.

[509] Well, I mean, it might be very difficult to make it safe.

[510] And then the short term, while we don't know how to make it safe, put down the phone.

[511] Well, no, hold on a second.

[512] Play with other kids via a platform like Roblox or multiplayer video games.

[513] That's great.

[514] I have no beef with that.

[515] You focus on bullying before.

[516] That's one of five or seven different avenues of harm.

[517] The main one I think, which does in the girls, is not being bullied.

[518] It's living a life where you're thinking all the time about posting because once a girl starts posting.

[519] So it's bad enough that they're scrolling through, and this is everyone comments on this, you're scrolling through and everyone's life looks better than yours because it's fake and all that you see are the ones the algorithm picked that were the night.

[520] Anyway, so the scrolling, I think, is bad for the girls.

[521] But I'm beginning to see, I can't prove this, but what I'm beginning to see from talking to girls from seeing how it's used is once you start posting, that takes over your mind.

[522] And now you're basically, you're no longer present because even if you're only spending five or six hours a day on Instagram, you're always thinking about it.

[523] And when you're in class, you're thinking about how are people responding to the post that I made between periods, you know, between classes.

[524] I mean, I do it.

[525] You know, I try to stay off Twitter for a while, but now, you know, I've got this big article.

[526] I'm tweeting about it.

[527] And I can't help it.

[528] Like, I check, you know, 20 times a day.

[529] I'll check.

[530] Like, what are people saying?

[531] What are people saying?

[532] This is terrible.

[533] And I'm a, you know, 58 -year -old man. Imagine being a 12 -year -old girl going through puberty.

[534] You're self -conscious about how you look.

[535] And I see some young women.

[536] I see some professional young women.

[537] Women in their 20s and 30s who are putting up sexy photos of themselves.

[538] Like, and this is so sad.

[539] So sad.

[540] Don't be doing this.

[541] Yeah.

[542] See, I, the thing where I disagree a little bit is, I agree with you in the short term.

[543] But in the long term, I feel it's the responsibility of social media, not in some kind of ethical way, not just in an ethical way, but it'll actually be good for the product or for the company to maximize the long -term happiness and well -being of the person.

[544] So not just engagement.

[545] But the person is not the customer.

[546] So the thing is not to make them happy.

[547] It's to keep them on.

[548] That's the way it is currently with that driven.

[549] If we can get a business model, as you're saying, I'd be all for it.

[550] And I think that's the way to make much more money.

[551] So like a subscription model where the money comes from paying?

[552] It's not.

[553] That would work, wouldn't it?

[554] That would help.

[555] So subscription definitely would help, but I'm not sure it's so much.

[556] I mean, a lot of people say it's about the source of money, but I just think it's about the fundamental mission of the product.

[557] If you want people to really love a thing, I think that thing should maximize your long -term well -being.

[558] It should.

[559] In theory, in morality land, it should.

[560] I don't think it's just morality land.

[561] I think in business land, too.

[562] But that's up maybe a discussion for another day.

[563] We're studying the reality of the way things currently are.

[564] And they are, as they are, as the studies are highlighting.

[565] So let us go then from the land of mental health for young people to the land of democracy.

[566] By the way, in these big umbrella areas, Is there a connection, is there a correlation between the mental health of a human mind and the division of our political discourse?

[567] Oh, yes.

[568] Oh, yes.

[569] So our brains are structured to be really good at approach and avoid.

[570] So we have circuits.

[571] The front left circuit is an oversimplification, but there's some truth to it.

[572] There's what's called the behavioral activation system, front left cortex.

[573] It's all about approach, opportunity, you know, kid in a candy store.

[574] And then the front right cortex has circuits specialized for withdrawal, fear, threat.

[575] And of course, students, you know, I'm a college professor.

[576] And most of us think about our college days like, you know, yeah, we were anxious at times, but it was.

[577] was fun and it was like, I can take all these courses, I can do all these clubs, all these people.

[578] Now imagine if in 2013 all of a sudden students are coming in with their front right cortex hyperactivated.

[579] Everything's a threat.

[580] Everything is dangerous.

[581] There's not enough to go around.

[582] So the front right cortex puts us into what's called defend mode as opposed to discover mode.

[583] Now let's move up to adults.

[584] Imagine a large diverse secular liberal democracy in which people are most of the time in discover mode and you know we have a problem let's think how to solve it and this is what de Tocqueville said about Americans like there's a problem we get together we figure out how to solve it and he said whereas in England and France people would wait for the king to do it but here like you know it's roll up personally to do it that's the can do mindset that's front left cortex discover mode if you have a national shift of people spending more time in defend mode now you so everything that comes up whatever anyone says, you're not looking like, oh, is there something good about it?

[585] You're thinking, you know, how is this dangerous?

[586] How is this a threat?

[587] How is this violence?

[588] How can I attack this?

[589] How can I, you know, so if you imagine, you know, God up there with a little lever like, okay, let's push everyone over into, you know, more into Discover mode.

[590] And it's like joy breaks out, age of Aquarius.

[591] All right, let's shift them back into, let's put everyone in defend mode.

[592] And I can't think of a better way to put people in defend mode than to have them spend some time on partisan or political Twitter where it's just a stream of horror stories, including videos about how horrible the other side is.

[593] And it's not just that they're bad people.

[594] It's that if they win this election, then we lose our country, or then it's catastrophe.

[595] So Twitter, and again, we're not saying all of Twitter, you know, most people aren't on Twitter and people that are mostly not talking about politics, but the ones that are on talking about politics are flooding us with stuff.

[596] All the journalists see it.

[597] All mainstream media is hugely influenced by Twitter.

[598] So if we put everyone, if there's more sort of anxiety, sense of threat, this colors everything.

[599] And then you're not, you know, the great thing about a democracy and especially a, you know, or a legislature that has some diversity in it is that the art of politics is that you can grow the pie and then divide it.

[600] You don't just fight zero sum.

[601] You find ways that we can all get 60 % of what we want.

[602] And that, that ends when everyone's anxious and angry.

[603] So let's try to start to figure out who's the blame here.

[604] Is it the nature of social media?

[605] Is it the decision of the people at the heads of social media companies that they're making in the detailed engineering designs of the algorithm?

[606] Is it the users of social media that drive narratives like you mentioned journalists that want to maximize drama in order to drive clicks to their, off -site articles.

[607] Is it just human nature that loves drama?

[608] Can't look away from an accident when you're driving by it.

[609] Is there something to be said about the reason I ask these questions is to see, can we start to figure out what the solution would be for, to alleviate, to de -escalate the definition?

[610] Not yet, not yet.

[611] Let's first, we have to understand, you know, as we did on the teen mental health thing, Okay, now let's lay out what is the problem, what's messing up our country, and then we'll, we can talk about solutions.

[612] So it's all the things you said, interacting in an interesting way.

[613] So human nature is tribal.

[614] We evolved for intergroup conflict.

[615] We love war.

[616] The first time my buddies and I played paintball, I was 29.

[617] And we were divided into teams with strangers to shoot guns at each other and kill each other.

[618] And we all afterwards, it was like, oh my God, that was incredible.

[619] Incredible.

[620] Like, it really felt like we'd opened a room in our hearts that had never been opened.

[621] But as men, you know, testosterone changes our brains and our bodies and activates the war stuff, like we've got more stuff.

[622] And that's why boys like certain team sports, it's play war.

[623] So that's who we are.

[624] It doesn't mean we're always tribal.

[625] It doesn't mean we're always wanting to fight.

[626] We're also really good at making peace and cooperation and finding deals.

[627] We're good at trade and exchange.

[628] So, you know, you want your country to, you want a society that has room for.

[629] conflict, ideally over sports.

[630] Like, that's great.

[631] That's totally, it's not just harmless.

[632] It's actually good.

[633] But otherwise you want cooperation to generally prevail in the society.

[634] That's how you create prosperity and peace.

[635] And if you're going to have a diverse democracy, you really better focus on cooperation, not on tribalism and division.

[636] And there's a wonderful book by Yasha Monk called The Great Experiment that talks about the difficulty of diversity and democracy and what we need to do to get this right and to get the benefits of diversity.

[637] So that's human nature.

[638] Now let's imagine that the technological environment makes it really easy for us to cooperate.

[639] Let's give everyone telephones in the postal service.

[640] Let's give them email.

[641] Like, wow, we can do all these things together with people far away.

[642] It's amazing.

[643] Now instead of that, let's give them a technology that encourages them to fight.

[644] So early Facebook and Twitter were generally lovely places.

[645] You know, people old enough to remember, remember like they were fun.

[646] There was a lot of humor.

[647] You didn't feel like you're going to get your head blown off no matter what you said.

[648] 2007, 2008, 2009.

[649] It was still fun.

[650] These were nice places mostly.

[651] And like almost all the platforms started off as nice places.

[652] But, and this is the key thing in the article, in the Atlantic article on Babel, on after Babel.

[653] The Atlantic article, by the way, is why the past 10 years of American life have been uniquely stupid.

[654] Yeah.

[655] My title in the magazine was after Babel, adapting to a world we can no longer share.

[656] That's what I proposed.

[657] But they, A, B, tested, what's the title?

[658] It gets the most clicks, and it was why the past 10 years have been uniquely.

[659] So Babel, the Tower of Babel is a driving metaphor in the piece.

[660] So, first of all, what is the Tower of Babel?

[661] What's Babel?

[662] What are we talking about?

[663] So the Tower of Babel is a story early in Genesis, where the descendants of Noah are spreading out and repopulating the world.

[664] And they're on the plane of Shinar, and they say, let us build us a city with a tower to make a name for ourselves lest we be scattered again and so it's a very short story there's not a lot in it but it looks like they're saying we don't want God to flood us again let's build a city in a tower and to reach the heavens and God is offended by the hubris of these people acting again like gods and he says here's the key line he says let us go down and confuse their language so that they may not understand one another So in the story, he doesn't literally knock the tower over, but, you know, many of us have seen images or, you know, movie dramatizations where a great wind comes and the tower is knocked over and the people are left wandering amid the rubble, unable to talk to each other.

[665] So I've been grappling.

[666] I've been trying to say, what the hell happened to our society, you know, beginning in 2014, what the hell is happening to universities?

[667] And then it's spread out from universities.

[668] It hit journalism, the arts, and now it's all over companies.

[669] What the hell happened to us?

[670] And it wasn't until I reread the Babel story a couple of years ago that I thought, whoa, this is it.

[671] This is the metaphor.

[672] Because, you know, I'd been thinking about tribalism and left -right battles and war, and that's easy to think about.

[673] But Babel isn't like, you know, and God said let half of the people hate the other half.

[674] No, it wasn't that.

[675] It's God said, let us confuse their language, that they, none of them can understand each other ever again, or at least for a while.

[676] So it's a story about fragmentation, and that's what's unique about our time.

[677] So Meta or Facebook wrote a rebuttal to my article.

[678] They disputed what I said, and one of their arguments was, oh, but, you know, polarization goes back way before social media, and, you know, it was happening in the 90s, and they're right, it does.

[679] And I did say that, but I should have said it more clearly with more examples.

[680] But here's the new thing.

[681] even though left and right were beginning to hate each other more we weren't afraid of the person next to us we weren't afraid of each other cable TV you know Fox News whatever you want to point to about increasing polarization it didn't make me afraid of my students and that was new in around 2014 2015 we started hearing getting articles you know I'm a liberal professor and my liberal students frightened me it was in Vox in 2015 and that was after Greg and I had turned in the draft of our first draft of our coddling article And surveys show over and over again, students are not as afraid of their professors.

[682] They're actually afraid of other students.

[683] Most students are lovely.

[684] It's not like the whole generation has lost their minds.

[685] What happens is a small number, a small number are adept at using social media to destroy anyone that they think they can get credit for destroying.

[686] And the bizarre thing is it's rarely about what ideas you express.

[687] It's usually about a word.

[688] Like, he used this word, or this, you know, this was insensitive, or, you know, I can link this word to that.

[689] So it's a, it's, they don't have been engaged with ideas and arguments.

[690] It's a real sort of gotcha prosecutor.

[691] Almost like a, you know, it's like a witch trial mindset.

[692] So the unique thing here is there's something about social media in those years that a small number of people can sort of be catalyst for this division.

[693] They can start the viral wave that leads to a division that's different than the kind of division we saw before.

[694] It's a little different than a viral wave.

[695] Once you get some people who can use social media to intimidate, you get a sudden phase shift.

[696] You get a big change in the dynamics of groups.

[697] And that's the heart of the article.

[698] This isn't just another article about how social media is polarizing us and destroying democracy.

[699] The heart of the article is an analysis of what makes groups smart and what makes them stupid.

[700] And so because, as we said earlier, you know, my own research is on post hoc reasoning, post -talk justification, rationalization.

[701] The only cure for that is other people who don't share your biases.

[702] And so if you have an academic debate, it's like the one I'm having with these other researchers over social media, you know, I write something, they write something.

[703] I have to take account of their arguments and they have to take account of mine.

[704] When the academic world works, it's because it puts us together in ways that things cancel out.

[705] That's what makes universities smart, what makes them generators of knowledge.

[706] Unless we stop dissent.

[707] What if we say on these topics, there can be no dissent?

[708] And if anyone says otherwise, if any academic comes up with research that says otherwise, we're going to destroy them.

[709] And if any academic even tweets a study contradicting what is the official word, we're going to destroy them.

[710] And that was the famous case of David Shore, who in the days after George Floyd was killed and there were protests, and the question is, are these protests going to be productive?

[711] Are they going to backfire?

[712] Now, most of them were peaceful, but some were violent.

[713] and he tweeted a study.

[714] He just simply tweeted a study done by an action African -American, I think sociologist at Princeton, Omar Wasau.

[715] And Wasau's study showed that when you look back at the 60s, you see that where there were violent protests tended to backfire, peaceful protests tend to work.

[716] And so he simply tweeted that study.

[717] And there was a Twitter mob after him.

[718] This was insensitive.

[719] This was anti -black, I think he was accused of.

[720] And he was fired within a day or two.

[721] So this is the kind of dynamic.

[722] is not caused by cable TV.

[723] This is not caused.

[724] This is something new.

[725] Can I just on a small tangent there, in that situation, because it happens time and time again, you highlight in your current work, but also in the coddling of the American mind, is the blame on the mob, the mechanisms that enables the mob, or the people that do the firing?

[726] The administration does the firing.

[727] It's all of them.

[728] Well, can I, I sometimes feel that we, don't put enough blame on the people that do the firing, which is, that feels like in the long arc of human history, that is the place for courage and for ideals, right?

[729] That's where it stops.

[730] That's where the buck stops.

[731] So if there's going to be new mechanisms for mobs and all that kind of stuff, there's going to be tribalism.

[732] But at the end of the day, that's what it means to be a leader is to stop, stop the mob at the door.

[733] But I'm a social psychologist.

[734] Which means I look at the social forces at work on people.

[735] And if you show me a situation in which 95 % of the people behave one way, and it's a way that we find surprising and shameful, I'm not going to say, wow, 95 % of the people are shameful.

[736] I'm going to say, wow, what a powerful situation.

[737] We've got to change that situation.

[738] So that's what I think is happening here, because there are hardly any, in the first few years, you know, it begins around 2018, 2019, and it really answers the corporate world.

[739] there are hardly any leaders who stood up against it.

[740] But I've talked to a lot, and it's always the same thing.

[741] You have these, you know, people in their, usually in their 50s or 60s, generally they're progressive or on the left.

[742] They're accused of things by their young employees.

[743] They don't have the vocabulary to stand up to it, and they give in very quickly.

[744] And because it happens over and over again, and there's only a few examples of university presidents who said, like, no, we're not going to stop this talk just because you're freaking out, no, we're not going to fire this professor because he wrote a paper that you don't like.

[745] There are so few examples, I have to include that the situational forces are so strong.

[746] Now, I think we are seeing a reversal in the last few weeks or months.

[747] A clear sign of that is that the New York Times actually came out with an editorial from the editorial board saying that free speech is important.

[748] Now, that's amazing that the Times had the guts to stand up for free speech because, you know, they're the people, well, what's been happening with the Times is that they've allowed Twitter to become the editorial board.

[749] Twitter has control over the New York Times, and the New York Times literally will change papers.

[750] I have an essay in Politico with Nadiaen Strasson, Steve Pinker, and Pamela Presky on how the New York Times retracted and changed an editorial by Brett Stevens, and they did it in a sneaky way and they lied about it.

[751] And they did this out of fear because he mentioned IQ.

[752] He mentioned IQ and Jews.

[753] And then he went on to say it probably isn't a genetic thing.

[754] It's probably cultural.

[755] He mentioned it.

[756] And the New York Times, I mean, they were really cowardly.

[757] Now, I think they, from what I hear, they know that they were cowardly.

[758] They know that they should not have fired James Bennett.

[759] They know that they gave into the mob.

[760] And that's why they're now poking their head up above the parapet.

[761] And they're saying, oh, we think that free speech is important.

[762] And then, of course, they got their heads blown off because Twitter reacted like, how dare you say this?

[763] Are you saying racist speech is okay?

[764] But they didn't back down.

[765] They didn't retract it.

[766] They didn't apologize for defending free speech.

[767] So I think the Times might be coming back.

[768] Can I ask your opinion on something here?

[769] What, in terms of the Times coming back, in terms of Twitter, being the editorial board for prestigious journalistic organizations, what's the importance of the role of Mr. Elon Musk in this?

[770] So, you know, it's all fun in games, but here's a human who tweets about the importance of freedom of speech and buys Twitter.

[771] What are your thoughts on the influence, the positive and the negative possible consequences of this particular action?

[772] So, you know, if he is going to succeed in, if he's going to be one of the major reasons why we decarbonize quickly and why we get to Mars, then I'm willing to cut him a lot of slack.

[773] So I have an overall positive view of him.

[774] Now, where I'm concerned and where I'm critical is we're in the middle of a raging culture war.

[775] And this culture war is making our institutions stupid.

[776] It's making them fail.

[777] This culture war, I think, could destroy our country.

[778] And by destroy it, I mean, we could descend into constant constitutional crises, a lot more violence.

[779] You know, not that we're going to disappear, not that we're going to kill each other, but I think there will be a lot more violence.

[780] So we're in the middle of this raging culture war.

[781] it's possibly turning to violence, you need to not add fuel to the fire.

[782] And the fact that he declared that he's going to be a Republican and the Democrats are the bad party.

[783] And, you know, as an individual citizen, he's entitled to his opinion, of course, but as an influential citizen, he should at least be thoughtful.

[784] And more importantly, companies need, and I think would benefit from a Geneva Convention for the Culture War in which, because they're all being, they're all being damaged by the culture we're coming to the companies.

[785] What we need to get to, I hope, is a place where companies do, they have strong ethical obligations about the effects that they cause, about how they treat their employees, about their supply chain, they have strong ethical obligations, but they should not be weighing in on culture war issues.

[786] Well, if I can read the exact tweet, because part of the tweet I like, he says, said in the past I voted Democrat because they were mostly the kind the kindness party but they have become the party of division and hate so I can no longer support them and will vote Republican and then he finishes with now watch their dirty tricks campaign against me unfold okay what do you make of that like what do you think he was thinking that he came out so blatantly as a partisan because he's probably communicating with the board with the people inside Twitter and he's clearly seeing the lean and he's responding to that lean he's he's also opening the door to the to the potential bringing back um the former president onto the platform and also bringing back which he's probably looking at the numbers of the people who are behind truth social saying that okay it seems that there's a strong lean in Twitter uh in terms of the left um and in fact from what I see, it seems like the current operation of Twitter is the extremes of the left get outraged by something, and the extremes of the right point out how the left is ridiculous.

[787] Like, that seems to be the mechanism.

[788] And that's the source of the drama, and then the left gets very mad at the right that points out the ridiculousness, and there's this vicious kind of cycle.

[789] Exactly.

[790] That's the polarization cycle.

[791] That's what we're in.

[792] There's something that happened here that's, there's a shift where there's a decline, I would say, in both parties towards being shitty.

[793] Look, look, what everything with the parties, that's not the issue.

[794] The issue is, should the most important CEO in America, the CEO of some of our biggest and most important companies.

[795] So let's imagine five years from now, two different worlds.

[796] In one world, the CEO of every Fortune 500 company has said, I'm a Republican because I'm a Republican because I'm I hate those douchebags, or I'm a Democrat because I hate those Nazi racists.

[797] That's one world where everybody puts up a thing in their window, everybody, it's cultural war everywhere all the time, 24 hours a day.

[798] You pick a doctor based on whether he's red or blue.

[799] Everything is culture war.

[800] That's one possible future, which we're headed towards.

[801] The other is we say, you know what, political conflict should be confined to political spaces.

[802] There is a room for protest, but you don't go protesting at people's private homes.

[803] You don't go threatening their children.

[804] You don't go doxing them.

[805] We have to have channels that are not.

[806] culture war all the time.

[807] When you go shopping, when you go to a restaurant, you shouldn't be yelled it and screamed at.

[808] When you buy a product, you should be able to buy products from an excellent company.

[809] You shouldn't have to always think, what's the CEO?

[810] I mean, what an insane world, but that's where we're heading.

[811] So I think that Elon did a really bad thing in launching that tweet.

[812] That was, I think, really throwing fuel on a fire and setting a norm in which businesses are going to get even more politicized than they are.

[813] And you're saying specifically the problem was that he picked the side.

[814] As the head of, yes, as the CEO, as a head of several major companies, of, you know, of course we can find out what his views are.

[815] You know, it's not like it's a, I mean, actually, with him, it's maybe hard to know, but, you know, it's not that a CEO can't be a partisan or have views, but to publicly declare it in that way in such an, really insulting way, this is throwing fuel on the fire, and it's setting a precedent that corporations are major players in the culture world.

[816] I'm trying to reverse that.

[817] We've got to pull back from that.

[818] Let me play devil's advocate here.

[819] So, because I've gotten a chance to interact with quite a few CEOs, there is also a value for authenticity.

[820] So I'm guessing this was written while sitting on the toilet.

[821] And I could see in a day from now saying, L .O .L. Just kidding.

[822] There's a, there's a humor.

[823] There's a lightness.

[824] There's a chaos element.

[825] and that's chaos is not.

[826] Yeah, that's not what we need right now.

[827] We don't need more chaos.

[828] Well, so yes, there's a balance here.

[829] The chaos isn't engineered chaos.

[830] It's really authentically who he is.

[831] And I would like to say that there's, I agree with that.

[832] That's a trade -off because if you become a politician, so there's a trade -off between, in this case, maybe authenticity and civility, maybe, like being calculating about the impact you have with your words.

[833] versus just being yourself.

[834] And I'm not sure calculating is also a slippery slope.

[835] Both are slippery slopes.

[836] You have to be careful.

[837] So when we have conversations in a vacuum, and we just say like what should a person do, those are very hard.

[838] But our world is actually structured into domains and institutions.

[839] And if it's just like, oh, you know, talking here among our friends, like we should be authentic, sure.

[840] But the CEO of a company has fiduciary duties, legal fiduciary duties to the company.

[841] he owes loyalty to the company.

[842] And if he is using the company for his own political gain or other purposes or a social standing, that's a violation of his fiduciary duty to the company.

[843] Now, there's debate among scholars whether your fiduciary duties to the shareholders.

[844] I don't think it's the shareholders.

[845] I think many legal experts say the company is a legal person.

[846] You have duties to the company.

[847] Employees owe a duty to the company.

[848] So he's got those duties.

[849] And I think he, you know, you can say he's being authentic, but he's also violating those duties.

[850] So it's not necessarily he's violating a law by doing it, but he certainly is shredding any notion of professional ethics around leadership of a company in the modern age.

[851] I think you have to take it in the full context because you see that he's not being a political player.

[852] He's just saying quit being dushy.

[853] Suppose the CEO of Ford says, you know what, let's pick a group.

[854] I shouldn't do a racial group because that would be different.

[855] Let's just say, you know what, left -handed people are douche people.

[856] I hate them.

[857] Like, why would you say that?

[858] No, no, no. Let's talk about left -handed people.

[859] What you said now is not either funny or like -hearted because I hate them.

[860] It wasn't funny.

[861] I'm not picking on you.

[862] I'm saying that statement.

[863] Words matter.

[864] There's a lightness to the statement in the full context.

[865] If you look at the timeline of the man, there's ridiculous memes and there's non -stop jokes.

[866] My big problem with the CEO of Ford is there's never any of that.

[867] Not only is there any of that, there's not a celebration of the beauty of the engineering of the different products.

[868] It's all political speak channel through multiple meetings of PR.

[869] There's levels upon levels upon levels where you think that it's really not authentic.

[870] And there you're actually, by being polite, by being civil, you're actually playing politics.

[871] Because all of your actual political decision -making is done in the back channels.

[872] That's obvious.

[873] Here, here's a human being authentic and actually struggling with some of the ideas and having fun with it.

[874] I think this lightness represents the kind of positivity that we should be striving for.

[875] It's funny to say that because you're looking at these statements and they seem negative, but in the full context of social media, I don't know if they are.

[876] But look at what you just said.

[877] In the full context, you're taking his tweet, tweets in context.

[878] You know who doesn't do that?

[879] Twitter.

[880] Like, that's the Twitter.

[881] The rule of Twitter is there is no context.

[882] Everything is taken in the maximum possible way.

[883] There is no context.

[884] So this is not like, you know, yes, I wish we did take people in context.

[885] I wish we lived in that world.

[886] But now that we have Twitter and Facebook, we don't live in that world anymore.

[887] So you're saying it is a bit of responsibility for people with a large platform to consider the fact that there is the fundamental mechanism of Twitter or people don't give you the benefit of the doubt.

[888] Well, I don't want to hang it on a large platform because then that's what a lot of people say like, well, you know, she shouldn't say that because she has large platform and she should say things that agree with my politics.

[889] I don't want to hang it on large platform.

[890] I want to hang it on CEO of a company.

[891] CEOs of a company have duties and responsibilities.

[892] And, you know, Scott Galloway, I think is very clear about this.

[893] He criticized Elon a lot as being a really bad role model for young men.

[894] Young men need role models, and he is a very appealing, attractive role model.

[895] So I agree with you, but in terms of being a role model, I think, I don't want to put a responsibility on people, but yes, he could be a much, much better role model.

[896] Yeah, I mean, to insult sitting centers by calling them old.

[897] I mean, that's, you know.

[898] Yeah.

[899] I won't do both -sidism of like, well, those senators can be assholes, too.

[900] Yeah, yes, yes, yes.

[901] Fair enough.

[902] Respond intelligently, as I tweeted, to unintelligent treatment.

[903] Yes.

[904] Yes.

[905] Yes.

[906] So the reason I like, and he's now a friend, the reason I like Elon is because of the engineering, because of the work he does.

[907] No, I admire him enormously for that.

[908] But what I admire on the Twitter side is the authenticity, because I've been a little bit jaded and worn out by people who have built up walls, people in the position of power, the CEOs and the politicians who built up walls and you don't see the real person.

[909] That's one of the reasons I love long -form podcasting, especially if you talk more than 10 minutes.

[910] minutes, it's hard to have a wall up.

[911] It all kind of crumbles away.

[912] So I don't know, but yes, yes, you're right.

[913] That is a step backwards to say, at least to me the biggest problem is to pick sides, to say, I'm not going to vote this way or that way.

[914] That's, like, leave that to the politicians.

[915] You have much, like, the importance of social media is far bigger.

[916] than the bickering, the short -term bicker of any one political party.

[917] It's a platform where we make progress, where we develop ideas through sort of rigorous discourse, all those kinds of things.

[918] So, okay, so here's an idea about social media, developed through social media from Elon, which is, you know, everyone freaks out because they think either, you know, oh, he's going to do less content moderations.

[919] The left is freaking out because they want more content moderation.

[920] the right is celebrating because they think the people doing the content moderation are on the left.

[921] But there was one, I think it was a tweet, where he said three things he was going to do to make it better.

[922] And so defeat the bots or something.

[923] But he said, authenticate all humans.

[924] And this is a hugely important statement.

[925] And it's pretty powerful that this guy can put three words in a tweet.

[926] And actually, I think this could change the world.

[927] Even if the bid fails, the fact that Elon said that, that he thinks we need to authenticate all humans is huge.

[928] because now we're talking about solutions here what can we do to make social media a better place for democracy a place that actually makes democracy better as Tristan Harris has pointed out social media and digital technology the Chinese are using it really skillfully to make a better authoritarian nation and by better I don't mean morally better I mean like more stable successful whereas we're using it to make ourselves weaker more fragmented and more insane so we're on the way down we're in big trouble.

[929] And all the argument is about content moderation.

[930] And what we learned from Francis Hogan is that, what, five or ten percent of what they might call hate speech gets caught, one percent of violence and intimidation, content moderation, even if we do a lot more of it, isn't going to make a big difference.

[931] All the powers and the dynamics changes to the architecture.

[932] And as I said in my Atlantic article, what are the reforms that would matter for social media?

[933] And the number one thing I said, The number one thing I believe is user authentication or user verification.

[934] And people freak out and they say like, you know, oh, but we need anonymous.

[935] Like, yeah, fine, you can be anonymous.

[936] But what I think needs to be done is anyone can open an account on, you know, Twitter, Facebook, whatever, as long as you're over 16, and that's another piece.

[937] Once you're 16 or 18, at a certain age, you can be treated like an adult.

[938] You can open an account and you can look, you can read, and you can make up whatever fake name you want.

[939] But if you want to post, if you want the viral amplification on a company that has Section 230 protection from lawsuits, which is a very special privilege, I understand the need for it, but it's an incredibly powerful privilege to protect them from lawsuits.

[940] If you want to be able to post on platforms that, as we'll get to in the Google Doc, there's a lot of evidence that they are undermining and damaging democracy, then the company has this minimal responsibility has to meet.

[941] Banks have know your customer laws.

[942] You can't just walk up to a bank with a bag of money that you stole and say, here, deposit this for me. My name's John Smith.

[943] You have to actually show who you are.

[944] And the bank isn't going to announce who you are publicly, but you have to, if they're going to do business with you, they need to know you're a real person, not a criminal.

[945] And so there's a lot of schemes for how to do this.

[946] There's multiple levels.

[947] People don't seem to understand this.

[948] Level zero of authentication is nothing.

[949] That's what we have now.

[950] Level one, this might be what Elon meant.

[951] authenticate all humans, meaning you have to at least pass a capture or some test to show you're not a bot.

[952] There's no identity, there's nothing, just something that, you know, it's a constant cat and mouse struggle between bots and humans.

[953] So we try to just filter out pure bots.

[954] The next level up, there are a variety of schemes that allow you to authenticate identity in ways that are not traceable or kept.

[955] So some, whether you show an ID, whether you use biometric, whether you have something on the blockchain that establishes identity, whether it's linked to a phone, whatever it is.

[956] There are multiple schemes now that companies have figured out for how to do this.

[957] And so if you did that, then in order to get an account where you have posting privileges on Facebook or Twitter or TikTok or whatever, you have to at least do that.

[958] And if you do that, you know, now the other people are real humans too.

[959] And suddenly our public squares a lot nicer because you don't have bots swarming around.

[960] This would also cut down on trolls.

[961] You still have trolls who use the real name.

[962] But this would just make it a little scarier for trolls.

[963] Some men turn into complete assholes.

[964] They can be very polite in real life.

[965] But some men, as soon as they have the anonymity, they start using racial slurs.

[966] They're horrible.

[967] One troll can ruin thousands of people's day.

[968] You know, I'm somebody who believes in free speech.

[969] And so there's been a lot of discussions about this.

[970] And we'll ask you some questions about this too.

[971] but there is the tension there is the power of a troll to ruin the party yeah that's right so like this idea of free speech boy do you have to also consider if you want to have a private party and enjoy your time challenging lots of disagreement debate all that kind of stuff but fun no like annoying person, screaming, just not disagreeing, but just like spilling like the drinks all over the place.

[972] Yeah, all that kind of stuff.

[973] So, see, you're saying it's a step in the right direction to at least verify the humanness of a person while maintaining anonymity.

[974] So that's one step, but the further step, that maybe doesn't go all the way because you can still figure out ways to create multiple accounts and you can...

[975] But it's a lot harder.

[976] So actually, there's a lot of ways to do this.

[977] There's a lot of creativity out there about solving this problem.

[978] So if you go to the social media and political dysfunction Google Doc that I created with Chris Bale and then you go to Section 11 proposals for improving social media.

[979] So we're collecting there now some of the ideas for how to do user authentication.

[980] And so one is WorldCoin.

[981] There's one, Human Dash, ID, dot org.

[982] This is a new organization created by an NYU Stern student who just came into my office last week, working with some other people.

[983] And what they do here is they have a method of identity verification that is keyed to your phone.

[984] So you do have to have a phone number.

[985] And of course, you can buy seven different phone numbers if you want, but it's going to be about $20 or $30 a month.

[986] So nobody's going to buy 1 ,000 phones.

[987] So yeah, you know, you can, you don't have just one unique ID, but most people do, and nobody has a thousand.

[988] So just there's just things like this that would make an enormous difference.

[989] So here's the way that I think about it.

[990] Imagine a public square in which the incentives are to be an asshole, that the more you kick people in the shins and spit on them and throw things at them, the more people applaud you.

[991] Okay, so that's the public square we have now.

[992] Not for most people, but as you said, just, you know, one troll can ruin it for everybody.

[993] If there's a thousand of us in the public square and 10 are incentivized to, you know, kick us and throw shit at us, like, it's no fun to be in that public square.

[994] So right now, I think Twitter in particular is making our public square much worse.

[995] It's making our democracy much weaker, much more divided, it's bringing us down.

[996] Imagine if we change the incentives.

[997] Imagine if the incentive was to be constructive.

[998] And so this is an idea that I've been kicking around.

[999] I talked about with Reid Hoffman last week, and he seemed to think it's a good idea.

[1000] And it is, it would be very easy to, rather than trying to focus on posts, what post is fake or whatever, focus on users, what users are incredibly aggressive.

[1001] And so people just use a lot of obstinity and exclamation points.

[1002] AI could easily code nastiness or just aggression, hostility.

[1003] And imagine if every user is rated on a one to five scale for that.

[1004] And the default, when you open an account on Twitter or Facebook, the default is four.

[1005] You will see everybody who's a four and below.

[1006] But you won't even see the people who are fives, and they don't get to see you.

[1007] So they can say what they want, free speech.

[1008] We're not censoring them.

[1009] They can say what they want.

[1010] But now there's actually an incentive to not be an asshole.

[1011] Because the more of an asshole you are, the more people block you out.

[1012] So imagine our country goes in two directions.

[1013] In one, things continue to deteriorate, and we have no way to have a public school.

[1014] in which we could actually talk about things.

[1015] And in the other, we actually try to disincentivize being an asshole and encourage being constructive.

[1016] What do you think?

[1017] Well, this is because I'm an AI person, and I very much, ever since I talked to Jack about the health of conversation, I've been looking at a lot of the machine learning models involved.

[1018] And I believe that the nastiness classification is a difficult problem automatically.

[1019] I'm sure it is.

[1020] So I personally believe in crowdsourced nests.

[1021] in this labeling, but in an objective way where it doesn't become viral mob cancellation type of dynamics, but more sort of objectively, is this a shitty, almost out of context with only local context, is this a shady thing to say at this moment?

[1022] Because here's the thing.

[1023] Well, wait, no, but we don't care about individual posts.

[1024] No, no, but all that matters is the average.

[1025] The posts make the man. They do, but the point is, as long as we're talking about averages here.

[1026] If one person has a misclassified post, it doesn't matter.

[1027] Right, right.

[1028] Yeah, yeah.

[1029] So, but you need to classify post in order to build up the average.

[1030] That's what I mean.

[1031] So I really like that idea, the high level idea of incentivizing people to be less shitty.

[1032] Yeah.

[1033] Because that's what we have that incentive in real life.

[1034] Yeah, that's right.

[1035] It's actually really painful to be a full -time asshole, I think, in physical reality.

[1036] That's right.

[1037] You'd be cut off.

[1038] It should be also paying to be an asshole on the internet.

[1039] There could be different mechanisms for that.

[1040] I wish AI was there.

[1041] Machine learning models are there.

[1042] They just aren't yet.

[1043] But how about how about we have?

[1044] So one track is we have AI machine learning models and they render a verdict.

[1045] Another class is crowdsourcing.

[1046] And then whenever the two disagree, you have, you know, staff at Twitter or whatever, you know, they look at it.

[1047] And they say, what's going on here?

[1048] And that way you can refine both the AI and you can refine whatever the algorithms are for the crowdsour.

[1049] And because, of course, that can be gamed and people can all, hey, let's all rate this guy is really aggressive.

[1050] So you wouldn't want just to rely on one single track.

[1051] But if you have two tracks, I think you could do it.

[1052] What do you think about this word misinformation that maybe connects to our two discussions now?

[1053] So one is the discussion of social media and democracy.

[1054] And then the other is the coddling of the American mind.

[1055] I've seen the word misinformation misused or used.

[1056] as a bullying word like racism and so on, which are important concepts to identify, but they're nevertheless instead overused.

[1057] Yes.

[1058] Does that worry you?

[1059] Because that seems to be the mechanism from inside Twitter, from inside Facebook, to label information you don't like versus information that's actually fundamentally harmful to society.

[1060] Yeah.

[1061] So I think there is there is a meaning of disinformation that is very useful and helpful, which is when you have a concerted campaign by Russian agents to plant a story and spread it, and they've been doing that since the 50s or 40s even.

[1062] That's what this podcast actually is.

[1063] It's a disinformation campaign by the Russians.

[1064] Yeah, you seem really Soviet to me, buddy.

[1065] It's subtle.

[1066] It's between the lines.

[1067] Okay, I'm sorry.

[1068] But so I think to the extent that there are campaigns by either foreign agents or, you know, just by the Republican or Democratic parties, there have been examples of that.

[1069] There are all kinds of concerted campaigns that are intending to confuse or spread lies.

[1070] This is the Soviet, the fire hose of falsehood tactic.

[1071] So it's very useful for that.

[1072] All the companies need to have pretty large staffs, I think, to deal with that because that will always be there.

[1073] And that is really bad for our country.

[1074] So Renee DeResta is just brilliant on this.

[1075] Reading her work has really frightened me and open my eyes about how easy it is to manipulate and spread misinformation and especially polarization.

[1076] The Russians have been trying since the 50s, they would come to America and they would do hate crimes.

[1077] They would spray swastikas and synagogues and they'd spray anti -Blackslars.

[1078] They try to make Americans feel that they're as divided as possible.

[1079] Most of the debate nowadays, however, is not that.

[1080] It seems to be people are talking about what the other side is saying.

[1081] So, you know, if you're on the right, then you're very conscious of the times when, well, you know, the left wouldn't let us even say could COVID be from a lab?

[1082] Like they would like, you literally would get shut down for saying that.

[1083] And it turns out, well, we don't know if it's true.

[1084] But there's at least a real likelihood that it is.

[1085] And it certainly is something that should have been talked about.

[1086] So I tend to stay away from any such discussions.

[1087] and the reason is twofold.

[1088] One is because they're almost entirely partisan.

[1089] It generally is each side.

[1090] I think what the other side is saying is misinformation or disinformation, and they can prove certain examples.

[1091] So we're not going to get anywhere on that.

[1092] We certainly are never going to get 60 votes in the Senate for anything about that.

[1093] I don't think content moderation is nearly as important as people think.

[1094] It has to be done and it can be improved.

[1095] Almost all the action is in the dynamics, the architecture, the virality.

[1096] and then the nature of who is on the platform, unverified people, and how much amplification they get.

[1097] That's what we should be looking at, rather than wasting so much of our breath on whether we're going to do a little more or a little less content moderation.

[1098] So the true harm to society, on average and over the long term, is in the dynamics of social media, not in the subtle choices of content moderation, a .k .a. censorship.

[1099] Exactly.

[1100] There've always been conspiracy theories.

[1101] You know, the Turner Diaries is this book written in 1978.

[1102] It introduced the replacement theory to a lot of people.

[1103] Timothy McVeigh had it on him when he was captured in 1995 after the Oklahoma City bombing.

[1104] It's a kind of a Bible of that that fringe, violent, racist, white supremacist group.

[1105] And that, so, you know, the killer in Buffalo was well acquainted with these ideas.

[1106] They've, you know, they've been around.

[1107] But, you know, this guy's from a small town.

[1108] I forget where he's from.

[1109] You know, but he was, and as he says in a manifesto, he was entirely influenced by things he found online.

[1110] He was not influenced by anyone he met in person.

[1111] Ideas spread and communities can form, these like micro communities, can form with bizarre and twisted beliefs.

[1112] And this is, again, back to the Atlantic article.

[1113] I've got this amazing quote from Martin Guri.

[1114] I mean, just find it.

[1115] But he, Martin Guri, he was a former CIA analyst, wrote this brilliant book called The Revolt of the Public, and he has this great quote.

[1116] He says, he talks about how in the age of mass media, we were all, in a sense, looking at a mirror, looking back at us.

[1117] And it might have been a distorted mirror, but we had stories in common, we had facts in common.

[1118] It was mass media.

[1119] And he describes how the flood of information with the internet is like a tsunami washing over.

[1120] It has all kinds of effects.

[1121] And he says, this isn't a comment in an interview in Vox.

[1122] He says, the digital revolution has shattered that mirror, and now the public inhabits those broken pieces of glass.

[1123] So the public isn't one thing.

[1124] It's highly fragmented, and it's basically mutually hostile.

[1125] It's mostly people yell at each other and living in bubbles of one sort or another.

[1126] And so, you know, we now see clearly there's the, there's this little bubble of just bizarre, you know, nastiness in which, you know, the killer in Christchurch and the killer in Norway and now in Buffalo, you know, they're all put into a community and posts flow up within that community by a certain dynamic.

[1127] So we can never stamp those words or ideas out.

[1128] The question is not, can we stop there from existing?

[1129] The question is what platforms, what are the platforms by which they spread all over?

[1130] the world and into every little town so that the one percent of whatever, whatever percentage of young men are vulnerable to this, but they get exposed to it.

[1131] It's in the dynamics and the architecture.

[1132] It's a fascinating point to think about, because we often debate and think about the content moderation, the censorship, the ideas of free speech, but you're saying, yes, that's important to talk about, but much more important is fixing the dynamics.

[1133] That's right, because everyone thinks, if there's regulation, it means censorship, at least people on the right, think regulation equals censorship.

[1134] And I'm trying to say, no, no, that's only if all we talk about is content moderation.

[1135] Well, then yes, that is the framework.

[1136] You know, how much or how little do we, you know.

[1137] But I don't even want to talk about that because all the action is in the dynamics.

[1138] That's the point of my article.

[1139] It's the architecture changed and our social world went insane.

[1140] So can you try to steal man the other side?

[1141] So the people that might say that social media is good for society overall, both in the dimension of mental health as as Mark said for teenagers teenage girls and for our democracy yes there's a lot of negative things but that's slices of data if you look at the whole which is difficult to measure it's actually good for society and it to the degree that it's not good it's getting better and better is it possible to steal on their point yeah it's it's hard but I should be able to do it I need to put my money where my mouth is, and that's a good question.

[1142] So on the mental health front, you know, the argument is usually what the, what they say is, well, you know, for communities that are cut off, especially LGBTQ kids, they can find each other.

[1143] So it's, it connects kids, especially kids who wouldn't find connection otherwise.

[1144] It exposes you to a range of ideas and content.

[1145] And it's fun.

[1146] Is this a there in the studies you looked at, is there inklings of data that's maybe early data that shows that there is positive effects in terms of self -report data, or how would you measure behavioral positive?

[1147] It's difficult.

[1148] Right.

[1149] So if you look at how do you feel when you're on the platform, you get a mix of positive and negative, and people say they feel supported.

[1150] And this is what Mark was referring to when he said, you know, there was like 18 criteria and on most it was positive and on something that's negative.

[1151] So if you look at how do you feel while you're using the platform?

[1152] You know, look, most kids enjoy it.

[1153] They're having fun.

[1154] But some kids are feeling inferior, cut off, bullied.

[1155] So if we're saying what's the average experience on the platform, that might actually be positive.

[1156] If we just measured the hedonics, like how much fun versus fear is there.

[1157] It could well be positive.

[1158] But what I'm trying to, okay, so is that enough steel manning?

[1159] Can I, that's pretty good.

[1160] You held your breath.

[1161] Yeah.

[1162] But what I'm trying to point out is this isn't a dose response sugar thing.

[1163] Like, how do you feel while you're consuming heroin?

[1164] Like, while I'm consuming heroin, I feel great.

[1165] But am I glad that heroin came into my life?

[1166] Am I glad that everyone in my seventh grade class is on heroin?

[1167] Like, no, I'm not.

[1168] Like, I wish that people weren't on heroin and they could play on the playground.

[1169] But instead, they're just, you know, sitting on the bench shooting up during recess.

[1170] So when you look at it as an emergent phenomenon, changed childhood, now it doesn't matter what are the feelings while you're actually using it.

[1171] We need to zoom out and say, how has this changed childhood?

[1172] Can you try to do the same for democracy?

[1173] Yeah.

[1174] So we can go back to, you know, what Mark said in 2012 when he was taking Facebook public, and, you know, this is the wake of the Arab Spring.

[1175] I think people really have to remember what an extraordinary year 2011 was.

[1176] It starts with the Arab Spring.

[1177] dictators are pulled down.

[1178] Now, people say, you know, Facebook took them down.

[1179] I mean, of course, it was the citizen, the people themselves took down dictators, aided by Facebook and Twitter and I don't know if it was texting.

[1180] There were some other platforms they used.

[1181] So the argument that Mark makes in this letter to potential shareholders or investors is, you know, we're at a turning point in history and, you know, social media is rewiring.

[1182] We're giving people the tools to rewire their institutions.

[1183] So this all sounds great.

[1184] Like this is the democratic dream.

[1185] And what I read about in the essay is the period of techno -democratic optimism, which began in the early 90s with the fall of the Iron Curtain and the Soviet Union.

[1186] And then the internet comes in.

[1187] And, you know, people my age remember how extraordinary it was.

[1188] How much fun it was.

[1189] I mean, the sense that this was the dawning of a new age.

[1190] And there was so much optimism.

[1191] And so this optimism runs all the way.

[1192] from the early 90s, all the way through 2011, with the Arab Spring.

[1193] And, of course, that year ends with Occupy Wall Street.

[1194] And there were also big protest movements in Israel and Spain and a lot of areas.

[1195] Martin Goury talks about this.

[1196] So there certainly was a case to be made that Facebook in particular, but all these platforms, these were God's gift democracy.

[1197] What dictator could possibly keep out the Internet?

[1198] What dictator could stand up to people connected on these digital media platforms?

[1199] So that's the strong case that this is going to be good for democracy.

[1200] And then we can see what happened in the years after.

[1201] Now, first of all, so in Mark's response to you, so here, let me read from what he said when you interviewed him.

[1202] He says, I think it's worth grounding this conversation and the actual research that has been done on this, which by and large finds that social media is not a large driver of polarization.

[1203] He says that.

[1204] Then he says, most academic studies that I've seen actually show that social media use is correlated with love.

[1205] polarization.

[1206] That's a factual claim that he makes, which is not true, but he asserts that study, well, actually, wait, it's tricky because he says the studies he has seen.

[1207] So I can't, so it might be that the studies he has seen say that.

[1208] But if you go to the Google Doc with Chris Bale, you see there's seven different questions that can be addressed.

[1209] And on one of them, which is filter bubbles, the evidence is very mixed.

[1210] And he might be right that Facebook overall doesn't contribute to filter bubbles.

[1211] But on the other six, the evidence is pretty strongly on the yes side.

[1212] It is a cause.

[1213] He also draws a line between the United States versus the rest of the world.

[1214] Right.

[1215] And there's one thing true about that, which is that polarization has been rising much faster in the U .S. than in any other major country.

[1216] So he's right about that.

[1217] So we're talking about an article by Matt Matthew Genskow and a few other research.

[1218] It's a very important article.

[1219] We've got it in the political dysfunction database.

[1220] And we should say that in this study, there's, like I started to say, there's a lot of fascinating questions that are, it's organized by whether studies indicate yes or no. Question one is, does social media make people more angry or effectively polarized?

[1221] Question two is, does social media create echo chamber?

[1222] These are fascinating, really important questions.

[1223] Question three is the social media amplify posts that are more emotional, inflammatory or false.

[1224] Question four, is the social media increase the probability of violence?

[1225] Question five is, does social media enable foreign governments to increase political dysfunction in the U .S. and other democracies.

[1226] Question six, the social media decreased trust.

[1227] Seven is the social media strengthen populist movements, and then there's other sections, as you mentioned.

[1228] Yeah, that's right.

[1229] But once you operationalize it as seven different questions, you know, like, so one is about polarization, and there are measures of that, the degree to which people say they hate the other side.

[1230] And so in this study by Boxel against Cowan Shapiro, 2021, they looked at all the measures of polarization they could find going back to the 1970s for about 20 different countries.

[1231] And they show plots.

[1232] You have these nice plots with red lines showing that in some countries it's going up, like the United States especially, in some countries it's going down, and in some countries it's pretty flat.

[1233] And so Mark says, well, you know, if polarization's going up a lot in the U .S. but not in most other countries, well, maybe Facebook isn't responsible.

[1234] But so much depends on how you operationalize things.

[1235] Are we interested in the straight line, regression line, going back to the 70s?

[1236] And if so, well, then he's right in what he says.

[1237] But that's not the argument.

[1238] The argument isn't that, you know, it's been rising and falling since the 70s.

[1239] The argument is it's been rising and falling since 2012 or so.

[1240] And for that, now I just spoke with, I just, I've been emailing with the authors of the study, and they say there's not really enough data to do it statistically reliably because there's only a few observations after 2012.

[1241] But if you look at the graphs in their study, and they actually do provide.

[1242] As they pointed out to me, they do provide a statistical test if you break the data at the year 2000.

[1243] So actually, a polarization is going up pretty widely if you just look after 2000, which is when the Internet would be influential.

[1244] And if you look just after 2012, you have to just do it by eye.

[1245] But if you do it on their graphs by eye, you see that actually a number of countries do see a sudden sharp upturn, not all, not all by any means.

[1246] But my point is, Mark asserts, he points to one study and he points to this over and over again.

[1247] I have had two conversations with him.

[1248] He pointed to this study both times.

[1249] He asserts that this study shows that polarization is up some places, down other places.

[1250] There's no association.

[1251] But actually, we have another section in the Google Doc where we review all the data on the decline of democracy.

[1252] And the high point of democracy, of course, it was rising in the 90s.

[1253] But if you look around the world, by some measures it begins to drop in the late 2000s, around 2007 -2008.

[1254] by others.

[1255] It's in the early to mid -2010s.

[1256] The point is, there is a, by many measures, there's a drop in the quality and the number of democracies on this planet that began in the 2010s.

[1257] And so, yes, Mark can point to one study, but if you look in the Google Doc, there are a lot of other studies that point the other way, and especially about whether things are getting more polarized or less, more polarized.

[1258] Not in all countries, but in a lot.

[1259] So you've provided the problem, several proposals for solutions.

[1260] Do you think Mark, do you think Elon or whoever is at the head of Twitter would be able to implement these changes or does there need to be a competitor, social networks, a step up?

[1261] If you were to predict the future, now this is you giving sort of financial advice to me. That I can't do.

[1262] Definitely not financial time.

[1263] I can give you advice.

[1264] Do the opposite of whatever I've done.

[1265] Okay, excellent.

[1266] But what do you think?

[1267] when we talk again in 10 years, what do you think would be looking at if it's a better world?

[1268] So you have to look at the dynamics of each change that needs to be made and you have to look at it systemically.

[1269] And so the biggest change for teen mental health, I think, is to raise the age from 13.

[1270] It was set to 13 in Kappa in like 1997 or six or whatever, eight, whatever it was.

[1271] It was set to 13 with no enforcement.

[1272] I think it needs to go to 16 or 18 with enforcement.

[1273] Now, there's no way that Facebook can say, actually, so look, Instagram, the age is 13, but they don't enforce it.

[1274] And they're under pressure to not enforce it because if they did enforce it, then all the kids would just go to TikTok, which they're doing anyway.

[1275] But if we go back a couple years, when they were talking about rolling out Facebook for kids, because they need to get those kids.

[1276] They need to get kids under 13.

[1277] There's a business imperative to hook them early and keep them.

[1278] So I don't expect Facebook to act on its own accord and do the right thing.

[1279] The regulation is the only way.

[1280] Exactly.

[1281] When you have a social dilemma, you know, like what economists call, like a prisoner's dilemma or a social dilemma is generalized to multiple people.

[1282] And when you have a social dilemma, each player can't opt out because they're going to lose.

[1283] You have to have central regulations.

[1284] I think we have to raise the age.

[1285] The UK Parliament is way ahead of us.

[1286] I think they're actually functional.

[1287] The U .S. Congress is not functional.

[1288] So the parliament is implementing the age -appropriate design code that may put pressure on the platforms globally to change certain.

[1289] So anyway, my point is, we have to have regulation to force them to be transparent and share what they're doing.

[1290] There are some good bills out there.

[1291] So I think that if the companies and the users, if we're all stuck in a social dilemma in which the incentives to the incentives against doing the right thing are strong, we do need regulation on certain matters.

[1292] And again, it's not about content moderation who gets to say what, but it's things like the Platform Accountability and Transparency Act, which is from standard of Coons, Portman, and Clobuchar.

[1293] This would force the platforms to just share information about what they're doing.

[1294] Like, we can't even study what's happening without the information.

[1295] So that, I think, is just common sense.

[1296] Senator Michael Bennett introduced the Digital Platforms Commission Act of 2022, which would create a body task with actually regulating and having oversight.

[1297] Right now, the U .S. government doesn't have a body.

[1298] I mean, the FTC can do certain things.

[1299] We have things about antitrust, but we don't have a body that can oversee or understand these things that are transforming everything and possibly severely damaging our political life.

[1300] So I think there's a lot of, oh, and then the state of California is actually currently considering a version of the UK's, the age -appropriate design code, which would force the companies to do some simple things, like not be sending alerts and notifications to children at 10 or 11 o 'clock at night, just things like that to make platforms just less damaging.

[1301] So I think there's an essential role for regulation.

[1302] And I think if the U .S. Congress is too paralyzed by politics, if the U .K. and the EU and the state of California and the state, a few other states, if they enact legislation, the platforms don't want to have different versions in different states or countries.

[1303] So I think there actually is some hope, even if the U .S. Congress is dysfunctional.

[1304] So there is, because I've been interacting with certain regulations that's hitting, designed to hit Amazon, but it's hitting YouTube.

[1305] YouTube folks have been talking to me, which is recommender systems.

[1306] The algorithm has to be public, I think, versus private, which completely breaks.

[1307] It's way too clumsy of regulation where the unintended consequences break recommender systems, not for Amazon, but for other platforms.

[1308] That's just to say that government can sometimes be clumsy with the regulation.

[1309] Usually is.

[1310] And so my preference is the threat of regulation in a friendly way encourages you really shouldn't need it.

[1311] You really should need it.

[1312] My preference is great leaders lead the way in doing the right thing.

[1313] And I actually, honestly, this to our earlier kind of maybe my naive disagreement that I think it's good business to do the right thing in these spaces.

[1314] Sometimes it is.

[1315] Sometimes it loses you most of your users.

[1316] Well, I think it's important because I've been thinking a lot about World War III recently.

[1317] And it might be silly to say, but I think social media has a role in either creating World War III or avoiding World War III.

[1318] It seems like so much of wars throughout history have been started through very fast escalation.

[1319] And it feels like just looking at our recent history, social media is the mechanism for escalation.

[1320] And so it's really important to get this right, not just for the mental health of young people, not just for the polarization of bickering over over a small scale political issues, but literally the survival of human civilization.

[1321] So there's a lot of stake here.

[1322] Yeah, I certainly agree with that.

[1323] I would just say that I'm less concerned about World War III than I am about Civil War II.

[1324] I think that's a more likely prospect.

[1325] Yeah, yeah, yeah.

[1326] Can I ask for your wise, sage advice to young people?

[1327] So advice number one is put down the phone, don't use Instagram and social media.

[1328] But to young people in high school and college, how to have a career or how to have a life they can be proud of?

[1329] Yeah, I'd be happy to because I teach a course at NYU in the business school called Work, Wisdom, and Happiness.

[1330] And the course is, you know, it's advice on how to have a happy, you know, a successful career as a human being.

[1331] but the course has evolved that it's now about three things, how to get stronger, smarter, and more sociable.

[1332] If you can do those three things, then you will be more successful at work and in love and friendships.

[1333] And if you are more successful in work, love, and friendships, then you will be happier.

[1334] You will be as happy as you can be, in fact.

[1335] So the question is, how do you become smarter, stronger, and happier?

[1336] And the answer to all three is it's a number of things.

[1337] It's you have to see yourself as this like complex adaptive system.

[1338] You've got this complicated mind that needs a lot of experience to wire itself up.

[1339] And the most important part of that experience is that you don't grow when you are with your attachment figure.

[1340] You don't grow when you're safe.

[1341] You have an attachment figure to make you feel confident to go out and explore the world.

[1342] In that world, you will face threats, you will face fear, and sometimes you'll come running back.

[1343] But you have to keep doing it because over time, you then develop the strength to stay out.

[1344] there and to conquer it.

[1345] That's normal human childhood.

[1346] That's what we blocked in the 1990s in this country.

[1347] So young people have to get themselves the childhood, and this is all the way through adolescence and young adulthood.

[1348] They have to get themselves the experience that older generations are blocking them from out of fear and that their phones are blocking them from out of just, you know, hijacking almost all the inputs into their life and almost all the minutes of their day.

[1349] So go out there, put yourself out in experiences.

[1350] You are anti -fragile and you're not going to get strong unless you actually have setbacks and criticisms and fights.

[1351] So that's how you get stronger.

[1352] And then there's an analogy in how you get smarter, which is you have to expose yourself to other ideas, to ideas that people that criticize you, people that disagree with you.

[1353] And this is why I co -founded Heterodox Academy, because we believe that faculty need to be in communities that have political diversity and viewpoint diversity, but so do students.

[1354] And it turns out students want this.

[1355] The surveys show very clearly, Gen Z has not turned against viewpoint diversity.

[1356] Most of them want it, but they're just afraid of the small number that will sort of shoot darts at them if they, you know, if they say something wrong.

[1357] So anyway, the point is you're anti -fragile, and so you have to realize that to get stronger.

[1358] You have to realize it to get smarter.

[1359] And then the key to becoming more sociable is very simple.

[1360] It's just always looking at it through the other person's point of view.

[1361] Don't be so focused on what you want and what you want.

[1362] you're afraid of, put yourself in the other person's shoes, what's interesting to them, what do they want?

[1363] And if you develop the skill of looking at it from their point of view, you'll be a better conversation partner, you'll be a better life partner.

[1364] So there's a lot that you can do.

[1365] I mean, I could say, you know, go read the Kotling of the American Mind.

[1366] I could say go read Dale Carnegie, how to win friends and influence people.

[1367] But take charge of your life and your development, because if you don't do it, then the older, older protective generation, and your phone are going to take charge of you.

[1368] So on antifragility and coddling, the American mind, if I may read just a few lines from Chief Justice John Roberts, which I find is really beautiful.

[1369] So it's not just about viewpoint diversity, but it's real struggle, absurd, unfair struggle that seems to be formative to the human mind.

[1370] He says, from time to time in the years to come, I hope you will be treated unfair, so that you will come to know the value of justice.

[1371] I hope that you will suffer betrayal because that will teach you the importance of loyalty.

[1372] Sorry to say, but I hope you will be lonely from time to time so that you don't take friends for granted.

[1373] I wish you bad luck again from time to time so that you will be conscious of the role of chance in life and understand that your success is not completely deserved and that the failure of others is not completely deserved either.

[1374] and when you lose, as you will from time to time, I hope every now and then your opponent will gloat over your failure.

[1375] It is a way for you to understand the importance of sportsmanship.

[1376] I hope you'll be ignored so you know the importance of listening to others, and I hope you will have just enough pain to learn compassion.

[1377] Whether I wish these things or not, they're going to happen.

[1378] And whether you benefit from them or not will depend upon your ability to see the message, in your misfortunes.

[1379] He read that in a middle school graduation.

[1380] Yes, for his sons, his son's middle school graduation.

[1381] That's what I was trying to say, only that's much more beautiful.

[1382] Yeah, and I think your work is really important, and it is beautiful, and it's bold and fearless, and it's a huge honor that you would sit with me. I'm a big fan.

[1383] Thank you for spending your valuable time with me today, John.

[1384] Thank you so much.

[1385] Thanks so much, Lex.

[1386] What a pleasure.

[1387] Thanks for listening to this conversation with Jonathan Haidt.

[1388] To support this podcast, please check out our sponsors in the description.

[1389] And now, let me leave you with some words from Carl Young.

[1390] Everything that irritates us about others can lead us to an understanding of ourselves.

[1391] Thank you for listening and hope to see you next time.