The Daily XX
[0] From the New York Times, I'm Michael Babaro.
[1] Today, Rabbit Hole.
[2] Episode 4.
[3] Tell me what Internet is.
[4] Can you explain what Internet is?
[5] What, do you write to it like mail?
[6] How do I become a member of Internet?
[7] Now, as you may or may not know, I have poured over 12 years of my life into my YouTube channel, this YouTube channel, heart, mind, body, soul, money, resources, travel, expertise, And I stand by what I have done.
[8] And then let me tell you what happened in a 24 -hour period.
[9] I have a strike against my channel.
[10] Okay, so Kevin, not long after you and I went down to West Virginia and talked to Caleb, things that YouTube started changing.
[11] Right.
[12] Apparently, you see, I had violated community standards.
[13] So as stories like Caleb's have started coming out about how YouTube and its recommendation algorithm, maybe pushing people toward these extreme views.
[14] This was a direct shot across the bow now.
[15] YouTube has started telling some video makers that their content is against YouTube's new policies.
[16] What they want to do is to have me start to self -censor and say, well...
[17] Like Stefan Molyneux.
[18] And...
[19] So I've just been told I'm on double secret probation.
[20] Gavin McInnis.
[21] Some sort of 30 -day watch on YouTube or if I do anything wrong, I'm done forever.
[22] People who had been associated with the alt -right.
[23] What content could possibly be acceptable?
[24] They also affected people like Stephen Crowder.
[25] The demonetization specifically targeting our opinions as unpopular speech.
[26] Who didn't really have the same association with the alt -right, but just like Stefan Molyneux and Gavin McInnis, his videos got demonetized.
[27] For being, quote, what they term now, on the border.
[28] Like, they couldn't make...
[29] any ad money off of their YouTube videos anymore.
[30] Yeah, meaning it's gotten a lot harder for some of them to make a full -time living, making YouTube videos.
[31] It renders us unable to create a show for you, and if it keeps going to this way, the channel will cease to exist.
[32] I am going to need to figure out what to do.
[33] I am begging you.
[34] Philosophy begs you, the future begs you.
[35] Please help out the show at freedomainradio .com.
[36] They are shutting down the discussion.
[37] Other people like Alex Jones, They disappeared me. Just got banned altogether.
[38] If this isn't 1984, baby, I don't know what it is.
[39] So, a few months ago...
[40] Give me a level, Kevin Ruse.
[41] Our colleague, Julia Longoria, flew out to San Francisco.
[42] We are here in San Bruno, California, a little south of San Francisco.
[43] And the two of us drove down to YouTube's headquarters.
[44] That building that looks like a insurance company, or something.
[45] That's YouTube.
[46] And why are we here?
[47] And we went there to talk to YouTube CEO, Susan Wojcicki.
[48] She's someone who has seen kind of the entire evolution of the modern internet.
[49] She's got the steering wheel of this giant, powerful artificial intelligence that runs YouTube.
[50] Hey, guys, how's it going?
[51] Hi.
[52] Do you guys have Ben?
[53] We're here for an interview.
[54] Sorry, go ahead.
[55] No, race.
[56] You need to check.
[57] I got you.
[58] No worries.
[59] Thank you.
[60] Thanks.
[61] And anything that she does with that power, it has all these downstream consequences.
[62] So a couple years ago, I guess now, they had a shooting here.
[63] A person who was a YouTube creator who was upset about her channel and ad money and things like that, she actually came onto this campus with a gun and shot three people, and shot herself.
[64] It was a really horrible, traumatic thing for the people here.
[65] And so as a result, they've really stepped up security.
[66] So I'm not surprised that we're getting scoped out by the security guys.
[67] So they lead us inside.
[68] So this is the lobby.
[69] And we go down this set of back stairs.
[70] I've been here, but not to the basement.
[71] Is this just studios down here?
[72] And that's where we meet.
[73] Hey, how are you?
[74] Susan.
[75] Hi, I'm Gina.
[76] Hi, hi.
[77] Nice to meet you, too.
[78] Before we get into the interview itself, what is Susan like as a person?
[79] Like, what was your first impression of her?
[80] Yeah, let's grab you a chair.
[81] So how are you?
[82] Doing okay, how are you?
[83] She doesn't really have, like, tech CEO vibes.
[84] I did not major in computer science.
[85] I have a history and literature undergraduate degree.
[86] She doesn't really get the same kind of attention as Mark Zucker.
[87] or Jack Dorsey.
[88] What was an early memory for you of the internet and what you thought it was?
[89] Well, I first saw the internet probably in my late 20s.
[90] But she's actually been around the tech world for longer than neither of them.
[91] And of course I remember Netscape.
[92] Like Google actually started in her garage.
[93] What does that mean?
[94] It's sort of a wild piece of Silicon Valley trivia.
[95] She was working at Intel and living in Silicon Valley.
[96] And these two Stanford grad students named Sergey Brin and Larry Page, they were looking for a place to house their brand new startup, this search engine called Google.
[97] And eventually they convinced her to come over and help them build it as a business.
[98] And she became Google's 16th employee.
[99] And so, you know, then you became history's most successful landlord.
[100] And you, you know, you got to Google and started, you know, working in.
[101] the tech world.
[102] And in 2014, you know, made the decision that you were going to come over and run YouTube.
[103] And what do you remember about YouTube when you came in?
[104] Like, what was it like?
[105] Well, first of all, you know, when I was first asked whether or not I wanted to have this role, I had been running our advertising business.
[106] And so Larry asked me, Larry Page, and I immediately said yes.
[107] I have always had a love for the visual arts.
[108] I was a photographer.
[109] I love creative.
[110] I had created Google's image search, and I could just see clear as day that YouTube had huge potential in front of it.
[111] And so in many ways, I thought, oh, I'll go to YouTube and I'll get to work with these fun entertainment creators.
[112] It probably seemed like you were going from a very serious job to a very fun job.
[113] Yeah.
[114] YouTube at the time was really much more of an entertainment site.
[115] It was not seen as a very serious site.
[116] Yesterday, the internet blew up over this video of a rat carrying a whole slice of pizza.
[117] And actually one of the areas that I really pushed our team a lot on was actually freshness because I would get frustrated.
[118] I would come to YouTube repeatedly and I would keep seeing the same videos.
[119] One of the areas I'd push them on and I've come.
[120] consistently push them on, is actually exploring new areas, which is, of course, the hardest area for us to discover, interest that you haven't necessarily told us that you're interested in.
[121] Or you might not know you're interested in.
[122] Or you might not know, but we think they're interesting.
[123] Like, that exploratory part is a really important part, and I do think we've gotten better at it.
[124] And we certainly have gotten better at predicting what people are interested in.
[125] So part of why I wanted to talk to Susan was to ask her about this 2015 -2016 period, this period when YouTube was trying to engineer this new algorithm that ended up opening the door to more polarization and extreme views and was also right when Caleb was getting introduced to all of these new characters.
[126] And it was kind of striking that almost right off the bat, she owned the fact that she had driven that change.
[127] I want to talk more about the recommendations thing.
[128] As you know, I've been very interested in it.
[129] And one of the people I've talked to and wrote about was a guy named Caleb Kane.
[130] Just like to sort of ask the most blunt possible version of this question, like what did you think of that story?
[131] I mean, I thought of the story you were trying to understand how our recommendation systems were working and what the impact of them were.
[132] And what was her take on Caleb's story?
[133] Well, she didn't really get into the specifics of his story, but.
[134] And it wasn't just you.
[135] There are many people that have raised questions or concerns on how our recommendation systems work.
[136] And we wanted to make sure we're taking that seriously.
[137] She did acknowledge that YouTube had taken from stories like his that they needed to start making some changes.
[138] And we have taken it seriously.
[139] And I think you know we've made a lot of changes to how our systems work.
[140] I want to talk about all those changes.
[141] But I'm struck by the fact that you said that one of the challenges for the algorithm and your design of it was getting people to explore new interests.
[142] That strikes me as something that happened to him, where he went to YouTube looking for self -help videos.
[143] He was going through a rough patch.
[144] He wanted to find something to cheer him up.
[145] And then he got introduced to these people who would do self -help, but then they would also talk about politics and they would go on each other's shows and they were creating this kind of bridge between like the self -help part of YouTube and the more political part of YouTube.
[146] Was that something that you observed happening on the platform?
[147] I mean, it's interesting that you say that because I guess I want to say going back from my early initial days, which is that we couldn't get people interested in news or get people interested in politics.
[148] I don't think we had no indication that this was something that people were interested in.
[149] Like, people were interested in gaming and music, entertainment.
[150] They came to laugh.
[151] They came to escape in many ways.
[152] She went back to this idea that YouTube just wasn't seeing data that suggested that people were looking for politics on YouTube.
[153] I acknowledge that there are many political commentations.
[154] and a lot of political views that have emerged on the platform.
[155] But that was not an initial part of how YouTube worked.
[156] And I guess the only reason I'm bringing that up because I do want to say that, you know, YouTube for all its creators, for, you know, millions of creators, there is a set of creators who do find success with that, but it's a small set.
[157] And in the end, she said that essentially, like, this kind of political content, it's still a really small percentage of everything else on YouTube.
[158] And is that true?
[159] Like, because it seems like, it seems like there's a lot of it, I guess.
[160] I mean, there's no way to know for sure without having access to their internal data.
[161] But like, even if it were true that only 1 % of YouTube videos consisted of this kind of political content, because YouTube is so big, that would still translate to millions and millions of people around the world watching it.
[162] Yeah, and we've talked to some people who were at YouTube.
[163] earlier in its history around the time that you came and even before then who said, you know, that there was sort of this obsession with growth, that there was a very strong push to expand the watch time on the platform and that any challenges that were brought to management around that these things just weren't given a real hearing.
[164] Does that resonate with you at all?
[165] And I asked about Guillaume, too, about the red flags that he had tried to raise back when he was at YouTube.
[166] I mean, I think I've certainly heard people say that.
[167] And I mean, you know, like I came from a company that was very focused on quality, quality of information.
[168] It was always like the most important thing we also always prioritize was quality.
[169] And she didn't really deny it, but she pivoted pretty fast to her work at Google and this concept of quality, which is basically like the term that Google uses to talk about its search engine and how it wants people to get.
[170] get good, reliable information when they go looking for something.
[171] And so I tried to figure out how do you reconcile that with a company that is a more entertainment -based company, right?
[172] So what does that mean to have quality?
[173] If the main thing that you're doing is gaming videos, cat videos, and music videos, then what does that really mean for quality?
[174] And I think what she's getting at is that at the time, like, YouTube just didn't really think that it was capable of doing much harm.
[175] And I guess the reason I'm bringing that up is that one of the biggest realizations for me was that we needed to start adapting and changing and that we needed a very different set of ways of managing information than you want to manage entertainment.
[176] Was there a moment that crystallized that for you?
[177] Well...
[178] It's the moment's celebration became nightmare.
[179] In 2016.
[180] People were screaming to...
[181] There was a terrorist attack that had happened in France.
[182] Susan says that in the French resort town of Nice.
[183] Susan says that in 2016, on Bastille Day.
[184] In the coastal city of Nice, just as the fireworks ended, a truck plowed into the crowd.
[185] When an ISIS terrorist attacked a crowd.
[186] At least 84 people were killed, dozens of others were hurt.
[187] And the government there declared a state of emergency.
[188] And I remember reading that and being just extremely, upset and thinking, our users need to know about it.
[189] YouTube decided that for people in France, it was actually going to push news about the attack onto their home pages.
[190] The attack turned the streets into a scene of chaos.
[191] And for people looking for information about the attack.
[192] The mayor of Nice warning people to stay indoors.
[193] Here's tonight of another terror attack.
[194] They were actually going to prioritize videos from what they considered authoritative sources.
[195] But that didn't perform.
[196] very well on our platform.
[197] She says that basically the engineers at YouTube were telling her, like, the users don't want to actually see it.
[198] No one is clicking on these videos.
[199] That's just not what they want to see.
[200] And so what do you do as a platform?
[201] Do you show it to them anyway?
[202] And so that's actually, I remember that very clearly, because that was the first time I said to them, you know, it doesn't matter.
[203] We have a responsibility.
[204] Something happened in the world, and it's important for our users to know.
[205] And according to her, she said basically, like, too bad, we're showing it to him anyway.
[206] And it was the first time we started using the word responsibility and the fact that we needed to put information that in our site that was relevant, even if our users were not necessarily engaging with it in the same way that they would with the entertainment videos.
[207] So if I'm understanding this right, what Susan is saying is that the niece attack marked the first time that YouTube took this idea of responsibility and prioritized that over watch time.
[208] Right.
[209] At what point did that sense of responsibility extend beyond, like, a big news event?
[210] It took a little while.
[211] I mean, I think over 2017 and 18 and into 2019, I think pressure was starting to build on YouTube.
[212] CNN reports that YouTube ran ads from large brands like Adidas, Amazon, and Hershey before videos which promoted extreme content.
[213] There were reports in the Times and other places.
[214] You're about to meet a man who says he was radicalized by alt -right figures via their persuasive YouTube videos.
[215] Regulators, parents.
[216] YouTube is one of the greatest agents of extremism that we might have ever created at this scale.
[217] Advertisers, they were all chiming in.
[218] Good morning, everyone.
[219] The Subcommittee on Consumer Protection and Commerce will now come to order.
[220] Congress actually held hearings about these Internet platforms.
[221] Congress has unfortunately taken a laissez -faire approach to regulation of unfair and deceptive practices online over the past decade.
[222] And platforms have let them flourish.
[223] And invited experts, including former Google employees.
[224] So there you are.
[225] You're about to hit play in a YouTube video and you hit play and then you think you're going to watch this one video and then you wake up two hours later and say, oh my God, what just happened?
[226] And the answer is because you had a supercomputer pointed at your brain.
[227] To talk about how YouTube was essentially built to pull people into these polarizing rabbit holes.
[228] It's sad to me because it's happening not by accident, but by design.
[229] And that's when YouTube started making these big changes last year.
[230] They disappeared me. So I've just been told I'm on double secret probation.
[231] Apparently, you see, I had violated community standard.
[232] You know, one of the biggest changes that we made from our recommendation system, and it was probably one of the later changes we made, and I think it's because it's a harder one to grapple, which is that we realized that there was a set of content that even if people were repeatedly engaging with it, we thought it was important to make the recommendations more diversified and to show them more quality content alongside.
[233] What she's essentially said is that there's a certain kind of video where, like, even if a lot of people are watching it and it's generating all this viewing time, the site will actually intervene.
[234] You know, if a video has hate speech in it or if it's made by like a neo -Nazi who's denying the Holocaust happened, that will come totally down off YouTube.
[235] But there's this other class of videos that YouTube has a harder time categorizing.
[236] You know, we began to identify content that we called borderline content and that if users were repeatedly engaging with this borderline content, we will show them other content alongside that we've ranked as more high -quality content.
[237] And these videos, like, they don't explicitly violate YouTube's rules, but they're also, like, not the kind of thing that YouTube wants to boost and recommend.
[238] So what they've done is essentially to train algorithms to identify these kinds of videos and then demote them in people's feeds and in their sidebars so that they don't get seen as often.
[239] And what that has done is it's actually.
[240] had a 70 % reduction of views of borderline content in the U .S. And we've been in the process of rolling that out globally.
[241] It's in most English language countries right now and a few large markets like France and Germany, but we continue to roll that out.
[242] And how do you define like borderline kind?
[243] Like what's borderline versus just, you know, edgy jokes?
[244] How do you sort of separate the two?
[245] Yeah.
[246] I mean, it is a complicated and nuanced area.
[247] And I mean, we have a whole process.
[248] an algorithm to find it.
[249] And I guess what I'm wondering is like these are sort of interventions in an automated system, right?
[250] Like you set up this automated system, let it run, realized like some of the things that it produces, realize that maybe those aren't creating an optimal experience for people.
[251] And then, you know, humans come in and sort of tinker with it to produce different results.
[252] Do you feel like when you...
[253] Well, hopefully we do more than tinker.
[254] We really have a lot of advanced technology and we work very scientifically to make sure we do it right.
[255] Right.
[256] I guess a sort of overarching question is, do you think there was an overreliance in YouTube's early days on automation, on machines?
[257] Like, do you think there was a sort of underappreciation for the role that sort of judgment and human intuition played in this stuff?
[258] No. I mean, I think both are really important.
[259] And if we look at how our systems work, we definitely have a lot of, we have basically a system that, is very heavy with humans that is extended with machines there is an automated component but then there's also of course a human component so is she saying that humans are now doing more work than the AI no I mean like yeah like they do have more people looking at videos than they used to we have a number of raiders at all watch and review videos and based on the feedback we identify a set of content that looks borderline and then based on those borderline.
[260] But YouTube is so big.
[261] They could hire a hundred times as many people and they still wouldn't be able to watch a fraction of everything that's being uploaded to YouTube on any given day.
[262] So it's still the case then that most videos that I'm recommended that anyone who goes to YouTube is recommended, those recommendations are coming from different algorithms being run by an artificial intelligence.
[263] Yeah, the robots are still very much running the show, but what she's saying is that they're trying to give humans a bigger role in supervising them.
[264] You can't see Andrew, but she's waving us off, but we'll, can we have like a couple more minutes?
[265] We're good?
[266] Okay, well, we'll start wrapping it up.
[267] I do have some more questions.
[268] One is, I wanted to ask about the shooting here in 2018.
[269] I wonder what that was like for you.
[270] Yeah.
[271] I mean, it was, first of all, just a horrible event to go through.
[272] We have a report of subject with a gun.
[273] This will be from the YouTube building.
[274] Like when something like that happens to you, you can't really process that it's happening.
[275] A flood of YouTube employees streaming out towards safety.
[276] Some still clutching tightly to their laptops.
[277] Turning what was an ordinary normal lunch hour on this tech campus into an all too familiar shooting scene in America.
[278] You don't know, is there one shooter, are there five shooters?
[279] Where are they?
[280] New details about the woman police say shot three people at YouTube's headquarters before taking her own life.
[281] I'm being discriminated and filtered on YouTube.
[282] And my old videos that used to get many views stop getting views.
[283] Accusing the website of filtering her channels to keep her videos from getting views, something she blames on new closed -minded YouTube employees.
[284] I'm very thankful in some ways that, you know, nobody was killed.
[285] I think it also just showed any kind of information company, you know, has some risk of people being upset with information that's reported.
[286] And so, you know, my key takeaway was that we need to make sure that our employees are always safe and that we're doing everything we possibly can.
[287] And I wonder, was that a moment for you where you realized the stakes of the decisions you made, here in a different way?
[288] Did it change the way you looked at your responsibility?
[289] I mean, information is a tough business.
[290] I mean, you are in the information business yourselves and you understand that sometimes people can get really upset and they can get really angry about ways that they're covered or ways that information is displayed.
[291] And just like a journalistic organization like yourselves, you need to do what's right.
[292] I want to look back on this time and feel that I made the right decisions.
[293] I don't want to say, oh, I allowed this type of content because I was afraid.
[294] Yeah, I mean, I remember sort of that day and I, you know, other shootings since that day and that have been sort of related in some way to the internet and just feeling like a sense of loss.
[295] Like I grew up on the internet.
[296] I love the internet.
[297] I have wondered if we're ever going to get back to a time when these things feel fun and useful and like they're not leading to these culture wars and polarization, like, do you think that will ever feel like that about the internet again?
[298] I love the internet.
[299] I think that the internet has enabled so many different opportunities for people that wouldn't have had it.
[300] And I think you're right.
[301] Like, in the last couple of years, there's been a lot of scrutiny.
[302] And it's because of the size that we are and because people realize the significance of our platform.
[303] And I recognize that.
[304] And I take that very seriously.
[305] My focus on responsibility is probably one of the most important things I'll do in my life.
[306] Why?
[307] Because I think we live at this time where there's tremendous change.
[308] And so, yes, we've had years of all this fun and gaming and cat videos, but there are a lot of really important decisions to be made about what this future will hold and what will platforms be and how will they operate in the future.
[309] And those rules haven't been written yet, or we're in the process of writing them.
[310] Yeah, well, thank you.
[311] Thank you.
[312] I appreciate making time.
[313] Thank you.
[314] Thank you.
[315] So, Kevin, this is where things stood a couple months ago.
[316] Mm -hmm.
[317] But then came the coronavirus.
[318] Right.
[319] A handful of people is spreading the idea on social media that the rollout of 5G cell towers is responsible for the COVID -19 epidemic.
[320] Some of the towers have even been set on fire in the U .S. So this April, as we were all dealing with the coronavirus and quarantining in our houses, I started to see, Let's talk about what's really going on.
[321] As usual, something just since the beginning hasn't seemed right with this coronavirus.
[322] The internet was filling up with misinformation.
[323] You're saying that silver solution would be effective.
[324] Totally eliminated, kills it, and deactivates it within 12 hours.
[325] There were all these miracle cures.
[326] Just drinking hot water can kill it.
[327] If you drink bleach, you will not get the coronavirus.
[328] Vitamin C can kill it, no problem.
[329] You can prevent this from happening with vitamin C. There were these conspiracy theories about...
[330] Anyone else find it coincidental that Dr. Fauci is on the Leadership Council for the Bill and Melinda Gates Foundation?
[331] Oh, it's all coming out about Fauci working with Gates.
[332] He knew all about it.
[333] Bill Gates or the Jews are behind this.
[334] We are quite literally watching a bio -warfare drama play out before us.
[335] Stuff about like...
[336] Is this being caused by 5G towers?
[337] Yes.
[338] 5G cell phone towers causing coronavirus.
[339] Anybody want to make one guess is to where the first completely blanketed 5G city in the world was.
[340] Exactly.
[341] Like the exact sort of thing that Susan would consider low -quality content.
[342] Right.
[343] Another incredible sight in this surreal ordeal as we battle this coronavirus pandemic.
[344] The Capitol steps, the scene of a protest.
[345] Of course, like a lot of misinformation, this didn't just stay online.
[346] Are you concerned about this virus?
[347] I was in the beginning until I've done my research and found out the realities and the media's overreach on it and that it's not as serious as they made it out to be.
[348] It actually contributed to these protests that were going on at state capitals all around the country, these sort of anti -lockdown protesters who were refusing to go along with social distancing and the other recommendations that medical and health authorities were making about how to keep society safe.
[349] The disease is not that serious that we should quarantine.
[350] We don't quarantine for the flu.
[351] We should not quarantine for COVID -19.
[352] Hi.
[353] Is that you in your closet there?
[354] Hello.
[355] Yes.
[356] I'm in my closet.
[357] I'm in my office.
[358] I've told my kids not to come in.
[359] So in early April, are you recording on your end?
[360] or can you record on your end with like a nice memo or something?
[361] I don't normally record myself.
[362] I got in my very fancy home studio.
[363] Susan, if you could hold the phone as if you're talking on the phone, that'll be the right placement.
[364] And Julia and I got Susan back on the phone.
[365] The last time we met, we talked about all of the ways that you were trying and had tried to improve the quality of the information on YouTube.
[366] And since then, like, there's this coronavirus.
[367] which seems like maybe kind of the highest stakes possible version of that mission, where the difference between people getting good information and bad information can literally be life or death.
[368] So what are you doing now to make sure that people are getting good information about the virus?
[369] From the very beginning, we took this extremely seriously.
[370] And we took our team and decided what are all of the things that we can do to be as responsible as possible.
[371] And right away, she said, like, all these coronavirus conspiracy theories and miracle cures.
[372] Just drinking hot water can kill it.
[373] Very quickly, we were reviewing all of the videos, so we are very proactive.
[374] And anything that would be harmful or anything that would promote medically unproven methods.
[375] If you drink bleach, you will not get the coronavirus.
[376] We have been removing that type of conduct.
[377] You can prevent this from happening.
[378] With vitamin C. She said YouTube has been aggressively hunting down and deleting them as fast as they can.
[379] Hi, I'm Dr. John Brooks with the CDC.
[380] We put links to the CDC.
[381] Hello, everyone.
[382] And welcome to this regular press conference on COVID -19 from Geneva.
[383] The World Health Organization, or to the associated equivalent globally, in the home feed, under videos that were about coronavirus and in our searches.
[384] Let's work together to keep ourselves healthy, our family's healthy, and our communities healthy.
[385] On every video about the coronavirus, they, like, put this little link that directs people to, like, the CDC's website or other health authorities.
[386] So the 5G story is complete and utter rubbish.
[387] It's nonsense.
[388] It's the worst kind of fake news.
[389] We have a new shelf that we launched that gives users information from authoritative sources.
[390] Far -fetched conspiracy theories like this are really damaging because they undermine public health efforts to stop.
[391] the spread of the virus by convincing some people that it's not the real problem.
[392] They created this feature that basically like when you search for information about the coronavirus, the first thing you see is this section called Top News that contains essentially like information from authoritative, trusted sources.
[393] Staying inside saves lives.
[394] Stay home.
[395] They also got popular YouTubers to make these social distancing PSAs.
[396] I really think togetherness is this superpower of our species.
[397] Let's do it together.
[398] We will keep each other company.
[399] So they weren't just hearing it from YouTube.
[400] They were hearing it from their favorite creator.
[401] Coming over with me. Each day can be a different thing.
[402] Which they then promoted to all their subscribers.
[403] And when we looked at the combined number of subscribers that they have, it's over 1 .5 billion subscribers.
[404] You can slow the growth of this and save lives.
[405] We have everyone at YouTube working throughout this crisis.
[406] to make sure that they are following what's happening, making changes on the policy, making sure that we are taking down content, making sure that we're adjusting whatever changes we need to our algorithm.
[407] And as she's going through her list of everything that YouTube is doing on the coronavirus, it really strikes me like how much work it takes to make sure that this algorithm that YouTube has built, that it's not leading people in a dangerous direction.
[408] I mean, I could go on.
[409] We did a lot of different steps here to make sure we were doing everything possible.
[410] Yeah, I mean, just personally, like, I've been impressed by how hard it's been to find misinformation about the coronavirus on YouTube.
[411] And I guess I'm wondering, like, why isn't it always like this?
[412] Like, what is preventing YouTube from taking this approach all the time to every subject?
[413] Well, we do take a responsibility approach to everything that we do.
[414] So what's a little bit different about this one is there is a very clear authority, which is a world health organization.
[415] And there are recommendations that health organizations are making.
[416] And as a result, it's very clear to us that anything that would contradict social distancing, for example, would be a violation of our policies.
[417] And in a lot of other areas, there may be some agreement on the science, but there could be politically different views.
[418] And so we want to enable a broad range of views.
[419] So Susan's saying that she's basically comfortable doing this kind of work when it comes to something like the coronavirus because it's just science and there are clear authority figures to turn to.
[420] But politics is a completely different story.
[421] And maybe there's some obvious answer to this, but like, why is that?
[422] We want to enable a broad range of views.
[423] We want to make sure that the full spectrum of political views or social ideas are represented there.
[424] But in this case, it's very clear that there's an authority and information behind that authority.
[425] I mean, I think on an ideological level, there are a lot of people at YouTube who still want it to be this open platform where every kind of view is welcome and is equal.
[426] And I think there's also this more practical angle to it, which is that they don't want to be seen as putting their thumb on the scale.
[427] They don't want politicians and regulators, people who might try to break them up or enforce new rules on this.
[428] them to think that they're taking a partisan stand.
[429] Right.
[430] But I feel like because of the time that we're living in, in part due to internet platforms like YouTube and the powerful AIs that have been designed to capture and keep our attention at all costs that like now everything feels polarizing and political, even everything around the coronavirus.
[431] And so it sounds like Susan is saying, you know, okay, we recognize that we helped let this genie out of the bottle.
[432] Right.
[433] We didn't mean to.
[434] We were just trying to entertain.
[435] We're now going to try and put it back.
[436] But I just wonder, like, in the environment that they helped to create, if that is even possible.
[437] Yeah.
[438] I mean, it's kind of a reckoning for them.
[439] YouTube is part of Google.
[440] And I've been at Google for over 20 years now.
[441] And, you know, it's an information.
[442] company, which means our goal, our mission is to deliver users the most relevant information, right, and that it's accurate.
[443] And that's true for YouTube, too, that we want to deliver accurate, useful information.
[444] And I think in the information area, it's very important.
[445] Yeah.
[446] And I guess it strikes me as like there's a trust crisis right now.
[447] I mean, people just, they don't trust necessarily the institutions that maybe they would have years ago to sort give them accurate information.
[448] So you're kind of trying to rebuild trust, it strikes me, like, you know, placing content from organizations like the CDC and the WHO prominently on YouTube.
[449] Do you feel like that's part of what you're trying to do right now is to kind of help people on YouTube understand that they actually can trust these sort of mainstream authorities?
[450] Well, part of what we want to do for YouTube is make sure that we have all of the voices represented on the platform.
[451] And YouTube started out with a lot of individuals just posting videos about what was happening in their lives.
[452] But what we've really done over the last couple of years is reach out to make sure that news organizations trusted, scientific health, government officials all also have the resources needed to be able to come onto the platform on YouTube.
[453] And what I've seen happen with COVID -19 is it's really accelerated public health officials, recognizing that YouTube is an important medium to be communicating with users about anything involving health.
[454] And so, you know, in many ways, I think this would have happened naturally.
[455] It might have taken a few years.
[456] It just accelerated a few years.
[457] And we plan to continue to work with all of these organizations to make sure they can get the right information out to their users.
[458] Thank you.
[459] Thank you.
[460] I think we're good.
[461] What Susan is saying, this change in YouTube, it's actually like a fundamental shift in the YouTube universe.
[462] Like for its entire existence, YouTube has been defined as an alternative media space.
[463] And the people that were there were like these insurgents and upstarts, these kind of like rebels and misfits.
[464] And now YouTube is basically saying when it comes to certain things, we're going to let the establishment win.
[465] Which is tricky because the establishment isn't always right.
[466] Like the world health organization has changed its guidance on things like face masks during the pandemic.
[467] So the ramifications of this change, they're not totally clear yet.
[468] But what is clear is that this is going to be a huge battle.
[469] Old school media does not like internet personalities because they're scared of us.
[470] We have so much influence and such a large voice.
[471] Because the internet culture that grew up on YouTube, not only has it gotten bigger than mainstream culture.
[472] Why is this an article?
[473] Because ClickBank, that's why.
[474] And not only does it distrust a lot of mainstream institutions.
[475] If there's anything I learned about the media from being a public figure is how they blatantly misrepresent people for their own personal gain.
[476] But over time, it's basically come to despise them.
[477] I'm still here.
[478] I'm still making videos.
[479] Nice try, Wall Street Journal.
[480] Try again, motherfuckers.
[481] To hear more, go to whatever app you're using.
[482] using to hear me right now.
[483] Search for rabbit hole, two words, and hit subscribe.