The Daily XX
[0] From the New York Times, I'm Michael Barbaro.
[1] This is the daily.
[2] Today.
[3] The death toll from a mass shooting targeting Muslims in New Zealand rose from 49 to 50 over the weekend, after officials found another body at the Al -Nor Mosque where most of the deaths occurred.
[4] Kevin Ruse on why this attack was made by and for the Internet.
[5] It's Monday, March 18th.
[6] Would you mind just telling us one more time of what happened in the mosque?
[7] When shooting started, it started from the hallway.
[8] So I could hear.
[9] So, to -to -to -to -to -to -to, then Maggiene is finished, then he refill again and came back again.
[10] And I saw all the plastering coming down from the wall and the ceiling.
[11] Okay.
[12] That was when I knew what was a gunshot.
[13] Right, okay.
[14] So immediately on the impulse, he's a lot.
[15] When he was shooting at me and there's a fence here, he was shooting at me, a duck here, and I come here, and I'll find the gun somewhere here, and a dead body here as well.
[16] I feel now, I repeated the story a lot, but this is a good idea to say that.
[17] Over the weekend, through dozens of interviews with survivors, a story began to emerge of what happened on Friday inside the mosques in Christchurch.
[18] Was that your regular mosque?
[19] visiting that mosque?
[20] Regular mosque.
[21] The shooting began at the Al -Noor Mosque, where Farid Ahmad and his wife, Humsah, who had moved to New Zealand from Bangladesh, were attending afternoon prayer.
[22] The ladies' room was on the right -hand side.
[23] So all the ladies were there.
[24] And my wife is always a leading person for ladies.
[25] She had a philosophy.
[26] She always is to tell me, I don't want to hold any position and I want to prove that you don't need to have any position to help people.
[27] She was like a magnet and exactly the same thing happened.
[28] The shooting started, she started instructing several ladies and children to get out.
[29] And she was screaming, come this way, hurry up, this and that, you know, she was doing all these things.
[30] And then she took many children and ladies.
[31] into a safe garden then she was coming back checking about me because I was on wheelchair and do you mind me asking why you're in the wheelchair I was run over by a car he was a drunk driver and it was 1998 and it happened sorry so she went out to the mosque and then she came back here yeah she was coming back and once she was approaching the gate, then she was shot.
[32] She came back in to fetch you.
[33] Yes, yes.
[34] Fried learned hours later that Humsa was one of the 42 people, police say, were killed at the mosque.
[35] So she was busy with saving lives, you know, forgetting about herself.
[36] And that's what she is.
[37] She always has been like this.
[38] Six minutes after firing the first shot, and as police raced toward Al -Nor Mosque, the shooter drove to a second mosque, the Linwood Mosque, four miles east.
[39] Tell me, I'm sorry, what was your name?
[40] Abdul Aziz.
[41] And the mosque that you go to, is it mixed Pakistani?
[42] I mean, that mosque, what was, who was there that day?
[43] That mosque, we got from every race, from Malaysia, from Philippine, from Afghanistan, from every, every sort of country.
[44] My colleague Damien Cave spoke with Abdul Aziz, who was praying at the Linwood Mosque with his four sons when he heard gunshots.
[45] Aziz ran toward the shots, grabbing the first thing he could find, a credit card machine, which he flung at the attacker.
[46] The shooter dropped a gun, and Aziz picked it up.
[47] And I picked up the gun, and I checked it.
[48] They had no bullets, and I was screaming to the guys, come here.
[49] I'm here.
[50] I just wanted to put more focus on me than go inside the masjid.
[51] But unfortunately, he just got himself to the masjid.
[52] Then I heard more shooting sound and I see he's shooting inside the masjad.
[53] Moments later, when the gunman went to his car to retrieve more weapons, Aziz followed him.
[54] He tried to get more gun from his car.
[55] When he's seen me, I'm chasing with a gun.
[56] He sat on his car.
[57] And I just got that gun and thrown his arm.
[58] window like an arrow and blast his window, and he thought probably shot him or something, and the guns come back and just he drive off.
[59] Aziz used the gun to shatter the gunman's car window, which many witnesses believe is what prompted him to speed away, rather than re -enter the mosque and kill more people.
[60] Minutes later, video shows the suspect being pulled by police.
[61] police from his car, two and a half miles down the road, where two more guns and homemade explosives were also found.
[62] I want to speak specifically about the firearms used in this terrorist act.
[63] There were two semi -automatic weapons and two shotguns.
[64] On Sunday, New Zealand's Prime Minister, Jacinda Ardardin, said that the suspect, an Australian citizen, would be tried in New Zealand, and that her government.
[65] would meet today to discuss the country's gun laws.
[66] I can tell you one thing right now.
[67] Our gun laws will change.
[68] Funerals for all 50 victims are expected to be held in the coming days.
[69] As the police commissioner confirmed this morning, 50 people have been killed and 34 people remain in Christchurch Hospital.
[70] 12 of them in the intensive care unit in critical condition.
[71] A four -year -old girl remains in a critical condition at Starship Hospital in Auckland.
[72] Islamic burial rituals typically require bodies to be buried as soon as possible, and usually within 24 hours.
[73] But New Zealand authorities say that the process of identifying the victims and returning them to their families could take several more days.
[74] It is the expectation that all bodies will be returned to families by Wednesday.
[75] day.
[76] I want to finish by saying that while the nation grapples with a form of grief and anger that we have not experienced before, we are seeking answers.
[77] We'll be right back.
[78] Kevin, I want to talk to you about the moments before this mass shooting began.
[79] What do you know about those?
[80] Well, what we know comes from a video that was live -streamed on Facebook while this was all happening by the gunman.
[81] He taped himself in the car on his way over to the mosque, listening to music, talking.
[82] And right before he gets out of the car and goes into the mosque, he pauses and says, remember lads, subscribe to PewDie Pye.
[83] And when I heard that, I just, like, I knew, oh, this is, this is something different than we're used to.
[84] What do you mean?
[85] What is PewDiePie?
[86] And why does that reference matter?
[87] So PewDiePie is this really popular YouTube personality.
[88] He has the most subscribers of anyone on YouTube.
[89] Some people think he's offensive.
[90] Some people really like him.
[91] He's got this whole fan base.
[92] And a few months ago, his fans started sort of spamming this phrase, subscribe to PewDiePie in an attempt to kind of keep him from being eclipsed by another account that was going to have more followers than him.
[93] So it sort of became this competition and then it became this joke and now subscribe to PewDiePie is just kind of like a thing that people say on certain parts of the internet.
[94] It's just kind of like a signifier like I understand the internet, you understand the internet, this is how we're going to signal to each other that we understand the internet.
[95] And this is what he's signaling in saying that.
[96] Yeah, so I have that in my head and then I see all these other signs that something is weirdly kind of internet -y about all of this.
[97] Like, There's this post on 8chan, which is kind of like a scummy message board that lots of extremists and weirdos go on.
[98] And in the post, the gunman links to the Facebook stream before it happens.
[99] The Facebook stream that he will record of the massacre itself.
[100] Exactly.
[101] And then he pasts a bunch of links to copies of his manifesto.
[102] He has a 74 -page manifesto that he wrote.
[103] And some of the stuff was fairly standard, hard -right ideology.
[104] Very fascist, very white nationalist.
[105] Muslims are kind of like the primary target for white nationalists around the world, calling them invaders, saying they're taking over.
[106] You know, this is a sort of classic white nationalist trope.
[107] And then there was all this kind of meta -humor, saying that he was radicalized by video games, which is another thing that internet extremists love to sort of troll the media with.
[108] Like, you know, he posted previews of his gun on Twitter.
[109] The whole thing just kind of felt like it just set this shooting up as like almost an internet performance.
[110] Like it was native to the internet.
[111] And it was born out of and aimed into this culture of extremely, concentrated internet radicalism.
[112] But underneath it all is white nationalism, white supremacy, whatever you want to call it, a kind of racism that has always existed.
[113] So why does the internet's role in this feel especially different to you?
[114] I want to make clear that like this is not just a tech story, right?
[115] There's a real core of anti -Muslim violence here, Islamophobia, far -right ideology.
[116] That's all very, very important.
[117] And we should focus there.
[118] But I think there's this other piece that we really need to start grappling with as a society, which is that there's an entire generation of people who have been exposed to radical extremist politics online, who have been fed a steady diet of this stuff.
[119] It's transformed by the tools that the internet provides.
[120] So I've talked to a lot of white nationalists, unfortunately, and, you know, when I ask them, how they got into this, a lot of them will say I found a couple of videos on YouTube and then, you know, I found some more videos on YouTube and it kind of started opening my eyes to this ideology and pretty soon you're a white nationalist.
[121] And that's different from historically how extremism has been born.
[122] I mean, you know, if you go to the library and you like take out a book about World War II, right, as you're about to finish it, Like the librarian doesn't say here, here's a copy of MindComp, you might like this.
[123] There's not this kind of like algorithmic nudge toward the extremes that really exists on social media and has a demonstrated effect on people.
[124] Walk me through this algorithmic nudge.
[125] I want to make sure I understand what you're referring to.
[126] This is pretty specific to YouTube, but that's where a lot of this stuff happens.
[127] So on YouTube, there's this, you know, recommendations bar.
[128] And after a video plays, another one follows it.
[129] And historically, the way that this algorithm that chose which video came next worked is it would try to keep you on the site for as long as possible.
[130] It would try to maximize the number of videos you watched, the amount of time you spent, which would maximize the ad revenue.
[131] Right, it would maximize lots of things.
[132] And so it turned out that what kept people on the site for longer and longer periods of time was gradually moving them toward more extreme content.
[133] You start at a video about, you know, spaceships and you'd end on something that was questioning, you know, whether the moon landing was a hoax.
[134] Or you'd start at a video about some piece of U .S. history and, you know, five videos later, you're at kind of a 9 -11 conspiracy theory video.
[135] Just these kind of like gradual tugs toward the stuff that the algorithm decides is going to keep you hooked.
[136] And in a lot of cases, that means making it a little more extreme.
[137] And what's the white nationalist version of this nudge?
[138] I mean, there's a ton of white nationalism on YouTube.
[139] YouTube, you know, from the conversations I've had with people in this movement, it's sort of central to how these ideas spread.
[140] Like, you start watching some videos about politics.
[141] Maybe they're about Trump.
[142] Then you start watching some videos by sort of more fringy kind of far -right characters.
[143] And all of a sudden, you are watching someone's video who is espousing open white nationalism.
[144] And you're not exactly sure how you got there, but you keep watching.
[145] And for some percentage of people, you internalize that.
[146] So it's a kind of computer -driven on -ramp or onboarding.
[147] Yeah, and this has been studied.
[148] Like, this is a well -documented phenomenon on YouTube has done some things to try to fix the algorithm and, you know, make it so that it's not sending you down these rabbit holes, but still a pretty observable effect.
[149] And what's your understanding of why these platforms didn't act years ago to police, to delete these hate -filled videos, this content that through these algorithmic nudge as you described, directs people further and further towards extremism.
[150] They had no reason to.
[151] I mean, they were making a lot of money.
[152] They saw their responsibility as providing a platform for free speech.
[153] They were very hesitant to kind of seem like they were censoring certain political views.
[154] they were committed to free speech.
[155] And I think that's kind of the original sin that's baked into all of this.
[156] That's like part of how this was born is this idea that we just provide the platform.
[157] And if people signal to us that they like something, we'll show them more of it.
[158] And maybe we'll show them something that pushes the envelope a little bit more.
[159] And we're not optimizing for truth.
[160] We're not optimizing for things that we think are healthy for people.
[161] we're just giving them what they want.
[162] And they're trying to change that now, some of them.
[163] There's a reckoning now where these platforms have come to understand that this is the role that they've played and they're trying to correct it.
[164] But there's a lot of people who have already been sucked up into this world, who have been radicalized, and who may not be coming back.
[165] It's going to be very, very tricky to slow the thing that has been set into motion.
[166] and I don't even know if it's possible.
[167] At this point.
[168] Yeah.
[169] These platforms played a pivotal role, have played, are playing a pivotal role in how these extremist groups gather momentum and share their ideas and coalesce into real movements and grow.
[170] And, like, that's the part that I don't think they've completely reckoned with and that I don't think we've completely reckoned with.
[171] I think we're still sort of coming to terms with the fact that there's this pipeline for extremism and we know how it runs, we know where it happens, we know who's involved, and we know that sometimes it has these devastating, tragic consequences.
[172] What I've been thinking is just how inevitable this feels.
[173] What do you mean?
[174] I've been watching these people in these kind of dark corners of the Internet multiplying and hardening and becoming more extreme.
[175] And, like, it was inevitable.
[176] This is the nightmare, right?
[177] This is the worst possible version of something that could happen and be broadcast on the Internet.
[178] And it's not getting better.
[179] and it's going to be with us for a long time.
[180] But it also strikes me that in a way, and in a pretty awful way, this gunman and the way he has approached this massacre is kind of reflecting back how the internet functions.
[181] Because I'm thinking about him making a video of this attack, which in a sense means he's making content that feeds that loop that we're discussing, perhaps feeds this algorithm that possibly fed him, that he's basically putting something back into the system.
[182] Yeah, and I saw this happening on these platforms, like in real time.
[183] You mean on Friday?
[184] Yeah, so if you went onto 8chan, which is the website where all this stuff was posted, the comments below this post were all about, you know, let's save these videos so that we can re -upload them somewhere else in case 8chan gets taken down.
[185] let's spread this, let's seed this all over the internet.
[186] I mean, there's no doubt in my mind that this guy was very aware of how his video and his manifesto would kind of filter through the internet and get refracted and picked up and analyzed.
[187] This was a very deliberate act, not only of murder and violence, but also of media creation.
[188] I mean, this was, in a way, like, engineered for internet virality.
[189] And then it did go viral.
[190] Yes, Twitter and Facebook and YouTube, all the platforms tried to take down the video as soon as it popped up, but it just kept popping back up.
[191] It's very hard to contain, so it's still out there.
[192] I mean, yeah, I'm looking at right now something posted, you know, six hours ago.
[193] It's the video of the shooting, and it's still up.
[194] And I don't think it'll ever fully disappear.
[195] Kevin, thank you very much.
[196] Thank you.
[197] Here's what else you need to know that day.
[198] Ethiopian officials say that information retrieved from the data and voice recorders on the Boeing jetliner that crashed last Sunday, establishes further similarities to an earlier crash of the same Boeing model in Indonesia.
[199] The officials did not specify the similarities, but the disclosure is another indication that the causes of the two crashes may be related.
[200] The Ethiopian crash led to the war.
[201] worldwide grounding of the Boeing jet, a 737 max, whose automated flight control system is believed to have contributed to the Indonesian crash.
[202] That system is now a focus of the investigation into the Ethiopian crash.
[203] And the Times reports that a campaign by the Trump administration to prevent foreign governments from using Chinese telecommunications equipment, especially those from Huawei, is failing.
[204] Several U .S. allies, including Britain, Germany and India, have rejected the White House's argument that Chinese technology poses a national security threat that could potentially allow China's government to disrupt their communications and are refusing the U .S .'s request to ban the equipment in their countries.
[205] That's it for the daily.
[206] I'm Michael Bobar.
[207] See you tomorrow.