The Daily XX
[0] It's not altogether uncommon in investigations for us to turn up information that is shocking and disturbing.
[1] The challenge is when in the course of your reporting, you come across something so depraved and so shocking that it demands attention.
[2] People have to know about this, but nobody wants to hear about it.
[3] How do you tell that story?
[4] From the New York Times, I'm Michael Wabarrow.
[5] This is the Daily.
[6] Today, a months -long Times investigation uncovers a digital underworld of child sexual abuse imagery that is hiding in plain sight.
[7] In part one, my colleagues, Michael Keller and Gabriel Dance, on the almost unfathomable scale of the problem, and just how little is being done to stop it.
[8] It's Wednesday, February 19th.
[9] Gabriel, tell me how this investigation first got started.
[10] So it all began with a tip.
[11] Early last year, we got a tip from a guy, and this guy was looking up bullets.
[12] Bullets for guns.
[13] Bullets for guns.
[14] And he was actually looking for a very specific weight of bullet.
[15] on Microsoft's Bing search engine and while he was looking up these bullets he started getting results of children being sexually abused and the guy was horrified he didn't understand why he was seeing these images couldn't stand to look at him and so he reported it to Bing and heard nothing and full of outrage he writes us a letter to our tip line and described what he was looking for, described the kind of images he was getting back.
[16] He says, New York Times, can you please look into this for me?
[17] So I actually emailed my colleague, Michael Keller, and I asked him to look into it.
[18] So in the tip, they had listed the search terms they had used.
[19] So we tried to replicate it.
[20] We put it into Bing.
[21] And we saw a flash on the screen of images of children.
[22] And so I wrote back to Gabe and said, yeah, this checks out.
[23] you could type words in to Bing and get back explicit images of children.
[24] So this is not the dark web.
[25] This is just a regular commonplace search engine.
[26] That's right.
[27] So a few things went through my head.
[28] First of all is, well, we need to document this because, as most of us know, things on the internet change all the time.
[29] It's possible they came down soon after, et cetera.
[30] But we were very unsure what kind of legal liabilities we had when it came to documenting anything regarding this imagery.
[31] So we emailed Dave McCraw, who's the head counsel at the New York Times, to ask them, you know, what can we do?
[32] What can't we do?
[33] How do we go about investigating where this imagery is online?
[34] And doing it without somehow violating the law?
[35] That's right.
[36] And David wrote back immediately and said, there is no journalistic privilege when investigating this, you have no protections, and you have to report it immediately to the FBI.
[37] And so that's what we did.
[38] We submitted a report both to the FBI and also to the National Center for Missing and Exploited Children, which is the kind of government -designated clearinghouse for a lot of these reports.
[39] And what did they tell you?
[40] They weren't able to tell us anything about the report we submitted, but it made us wonder how common is it that they get these kinds of reports, how many images are out there, how many images are flagged to this, them each year.
[41] And they were able to tell us that.
[42] And that number was frankly shocking to us.
[43] The handful of images that the tipster stumbled across was just a tiny portion of what the National Center sees every single day.
[44] They told us that in 2018 alone, they received over 45 million images and videos.
[45] Wow.
[46] Forty -five million images a year.
[47] That's more than 120 ,000 images and videos of children being sexually abused every day, every single day.
[48] But to put in perspective, 10 years ago, there were only 600 ,000 images and videos reported to the National Center.
[49] And at that time, they were calling it an epidemic.
[50] So in just a decade, it went from 600 ,000 reports to 45 ,000.
[51] Yeah.
[52] So we were really curious.
[53] How does a problem, called an epidemic 10 years ago, become such a massive issue now?
[54] And one of the first things we learned was that we did try and tackle it back then.
[55] The national epidemic of grown men using the internet to solicit underage teens for sex.
[56] As more and more parents become aware of the dangers, so have lawmakers in Washington.
[57] In the mid to late 2000s, as the internet was being more widely adopted, this issue of online child sexual abuse really got on the radar of Congress.
[58] There was even a bill being introduced by W. Washington Schultz.
[59] The internet has facilitated an exploding multi -billion dollar market for child pornography.
[60] There were multiple hearings.
[61] I'm here today to testify about what many of my law enforcement colleagues are not free to come here and tell you.
[62] They heard from law enforcement.
[63] We are overwhelmed, we are underfunded, and we are drowning in a tidal wave of tragedy.
[64] They were overwhelmed with the number of reports that were coming in.
[65] Unless and until the funding is made available to aggressively investigate and prosecute possession of child pornography, federal efforts will be hopelessly diluted.
[66] They, in many cases, had the tools to see where offenders were, but not enough staff to actually go out and arrest the perpetrators.
[67] We don't have the resources we need to save these children.
[68] Hello, thank you for inviting me to speak today.
[69] My name is Alicia Kozikovic.
[70] A Pittsburgh resident, I am 19 years old, and a sophomore in college.
[71] There was also a very chilling testimony from a victim of child sexual abuse.
[72] For the benefit of those of you who don't know, don't remember those headlines, I am that 13 -year -old girl who is lured by an internet predator and enslaved.
[73] by a sadistic, pedophile monster.
[74] In the beginning, I chatted for months with Christine, a beautiful red -haired 14 -year -old girl, and we traded our school pictures.
[75] Too bad that hers were fake.
[76] Yeah, Christine was really a middle -aged pervert named John.
[77] And he had lots of practice at his little masquerade because he had it all down, the abbreviations, the music, the slang, the clothes, he knew it all.
[78] John slash Christine was to introduce me to a great friend of hers.
[79] This man was to be my abductor, my torturer.
[80] I met him on the evening of January 1st, 2002.
[81] Imagine.
[82] Suddenly, you're in the car, terrified, and he's grabbing onto your hand and crushing it, and you cry out, but there's no one to hear.
[83] In between the beatings and the raping, he will hang you by your arms while beating you, and he will share his prize pictures with his friends over the internet.
[84] Bogeyman is real, and he lives on the net.
[85] He lived in my computer, and he lives in yours.
[86] While you are sitting here, he's at home with your children.
[87] Tax forces all over this country are poised to capture him, to put him in that prison cell with the man who hurt me. They can do it.
[88] They want to do it.
[89] Don't you?
[90] Alicia's testimony really moved people.
[91] People responded.
[92] And eventually, about a year later, the bill passes unanimously.
[93] And what is this new law supposed to do?
[94] So this law, the 2008 Protect Our Children Act, is actually a pre -formidable law with some pretty ambitious goals.
[95] It was supposed to, for the first time ever, secure tens of millions of dollars in annual funding for investigators working on this issue.
[96] And it required the Department of Justice to really study.
[97] the problem and put out reports to outline a strategy to tackle it.
[98] And what has happened since this ambitious law was put into place?
[99] In many ways, the problem has only gotten worse.
[100] Even though the number of reports has grown into the millions, funding is still pretty much what it was 10 years ago.
[101] And even though the government was supposed to do these regular reports, they've only done two in 10 years.
[102] And that's an issue because if you don't have anyone studying the size of the problem, you don't have anyone raising alarm bells and saying, hey, we need more resources for this.
[103] So they didn't study it, and they didn't increase the funding in a way that would match the scale at which the problem is growing.
[104] Yeah, it really looked like they had this law in 2008, and then everyone really took their eye off the ball.
[105] So we called it.
[106] We call it.
[107] We called Congresswoman Debbie Wasserman Schultz, who was one of the leading proponents of this bill to figure out what happened.
[108] And we are really gobsmacked to hear that she's unaware of the extent of the failings.
[109] She sends a letter to Attorney General William Barr, laying out a lot of our findings, requesting an accounting.
[110] As far as we know, she hasn't heard anything.
[111] So even the person who co -wrote the law was unaware that it was pretty much failing.
[112] She knew about the funding, but even she didn't know the things had gotten this bad.
[113] And we wanted to figure out, now 10 years later, what kind of effect is this having to law enforcement, to the people on the ground, working these cases?
[114] And what we heard from them really shows what happens when everyone looks away.
[115] We'll be right back.
[116] Gabriel, Michael, what happens when you start reaching out to law enforcement?
[117] So Mike and I probably spoke with.
[118] with about 20 different internet crimes against children task forces.
[119] And these are the law enforcement agencies responsible with investigating child sexual abuse.
[120] To be honest, most of the time, as an investigative reporter, generally law enforcement, I mean generally anybody, but especially law enforcement, is not particularly interested in speaking with us.
[121] Usually they don't see much to gain.
[122] But surprisingly, when it came to this issue, they were not only willing to speak with us, but they were interested in speaking with us.
[123] Why do you think that was?
[124] It's partly because we were using the right terminology.
[125] And by that, I mean, we were asking them about child sexual abuse imagery, not child pornography.
[126] And what exactly is the distinction?
[127] Well, legally, they're basically the same.
[128] But for the people who deal with this type of crime day in, day out, who see these images and who speak with survivors, they know that calling it child pornography implies a bunch of things that are generally incorrect.
[129] One is that it equates it with the adult pornography industry, which is legal and made up of consenting adults, whereas children cannot consent to the sexual behavior.
[130] The other thing is that the crimes depicted are heinous, and that each one of them is essentially looking at a crime scene.
[131] And for that reason, they prefer to call it child sexual abuse imagery.
[132] But I think they also talk to us, because for the law enforcement who deal with this, they very much feel that the issue is undercovered and underdiscust, especially considering the seriousness of the crime.
[133] I mean, we had the kind of coordination and cooperation from these law enforcement officers that we rarely see from anybody whatsoever.
[134] They let us go out on raids with them.
[135] They provided us with detailed reports.
[136] They talked to us about specific cases.
[137] They were really, really open because they felt that as a nation, We were turning our backs on children, essentially.
[138] And once you have that access, what do you find?
[139] What we learned talking with all these law enforcement officers was just how this world operates.
[140] A lot of the departments told us about the high levels of turnover they have.
[141] We had one commander who said, back when he was an investigator, he saw one image that was so shocking to him.
[142] He quit and served a tour in Iraq.
[143] That was his escape.
[144] To even see this imagery once changes your life.
[145] And these people look at it all day long.
[146] And then on top of that, they have to deal with the fact that, you know, their funding has not gone up whatsoever.
[147] They're being funded at a level that means they can't do proactive investigations anymore.
[148] So they're not sitting in chat rooms trying to catch what many of them think are the worst criminals.
[149] They're unable to do anything really then respond to the, the utter flood of reports coming in from the National Center.
[150] And because of the sheer number of reports coming in, they're forced to make some truly terrible decisions on how to prioritize who they're investigating.
[151] Like what?
[152] The FBI told us that in addition to, of course, looking for anybody who's in immediate danger, they have to prioritize infants and toddlers.
[153] When we first got into this, we didn't even consider the idea of infants.
[154] And to hear that the FBI and later LAPD would say the same thing, were prioritizing infants and toddlers and essentially not able to respond to reports of anybody older than that.
[155] I mean, it really left us pretty speechless.
[156] So we're learning a lot from speaking with law enforcement, but they also only have a small part of the picture.
[157] We also are thinking about this tip that we got, where the tipster was able just to find these images on search engines very easily.
[158] And so we still have this big question of how easy is it to come across this material?
[159] And how do you go about answering that question?
[160] When we initially started trying to replicate the tipster's search, we had to stop because we didn't want to be searching for these illegal images.
[161] But then we discovered a technology that would allow us to keep investigating without having to look at the images.
[162] It's called photo DNA, and it essentially creates a unique digital fingerprint for each image.
[163] And as the National Center is receiving these reports and identifying these illegal images of child sexual abuse, they keep track of these digital fingerprints, and other companies can tap in to that database to see, is this image that I have, is it in that database of known images?
[164] And we stumbled upon a service from Microsoft that actually allows you to do just that.
[165] It'll take a URL of an image and tell you if it matches an image already in that database.
[166] So we wrote a computer program that would replicate that initial search from the Tipster, record all of the URLs, The key part of it, though, was that it blocked all images.
[167] So suddenly you can now search for these images and find where they live on the Internet without illegally looking at them.
[168] Right.
[169] So we started doing this test across a number of different search engines.
[170] And to sit there and watch as the program starts ticking away and see the first time it flashes that it got to match and then to see it match again.
[171] Bing from Microsoft and match again, Yahoo, and match again.
[172] Another one called Duck, Duck, Go.
[173] I mean, I think both our jaws dropped.
[174] We didn't find any on Google, but on other search engines powered by Microsoft data, we found a number of matches.
[175] Dozens, dozens of images.
[176] You're saying the company that came up with this technology to help track these images is the very same company whose search engine allows people to find them, view them, keep viewing them.
[177] Right.
[178] As soon as we started running this computer program using Microsoft to detect illegal imagery, we were finding illegal imagery on Microsoft.
[179] So it was very clear that they were not using their own services to protect users from this type of image.
[180] And Microsoft told us that this was the result of a bug, that they fixed, but about a week later, we re -ran the test and found even more images.
[181] Wow.
[182] So it sounds like you're finding all these images on Microsoft Power search engines.
[183] So how much of this in the end is just a Microsoft problem?
[184] It's more complicated than that.
[185] We were performing this limited search, just to test search engines, on images.
[186] But we're also, at the same time, reading over 10 ,000 pages of court documents.
[187] These are search warrants and subpoenas from cases where people were caught trading this material.
[188] And what becomes clear pretty quickly is that every major tech company is implicated.
[189] In page after page after page, we see social media platforms, Facebook, KIC, Tumblr.
[190] We see cloud storage companies, Google Drive, Dropbox.
[191] We read one case where an offender went on a Facebook group to ask for advice, from other people say, hey, how do I share this?
[192] How do I get access to it?
[193] How do I get access to children?
[194] And they say download KIC to talk to the kids and download Dropbox, and we can share links with you.
[195] Wow.
[196] And from these documents, it becomes clear that the companies know.
[197] I mean, there's so many cases for each company that they all know.
[198] And so the question becomes, what are they doing about it?
[199] Tomorrow on the daily, a victim's family asks that same question.
[200] We'll be right back.
[201] Here's what else you need to know day.
[202] President Trump has granted clemency to a new round of high -profile individuals, including Bernie Carrick, the former New York City Police Commissioner, who was convicted on eight felony charges, including tax fraud.
[203] Michael Milken, the financier convicted of insider trading.
[204] And Rod Blagojevich, the former governor of Illinois, who was convicted of trying to sell Barack Obama's Senate seat after he became president.
[205] Yes, we have commuted the sentence of Rod Blagojevic.
[206] He served eight years in jail.
[207] It's a long time.
[208] Asked about Blagojevich, who Trump ordered released from prison four years early.
[209] The president mentioned his wife's public pleas on Fox News and Blagojevich's appearance on Trump's NBC TV show, Celebrity Apprentice.
[210] I watched his wife on television.
[211] I don't know him very well.
[212] I bet him a couple of times he was on for a short while of the apprentice years ago.
[213] I seemed like a very nice person.
[214] And on Tuesday, after a new poll showed him surging in the Democratic primary race, Michael Bloomberg qualified for the next presidential debate scheduled for tonight in Las Vegas.
[215] The poll, conducted by NPR, PBS, and Marist College, showed Bloomberg with 19 % support, putting him in second place behind Bernie Sanders.
[216] That's it for the Daily.
[217] I'm Michael Babaro.
[218] See tomorrow.