Betrayal: Weekly XX
[0] I'm John Walsack, host of the new podcast, Missing in Arizona.
[1] And I'm Robert Fisher, one of the most wanted men in the world.
[2] We cloned his voice using AI.
[3] In 2001, police say I killed my family and rigged my house to explode.
[4] Before escaping into the wilderness.
[5] Police believe he is alive and hiding somewhere.
[6] Join me. I'm going down in the cave.
[7] As I track down clues.
[8] I'm going to call the police and have you removed.
[9] Hunting.
[10] One of the most dangerous fugitives in the world.
[11] Robert Fisher.
[12] Do you recognize my voice?
[13] Listen to missing in Arizona every Wednesday.
[14] on the IHeart Radio app, Apple Podcasts, or wherever you get your favorite shows.
[15] The podium is back with fresh angles and deep dives into Olympic and Paralympic stories you know, and those you'll be hard -pressed to forget.
[16] I did something in 88 that hasn't been beaten.
[17] Oh, gosh, the U .S. Olympic trials is the hardest and most competitive meat in the world.
[18] We are athletes who are going out there, smashing into each other, full force.
[19] Listen to The Podium on the IHeart app or your favorite podcast, weekly and every day during the Games to hear the Olympics like you've never quite heard them before.
[20] In 2020, in a small California mountain town, five women disappeared.
[21] I found out what happened to all of them, except one, a woman known as Dia, whose estate is worth millions of dollars.
[22] I'm Lucy Sheriff.
[23] Over the past four years, I've spoken with Dia's family and friends, and I've discovered that everyone has a different version of events.
[24] Hear the story on Where's Deer?
[25] Listen on the IHeart Radio app, Apple Podcasts, or wherever you listen to podcasts.
[26] Topics featured in this episode may be disturbing to some listeners.
[27] Please take care while listening.
[28] This is a crime that thrives in the shadows, and people needed to hear what was actually going on.
[29] One of the biggest problems reporting on this is nobody wants to hear about the problem because of how awful it is.
[30] I'm Andrea Gunning, and this is a betrayal bonus episode.
[31] In episode four, you heard from New York Times reporters Michael Keller and Gabriel Dance, as they spoke to their 2019 investigative piece on child sexual abuse material.
[32] It's called The Internet is overrun with images of child sexual abuse.
[33] What went wrong?
[34] If you have a chance to read it, look it up because it's superb investigative reporting.
[35] There's a link in our show notes to the article.
[36] We wanted to dive a little deeper.
[37] How are crimes being reported?
[38] What role are technology companies playing?
[39] And how is the government responding?
[40] Here's Michael Keller.
[41] This is a crime that thrives in the shadows.
[42] And people needed to hear what was actually going on.
[43] Reporter Gabriel Dance.
[44] One of the biggest problems reporting on this is nobody wants to hear about the problem because of how awful it is.
[45] And to be honest, we were not.
[46] nervous.
[47] But we know from season one of betrayal that our audience is genuinely interested in letting the light in on dark stories.
[48] One of Michael and Gabriel's most important revelations was that our legislators don't really want to hear about it.
[49] State lawmakers, judges, and members of Congress have avoided attending meetings and hearings when it was on the agenda.
[50] They just aren't showing up.
[51] One of the big things was the failures of the federal government to live up to its own promises that it made around 2008 to develop a strong national response.
[52] The government had not really followed through on its grand plans.
[53] The high -level position at DOJ was never fully created.
[54] The strategy reports that were supposed to come out on a regular basis, there's only been two of them over the last decade.
[55] You know, these reports have risen, but federal funding to these special task forces has largely remained flat.
[56] I mean, there's so much of these offenses going on.
[57] There's so many reports.
[58] There's not enough police in the United States seemingly to solve this problem.
[59] ICAC, or the Internet Crimes Against Children Task Force, is working on the front lines every day.
[60] There's at least one ICAC in every state.
[61] Hearing what they go through daily, it's truly harrowing.
[62] What I will say is speaking with members of these ICAC task forces, I was always in such admiration and awe of their work.
[63] Dealing with this kind of content and this kind of horrible crime and really the survivors and how hurt some of them are for sometimes the rest of their lives, we spoke with an ICAC guy in Kansas who had served in the Iraq War.
[64] and he said that he would almost rather go back and serve another tour than continue in his position dealing with these types of crimes.
[65] He had said that he worked in ICAC, and then to take a break, he did a tour in Iraq, and then came back and felt like, all right, now I can go back and keep doing this work.
[66] I'm in all of these law enforcement officers.
[67] They choose this work because they want to save children.
[68] But it really is akin to war in an emotional sense.
[69] Some people, like, viscerally cannot deal with this issue because it is truly one of the most awful crimes that we commit against one another.
[70] And the descriptions, Michael and I probably read hundreds and hundreds of search warrants and legal documents that would describe videos and photos and the acts in them, One strategy ICAC uses to write reports is to turn the video off when documenting the audio and turn the sound off when documenting the video because it's too much to handle at the same time.
[71] These ICAC task force members can only do so much with what they are given.
[72] They triage the cases, often prioritizing the youngest victims.
[73] But they can only investigate about a third of all the tips because the caseload is so overwhelming.
[74] Of course, predators are the biggest problem and bear the most responsibility.
[75] But we need to acknowledge there's another culpable participant when it comes to the explosion of CSAM material, technology companies.
[76] Before the Internet, the U .S. Postal Service was the leading reporter of CSAM and was stopping the dissemination of material via the mail.
[77] However, with millions of images plaguing the Internet, is it time we started holding technology companies responsible for their lack of action?
[78] What I do think they're certainly responsible for is allowing this problem to get very serious before they started to take responsibility for their role in it.
[79] As early as 2000, tech companies knew this was a very serious problem and we're doing nothing to solve it.
[80] So I would say that tech companies are certainly responsible for allowing the problem to spiral out of control in the early part of this century, and I'm encouraged that, from what we've seen, several of them have begun to take the problem much more seriously.
[81] If the technology exists to root out criminal behavior, why aren't tech companies deploying it?
[82] Microsoft, along with Professor Honey Farid, came up with the technology called photo DNA.
[83] This takes a database of image fingerprints, and whenever a photograph gets uploaded, to an internet platform, that company can scan it to see if it's in the database of verified illegal imagery.
[84] And so that's the main tool that tech companies use, which is great because it's largely automated and easy to use.
[85] It's been around for a long time.
[86] So a company like Facebook or others that are doing automated scanning, they can generate a large number of reports, just through this software, they also generally have a team of human moderators that review it, and that serves an important role of verifying what was found and also escalating it if there's evidence of actual hands -on abuse of a child.
[87] If you talk with most technology policy people, one perspective that you hear a lot, Technology companies don't have that much pressure to get rid of harmful content on their platform because they don't face any legal liability for it.
[88] You know, technology companies, of course, would say we have every reason to get rid of this harmful content.
[89] We don't want to be a place for exploitation.
[90] What the legislative solutions that have been proposed so far try to go after Section 230, of the Communications Decency Act, which shields technology companies from any liability for content that users post.
[91] There have been a few proposals to try and change that, both from Democrats and Republicans.
[92] It's been one of the few areas of bipartisan support.
[93] Those proposals have not gone through, but over the last few years, you do see people trying to find ways to increase the incentives for tech companies, to clamp down on this more.
[94] Let's take Facebook's parent company Meta as an example.
[95] Meta is the leading reporter of child sexual abuse material to the National Center for Missing and Exploited Children.
[96] Almost all of the illegal content gets transmitted through their Messenger app.
[97] That isn't necessarily because it has the MoseSAM, because they are using photo DNA and finding offenders.
[98] Currently, Messenger does not encrypt their messages.
[99] However, Meta has announced that this year, it will make end -to -end encryption the default.
[100] Meta executives have admitted that encryption will decrease its ability to report CSAM, saying, if it's content we cannot see, then it's content we cannot report.
[101] The Virtual Global Task Force, a consortium of 15 law enforcement agencies, is practically begging Meta not to do it.
[102] Meta's CEO Mark Zuckerberg stated, encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things.
[103] When billions of people use a service to connect, some of them are going to misuse it for truly terrible things, like child exploitation, terrorism, and extortion.
[104] The more communications are encrypted, the less capable tech companies are of using these automated scanning tools to find and report CSAM.
[105] that's a much broader conversation that should be had and oftentimes it gets short -handed to everything should be encrypted or nothing should be encrypted.
[106] I'm John Walsack, host of the new podcast Missing in Arizona.
[107] And I'm Robert Fisher, one of the most wanted men in the world.
[108] We cloned his voice using AI.
[109] In 2001, police say I killed my family.
[110] First mom, then the kids.
[111] And rigged my house to explode.
[112] In a quiet suburb.
[113] This is the Beverly Hills.
[114] of the valley before escaping into the wilderness there was sleet and hill and snow coming down they found my wife's SUV right on the reservation boundary and my dog blew all i could think of is in the sniper me out of some tree but not me police believe he is alive and hiding somewhere for two years they won't tell you anything i've traveled the nation i'm going down in the cave tracking down clues they were thinking i picked him up and took him somewhere if you keep asking me this i'm going to call the police and have you removed searching for robert fisher one of the most dangerous fugitives in the world.
[115] Do you recognize my voice?
[116] Join an exploding house.
[117] The hunt.
[118] Family annihilation.
[119] Today.
[120] And a disappearing act.
[121] Listen to missing in Arizona every Wednesday on the IHeart Radio app, Apple Podcasts, or wherever you get your favorite shows.
[122] The podium is back with fresh angles and deep dives into Olympic and Paralympic stories you know.
[123] And those you'll be hard pressed to forget.
[124] I did something in 88 that hasn't been beaten.
[125] Oh gosh.
[126] The U .S. Olympi Trials is the hardest and best.
[127] most competitive meat in the world.
[128] We are athletes for going out there, smashing into each other, full force.
[129] Listen to The Podium on the IHeart app or your favorite podcast platform weekly and every day during the games to hear the Olympics like you've never quite heard them before.
[130] In the summer of 2020, in the small mountain town of Idlewild, California, five women disappeared in the span of just a few months.
[131] Eventually, I found out what happened to the women, all except one.
[132] A woman named Lydia Abrams, known as Dia.
[133] Her friends and family ran through endless theories.
[134] Was she hurt hiking?
[135] Did she run away?
[136] Had she been kidnapped?
[137] I'm Lucy Sherrith.
[138] I've been reporting this story for four years, and I've uncovered a tangled web of manipulation, estranged families and greed.
[139] Everyone, it seems, has a different version of events.
[140] Hear the story on Where's Dea, my new podcast from Pushkin Industries and IHeart Podcasts.
[141] Listen on the IHeart Radio app, Apple Podcasts, or wherever you listen to podcasts.
[142] The encryption conversation is often complicated by this particular issue and held out by both sides, both by law enforcement and by tech companies and people who believe that all communications should be encrypted as a wedge issue, I think there can be more nuanced to that conversation, particularly when you come to platforms and social media networks where adults can engage with children.
[143] Just by definition, children are at such a disadvantage.
[144] Something that's important to note as well is that many of these social networks also give predators an opportunity to engage with children in a way that was never before possible.
[145] You have documented cases of grown men going on Facebook pretending to be children and then sexually extorting other children into sending images of themselves after which they continue to force them to produce more and more imagery.
[146] Gabe is referring to what is commonly known as extortion.
[147] Tricking a young person into sending an image and then essentially blackmailing the child into sending more with threats of exposure or harm.
[148] The encryption debate won't be solved anytime soon, but it's clear that protecting children from abuse is not enough of a reason to compel for -profit tech companies to consider changing their approach.
[149] Social media websites and messaging platforms are ground zero for the production and sharing of CSAM material.
[150] Through the dark web, encrypted groups appalling communities have developed.
[151] Take the site Welcome to Video.
[152] This dark net site hosted in South Korea amassed more than 250 ,000 child exploitation videos in only two years.
[153] Welcome to Video created a community of users who bought and traded appalling content.
[154] Videos were sold for Bitcoin.
[155] According to an April 22 article in Wired magazine, the site's upload page instruct do not upload adult porn.
[156] The last two words highlighted in red for emphasis.
[157] The page also warned that uploaded videos would be checked for uniqueness, meaning only new material would be accepted.
[158] In a lot of online groups, these images are like a currency.
[159] In order to gain access to people's collections, it's required that you produce new never -before -seen.
[160] images.
[161] So you also have that dynamic where people that want to get images are pushed into abusing children and documenting that abuse and sharing it online.
[162] Welcome to Video was brought down by a joint effort between the FBI and the South Korean government.
[163] It was the result of dogged detective work and internet sleuthing.
[164] And while it was hosted in South Korea, many of its users were United States citizens.
[165] There are so many people who don't realize just how big this problem is and how close to home it actually hits.
[166] So with all of this information we have, what can we do to make the public more aware of this problem?
[167] What I came away with as the clearest call to action from our reporting is spreading awareness and educating parents and encouraging them to educate their children.
[168] This is not necessarily a problem that tech companies can solve and certainly don't seem determined to solve.
[169] We spoke with a few online child safety experts who had a few pieces of advice.
[170] One brought up the idea that, you know, the industry is not about the business of promoting safety, and she said that she would love to see whenever she buys a cell phone, a pamphlet that comes along with it that says how to keep your children safe with this device.
[171] The key thing is to not keep abuse secret.
[172] The less we talk about this, the more the offenders have an advantage.
[173] They thrive on the feelings of guilt and blame that a child may have if they were tricked into sending a nude photograph.
[174] that shame is really what gives them more power.
[175] If you or someone you know has been a victim of extortion, you can get help.
[176] Email the National Center for Missing and Exploited Children or call 1 -800 The Lost.
[177] Many thanks to Michael Keller and Gabriel Dance from the New York Times.
[178] See our show notes for a link to their article.
[179] The internet is overrun with images of child sexual abuse.
[180] What went wrong?
[181] Since we spoke with Michael and Gabriel, meta has been caught up in controversy again.
[182] A recent investigation by the Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst found that Instagram was helping to link predators and people selling child sexual abuse material, its algorithm -connected accounts, offering to sell illicit sex material with people seeking it.
[183] According to the Wall Street Journal, Instagram allowed users to search for terms that its own algorithms know may be associated with illegal material.
[184] And it's not like they were hiding in.
[185] Instagram enabled people to search hashtags like hashtag pudo horror and hashtag preteen sex, then connected them to accounts advertising CSAM for sale.
[186] If that wasn't troubling enough, a pop -up screen for the users warned, these results may contain images of child sexual abuse and then offered users options.
[187] One of them was see results anyway.
[188] Meta has set up an internal task force to address the problem.
[189] If you would like to reach out to the betrayal team, email us at BetrayalPod at gmail .com.
[190] That's Betrayal P -O -D at gmail .com.
[191] To report a case of child sexual exploitation, call the National Center for Missing and Exploited Children's Cyber Tip Line.
[192] At 1 -800, The Lost.
[193] If you or someone you know is worried about their sexual thoughts and feelings towards children, reach out to stopitnow .org.
[194] In the United Kingdom, go to stopitnow .org .com.
[195] These organizations can help.
[196] We're grateful for your support.
[197] And one way to show support is by subscribing to our show on Apple Podcasts.
[198] And don't forget to rate and review Betrayal.
[199] Five -star reviews go a long way.
[200] A big thank you to all of our listeners.
[201] Betrayal is a production of Glass Podcasts, a division of Glass Podcasts, a division of Glass Entertainment Group and partnership with IHart Podcasts.
[202] The show was executive produced by Nancy Glass and Jennifer Fasin.
[203] Hosted and produced by me, Andrea Gunning, written and produced by Carrie Hartman.
[204] Also produced by Ben Fetterman, associate producer, Kristen Malkuri.
[205] Our IHart team is Ali Perry and Jessica Kreincheck.
[206] Special thanks to our talent Ashley Litton and production assistant Tessa Shields.
[207] Audio editing and mixing by Matt Dalvecchio.
[208] A Trails theme composed by Oliver Baines.
[209] music library provided by Mide Music.
[210] And for more podcasts from IHeart, visit the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
[211] I'm John Walzac, host of the new podcast, Missing in Arizona.
[212] And I'm Robert Fisher, one of the most wanted men in the world.
[213] We cloned his voice using AI.
[214] In 2001, police say I killed my family and rigged my house to explode.
[215] Before escaping into the wilderness.
[216] Police believe he is alive and hiding somewhere.
[217] Join me. I'm going down in the cave.
[218] as I track down clues.
[219] I'm going to call the police and have you removed.
[220] Hunting.
[221] One of the most dangerous fugitives in the world.
[222] Robert Fisher.
[223] Do you recognize my voice?
[224] Listen to missing in Arizona every Wednesday on the IHeart Radio app, Apple Podcasts, or wherever you get your favorite shows.
[225] The podium is back with fresh angles and deep dives into Olympic and Paralympic stories you know, and those you'll be hard -pressed to forget.
[226] I did something in 88 that hasn't been beaten.
[227] Gosh, the U .S. only of trials is the hardest and most competitive meat in the world.
[228] We are athletes for going out there, smashing into each other, full force.
[229] Listen to The Podium on the IHeart app or your favorite podcast platform weekly and every day during the games to hear the Olympics like you've never quite heard them before.
[230] In 2020, in a small California mountain town, five women disappeared.
[231] I found out what happened to all of them, except one, a woman known as deer, whose estate is worth millions of dogs.
[232] I'm Lucy Sheriff.
[233] Over the past four years, I've spoken with Dia's family and friends, and I've discovered that everyone has a different version of events.
[234] Hear the story on Where's Deer?
[235] Listen on the IHeart Radio app, Apple Podcasts, or wherever you listen to podcasts.