Betrayal: Weekly XX
[0] I'm John Walsack, host of the new podcast, Missing in Arizona.
[1] And I'm Robert Fisher, one of the most wanted men in the world.
[2] We cloned his voice using AI.
[3] In 2001, police say I killed my family and rigged my house to explode.
[4] Before escaping into the wilderness.
[5] Police believe he is alive and hiding somewhere.
[6] Join me. I'm going down in the cave.
[7] As I track down clues.
[8] I'm going to call the police and have you removed.
[9] Hunting.
[10] One of the most dangerous fugitives in the world.
[11] Robert Fisher.
[12] Do you recognize my voice?
[13] Listen to missing in Arizona every Wednesday.
[14] on the IHeart Radio app, Apple Podcasts, or wherever you get your favorite shows.
[15] The podium is back with fresh angles and deep dives into Olympic and Paralympic stories you know, and those you'll be hard -pressed to forget.
[16] I did something in 88 that hasn't been beaten.
[17] Oh, gosh, the U .S. Olympic trials is the hardest and most competitive meat in the world.
[18] We are athletes who are going out there, smashing into each other, full force.
[19] Listen to the podium on the IHeart app or your favorite podcast, weekly and every day during the Games to hear the Olympics like you've never quite heard them before.
[20] In 2020, in a small California mountain town, five women disappeared.
[21] I found out what happened to all of them, except one, a woman known as Dia, whose estate is worth millions of dollars.
[22] I'm Lucy Sheriff.
[23] Over the past four years, I've spoken with Dia's family and friends, and I've discovered that everyone has a different version of events.
[24] Hear the story on Where's Deer?
[25] Listen on the IHeart Radio app, Apple Podcasts, or wherever you listen to podcasts.
[26] Hey guys, it's Andrea Gunning with some big betrayal news.
[27] I have been on location with some of the people you heard in season two, Ashley Avea and their family to shoot a docu series for Hulu.
[28] I'll let you know when the docu series is available on Hulu later this year.
[29] We're also excited to announce that betrayal will become a weekly series starting this summer.
[30] Thanks to your support of this podcast, we'll be able to bring you many real -life stories of betrayal making this community even stronger.
[31] So if you've been thinking about sharing your story, now is the time.
[32] Email us at Betraylpod at gmail .com.
[33] That's Betrayal P -O -D at gmail .com.
[34] I want to share some news that affects parents and children everywhere.
[35] Our second season of betrayal focused on families destroyed by child sexual abuse material.
[36] Also called CSAM.
[37] The National Center for Missing and Exploited Children has reviewed over 322 million images and videos of child sexual exploitation.
[38] It's hard to wrap your head around that.
[39] It's why we couldn't stay away from the topic last season.
[40] It's also been a big issue in Washington recently.
[41] Betrayal producer Carrie Hartman has been following developments.
[42] Carrie, I know you watched it.
[43] What did you see?
[44] Yeah, I watched it.
[45] It was fascinating.
[46] The Senate Judiciary Committee, they subpoenaed five CEOs of some of the biggest tech companies, Discord, Snap, Meta, X, you know, formerly Twitter, and TikTok.
[47] And the committee wants to advance several bills that address online safety for children.
[48] And this hearing, it got a ton of publicity.
[49] And at the beginning, Senate Judiciary Chair Dick Durbin explained how the committee was feeling.
[50] these apps have changed the ways we live work and play but as investigations have detailed social media and messaging apps have also given predators powerful new tools to sexually exploit children your carefully crafted algorithms can be a powerful force on the lives of our children today we'll hear from the CEOs of those companies their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.
[51] But the tech industry alone is not to blame for the situation we're in.
[52] Those of us in Congress need to look in the mirror.
[53] This was a major issue for two New York Times reporters that you talk with earlier this season.
[54] Yeah, why don't we actually revisit that interview with Gabriel Dance and Michael Keller?
[55] We spoke with people who said that as early as 2000, tech companies knew this was a very serious problem, and we're doing nothing to solve it.
[56] In 2009, when they introduced scanning technology, we knew that it could be effective in helping stem the problem.
[57] Still, tech companies were not using it.
[58] I would say if you talk with most technology policy people, their answer would be technology companies don't have that much pressure to get rid of harmful content on their platform because Section 230 of the Communications Decency Act shields technology companies from any liability for content that users post.
[59] Can you explain more about what Section 230 does?
[60] Okay, so Section 230 means any lawsuit holding a tech company liable for damages won't go anywhere.
[61] They have immunity.
[62] So if Facebook, Discord, Snapchat, or X is storing or transmitting images of CSAM, for example, parents can't hold the company responsible and try to imagine if it was your child's photo and if that child was tricked into sending it.
[63] But Section 230 was passed almost 30 years ago back in 1996.
[64] No one could have imagined back then, TikTok or Instagram, or even sex distortion.
[65] People still had their photos developed at the drugstore.
[66] And I have to tell you how real this is.
[67] I mean, this happened to a close friend of mine, to her child.
[68] You take a vulnerable kid and a savvy adult with no conscience and no barriers.
[69] Right.
[70] So why was there a hearing now?
[71] It seems in recent months that frustration with tech's immunity is just, getting bigger on both sides of the aisle.
[72] And look, this isn't the first time Congress has summoned tech leaders for a shaming session.
[73] But I was really curious.
[74] Was this more than a shaming session?
[75] So I reached out to Politico Technology reporter Rebecca Kern.
[76] She was in the room for this whole thing.
[77] And she shared some of her thoughts.
[78] Oh, interesting.
[79] I've been covering efforts in Congress to regulate social media companies and how they handle kids' online safety issues.
[80] Typically, there's a lot of posturing from the senators, but in the room, it was very palpable the emotion because this time the committee members invited families whose children have died.
[81] As a result, they say, of content they've been exposed to on the platforms.
[82] A number of children have committed suicide over cyberbullying, over a new phenomenon that I know you guys have covered in the podcast called sex torsion where organized criminal groups create fake accounts that oppose to be other children and extort illicit images from children and then hold them financially.
[83] Oh my gosh.
[84] Yeah.
[85] And the committee chair, Dick Durbin, co -sponsored the Stop CSAM bill.
[86] That bill would hold platforms responsible if they host CSAM or make it available.
[87] And you're probably thinking, well, who would make those images available.
[88] But haven't you ever searched for something like you just took up skiing recently, right?
[89] So you want to see more images of skiing.
[90] And then the platform's algorithm recommends more content because they think that you like that.
[91] Well, it does the same thing with nefarious and dangerous content.
[92] And Senator Ted Cruz went after meta on exactly that point.
[93] Mr. Zuckerberg.
[94] In June of 2023, the Wall Street Journal reported that Instagram's recommendation systems were actively connecting pedophiles to accounts that were advertising the sale of child sexual abuse material.
[95] In other words, this material wasn't just living on the dark corners of Instagram.
[96] Instagram was helping pedophiles find it by promoting graphic hashtags, including hashtag ped -hore and hashtag preteen sex to potential buyers.
[97] Instagram also displayed the following warning screen to individuals who were searching for child abuse material.
[98] These results may contain images of child sexual abuse.
[99] And then you gave users two choices.
[100] Get resources or see results anyway.
[101] In what sane universe, is there a link for see results anyway?
[102] How did Mark Zuckerberg respond?
[103] to that.
[104] There's no good answer for that.
[105] But here's what he said.
[106] Well, because we might be wrong.
[107] We try to trigger this warning or we tried to when we think that there's any chance that the results.
[108] Okay, you might be wrong.
[109] Here's more from Rebecca Kern.
[110] Tech companies will admit and it is for sure not something they want on their platforms.
[111] They don't want to be hosting CSAM and they take great efforts to remove it.
[112] And I will give them credit.
[113] They invest millions of dollars into AI and machine learning to detect it early.
[114] But it's still there, and it gets spread across multiple platforms.
[115] These companies are self -pleasing and self -reporting, but we're depending on them to find it and shut it down.
[116] It's interesting that you bring that up because a senator from Rhode Island, Senator Sheldon White House, commented exactly on that issue.
[117] We are here in this hearing because as a collective, your platforms really suck at policing themselves.
[118] In my view, Section 230 is a very significant part of that problem.
[119] Listen, there were great soundbites from senators, but that doesn't translate to policy, right?
[120] Rebecca Kern pointed out that Section 230 served an important purpose, at least for a while.
[121] We wouldn't be leading the globe in these innovations without Section 230 and allowing them to flourish without lawsuits.
[122] But a lot of other senators are saying, okay, we allow them to flourish and grow.
[123] Now we need to rein them in.
[124] And we're an outlier in the whole globe.
[125] Europe has been able to pass regulations and hold them accountable.
[126] And so a lot of people said it's time to take away this quote -unquote sweetheart deal that we have given to tech companies.
[127] I'm John Walsack, host of the new podcast, Missing in Arizona.
[128] And I'm Robert Fisher, one of the most wanted men in the world.
[129] We cloned his voice using AI.
[130] In 2001, police say I killed my family.
[131] First mom, then the kids.
[132] And rigged my house to explode.
[133] In a quiet suburb.
[134] This is the Beverly Hills of the Valley.
[135] Before escaping into the wilderness.
[136] There was sleet and hail and snow coming down.
[137] They found my wife's SUV.
[138] Right on the reservation boundary.
[139] And my dog flew.
[140] All I could think of is in the sniper.
[141] out of some tree.
[142] But not me. Police believe he is alive and hiding somewhere for two years.
[143] They won't tell you anything.
[144] I've traveled the nation.
[145] I'm going down in the cave.
[146] Tracking down clues.
[147] They were thinking that I picked him up and took him somewhere.
[148] If you keep asking me this, I'm going to call the police and have you removed.
[149] Searching for Robert Fisher.
[150] One of the most dangerous fugitives in the world.
[151] Do you recognize my voice?
[152] Join an exploding house.
[153] The hunt.
[154] Family annihilation.
[155] Today.
[156] And a disappearing act.
[157] Listen to missing in Arizona every Wednesday on the IHeart Radio Apple Podcasts or wherever you get your favorite shows.
[158] The podium is back with fresh angles and deep dives into Olympic and Paralympic stories you know and those you'll be hard pressed to forget.
[159] I did something in 88 that hasn't been beaten.
[160] The U .S. Olympic trials is the hardest and most competitive meat in the world.
[161] We are athletes who are going out there smashing into each other full force.
[162] Listen to The Podium on the IHeart app or your favorite podcast platform weekly and every day during the Games to hear the Olympics like you've never quite heard them before.
[163] In the summer of 2020, in the small mountain town of Idlewild, California, five women disappeared in the span of just a few months.
[164] Eventually, I found out what happened to the women, all except one, a woman named Lydia Abrams, known as Dia.
[165] Her friends and family ran through endless theories.
[166] Was she hurt hiking?
[167] Did she run away?
[168] Had she been kidnapped?
[169] I'm Lucy Sherrith.
[170] I've been reporting this story for four years and I've uncovered a tangled web of manipulation, estranged families and greed.
[171] Everyone, it seems, has a different version of events.
[172] Hear the story on Where's Dea, my new podcast from Pushkin Industries and IHeart Podcasts.
[173] Listen on the IHeart Radio app, Apple Podcasts, or wherever you listen to podcasts.
[174] Did any comments stand out to you while you were watching?
[175] There were a lot of them, but this one from Amy Kloberchard kind of got me. When a Boeing plane lost a door in mid -flight several weeks ago, nobody questioned the decision to ground a fleet of over 700 planes.
[176] So why aren't we taking the same type of decisive action on the danger of these platforms when we know these kids are dying?
[177] She has a point, right?
[178] When everyone is worried about their own physical safety, boom, it's done.
[179] And I got to tell you about another moment that really took the room down.
[180] And that was when Meta CEO Zuckerberg testified that social media doesn't.
[181] really do any harm to kids.
[182] With so much of our lives spent on mobile devices and social media, it's important to look into the effects on teen mental health and well -being.
[183] I take this very seriously.
[184] Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes.
[185] Did he say that with a straight face?
[186] He did, and there were some laughter.
[187] I mean, it was one very short moment of levity, but, you know, it's just so absurd.
[188] You don't have to be a social scientist or a psychologist to understand that social media impacts kids a lot.
[189] Was there anyone there defending the work of technology companies?
[190] I mean, there are ways they've enriched all of our lives.
[191] Can you even remember life before Amazon?
[192] Life before Amazon?
[193] We mean going to a store and having to wait in line?
[194] No. No, of course not.
[195] But all kidding aside, some senators mentioned that and did praise these companies for adding some value to society.
[196] But this hearing wasn't set up for pushback.
[197] It was really about these tech companies being told draconian measures are coming if you don't do a better job.
[198] But outside of this, there is an advocate for the tech company called Net Choice, and they are pushing back pretty hard.
[199] They have filed several lawsuits against states that are tired of waiting for the federal government to do something.
[200] Can you give me an example?
[201] Sure.
[202] There's one, Net Choice is suing the Ohio Attorney General over the Social Media Parental Notification Act.
[203] This law requires companies to obtain parental consent before an individual's younger than 16 can use platforms like Facebook, Instagram, YouTube, Snapchat.
[204] So NITWIS does not support any of these bills being pushed by the Judiciary Committee.
[205] What do they support?
[206] Well, free speech is what they hang their hat on.
[207] Free speech, free speech all the way.
[208] But one thing that they did promote that we'll be familiar to our season two listeners is to hold child abusers accountable by prosecuting more of them.
[209] You know, far too many reports of CSAM offenses are not investigated, not prosecuted, because we talked about this, Andrea, like they're triaged, right?
[210] There's not enough law enforcement to go after all the people that are breaking these laws.
[211] And when they're able to go after them, they can prosecute them and at least put them in for some kind of prison time.
[212] But despite net choice, there was some movement on one of the bills called COSA or the Kids Online Safety Act.
[213] Now, this bill wouldn't repeal Section 230.
[214] So we asked Rebecca Kern, what would it do?
[215] That one specifically would hold tech companies accountable and imposing a duty of care for them to make sure that their recommendation systems, their algorithms, do not recommend harmful, quote -unquote, content.
[216] That is the key word, how do you define harmful?
[217] For them, they're saying it's suicide content, eating disordered content.
[218] And Rebecca pointed out that some groups are worried about COSA moving forward.
[219] Progressive LGBTQ groups are saying we're worried that this bill also empowers state attorneys general to sue over harmful content and how they would define content, maybe like trans content or LGBTQ content, that these communities would want to see on the platforms.
[220] Some conservative line AGs may want to take that down.
[221] And so they said this could have an inadvertently negative impact for certain vulnerable youth.
[222] While the CEOs were on the hot seat and, you know, the day before they were called to the hearing, they did make some concessions that are worth mentioning.
[223] Here is ex -CEO, Linda Yakorino.
[224] X supports the Stop C -SAM Act.
[225] The Kids Online Safety Act should continue to progress and we will support the continuation.
[226] to engage with it and ensure the protections of the freedom of speech.
[227] And, you know, SNAP CEO's Evan Spiegel also came out in support of COSA.
[228] And look, it's not everything, but maybe it's a start.
[229] Here's Politico's Rebecca Kern again.
[230] These are the constant battles these platforms have to deal with between privacy, which is such a strong protection in our country and free speech and other protection and safety.
[231] and there's, you know, no real mandate to put safety first.
[232] Do you think Section 230 has a chance of being repealed?
[233] I asked Rebecca that question, and she seemed pretty doubtful.
[234] You know, it's not just the law passing, but it's the lawsuits that would follow, and how many years would it be caught up in court?
[235] I can't help but wonder, did this hearing make a difference?
[236] If you're asking, will it create more safety for children online?
[237] I think there is a reason for hope.
[238] There was some movement we've never seen before, but people need to keep applying pressure because that does make a difference.
[239] Thank you to Politico's Rebecca Kern for her insight.
[240] And thanks to our listeners for your support of betrayal.
[241] Remember, if you want to share your story for the new weekly series of betrayal coming this summer, email us at Betrayalpod at gmail .com.
[242] That's Betrayal P -O -D at Gmail .com.
[243] Betrayal is a production of Glass Podcast.
[244] a division of Glass Entertainment Group and partnership with IHart Podcasts.
[245] The show was executive produced by Nancy Glass and Jennifer Fasen.
[246] Hosted and produced by me, Andrea Gunning.
[247] Written and produced by Carrie Hartman.
[248] Also produced by Ben Federman.
[249] Associate producer, Kristen Malkyrie.
[250] Our IHart team is Ali Perry and Jessica Kreinschek.
[251] Audio editing and mixing by Matt Dalbekio.
[252] A Trails theme composed by Oliver Baines, music library provided by Mide Music.
[253] And for more podcasts from IHeart, visit the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
[254] I'm John Walzac, host of the new podcast, Missing in Arizona.
[255] And I'm Robert Fisher, one of the most wanted men in the world.
[256] We cloned his voice using AI.
[257] In 2001, police say I killed my family and rigged my house to explode.
[258] Before escaping into the wilderness.
[259] Police believe he is alive and hiding somewhere.
[260] Join me. I'm going down in the cave.
[261] As I track down clues.
[262] I'm going to call the police and have you removed.
[263] Hunting.
[264] of the most dangerous fugitives in the world.
[265] Robert Fisher.
[266] Do you recognize my voice?
[267] Listen to missing in Arizona every Wednesday on the IHeart Radio app, Apple Podcasts, or wherever you get your favorite shows.
[268] The podium is back with fresh angles and deep dives into Olympic and Paralympic stories you know, and those you'll be hard -pressed to forget.
[269] I did something in 88 that hasn't been beaten.
[270] Oh, gosh, the U .S. Olympic trials is the hardest and most competitive meat in the world.
[271] We are athletes.
[272] We're going out there, smashing into each other, full force.
[273] Listen to The Podium on the IHeart app or your favorite podcast platform weekly and every day during the games to hear the Olympics like you've never quite heard them before.
[274] In 2020, in a small California mountain town, five women disappeared.
[275] I found out what happened to all of them, except one, a woman known as Dia, whose estate is worth millions of dollars.
[276] I'm Lucy Sheriff.
[277] Over the past four years, I've spoken with Dia's family and friends, and I've discovered that everyone has a different version of events.
[278] Hear the story on Where's Deer?
[279] Listen on the IHeart Radio app, Apple Podcasts, or wherever you listen to podcasts.