Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert experts on expert.
[1] I'm Monica Lilly Padman.
[2] I'm joined by Dax Shepard.
[3] Dax Randall Shepard.
[4] Oh, so sorry, I always forget your middle name.
[5] It doesn't seem like that would be your middle name.
[6] That it is, and I love it.
[7] Who are you named after?
[8] Where does that one come from?
[9] Oh, my uncle.
[10] Your uncle Randy.
[11] Yep.
[12] Yeah.
[13] You know what I just realized?
[14] Oh, my God, what?
[15] I have an uncle.
[16] You're named after your uncle Randy, and my middle name is named after my aunt Lily.
[17] Your Uncle Lily.
[18] You have a ruin.
[19] I'm sorry.
[20] Uncle Lily would be funny, though.
[21] Well, sure.
[22] But I was angry because my uncle Randy was Randy, not Randall.
[23] And as like a six -year -old, I'm like, no, Randall, all Randys are really Randall.
[24] I'm like, I want Randy be good.
[25] You wanted to be Jack's Randy Shepherd?
[26] Well, I'm glad it's Randall.
[27] Okay.
[28] Randall sounded like Randolph Hurst.
[29] Yeah, that's cool.
[30] Oh, okay.
[31] Today we have Barry Meyer on.
[32] Barry Meyer is a Pulitzer Prize winning writer in former.
[33] New York Times journalist who wrote the 2003 nonfiction book Pain Killer, A Wonder, Drugs, Trial of Addiction and Death.
[34] So he was way about, he was way ahead of this curve that we're now seeing all these great investigations of the opioid epidemic.
[35] He was like on it.
[36] Yeah.
[37] And he has a new book called Spooked the Trump dossier, Black Cube, and the rise of private spies.
[38] I find this to be so interesting.
[39] It's a multi - billion dollar a year business, people hiring spies to dig up dirt on people to sometimes intimidate.
[40] It feels like a TV show, but it's real life.
[41] I know.
[42] And some of his discoveries are kind of troubling because I was hoodwinked.
[43] Yeah.
[44] I don't like being hoodwinked.
[45] It's my main job is to prevent being hoodwinked.
[46] So please enjoy Barry Meyer.
[47] Also, quick reminder in July, it's time for Spotify.
[48] So get that app on your phone because we will be there exclusively in July.
[49] Wondry Plus subscribers can listen to armchair expert early and ad free right now.
[50] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[51] Or you can listen for free wherever you get your podcasts.
[52] He's an armchair expert.
[53] Hi.
[54] Hi.
[55] Hi.
[56] Hi.
[57] Hi.
[58] Seven times a week and half the time we're to blame.
[59] So, yeah.
[60] Well, I'll blame you.
[61] And I just want to let you know that I have a 23 -year -old daughter, and this is her favorite podcast.
[62] Oh, that's flattering.
[63] She sounds like a very intelligent young woman.
[64] She's extraordinarily intelligent.
[65] So thank you very much.
[66] You've earned me like 30 seconds of grace with my daughter.
[67] Good.
[68] I have two daughters, and I know how fleeting being cool in their eyes is.
[69] It's very quick.
[70] It's like the cherry blossom.
[71] You must enjoy them while they're in bloom.
[72] Indeed.
[73] So, you know, what do you think about the fact that your first huge book, well, that could be inaccurate.
[74] Your humongous book, Painkiller, which ultimately led to some congressional hearings, you started investigating in 2001.
[75] And what's it like for you to now see this second swell of interest in documentaries being made and another book?
[76] How does it feel to have been at the vanguard of that and to see where it's growing into?
[77] You know, it's pretty remarkable because, like most, journalists.
[78] I've written about this problem.
[79] I've brought this problem to light, so it's over.
[80] I've solved it.
[81] Give yourself a pat on the back.
[82] Like the day after Painkiller came out in 2003, I realized I didn't solve it.
[83] And along the way, I kept learning more and more things.
[84] I mean, Painkiller, from a publishing perspective, was a complete disaster.
[85] I mean, it probably sold a couple hundred copies.
[86] It went out of print after the first year.
[87] It never even was published as a paperback.
[88] And it was sort of a blessing in disguise because all the publication rights came back to me. So then I was able to republish the book in 2018.
[89] And the reason for republication was, as you noticed, this whole second wave had started.
[90] All the litigation had started.
[91] And I also had come into possession of this extraordinary document, which was this prosecution memo, this memo that had been drawn up by the prosecutors who were going after Purdue in the mid -2000s in which they laid out their reasons for like wanting to indict the top executives of the company and bring very serious charges against them, you know, send them to jail.
[92] So it's been one of those stories where it just keeps going and going and going to the point where, as you mentioned, I'm now a character in books that are being written about this.
[93] Yeah.
[94] Yeah.
[95] Have you watched the HBO documentary that's currently out, Crime of the Century?
[96] He's in it.
[97] You're in it.
[98] Yeah, yeah.
[99] Yes, yes.
[100] And it's funny, I'm in it, albeit briefly.
[101] And it looks like I'm in like a cold storage locker, kind of the white room.
[102] Like a forensic scientist.
[103] Yeah, yeah, that John Oliver broadcasts from the white cube.
[104] And it was, I'm up in Western to Massachusetts.
[105] We rented, or Alex Gibney, the director, rented this really weird looking house up here that's like built like a gingerbread cottage.
[106] It's gigantic.
[107] It was like one of these things that, like, you know, some nutty person built.
[108] Well, you know, obviously the parallels between big pharma and big tobacco are not hard to draw.
[109] And similarly, they're very incentivized to have kind of an impenetrable fortress in rings of denial around them.
[110] And so, yeah, it's not shocking to me that daily inner circles of it are penetrated as someone perhaps starts cooperating or whistleblowers come to the forefront.
[111] Like, yeah, the true depth of it, and I just want to say, so I'm a recovering addict.
[112] I've abused oxycodone, cotton, you name it.
[113] So I think I have a unique lens I'm viewing it through.
[114] And Monica and I brought this up because she's reading the other book.
[115] Empire pain.
[116] So as we started talking about this, I have kind of this bizarre libertarian bent, I guess, even though I'm not a libertarian, where I say, you know, if people abuse things, that is on the abuser.
[117] I'm an addict.
[118] I take responsibility for me. If it weren't this thing, I'd find another thing to be addicted to you.
[119] I just love being addicted to stuff.
[120] I don't think it's coca farmer's responsibility.
[121] I think it's mine.
[122] All that to say, when I start seeing the level of deception and conspiracy as far as them manipulating the medical literature, them manipulating consensus in the medical community, employing doctors and experts, paying off perhaps people at the FDA.
[123] Now, I don't think it's a fair fight.
[124] I don't think it's just an addict against an appealing product.
[125] I think it's a real conspiracy that a human individual can't compete with the force of that.
[126] Well, you know, when I first started investigating Oxycontin and Purdue Farm, and this is going back 20 years to 2001, that is, in fact, a thing that really attracted me to the story.
[127] People that abuse drugs have done so forever.
[128] They're going to find the best drug they can to abuse, and why wouldn't you?
[129] If you're a connoisseur, you're going to go find the best thing.
[130] They're drug addicts.
[131] That doesn't mean they're lazy.
[132] Exactly.
[133] Well, it doesn't mean they don't have tapes.
[134] When I started digging into the world, this is many years ago, about the Sackler brothers who own Purdue Pharma, the company and its marketing techniques, I was stunned and appalled.
[135] I've been a newspaper reporter most of my career, so all the cynicism I've seen it before comes with that.
[136] But this is nothing that I'd ever seen before.
[137] And the depth of manipulation, as you say, the depth of misrepresentation, the depth of lying.
[138] And the crime of lying to make money was really appalling.
[139] So did Purdue Pharma, the maker of OxyContin, seek to addict America and sell its drugs exclusively to people who are drug addicts?
[140] I don't think so.
[141] But did they create a huge basket of supply?
[142] And did they push the use of this drug onto patients who, subsequent, became dependent or addicted to it.
[143] And did they do that ruthlessly without any scientific merit and continue to do it?
[144] Even after there was clear -cut evidence that this product had run amok, that's the issue they're being held accountable for.
[145] And that's the reason why their names are being taken off museums in medical schools.
[146] Yeah.
[147] And to your point, so again, there's all these buckets of people.
[148] I think, was it the number 300 ,000 since 01 or something that have OD'd from obviets?
[149] Is that the number?
[150] I think it's much higher than that.
[151] I don't have the most current number on the top of my head, but I think it's well over a half a million people.
[152] Okay, so it's a staggering number.
[153] The scope of it is so enormous that I don't think people have looked at it from 30 ,000 feet and seen like, oh, this is a mega tragedy.
[154] And whereas I have no sympathy for myself or other addicts like me, who I have great sympathy for is the people whose gateway to this was a doctor who is trusted.
[155] And of course we should listen to doctors.
[156] And we should even at times go against our intuition and trust that they know what's best.
[157] So there's this really sacred relationship that was exploited, and that's ruthless.
[158] And I agree with you.
[159] I don't think anyone sat there twirling their mustache going like, ooh, let's get everyone addicted to our product.
[160] But if you say I want the sales of this drug to be $7 billion next year and you fail to recognize what that would represent, that everyone on it was taking vast amounts of it, it's kind of just a willful neglect of the truth.
[161] Yes, and to your point, they also suppressed information showing that this drug was getting onto the street.
[162] They knew from their sales reps that doctors were reporting back, that people were abusing it, people were diverting it to the street, and they never, at any point in time, said, okay, time out, let's figure out what's going on here.
[163] let's try to correct this while it's unfolding because the money was so staggering, the money was so fantastic, and they tried to torpedo anyone, including myself, who were investigating the company trying to bring the truth to light.
[164] One of the moments of the documentary that I watched that was just so unbelievably unethical is when they were discussing patients who are showing major signs of addiction and their behavior and they're calling it drug seeking, which is what it is, their response was to go double down and go another round of the doctors talking to camera going, what you're seeing is not drug seeking.
[165] You're seeing someone that's still undermedicated up the dose.
[166] Like, talk about a fork road.
[167] And they actually chose to say, no, those are all signs of being undermedicated.
[168] Give them way more.
[169] It's just like, whoa, did they drink the Kool -Aid at that point.
[170] Yeah, did they believe it?
[171] I know some of the doctors must have thought that was true.
[172] One of the thoughts that I came away with from writing this book back in the early 2000s, people always think, and I often think, that money is a great motivator, that, okay, these doctors, they're going out and they're shilling this drug for money, they're paying for their kids private schools or for fancy foreign vacations or whatever the case may be.
[173] There's a grain of truth to that, and in some cases, that is the situation.
[174] But there is something even more powerful, and we see it all the time in our society, and that is ideology.
[175] And what Purdue was able to do was inculcate many people within the medical profession with this ideology that pain was being undertreated, that patients were suffering because they weren't getting adequate pain medication.
[176] They took this drug that was intended primarily for using end -of -life cancer treatment, severe treatment for a few days, and basically ramrodded it into the big breadbasket of American medicine so that your local dentist, your local doctor, a sports medicine doctor, whatever, were prescribing it.
[177] And by their very nature, in order for you to, get pain relief from this drug, an opioid drug, you have to keep upping the dose, upping the dose, and upping the dose as time marches on.
[178] And there are all sorts of horrible consequences.
[179] Addiction is but one of them.
[180] There's a whole range of consequences.
[181] I think it's relevant to point out, I guess people's probably fantasy of what this thing is.
[182] There's only one version, right?
[183] Someone's just like gobbling as many as they can to get as fucked up as humanly possible, and that's that, where is in, in my case, I was coasting at a pretty manageable level where it wasn't very detectable to anybody.
[184] I was by no means falling out in grocery stores with my kids, but despite my, quote, responsible use of it, my tolerance is going up daily.
[185] So that element of it is not even like, I'm not even getting obliterated, but before long, holy shit, I need however many hundreds of milligrams of this.
[186] So that's the part that no one can manage.
[187] There is no playbook for your body getting immune to it in the requirement going up.
[188] That really is beyond anyone's control.
[189] You're absolutely right.
[190] And the flip side of that is that once you get really cranked up on a drug -like Oxycontin and you want to come off, the withdrawal is, as you probably know, from your own experience, you know, unbearable, horrible, you've got flu -like symptoms.
[191] You're basically having DTs.
[192] And that became, even for pain patients who became dependent and addicted to these drugs, that became a real disincentive because who wants to go through that?
[193] Like maybe I'm addicted, but wow, the last time I tried to get off for this, it was a nightmare.
[194] So maybe I'll just stay on it.
[195] Yeah, the really insidious part of it is exactly what you're saying.
[196] So over 16 years ago, I was an alcoholic, full -blown alcoholic.
[197] I quit that.
[198] And then, of course, in the first week of quitting alcohol, There's the mental racket telling you, oh, you can do it responsibly.
[199] You can have two, right?
[200] That's one thing you're battling.
[201] But now if you add on another side of it, which is like physically you are in immense pain and anguish and you know that's the antidote, it's just a whole other level that that drug offers that differs, I think, from most other things people are addicted to, is that it's downright gruesome to get off of it.
[202] I interviewed a couple people from the military who had gotten very addicted to Oxycontin because, for wartime injuries in Iraq and Afghanistan, when people came out, they had lost a limb, they had gotten severely hurt.
[203] The medical doctors and the military were giving them oxycontin.
[204] And this ex -soldier, Shane, was a wonderful human being, was explained to me like this decision point that he came to, which was that he loved his wife, he loved his kids, but he was taking so many drugs that he was just sitting on the couch all day, watching TV, getting you really pissed off if they were interrupting him or wanting his attention.
[205] And then just one day he decided, this has got to stop or I'm going to kill myself.
[206] And he just went cold turkey and went through this horrible ordeal, but emerged out the other side of it, thankfully, and regained his life, regained control over his life.
[207] Yeah.
[208] Okay.
[209] So when you were investigating Purdue, did they ever deploy anyone to?
[210] mess with you?
[211] I mean, they have such resources.
[212] Did they put anyone on you?
[213] Which leads to your current book, Spooked, the Trump dossier, Black Cube, and the rise of private spies.
[214] I wonder, were they watching you?
[215] I have no doubt they were.
[216] There's a huge amount of documentation that is still under wraps in litigation.
[217] But I know that there are many documents in there.
[218] If you do a tech search for Barry Meyer, you're going to have enough reading for a couple of days.
[219] And if you do a tech search for Kroll Associates, a very famous private spying firm, you're also going to get a lot of tech.
[220] So basically, Kroll was looking at me. They were looking at other reporters.
[221] They were looking at people who...
[222] Can I interrupt you for one second?
[223] Is it Nick Kroll?
[224] Yeah, that's Nick Kroll's dad's company.
[225] It's Nick Kroll's dad.
[226] That's crazy.
[227] Yeah, it's Nick Kroll's dad.
[228] Can you believe it?
[229] Wow.
[230] Yeah.
[231] And we love Nick Kroll.
[232] So this just got really dynamic.
[233] Jules Crowe is his name, essentially created the modern -day private spying industry.
[234] Wow.
[235] Yes.
[236] Wow.
[237] Yes.
[238] And if Nick acting comedic career doesn't work out, and I hope it stays good.
[239] Yeah, he's amazing.
[240] We love him.
[241] He's a good guy.
[242] But if it doesn't work out, he's got an out.
[243] He can go work for his dad.
[244] His brother is working with his dad.
[245] They've since changed their name.
[246] name to K2 integrity.
[247] A couple of years ago, the company that they ran, and I tell the story and spooked, hired this broken -down British comedy producer.
[248] In Britain, they have what are called prank shows.
[249] They're kind of like candid camera show where, like, you know, everyone is in on the joke except the person who's the target of it, you know, like Borat and all that kind of stuff.
[250] The term for that in England is a prank show.
[251] show.
[252] And so they came up with a brilliant idea.
[253] They were hired by someone with interest in asbestos, that wonderful building material that kills people to infiltrate these public health advocates who were lobbying to try to get asbestos banned in the developing world.
[254] So what they did was to have this former comedy producer present himself as a documentary filmmaker and infiltrate this group by suggesting he was going to be.
[255] to make a big documentary exposing the dangers of asbestos.
[256] This guy, Rob Moore, who I write about in the book, decided this thing was so horrible, and he kind of became a double agent and started ratting them out to people.
[257] Oh, wow.
[258] It's a very wild story.
[259] Wow.
[260] Yeah, it's good.
[261] Man, so I guess the first thought that comes to mind when you tell me stories like this is the amount of money it would take to execute something like this.
[262] Like, just knowing the budget of a documentary film, the money's got to be vast.
[263] Like, whatever this lobbying group is that wants to keep asbestos in the developing world, it must be millions of dollars, right?
[264] I mean, do you have any sense of what, like, the price tag is for these?
[265] They're enormous.
[266] I didn't know what the specific price tag for this was, but let's talk about another private spying firm.
[267] This is a Zarelli firm Black Cube.
[268] They were the ones who were investigating women who were accusing Harvey Weinstein of sexual misconduct.
[269] And they were being paid millions of dollars.
[270] And they were also offered a writer, an additional, I believe it was $3 million that's in the book.
[271] I'm blanking out on the exact figure.
[272] If they succeeded in killing a story that the New York Times was writing about Harvey Weinstein.
[273] So the stakes are enormous.
[274] They're unlimited.
[275] Yeah.
[276] Wow.
[277] This industry, these people make millions of dollars.
[278] There's no real great estimate, but the best I could get is that it's like a one and a half billion dollar industry.
[279] And so the money is phenomenal.
[280] Okay.
[281] Again, not to get stuck on this.
[282] So I read, I want to say it was Ronan's book on the Weinstein.
[283] Whatever book I read, there was a character in it who was employed by Black Cube.
[284] And he was, I don't know how to say this tactfully, but he was kind of a low -rent gum shoe.
[285] Igor.
[286] Yeah, yeah, yeah, yeah, yeah, Igor's my friend.
[287] Yeah, I like that immediately from hearing that description.
[288] Yeah, so what shocked me when I was reading the description of this was he ultimately kind of was double -aged into you.
[289] as well, as I recall?
[290] He kind of tips off Ronan and lets him know what's going on.
[291] I got to know Igor, a different slice of Igor's in my book.
[292] I kind of recount the Ronan's story, but what's fantastic about Igor is that he is from Ukraine.
[293] He's like a Ukrainian Jew, much like Alexander Vindman, who testified during the first Trump impeachment.
[294] And they were kind of contemporaries.
[295] You know, they both came to the United States at the same time.
[296] And, like, Igor, yeah, he's sort of like a guy, like, if you think there's, like, you're an insurance company and you think someone is defrauding you by pretending they're sick or whatever, he'll go sit outside your house on a cold night eating pizza to make sure you're now out there weightlifting or something like that.
[297] Right, right.
[298] Doing wind sprints in the front yard.
[299] Yeah, once he caught wind of what was going on, he was incredibly horrified.
[300] its thought was, look, I came to the United States.
[301] This country welcomed us when we were escaping this horrible treatment in Eastern Europe.
[302] And here I am.
[303] I'm being used by this Israeli company to try to prevent the exercise of free speech and try to thwart the press from doing its job.
[304] In my book, he sort of described as someone who had much more integrity than all these high -priced lawyers that were working for David Boyce's law firm that had signed off and were using Black Cube to gather dirt.
[305] And, you know, one of the things that struck me as I was working on the book with this case and in many other cases is that you have all these people that put on suits and ties or very nice dresses every day, go to work at law firms, and find it morally acceptable to employ con men like the schlumps and schmucks that work for Black Cube because they think that's how they can win their cases.
[306] Hey guys, guess what I did today?
[307] I hired these people who go out and pretend that there's someone that they're not and run this con game in order to try to trick people into like giving them information.
[308] Yeah, and the people being tricked are often victims of sexual abuse.
[309] Yes.
[310] So again, as is always the case, like I get distracted by these other kind of human inevitability.
[311] So the fact that you have this multi -million dollar deal and you think you're hiring like this Black Cube, elite espionage type CIA level MI5 thing, and that ultimately the human propensity to just hire another vendor to do stuff who hires another vendor and And then ultimately, there's cost savings within that.
[312] And then now you have a guy who they're not paying nearly enough to be allegiance to them.
[313] The incentives for him are not high enough.
[314] And then you think, wow, even this elite organization did the stupid thing that everyone does, which is someone builds this great company and ultimately they decided to pay people $5 an hour.
[315] And then they're shocked that those people, A, did a bad job, B, weren't loyal, all these things.
[316] Like, there's something about I love that even in this type of a field, people are still trying to save a dollar, and then it ultimately unravels their whole thing.
[317] I mean, that's just a side note.
[318] It's arrogance.
[319] It's just complete arrogance and this feeling on their part that they can pull any shenanigan they want, any con they want, and get away with it.
[320] And they usually do.
[321] Yeah, but a lot of the activity is just patentedly illegal, right?
[322] Can you tell us some of the things that their operatives do that are just clearly illegal?
[323] You would think it's illegal.
[324] Your listeners will think it's illegal.
[325] I like to think it should be illegal, but it's not.
[326] If I con you out of money, I could go to jail.
[327] But if someone pays me to defraud you out of information, I'm good to go.
[328] Bob's your uncle.
[329] Yeah.
[330] We're living in a society that allows this kind of stuff to happen.
[331] And it really shouldn't.
[332] It would be really easy to make a lot of.
[333] of this illegal because most people think it should be illegal.
[334] So let's go through a couple of the cases because I'm dying to know like what issue they're employed to look into and then how one of these operations works on the ground.
[335] So tell us about the Trump dossier because I guess, which is crazy because when it was happening it felt so novel and exciting, I would have imagined I would have remembered the details for life and here we are.
[336] I've already kind of forgotten what the Trump does.
[337] ACA was all about?
[338] Well, the Trump dossier falls into the general category of what's known as political opposition research or APO, right, where an investigator will go out on behalf of a candidate, a political party, and dig up dirt on their opponent.
[339] And then it'll be used to money them up, smear them, or in some cases, reveal things that are very important to know about them.
[340] Fusion GPS, which is run by two former Wall Street Journal reporters, they were first hired by a major Republican businessman who was trying to prevent Donald Trump from getting the 2016 presidential nomination.
[341] He was a supporter of Marco Rubio.
[342] So they were first hired to dig up dirt on Trump.
[343] And they were mainly looking at his old business deals.
[344] I mean, anyone who's ever heard of Donald Trump or followed Donald Trump or lived in New York as I do and and sort of had to see Donald Trump.
[345] You know, knew that this guy was like a schnuck and a schmuck.
[346] Yeah, it's not hard.
[347] It's not hard, right?
[348] And he had like this big trail of lawsuits and all kinds of garbage.
[349] Weird associations with people and maybe a lot of mobsters would go to his casinos and blah, blah, blah, blah.
[350] So that's all kind of like cut and dry sort of stuff.
[351] Something that like any reporter could find out and do.
[352] But where things really get interesting is in the spring of 2016, when it's clear that Donald Trump is going to be the presidential nominee, the Republican donor who's been funding their efforts to date drops out.
[353] And these two former reporters now go to a law firm that represents the Hillary Clinton campaign.
[354] And they say to him, hey, guess what?
[355] We've been digging up a lot of stuff about Donald Trump.
[356] We think you guys can use it.
[357] We'd love to keep doing it.
[358] And maybe Russia would be a great place to look for info on Donald Trump.
[359] And the lawyer goes like, great, let's go for it.
[360] Yeah.
[361] Stay tuned for more armchair expert, if you dare.
[362] What's up, guys?
[363] It's your girl Kiki, and my podcast is back with a new season.
[364] And let me tell you, it's too good.
[365] And I'm diving into the brains of entertainment's best and brightest, okay?
[366] Every episode, I bring on a friend and have a real conversation.
[367] And I don't mean just friends.
[368] I mean the likes of Amy Polar, Kell Mitchell, Vivica Fox, the list goes on.
[369] So follow, watch, and listen to Baby.
[370] This is Kiki Palmer on the Wondery app or wherever you get your podcast.
[371] We've all been there.
[372] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.
[373] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[374] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[375] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[376] It's called Mr. Ballin's Medical Mysteries.
[377] Each terrifying true story will be sure to keep you up at night.
[378] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.
[379] Prime members can listen early and ad -free on Amazon music.
[380] And so one of these reporters, Glenn Simpson, who's a character throughout the book, knows a former MI6 agent, Christopher Steele, whose name is probably familiar.
[381] And who...
[382] Do you remember him, Monica?
[383] He was the one who originally claimed that Trump had been peed on by some sex workers, and there was maybe a video of that and leverage.
[384] Okay.
[385] Yeah, so they high...
[386] him.
[387] And he's like some retired MI6 spook, who's now a private spy.
[388] And they had known each other.
[389] And Glenn says to him, like, go do this.
[390] I'll pay you.
[391] Blah, blah, blah, blah, blah.
[392] And so Steele dispatches a person he calls his collector, right?
[393] We don't know what his identities at that time, who he is at that time, to go and collect damaging information about Donald Trump.
[394] And so the result of that was this so -called dossier, which was all this stuff that there was, yes, this peatap, that Michael Cohn, Trump's lawyer, had gone to Prague to meet with Russian agents, that other members of Trump's campaign were compromised.
[395] Admittedly, all these guys, they weren't like the most attractive people in the world, right?
[396] They were kind of awful.
[397] But nonetheless, as I sort of unwind the story of the dossier throughout the book, you began to realize that this stuff that Christopher Steele and these two former reporters are shopping to journalists to try to get them to write stories about the dossier is like insane and ridiculous and untrue in most cases.
[398] Can I pause you?
[399] I want to make a public service announcement.
[400] Okay?
[401] Go right ahead.
[402] This is exactly why the fourth estate needs to be defended at all cost because I can't imagine you.
[403] I'm guessing you probably voted for Hillary in that election.
[404] You don't have to tell me. I can't imagine you are a Trump supporter.
[405] And yet you're presented with some facts that are pretty much saying the left's Russian investigation was probably to some degree of which.
[406] And you have an ethical obligation that I'm blown away with that you just got to tell the truth.
[407] As you find the truth, you must tell the truth, even if it is counter to whatever your political leanings might be.
[408] This is like, for us on the left, right, this is like dangerous shit right here.
[409] You're totally right.
[410] It says to me is like, we're as bad as the Hillary email dump, or as bad, both sides are fucking gnarly, and that's exactly why no one trusts each other.
[411] Okay, continue.
[412] I just wanted to point out that, like, you're going against what I would assume your interests are here at this point.
[413] Yes, because ultimately, my interest is the truth.
[414] I mean, I have no interest in perpetrating or continuing a false narrative.
[415] Whatever it is, you know, not this, not anything.
[416] And I think as journalists, yet I didn't vote for Trump.
[417] I think Trump's a horrible human being.
[418] But the way I feel has nothing to do with what I report or write about, right?
[419] I mean, those two things can exist very comfortably with each other.
[420] But that's why this is the most sacred thing for us.
[421] in a democracy to protect.
[422] I'm just kind of pointing out.
[423] And you're absolutely right.
[424] And part of my motivation for writing the book, and I hope it's something that people take away as they're reading the book, is that it's precious.
[425] That right is precious.
[426] That trust is precious.
[427] And it can't be squandered.
[428] And this is a case where it was squandered.
[429] And it was actually squandered in such a way that Trump supporters, Trump himself, whomever was able to turn it and use it against people who were trying to get the truth about all of Trump's activities, not related to this.
[430] This guy, Peter Strozag, who used to be at the FBI and was involved in the Trump investigation, basically said that all the narrative that was being promoted about the collusion narrative, if you will, was ultimately so damaging.
[431] to examining Trump and all of Trump's misdeeds because every time someone would go and start digging up stuff about Trump and the really bad stuff he was involved with or his administration was involved with, he would bring up the Russia card and say, oh, well, they were selling that phony Russia thing and he's still doing it, right?
[432] To this day, he's still doing it.
[433] And many of his supporters still believe it.
[434] And so I thought there's a public service.
[435] to be done in telling the story.
[436] Well, look, it erred clearly, especially if these people were successful in getting journalists to bite on this, to report on this.
[437] That's a big misstep for journalism.
[438] But the way to save it is to do what Bayer did and pull Tylenol.
[439] It's to do what science does shit.
[440] This number is no longer adding up the way we said.
[441] We must amend it.
[442] So this type of truth, telling and taking responsibility is so vital so that we don't lose that.
[443] Look, we all believe bad things about Donald Trump.
[444] It's not hard to believe that.
[445] And there's plenty of evidence for it, right?
[446] Mm -hmm.
[447] But believing that and then saying, well, because I believe that, I believe that he also colluded with Moscow.
[448] That's a pretty big jump.
[449] Now, if there's evidence of that, fine.
[450] That's great.
[451] Bring it on.
[452] But if it isn't, then we have a problem.
[453] And in some ways, the problem began when BuzzFeed, you know, this news organization posted the dossier in early 2017, and then it just exploded.
[454] It was flubber, right?
[455] It just had its own momentum, yeah.
[456] People you and I watch on MSNBC and CNN and everywhere else and in various publications, just embrace this and we're chasing this.
[457] And it was an all -consuming media obsession for three years.
[458] And what I tried to do in the book is kind of follow that trail and bring readers into the rooms where people are being misled about this stuff or they're interacting in ways or they're misreporting things.
[459] I mean, there's some reporting that looks like it was totally made up out a whole cloth.
[460] And it may be because the people who reported it believed what their story.
[461] sources were telling them.
[462] It may be because they made this shit up themselves.
[463] But in either case, the people who did this should, one, be held account for it and to sort of acknowledge what they did.
[464] And I'd like to think that my book, in some ways, is part of the accountability process.
[465] But I don't know if there's ever going to be an acknowledgments chapter written by news organizations saying, you know what, we fucked up, we shouldn't have done this, and we're going to change our practices.
[466] So when we give the public information that has, in some form come from a hired spy, we're going to let them know about it.
[467] So they're going to be able to contextualize this information and use it in whatever way they want.
[468] I mean, no one in their right mind woke up one morning with the idea that Donald Trump was a Russian puppet and he was colluding with Moscow and all these things were going on.
[469] They were all coming from some place.
[470] And people who describe themselves as quote, unquote, investigative reporters, they need to sort of think about how they present information.
[471] They need to be honest with readers and viewers.
[472] And they need to make it clear that some of this information is coming from people.
[473] who are being paid to plant the information.
[474] I think part of the problem is there can be a political bent to these different news organizations, too.
[475] And so if there's a, quote, right -wing organization, they are likely, maybe this is not fair, but they're probably not going to take accountability for that level of fake news that they're putting out.
[476] So I'm sure there's this counter effect on the left in those news organizations.
[477] to say, well, if they're not going to take accountability and we are going to, it's just going to make us look less effective.
[478] I mean, I'm not saying they shouldn't, but I think this is part of the whole issue is, well, they're not doing it.
[479] So should we do it?
[480] I totally understand what you're saying, because there is a lot of disinformation, misinformation coming out of bright wing media.
[481] My view was there's nothing I can do about that.
[482] That's that.
[483] I think of myself kind of a, you know, I had amazing.
[484] mainstream publication.
[485] New York Times, I mean, many people think it's a left -wing publication, but God knows it's about as mainstream as they come.
[486] So mainline news organizations need to hold themselves to higher standard because they don't want to become hyperpartisan.
[487] They don't want to fall in the trap that you're describing, which is, hey, these other guys do it, so it's cool for me. Can I ask your opinion on it seems like what has changed is there are so much many more players that news organizations like the New York Times are now, in essence, whether you want to call it that or not competing with, eyeballs.
[488] So as you mentioned, like BuzzFeed puts it out there.
[489] Now, New York Times can't pretend that's not in the world.
[490] It somehow needs to be addressed.
[491] Also, you have now, because there's so many more outlets, there's so many more layers of what we would call news.
[492] Is it entertainment?
[493] Is it this?
[494] There's all these strata.
[495] And so because there's so many more players, Perhaps the ethics are as broad as the field itself has gotten.
[496] Do you think that's what's driven this?
[497] In fact, I think it's a very strong partner, and I write about that in the book.
[498] You have all these new news organizations that didn't exist 15, 20 years ago.
[499] News organizations have become hyper -partisan, you know, so you have like Fox over here and now that you're even crazier O -A -N or to the right of that.
[500] You've got MSNBC on the other side, things that never would have made.
[501] the news or now making news because there's more of like a demand for like, oh, we can get a hit on this or we can get clicks on this.
[502] So the way that I described in the book is that when you take those factors and you threw in Donald Trump, two former Wall Street Journal reporters and a former British spy, all the elements were in place for a media cluster fuck of epic proportions.
[503] yes i mean this all was the the substrate for this whole rocket ship to be launched from yeah man and then i just want to add too it's like and it kind of goes back to your investigation and painkiller which is we always would recognize that if the government goes to john delorean and says hey do you want to make two million dollars by taking this cocaine over here that's entrapment so he's not going to serve any time for that because we acknowledge the government shouldn't entrap people And when I watch the crime of the century, you know, you're taking a gal that worked at a strip club who's putting herself through college.
[504] You bring her into this company to sell pharmaceuticals.
[505] And you're incentivizing and entrapping and encouraging her to break the law.
[506] And yet that poor woman ends up going to prison, not anyone that incentivized her.
[507] And so it's this endless shit flowing downstream in deniability.
[508] and it's kind of repugnant because the ultimate victims of it are always the lowest people on the totem pole.
[509] We don't seem to ever go after the people who have incentivized the behavior.
[510] There are two systems of justice.
[511] There's the justice for most of us and there's the justice system for the wealthy.
[512] And not only was that true in the case of painkiller and similar other corporate crimes, but in my new book, it's also relevant also because if you're a business person and you think somebody is screwing with you for one reason or another, you hire a bunch of private spies to gather up evidence, and you've got the lawyers and the information to, like, walk that stuff over to prosecutors.
[513] So the nature of the two tiers of justice don't affect who gets crapped out at the bottom.
[514] It's who gets fed in at the top.
[515] Yeah.
[516] And then the only time that ever happens in like some egalitarian way is that like internet sluice compile enough evidence against nexium.
[517] But the odds of them kind of fueling and funding a full investigation to put on the plate of someone is so rare for that to happen.
[518] You bring up nexium and I wonder why.
[519] Because as I see it, that was an investigation that no one wanted to take on and it was largely led by defectors and parents.
[520] and through their endless pursuit of media attention finally got someone to act, right?
[521] So they had to make the whole thing happen.
[522] They had to go out and get the public will for it.
[523] They had to provide all the evidence.
[524] They had to record phone calls.
[525] And that was done without any funding, but I just would say that's probably the ratios one to a million of it happening in that fashion.
[526] Tell me why nexium interests you.
[527] Well, I broke the nexium story.
[528] Get the fuck out of here.
[529] Oh, I'm so embarrassed.
[530] seen the vow?
[531] I have seen the vow.
[532] You have not seen how fucking awesome I am in the vow.
[533] Oh, shit.
[534] Come on.
[535] Man. Boy, I got egg on my face.
[536] I got, I should have been like, I deserved an Emmy for what I do in the vow.
[537] I'm rewatching it now.
[538] Now that I'm a fan.
[539] Episode five.
[540] I'm going back then.
[541] I'm going back.
[542] Okay.
[543] Yeah, great.
[544] Yeah.
[545] Boy, that was, will we call that serendipitous?
[546] Yeah.
[547] I thought, oh, you're like trying to set me have to say something about the vow.
[548] And then I started thinking, maybe he doesn't know about me and the vow.
[549] It was the latter.
[550] Yeah.
[551] No, but I go into a big spiel about how Nexium was like being promoted as a self -help organizations.
[552] That's how it was luring people in.
[553] So I am being interviewed by the filmmakers.
[554] And I say to them, something to the effect of, look, I get self -help, but I'm so fucking helpless, no self -help program could help me. And so I was totally immune to this shit.
[555] Well, I'll say another thing about you.
[556] Let me try to wow you with my brilliance.
[557] I think I figured out exactly what Reneery did, what his magic thing was.
[558] Because he did not go after rejects of society.
[559] He recruited and groomed very intelligent people.
[560] And I think his key breakthrough was they were intelligent people that also very much wanted validation for being intelligent.
[561] So when they asked him something, when they wanted him to be prophetic, they'd say, well, what is this?
[562] And he would go, well, what do you think it is?
[563] He never answered a fucking question.
[564] He'd asked, what do you think it is?
[565] And they would whip up a theory that they believed in that they thought he would believe in.
[566] And so he forced them to come up with all the answers that they were looking for because they wanted to impress him.
[567] And they would end, and he would smile, and he'd go, yes.
[568] And maybe he'd make an adjustment, but he never actually gave you scripture.
[569] He forced you to give yourself scripture.
[570] And I was like, this is brilliant.
[571] He's praying on people who want him to think they're intelligent.
[572] And you don't need Renieri's approval for being intelligent.
[573] That's why you're immune.
[574] I need approval all the time.
[575] But I got a pass where he was concerned.
[576] But to drag it back to private spies, you know, he, Reneery used private spies a lot.
[577] There was a, like in the 90s, there was this private eye in New York who was kind of a forerunner of Black Cube because he was Israeli.
[578] He claimed to be a Mossad agent.
[579] And he was like investigating people that were questioning Reneering or trying to expose Reneery.
[580] And the village voice had the greatest piece on him when the village voice still existed.
[581] And the headline was Super Agent Schmuck.
[582] It was basically an expose about the fact that this guy had no Mossad experience.
[583] And all this legend that he created about himself was bullshit.
[584] One of the things that really interested me about Christopher Steele, the former MI6 agent in my book Spooked, was his legend.
[585] Like he was supposed to be incredibly smart, incredibly knowledgeable about Russia.
[586] Many reporters and bookwriters described them as having broken open the FIFA soccer scandal and helped solve the mystery of the poisoning of this former KGB agent in London in the mid -2000s.
[587] And I'm going like, okay, well, that might be true, but let me see what I can find out.
[588] And it turns out he did very little when it came to breaking the FIFA investigation and all these people that were involved with this KGB agent who was murdered had never heard of Christopher Steele.
[589] So maybe he did have some involvement, but it was a lot less than the way it was being billed to journalists that then amplified those stories.
[590] It is crazy how ubiquitous the Mossad thing is, because even here in Hollywood, I have numerous friends that have either done, gun training.
[591] This guy is the best.
[592] He was former Mossad.
[593] Someone's personal security.
[594] Oh, they have former Massad.
[595] A personal trainer.
[596] Oh, this fitness coach, man. He knows.
[597] Former Massad.
[598] I'm thinking, are there any Mossad in Israel?
[599] Are they all here coaching people on how to be?
[600] Right.
[601] Right.
[602] And what did they do for the Mossad?
[603] I mean, were they like?
[604] Analysts that read computers?
[605] Your receptionist or clerks or whatever the case may be.
[606] I mean, you turn on TV and it doesn't make a difference what channel is.
[607] It could be.
[608] be CNN, Fox, MSNBC.
[609] And there are all these former FBI people, CIA people, military intelligence people.
[610] They're opining about all this stuff, right?
[611] Oh, well, Jim's the former CIA agent.
[612] What do you think about this, Jim?
[613] And he goes, well, blah, blah, blah.
[614] And I'm thinking to myself, like, who the fuck is this person?
[615] I mean, like, what do they know?
[616] What do they really do?
[617] I probably know as much about it as they do.
[618] Maybe there's some, like, MOOC bureaucrat their entire.
[619] career.
[620] And now they're like dressed themselves up in a suit and they're pretending to know a hell of a lot more than they actually do.
[621] Well, I think humans as animals.
[622] We like archetypes.
[623] We like stories.
[624] So it's like, oh, this is the expert.
[625] Here's the credential.
[626] I can trust this.
[627] And this is how these private spies sell themselves to customers too.
[628] Hey, I'm an ex -M -I -6 guy.
[629] Yeah, so even Black Cube.
[630] Like, if you go and you have the meeting with Black Cube and they wow you with all this stuff, they don't mention they're going to hire Igor for $30 a day.
[631] Like, they leave that part out because it's selling this story.
[632] That's something they've seen in a movie.
[633] Yes, it's a legend.
[634] But they must be really effective.
[635] Could you give me some examples of when they've been hugely successful?
[636] Obviously, the Trump dossier, that was hugely successful.
[637] It led to an impeachment virtually.
[638] So what's another case of when they've been really effective?
[639] So, I mean, Black Cube has had some effective cases.
[640] They were very effective in helping these two British businessmen fend off criminal charges against them.
[641] And they were then able to recoup million dollars in legal expenses from the British government.
[642] Private investigators, pirate spies do do legitimate work.
[643] But increasingly, they're doing lots of really, really skeevy things.
[644] And there are many stories of that in the book.
[645] Yeah.
[646] Could you tell us how they're using digital tactics?
[647] Well, Black Cube is a wonderful example.
[648] So you've got a new job.
[649] Your acting career is over.
[650] I'm sorry to tell you that.
[651] That's not a surprise.
[652] Black Cube is willing to hire you because you have talent.
[653] But we're not going to send you out as Dax.
[654] We're going to create a new persona for you, a new legend for you.
[655] So when you go out and try to con somebody into giving you information, you're going to have a new name.
[656] and there's going to be your Facebook page with all your imaginary friends on it, your LinkedIn listing with all your imaginary jobs on it.
[657] If someone gets really curious about you and they click on your employer, it's going to bring them to a phony website where there's stock photographs of this company and wonderful comments about what a great employee you were.
[658] They used social media to create this digital mirage.
[659] So that's one aspect of it.
[660] And then there's also the other side of it, which is this insidious cyberspying, where there is a lot of hacking, computer hacking going on, or where spyware programs are planted into people's phones.
[661] So that way the investigators can follow them.
[662] they can get all their text messages, they can get all their emails.
[663] There was one great contract that I saw where the pirate spies were promising to provide the client with a listing of all the porno sites that their targets visited.
[664] And I thought, wow, people have to be really careful with that kind of stuff happening, right?
[665] But once they get into you, into your computer, that's one side of digital spying.
[666] The other side is when someone approaches you under what's known as a pretense, made -up character, digital social media technology has been used to like really flesh out that mirage.
[667] So it appears that you are a real person.
[668] now the text message part in the spyware software that now has to be illegal now at that point they've crossed some line yes phone hacking is illegal but it's usually done remotely so let's say you're a private investigator and you think you know what i know someone who hacks phones that person's not going to be living next door to you they're not going to be living in the united states they may be living in india there was recently a very big case where hundreds of jobs were being farmed out from various customers, private investigators to this hacker in India.
[669] So basically there's like a daisy chain or one of those Russian stacking dolls where, you know, you've got the company here, then you pull out, or then you've got the lawyer, then you pull it out, you've got the private investigator, then you pull out the next one, there's this one middleman, then you pull out the next one, there's another middleman, and eventually you get to the bottom, and you get the hacker.
[670] But there's so many layers going down that everyone's got like deniability.
[671] Yeah.
[672] Oh, wow.
[673] Am I misremembering this?
[674] I feel like I remember, and this is the moment I said, oh, well, by God, no one's safe.
[675] Didn't one of the British tabloids have access to the prime minister's phone like 12 years ago or something?
[676] Yeah, well, it was a big phone hacking scandal.
[677] I think that's what you're referring to, where the news of the world, Rupert Murdoch's newspaper, had phone hackers, and it was very commonplace in England.
[678] I mean, one of the characters in the book did a lot of pretense work.
[679] I'm sure he would deny that he did phone hacking, but maybe he did.
[680] But it was very commonplace.
[681] And so there's this huge scandal where this hacking was caught, and there was like hearings.
[682] I don't remember, if you remember, like, when Rupert Murdoch and one of his sons went to testify, someone threw a pie at them.
[683] Oh, yeah, yeah, yeah, that Wendy, you know, Wendy Ding, Rupert Murdoch's then wife intercepted through herself between her man and the...
[684] But what the book does is kind of, as I'm laying out these stories, these characters, you're learning, hopefully, about the history of this type of cyber spying and how it started and how it's now being used.
[685] Oh, wow.
[686] It doesn't even feel real.
[687] It feels like all of this couldn't happen in the United States.
[688] That's something that happens in another country.
[689] In Russia.
[690] Well, it doesn't feel like it would happen in Britain to me either.
[691] So I guess my last question for you, this has been so fun, by the way.
[692] Thank you so much.
[693] I really hope people buy and read spooked the Trump dossier, Black Cube, and the rise of private spies.
[694] And I know this is an anti -journalistic question to even ask.
[695] Because I think it was Bob Woodward.
[696] We talked to Bob Woodward, and he's only levied a verdict one time in his entire life.
[697] And it was his last book.
[698] He doesn't levy verdicts.
[699] But your overall feeling, what percentage of them are good?
[700] As you said, there are good reasons to investigate people.
[701] And if your local government's not doing it and you're funded and you could help break open something, then that seems like a noble pursuit.
[702] What percentage of the work they're doing would you say is good and what is evil?
[703] Well, I can't give you a percentage breakdown.
[704] Let me give you an anecdote.
[705] As you mentioned, as I discussed, there are many private investigative firms.
[706] that are honorable and they do ethical, legal work.
[707] But increasingly, what some of these investigators said to me is that more and more, they're going into law firms to pitch their services to them.
[708] Like, if you have cases, we're great, here's what we've done, blah, blah, blah.
[709] Invariably, what they're getting back is like, well, can you guys do what Black Cube did?
[710] Do you guys do that?
[711] And that's what they want.
[712] that's what is wanted more and more because at the end of the day people want to win and they're willing to do anything to win so long as they don't get caught doing it monica have you seen the film michael clayton it's a great film isn't it the greatest it's one of my favorite movies of all time and then in my mind i'm like you know what percentage how much have they deviated for creative license like they might not shoot your ankle up with some kind of agent.
[713] But short of that, man, it seems like maybe everything else is on the buffet of options.
[714] Yeah.
[715] Oh, my goodness.
[716] This was so enlightening.
[717] Thank you so much for your time, Barry.
[718] And how hilarious I bring up nexium.
[719] I know.
[720] It was a really nice moment.
[721] It was real.
[722] Something real happened today.
[723] Everyone should check out Spooked, the Trump dossier, Black Cube, and the rise of private spies.
[724] Just sounds like one exciting story after another of just the depths of all this.
[725] Thanks so much.
[726] You made me happy.
[727] You made my daughter Lily happy.
[728] And it was great to share some time with you.
[729] Well, let me leave you with this.
[730] Lily, your dad is so cool.
[731] Oh my God.
[732] I want to be your dad.
[733] And you should think he's the coolest as well.
[734] Be well.
[735] Great luck with the book.
[736] It's been so fun talking to you.
[737] I look forward to your next investigation.
[738] Take care.
[739] All right.
[740] Cheers.
[741] Stay tuned for more armchair expert if you dare.
[742] my favorite part of the show, the fact check with my soulmate Monica Padman.
[743] Good morning.
[744] Good morning.
[745] I tried to act like we didn't just do a fact check, but we did.
[746] Don't say that.
[747] Really?
[748] No. They don't like that.
[749] No, I would hate that.
[750] You would?
[751] I think they like honesty.
[752] Yeah, they do.
[753] Yeah.
[754] It's tricky.
[755] You got to be honest and give them what they want.
[756] This is not always the same.
[757] But I'm still wearing that white t -shirt.
[758] I know, and I'm still thinking about the boy who gave you his number.
[759] Oh, wow.
[760] Because I was wondering, like, I guess if he's a fan on the show, he does know your age.
[761] But you also read young.
[762] I could imagine being 27 and not thinking you're older than me. I agree.
[763] I don't know how often we talk about age on here.
[764] I know.
[765] I mean, I'm doing fast math a lot.
[766] I'm saying 13 younger's at me. It comes up every time a guest is around your age.
[767] That's true.
[768] Oh, that is true.
[769] That is true.
[770] But you know what we talk about all the time is when there's a guest who's younger than me. it's like a big shock that they're younger.
[771] So maybe it is coming across like I'm much younger than I am, which I'm not.
[772] And you look young.
[773] I do.
[774] Yeah.
[775] I do.
[776] But I'm on the eve of 34 years.
[777] Yeah.
[778] It's exciting.
[779] 34 is like not a real age.
[780] Either was 33.
[781] Yeah, I'd agree.
[782] 32 for some reason stands out of my mind.
[783] It was a big year.
[784] Okay.
[785] A very good year.
[786] Really 30 to 35 are all nothing years, kind of.
[787] of, I think, and then 35 feels different.
[788] I learned something about silly Valentine, by the way.
[789] Oh, my God.
[790] I don't think he wrote that song.
[791] I think that's what I found out.
[792] Funny Valentine.
[793] Silly stupid Valentine.
[794] I think he re -recorded it.
[795] Oh.
[796] Yeah.
[797] I think that's what I figured out in Montana.
[798] I think the original was on the radio.
[799] Oh, or did that person redo it?
[800] I will never know.
[801] Okay.
[802] This was a great episode.
[803] Barry Meyer, opioids.
[804] Ooh.
[805] Hot topic.
[806] Wait, opioids or that was his first book and now is new books not about opioids?
[807] This is the private eyes?
[808] Yeah.
[809] Okay, okay, okay.
[810] Private eyes and opioids.
[811] And opioids.
[812] And Xiam, which was, which you just fell backwards into.
[813] That was crazy.
[814] Crazy.
[815] Oh, my God.
[816] In a weird way, if I'm him, I'd prefer it came out that way.
[817] Because it'd be one thing if I had, you know, I knew that about him and then I was going to bring that up about him.
[818] But just like I was off talking about the power of journalism, not realizing it was him.
[819] That's better.
[820] Well, kind of, but he had to then tell you it was him.
[821] Yeah.
[822] Maybe he felt like that was going to be a brag.
[823] But I got to be honest, if someone was in here and they were just going on about how great hit and run is, not knowing I had anything to do with it.
[824] That's pretty much the ultimate.
[825] That's true.
[826] Compliment.
[827] You know there's no pandering happening.
[828] Yeah.
[829] That's true.
[830] And people with that facial recognition ailment.
[831] I'm always nervous now saying disorder.
[832] Oh.
[833] That gift of facial amnesia.
[834] If you have that, you could probably love hit and run, sit down with me and have no fucking clue.
[835] Because you wouldn't recognize me as the guy that was in that movie.
[836] Okay.
[837] But wouldn't they still see you?
[838] Oh, they can't remember what they saw, right?
[839] No, yeah, because I have a friend that has it, and I ask him, what does Brad Pitt look like?
[840] And he's like, I can tell you what his hair looks like.
[841] I cannot tell you what his face looks like.
[842] But don't they see the face kind of upside down?
[843] No, you're conflating.
[844] Now, we saw 60 minutes that explored this condition, and what they demonstrated was that our recognition of people's faces is very precarious.
[845] It doesn't take much at all for us to not be able to recognize faces.
[846] And all they did is they took eight or ten faces, turned them upside down, put them on the screen.
[847] And there's the 10 most famous people in the world, and we got like two of them.
[848] Right.
[849] Okay.
[850] They're just showing that it's a very...
[851] Oh, yeah.
[852] All one needs to do is flip the thing upside down.
[853] You cannot see that that's Tom Cruise.
[854] I'm going to type an upside down faces.
[855] All right.
[856] See what we get.
[857] Upside down face.
[858] That's you.
[859] That's your face.
[860] You're looking at.
[861] Oh, my God.
[862] Okay.
[863] So they just can't...
[864] They see it as a big...
[865] blur.
[866] Well, when they're looking at it, they see it.
[867] Okay, that's what I'm not understanding.
[868] But the points that your brain memorizes, like the little data points on the face, they're precarious.
[869] So if you just tilt the face upside down, you lose those data points and you don't know what face you're looking at.
[870] You can see the face.
[871] Right.
[872] Okay.
[873] But you do not recognize it as Tom Cruise or Oprah, which is wild.
[874] Why don't stuff?
[875] How can you go through life, though?
[876] I don't understand.
[877] I understand, like, every time you meet someone, it's like you haven't met them before.
[878] Well, like, in my friend's case, like, he knows I'm coming to pick him up.
[879] So when a human arrives, he knows it's me. Oh, my, they can take an advantage of so easily.
[880] Well, that's true.
[881] Oh, no. This is upsetting.
[882] I didn't really realize it was that intense.
[883] Well, the other thing I would wonder, boy, there's two outcomes of it that I want to explore.
[884] One part of me thinks how exciting to see everyone's face for the first time every time.
[885] That seems exciting.
[886] But then secondly, when you love someone and they just continue to get more and more attractive, what does it do to that phenomenon?
[887] But, okay, so when they wake up every morning next to their wife, they forget what their wife has looked like.
[888] Right, but they turn over and they look and they know they're next to their wife.
[889] So they're just looking at her face and they're not thinking much of it.
[890] But if they had to pick their wife's face out of a lineup, they'd be in trouble.
[891] No, that's bad.
[892] No, no, no, no. Yeah.
[893] That's how it is.
[894] This is like 51st days, the saddest movie that's ever been made.
[895] Yeah.
[896] Where she keeps forgetting him and he has to make her fall in love with him every day.
[897] That might be the perfect relationship for me. They'd have to earn it every single time.
[898] Oh, yeah.
[899] You would like that.
[900] Yeah.
[901] Every day you get these people.
[902] Okay, if you guys want to do this, go to six, just type in 60 minutes, face blindness.
[903] Oh, face blindness.
[904] And then, um, yeah, I mean, you can't do it.
[905] Right.
[906] Isn't that wild?
[907] It's crazy.
[908] Who are the people in that photo?
[909] John Travolta.
[910] Oh, wow.
[911] We all know.
[912] Sandra Bullock, Jennifer Anneson, Denzel.
[913] Jen Aniston, imagine you not getting Jen Aniston.
[914] You can experience it, right?
[915] Like, so when you're looking it upside down, you can, you can, experience exactly what they're experiencing like you see the face right you can talk to the person you see all the elements that make up their face but it's just not queuing you to remember that that's jennifer aniston huh but i guess if i was looking at this and talking to this and then tomorrow i saw that again i would know like it would still look i don't know um anywho that was scary why did we talk about that Because we were talking about opioids.
[916] We were talking about private investigators.
[917] Oh, sitting with me having loved hit and run.
[918] There we go.
[919] We found our way back.
[920] Wow.
[921] I felt stoned for a second.
[922] I was like, why can't we remember?
[923] No. Okay, because that's a different.
[924] I'm razor sharp, yeah, an opioid.
[925] I don't think so.
[926] You think you're razor sharp on them?
[927] I don't think I have much diminished mental capacity.
[928] Whereas stoned, when I've been stoned, I'm confused.
[929] I'm like, what were we talking about?
[930] Where are we going?
[931] You know, you can't keep anything straight for more than like 10 minutes.
[932] Yeah, that's...
[933] That's not a desirable state for me. Me either.
[934] No. I want to be razor sharp.
[935] I'm all gacked up on opioids.
[936] No. I'm teasing.
[937] You don't like those jokes.
[938] I understand.
[939] Yeah, too soon.
[940] It's just makes me scared.
[941] Okay.
[942] What's the name of the Ronan Faro book that you mentioned?
[943] His book is Catch and Kill?
[944] How many people have died from opioid addiction in the last year?
[945] Opioids were involved in 49 ,860 overdose deaths in 2019, which was 70 .6 % of all drug overdose deaths.
[946] The 500 ,000 numbers over the course of like the last eight years or something, we were using hundreds of thousands at points, but that was when viewed in the whole story of Purdue.
[947] Oh, got it.
[948] Yeah.
[949] Oh, but a friend of mine, yeah.
[950] I just learned last night that a friend of mine OD'd on, what he thought was Xanax that had fentanyl in it he lived but had two to narcams by the police administered and oh my god you know you'd be right to expect that an opioid would be laced with it a fake oxycontin if we haven't covered this a trillion times there's no source of real oxycontin anymore because they've they've shut off that big waterfall so there are all these mills in Mexico that make pills that look just like oxy cotton but really the active ingredients fentanyl that makes sense to get a zanax that's a different drug that's a benzo and they're putting fentanyl in that like so basically you can't take pills anymore because there's fentanyl and everything like even if you want to experiment with drugs and get high I don't have a position on that if you're not an addict and you want to get high and you can do it responsibly great but you can't take pills anymore responsibly because you don't know what's in them anymore That's a bummer for druggies, you know?
[951] Yeah.
[952] Oh, that's really scary.
[953] It is.
[954] That's what's leading all these odies.
[955] People don't know they're taking fentanyl.
[956] And heroin is laced with fentanyl now.
[957] I mean, look, I guess there's a part of me that's like these highly insanely addictive drugs.
[958] I guess there's a part of me that's like glad there's a bigger deterrent.
[959] for some of them, like heroin?
[960] I would be if it translated into risk, reward and people not doing it, but that's not how it was.
[961] No. I think, like, you can relate to drinking alcohol a lot.
[962] And if you were alive in Prohibition, you would find yourself drinking lots of alcohol that you didn't know where it came from.
[963] I just know you would.
[964] Like, you would still be social and you and your friends would still drink.
[965] And you'd be getting alcohol that was made in some people's bathtubs.
[966] And you would just, you'd accept.
[967] the risk of it.
[968] Yeah, that is true.
[969] But I, and I love alcohol.
[970] Yeah.
[971] I can go, I'll say it.
[972] That's great.
[973] I love it.
[974] But if fentanyl was ending up in drinks, uh -huh.
[975] And in wine bottles, and I wouldn't know.
[976] I really do think for me, I'd be like, I can't.
[977] Yeah.
[978] Yeah.
[979] Which, I mean, but if I was an addict, maybe I wouldn't be able to resist it.
[980] Well, you're already endeavoring.
[981] on something that you know has some lethal outcome.
[982] Like if you shoot heroin, you shoot heroin, you know people OD from heroin, but you tell yourself, well, I won't be that greedy.
[983] I won't shoot that much.
[984] So that's the first thing you tell yourself.
[985] And then the second thing is like, yeah, fentanyl's out there.
[986] But what I'll do is I'll shoot a tiny bit, see how that lands.
[987] And if it's fine, I'll know my batch is good.
[988] And then I'll do my normal.
[989] Like, there's all these ways I'm sure people talk themselves into ignoring the risk.
[990] I'm going to add one thing, though.
[991] You will not like this.
[992] Population control?
[993] No. It's easy to look at them as crazy.
[994] I don't.
[995] Or just engage in very risky behavior.
[996] And it's easy to see it that way.
[997] People don't really want to recognize that some 65 ,000 people die of a heart attack this year, induced by a terrible diet.
[998] Yeah.
[999] And so if you know what it's like, like to pull up at a fast food restaurant when you are unhealthy from eating and make the decision to get it anyways, you actually know what it feels like to make the decision.
[1000] Oh, yeah.
[1001] You know, I think a lot of us are doing things that result in our untimely demise, and we've labeled some as crazy and some is totally acceptable, and the outcome's the same.
[1002] There more people are dying from terrible eating habits.
[1003] I don't think we've labeled that as crazy so much as maybe pathological or I mean they are illegal so that's its own category but that's the part that I think allows people to think of those people as a separate group of people who make decisions that are so reckless right right and I'm only pointing out that just as many people are dying from another decision everyone makes and no one's really vilifying them definitely I agree yeah all of this is sad and hard I mean it's just hard to be a person and make decisions that are healthy for Like, it's hard.
[1004] Yeah.
[1005] And the only thing I'm urging is when people are inclined to go, like, I can't relate to that.
[1006] Like, who would take pills at this point knowing fentanyl?
[1007] Well, who would go eat this food that you know you're going to have coronary disease from?
[1008] Who would go and do this stuff that's going to give you cancer?
[1009] Most people have something that if they dig deep, like they're putting themselves at risk doing.
[1010] Right.
[1011] Oof.
[1012] Oof.
[1013] Was that all the facts?
[1014] Yeah.
[1015] Yeah.
[1016] Well.
[1017] All right.
[1018] This was sad.
[1019] This was sad.
[1020] I hope everyone's okay.
[1021] Love you.
[1022] Don't eat pills.
[1023] Don't.
[1024] Please.
[1025] Just don't.
[1026] Just so.
[1027] Yeah.
[1028] Yeah.
[1029] And exercise, you know.
[1030] And go to therapy.
[1031] Yeah.
[1032] All right.
[1033] Love you.
[1034] Love you.
[1035] Follow armchair expert on the Wondry app, Amazon music, or wherever you get your podcasts.
[1036] You can listen to every episode of Armchair Expert early.
[1037] and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.
[1038] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.