Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert, experts on expert.
[1] I'm joined by resident expert, expert mouse.
[2] Hello, expert shepherd.
[3] How are you?
[4] I'm great.
[5] It's a sunny day in Los Angeles.
[6] No sad for you today.
[7] No, it's happy.
[8] It's a happy day.
[9] We have a really fantastic expert on today.
[10] Susan Liatow or if you're in France, Susan Liatow.
[11] I think you did that right.
[12] You think so?
[13] Yeah.
[14] It's hard for me. It's hard.
[15] There's a couple hundred.
[16] in her name.
[17] Let's count them.
[18] One, two, three, four, five, six, seven vowels out of 13 letters total.
[19] Wow, that's impressive.
[20] That's a heavy proportion of vowels.
[21] Now, who is Susan?
[22] She is the founder of the leotowed and associates limited, a consultancy in ethics matters internationally.
[23] Susan also teaches ethics courses at Stanford University and founded a nonprofit, independent cross -sector laboratory and collaborative platform for innovative ethics called The Ethics Incubator.
[24] She has a new book out now called The Power of Ethics, How to Make Good Choices in a Complicated World.
[25] I think you'll find her refreshing.
[26] Like I think when you think of an ethicist, you think of someone telling you like, that's wrong, that's wrong, that's amoral.
[27] She's the opposite.
[28] That's right.
[29] And we get to reference Cheaty from the good place.
[30] We sure do.
[31] The only other ethicists we know.
[32] So buckle up for Susan Leotowden.
[33] Wondry Plus subscribers can listen to Armchair expert early and ad free right now.
[34] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[35] Or you can listen for free wherever you get your podcasts.
[36] He's an armchair expert.
[37] So sorry, Susan.
[38] We're brand new to this.
[39] We've never done this before.
[40] Oh, well, first of all, it's really nice to meet both of you.
[41] But Rob has had to deal with me. on the sound check, I am the most technologically inept person you will ever speak to.
[42] Well, I would implore you to meet my mother.
[43] You might get an ego boost.
[44] But he was awesome.
[45] So thank you for that.
[46] Do you think there's an added layer of insult to injury when you're a Stanford professor?
[47] Well, I think there are two added layers of insult to injury for me. One is that I do ethics of tech.
[48] Oh, wow.
[49] Yeah.
[50] And the other is that I have five children.
[51] who decided one year that my New Year's resolution was going to be that I was going to become technologically independent.
[52] And 12 months later, they retracted it and said that they needed to give me New Year's resolutions that had some hope of actually ever coming true, that were actually grounded in reality.
[53] It sounds like you said came to the common problem of aiming way too high on a resolution.
[54] Absolutely.
[55] Absolutely.
[56] Baby steps.
[57] Gave them up after that.
[58] Yeah.
[59] Let's just start with the.
[60] idea that you have five children.
[61] As someone who owns two of them, I mean, I just really, I can't hop into your mindset.
[62] Can you tell me how you ended up with five children?
[63] Well, first of all, they're adults now.
[64] The youngest just turned 18.
[65] And so it's kind of crazy.
[66] You made it.
[67] Yeah, that's kind of a, it's kind of crazy.
[68] But no, the second and third were twins.
[69] Okay.
[70] And then it was sort of like it was always one against two.
[71] So I really wanted to have a fourth.
[72] I feel so grateful and I just really love kids.
[73] Yeah.
[74] And, you know, survived, kind of.
[75] I mean, they're, they are adults now.
[76] Well, can you help us understand?
[77] I know there's a fluctuating ratio.
[78] So one is one kid and then two is like, feels like four all of a sudden.
[79] You think it's just going to double, but like quadruples.
[80] Then I've heard three, it kind of diminishes because it's already a shit show.
[81] What happens at five?
[82] Well, let's just say there's a general.
[83] lowering of standards.
[84] Right, right, right.
[85] What qualifies as matching socks or a nutritional meal is, you know, I developed a very open mind.
[86] Yeah, I always noticed that like the fourth and fifth kid have the helmet on because their head so flat because they're not going to pick them out of the crib nearly as much.
[87] The other thing that happens, and this is actually really true, is that the fifth child, actually, they sort of learn to fend for themselves.
[88] So my youngest son cooks, he's a really great cook.
[89] I think he sort of figured that he would just get forgotten if he didn't, you know, learn some skills.
[90] Yeah, it really makes you question the whole approach because, yeah, generally that fifth one is like the most competent human being on planet Earth because they got no help.
[91] Completely.
[92] Yeah.
[93] So first most interesting thing about you, when I read about you, is that you've lived just everywhere for long periods of time.
[94] And I think for a lot of us, that seems scary or, you know, adventurous, but perhaps lonely.
[95] I wonder what it was that made you curious to live so many places.
[96] So Paris, it was about the fact that my husband's French and he was there.
[97] So after law school, it was I was going to move to Paris or I wasn't going to get married.
[98] So that pretty much settled it.
[99] And it was a little scary because I started, I was working on law firm and I really needed that job.
[100] and there wasn't, you know, at the time, you didn't, as an American lawyer, you didn't just go get another job at an American law firm in Paris.
[101] You had to be sent from New York.
[102] But, you know, it was great.
[103] And then we moved back to California for a while because of his job.
[104] So I'm from the Bay Area.
[105] And then we moved to London.
[106] And, you know, honestly, I thought London was going to be a lot like the U .S. You know, people speak English, Anglo -Saxon, et cetera.
[107] Yeah.
[108] Way more different from the U .S. even than Paris, other than language.
[109] In what way?
[110] Just sort of attitude towards how we interact with each other.
[111] Like, for example, you know, we had neighbors and we just nobody talked to each other, you know, and even sort of food and just sort of social norms, just way less relaxed.
[112] I found that, you know, at least in France, and maybe that's, you know, in fairness, maybe that's because I was kind of coming into my husband's group of friends.
[113] Yeah.
[114] But people are, you know, really social, really friendly, really open -minded about.
[115] a lot of things.
[116] London's phenomenal, but I guess it was also that I expected it to be more like the U .S. Yeah.
[117] This has been an ongoing debate on this show for three years now, because I learned in an anthro class that there were all these war babies, right?
[118] Right.
[119] Do you know about the phenomena of the World War II war babies?
[120] No. Okay, an inordinate amount of female English citizens became pregnant by American GIs.
[121] and disproportionate to the amount of American female service women that were there, they did not become impregnated at all by the Englishmen, or very low percentage.
[122] So an anthropologist studied this.
[123] I cannot find this article I read in college, hence the debate.
[124] It's an elusive article.
[125] But what the conclusion of that was is that culturally in England, men have the break pedal, and women are the pursuers.
[126] and here, of course, it's reversed.
[127] So you had two people with a gas pedal and no brake pedal.
[128] And then the English man and the American female had no shot of getting past first base because they both had the brake pedal.
[129] We've read other explanations since that because in England, kissing is like step 18 out of 20 before sex, whereas in America kissing's like number two.
[130] So the women felt like they were much further along on the progression towards sex.
[131] So that's a compelling thing.
[132] That is so interesting.
[133] Yeah, it is fascinating.
[134] Because I agree.
[135] I think, oh, yeah, we all like the same music.
[136] It's all interchangeable.
[137] We're virtually the same.
[138] Speak the same language, yeah.
[139] Mm -hmm.
[140] So when you were a lawyer, when you went into it, I'm curious what mindset you had because I find myself regularly debating with Monica and my wife when we watch any kind of legal documentary.
[141] Well, generally, if they don't like the defendant, than they're very suspicious of the defense's tactics, say like the O .J. defense team walking them through the house and having put up certain photos on the wall and whatnot.
[142] And you're already bristling.
[143] You don't like my depiction of this.
[144] I don't think that's unethical, in my opinion.
[145] I don't think that has to do whether I like OJ or not.
[146] I think if I liked him, I still would think, e, that's...
[147] But see, I thought there was stuff in the making the murder trial, and we love those.
[148] defense attorneys that I thought they did stuff that's appeared to me to be the same.
[149] I guess my point is I'm regularly arguing from the point of view of there are no ethics when you're the defense attorney.
[150] Your only ethical directive is to give the best possible defense ever for your client.
[151] That's your ethic.
[152] And of course, that then bumps up against many other ethics that lawyer may hold or not hold.
[153] And I wonder what your opinion on that is and how you felt about it when you were involved with law?
[154] So I was never a criminal lawyer.
[155] I was doing corporate law, but I have to admit to being a bit of an addict to documentaries like that or even like, you know, David Kelly TV shows like back in the day, there was a show called Boston Legal.
[156] I was a total addict or suits or whatever.
[157] Yeah.
[158] So I'll confess to, you know, I'll share an opinion, which is I agree with you.
[159] The idea that you can do whatever is legal that is in the interest of your client, is a problem for me because one of the reasons I got into this whole thing and got so fascinated by it is that the law falls short so often.
[160] And, you know, sometimes it falls short, like I have this story in the book about an abused woman, and sometimes it falls short because it doesn't protect someone until after they're hurt.
[161] And sometimes it falls short, like fast forward to technology.
[162] It's just lagging so far behind the technology that it's not protecting anybody from the ills of social media or from what we don't know about AI or whatever.
[163] So it's sort of the same thing there.
[164] The law just isn't enough because you can do an awful lot when you're a defense attorney that, you know, as Monica is saying, is not necessarily what anyone would consider ethical.
[165] Oh, yeah.
[166] And prosecution, too.
[167] I'm not just putting it on the defense.
[168] True.
[169] Yeah, generally, I'm more, I'm more angry when the prosecution does it because they have the full weight of the government behind them.
[170] And they're also working in concert with the law enforcement agents.
[171] So it's already so stacked against the defendant that I generally speaking am far more critical of the prosecution side than I am the defense.
[172] I am too just because we have this innocent until proven guilty and the consequences of somebody going to jail who's innocent.
[173] I mean, I'm sure you've seen these things in particularly the horribly disproportionate number of times that happens to black Americans.
[174] And it just literally makes my stomach term.
[175] Yes.
[176] Or the people who are put in an interrogation room for 11 hours and they are not equipped to endure that you know as many humans are just not nobody's equipped to endure that yeah I have fantasies that it's me and I keep telling them to fuck off and go ahead bring it longer let's go to 10 a .m so but you're probably right it's at some point I would like superhuman stamina so not a problem for you well no it's more that I live to defy authority I think so I think I would be driven yeah I would be so driven to to to But we keep giving them to, you know.
[177] There's a lot of different ways to skin that cat.
[178] Right.
[179] But I have a great personal example.
[180] So we led this kind of campaign to stop paparazzi from taking photos of children.
[181] The children of famous people, obviously selfishly motivated because I didn't want my children to be photographed.
[182] And we debated some paparazos on television.
[183] And we kept getting into this circular thing, right, which was, well, the First Amendment protects And I was always like, yes, I fully recognize that the First Amendment protects your right to do that.
[184] I'm not making a rights plea.
[185] I'm making an ethical plea.
[186] Do you think it's okay for a four -year -old to be crying as men jump out of a bush?
[187] Like, that's what I'm here to debate, the ethics of it, and the ethics of the consumer.
[188] So what do they say to that?
[189] Because honestly, so I saw that.
[190] I thought that was fascinating.
[191] And I thought it was great that you did that.
[192] And I have to say, like, there is no world in which it's okay to photograph children.
[193] Right.
[194] It seems pretty self -evident.
[195] For one thing, if we want to put it in more formal ethics terms, for one thing, children can't consent any more than they can consent to medical treatment or, you know, sex or anything else, right?
[196] Yeah.
[197] And you, the parents, you're not consenting.
[198] So there's no consent.
[199] You don't get to do things to other people's children, and especially in a social media world.
[200] like to me that is just there's just no way that that's okay and i think it's really interesting that you're trying to get a law to make it so but like why should you need a law to make it so well and i was never under the illusion we would get a law to make it so so there was another campaign simultaneous to our led by other actors and i think they were super well intention but i was like this is just never ultimately this will end up in the supreme court and it'll never they're not going to make an exception in the first amendment for this has to be like an entirely different plea.
[201] But England, oddly enough, where you're at, they are really, they have, you know, great rights to protect the press.
[202] And yet it is illegal to photograph kids in England.
[203] Yeah, the freedom of the press and free speech in England is even more liberal than it is, you know, and that isn't to say that the British press, especially the British tabloid presses, I'm sure you know very well, behaves well.
[204] But, you know, the other thing about it is who's buying these photos?
[205] Like, if people are taking them, it's because there's a market for them.
[206] Yes.
[207] Publications are buying them online or otherwise and people are looking at them.
[208] And so I always sort of look at all of the different parties to who's responsible and there are some that are more direct like the photographers.
[209] But, you know, if there was no place to sell them, would they really be taking those pictures?
[210] Oh, I couldn't agree more.
[211] And again, I was also sympathetic to the ultimately at the end of the down river, the consumer who buys the magazine.
[212] I don't fault them.
[213] A, who's not curious to see what Brad Pitt's kid looks like?
[214] Like, I'm not judgmental of that.
[215] I get it.
[216] And if the photo is one of the kids smiling, I have no way to know how it was garnered.
[217] I don't know what it's, you know, I don't know how many pictures of him crying before I see the one of him smiling.
[218] So, you know, that's a lot to ask the consumer to recognize.
[219] So, yeah, part of it was just like educating people who like that stuff.
[220] Like, hey, it's a little darker than you might think.
[221] There's guys living in bushes outside of our house and they follow us everywhere.
[222] Yeah, but it's a whole process.
[223] And then, yes, ultimately we had to go to the picture houses and say, like, here's a coalition of people who will never do an interview for you again.
[224] You know, you've got to play ball.
[225] Like, there has to be a financial incentive.
[226] And that's ultimately what I think worked.
[227] Well, I've done a lot of work recently looking at where can consumers make a difference.
[228] And I think there are places consumers can make a difference.
[229] And then there are places where you still need regulation.
[230] There are never going to be enough people who come off of Facebook that, you know, know, you're just going to clear up all social media addiction and online bullying and all the other things we've seen lately.
[231] Everyone has to play ball, as you say.
[232] Like the regulators, the company, the people who use it, et cetera.
[233] So let's get into that one because that one, again, to the frustration of everyone that loves me. As a hobby, I really, really try to make an argument for the other side.
[234] So first and foremost, I recognize Facebook is making everyone feel terrible.
[235] Most of the social media is, I think it's polarizing everyone.
[236] I think it's driving.
[237] people to rabbit holes.
[238] I think it's radicalizing people.
[239] I think there's so many things you could say about it.
[240] Now, the way we're in this problem is you would know the name of the bill.
[241] I have forgotten it, but basically it was decided early on that these social media platforms would not be considered publishers and that the content on there would not be their responsibility.
[242] And so that is what has led to all this trouble ultimately and why it's so hard to regulate, if I understand it correctly.
[243] So because YouTube is a self -publishing platform, YouTube can't be held responsible for some knucklehead putting a silly video up on there.
[244] And now the argument I understand and I recognize is valid is like, you couldn't have had any of these things if they're being held responsible as publishers.
[245] They just simply wouldn't exist.
[246] So the option was really they don't exist at all or they're going to exist with no legal liability.
[247] So that's a really interesting way to put it.
[248] there are a couple things.
[249] First of all, I should say, I am very pro -technology when it's societally beneficial and when it's ethically deployed.
[250] So I'm pro -tech.
[251] I'm pro -innovation.
[252] Look at all the good that Facebook is doing.
[253] Look at all the people in the world for whom, like, it's tantamount to the internet or being able to connect with families, and that would be their only way.
[254] Or Google is giving them, you know, access to search or whatever, or even, you know.
[255] Organ matches.
[256] There's like crazy stories of people's lives being saved.
[257] Yeah.
[258] Totally.
[259] So I think we need to make sure we can.
[260] keep the positive.
[261] And I think we're in a really tricky bind with the law that you mentioned this section 230.
[262] I mean, I think the issue is that we can't have binary.
[263] They're not liable at all or they're liable for any old thing.
[264] I mean, these companies are increasingly capable of controlling the really awful stuff.
[265] And they do.
[266] I mean, they've made progress.
[267] They just need to make more progress.
[268] Yeah, they need to approach it as if they're going to get outpaced by their competitor like put that same level of investment and power behind that aspect no exactly because if they can do it for child pornography they can do it for inciting violence right now i'm not a technical person as we established at the very beginning of this call when i put my batteries in backwards in the corner okay um but nonetheless you know they are doing it for certain things which means they can do it for more things and i think a lot of what we hear from the companies is you know they just don't want the slippery slope.
[269] So they don't want to touch this law at all because they think if they touch it at all, it's going to be the slippery slope to the binary that you described.
[270] I think everybody needs to take a deep breath and say, we're not going binary here.
[271] We're going to try to maximize the opportunities of these platforms.
[272] And we're going to try to mitigate the risks.
[273] And the risks are really serious.
[274] Okay.
[275] So I'm with you in that and I am a proponent of that approach.
[276] But I immediately see a very legitimate concern for people.
[277] people that are not in my party, but on the right, because seemingly most of the proposed regulations would disproportionately affect the free speech of people on the right, if we're being dead honest.
[278] I mean, QAnon, the inciting violence, these are all, all the ones that are currently in the news are pretty much right issues.
[279] Would you, I mean, can we agree on that a little bit?
[280] I know it's not blank.
[281] Yeah, no, I see where you're going with it.
[282] And it makes a lot of sense.
[283] And it's a really, really important question.
[284] I think anybody inciting violence, and by inciting violence, I mean, like, in a really direct, like, go here and do this now kind of way, doesn't belong on these platforms.
[285] And, you know, how you take them off, do you take the person off?
[286] Do you take just the offensive speech off?
[287] And offensive is not the right word.
[288] Do you take just the inciting violence speech off?
[289] How do you do it?
[290] That's a different question.
[291] You know, it appears that now a disproportionate amount of this speech is coming from one side of the political spectrum.
[292] Yes.
[293] And again, I have to force myself to imagine I'm on the right and I'm watching the BLM protests and I'm seeing a ton of violence and I'm seeing deaths and I'm seeing destruction.
[294] And I on the left am saying, well, you're confusing and conflating looters with peaceful protesters.
[295] So you're you're trying to make that one group and that's how I can easily delineate that.
[296] But I'm sure on the right, they're looking at the knuckleheads who stormed the capital and going, well, no, those are just anarchist idiots.
[297] And they don't represent the thing I'm talking about.
[298] And so...
[299] Yeah, but I would agree with them.
[300] I think most people who are logical would agree would say, yeah, those aren't just regular run -of -the -mill Republicans.
[301] They're extremists.
[302] And if we call it what it is, which is that, that should be fine.
[303] You should be happy that we're saying that, like, actually, we don't think they're like you.
[304] We think they're their own French thing and they need to be controlled.
[305] Yeah, I just imagine if the right owned all the tech companies, which they don't, the left owns them all, that they would have had a different opinion of what should be policed as far as promoting BLM rallies.
[306] And that makes me nervous.
[307] Yeah, it makes me nervous too.
[308] I mean, I really think that we need to make sure that when we talk about free speech, you know, we're not protecting inciting violence, no matter what the political views are of the person inciting violence.
[309] So as Monica says, I totally agree with the way Monica just put it.
[310] Like, extremists, it doesn't matter what your political party is.
[311] You don't get a platform.
[312] But if I'm full honesty, if I'm being dead honest, I actually will excuse stuff on the left that I would never on the right, which is I hear people being very critical of riots in black neighborhoods where they burn down buildings.
[313] And I say, yeah, I don't agree with burning down buildings.
[314] but I'll tell you what, the news is in Ferguson now, and it is national, and you have not created a system by which you're going to listen to them if they play ball, so this is their last option.
[315] And guess what?
[316] You're now listening, so you can't argue it's ineffective.
[317] So I just know how skewed and biased I am, because I believe in that movement, and I'm very quick to explain why that's the only political option for a lot of people.
[318] Yeah, so do I. And I think I'm probably guilty as well.
[319] And the thing that is the most exasperating for me in all of that is that how is it possible that it takes burning down buildings to see racist police?
[320] They knelt during the national anthem.
[321] They actually, there was an attempt to do it peacefully and no one listened.
[322] And in fact, they penalize those people.
[323] So yeah, but all that to say, emotionally that's how I feel about it.
[324] I have to imagine there's people on the right that are saying you're not listening unless we act like idiots.
[325] And I totally disagree, but I just, I think it's always worth challenging that.
[326] I completely agree with you.
[327] I want to make sure that we're listening to the right.
[328] I don't have a lot of space for extremists, no matter what, their political party, but absolutely want to listen to all parts of the political spectrum.
[329] Because the reason we are where we are is that we are not listening to all parts of the political spectrum.
[330] Yeah, so let's get into the binary thing because you have six forces in your book that are present in every ethical dilemma.
[331] And then the first is banish the binary.
[332] And this is something that Monica and I are on our high horse about all the time.
[333] We're so critical of all binary thinking.
[334] The vast majority of issues we'll have to confront.
[335] They're not black and white.
[336] They're gray.
[337] It's a sliding scale.
[338] Yeah, tell us about the complexity of really what most things are.
[339] Yeah.
[340] So, you know, I loved your tagline, the messiness of being human.
[341] And for years, I've been starting my classes by saying, you know, welcome to ethics on the edge.
[342] We're going to muck around an uncertainty.
[343] You know, students look at me like, what are we in for here?
[344] But there are a couple things about this gray.
[345] One is that the world is moving so fast.
[346] So what binary does is it sort of says, you know, this is ethical or it's not ethical.
[347] And by the way, I don't allow the words ethical or unethical in my ethics classes.
[348] it's just too easy the other thing it does is it is a sort of a and we are definitely at a historic moment of taking sides everybody's got to be on a side and so that destroys all kinds of potential for really seizing opportunity it also destroys potential for things we can talk about later if you like with like healing connecting but the biggest problem is that you know you just miss out on so much and so in terms of our ethical responsibility we really want to shut down social media when so many people are dependent on it?
[349] Do we really want to shut down facial recognition technology completely when it might help us find a terrorist or a lost child?
[350] We don't want the police using it in any old way.
[351] We certainly don't want biased facial recognition technology being used by police that can then end up where we started the conversation, which is innocent people going to jail.
[352] But we also want the upside.
[353] So I, you know, I think that the world is so complex that in most of these cases, and in particular, a lot of cases around technology or gene editing.
[354] We want it to help cure cancer.
[355] We don't want to see what happened in China with the rogue scientist Hujeng Kui, who, you know, if he did what he said he did, you know, edited embryos.
[356] To make them impervious to HIV, which was already on top of being a potentially unethical, also completely unnecessary.
[357] But, you know, when you call somebody ethical in a way, you're sort of giving them a pass forever.
[358] We're only as ethical as our last decision.
[359] I couldn't agree with you more.
[360] I also think that labeling ethical, there's another way to look at it, which is like good and evil.
[361] Those to me are the most dangerous things because if we label Hitler evil, well, we're done thinking about that.
[362] There's no other causality needed to explain that.
[363] So then when we don't investigate how he came to be, we cannot prevent it from happening again.
[364] So evil or good to me is always so incomplete, and then there's nothing to gain from it to apply to anything.
[365] Right.
[366] And also you're kind of stuck in it.
[367] So the other thing is that as much as I don't think anybody deserves a life pass and a little sticker that says ethical, I don't think anybody or, you know, few people deserve to be condemned as unethical, right?
[368] And so one of the things I talk a lot about in the book is sort of resilience and recovery.
[369] And, you know, and we can talk about that as sort of national healing.
[370] at the moment.
[371] But I just think that, you know, by calling somebody unethical, you just miss all kinds of opportunity for recovery.
[372] The final piece of it is that I'm an ethics optimist, but I'm also an ethics realist.
[373] And I tell my students all the time, you can do your ethics in theory.
[374] You can cherry pick what you like and do your ethics outside of reality, but reality is always going to come back to bite.
[375] And reality is messy, like you guys say in your tagline.
[376] Yeah.
[377] So, okay, great.
[378] This will open up something I've been ruminating on for a while, which is I read the Moses book about Robert Moses, who built virtually all of New York City.
[379] He built the freeways.
[380] Highly unethical things were done.
[381] You know, poor people were kicked out of their neighborhoods, all kinds of wreckage that he created.
[382] And also, he built the freeway system.
[383] And he made New York the greatest city in America.
[384] and I look at our current society where everything's binary, so it's good or bad, and everyone has a voice, and everyone can start a boycott, and everyone can get on the news, I wonder, is there an appetite for anyone to do something that is unpopular but necessary, or that is only 52 % beneficial to society, even 48%, not great.
[385] I wonder how we would do any of these public work projects in our present day.
[386] Does that make any sense?
[387] Yeah, it makes a lot of sense.
[388] So I haven't read the book, but certainly, I mean, when we look at things, I think we need to ask.
[389] So I ask the question when and under what circumstances would you do something.
[390] And I think we don't do enough considering how we go about something.
[391] You know, there are a lot of ways to do something and get to the same end.
[392] And very often we sort of see only one route.
[393] Well, there are a lot of ways to move less privileged families and get that freeway.
[394] You know, there are really humane ways of doing it and they're really awful ways of doing it.
[395] Now, they may not like getting moved, you know, and that may be a risk and that may be a downside that is for the betterment of society worth taking.
[396] But I don't think we spend enough time thinking about how we get to the results that we want to get.
[397] Yeah, and again, I just wonder if we could just do it theoretically.
[398] for more armchair expert, if you dare.
[399] We've all been there.
[400] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.
[401] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[402] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[403] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[404] It's called Mr. Ballin's Medical Mysteries.
[405] Each terrifying true story will be sure to keep you up at night.
[406] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.
[407] Prime members can listen early and ad -free on Amazon Music.
[408] And my podcast is back with a new season.
[409] And let me tell you, it's too good.
[410] And I'm diving into the brains of entertainment's best and brightest, okay?
[411] Every episode, I bring on a friend and have a real conversation.
[412] And I don't mean just friends.
[413] I mean the likes of Amy Polar, Kell Mitchell, Vivica Fox.
[414] The list goes on.
[415] So follow, watch, and listen to Baby.
[416] This is Kiki Palmer on the Wondery app or wherever you get your podcast.
[417] So if they were going to build a highway here in Los Angeles, that was going to, or let's even make it better, It's an all -electric thing that's going to be great for the environment.
[418] It's going to do all these things.
[419] But it has to be built somewhere.
[420] And then people will say, well, why don't you build it through Beverly Hills?
[421] And then you go, well, the reality of a marketplaces, that is unaffordable for any nation to undertake.
[422] Okay.
[423] So there's roadblock one.
[424] Where can you afford to build it?
[425] Well, unfortunately, it's going to be the people that have already been shit on.
[426] That's where you can build something.
[427] That's a very hard proposition to make.
[428] And I think currently it would just be like, well, we can.
[429] can't do it.
[430] No, I think you're right.
[431] I think there's very little appetite for things that are difficult and for things that are politically incorrect.
[432] I mean, look at COVID.
[433] COVID is in many ways the individual making a sacrifice for the good.
[434] You know, everybody wearing a mask, everybody respecting social distancing so that for the good of society or, you know, having a certain vaccine rollout mechanism that is for the good of society, even though lots of people who are in their 40s really need to be out and about working and taking care of children and really want to get vaccinated, right?
[435] Yeah.
[436] So we have this, you know, individual versus society in a lot of cases.
[437] But I think you're absolutely right.
[438] I think we've gotten to a point where the really tough decisions, I think that that would sit on the back burner.
[439] So COVID.
[440] So the thing, my current frustration is, and there was actually a pretty good New York Times piece semi -addressing this, which is because of our binary identity left, right, both sides are completely painted into a corner.
[441] So if you're right, you're almost obligated to host a no masks party.
[442] You have to do that, which is insane.
[443] And then in my opinion, if you're on the left, you have to spend your days going through other people's social media and trying to shame them for a situation you deem unsafe that you don't really know a fucking thing about.
[444] So I look at both those sides and I'm like, I don't agree with either of you.
[445] I think people are sometimes evaluating their mental health against their physical health.
[446] You know, there's a lot going on.
[447] What I try to do is to extract COVID, you know, even though it is, especially in the U .S., a very politically charged reality, I find that very, very unfortunate.
[448] But the first thing is, you know, it's so embarrassing, yeah.
[449] The first thing is, you know, we start with science, right?
[450] We'll listen to people like, you know, people I've heard on your show, Vivek Merche, or, you know, Sanjay, right?
[451] Right?
[452] I mean, we don't get to have our opinions about the science.
[453] We get, you know, opinions are very important.
[454] Facts are very important.
[455] They are not the same thing.
[456] So parties with no masks, kind of a non -starter for me, not because that's my opinion, but because I listen to Dr. Fauci and Sanjay Gupta, right?
[457] It's a non -starter.
[458] And on the other hand, that the shame thing, so I always say like there is no place for blame, shame, or guilt and ethics.
[459] And it's not just that there's no place for.
[460] They are actual creators of drivers of unethical behavior.
[461] And the worst I ever saw was in London.
[462] We were told that the government, there was actually a minister, and I won't mention her name, who was telling the citizens of London to actually shame your neighbor.
[463] Like if you were supposed to be quarantining, but, you know, maybe you had an emergency, so you put 12 masks on and you ran to the pharmacy to get a prescription, you know, the next thing you know you'd be all over social media.
[464] It's ridiculous.
[465] People have no idea what they're seeing.
[466] They don't know if somebody has been tested or is having an emergency or who knows.
[467] I mean, and by the way, like, why would that be a constructive step?
[468] It drives me crazy.
[469] I almost hate that more than I hate the mass house party, which is absolutely bonkers.
[470] It's just.
[471] No, no, but it's not that bonkers because, you know, this quest to shame people, I mean, shame is just so toxic.
[472] And it's something that all of us feel at one point or another.
[473] in our lives for different reasons.
[474] But the idea that you're going to make somebody else feel that, that you're going to intentionally try to create that awful feeling and even worse publicly on social media is, you know, again, non -starter.
[475] I do like that about your approach.
[476] It's kind of you start by saying, perfect is not even the goal.
[477] I'm not aiming for per, I mean, maybe I'm aiming for perfect.
[478] I have no illusion that you can be perfectly ethical.
[479] And, Within that, in my opinion, there's a lot of room for understanding and forgiveness and recognition that, yeah, we're all, you know, we're giving a shot some better than worse.
[480] And because as you say, it leads to actually more and more unethical behavior, that cycle.
[481] Yeah.
[482] Actually, do you mind if I take a cracket slightly rephrasing what you just said?
[483] I agree with it completely.
[484] Please do, because I wrote it down and I couldn't find it.
[485] No, no. It's just that it's so important to me. So, you know, as you say, perfection is not possible.
[486] And in my view, it's also not a laudable goal.
[487] But here's what perfection does in sort of ethics terms.
[488] I mean, since it's not possible, there are only two reactions that can happen.
[489] One is people are going to cheat to try to achieve it.
[490] So perfection can take a lot of different forms.
[491] It can be like an unrealistic sales target.
[492] It can be like unrealistic looks.
[493] So you have this epidemic of teenage girls.
[494] trying to have plastic surgeons create what they want their manipulated selfie to look like, really dangerous things.
[495] But so cheating to try to achieve it.
[496] But the other thing is the epidemic of mental health issues we have, especially with teenagers, university students, doesn't matter what socioeconomic situation, et cetera.
[497] It's so terrifying.
[498] And it's largely because of that.
[499] And it's sort of like, you know, so many young people especially feel like, well, if I don't just get this next thing, whether it's.
[500] it's get into this school or get this one score or get this one job, that life is kind of over.
[501] And there's no joy in that.
[502] Yeah.
[503] I have this really inexpensive guitar sitting behind me that I have because this is going to sound crazy to you, I'm sure.
[504] But about three or four years ago, I was watching something, and Bill Gates said, you should learn something new every week.
[505] Uh -huh.
[506] So I decided I was going to do something.
[507] Now, if you're Bill Gates, you can, you know, learn everything.
[508] there is to know about malaria nets in an hour because you're Bill Gates.
[509] I was not trying to get a PhD in mosquitoes.
[510] I just figured I'm going to take something that's like half an hour a week.
[511] I've always wanted to play the guitar.
[512] It's going to be no pressure to be good, just half an hour of learning something new.
[513] So fast forward three years.
[514] I still have my, you know, $90 guitar.
[515] And I know more about the guitar than I did.
[516] But I am terrible.
[517] And that's fine.
[518] It was all about you don't have to be good.
[519] I'm going to go further.
[520] I don't want to really be around anyone that's perfect.
[521] What fucking stories will there have?
[522] My favorite stories are when people shit the bed, make a mistake, do a dumb parenting decision, are a terrible husband at one point.
[523] You know, all that stuff is what is attractive to me about people, funny enough.
[524] Yeah, if I sit down and you do a classical guitar performance, I'll dig it for five minutes and I'll be like, okay, great, you're perfect.
[525] That's that.
[526] We're done with that.
[527] Yeah.
[528] I mean, my five children are the expert.
[529] on how imperfect I am, and that's something that's so important to me. And especially when I'm working with young people, if there's one thing you learn in my class, I don't care what you take away from this class or what grade you get.
[530] If there's one thing you learn from this class, is that you need to banish the word perfection from your vocabulary.
[531] Yeah, yeah, I'm with you.
[532] Let's start with a question I should have asked right out of the gates, which is, and it'll sound offensive maybe or dumb, but why would people even have an interest in being, quote, ethical.
[533] What is the incentive for people to explore ethics in making ethical decisions?
[534] I actually think it's a really, really great question because for many years when I started studying this and started working with it, the reaction was always the same.
[535] It was almost like an allergy.
[536] People thought, you know, ethics is going to be really inconvenient.
[537] Yeah.
[538] So first we had all these kind of, let's just take a box.
[539] Let's just say if we're a company that we have a code of ethics and that we've had this training and we've had.
[540] And we've had that review at the end of the year.
[541] But actually, the way I see ethics, it's really about being much more conscious of the impact you have on people around you and the consequences that you're creating for yourself and for the world around you.
[542] So you use the word stories, Dax, and for me it's that.
[543] It's, you know, how well or not so well we integrate ethics into our decisions determines our stories.
[544] And it determines our influence on everybody else's stories.
[545] So it's really about how we're connecting with other people and how we're having an impact on the world around us.
[546] I ultimately think everything's selfish.
[547] So for me, the incentive to try to be as moral as possible is that I like myself more, which is ultimately what I'm striving to be is someone that I look in the mirror and I'm proud to be.
[548] So for me, that's my incentive.
[549] And I don't know what other people's incentives would be.
[550] I think for my wife, where she's just genuinely, intrinsically concerned about everyone in the world, I'm not.
[551] For her, that's easy.
[552] You know, she really wants to have an impact on the whole world around her.
[553] I'm more I'm coming from maybe, I guess, the individualist point of view, which is I just don't want to hate myself and I don't want to live with myself when I act shitty.
[554] The starting point of this is sort of, who do you want to be?
[555] Right.
[556] And you don't need to be the person you want to be all the time.
[557] That's the perfection thing, right?
[558] I mean, we think we know who we want to be, and maybe that evolves also.
[559] So I think it's a lot about that.
[560] It's a lot about how do you want to define yourself and present yourself to the world and who do you want to be for yourself?
[561] And ethics isn't about you have to be spending all day, every day, worrying about have you been as nice as possible to person X or whatever.
[562] It's really about just being thoughtful.
[563] I talk about listening to what people are really saying and not listening to what you expect to hear or what you want to.
[564] to hear or even what you think they should be feeling.
[565] So your book, The Power of Ethics, How to Make Good Choices in a Complicated World.
[566] Your hope is to democratize ethics.
[567] And so here's one thing, as a self -admitted, I just said, selfish person.
[568] I do sometimes I'll make an excuse for people because I also think thinking about what's ethical is a bit of a privilege.
[569] Like if my hour to hour thought is how the fuck am I paying for the rent, how am I bringing home food to my kids?
[570] I'm not then thinking, oh, is the food I'm bringing home to my kids also supporting factory farming?
[571] There's no room for that.
[572] You know, I'm pretty sympathetic to like, it does feel like a little bit of a luxury, at least in my own life, I found that as my needs were met, I had more bandwidth to think about these things.
[573] You raise such an important point, and I completely agree.
[574] So when I talk about democratizing ethics, the first thing I mean is making ethical decision, making accessible to everybody, like almost a habit.
[575] It's not that there's some perfect process, but I have this framework in the book, and it's like, it just becomes sort of how you deal with the world.
[576] Then we look at the choices.
[577] There's no question that if you don't know how you're going to put the next meal on the table, you're not worried about organic.
[578] Right, right.
[579] And if your only way to communicate with someone at the other side of the world is Facebook, you're just not going to be all that worried that there's all the kind of stuff that we saw and the social dilemma going on, because that's the only way you're going to be able to communicate with your family.
[580] But the other thing I mean by democratizing ethics is that the world has gotten so complicated, and in particular things like artificial intelligence and gene editing and all of that, that we have this really complicated expertise that is lodged in the brains and the power of a very few people.
[581] Yeah.
[582] And what I really want to say is we can't have a society where only people who really understand gene editing get to make the decisions about how that's unleashed on society.
[583] And we ought to be able to have a voice about these things.
[584] I don't need to understand how an algorithm works.
[585] To have a voice, I should be able to have a voice, whether or not we're using potentially racially biased facial wreck technology by police.
[586] So it's one thing to monopolize social media.
[587] It's one thing to monopolize search.
[588] It's a whole other thing to have.
[589] people in power or people with very, very sophisticated knowledge, monopolizing our ethics.
[590] Yeah.
[591] What an unforeseeable experiment that's happened.
[592] Where we're all sharing, A, a language, whether you think so or not, we're sharing ones and zeros.
[593] We're sharing a binary platform.
[594] And we're beginning to think in the way it guides us to think, which is so unimaginable.
[595] It's quite scary.
[596] It is really scary because some of it is not human intervention even that's making it happen.
[597] And tell me, Dax, if this is, if I'm right about this, because I'm not an anthropologist, do you feel more in control if you knew that humans were making decisions?
[598] And you could then deal with those humans.
[599] But these algorithms are kind of, you know, putting us in bubbles or, you know, polarizing us, et cetera.
[600] Yeah, it's really, really interesting to try to comprehend motive in a situation where it's a computer who has no ego.
[601] It only has a directive, get people to consume more of this media for longer without any concern over what the media is or why it's more appealing.
[602] Yeah, it's quite scary.
[603] And then I, of course, we, we had Tristan Harrison and we're all alarmed about it.
[604] But the notion that we've, you know, in many ways, created things that even the creators have no idea what the end result will be and that they do end up with a momentum of their own.
[605] And they kind of just wake up going, oh, wow, all that increased viewership came at this price.
[606] We missed that.
[607] You just hit on something that I think is so important.
[608] So one of the things I think about with ethical decision making, especially with people who do have the time, you know, and creators of this technology do have the time, is that we should be thinking about the consequences that could happen, not just the consequences that we know will happen.
[609] And the other thing is we need to be thinking about, particularly with technology, short, medium and long -term consequences all the time of the decision.
[610] And kind of what's happening is we've become like serial short -termists.
[611] It's like, well, I'm going to make a decision that suits me for this week.
[612] And then I'll come back on Sunday and I'll make the decision for the next week.
[613] But the problem is exactly what you just described is that the longer -term consequences, it was like, how did that happen?
[614] I do believe most of those people were like good, you know, folks.
[615] I don't think there was a twirling mustache person at YouTube.
[616] No, I think, no, these people are not going to work thinking, how can I be unethical?
[617] Yeah.
[618] You say that you want people to recognize the power that they have over their ethics and societies at large.
[619] And I'd be curious to hear how you think that works, because I'm a little bit of a pessimist sometimes.
[620] So I would love to hear how we do have power and choice and how that.
[621] can affect the people around us in our society at large.
[622] Also, you know, and again, it's not about sort of having a sit down for an hour and put your dilemma through, is it Plato or is it Aristotle?
[623] I just think every little thing adds up.
[624] Sort of read an article and express a view or think twice before you post a photo of a baby on social media.
[625] Not that there's any problem doing it, but just, you know, make sure that it's right for you.
[626] Yeah.
[627] And, you know, by the way, as I say in the book, I never tell people what to do.
[628] because I'm not going to be the ones living out the consequences.
[629] And I certainly don't know anybody else's life as well as they do.
[630] But I think that decision by decision, we really can have an impact.
[631] But bigger things like just thinking about do we take the car keys away from an elderly grandmother, that someone who could potentially do harm and yet is taking away their independence, just thinking through, okay, how do we handle this?
[632] I regret not doing philosophy.
[633] I love the ones where it's almost impasse.
[634] I don't know.
[635] You know, I'm just so certain of things, but I love the ones that are like, you know, the hypothetical one about sleeping with a sister on a vacation.
[636] Yeah, that's Jonathan Heights.
[637] Thought experience.
[638] Moral dumbfounding experiment.
[639] You present something that you really can't make an argument against, but your gut is so certainly telling you it's wrong that you will find many, many ways to explain why it's wrong, even though there isn't really any kind of logical attack.
[640] Yeah.
[641] I mean, I think, so those are really interesting.
[642] And the thing is, though, that what we see is that a lot of these dilemmas have actual real potential consequences.
[643] Like whether or not you take the car keys away from an elderly grandmother could have serious, you know, it might not.
[644] This person might drive and be fine and everything's great.
[645] They might pull over and save a motorist.
[646] Exactly.
[647] Now, let's talk about the fun fact that Mike Scher, who created the television show, my wife was on, Good Place, loves you.
[648] and has spoke highly of your work, numerous occasions.
[649] The thing I liked about the show where it went is that it attempted to explain the deep complexity of the 21st century.
[650] I just got into a debate the other day in our kitchen about this.
[651] It was like someone was talking about the poor wages that some company is paying the people in Vietnam or whatever country the manufacturing facility is in and how it's unacceptable.
[652] it needs to get shut down and then I raised the point.
[653] Yeah, I mean, it's fucking terrible.
[654] There's no two ways about it, but I don't know that when you talk to the person that's getting 20 cents a day, they're not going to be pissed you took away the 20 cents a day.
[655] Like it's, it's not easy.
[656] It's not just, yeah, clearly that's unethical.
[657] They're paying him that.
[658] I agree on that part.
[659] But I don't know that the solution is to remove it.
[660] That's complicated.
[661] And I thought the show did a good job.
[662] And I think he was modeling a lot of what he wrote after your work.
[663] Just pointing out how.
[664] how increasingly complicated the world is and how hard the decisions we have to make are.
[665] Yeah.
[666] So, I mean, I first of all, I have to say, I think he's just brilliant.
[667] And if you ever needed a cure for regretting not being a philosophy major, Cheeady should be the cure.
[668] Yeah, yeah, yeah.
[669] He's a total stress case, like of all of the characters in that, you know, he's going through John Stuart Mill and Bentham and in all of these things.
[670] Everything has to be just so.
[671] And I think an episode I just watched recently, he says something like, and I'll get this wrong, so Monica, with your brilliant fact checking, you know, but he says something like I can turn any situation or any place I encounter into a living hell.
[672] Yeah, yeah.
[673] You know, and I just thought that was the most brilliant.
[674] I mean, that's typical.
[675] Like, that's the most brilliant line.
[676] And actually, the people who are actually figuring stuff out like the character that Kristen plays, you know, it's like street smart or like actual.
[677] thinking.
[678] That's what I mean by democratizing ethics.
[679] Like, you don't need six degrees from an Ivy League school.
[680] You don't need to have been raised in any kind of place.
[681] Like, anybody can do this.
[682] Anybody can think a little more about how am I affecting someone else.
[683] So I thought it was brilliant for that.
[684] But the other thing is this idea that at one point in this show, they keep getting reset.
[685] They don't know if they're like the 187th version of themselves or the 56th version of themselves.
[686] And I just thought that was so brilliant because I think it's really important in today's world that we realize that we don't get to a race.
[687] We can learn, we can recover, we can heal.
[688] But to me, one of the most dangerous things about the assault on truth is that it distorts memory.
[689] It distorts history.
[690] And so that whole thing really, really resonated with me. Well, yeah, we're in a very unique time where all of our secrets are now encapsulated in this device that.
[691] could always be referenced is a little bit of a dangerous.
[692] I feel like one of two things are going to be the outcome.
[693] Either everyone gets so fucking good at lying or we all acknowledge we all have a ton of warts and fucked up relationships and broken promises and we all finally take a breath and go, oh yeah, we're all pieces of shit, but we're going to be okay.
[694] We're going to strive to not be.
[695] I just don't know which direction it's going, but it does feel unique to me. Yeah, no, I think you're right.
[696] But I think it would be really great if we all could take a deep breath.
[697] Now, again, I go back to some things are binary.
[698] You know, I mean, sexual assault and racism, like, no, I'm not taking any deep breaths there.
[699] But in terms of just like the kind of stuff you described, like we all didn't have the best moment in a relationship kind of thing.
[700] I mean, first of all, how does it happen that it's everybody else's business?
[701] Yeah.
[702] Like, nothing happens in private anymore.
[703] So I think that's really complicated.
[704] But, you know, you raise a really really, really great example with, you know, workers in a foreign country and their low salary and how they're treated and all.
[705] And is it all right if we just come back to that for a minute?
[706] Yeah.
[707] Yeah.
[708] I'd love to hear your opinion on it.
[709] Yeah.
[710] Okay.
[711] So this came up with Bangladesh with these terrible stories a few years ago where one of the factories, there was fire and another one collapsed.
[712] And as you might expect, all of the workers inside were women.
[713] And then there was some also very bad behavior, like trying to trap the women in these things.
[714] And the companies that were involved, and there were many, there were many clothing manufacturer companies, they all said, well, you know, the supply chain, it's like 100 units down.
[715] It's this one, outsources to this one, outsources to this one, and we can't know.
[716] And the exact question you raised came up, which was, do we shut these things down?
[717] But that's the only source of income for these women.
[718] And this goes back to my when and under what circumstances or my how do you do it question.
[719] My responses to the companies is you can know your supply chain if you just spend some more money.
[720] This is not impossible to know.
[721] This is not like science we haven't discovered.
[722] This is a chain of contracts.
[723] And you can treat those workers better.
[724] You don't have to shut down.
[725] It's not binary.
[726] You just have to do a little more to make sure you know who you're dealing with.
[727] And by the way, it's not good enough to sign a contract, you have to actually inspect what people are doing everywhere along the way.
[728] Yeah, so I totally agree with that end of the equation.
[729] I think we have a right to demand.
[730] Yeah.
[731] Now, my fear is that there is a slight hint of what we would call naive relativism, where because I want a microwave, I assume everyone wants a microwave.
[732] I don't love when we have this kind of moral imperialism.
[733] It would be wrong for me to accept 15 cents an hour.
[734] So I'm going to tell this person over there that it's wrong for them to have 15 cents an hour.
[735] But I'm not asking that person, what would be your choice?
[736] Would you like me on your behalf, get this place shut down, or keep your 15 cents an hour?
[737] I would feel better if I knew what that person's perspective was.
[738] And I wasn't just over here deciding what everyone else's standards and morals.
[739] Yes, I completely agree with that.
[740] So I do think there's a difference between being the creator of a deadly fire or, you know, factory collapsing and sort of, you know, arguing about wages or whatever.
[741] Yes.
[742] But yeah, I mean, I think your fundamental point is so important.
[743] And it comes back to what I said earlier about listening, you know, and this is the whole democratizing ethics thing.
[744] We don't get to decide their ethics for them.
[745] Right.
[746] And we might find them reprehensible at times.
[747] But I have to also then respect other people's cultures and points of view and how they want to do things.
[748] So it's very tricky.
[749] I love it.
[750] It's very, very tricky.
[751] It's very, very tricky.
[752] I'd say, like, let's take a more nuanced one.
[753] It's so much more juicy.
[754] And we just had Dr. Michael Erick Dyson on who said he ended up coming to the defense of the Virginia governor who was found to have been in blackface when he was in medical schooler.
[755] You know, right on the surface, I haven't done blackface.
[756] I can't agree with that decision.
[757] And he said, despite that history, look at the body of work that the man has done since then, and he's got policies that are going to help.
[758] And so you could pick the guy with no skeleton in his closet, but he has no plan to help black communities.
[759] Or you could pick the guy with the skeleton in his closet, who I believe has a real platform, or rather a policy.
[760] So I look at the question in a slightly different way, which is I don't think we get a net ethics score.
[761] It isn't like you were bad here and you're good here, and the good is 10, and the bad is five, so your net score is five.
[762] So, like, to give you a corporate example of that, it's like we manipulated the foreign exchange rate, but we give women in a particular developing country a savings and loan program.
[763] I mean, so I think we have to be careful that we don't, like, get to net out our ethics scores.
[764] But on the other hand, I, you know, like I said, I'm really about resilience and recovery.
[765] So to me, what matters is three things.
[766] Does the person?
[767] and tell the truth.
[768] I did this.
[769] And it was racist.
[770] So the example of that is Canadian Prime Minister Trudeau.
[771] Like, I don't know if you've seen any of this footage, but he actually was there's this video of him on an airplane talking to the media where he says, I did this.
[772] There's no question it was racist.
[773] I should have known it was racist then.
[774] I'm not telling you it wasn't racist then.
[775] Yeah.
[776] And, you know, and so the first thing is tell the truth.
[777] The second is to take responsibility.
[778] So he clearly did that.
[779] And then the third.
[780] is to have a plan so that you aren't going to have it happen again.
[781] So it's slightly different from saying, well, I like his policy now, so, you know, we're going to just like give him a pass for, you know, I think that, you know, ethical resilience has to start with truth and, you know, responsibility.
[782] I think that is the key.
[783] I think that's so the key.
[784] I think it's more about how you react to the thing than the thing quite often.
[785] Yeah.
[786] Well, I mean, we have to give people space to recover.
[787] From all air ethically, I mean, in different ways.
[788] Stay tuned for more armchair expert, if you dare.
[789] The example where it really didn't work is, I don't know if you saw the news footage of Harvey Weinstein in the courtroom being convicted, and he's saying, but I didn't do it.
[790] Yeah.
[791] Yeah.
[792] That's a case where you could make this case.
[793] it's intellectually possible to make a case where he goes, hey, look, these people want a job.
[794] They're willing to be with me to get this job.
[795] I'm being honest about it.
[796] I don't think there's a huge moral imperative about prostitution.
[797] If someone wants to sell their body for sex, I'm in favor of that as long as they are in full control of that and they have consent.
[798] So you can make this intellectual argument of, oh, well, is that some form of an agreement, an arrangement, an arrangement in exchange?
[799] And then as you hear the cases, it becomes incredibly obvious.
[800] That was evident to not be the case.
[801] People were terrified.
[802] They weren't making that arrangement with him.
[803] It was not a transaction.
[804] Even if they were, it was an abuse of power.
[805] So, like, regardless, I mean, I know that's a different, maybe the whole conversation, but even if they wanted the job and they were super happy to be there, he's abusing his power in all these situations.
[806] Yeah, it's an abuse of power, but is, is it an abuse?
[807] abuse of power if a John has money and a prostitute needs money, there's a, there's a power imbalance.
[808] One person has money.
[809] The other person doesn't.
[810] Yeah.
[811] I mean, I do think they're getting taken advantage of.
[812] Yeah, I do.
[813] Yeah, I don't.
[814] I don't think prostitutes, oh, some, I think some prostitutes are obviously getting taken advantage of.
[815] There are people who are held against their will, blah, blah, but I do believe there's a world in which someone decides they're going to be a prostitute male or female and that's a decision i think and it's genuinely you're talking about cases where it's genuinely free choice yes yeah and i think i mean i think monica you you know you raise an interesting point you know you talk about abuse of power dax used the words fear i mean all these things drive the spreading of unethical behavior pressure fear and he had it like a supercharged version of fear because it wasn't just you're not going to get in this movie and you know this industry a million times better than I do.
[816] But it was like, you're not going to get in any movie ever.
[817] Like, he just had such, you know, such influence over the industry, apparently.
[818] Yeah, he's a disgusting monster.
[819] That's my assessment of him.
[820] Can't disagree with you.
[821] Yeah, it's pretty, the verdict's in on that.
[822] Yeah.
[823] It becomes very interesting when you switch genders.
[824] So a comedian pointed out of his son came home from a meeting with a female executive and And she said, you have to sleep with me to be in this superhero movie.
[825] He'd say, oh, my God, what superhero are you going to be?
[826] Like, it does all of a sudden.
[827] And again, I understand the context, which is it's a patriarchy and they have the power and it's being exploited quite often.
[828] And when you flip it, it is not the same.
[829] But just to tell you, like, when I work with companies, there are plenty of cases where women managers, for example, will invite a bunch of guys who are more junior to a totally inappropriate place and they don't really want to be there but they're there the women are not understanding that it's not because they're women that it's okay so it you know so your scenario actually does happen like in the mainstream corporate world like i've seen it right right i guess i'm only pointing that out just to say like once again it's just very complicated that i think each individual case is so complicated and we want the cookie cutter solution to all this stuff and sadly it's just going to take more work and more brain power and more patience and time.
[830] But what do you guys think about the question about, okay, we have somebody who we've decided is a total creep like Harvey Weinstein?
[831] And as you say, Monica, convicted, like literally convicted.
[832] So what do you think about, you know, what do we do with all of his works?
[833] What do you say to people who say we should just get rid of it all?
[834] Yeah, it's such an interesting.
[835] I mean, my favorite movie ever is a Miramax movie, which was run by them.
[836] So, but I can't, but it's not fair.
[837] for me to say, well, that's my favorite movie, so we should keep it.
[838] Just out of curiosity, what is it?
[839] Goodwill hunting.
[840] So good.
[841] I think that's a personal choice.
[842] I don't think the Cosby show should be removed from TV.
[843] But I totally understand if you don't feel comfortable watching the Cosby show, because all you can see is him and all the stuff he did to all those women when you're watching it.
[844] And same with like Michael Jackson.
[845] Like, I think it's okay if you're like, you know what?
[846] I just can't listen to him anymore.
[847] But also, should it be taken off the radio?
[848] Probably not?
[849] I don't know.
[850] I have less of a hard time.
[851] I think that it's an absurd proposition that a monster like Michael Jackson, who created so much pain and damage, that we would then punish him by, not enjoying the thing he did that was good.
[852] So he's dead and I'm going to penalize myself for his crimes.
[853] That to me seems absurd.
[854] I love off the wall.
[855] Also, Quincy Jones.
[856] I'm going to penalize Quincy Jones who produced that album because this guy was a monster.
[857] That's, yeah.
[858] I mean, I see all of your points.
[859] Unless it's a stand -up where literally there's one human being involved, you have to also recognize you're punishing everyone involved.
[860] There's people that are mad at your author of Harry Potter, J .K. Rowling, and my thought is, okay, so boycott Harry Potter.
[861] So who does that hurt?
[862] It doesn't hurt her.
[863] She's already got a billion dollars.
[864] So you're not going to inflict any pain on her for that opinion.
[865] It's just, it's not going to happen.
[866] So now you're going to deny the immense pleasure that it is given Monica's life.
[867] It's her favorite series of books she's ever read.
[868] It accounts for a decade.
[869] decade of her life.
[870] And so I'm going to, we're going to punish her?
[871] It doesn't make sense to me. Yeah.
[872] You know, I think so you both make so many good points.
[873] So in the book, I talk a lot about what you just said, Dax, about, you know, so many other people were involved in these movies.
[874] Like, we're going to destroy the careers and the ability of these people to have their work seen and appreciated because this one guy, I have a really, really hard time with it.
[875] I totally see your point, Monica, that like everyone chooses for themselves.
[876] For some people, Like, I had someone tell me the other day that, you know, he just can't watch a Woody Allen movie.
[877] Sure.
[878] Because Woody Allen is in the movie.
[879] And it's like, you know, but probably if Woody Allen wasn't in the movie, he might be able to enjoy it.
[880] So I think you're totally right.
[881] But the other thing is that I'm obsessed with truth.
[882] There's no alternatively factual ethics.
[883] So in a way, if we start erasing bits of the artistic history, we're erasing truth, you know.
[884] And I have a really hard time with that.
[885] I do too.
[886] I don't know why people are afraid of these things as being a conversation starter.
[887] Like, oh, that's interesting.
[888] Things have really changed, haven't they?
[889] We all show them movies in the 80s.
[890] You know, most movies in the 80s have a guy pursuing a girl in a way that certainly wouldn't be cool today.
[891] And I say that.
[892] That's weird.
[893] That guy didn't listen at all when she said she wants to get out of here, huh?
[894] I don't need to get rid of it.
[895] In fact, I think it's a great example of instead of me telling her in theory what it can look like, I can show you.
[896] others would it look like and that's how it used to be and doesn't that suck and here's where we're going yeah and also i think you know the further back you go the more you would have to just erase i mean it's it's like how far back are you going to go and how are you going to draw the line and who's going to draw the line yes like who you know who's going to decide that that michael jackson is you draw it there but not with i don't know somebody else i think that's a really problematic situation well that's the unavoidable you want to talk about truth is it's all proportional to how talented the artist was.
[897] So it's like, no one's got trouble throwing certain artists away.
[898] Picasso's a hard one to throw away.
[899] Exactly.
[900] You know, and you got to be truthful about that.
[901] They add some value to humanity that is so immense.
[902] It deserves, some are easy.
[903] Some rapper I've never heard more than two songs from.
[904] He, you know, kidnap someone.
[905] Yeah, let's get rid of it.
[906] Now, Michael Jackson, who is, by all accounts, was a monster, it gets harder.
[907] I mean, a different way of saying what you were saying about other people's work, is affected, these pieces of art, you know, influence the arts more generally.
[908] So like Michael Jackson had an influence on dance, had an influence on, you know, videos, had an influence on all these things.
[909] It would be like saying, you know, I've spent some time for crazy reasons the last week, like reviewing lyrics of Wooten Klan.
[910] And I mean, these guys are like, they're not perfect people either, but God, are they cool in a lot of ways.
[911] And, you know, I'm just looking at this going, how do they get in the same song, Chinese sword fighting, chess, the Emancipation Proclamation, guns, and like relationship with your girlfriend.
[912] I don't know how you do that all in one song, you know, like I don't have that talent for sure.
[913] But I was just looking at it, for similarly, like, thinking about, okay, this really iconic group that defined that genre, can you imagine if you just said, okay, well, we're just going to take them away.
[914] Yeah, no. You know, I tried to make this argument a couple years ago.
[915] I think what it's more telling of is how much we value art versus science.
[916] Because if we found out tomorrow, Einstein had raped 350 women, we would hate Einstein, but we would not throw away the theory of relativity.
[917] It would be insane for us to do that.
[918] It benefits us.
[919] So in my opinion, what people are really saying is, well, art's just not as valuable.
[920] Because if it were, we would would never consider throwing it away.
[921] We would never consider getting rid of Picasso's art if we thought of it having the same value as science.
[922] Right.
[923] But I think, you know, just to come back to truth, well, first of all, science, people think of it as fact and you can't sort of discard fact.
[924] I'd argue that thrillers of fact.
[925] The whole world loved it.
[926] A billion people.
[927] I totally agree with you.
[928] It is a fact that it existed, that it had this influence, the numbers you just cited.
[929] Yeah, I mean, thriller was like, you know, incredible.
[930] But, you know, like more than one thing can be true.
[931] Michael Jackson can be this amazing artist and be a total creep in certain ways, you know, and also be incredibly generous in other ways.
[932] Like, we're complex people, and I'm not in any way, and I want to be really clear about this.
[933] I have zero tolerance for sexual misconduct.
[934] Yeah, same.
[935] Yeah.
[936] Also, if Michael Jackson's songs were about having young boys to his room, it would be easy we would be like yeah there's songs about falling in love with women which is horseshit or getting billy jean pregnant you know they're not even in the bill cosby episode's not about him quote mentoring young aspiring actors you know yeah that's a really good point it's a complicated but i don't know about that because there's there are songs about being horrible to women and treating women horribly and they're playing and we're not taking those off either or talking about like even if those songs were about that there are other songs that also exist about that and those for me are hard to hear so the one the other day i was listening to it's on yacht rock radio i love it you know delta always used to sing it the chorus is you know if i could fly i'd pick you up i'd bring you into those show you alive that you never knew so everyone knows the chorus of that song the opening line of that song i just realized that the other day was She was 16 years old.
[937] And I'm like, oh, oh, this whole song is about wanting to show this young girl what an adult love is.
[938] This is disgusting.
[939] And because it's in the song, guess what?
[940] I don't really like the song anymore.
[941] Now, if I found out the dude was just a weirdo, but the song wasn't about that, it would be different for me. I guess that's what, I mean, again, I just think it's personal.
[942] I think it's what you can tolerate, what you.
[943] associate with and yeah no it is personal and also we cannot engage with it personally uh we can decide we're not going to watch a movie or listen to the music but still not remove it from you know from access so to speak well yeah making it uh yeah erasing it from the archaeological record and making the decision for everyone else i'm a little hesitant to do that even the song 16 years old it's not for me anymore but you know i'm not but it existed yeah yeah yeah and And again, when I heard it, the first thing I said to my daughters, because she always sings it, is I was like, you know, that song's really weird.
[944] The guy's singing it's like 30 years old and it's about a 16 year old.
[945] Like, it's so gross.
[946] Yeah, because you do end up, it becomes almost, it can get close to like, well, don't talk about Hitler or the Holocaust because that's bad and that was a bad thing.
[947] But really, no, you do have to talk about it because you have to prevent that from happening again.
[948] and you have to tell people that was horrible.
[949] So I guess keeping it out there helps the conversation continue?
[950] I think so.
[951] Well, I certainly think the conversation continuing is such a great point, Monica.
[952] I mean, and actually, Ellie Wiesel, Nobel Peace Prize speech talks about we can't forget.
[953] Now, that doesn't mean we have to turn everything into this, you know, really, really difficult conversation.
[954] You know, this is this thing that existed in this time.
[955] And boy, we have a different view of that today.
[956] And how great is that to handle it that way?
[957] Yeah.
[958] Well, Susan, you're radical.
[959] I'm so glad we got to talk to you.
[960] Everyone should buy the power of ethics, how to make good choices in a complicated world.
[961] If you loved the good place, you will love this book because Mike loves this book.
[962] And we love Mike.
[963] And we love Mike.
[964] Well, yes, Monica and Kristen are deeply in love with Mike.
[965] Yeah.
[966] And so I'm threatened by Mike, but I recognize the brilliance.
[967] Okay, well, it's important to clarify that.
[968] I wanted to own all my biases.
[969] Well, so fun talking to you.
[970] I really appreciate it.
[971] Yeah, no, thank you both so much.
[972] So that's really such an honor to speak to you both.
[973] Thank you.
[974] All right.
[975] We'll talk again.
[976] All right.
[977] Take care.
[978] Bye.
[979] And now my favorite part of the show, the fact check with my soulmate, Monica Badman.
[980] Okay, we have a special guest.
[981] on today's fact check for just a few minutes.
[982] Hopefully hours.
[983] Oh, my God.
[984] But yes, we have the only person that's been talked about more than Aaron Weekly on this show is you, Ashok.
[985] You come up quite a lot.
[986] Yeah.
[987] I'm not sure.
[988] What are you talking about?
[989] I didn't suspect you did.
[990] He doesn't really understand the simulation.
[991] I tried to explain it to him yesterday, but I said it's part of the, he's not supposed to understand it because it's his.
[992] Okay.
[993] I'm going to walk.
[994] Ashok through it.
[995] First of all, welcome.
[996] I'm so excited you're on.
[997] For real, it's honor to be here, my debut.
[998] Oh, Jesus.
[999] Uh -oh.
[1000] Let me walk you through the simulation, okay?
[1001] Well, of course you saw the Matrix, right?
[1002] No. Of course he has not.
[1003] You're the only person in America that didn't see the Matrix.
[1004] Wow, so we got uphill battle then.
[1005] But you're familiar with the idea, right, that maybe in the future, people will be able to plug into a computer and in their mind they'll be having this.
[1006] full experience, but in real life, they would just be hooked to a computer.
[1007] Yeah.
[1008] So here's our theory.
[1009] We believe that you, Ashok, are somewhere.
[1010] You are hooked to a computer, and you are living this incredible simulation.
[1011] Because your life is suspiciously wonderful.
[1012] Would you agree?
[1013] Yeah, I can't complain.
[1014] The story is too good to not be a simulation.
[1015] How old are you when you moved?
[1016] 25.
[1017] 25.
[1018] And in your wildest dreams, what did you think America was going to be like?
[1019] You know, came as an engineer, you know, thinking it's going to be everything about technology.
[1020] But shocked a little bit when I first came, you know, watched the wrestling.
[1021] And I said, the very first time I saw that within a week, I said, this is not real.
[1022] Wrestling?
[1023] Oh, my God.
[1024] Oh, my God.
[1025] Oh, boy.
[1026] I said, this can't be happening in the United States.
[1027] You know, it's still the same.
[1028] Dan, we need to scoot your chair up because it's hitting that thingy.
[1029] It's not going to make noise.
[1030] So this is great.
[1031] So I wanted to watch wrestling.
[1032] They say it was on wrestling on TV.
[1033] So I started watching it within the first 30 seconds.
[1034] I knew this is not real.
[1035] Oh, right, right, right.
[1036] So you did watch like big time wrestling and realize, oh, this is all, it's a show.
[1037] And the United States, I was not expecting that.
[1038] And prior to coming, you had seen, I assume, movies that were set.
[1039] Oh, yes, absolutely.
[1040] And your idea of America maybe was kind of from movies?
[1041] Yes, movies and, you know, the textbooks we studied and all that mostly came from the United States also.
[1042] The authors, you know, university professors from United States wrote our textbooks for engineering and all that.
[1043] And so you came here and you thought, okay, I'm going to be an engineer and my life will be like, what?
[1044] You know, I'm going to have a wife and I want to have children.
[1045] Did you have a plan of where you wanted to live?
[1046] Did you think you were going to end up in Atlanta?
[1047] No, no, no. I had no plan.
[1048] I came to Chicago because my sister was there.
[1049] And then, you know, I got a job in Milwaukee, Wisconsin, three months after I moved here.
[1050] I got another job at Kansas City, Kansas.
[1051] Well, I think the stereotype is that most parents that are Indian, first generation, their children, there's this tremendous pressure on them to become a doctor or some kind of higher education.
[1052] master's degree, Ph .D., all that.
[1053] And you didn't pressure Monica or Neil to do that.
[1054] No, probably because of my engineering background, because in this country, many of the people who came from India who are engineers because of the ups and downs of the economy, recessions, and they're getting laid off and jobs and all that.
[1055] Many just left the profession and started owning motels and all kinds of other businesses.
[1056] Dairy queens and whatever, you know.
[1057] And many of them started were originally engineers, many of them.
[1058] They started doing that, and they became very successful.
[1059] You know, initially they would look down upon, you know, the professions.
[1060] But then they made a lot of money.
[1061] My dad really only cares about money.
[1062] No, no, no, no, no. Well, no, that's okay.
[1063] Dax only cares about money, too.
[1064] No, no, no, what I'm saying.
[1065] So, you know, then, you know, you come to realize, you know, there are many different ways of making a good living or whatever they want to do.
[1066] But with Monica's case and the logic and Neil's case, even if you wanted, we don't have much choice.
[1067] You know, they're going to do what they're going to do anyway.
[1068] Well, especially in Monica's case, yeah.
[1069] So we got in a fight.
[1070] Well, first of all, I need to thank you.
[1071] We have a very successful show.
[1072] And the whole reason we have a successful show is because of you and Monica, because you guys argue all day long.
[1073] And so she's great at it.
[1074] And all Monica and I do is argue all day long.
[1075] And when I saw you guys together, I was like, oh, I get it.
[1076] This is why Monica knows how to get along with me, because she just argues all day long.
[1077] So I have you to thank for this dynamic relationship that has been successful.
[1078] But was she a brat?
[1079] Because we fought about this a couple days ago.
[1080] No, no, she was not.
[1081] You can answer truthfully if you want.
[1082] No, truthfully, I don't think she was.
[1083] Thank you.
[1084] I don't think she was ever.
[1085] Right.
[1086] What about when she was screaming for her mother to make her milkshakes?
[1087] I don't even remember that.
[1088] You don't think that's bratty?
[1089] He wasn't paying very much attention.
[1090] My mom might have a different opinion on the Is She a Brat question.
[1091] She was, taxed we never, I don't think we ever questioned at the local war of what she did in school or anything like that.
[1092] I don't even know what classes she took.
[1093] This is great.
[1094] Yeah, there was nothing.
[1095] You know, it was just, she was an autopilot.
[1096] there was no question about, you know, how she was going to do.
[1097] You know, it was a little shocking for us when she said, I was going to major in drama.
[1098] I said, drama.
[1099] That was not a great night.
[1100] Yeah, 0 .1 % employment rate.
[1101] Oh, zero, probably.
[1102] Yeah, yeah, 0 .0 .1.
[1103] Yeah, because while you never told us we need to be doctors or this or that, you definitely told us to get a secure job.
[1104] Yes.
[1105] And so I did the opposite of that.
[1106] Then, in our own minds, we decided, well, you know, if you can't do anything, you can always go to law school, you know.
[1107] Right.
[1108] That was in our, you know, we didn't have in our own mind and says, well, you know, she's going to do this for a while and then finally she can go to law school.
[1109] Yeah, she'll get this out of her system and then be a trial lawyer.
[1110] You never fall into any of my imagination of it.
[1111] So I'm sure this won't work out either.
[1112] But when I have this fantasy of I raised my children in a much different culture, I just wonder what it's like for you to have had this daughter who is a cheerleader in America.
[1113] Like watching that where you're just like, oh, wow, I mean, she is an American.
[1114] Look at this.
[1115] She's a cheerleader.
[1116] Well, in my case, it has to be, you know, if I have to be honest with you, my wife grew up here.
[1117] She was six years old when she came.
[1118] So I've already seen that part very closely.
[1119] It's not like, you know, if I had traditionally married somebody from India who grew up there, it would have been a little different.
[1120] In this case, it's half, you know, almost three quarters.
[1121] She, you know, her parents were Indian, so she had a lot of that, but still, you know, she grew up here, went to school, college, and everything here.
[1122] Right.
[1123] She even got a southern accent, which Monica didn't know.
[1124] Yeah, right.
[1125] So it's a southern accent, you know, Savannah.
[1126] So she wasn't.
[1127] like a cheerleader or any, like all of, I really leaned in hard to very American life.
[1128] Yes, she was, all I was trying to say is my thing was not direct from me to her.
[1129] It was there was in between.
[1130] She was not like Monica, but it was not like a traditional Indian who grew up in India.
[1131] So for my transition from going there, there was an in between.
[1132] And then, you know, when she said she was going to do a cheer lady.
[1133] I didn't know what it was.
[1134] Sure.
[1135] Even if you know what it is, it doesn't make a ton of sense, but yeah.
[1136] Yeah, it was going down there, you know.
[1137] Then they was told you have to have gymnastics and all this, and she was going to go try out.
[1138] So I said, she's not going to make it.
[1139] So that's okay.
[1140] She can go try.
[1141] No confidence.
[1142] Yeah.
[1143] She can try out for that.
[1144] She'll be a lawyer soon as well.
[1145] Because, you know, they said you have to have a lot of gymnastics.
[1146] She never had any gymnastics or a lot.
[1147] anything like that.
[1148] Yes, I did.
[1149] My little, not, you couldn't, you know.
[1150] They were, I remember Jane all that, heavy duty.
[1151] Well, yeah, I wasn't like that, but I had some skills.
[1152] Is he familiar?
[1153] If I referenced the picture of you and the white dress, does he know which one that is?
[1154] No. There's a picture of me. Yeah, remember, we were looking at it earlier.
[1155] I'm in a white dress and I look four, but I'm actually two, and I'm touching some grass.
[1156] You do not look for.
[1157] You look 18 months.
[1158] I look four.
[1159] Even my mom agreed.
[1160] She said, Indian baby.
[1161] these look old.
[1162] They look mature.
[1163] And their hair and facial features, she says they look old and then at some point it bounces out.
[1164] Oh, well, I don't know if I agree with this assessment, but that's fine.
[1165] This is your, y 'all's story, not mine.
[1166] This picture of her has provided so much happiness in our circle.
[1167] When people are sad, we send them the picture of Monica in that white dress.
[1168] My best friend, Aaron Winkley, had COVID.
[1169] He was so depressed.
[1170] He was miserable.
[1171] Monica sent him that picture, and he was immediately in a good mood.
[1172] And I just always think how in love with that little girl were you?
[1173] Oh, wow.
[1174] I mean, she was the cutest.
[1175] Also, you know, first being the girl, you know, we were pretty close at that.
[1176] You know, I have a lot of pictures, you know, hanging around.
[1177] We were going to all this place.
[1178] Yes, there's no doubt about it.
[1179] I don't know if she tells you.
[1180] 16, that's a different story.
[1181] Sure.
[1182] Well, that's the braddy ears, I'm talking.
[1183] talking about.
[1184] What was it like when you flew to Chicago and you were sitting in this audience with 5 ,000 people and then that little girl in the white dress came out on stage and the place went insane.
[1185] Well, obviously very proud.
[1186] I mean, you know, feeling I can't believe this.
[1187] Who could believe it?
[1188] We can't believe it.
[1189] Of course that, yeah.
[1190] We can't believe it.
[1191] Yeah.
[1192] And even after the show, people saying, you know, we're coming after that, waiting after the show and, you know, telling us, we love Monica and all this.
[1193] I mean, people you don't even know.
[1194] I mean, you know, obviously it was, we kind of not expected that because we know we're going to the show.
[1195] But, you know, you're sitting there.
[1196] Oh, boy, this is unbelievable.
[1197] No other way of explaining.
[1198] We've done a lot of live shows, maybe 20 or 25.
[1199] And that is by, that'll be the best show of all times.
[1200] to see you guys afterwards was so fun and wonderful.
[1201] Really enjoyed it.
[1202] I mean, it was kind of, you know, it took a while to sink in, you know, the feeling.
[1203] Yeah.
[1204] It's hard to comprehend.
[1205] It is for us.
[1206] Well, you know, in your case, you know, you're not nothing new for you to these things.
[1207] I mean, you're a celebrity.
[1208] You've done movies and all kinds of stuff.
[1209] So it's not the same.
[1210] It's still silly.
[1211] It's like, this doesn't happen to people.
[1212] There's always that feeling, well, this doesn't happen to people.
[1213] Especially, you know, for us with Monica, I think I recall, she said she was going to do with you, I'm going to do a podcast.
[1214] I didn't know, I didn't even know what a podcast was.
[1215] Of course not, yeah.
[1216] How about law school?
[1217] How about you and Dex open a law firm instead?
[1218] Because, you know, podcasts were not that known.
[1219] Right.
[1220] No. We're the luckiest two human beings in America.
[1221] for sure.
[1222] So that brings us to the simulation.
[1223] So, you know, this story is, I find it impossible from your perspective.
[1224] The fact that you didn't even arrive here until 25 and then you had this wonderful career, which still continues as an engineer, and then you have this little cheerleader.
[1225] She becomes rich and famous.
[1226] That's all impossible.
[1227] This doesn't happen.
[1228] So obviously, it's your simulation, and I'm, unfortunately, I'm not even real.
[1229] I feel real, but I'm not.
[1230] I'm just a computer algorithm.
[1231] them to facilitate your fantasy.
[1232] You too.
[1233] I know.
[1234] You're the only real person here.
[1235] You're the only real person on planet Earth.
[1236] But isn't he also kind of a character?
[1237] Like this version of him is a character because his real self is potted up.
[1238] Right.
[1239] And we don't know where that is.
[1240] Like we don't even know if this geopolitical stuff even exists where you're hooked up to the computer.
[1241] I don't know.
[1242] I told Monica the other.
[1243] day that you might be like a six foot eight basketball player yeah we don't even know what you look like in real life yeah okay have you ever listened to an episode of the podcast yes not very many maybe yeah of four or five i prefer you to keep that number low yeah yeah yeah yeah you know you know do you remember who one of the episodes of what she was the guest i guess of the host of oh right right right right was that and then Hillary we have not watched she over here no Might have just been that one, Monica.
[1244] The worst one for you to listen to from.
[1245] No, no, no, I think what did, Hassan was there?
[1246] Oh, yeah, Hassan, yeah.
[1247] Hassan, Manage.
[1248] Oh, okay.
[1249] He only listens to the Indian way.
[1250] Yeah, and Kumail Nanjiani and...
[1251] Was he on?
[1252] Sanja, yeah, Kumail.
[1253] We've had a disproportionate an amount of Indians on, actually.
[1254] If you add it up, we've had way more Indians than are represented in the population.
[1255] Or Desay, Desay.
[1256] Desi.
[1257] Desi.
[1258] Desi.
[1259] Desi.
[1260] Desi.
[1261] Desi.
[1262] Yeah, this is simply It means native or Indian.
[1263] I got to get better at that.
[1264] I sometimes want to correct him, but I'm glad you did it.
[1265] Well, also, I don't think you really knew how to say it either.
[1266] I don't.
[1267] Yeah, she does not know any.
[1268] I don't know.
[1269] I don't think Monica, or me don't know a single word of an Indian language.
[1270] No. Not a single word.
[1271] Oh, who is mad at you about your last name, Monica?
[1272] Well, Pari is always mad at me about it.
[1273] Yeah, Pari.
[1274] He's a name.
[1275] in our neighborhood, and he really doesn't like that my last name's Padman.
[1276] Tell your father he's Indian, so this is why he has such a strong opinion.
[1277] He says Padman is not a name in India, and he's very frustrated.
[1278] It should have been changed to.
[1279] What do you want it to be changed to?
[1280] He says, if you're going to change it, you should have made it Padma.
[1281] No, Padma is a female name.
[1282] I know.
[1283] Well, look, he has a lot of opinions.
[1284] It's a short name.
[1285] It's not, I did not do it.
[1286] My father did it.
[1287] What was the original name?
[1288] The original name is hard.
[1289] It's Padman -Auban.
[1290] Ooh, I like that.
[1291] Padman -Aban is the original name.
[1292] And my father, he was in Borneo, Singapore, that area.
[1293] Then he was dealing with a lot of the British people.
[1294] And then he changed his name from Padman -Avon to Padman.
[1295] Padman is what's called in India, you know, Padman here, Padman.
[1296] But, Monica, you probably would not want me to pronounce that correctly.
[1297] Because you would think that sounded like an Indian accent.
[1298] You could try it.
[1299] If I said, Monica Podman?
[1300] Podman, yeah.
[1301] You got it, Putman.
[1302] See, you don't like it, right?
[1303] Look at the look on her face.
[1304] I'm not crazy about it.
[1305] She's triggered.
[1306] Podman.
[1307] Okay, my last question.
[1308] This is my last one for real.
[1309] Can I go with you to India someday?
[1310] Yes.
[1311] Yes.
[1312] You would take me, especially if your daughter paid for all of us.
[1313] What do you want the paying part?
[1314] Yeah, you know.
[1315] We'll go.
[1316] I want to go for real.
[1317] go to Kerala really bad.
[1318] Yes.
[1319] It's a beautiful place.
[1320] It's a beautiful place.
[1321] It's a unique state in India.
[1322] It's got the highest literacy anywhere.
[1323] The literacy rate is almost 100%.
[1324] Oh, my God.
[1325] No wonder I'm so good at reading.
[1326] Wow.
[1327] Wow.
[1328] Especially some parts of Kerala, the females had power from a long time ago.
[1329] That was originally.
[1330] The system itself is designed that way.
[1331] Oh, wow.
[1332] It's a patriarchal society.
[1333] Oh, that's so cool.
[1334] So if you go back to the history of Kerala, even many, many, many years ago, you see a lot of high ministers, chief justices and all that female from a very, very long time in that state.
[1335] Women have always held a lot of power in that state.
[1336] Dad, I have one more question.
[1337] Did you think grandma was hot when you first met, when you first met, Mom?
[1338] I came from an Indian background.
[1339] You don't think that.
[1340] There's your mother -in -law being on.
[1341] I know.
[1342] That one to stop me. I've lived here so many years but coming here.
[1343] Some of my Indian thought process doesn't go away so you don't even think that.
[1344] Do you think in India they are less sexual?
[1345] No. Canada, we're home.
[1346] 1 .3 billion people or whatever.
[1347] The numbers don't lie.
[1348] The numbers don't lie, Monica.
[1349] Well, you're saying you didn't think like that.
[1350] Well, maybe more respect thing.
[1351] Yeah, relatives and all that.
[1352] You don't think as sexual objects or anything like that.
[1353] I don't think that's Indian.
[1354] I think that's most people.
[1355] They probably don't sexualize their relatives, but Dax is an anomaly, so we had to check.
[1356] No, I think that's pretty common from my area of Michigan.
[1357] I'm going to claim cultural influences for that.
[1358] Well, this has been awesome.
[1359] I'm so happy that you let us talk to you.
[1360] Oh, it was a pleasure in my debut to this show.
[1361] Yeah, you'll have to listen to this one.
[1362] This will be the second one you listen to.
[1363] All right.
[1364] Thanks, Max.
[1365] I appreciate it.
[1366] Great seeing you.
[1367] I can't wait to see you again.
[1368] Take care.
[1369] Okay.
[1370] I'm going to stay, Dad, because I have to finish.
[1371] Don't fall down.
[1372] I can hear my mom in the other room questioning.
[1373] Oh, really?
[1374] Interrogating.
[1375] Yeah, she's jealous.
[1376] Oh, well, that's in her...
[1377] Not now.
[1378] Next time.
[1379] Next time.
[1380] Uh -oh.
[1381] Did she think so for real?
[1382] Yeah.
[1383] I think you caused a big fight.
[1384] Oh.
[1385] I hope not.
[1386] That was so fun.
[1387] That's my father.
[1388] I love him.
[1389] Yeah.
[1390] I really respect him.
[1391] Were you sad?
[1392] He didn't answer it truthfully about my grandma.
[1393] There's no fucking way he didn't think she was hot.
[1394] It's not, I don't care what, like, cultural parameters he's claiming.
[1395] I mean, eyes are eyes, faces are faces.
[1396] Sorry, Susan.
[1397] My dad really commandeered your fact check.
[1398] Yeah.
[1399] Maybe even the whole week is probably what happened.
[1400] People are going to be so excited about that.
[1401] He didn't really give us any answers about the simulation.
[1402] I really think, we're going to have to talk to David Ferrier about this.
[1403] I mean, I think this is part of it.
[1404] I think if it's you, you don't even know about simulations.
[1405] You don't even see the Matrix.
[1406] Like, you don't know.
[1407] And if you find out, you implode.
[1408] I don't know, everyone in your life dies.
[1409] Well, you didn't watch Westworld, did you?
[1410] No. Okay, so there's this fascinating thing about Westworld because the characters inside of it are AI robots and they're acting out basically these people who come to this tourism destination to live out fantasies and it's an old West town.
[1411] And everyone's AI.
[1412] And when they show like the AI a photo, because the AI is, they think they live in the Old West.
[1413] But if they show the AI like a photo of someone getting on an airplane and they'll say, what do you, what do you see here?
[1414] And they'll go, well, I don't really see anything there.
[1415] Like they've, they somehow have figured out how to prevent them from even being able to see a photo of present day.
[1416] So do you think that's what happened to my dad seeing my grandma?
[1417] Well, no, I think that's what happened when I was, the whole time I was talking to your dad about the simulation.
[1418] He's like, I don't know exactly what you're talking about.
[1419] They've put a block around him even really entertaining it.
[1420] It maybe is just like white noise, everything I was saying.
[1421] He probably was just barely got through that.
[1422] He's probably like, I hope this is over soon.
[1423] He wasn't paying attention.
[1424] It's really funny because some of our first arguments were, I think you should tell your dad you love him more.
[1425] You're like, that's not really what he's at.
[1426] That's not what he's in the market for.
[1427] And I was like, that can't be humans are this way.
[1428] And then as I've come to know him, you know, I'm dead wrong.
[1429] And yeah, I want him to tell me. that he can't believe this little girl is doing that, the little girl from the white dress.
[1430] But it's not how he communicates love.
[1431] He does it in so many ways.
[1432] He's so available to you and endlessly taking care of you and trying to help you with find anything you would ever ask him to do, he would handle for you.
[1433] But he doesn't want to take that sentimental walk with me. Mm -mm.
[1434] No. No. He doesn't.
[1435] And it's not because trauma.
[1436] It's like, you know, it's not something to get over.
[1437] I think that's the thing that's hard.
[1438] And this is really cultural relativism, I think, a lot is.
[1439] Yeah.
[1440] There probably are plenty of Indians who are very emotional and whatever.
[1441] Expressive.
[1442] Yeah, expressive.
[1443] But it's not like, oh, my dad, like, isn't emotional and he's got to get in touch with his feelings more.
[1444] Like, it's not causing a problem.
[1445] It's not a pathology at all, yeah.
[1446] Yeah, I think some Westerners specifically have.
[1447] this idea of love what love looks like that is not universal and is not better it's not worse but it's not better it just is different i like that i i know it can look different and like really know it i have no question i'm never like god like i wonder if my parents love me i mean it's just so obvious they do everything they've ever done has proven that but but yeah it's not in a um a emotional weight.
[1448] Even like, you know, I was thinking, so we were, I watched Contagion for the last time.
[1449] Well, I don't want to commit to that.
[1450] But for the last time during the pandemic, I really bookended it.
[1451] I don't know.
[1452] I don't know.
[1453] You might have one left in you.
[1454] We were watching at me and my mom and my dad.
[1455] And there's a lot of seizures that happened in that movie.
[1456] And you see them and they're pretty gruesome.
[1457] Like they look really bad.
[1458] And you see all the foam and it's very scary.
[1459] And I could feel like kind of a shift in energy in the room when that happened.
[1460] And I was like, oh, they're definitely like in their head about it with me. And I thought like, oh, should I say something?
[1461] Should we address it?
[1462] Should I make them feel better?
[1463] Should I?
[1464] And then I was like, no, we don't do that.
[1465] We just don't do it.
[1466] And it's okay.
[1467] Yeah.
[1468] I think trying to put this square peg in a round hole sometimes, it's just not necessary.
[1469] I agree.
[1470] Yeah, yeah, yeah.
[1471] One upside of it, I would argue, is there are a lot of parents that are at great ease saying, I love you, but then their actions don't necessarily prove that.
[1472] Yeah.
[1473] And I guess if I had to pick, I think I'd want the actions to prove it.
[1474] Oh, yeah.
[1475] The lip service of it all.
[1476] It can be a Band -Aid of sorts in absence of true dedication and sacrifice for someone.
[1477] Well, yeah, even in my circle, like Ken Kennedy, he will say I love you.
[1478] But maybe not as overtly or as frequently as, say, Ryan or Eric would.
[1479] But if anyone in my friendship group is proven that they love me time and time again, it's Kenny.
[1480] When my dad was dying, I was like, I want to take him to the house.
[1481] Now he's in a wheelchair and he wants to go to the house.
[1482] I don't know what to do their steps.
[1483] And, I mean, literally four hours Larry calls me, he's like, there's a ramp.
[1484] I built a ramp.
[1485] I know.
[1486] Sweetest boy.
[1487] I know.
[1488] Exactly.
[1489] Like acts of service.
[1490] I didn't even say.
[1491] He didn't even, like, say, oh, I need a ramp, or would you help me?
[1492] Nothing.
[1493] Just he heard him that.
[1494] That was my problem.
[1495] And he dropped everything and solved it.
[1496] Yeah.
[1497] I love you, Ken Kennedy.
[1498] I love you, Ken Kennedy, too.
[1499] And I'm not afraid to say it.
[1500] So I guess we should talk about Susan.
[1501] It's a ding, ding, ding.
[1502] Oh, dingles?
[1503] Because one of the facts is a quote from Cheedy, and we just interviewed William Jackson Harper, a .k .a. Bill, aka Cheedy.
[1504] Minutes ago.
[1505] Okay.
[1506] So she says, a quote, a cheaty quote, Susan does, something like I can turn any situation or any place I encounter into a living hell.
[1507] I'm actually having a hard time finding this quote, but there are so many good quotes from the good place.
[1508] And one that I thought was relevant was when Michael tells Cheedy, if soulmates do exist, they're not found.
[1509] They're made.
[1510] Ah, that's nice.
[1511] It is nice.
[1512] That's my desert island example that you hate.
[1513] Yeah.
[1514] Well, you like it in this context.
[1515] Well, no. I like it because Mike wrote it.
[1516] I'm actually proving my initial theory that anything I say, if it came out of Mike's mouth, it would be really good.
[1517] No. Mike's not saying, that quote isn't saying you can find your soulmate in any person.
[1518] That's what your theory is that.
[1519] It's saying it's made.
[1520] Yeah.
[1521] But it's not saying every person could be your soulmate.
[1522] That's what you think.
[1523] I mean, that's what you've said.
[1524] All that was said was it's made.
[1525] Like that you're extrapolating now that he didn't say X, Y, or Z. What he said was it's made.
[1526] Well, I don't think I can extrapolate that he didn't say because he didn't say.
[1527] That's true.
[1528] That's true.
[1529] It's not an extrapolation.
[1530] But in doing that, you're kind of doing a null hypothesis where you're suggesting that he doesn't believe that.
[1531] But we don't know that because that's not in the sentence.
[1532] Correct.
[1533] Okay.
[1534] I don't know what he believes, but I know what you believe, which is that.
[1535] Catch him in.
[1536] No, I don't think soulmate status can be found on an island.
[1537] with anybody, but I think you could love deeply any human you put in the time to love.
[1538] Yeah.
[1539] I mean, I think that's a beautiful theory.
[1540] I don't think it's boo -hicky.
[1541] Okay.
[1542] You're not poo -pooing it.
[1543] It's not boo -hicky and it's not a poo -poo.
[1544] It's something to strive for, definitely.
[1545] Anyway, so I'm really sorry.
[1546] I can't find that quote.
[1547] Oh, ding, ding.
[1548] Somebody will find it in the comments.
[1549] Ding, motherfucking ding.
[1550] We just interviewed your father.
[1551] comes from a country with a long history of arranged marriages.
[1552] And those things often work out at a higher percentage than when you go pick.
[1553] Yeah, that's true.
[1554] The arranged marriage is being locked on an island with one person.
[1555] And you have no option.
[1556] And they really divorce.
[1557] I'm sure now in India is the same as rampant and everywhere.
[1558] But it really, really, really was not.
[1559] Right, right.
[1560] Even with my parents.
[1561] Like, it was not an option.
[1562] The really bad stuff could be happening.
[1563] and that would not even come up on the table.
[1564] Yeah.
[1565] Until probably recently, you know, the mentality of, oh, there's just no option.
[1566] I have to make it work.
[1567] I have to find something I love at this person is what arranged marriage is.
[1568] I feel like an arranged marriage can almost only get better and a love marriage can almost only get worse.
[1569] Yeah, exactly.
[1570] And in an arranged marriage, I think you're like, okay, well, here's a person.
[1571] And then you start learning about them.
[1572] Then you start finding things you like.
[1573] And, yeah, I do think it's kind of the opposite in a love marriage.
[1574] It's like, I love this person so much.
[1575] And then you start finding out more things.
[1576] Like, oh, actually.
[1577] And the chemicals wear off.
[1578] Who has it right?
[1579] I wonder what the middle road is.
[1580] What's the Sweden Road?
[1581] It's all right.
[1582] Nothing's wrong.
[1583] It's just different ways of doing it.
[1584] This is so unromantic.
[1585] But I am a little bit of a proponent of not leading with the passion, per se.
[1586] The white hot flame.
[1587] I think that's a little more addicty.
[1588] than, say, picking someone who you admire or thinks has integrity, that that be the leading.
[1589] Well, because it's lasting.
[1590] Those are, like, character -driven and not chemically driven.
[1591] Yeah.
[1592] It's not as exciting, but I do think it's probably a more sustainable approach.
[1593] Yeah.
[1594] I just really, this episode is really fun because it's about ethics.
[1595] and I feel like we could argue this and do for hours and hours and hours.
[1596] So I liked that we got to talk to her about it.
[1597] It is the most fertile ground for argument because there isn't an answer.
[1598] There's not an objective measuring of anything.
[1599] I want to try.
[1600] I'm tempted to dance because it's so set up right now.
[1601] Do it.
[1602] So I love this song by Anderson Pack called Swade.
[1603] He uses bitch a lot in the song.
[1604] Now, if I call you a bitch, yeah, Yeah, it's because you're my bitch.
[1605] And as long as no one else calls you a bitch, then there won't be no problemings.
[1606] And so I sent you this song, and I just can't stop listening to it.
[1607] And even when I sent it, I said, well, the messaging's a little problematic, but maybe you can enjoy this song.
[1608] And you completely understandably were like, yeah, the lyrics are a little rough.
[1609] And I'd be worried about a young dude listening to this and thinking that's how you should be talking about women.
[1610] Really solid argument.
[1611] What has occurred to me a little bit, or the way I've been trying to think of formulating my thoughts on it to you is when black people first took the N -word back and they started saying it, the biggest chorus against them saying it was white people.
[1612] White people were the ones saying, you shouldn't say that it's a really derogatory thing, it's bad for your people.
[1613] They were the ones that were on this kind of moral high ground about that they shouldn't say the N -word.
[1614] And I like that they took that word, and it's their word, and they made it empowering.
[1615] And I like that what I think is that white people never should have had a fucking opinion about that, nor a say, nor a moral high ground.
[1616] And I think culturally, the world that Anderson Pat comes from is so different than the one you and I come from.
[1617] And the women in that culture are so different from the women in the culture you grow up.
[1618] that not unlike the N -word, it is very conceivable that the women in his life, and the women in his culture, and the women in his community, think of that word as they think of the N -word.
[1619] And that they're cool with this culture, and it's you guys that are not.
[1620] Like, we're fine with...
[1621] But I'm a woman.
[1622] But you're not a black woman.
[1623] You're not a black woman that grew up in the ghetto.
[1624] But the analogy is flawed.
[1625] If it was a black female artist singing.
[1626] and they were saying it, then, sure.
[1627] She's taking it back.
[1628] But that's something.
[1629] We're talking about this song, which is a man singing.
[1630] I know, but if you listen to black female artists, they use it a ton too.
[1631] So I think culturally, Anderson Pack calls all black dudes and word.
[1632] And he calls all black women the B word.
[1633] And they both have taken this word back.
[1634] And they're using it how they want.
[1635] And they're comfortable with it.
[1636] And I think when people get really critical of it, like, oh, the black music.
[1637] is so misogynistic.
[1638] It's like, well, why don't you ask a black woman first if that's the case or the people that buy that music if that's the case?
[1639] I just think there's potentially a little bit of a similar dynamic going on where it's like we're claiming a moral high ground about a topic, misogyny, in a culture that we're not from.
[1640] What if they surveyed every single black woman in America?
[1641] And then the survey came out that 100 % of them love that song and don't feel like it disparages them.
[1642] If that were the result of that, would you honor it?
[1643] Is it to be consumed only by black people?
[1644] Because I don't like it.
[1645] But he's not singing about you.
[1646] He's not, and he's not, when he talks about all these other end words, he's not singing about me. He's singing about his black friends and black dudes.
[1647] He's not singing, it's not about me. None of his music's about me. And none of it, those songs are not about you either.
[1648] But bitch is not specific.
[1649] to that culture.
[1650] It is a general word about women.
[1651] It's not specific.
[1652] They may use it, just like white people use it, just like everyone use it.
[1653] They might use it differently in their culture.
[1654] Like, for you to get called a bitch means a very specific thing from your culture, from Duluth, Georgia.
[1655] If you get called a bitch, it's very derogatory.
[1656] It is very misogynistic.
[1657] And that is a fact.
[1658] And if you pulled every white woman in America and said, would you like to be called a bitch in this song and only referred to as a bitch?
[1659] bitch, and they all said, no. Well, yeah, I would know that their culture, that's not their thing.
[1660] But in a world where if you could, if you could pull every single black female and they all said, we don't give a shit about that, that's your thing, would you at that point be okay with it?
[1661] I guess if that were really the case, I would have to be okay with it.
[1662] And it's so blanket to say in that culture.
[1663] I mean, there's black people everywhere, you know, and it's not like, like.
[1664] culture, because I'll make it more specific, where the dudes only call each other the N -word.
[1665] Of course, lawyers at law firms that are black aren't calling the other lawyers the N -word every time they talk about them.
[1666] But in the hip -hop culture, they're saying, my N -word this, my N -Word this, that's my N -Word.
[1667] That is the culture.
[1668] I can't speak on the N -word because I am not black, but I do feel that I can speak on the word bitch because I'm a woman, and that is a word that is widely used to disparage women.
[1669] see what you're saying it does extend to you it is used against you as a woman but i'm saying if it's not about you or any of the people outside of that hip hop culture you would then be choosing to think it's about you wouldn't you well that's what everyone does when they listen to music i'm saying there can be a subgroup so yes you're a woman and the word bitch is applied to all women but also within that there are black women and black women have a different culture than women per se at large or there are variations.
[1670] Again, I don't think that's fair to black people to say black women have a very specific culture.
[1671] No, black women who live in this area have a specific culture, black women who live in this.
[1672] Like joy doesn't have probably the same opinion as every black person.
[1673] I'm not saying that.
[1674] You know I'm not saying that.
[1675] I am not saying that.
[1676] I'm proposing the question that if the community, the culture, and the people who make the music, consume the music and enjoy the music are all on board with it and there's no victim and no one feels disempowered by it are we allowed from the outside what we can agree upon is we are definitely not in any one of those communities or cultures we're not we're not black and we're not in the hip -hop world and we're not in the inner city world we're not in any of those so is it our position to be able to judge his music that's for him and his community and his fans with the standard of our thing.
[1677] That's my, that's kind of what I've been mulling over.
[1678] Yeah, it's a, it's a really good question.
[1679] I think it's, I think it's hard to say.
[1680] I mean, I brought up to you and you first were texting me about this.
[1681] To me, this is in some ways similar to Afghanistan and the way women are treated there.
[1682] Is it our place to say that's a problem if the women there are going along with it and they're fine with it and it's part of the culture?
[1683] We do condemn that behavior.
[1684] It's, to me, is similar.
[1685] It's from the outside saying that's a problem.
[1686] Yep.
[1687] There's two distinctions to be made there.
[1688] One is there is a very, very loud and vocal cadre of women from those countries that say, I want to be able to drive.
[1689] I want to be able to go to school.
[1690] I want, like, vocally saying, I am the victim of this.
[1691] Yeah.
[1692] And I think a lot of black women would say the same thing about a lot of those songs.
[1693] My hunch is they don't listen to that if that's not their thing.
[1694] But they would, they have an opinion that it's probably not helping.
[1695] I'm sure that's a, yeah, I'm sure that's an opinion.
[1696] But there are, and a lot of liberals have fallen on this.
[1697] This is a big Sam Harris issue is that a lot of liberals will defend the hijab.
[1698] Because a lot of the women, some percentage of the women do like the hijab.
[1699] Or they do like how they're.
[1700] Exactly.
[1701] And so it's very dicey for us to say, well, because we don't want our.
[1702] face covered, that's a universal human right.
[1703] Exactly.
[1704] That's my point.
[1705] I think you stand on the side with Sam.
[1706] You stand, I mean, you've said it before, maybe you've changed your mind, but that is something we need to take issue with the treatment of women in those places.
[1707] But yeah, many of them are totally fine and great with it.
[1708] Well, I think both exist.
[1709] I'm not on either side of that.
[1710] I'm on that, where they're saying this is restrictive and I have no options and no life, then those people need to be freed from that.
[1711] But the people that are saying, I absolutely fucking love this.
[1712] I don't think those people should be forced to not live that way if they are enjoying it.
[1713] So I'm of both minds.
[1714] It's very much, I think, in keeping with just having majored in anthropology.
[1715] But there are lines to be drawn.
[1716] Female circumcision?
[1717] No, because that's a child.
[1718] And the child can't even tell you if they're growing.
[1719] grateful that they don't have a clitoris anymore.
[1720] Yeah, but we do it to males.
[1721] And I think it's preposterous.
[1722] I've been very vocal about that.
[1723] It's a biblical answer to not having soap.
[1724] It's ridiculous.
[1725] Yeah, I mean, I personally agree, but it's also cultural.
[1726] People, when they think of female circumcision, they get really hot and bothered, they get very upset.
[1727] But if you said, well, is your son circumcised?
[1728] Most of them would say yes.
[1729] Right.
[1730] Right, and this is where intention's incredibly important.
[1731] So the full intention of the circumcision has always been cleanliness so that you don't get an infection in your dick and die of it.
[1732] The full intention of female circumcision is so women will never experience pleasure during sex so they won't cheat on their husbands.
[1733] So one has a very, very blatant objective of controlling women.
[1734] And the other one has the objective of preventing infection.
[1735] But it's still genital mutilation to a baby who doesn't get to decide.
[1736] Ultimately.
[1737] It is crazy.
[1738] I'm not for circumcision, but I do think there is that we can acknowledge there's a huge difference in the motivation for female circumcision and male circumcision.
[1739] I don't really know how we got on this.
[1740] I'm just, but here's the other thing with the song.
[1741] And I don't know.
[1742] Like, I think I'm not saying people shouldn't listen to it or they should do whatever you want.
[1743] But like you're listening to it.
[1744] Yeah, I love it.
[1745] And you love it.
[1746] So it's not just consumed by the community that you're talking about that it was made for.
[1747] It's consumed by everyone.
[1748] Well, anyone who likes it.
[1749] Sure.
[1750] Anyone who likes it.
[1751] But some people who like it aren't in that community.
[1752] Mm -hmm.
[1753] So if you're singing it and I'm sitting there, I am not someone who likes that.
[1754] Agreed.
[1755] But that's an issue between you and I. Like, you tell me, hey, I don't like hearing you say bitch every other word.
[1756] And I go, oh, my God, absolutely.
[1757] I'm sorry.
[1758] I'm more questioning whether we are entitled to lay a judgment on Anderson Pack for the song.
[1759] I'm not judging anyone.
[1760] I just, I don't think it's helpful to women to have more songs where they're being referred to as a bitch or someone's property.
[1761] I don't either, unless those women have taken bitch back in a way that black people have taken the N -word back.
[1762] If that has happened for them, and that's an empowering thing and they like their girlfriends to call them bitch and they're...
[1763] Boyfriend, then I'm super into it.
[1764] I don't think the word is objectively good or bad.
[1765] I think it's all about the intention and context and all that stuff.
[1766] Yeah, but I guess that circles back to what I was saying at the beginning.
[1767] If it was a female artist, black artist saying it, I could buy into that more.
[1768] I could definitely be like, well, yeah, they took it back.
[1769] Don't care.
[1770] They're making it theirs.
[1771] That's great.
[1772] But it's a man. Like, it's still a man. I know.
[1773] but black female artists also refer to men in that hip hop community as the N -word every time, because that's what they want to be called.
[1774] And so the girls are doing it to the guys.
[1775] So if, in fact, the girls like being called bitch, I don't, I'm not, I don't actually have a position.
[1776] I've not interviewed enough people who have read a poll.
[1777] But if in fact, the women want to be called that and the black dudes want to be called the N -word, I think that's cool for them.
[1778] Yes.
[1779] If everyone's feeling good about it.
[1780] I don't think they are.
[1781] but if they are, great.
[1782] Well, that was a fun dance.
[1783] I liked it.
[1784] Do you feet her?
[1785] No, I can dance for hours.
[1786] Get your dad back down here.
[1787] I want to talk to him about it.
[1788] All right.
[1789] Well, I love you.
[1790] I can't wait for you to get home tomorrow.
[1791] Tomorrow.
[1792] Tomorrow, tomorrow.
[1793] Bye.
[1794] I love you, bye -bye.
[1795] Follow Armchair Expert on the Wondry app, Amazon Music, or wherever you get your podcasts.
[1796] You can listen to every episode of Armchair Expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.
[1797] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.