Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert experts on expert.
[1] I'm Dax Randall Shepard.
[2] I'm joined by Monica Lily Padman.
[3] Hi there.
[4] Hello there.
[5] I wish we knew Nita's middle name.
[6] I'm going to guess.
[7] Well, first I'll just say our guest is Nita Farahani.
[8] She is a leading scholar on the ethical, legal, and social implications of emerging technology.
[9] She is the professor of law and philosophy at Duke University.
[10] So what do we think for Nita?
[11] Melanie.
[12] Melanie Farahani.
[13] I like it.
[14] It has a good ring.
[15] It really sounds right.
[16] Good job.
[17] I'm glad you picked it.
[18] This is a really mind -bending episode.
[19] It's about the future of the technology that's really already here to some degree and going to be exponentially getting better and better.
[20] And in a nutshell, machines are very close to being able to read your mind.
[21] Yeah, and she does an interesting job of laying out the pessimistic side and the optimistic side.
[22] She's pretty optimistic, as you'll hear, a little bit more than me and you were.
[23] Yeah, she's kind of, of like neutral.
[24] She's warning us and she's excited.
[25] Yeah.
[26] As am I. She has a book about this topic out right now called The Battle for Your Brain defending the right to think freely in the age of neurotechnology.
[27] This conversation spans all kinds of wonderful things because she's also a law professor so we get into all kinds of good legal stuff.
[28] Yeah, there's some philosophy stuff that's interesting.
[29] She was radical and I loved it.
[30] Please enjoy Nita Melanie Farahani.
[31] can listen to armchair expert early and ad free right now.
[32] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[33] Or you can listen for free wherever you get your podcasts.
[34] He's an armchair expert.
[35] It's been so cold and rainy here.
[36] It's not sunny, but it's better.
[37] It was rainy when I got in last night, but this morning I was supposed to have a meeting with somebody who happily canceled on me. And so I sat poolside and had breakfast and like the glory.
[38] sun with a cup of tea.
[39] Where's the man?
[40] I was saying it that Kara.
[41] Oh, right here.
[42] Right there.
[43] Oh, that little outdoor area is wonderful, isn't it?
[44] It's so fantastic.
[45] It feels like Europe in there, doesn't it?
[46] It feels completely like you're...
[47] I was like, I'm in a little Italian.
[48] It was lovely, ate breakfast on that little courtyard.
[49] And I was like, I'm so in L .A. because I asked the guy, like, well, what do you recommend?
[50] And he was like, well, obviously the chia pudding.
[51] I was like, I don't know anywhere in Durham that I could get chia pudding.
[52] So I was like, all right.
[53] And I was like, is that filling?
[54] And he was like, well, for me it is.
[55] So I'm like, I don't know what you're having calories a day.
[56] So, yes, it's a great feeling.
[57] Yeah, I was like, okay then.
[58] Yeah, that would be great.
[59] Did you try it?
[60] I did.
[61] It was great.
[62] I mean, I liked Chia pudding.
[63] Cheap pudding is wonderful.
[64] It's really wonderful.
[65] I mean, I've gotten out of the practice of making it, but I have made it for myself off and it because it's so easy to put it overnight.
[66] Yes.
[67] This is going to sound so bougie, but the chia pudding at the four seasons in Hawaii.
[68] is one of the best foods I've ever had.
[69] Four seasons on Honolulu?
[70] Yes.
[71] So that's fantastic.
[72] I have to go to a meeting there the summer and we're staying there for a week with the family.
[73] Every day.
[74] Every day.
[75] So just to like one up you on Bushi, I'm gonna.
[76] You're gonna die.
[77] I'm gonna love it.
[78] It's a week -long law conference.
[79] But I was like, it's in Hawaii.
[80] Obviously the entire family is gonna go.
[81] And there's a little restaurant downstairs.
[82] I took a bunch of pictures last time I was there.
[83] When they got the torches are going.
[84] It's so tropical, it's outdoors.
[85] Right on the beach.
[86] It's incredible.
[87] I'm looking forward to it.
[88] I'm super excited.
[89] I'm going to come to this law conference.
[90] I think you should.
[91] Do they need a comedian to speak?
[92] So the drafting committee I'm on is about updating the definition of death.
[93] So yes.
[94] Updating the definition of death.
[95] Yes.
[96] Okay, really quick, before you give us the real answer, can we hypothesize on what possible wiggle room could exist within the definition of death?
[97] But it's for legal.
[98] It's a brain death.
[99] Yeah, yeah, yeah, yeah, uh -huh, uh -huh.
[100] Yeah, this just happened to actor Tom Seismore.
[101] So he just died a few days ago.
[102] He did?
[103] Yes, and he had had an aneurysm and was brought to the hospital in Burbank and then remain there for, I don't know, maybe a couple weeks.
[104] You know, he's gone, but he's on 61.
[105] He was declared brain dead?
[106] Yes.
[107] I'm out on a limb saying that.
[108] I read the article.
[109] They didn't use the word brain dead, but it was quite clear that he had no brain dead.
[110] activity since being brought in from the aneurysm.
[111] Anne Hesha?
[112] There was a big one out here.
[113] Anne Hache?
[114] Yeah.
[115] So she was declared brain dead.
[116] What?
[117] She was.
[118] I mean, now I'm like, do we get the right name?
[119] Isn't it funny?
[120] Like, we'll say.
[121] We got to find the right name here.
[122] If we were at dinner, we would be popping all these off, like, total facts.
[123] She was in a car accident, I think.
[124] I think she drove her car into a house.
[125] I don't think she died on the scene.
[126] Okay.
[127] And I think she was later declared brain dead.
[128] Wait, so she's dead.
[129] Oh, yes, she died.
[130] I don't know about any of this.
[131] You know what this is like in the Academy Awards when they do in the memoriam?
[132] Yes, it's like you're sitting there.
[133] Like he's dead?
[134] Yes, you were on a media blackout somehow this year and these are the things that happen.
[135] And HESH might have even been the year prior, but regardless.
[136] So time does, it wasn't on TikTok.
[137] It was all over sent to do.
[138] But it was interesting to look at the media commentary about it because I think she had already been declared brain dead, but they talked about her being on life.
[139] support, which is a confusing connection.
[140] Right.
[141] So she was being mechanically ventilated, which is not life support.
[142] It's not?
[143] Well, it's not.
[144] You're not alive.
[145] Oh, I got you.
[146] Here comes the definition.
[147] That's the challenging thing is that after the advent of mechanical and respiratory support, being able to use mechanical ventilation, somebody who would die circulatory and respiratory death, right?
[148] Their heart would stop eating.
[149] They would stop breathing.
[150] Those people now could be mechanically ventilated and so they could have no brain activity.
[151] So you would phrase it as corpse support?
[152] Well, I would say that they are brain dead.
[153] The rest of their physical body and organs are receiving organ support.
[154] So maybe organ support over life support would be the preferred terminology for you?
[155] I wouldn't say it's my preferred terminology.
[156] I just think it's where a lot of the debate is right now over what medicine has made possible and impossible.
[157] Well, it brings up an interesting sci -fi thing.
[158] Several people have been cryogenically frozen in hopes that at some point technology will be able to revive them, right?
[159] I talk about this in my book.
[160] Right.
[161] So Walt Disney is rumored to be somewhere on ice.
[162] And there are a bunch of people who've paid many, many, many, many thousands of dollars so that when they die, they are immediately either their entire body if they can afford it or just their brains are frozen.
[163] Okay.
[164] So why not just hook yourself up to a machine and keep rocking the body?
[165] Well, it's going to naturally deteriorate through.
[166] First, the body would still age.
[167] Yeah.
[168] And so I think if you could be cryogenically frozen before all of the rest of the body deteriorated.
[169] And then it's not like you can indefinitely be supported without ever getting an infection.
[170] Right.
[171] You're in a hospital.
[172] It's about the most dangerous place to just hang out.
[173] Yeah.
[174] Someone will die like 19 months shy of some crazy thing that could have reversed it.
[175] That's true every day.
[176] People died all the time, for example, before heart transplants.
[177] somebody died the day before or the first heart transplant that was successful, right?
[178] So I think that's true, but I think if you know that there is technology right around the corner, you're right, that more people may try to find ways.
[179] One's a little far flung, like, I'm going to be cryogenically frozen, and hopefully in 2180, they know how to bring your brain back online.
[180] There's a lot of leaps in technology before then.
[181] But if you have a tumor, there could be a mechanical process.
[182] There could be something that really would be known, like, God, man, they're like 12 months out.
[183] from cracking this and being able to deploy it, but you're going to be dead next week.
[184] In that situation, maybe you want to keep the body supported.
[185] So all this is transhumanism's agenda.
[186] The second to last chapter of my book's Battle for Your Brain, what I do is I look at the ways in which neurotechnology might make some of the agenda like super longevity or brain uploading or brain -to -brain communication possible and how we should be thinking about that and who gets to decide that.
[187] And the scenario you just outlined, you know, who's going to afford that?
[188] altered carbon.
[189] I really liked that show.
[190] No, is that worth?
[191] It was really good.
[192] Spoiler alert.
[193] It's the world in which brain kind of consciousness can be fully uploaded.
[194] And then the really rich people have cloned their bodies at their perfect younger self that they want.
[195] And then they just have a bunch of those kind of incubated and waiting.
[196] And then when something happens like they're in a car accident or they got shot or whatever, their consciousness just instantly gets transferred.
[197] Yeah, exactly.
[198] You just transfer over to the new body.
[199] And then they There's a whole, yeah, and it might be, you know, S prime.
[200] And you're like, uh -oh, just like got a big bruise here.
[201] Do I want?
[202] I don't want that.
[203] Do I, like, don't like that bruise.
[204] It's going to go for this next body.
[205] Who can afford that and what kind of segmentation that creates?
[206] And then also there are all the people who think that that's no longer real death because there's only one kind of real true death.
[207] So they're the ones who have rejected all of the technology.
[208] When I write, because science fiction, I think is oftentimes, so good at, you know, being able to tease out these issues.
[209] I, in writing nonfiction, mix in so much of the science fiction films that I've seen, including a lot of animated ones, because I have to sit through a lot of animated shows with my kids these days.
[210] Yeah, I'm with you.
[211] But even those raise interesting questions.
[212] Do religious people get excited about the idea of this?
[213] A friend of mine, Jonathan Marino, wrote this wonderful book a number years ago now called biopolitics like the body is the new politic and what he talked about which I reference in my book he talked about how the body politic doesn't fall out on traditional conservative and liberal or religious and non -religious that they're these really odd bedfellows where you can have very very strongly religious ultra right conservatives who are together with the farthest left you can imagine because they coalesce around these issues.
[214] in odd ways.
[215] This could be the olive branch.
[216] Well, it could be.
[217] Right, but they come together in rejection, oftentimes even for the same reasons.
[218] But, you know, they're not necessarily talking to each other either.
[219] Well, I would argue, too, there's this weird, instead of it being left, right, more of a circle, right?
[220] And there is a part where they kind of conjoin around like super militant, anti -vaccine, liberal, organic fundamentalists.
[221] There's some overlap there.
[222] No, that's true.
[223] I mean, I have a few friends who are as liberal as they come, but also as crunchy, organic as they come and make really questionable scientific and health choices as a result.
[224] Well, if we look at the most simple definition of a conservatism, it's like, no, there's great ideals that we have that are worth defending and protecting, and there's a kind of caloric or food conservatism, which is we used to have wholesome food, and now everything's contaminated with GM.
[225] and all this stuff.
[226] And so it's the same tenant, really.
[227] It's like there was a better time we need to get back to it.
[228] Even as far as like births, there's this home birth, which is kind of like anti -hospital.
[229] That's very liberal.
[230] But then anti -hospital, anti -vaccine, anti -all of that institution is also very conservative too.
[231] So yeah, they meet.
[232] I was on the other end of the spectrum with the births of my children, which was which medical practice do I need to go to that is affiliated with the level three nursery because I I want to be delivering at a hospital that has the highest, most intensivist, available in case anything goes wrong.
[233] Yeah, that's how I would feel.
[234] Where are you drawing the line of what tech is?
[235] Anyways, your book is very tasty.
[236] I hope you didn't really eat it.
[237] Well, it's figuratively tasty.
[238] And it brings up all these great things.
[239] We've already dived kind of right into the middle of some of them.
[240] But the book is the battle for your brain defending the right to think freely in the age of neurotechnology.
[241] We have a lot of experts on.
[242] We've talked to a lot of folks that are impressive.
[243] You have too many degrees.
[244] It's literally way too many degrees.
[245] It's a paragraph on Wikipedia.
[246] I know.
[247] I know.
[248] It's exciting.
[249] B .A. in genetics from Dartmouth.
[250] Masters in biology from Harvard.
[251] A PhD from Duke and a law degree.
[252] Do I have that right?
[253] And an M .A. from Duke, too.
[254] And an M .A. from Duke.
[255] Yeah.
[256] Now the M .A. was on the way to the Ph .D. So it doesn't really count.
[257] It still counts.
[258] We'll count it.
[259] Before we get into your work, I need to psychoanalyze.
[260] Wait, how long are you in school?
[261] First, I need that number.
[262] I did a lot of degrees concurrently, and we could go into the psychoanalysis of it.
[263] I'll say this, which is we have a joke in my family, my dad.
[264] You're Persian, yeah?
[265] I'm Iranian, yeah.
[266] Oh, that's interesting.
[267] I keep hearing you say Iranian, but everyone out here that's Persian says Persian.
[268] Well, I mean, it's interesting because I grew up saying Persian, and nobody seemed to know what that was.
[269] And so I've just sort of leaned into, I'm Iranian -American, right?
[270] But Persian identifies you with that.
[271] the culture.
[272] And I think especially right now, for those of us who strongly identify with the culture, but not at all with the country and where it's become and the regime and any of the things that they're doing with the state, it's understandable to focus on the culture.
[273] But Persian predates Iranian.
[274] It does.
[275] Persia predates Iran.
[276] Did your parents come over in the 70s?
[277] My dad came over in 69.
[278] My mom came over a year later.
[279] They were already engaged, but my dad came here to do his medical residency.
[280] And my mom was finishing school and came over to join him a year later.
[281] They were always intending to go back to Iran and in fact got all the way up to my dad getting on a plane the next day but being called by the hospital saying, no, no, you should wait.
[282] Everything was coded because of the surveillance state and the worries.
[283] It was like the hospital wing has some construction.
[284] You really had to wait a few months.
[285] And then a few months later, the Shah fled Iran.
[286] They were watching the situation, but they had wanted to go back.
[287] But my parents really prized education, being both immigrants, but also my father grew up where there was an education.
[288] He grew up in a small village where there was no school past fourth grade.
[289] His parents were illitered.
[290] He had to walk literally that we've heard the story, like I walked up hill both ways.
[291] And when I finally went to Iran, he really did walk up both ways.
[292] But he walked for like six kilometers a day to go to school through sixth grade.
[293] They finally moved.
[294] And through kind of perseverance and drive, he was able to accomplish a lot in education.
[295] So he always said to us, as long as you are in school, I will do whatever I can to support you.
[296] And then when it came to me, he was like, all right, at some point, enough is enough.
[297] You're going to have to.
[298] Time to like go do something.
[299] You're running out of degrees.
[300] Exactly.
[301] Exactly.
[302] Like, there's got to be an end game here.
[303] I didn't mean infinite.
[304] But if you start in genetics, what were you originally?
[305] I was originally pre -med.
[306] Really, I had taken a high school genetic class, thought it was fascinating.
[307] Went to college and was like, I love the stuff.
[308] It's super interesting.
[309] And took organic chemistry.
[310] I did take or go.
[311] I actually made it through the entire curriculum of pre -med.
[312] But what I found was that every single internship, it was like I was cheating.
[313] I would find the policy -based thing rather than the true science -based thing.
[314] So, I mean, I worked in a hospital, but I worked on a research study about spontaneous abortions and trying to understand the causes of them.
[315] And I went to Kenya and worked on women's health and reproductive rights rather than going to work in hospitals.
[316] And I was always leaning toward that.
[317] And so by the time I got to applying to medical school, I was like, you know, I think I'm going to take a little time and figure out if this is what I really want to do.
[318] And I was really terrified to tell my parents that because I was going to be their doctor and coming out of Iranian culture.
[319] You're going to be the father so proud.
[320] They need just one doctor.
[321] They just need one doctor.
[322] And there's pretty much like there's doctors and then there's everybody else, right?
[323] That was my grandfather, too.
[324] He was like, where is my doctor?
[325] Exactly.
[326] And I'm the third and youngest, so it was all riding on me. And you shit the bed.
[327] I hate to say it.
[328] I did.
[329] I totally did.
[330] I totally did.
[331] But they were so supportive of me. And my dad actually was like, oh, thank God.
[332] You would be a terrible doctor.
[333] I'm sure you could handle it intellectually.
[334] But he was just like, your personality is not well -fitted for this.
[335] A high degree of disillusionment of current doctors.
[336] I've talked to so many doctors who say, like, I wouldn't want my kid to be a doctor, which is so shocking.
[337] When I finally sat down to talk about.
[338] with him, he was like, medicine is not what...
[339] It was in the 70s.
[340] Yeah, and he was like, it's not what you see.
[341] It's not the relationship I've had with my patients.
[342] And that was like the growing height of managed care, you know, in HMOs as I was going into.
[343] So from there, I really tried to figure out what I wanted to do.
[344] And I knew that I was really interested in this intersection of policy and science.
[345] And so a lot of those degrees were an exploration of that, which is I was really interested in behavioral sciences.
[346] I did the master's while working at a consulting company to try to figure out if I was to go get a PhD or do a MD PhD or go do a JD PhD or something.
[347] And there was this question in this class on behavioral genetics about behavioral genetics being used in the criminal justice system.
[348] There was a whole chapter on it.
[349] And I was like, that's what I want to do with my life.
[350] And what was the tantalizing concept?
[351] They were testing criminal defendants for a gene for criminality.
[352] And I thought, wow.
[353] Like a predictive gene.
[354] Yeah.
[355] And they had found like an increased incidence of X, X, X, Y. I mean, this all turned out to be BS, but it was fascinating to me at a philosophical level, which was what would that mean for why we hold people responsible and who we hold responsible?
[356] One terrible outcome is we're going to preemptively manage or surveil people who have this genetic disposition.
[357] Conversely, we're maybe not going to hold people accountable who've murdered because they have it.
[358] It's like, what side of that do you like?
[359] Neither for me. The double -edged sword of it, right?
[360] But so I went to law school ultimately after putting out applications to kind of every program and every combination of program.
[361] My first day of criminal law, I walked up to my law professor and I was like, hey, so I'm here because I really want to think about him, right about the role of behavioral sciences in criminal law.
[362] And he's like, okay, cool.
[363] I want to support that.
[364] Let's try to get through the first semester of law school.
[365] He ended up being a great mentor and co -author with me. But like, yeah, I did the PhD concurrently with the JD.
[366] It took a couple of years past the JD to finish the dissertation.
[367] And so a lot of those degrees I did, really I did them all within, I think, like seven years.
[368] Oh, wow.
[369] All the graduate degrees.
[370] So you boogie.
[371] Your father didn't have that much to complain about.
[372] Also, philosophies in there.
[373] Yeah, I got a PhD on philosophy.
[374] This is lovely.
[375] Probably our favorite people to talk to are philosophers and or psychologists who trade heavily in philosophical debates and thought experiments.
[376] All of those things.
[377] Yes.
[378] We often quote the many Jonathan Haidt thought experiments that are so juicy and fun to get into.
[379] And we make some up.
[380] Some successful, some.
[381] Yeah, some are rough.
[382] You gave a TED talk in 2018 called When Technology Can Read Minds, how will we protect our privacy?
[383] So this is five years ago, and you're starting to raise the flag on a bunch of stuff, and you ask the question, should we have a right to Cognitive Liberty?
[384] And tell us from the five years ago standpoint some of the things you are pointing out.
[385] And I guess let's start with how the brain works, what it emits, how we can actually measure that and look at it.
[386] What I described in the TED Talk was just one aspect of it, which is an average person thinks thousands of thoughts each day, right?
[387] As a thought takes form, you have neurons that are firing in the brain.
[388] When you have a particular dominant thought or experience, like you're happy, you're sad, you're engaged, or bored, hundreds and thousands of neurons are firing at the brain.
[389] same time and they're giving off tiny electrical discharges and those discharges can be picked up through neurotechnology EEG which is electroencephalography.
[390] I tried to say it nine times when I was doing my research.
[391] Say it one more time for us.
[392] Electroencephalography.
[393] Incephalography.
[394] I guess maybe now I hear you.
[395] Encephalography.
[396] It's a rough one.
[397] Can you do it really fast?
[398] No, I can barely do it really slow.
[399] I was just admitting to that.
[400] You can just go with EEG.
[401] I'm going to always stick with EEG.
[402] But there are other modalities.
[403] Like there's F nears, functional near infrared spectroscopy.
[404] That's a cool one that's coming later that has better resolution in the brain.
[405] We can measure electrical activity in the brain.
[406] We can measure using infrared lights.
[407] Using technology, now the average person can wear these devices that pick up the electrical activity in their brain to some degree.
[408] If you have many, many electrodes on your head, you can pick up more of the activity.
[409] When you show the image of it, it's kind of like these clouds.
[410] of neurological activity or clouds of electricity.
[411] Yeah, so I showed a brain visualizer, which was cool because you could see different areas of the brain lighting up.
[412] And that's not quite precise because really it's not like you're using some parts of your brain, sometimes in other parts of your brain, other times.
[413] But what is possible to see with EEG is you can pick up these different patterns and different brain waves and different frequencies that those brain waves are being represented as you have any kind of brain state.
[414] And what's happened since 2018 until now is AI has gone nuts, right?
[415] I mean, our ability to actually like take huge data sets and do pattern recognition is now extraordinary.
[416] So, yeah, let me just put a real fine point on that.
[417] So the images you show, us humans would look at that and you're going to see some richening of some overlap.
[418] You're going to go, oh, okay, these two areas are kind of firing.
[419] They're emitting electricity, but they're kind of clustering over this area.
[420] We know what this area the brain does.
[421] It's so rudimentary what we could probably.
[422] ascertained by looking at this data.
[423] But now we have AI, which is capable of recognizing patterns on a magnitude so much greater than humans.
[424] And so much faster.
[425] So in 2018, when you were giving the TED Talk, it was almost before its time.
[426] Right.
[427] And the thing that was cutting edge or rumored to be was that there was some AI that could actually recognize the pattern on an EEG that they would be able to tell you what either color, shape, or maybe a single letter that the person was thinking, which already just that is insane.
[428] Yep.
[429] At that point in 2018, there's some really potentially exciting things about that when we think of the pros.
[430] So in 2018, first, I'd been thinking about the stuff for a really long time.
[431] I'd even been playing with it for a long time.
[432] I suffer from chronic horrible migraines.
[433] I am not good at meditating on my own.
[434] And with these devices, some of the early applications of them are you can use software that can translate from Bluetooth, taking whatever the brain activity is that it's picking up, send it by Bluetooth to your phone, and then give you some information, some feedback.
[435] Like, yes, your brain state is in a meditative state or yes, your brain state is super stressed out.
[436] Like alpha waves or?
[437] Alpha waves, beta waves.
[438] And it's really the balance between them.
[439] So you got alpha, beta, delta, thamagata, theta, theta, not.
[440] Thama, Alpha, Delta, Beta, Gamma.
[441] They just received one of your degrees.
[442] I hate to tell you.
[443] I think they did.
[444] Yeah, they took one back.
[445] Like the Olympics, we'll talk about Mary Jones later, but.
[446] Absolutely.
[447] Just like that.
[448] Yeah.
[449] But so in any event, they can pick up the balance between these waves and then give you real -time feedback.
[450] So you're trying to meditate.
[451] You're trying to get your brain state into this meditative state.
[452] And you can get neurofeedback.
[453] You can get auditory feedback, like birds chirping.
[454] Like, yeah, you're there.
[455] My guess then would be that also triggers now that.
[456] this reward center learning process that would help then reinforce that activity and make it even easier for you to get into it in the future.
[457] Totally right.
[458] And sustain it for longer periods of time.
[459] Right.
[460] Very helpful and you can build up, you know, these kind of muscles.
[461] There are these little golf games that you can play to try to, if you can keep your concentration and focus, you can get the golf ball into the hole.
[462] And that can help people who have everything from ADHD to just people who are trying to improve their ability to concentrate and we're all distracted all of the time, right?
[463] And so having that ability can be pretty neat.
[464] There were already some emerging health applications, too.
[465] One's going to really be pertinent to Monica, who has some mild epilepsy.
[466] One of the things I was really interested in, and this wasn't in 2018, this is more recently.
[467] So I was looking at some of the health applications for consumer brain wearables because consumer devices have far fewer electrodes.
[468] They pick up less of that brain activity in the firing.
[469] And one of them that was really cool coming out of Israel.
[470] It's called eponess.
[471] And with epilepsy, oftentimes you don't know.
[472] Yeah.
[473] Right.
[474] All the times you don't know.
[475] Well, sometimes you can get some early.
[476] I think some people, yeah, they have a sense of like it's about to happen.
[477] But moments before.
[478] Can I tell you one anecdotal fun thing just for fun?
[479] My uncle had it so severely and I witnessed many of them and there was absolutely zero warning, right?
[480] We'd be at dinner.
[481] He had it so bad.
[482] This was the apex.
[483] He got the surgery that cut a corridor in his brain didn't have any of the side effects that were potential which were fascinating they said you could have a radical different personality afterwards but anyways he was on a work trip in chicago he was at the window of a macdonald's drive -thru he was looking at the woman waiting to find out how much he owed and she said sir are you okay and he said what she said do you want me to call an ambulance silent teacher he had rolled his car it landed on his wheels.
[484] He then drove into McDonald's, ordered and drove up, had no idea any of it happened, had to go to the hospital, I mean, was severely.
[485] Which is so weird because you do have a post -dictal period.
[486] There's like a period of time where you're not online.
[487] You're just totally not online.
[488] In his not -online thing, he just drove into McDonald's and ordered some food.
[489] And then, yeah, she was like, you're severely injured.
[490] Is that wow.
[491] Okay.
[492] Well, okay.
[493] So epilepsy, obviously a huge problem.
[494] And it's a huge problem because some people, of course, are treatment resistant.
[495] Drugs are not effective.
[496] It doesn't control all seizures.
[497] And they don't get warning time.
[498] So this company and a number of other ones too have been looking for early electrical changes in the brain that can be picked up.
[499] And this company, I haven't seen the data, just what they've shared with me and they've published some results and have been out there over the past few years trying to get regulatory approval, claim that they can pick up an hour in advance of an epileptic seizure wearing a consumer brain wearable.
[500] Really?
[501] And that would be transformative.
[502] Yes.
[503] You know, benzodiazepines, things like Ativan.
[504] You could administer within that hour.
[505] You could, right?
[506] And you can't stay on those long term because they're addictive.
[507] They don't work over time.
[508] You build up tolerance to them.
[509] But if you know that, even somebody who's treatment resistant could take just -in -time medication.
[510] My parents were Apple watches, which is good.
[511] My mother has developed an abnormal heart rhythm, and it gives her real -time alerts because of the ECG and the heart rate monitor, and that's part of what alerted her recently when she needed to get medical attention.
[512] I think real -time wearable sensors can be really powerful, and we haven't used real -time wearable brain sensors for people to be able to pick up urgent health information and to even be able to track and know what urgent health information is.
[513] Like one time I had a migraine, I was with my parents.
[514] It was the first time I ever had a vertigo attack.
[515] And I lost sensation on the left side of my body right afterwards.
[516] So first, everything started spinning.
[517] Then I lost sensation.
[518] And I thought I had a stroke.
[519] And so I was flipping out.
[520] I was glad my parents were with me because I said, I can't feel the left side of my face.
[521] I can't move the left side of my face.
[522] And my dad was like, it is moving.
[523] I promise you, it is actually, you can't feel it, but it's moving.
[524] This is where we're having a dad that's a doctor's helpful.
[525] Because if you just, your dad's like a muffler installer and he tells you, you're like, I don't know if he knows.
[526] And I don't know if I'm going to trust that.
[527] I love him so much, but I don't know if he knows.
[528] But, you know, what if you could differentiate between migraine and stroke from a sensor, right?
[529] Because there are a number of people who go in with stroke and are misdiagnosed with migraines.
[530] They have a terrible headache and they're sent home.
[531] And so I think the potential for treating our brain health the same way as we treat the rest of our physical health through brain sensors could be transformative.
[532] That's not what my talk was about because there's a real scary downside potential of all of this.
[533] Yeah, I just wanted to paint a rosy picture before we got into critical.
[534] I mean, I'm actually giving another TED talk in April and I'm focusing on a lot of those really positive aspects of it because I don't think that the answer is let's ban neurotechnology.
[535] I mean, there is tremendous promise.
[536] It's like that great documentary, which is like, no, social media or the internet is both dystopia and utopia.
[537] It's both things.
[538] One other thing I think worth pointing out is like for quadriplegics, you are going to be able to type with their brains.
[539] Well, it's not going to just be quadriplegic.
[540] But so first, yeah, for implanted brain computer interface and not even implanted because some of the wearables are really going to help people who have lost the ability to move.
[541] So implanted neurotechnology, which is only right now in about 40 people worldwide, has already restored for some people the ability to speak.
[542] There was an article that just came out a few weeks ago about an ALS patient who had lost the ability to communicate because it's a progressive motor disease where you lose even the ability to blink was able to communicate from their brain using brain computer interface at a rate of 62 words per minute.
[543] Whoa.
[544] Which is really extraordinary.
[545] I think at the end of Stephen Hawking's life he was only able to communicate 15 words a minute using the kind of best available technology and that was extraordinary technology at the time.
[546] So I think that that, That's amazing.
[547] Or the ability even to just regain some independence, turn on lights, operate your computer, move a wheelchair.
[548] There's a lot that this technology can do for both people who have lost abilities.
[549] But, you know, meta is also looking at neural interface for all of us to be able to type with brain activity.
[550] Elon's, what is it, hyperlink?
[551] What's it called?
[552] Neurlink.
[553] But that's implanted, right?
[554] Meta wants you to just be able to wear a watch and do it.
[555] And then think and it type.
[556] Or like type on a virtual keyboard and have it type.
[557] Now's going to be a, we're going to pause for a philosophical part of this.
[558] When you're laying out all the future potential for these things, there's a few things that are seemingly exciting.
[559] The apex of it would be to be able to transfer a thought to another human being without being encumbered by communication and everything else.
[560] That seems incredible.
[561] To truly be understood.
[562] Some people have verbal dexterity, some don't.
[563] It could be this amazing thing.
[564] But when you describe the kind of consumer product applications and it's like, oh, you would think I need to send that person a text and it would do it.
[565] What I immediately thought of is that these hurdles that we have might actually be what's keeping the amount of information, which is already overwhelming.
[566] It's some kind of a threshold where if every time someone who meant to text me could, and conversely me, I can't tell you how many times I'm in the shower and I think, oh my God, I got to text so -and -so.
[567] It's the fifth time I thought that.
[568] Well, that's great because I've reduced by one fifth the amount of output.
[569] And then what you'd be receiving, I don't know that the goal should be when we think it, we can make it happen.
[570] There might be something healthy about a natural laziness or the hurdles or the mechanics that actually reduce the amount of output.
[571] We don't all need to fucking communicate as much as we're probably inclined to do every second.
[572] It's kind of like, what did email do to us?
[573] Yeah, no, I agree.
[574] So this would be email times a quadrillion.
[575] It doesn't have to be, though.
[576] You're imagining a world in which all this does is lead us to be hyper -efficient.
[577] And it's really about removing barriers to sending more text messages.
[578] Uh -huh.
[579] But I'm hoping that's not the future we go down, that it's not just that we've removed friction.
[580] And so therefore, everything gets exponentially more stressful.
[581] It's almost like you'd be sharing everyone's thoughts, and everyone's thoughts are already too voluminous, and it would be overwhelming to have that level of ease.
[582] I have a hard time imagining that what is going to happen is that you're going to have a brain that doesn't have the ability to filter out or turn off any incoming communication.
[583] No, even if it was as simple as it still goes through this interface, my phone.
[584] Right.
[585] But everyone who's ever had the thought in their head, I want to say hi to Dax now is doing it because there's no effort required.
[586] My inbox now is going to be completely unnavigable.
[587] I'm not going to know what's high priority.
[588] I'm hoping that people will learn to continue to say.
[589] self -filter, right?
[590] So, I mean, just because...
[591] That's optimistic.
[592] That's very optimistic.
[593] You know, I would say I am definitely optimistic about humanity.
[594] One thing that is really interesting is I was walking through the airport to get here and I had a tight connection.
[595] And I noticed that I had to go through a bunch of human hurdles.
[596] And those humans were people who were standing in totally inappropriate places with their phones looking down, unaware of their surroundings because they were stopping to interact with their phones.
[597] You think it would clean up that clutter?
[598] People would just be cruising.
[599] No, it's not that.
[600] It's the sense in which rather than being really thoughtful about our interactions with technology, we're just addicted and glued to technology all the time.
[601] And I hope that brain computer interface or neural interface wearable brain sensors don't make it so frictionless that all that happens is that our brains are addicted to technology at all times because the hurdle to me walking over and turning on the television is gone because I can just look up and turn on the television or the hurdle to interacting with my phone is gone because I can just scroll through the phone by thinking about it.
[602] Like, I don't want to be connected all of the time.
[603] Stay tuned for more armchair expert, if you dare.
[604] What's up, guys?
[605] It's your girl Kiki, and my podcast is back with a new season, and let me tell you, it's too good.
[606] And I'm diving into the brains of entertainment's best and brightest, Okay, every episode I bring on a friend and have a real conversation.
[607] And I don't mean just friends.
[608] I mean the likes of Amy Polar, Kel Mitchell, Vivica Fox, the list goes on.
[609] So follow, watch, and listen to Baby.
[610] This is Kiki Palmer on the Wondery app or wherever you get your podcast.
[611] We've all been there.
[612] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.
[613] Though our minds tend to spiral to worst -case scenarios, it's usually not.
[614] nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[615] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[616] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[617] It's called Mr. Ballin's Medical Mysteries.
[618] Each terrifying True Story will be sure to keep you up at night.
[619] Follow Mr. Ballin's medical mysteries wherever you get your podcasts.
[620] Prime members can listen early and ad -free on Amazon Music.
[621] Given what we've seen how people have embraced and used technology over the last 15 years, what on earth would make you optimistic that that's not exactly what would happen.
[622] I mean, truly, how could you possibly think?
[623] The easier it gets, the worse it gets around every corner.
[624] It seems crazy to me that it would get completely effortless.
[625] and that it would go down.
[626] Here's what I predict.
[627] Just like the like button is a heart button and that's addictive, right?
[628] Here's how it would be pitched to you.
[629] I think in my head 12 times a week I should call my mom.
[630] When I have that thought, it'd just send my mom a little nudge thinking about you and she'd feel great.
[631] That'd be lovely for her.
[632] And then I may or may not have to call her as much.
[633] It would be pitches like, wouldn't it be nice to know when your kids are thinking about you?
[634] Yes.
[635] Okay, so that's just one isolated thing.
[636] So all I got to do is think about her.
[637] She gets a little nudge.
[638] She feels good.
[639] Now we're going to be living in the burbling dopamine dump of getting nudged all day long by everyone.
[640] I barely have the time to carve out each day to be able to call my mom and let her know I'm thinking about her.
[641] But if I could, 10 times a day, sent her little internal hugs to let her know that I'm thinking about her.
[642] I think that would be awesome.
[643] So you're quickly putting it in a dystopian world, the idea of interconnectedness.
[644] What I'm saying is that what is well documented.
[645] We are in a dopamine deficit disorder and it's driven by this exact thing.
[646] So what sounds really good, mom would know she's loved.
[647] Mom would quickly get on the treadmill of dopamine deficit cycle.
[648] No, you're right.
[649] I mean, I write about this in the book.
[650] I write about the concept of mental manipulation.
[651] We're not supposed to be told we're loved 80 times a day.
[652] We love that, but we're not designed as a human to do that.
[653] You and I are in heated agreement over the idea that addictive technology is terrible.
[654] I write about this in the book, and I'm really worried about it as a violation of our freedom of thought.
[655] And part of the idea of cognitive liberty is to create the space for us to be able to have self -determination over our brains and mental experiences.
[656] And I believe that technology has taken it away.
[657] Now, is there a way in which neuro technology can enable it?
[658] That's not what I'm arguing for.
[659] I think you and I have two different concerns is what it is.
[660] You have a legal and civil rights and civil liberties concern?
[661] We share this concern about like buttons, about dopamine, about dopamine, about.
[662] brain heuristics, about technology, hacking into the...
[663] Edonic treadmill.
[664] Totally 100 % agree with you about all of that.
[665] But you fear the state.
[666] I fear the individual.
[667] I think that's where we differ if I had to sum up our positions.
[668] Yeah.
[669] I think I have more faith in the individual.
[670] Have you been on Twitter?
[671] Well, but there it's the individual interacting with a corporation that has designed a platform.
[672] And I'm not going to call out Twitter in particular.
[673] All of them have, right?
[674] Yeah, yeah, yeah.
[675] have designed platforms to do exactly what you're talking about, right?
[676] Which is to hijack and manipulate our brains to need.
[677] Delay load speeds to string out the anticipation.
[678] I mean, Tristan Harris wrote about all this stuff beautifully.
[679] And that manipulation of our brains, that's part of this idea of cognitive liberty that I'm writing about.
[680] We need to reset the terms of service.
[681] It's the individual interacting with corporations and the individual interacting with governments.
[682] and it's technology that is designed to be able to really hijack our cognitive liberty.
[683] So I think we agree where I'm optimistic and maybe it's impossible to be optimistic is I don't think individuals would have chosen that path or would choose that path.
[684] I think it is that corporations have chosen how technology is deployed in society and we have fallen into that trap.
[685] Yeah.
[686] And so the question is, could neuro technology be designed differently if the terms of service were different.
[687] And I think if we reset with a right to cognitive liberty so that we're dictating what it means to have autonomy over our brains and mental experiences, we wouldn't start with deploying the technology in that same way.
[688] Let's earmark, because I want you to lay out your actual, which is great.
[689] There's criminality issues to be considered.
[690] There's military applications that need to be worried about.
[691] There's definitely a ton of civil liberties things that are very legitimate for you to lay out.
[692] My worry is, and we're already so far, gone down this road that everyone now thinks their thoughts are super precious and really important that every thought you have needs to be shared can be shared a lot of people can hear it that you don't really have to think about what you're putting out there you can just do it and this feels like an very extreme version of that where you have any thought and you've shared immediately but I'm not arguing for it to be shared either right I'm actually arguing for the opposite I'm worried about it being exposed rather than keeping it inside.
[693] I want you to have that private space.
[694] Now, you're right.
[695] Neurotechnology could ease your interaction with other technologies so that you overshare even more than you already do.
[696] But what I worry about is that involuntarily, your brain is now being decoded, being commodified, being shared in ways that you never would have wanted to share and that you ought to have mental privacy where you can't have a whole bunch of thoughts that you hopefully never share with other people.
[697] Right.
[698] But isn't that just regular thoughts?
[699] Well, it's regular thoughts today, including if you're bored or frustrated or engaged.
[700] There's thoughts like you're deep inner monologue, and then there are brain states that can be decoded.
[701] And the brain states that can be decoded are things like you're bored, you're excited.
[702] And that is something that you keep to yourself, and I don't think that that should be revealed without your expressed desire to do so.
[703] So the dystopian things that you can predict or you fear for sure is that as this AI gets better at reading EEG, and there'll be improvements, as you said, there's an infrared thing coming.
[704] So as different things are available, you even mentioned there's sensors in the headrests of the train conductors in China that are monitoring for fatigue and mental load and stress and emotional state.
[705] Yeah, we're wearing earphones, right now, right?
[706] I mean, they're major earphone manufacturers partnering with neurotech companies to embed sensors into the cups that go around your ears and earbuds that have brain sensors inside of them that can pick up EEG activity from inside your ear.
[707] So it's being integrated into multifunctional devices like your watch, your earbuds, your headphones.
[708] So it's not some extra device that you're going to wear.
[709] It's just going to be brain activity is going to be picked up along with while you're listening to music.
[710] Yeah, it's not going to be cumbersome or noticeable.
[711] And you have this little passage.
[712] I don't know if you got it from someone or you wrote it yourself.
[713] It was in a talisies, so I'm not sure.
[714] But it paints a picture of someone working in the near future at a desk and they get a little chime on their computer and it's rewarding them for their clarity of thought and their productivity and maybe their bonus is based on that.
[715] And that's great.
[716] But then come to find out a coworker is under investigation for some kind of embezzlement.
[717] And now all brain activity of all employees has been logged in this hard drive.
[718] And so they're going to really be able to easily look for some correlation and brain patterns from the accused to see who was working in concert with this person.
[719] And then the fear of the person at the desk is, oh, shit, well, I was working on something with them on the side for a startup, which wasn't totally above board for the company.
[720] That's going to look suspicious.
[721] And the notion that all of this would have been recorded, all your mental states, and there would actually be an AI that could actually read the mental state and realize you were fantasizing about the coworker, that this is one of this dystopian outcomes of it.
[722] I wrote that.
[723] It's in the introduction to my book, and I actually had that exact scenario animated.
[724] Because I wanted to create a little trailer to help people understand it.
[725] I showed that at the beginning of a talk that I gave at the World Economic Forum in Davos.
[726] That little dystopian trailer went absolutely viral on TikTok.
[727] I started getting death threats about it because it was because people thought I was advocating for that, right?
[728] That I was saying, like, this is what we ought to be doing.
[729] But I wrote that to try to help people understand the full spectrum of what's possible today with existing technology.
[730] So it goes from a person wearing these earbuds, which already exist, that can pick up brainwave activity that allow them to track their focus at work.
[731] And that's what a lot of companies are selling these devices to corporations for now to enable people to track their focus and attention.
[732] And this is already happening in China, where they already have workers required to wear brain sensors to track their fatigue levels or to track their emotional stress throughout the workday?
[733] That even sounds benevolent.
[734] Like, oh, if we notice you're really struggling, we might give you a break.
[735] Yeah.
[736] The sales pitch is that, right?
[737] But, I mean, people really obviously do not like workplace surveillance.
[738] And there are more than 5 ,000 companies worldwide that are already using SmartCap, another one of these companies' technologies.
[739] And so I start there, right, which is a realistic, like this is happening today's scenario.
[740] Then some of the things that are imagined by a lot of the companies as coming on board in the next couple of years, which is swiping your computer screen because you can do things like left, right, and that can be decoded from brain activity.
[741] So to eventually replace your mouse and your keyboard using brain sensors instead of having to use a mouse.
[742] Repetative stress disorder, cut down on carpal tunnel.
[743] And then, you know, the person's kind of scrolling through it and sees that there's some changes in their sleep patterns, because lots of companies have launched us, including LG just at CES, launched a pair of earbuds that you can wear while sleeping to track your brain activity while sleeping.
[744] So you notice an unusual pattern and you send it off to your doctor and say, can you take a look at this data?
[745] People are doing this all the time with, they're quantified.
[746] I mean, really, and they're like, what are we supposed to do with all this?
[747] That's why a doctor's going to turn it over to AI.
[748] Yes.
[749] And say, can you look at this and see if there's anything unusual?
[750] The whole thing becomes virtual.
[751] Yeah.
[752] And then the idea that employers would track brain metrics, well, I mean, they're already tracking everything else in the workplace from keystrokes, 80 % of companies during the pandemic admitted that they're using, you know, this pejorative bossware, even turning on webcams at home offices to see what people are doing.
[753] And so the idea that they wouldn't use brain metrics to see like who's paying attention and who's focused.
[754] By the way, a company came up to me after my talk at the World Economic Forum and was like, that was so interesting.
[755] We're the perfect case study for you because we've already tested that technology out on thousands of employees.
[756] And we're about to like sell this worldwide.
[757] And I was like, okay, wow.
[758] You imagine it and, you know, it's happening.
[759] Yeah.
[760] So that's the next piece of this scenario.
[761] And then there's all this really interesting brain data that shows when people work together, you start to see patterns of synchronization between brain activity.
[762] And menstrual cycles.
[763] And menstrual cycles.
[764] I mean, so police subpoena data all the time from third parties.
[765] And here's something that's interesting.
[766] just reading an MIT Tech Review article about a conference that just happened where one of my colleagues out of Canada, Jennifer Chandler, mentioned that a brainwave data company had told her that law enforcement had subpoenaed evidence of brain data from an epileptic patient who has implanted electrodes to figure out if the person was having a seizure or who had assaulted a police officer.
[767] So this idea that brain data could be subpoenaed by law enforcement, which I talk about in the book, apparently has already happened.
[768] Oh my God.
[769] I don't know all the details of that case.
[770] I've reached out to learn more about it.
[771] But that's the kind of scenario that I imagine.
[772] And part of the reason I imagine that is that there are at least a couple of cases that I talk about in the book where police have used Fitbit data to try to challenge, for example, a person's alibi who says, no, no, I was sleeping at the time.
[773] Yeah.
[774] And their heart rate was 120.
[775] Yep.
[776] Or shows that they were active in moving at the time or to confirm when the person says, no, no, I was sleeping.
[777] Already from Apple watches with sensors to Fitbits, this data is being subpoenaed by law enforcement.
[778] And so the idea that you wouldn't have brain data subpoenaed, I think people would be very naive to think that that wouldn't also be just as likely to be subpoenaed.
[779] So I spell out that to kind of give people a very realistic sense of here's what you can already do with the technology and understand what that means.
[780] for us.
[781] Could you add to it how we can see whether people are biased, liberal, or conservative, how we can actually visually detect that?
[782] Yeah.
[783] So there's very good data, at least for better imaging technologies like functional magnetic resonance imaging fMRIs.
[784] So for a long time, people have been doing these studies to see are there differences in brain structures or how brain reacts?
[785] And with EEG, they have found similar findings that you can see how a person reacts to messaging, for example.
[786] So you could put a bunch of statements in front of them, like a survey or on a screen or even show them a bunch of images of Democrats who are well -known and a bunch of images of Republicans who are well -known and see how their brain reacts to those different images.
[787] And you can, with pretty high degree of accuracy, if you, you know, go through enough different images and messages, get a pretty good sense of a person's political ideology.
[788] Yeah.
[789] And when you're applying for a job, you're not trying to go in there saying I'm left or right.
[790] You don't know what the boss is.
[791] You'd like to keep that private.
[792] So the political ramifications, I think, would be very obvious to anybody why that's something that you should be able to keep private at every turn for employment, for being seen as a threat to the government.
[793] You use the example, of course, of talking to your family members in Iran during the Green Revolution and them being afraid to talk.
[794] to you on the phone and tell you what's going on in fear that somehow they're being bugged.
[795] Well, it could be much deeper than that.
[796] They wouldn't even be able to hide it if the phone they're talking on has this software, right?
[797] Well, and purportedly in China, given the mandated use of brain sensors, they're already doing that kind of political ideology testing as well.
[798] That's a really terrifying possibility with everything that's been going on in Iran.
[799] You know, I have a lot of first cousins and aunts and uncles who still live in Iran.
[800] And similarly, they can't talk with us.
[801] They can't be open because they're worried about their phones being tapped at all times.
[802] It really almost doesn't even matter what you can decode from brain sensors.
[803] If the state is mandating its citizens wear them, the chilling effect that that has on people to even be able to think freely is really profound.
[804] You start policing your thoughts.
[805] That's right.
[806] And they can just say, well, you can read your every thought.
[807] They can't, right?
[808] But people don't know that.
[809] And the fear that they could or the fear that there's some advanced AI or technology and you have these brain sensors, you have to wear them.
[810] It's truly the last bastion of freedom.
[811] And if people in an environment like that are worried, they will start to censor their own thoughts to the extent that they can.
[812] And then the ability to ever oppress regimes, it's terrifying.
[813] It makes me think immediately the New York cannibal cop trial.
[814] There was a police officer that was engaged in some kind of social network sex scenario.
[815] He's communicating with someone online.
[816] His sexual fantasies are he wants to eat this person.
[817] And then there was a trial.
[818] They were trying to convict him of this.
[819] And for me, as a very big civil liberties advocate, I'm like, your fantasy can be whatever the fuck you want.
[820] There's a big gap between fantasizing for sexual arousal and committing a crime.
[821] And that needs to be truly safeguarded.
[822] Now, this is like five steps beyond that.
[823] Agreed.
[824] Another researcher, his name is Kent Keel, who has been studying psychopathy for a very long time.
[825] And he's been looking at different brain changes that you might be able to see through imaging as early as even five years old.
[826] I worry about that same kind of thing with that technology.
[827] You know, I worry about the thought police, but I worry about tracking people and believing, like, oh, this person has a likelihood of being a psychopath.
[828] Now, we don't know if that means they're going to be like an excellent rugby player or if they're going to be homicidal and, like, kill people.
[829] Or they're going to start the next Fortune 500.
[830] Right, exactly.
[831] Yeah, a lot of these people track high on the sociopathy scale.
[832] They track high on all these different scales.
[833] And self -fulfilling prophecy.
[834] Are they just going to, oh, they're going to.
[835] own it.
[836] Right.
[837] And if you take a child and you stigmatize them and say, like, you have the brain of a psychopath.
[838] Yeah.
[839] And likewise, if you do that for brain sensors, I worry a lot about that.
[840] Me too.
[841] It's almost like the DSM, too.
[842] It just puts us in these boxes of what normal is.
[843] I mean, I worry about that even with the kinds of testing we're doing for employment now.
[844] So these cognitive tests and personality tests for employment, it's like reducing humans to these little puzzle pieces.
[845] You have five key.
[846] Exactly.
[847] Like, here are the key personality types or the key cognitive traits.
[848] that doesn't necessarily make the best employee or the person who's most likely going to flourish there.
[849] Yeah, exactly.
[850] We're looking for Picasso that's not on any charge.
[851] And we're not going to find them through cognitive batteries of testing or personality testing.
[852] I worry a lot about this kind of reductionism.
[853] I talk about that in the book.
[854] I talk about what one research group called the seductive allure of neuroscience, which is people start to overweight claims that are based on neuroscience.
[855] And neuroscience seems to be more seductive than most.
[856] Yeah, because it's quantifiable.
[857] It seems to people like it's this objective truth, right?
[858] It's if consciousness, even if observed and quantified, is objective.
[859] Well, it feels like the objective end to psychology that is subjective.
[860] That's exactly right.
[861] And for any psychological trait that can be reduced to neuroscience, people think it's like the holy grail of truth.
[862] Yes.
[863] And that risk I worry about, about taking people and turning them into little bits and bites that we treat them based on.
[864] You know, it's also how disinformation spreads.
[865] This seductive allure of neuroscience is very dangerous.
[866] We have to resist it.
[867] It's like turbocharged DSM, like DSM through AI.
[868] Yes.
[869] Where we can actually now put you in the, quote, normative or typical bracket.
[870] We've been talking about the ability to observe, but I think there's also some really interesting stuff about how not just we will observe, but we will augment and we will change with this technology.
[871] So my favorite part that I read, because again, it dips into the philosophical lot as we start with Marion Jones, the incredible sprinter.
[872] And this notion of cheating, you have a very provocative and fun and civil liberties forward kind of argument about people using ADHD medication to, quote, cheat on tests or cheat in studying.
[873] So would you just introduce that because I really love exploring this.
[874] Sure.
[875] So I would say I'm definitely against the green in my views on this.
[876] Even your own school has a policy.
[877] in school on this, which is, I do not believe using cognitive enhancers as cheating.
[878] I think that policies that treated as cheating really are just on the wrong side of history and on the wrong side of human flourishing.
[879] So the Duke's story is that some students a number of years ago petitioned, apparently the administration, to put using drugs like Adderall or Ritalin drugs, concerta, on the cheating policy, not the drug policy.
[880] And the cheating policy, if you cheat, you know, it's, it's, expulsion, and it's also a different thing.
[881] The stigmatization of being a cheater and being expelled is different than somebody who is an addict and needs help.
[882] You know, they didn't consult any of the bioethicist on campus.
[883] I know that.
[884] To ask them.
[885] They got this great department.
[886] They're like, nah.
[887] Yeah, forget that.
[888] We don't need to talk to them.
[889] Exactly.
[890] And made a part of the cheating policy.
[891] We should find some experts.
[892] They're down the hall.
[893] Yeah, fuck it.
[894] Let's just wing it.
[895] Let's listen to the students.
[896] Let's take the strong normative stance against us.
[897] This one student's so afraid someone else will get a great grade on their test.
[898] It literally was.
[899] They were like, this is an understanding.
[900] unfair competitive advantage.
[901] And I think people confuse these arguments.
[902] They confuse the idea that not everybody has access to treating it as cheating.
[903] And I think they are separate problems.
[904] There's the question of whether or not enhancing is cheating using something like a drug or a device or study aids or anything.
[905] Tudors for SAT.
[906] Tudors or exercise.
[907] Let's talk about diet as a kid when you have a single parent.
[908] Let's talk about attention.
[909] Books read to you as a kid.
[910] So many things that impact brain and brain development versus access, which is a real and important issue that we as a society need to address across so many different technologies and across so many different ways in which healthy brains do not develop equally.
[911] So I look at the kind of sports analogy, which is what most people turn to.
[912] They hinge it generally on just like steroids and sports.
[913] Exactly.
[914] And steroids in sports, first of all, I think sports is a limited set of artificial rules that we have developed.
[915] We've said these are games, these are the rules of the games, and these are how they have to be played.
[916] And the game is defined by the rules.
[917] That's right.
[918] That's all it is.
[919] That's all it is.
[920] A list of rules.
[921] That's right.
[922] And I say life is not that.
[923] Life is not a game.
[924] It is not a set of rules.
[925] It is not zero son.
[926] There's not a first place winner.
[927] That's right.
[928] And there are competitions.
[929] There are competitions for spots into jobs.
[930] There are competitions for spots into everything.
[931] But life and the human experience, which is about our brains and mental experiences, about our bodies, about our interaction with other people, about our interaction with the environment, all of that can be enhanced, whether it's through drugs or devices or nutrition or any of these other things.
[932] And I believe cognitive enhancers should be part of a person's self -determination to decide whether or not they want to use them in order to maximize their own flourishing.
[933] I don't say that without limits, which is, you know, if they're super unhealthy for you, if they're likely to cause all kinds of health problems, then they're understandable needs for oversight and regulation to make sure there are safe and effective drugs available.
[934] It's just treating it as a category as wrongful normatively seems just crazy to me. Really, what about coffee?
[935] Right, and that's one of the examples I do.
[936] Coffee, exercise, right?
[937] Where are we going to draw the line?
[938] Also, and they just don't acknowledge this in sport, and I think you do a great job.
[939] You tell the history of this cross -country skier, who was so dominant to a level that's not yet been matched, he beat somebody by 40 seconds.
[940] They ended up doing some tests on him.
[941] And what they found out is by some genetic mutation, he happened to create a lot more EPO than anyone else.
[942] And if you have a lot of EPO, you have more red blood cells, more oxygen.
[943] And that's exactly what Lance Armstrong was doing is he was elevating his EPO levels.
[944] And he's a disgraced fallen champion.
[945] And this other guy is the greatest member of a sport to ever live.
[946] And you're like, okay.
[947] One guy hit the genetic lottery is a hero.
[948] The other guy augmented himself to be just like the hero and he's a villain.
[949] This is a little precarious.
[950] Go ahead, Monica.
[951] She loves sports.
[952] She's a state champion.
[953] She loves the Olympics.
[954] But do we all have the right to be in the Olympics?
[955] So one of the things I say about sports is we've decided what it is we want to celebrate.
[956] and what we're cheering for and what the rules are.
[957] As arbitrary as it is, I think, we have decided that we would rather celebrate the skier who has...
[958] Exactly, rather than the person who was like, wow, that looks like that would be really helpful for sports and therefore I will figure out a way to synthetically create it.
[959] Again, it's exactly the same advantage, but we've decided that it's naturally honed talent through work and genetic lottery that we want to celebrate.
[960] Right.
[961] And that's the game that we've decided.
[962] And Saturday Night Live did a really funny skit on this number of years ago.
[963] First all -drug Olympics.
[964] Exactly.
[965] Rips his arms off.
[966] Exactly.
[967] And like the difference between baseball and schmaseball, like we could decide that really we just want to watch people hit home runs all day long.
[968] And so we just want steroid base.
[969] Well, by the way, they've done this in bodybuilding.
[970] They're like, we don't give a fuck.
[971] Right.
[972] Go crazy.
[973] Let's see how big you guys can get.
[974] And you're like, oh, wow, they can do that.
[975] You think the Olympics or these sports organizations are abiding by a law.
[976] That's not what's going on.
[977] They can just test or not test.
[978] Yeah.
[979] There's no law that they have to test people.
[980] I think first of all, I just sort of want us to set sports aside and say, okay, we've created an artificial universe, we've decided what we want to celebrate, we could make the rules differently if we wanted to, but we have decided not to.
[981] You know, you want to have rules about steroids and sports, fine, have rules about steroids and sports.
[982] It's just not the right analogy when it comes to the brain.
[983] Now, I do think people often, they're just not clear enough in their fears.
[984] I look at that and I go, it's dicey to be using methamphetamine.
[985] Yep.
[986] Okay?
[987] It's hard to just dabble in methamphetamine.
[988] that's a very addictive drug one of the most addictives in fact the recovery rate once you've shot methamphetamine is the lowest it's like one and a hundred are going to beat that problem number one problem number two when does it stop so you use it to study for this one test okay okay i can buy that so that you can get this grade so that you can get this mark on your thing so that you can get this job like don't you realize you'll always have to do that if the justification is to get this thing that gets you to another level which will also require that thing well then by god the thing's going to be required.
[989] So I just think people aren't honest with themselves about you're not endeavoring on this one time.
[990] That needs to be talked about.
[991] Totally agree with you and say, if we want to talk about a health justification as opposed to cheating, like let's have that conversation.
[992] And then let's focus on developing drugs that are not methamphetamines that are good.
[993] Like a lot of the neutropics are not methamphetamines and they can enhance the brain.
[994] Modafinil is not a methamphetamine, right?
[995] It has a different mechanism of action.
[996] Maybe psychologically addictive, but not physically in the same way as methamphetamines are.
[997] We don't know because we don't study it for cognitive enhancement.
[998] Yes.
[999] And I feel like if we can shift the conversation, because what I lay out in the book for cognitive liberty doesn't say you have an absolute right to self -determination.
[1000] To blow cocaine right before you.
[1001] I say it's a relative right.
[1002] I mean, you know, there's societal interest and there's individual interest.
[1003] And if everybody's going to be addicts that society is going to have to be taken care of, of course we can take those things into account.
[1004] But that's the conversation we should be having rather than it's cheating to enhance your brain.
[1005] And can I throw in an innocuous example that also serves our argument really well?
[1006] I think this one doesn't trigger people, and it's an identical thing, which is people who audition for symphonies often take beta blockers to keep their heart right at a level that doesn't shift their brain into midbrain thinking.
[1007] We're fine with that.
[1008] It's the identical thing.
[1009] So it's just weird that because one's an upper and one's a blocker, because those two things are in different categories in our mind, we treat it differently.
[1010] But no one would object to someone keeping themselves from panicking so that they can perform.
[1011] The chapter after this one, breaking your brain, I write about propanol and I write about the fact that we should be treating them the same.
[1012] And I don't use the example you just use.
[1013] I use it as a way to extinguish fear memories.
[1014] It is a thing that people across the board are using for fear.
[1015] I used it to try to address PTSD, we can break our brains, we can speed up our brains.
[1016] And if we treat those as about self -determination over our brains and mental experiences, then we can shift the conversation to more useful ones.
[1017] Like, how do we develop safe and effective drugs?
[1018] What is a healthy level?
[1019] What is a healthy level?
[1020] How do we make sure that people don't feel pressured to take them forever and that there is no off -train or that if they do take them forever, they're healthy and safe for them to do so, right?
[1021] Yes.
[1022] now I've waited as long as I possibly could whenever I hear about any of this my brain just goes immediately as a recovering addict to oh my god will we have the headset I can put on that gives me exactly what I want it's a triple uptake inhibitor I get the cocaine feeling and then in 45 minutes it goes off and then none of the downside and none of the craving and none of the like are we going to be able to augment our electricity enough to have the perfect drug I hope so You know, I don't know.
[1023] So I'll tell you what's interesting.
[1024] There's some cool research that's been going on with virtual reality and surgery, seeing whether or not they could use it in lieu of or to decrease the amount of anesthesia that a person needs.
[1025] And the hope initially was that it would be like the brain's own circuitry of pain medication.
[1026] That isn't the mechanism so far that has worked out.
[1027] What has worked out is people are super distracted and immersed.
[1028] But to me it seems theoretically like if we can go for, for a run and get endorphins.
[1029] Well, the chemistry sets in our head.
[1030] The drug doesn't add new chemicals to the brain.
[1031] But the problem is you do get addicted.
[1032] We like it.
[1033] It doesn't seem like a bad thing to have the endorphin rush and the craving to do so.
[1034] That's the hack.
[1035] You get it after you reach like 15 minutes of elevated anaerobic activity.
[1036] There you go.
[1037] Yeah.
[1038] So it's linked to a very positive thing.
[1039] And then you get an even better.
[1040] It's like a video game.
[1041] It's unlocked level.
[1042] So you have to earn it.
[1043] and you can only earn it.
[1044] Run themselves to death.
[1045] And maybe some of it is like cognitive fitness training too, right?
[1046] So you have to do 15 minutes of cognitive fitness training or meditation, right?
[1047] Things that bring your stress level down at the end of 15 minutes of stress reduction.
[1048] MDMA hits you.
[1049] For 18, and whatever number they figure out will be none of a good.
[1050] And you can earn it per minute.
[1051] Yes, I mean the bank.
[1052] I like this line of research.
[1053] I think we should invest in the startups that are doing it.
[1054] Okay.
[1055] So we earmarked it.
[1056] But this is where I'm going to now go back to a little bit of what my actual fear is.
[1057] Part of me thinks people are like, oh, my God, the government's watching, blah, blah, blah.
[1058] One of my thoughts is like, they don't give a fuck about you, first of all.
[1059] No one's monitoring you.
[1060] No one knows.
[1061] Okay.
[1062] Also, we've had some cases where the NSA actually had tap phone lines and stuff.
[1063] They can't even go through it.
[1064] They just gather all this shit.
[1065] They have no ability to synthesize it, look at it.
[1066] I mean, you're right.
[1067] You're right.
[1068] That's going to be helpful.
[1069] I worry a lot more about corporations.
[1070] And the misuse is different.
[1071] It's not the surveillance state in the government, although I worry.
[1072] I mean, look, I'm Iranian -American and I worry about, you know, government surveillance.
[1073] Oh, I follow you like crazy.
[1074] Exactly, right?
[1075] Profile a shit I do worry about government use and misuse of the technology.
[1076] And I write about that.
[1077] But I think one of the things that really changed for me from 2018 when I gave that first head talk until I wrote the book and now was a growing concern about corporate use and misuse.
[1078] It's insidious in the addiction to technologies and the neuromarketing and the micro -targeting and the revealing and commodification of information about us.
[1079] And a big part of the book is the worry about corporate use and misuse of the brain data.
[1080] Okay, so here is my thought about this.
[1081] Truthfully, do you think people will care?
[1082] Because so far people have accepted that they have a device on them right now that records their every movement in time and space and it records 100 % of their financial entertainment and romantic choices to have access to someone's phone is to know nearly everything about their behavior that is knowable.
[1083] And they don't give a fuck.
[1084] Like Orwell's Big Brother, all this stuff.
[1085] If you told people we're going to have Big Brother, that's scary.
[1086] But in practice, no one cares.
[1087] They're kind of saying they want their privacy, but they don't.
[1088] And weirdly, I've even had this experience with myself where it's like, yeah, they know a lot about me. The ads I'm seen, I actually like.
[1089] So this thing that we were so.
[1090] fearful of, these targeted ads, actually makes my life a little more enjoyable.
[1091] I have to watch all the ones I don't like.
[1092] So that's interesting.
[1093] I kind of like what I see now.
[1094] I just think if people have already become completely comfortable with that, I mean, you could already subpoena someone's phone.
[1095] There's no alibi.
[1096] Who has an alibi?
[1097] The phone knows everywhere you've been every second of the day and everything you did while you were in there.
[1098] We've already given that over and no one gives a shit.
[1099] Do you think we could possibly get people to care?
[1100] So first, technology, it's normalized very quickly and surveillance gets normalized very quickly at least initially I'm hoping people will care and I think before it is normalized is the reason that I'm sounding the alarm now because right now when I talk to people it truly scares the shit out of people the idea that their mental privacy that their freedom of thought that the kind of last bastion the one space that they thought like okay fine take my phone take my GPS location take everything else because I can think anything I want.
[1101] Right.
[1102] And the idea that maybe not, maybe you can't think anything you want without other people knowing it, that really seems to touch a nerve when I talk to people about it.
[1103] But I worry because I see lots of examples already.
[1104] It touches a nerve when you tell them that.
[1105] That's right.
[1106] But I'm fearful that when it's happening, they won't care just like they don't care right now.
[1107] Their privacy practices do not align with their privacy preferences, right?
[1108] So what you do and what you say you want do not align well.
[1109] And I think the chilling dystopian effect of that when it comes to the brain is so problematic that I want us to get out ahead of it because right now people care about it when they hear about it, they care about it.
[1110] You're right.
[1111] Now is the time because they haven't been seduced by it yet.
[1112] That's right.
[1113] If we adopt a right to cognitive liberty now, it changes the default rules which we haven't done in the past, right?
[1114] The default rules always start in favor of the corporations in favor of the government and disfavors individual privacy and choices.
[1115] So we flip it.
[1116] and we say okay this technology is going mainstream this year fine we're going to do it with a different set of terms of service before we're addicted to it and you know you say like I like the targeted advertisements I have a three -year -old and an eight -year -old and I feel like sick to my stomach when my three -year -old wants to get my attention and a ding on my phone makes me check my phone and she comes over to me and says like mommy mommy pay attention to me the idea that the ding could invade that mental space that I have and that I, as much as I am aware of it, still succumb to it, I hate that.
[1117] I want us this time to do it differently and not just because it's the next thing, but because it's the last thing, right?
[1118] It is the last and most important space, which is the space for internal rumination.
[1119] I hate to be this critical of people, But I think there are so many people I hear shouting about the privacy and the cookies and the being tracked.
[1120] But if you give them the option to pay $19 a month for Google, they're not going to do it.
[1121] So really, it's a complaint they have that to them they'll sell for $19 a month.
[1122] That's what's really fucked up about our relationship with technology is we already formatted it to be free.
[1123] I don't want people to have to pay for it.
[1124] It gets paid for by the marketing machine.
[1125] Agreed.
[1126] But I mean, I don't want it to be, you either pay $19.
[1127] right now and you get freedom and privacy or you get the commodification, I want to start with the default rules being, sure, maybe it's a little bit more expensive.
[1128] Neurotechnology is a little bit more expensive for all of us, but the default rules are in favor of people.
[1129] I want it to be frictionless for people to have cognitive liberty.
[1130] Yes.
[1131] That's what frustrates me about it.
[1132] I don't know if it's national or in California only, but we have this law now right where you go to a website and they have to at least give you the option to get rid of cookies, which when that was put on the ballot and it passed I was like great I can't wait to not have cookies but then they got around that they go accept cookies or go to manage fucking and then there's 55 switches and then save and then every time I go back to the it doesn't remember that magically that's the one thing it can't remember is that I've already done this and you go oh they just made it so impossible that effectively it's not even being enforced no I agree I don't want that to be the case the examples I give in the book about the number of people who've already been willing to trade their brain data for some fun new gadget toy access to something, I think it's really likely we will go in that same direction.
[1133] I just think that the stakes are even higher.
[1134] And then minimally we should have law.
[1135] I mean, that's where we do rely on the government to protect us.
[1136] We as individuals are defenseless against that.
[1137] So we collectively agree the government's going to have to protect us from that.
[1138] This is a right from but also a right to.
[1139] So it's a right from surveillance of our brain activity, governments, by corporations, but it's also a right to the things that we've been talking about, the right to enhancement, the right to diminishment, the right to self -determination over your own brain.
[1140] And so I think that idea that we have both a positive right to means make it frictionless for us to be able to access.
[1141] Because if we don't write this legislation, then all that's going to happen is as these things get challenged, they're going to be flown under the banner of a previous bill of rights thing that just isn't comprehensive enough to possibly deal with this.
[1142] Like if that's what our fallback is to wait that these things get to the court and try to apply freedom of speech to it, it's going to be totally incomplete.
[1143] Yeah, and trying to claw back rights after they've already been given up.
[1144] I mean, how well is that going?
[1145] Oh, Nita, this has been really stimulating and encouraging and fearful.
[1146] Yeah.
[1147] Yeah, mixed messies.
[1148] We love that.
[1149] We love a mixed message.
[1150] Well, Nina, this has been so fascinating.
[1151] I hope everyone checks out the battle for your brain defending the right to think freely in the age of neurotechnology.
[1152] Like I said, I think we dip into it.
[1153] into it a little bit with that cheating thing.
[1154] There's a lot of fun philosophy in there.
[1155] There's a lot of fun legal stuff in there.
[1156] It's not just the tech.
[1157] It's not just about your brain.
[1158] It's a lot of other fun thought -provoking, stimulating examinations of this.
[1159] Thank you.
[1160] It's been a lot of fun having a conversation today with you guys about it.
[1161] Okay.
[1162] We hope to see you again soon.
[1163] Thanks.
[1164] All right.
[1165] Bye -bye.
[1166] Bye.
[1167] Stay tuned for more armchair expert if you dare.
[1168] And now my favorite part of the show, the Back check with my soulmate Monica Padman.
[1169] You received, we received gifts from past guest Jake Gyllenhaal.
[1170] Do you want to tell folks what you got?
[1171] Yeah, I posted it, but if you missed it.
[1172] Yeah, I posted it today because it was.
[1173] Appropriate.
[1174] Yeah, we released him today.
[1175] Oh, don't say that, but yeah.
[1176] It's okay.
[1177] People, do they think we...
[1178] They don't like it.
[1179] Why?
[1180] Because it feels old.
[1181] They think we recorded this the day that we put it out.
[1182] They don't want to think about it.
[1183] It's just like on the Tonight Show, they're not trying to tell you they record it at 5 p .m. They are, people know.
[1184] They avoid reminding you that it's noon in Burbank.
[1185] Okay.
[1186] Yeah, it's part of the, I think it's good to be thinking of.
[1187] Okay.
[1188] Sorry, that's my opinion.
[1189] We have different opinions on that, but that's fine.
[1190] Yeah, we don't have the lie, but we don't have to say like yesterday or Monday.
[1191] You know what I'm saying?
[1192] Okay, well, I posted a picture of it.
[1193] Okay, yes.
[1194] So in case you missed that, it was a mug.
[1195] Uh -huh.
[1196] Custom mug.
[1197] I love mug.
[1198] Well, it seems custom, but maybe you can just buy it.
[1199] Okay.
[1200] I don't know.
[1201] Explain more.
[1202] Okay.
[1203] So the note said something like the row was too expensive, but this is close or something.
[1204] And it's a mug of a lot of pictures of Mary Kate and Ashley.
[1205] Yes.
[1206] It's really exciting.
[1207] I thought it was so thoughtful.
[1208] You know what it said?
[1209] He paid attention.
[1210] He listened.
[1211] He remembered you were obsessed.
[1212] I know.
[1213] It's so nice.
[1214] nice and he sent me a mug too yeah yes and mine says sometimes you forget that you're awesome so this is your reminder yeah and you need that yes but i was suggesting that really is that what i need or do i need a mug that says i know you think you're awesome don't forget you're also a piece of shit just to keep you humble level i think six seven times Six half dozen?
[1215] Is that I mean?
[1216] Either or.
[1217] Same same.
[1218] Oh, I've never heard that before.
[1219] Six half dozen?
[1220] Yeah.
[1221] Six or one half dozen?
[1222] People say, well, it's six or one half dozen.
[1223] Those are the same thing, right?
[1224] You don't know that expression?
[1225] I've never heard that.
[1226] Oh, yeah, welcome to Earth.
[1227] Do you like it here?
[1228] Is it?
[1229] Maybe that's regional.
[1230] Do you know it, Rob?
[1231] No, I don't know that.
[1232] Oh, my God.
[1233] Oh, my God.
[1234] Maybe it's an age thing.
[1235] I was going to say that.
[1236] I'm glad you said it, not me. It's in movies you guys have seen.
[1237] Which ones?
[1238] Like Dennis the Menace?
[1239] Citizen Kane, Casablanca, Maltese Falcon, you know, all the movies you guys are.
[1240] The Charlie Chaplin ones.
[1241] Rise of Estate.
[1242] No, was that terrible.
[1243] Birth of a Nation.
[1244] Oh, yeah.
[1245] That one's racist.
[1246] It was like the first movie.
[1247] Yeah.
[1248] He's got clan men on horses.
[1249] Yeah, oh, oh.
[1250] You watched that in history class.
[1251] film history or just history history history of home no history of America oh yeah or do we do we tie up no I want to keep talking about it yeah I thought it was so thoughtful and I also thought man he just must be like it just says he's a good gift giver and we've talked about this like okay do we want to give him too much credit is he just having an assistant do this but right right But he can't just have had an assistant do it because the assistant doesn't know about the details of the conversation.
[1252] That's correct.
[1253] So he would have to minimally say, hey, can you get Monica something married?
[1254] She's 5 -1, 35 years old.
[1255] She loves the role, particularly marry Kate and Ashley.
[1256] Go!
[1257] Yeah.
[1258] And that is great.
[1259] That's amazing, yeah.
[1260] So to me, that's a gift that, like, one of my best friends would give.
[1261] Yes.
[1262] So thoughtful.
[1263] Wait, Jake's one of your best friends?
[1264] No, as far as the fact that he's exceeding my thoughtfulness.
[1265] Well, he's not.
[1266] No, he is.
[1267] I'm happy.
[1268] Hold on a second.
[1269] I don't mind that people are better at things than me. And he is definitely better at the me. He's exceeding my thoughtfulness for sure.
[1270] Like, I've never left an interview and sent anyone anything.
[1271] Right.
[1272] As you were saying it, I went to a very judgmental memory in my head.
[1273] And then I was debating whether that's something I even share.
[1274] And then if I do share, how do I own my own baggage?
[1275] Okay, let's hear it.
[1276] So we know a couple.
[1277] They were telling us the story of how they got engaged.
[1278] And the boy was talking about how he had like decorated the whole house with flowers and then put boxes of chocolate leaning up the stairs, blah, blah, blah.
[1279] Then some trick at the airport, really an elaborate thing.
[1280] Okay.
[1281] And then as I was drilling into it, because I was just imagining to myself going through all that.
[1282] Like, man, that's like, that was like two days worth of driving around town and buying all this shit.
[1283] There were candles, right?
[1284] Some part of the story had candles.
[1285] And my thought immediately was like, huh, how would he have been at the airport and then arrived home and there were candles lit?
[1286] Either he lit them two and a half hours before he went to the airport.
[1287] The house would have been on fire, right?
[1288] So I just simply, it started with, how did you get the, oh, my assistant lit the candles?
[1289] And I go, oh, did she help you?
[1290] and come to find out, this isn't it had done every single thing.
[1291] Right.
[1292] And so I didn't like that.
[1293] I'm like, you're like coasting on this notion that you're a huge romantic and you like presented, but you didn't do shit.
[1294] You know what I'm saying?
[1295] I kind of, now here's, I'm going to explore my own insecurities about it.
[1296] Okay.
[1297] Which is I think probably I was on the attack because I've never done anything that thoughtful for somebody.
[1298] So I was probably feeling, defensive.
[1299] And then maybe even thinking, Kristen's listening to this, thinking like, God, well, he didn't.
[1300] He was in bed and was like, hey, hold on a second.
[1301] I came back and I had a ring, which is what happened, right?
[1302] It was like real last minute.
[1303] Yeah.
[1304] It's rarely fly by the seat.
[1305] It could have happened that night, could have not.
[1306] Who knows?
[1307] But you had a ring, so you had thought it out.
[1308] Yeah, I had had it for months.
[1309] But the point is, like, only - Beautiful ring.
[1310] I saw it glimmering in the play yesterday.
[1311] Oh, you did?
[1312] From far away.
[1313] Like, it was sparkling from afar.
[1314] Yeah.
[1315] Oh, my God.
[1316] Oh, and I thought, oh, yeah, That's such a nice ring.
[1317] That's nice.
[1318] That's an old -ass diamond, too.
[1319] For it to be sparkling this long.
[1320] It's beautiful.
[1321] At any rate.
[1322] So maybe, you know, some defensiveness, but, and then, like, my class warfare stuff.
[1323] Sure.
[1324] I get this.
[1325] Mostly, I really don't like people getting credit for something they didn't do.
[1326] Yeah, same.
[1327] That's my whole thing.
[1328] And I was like, this whole marriage is built on a fraud.
[1329] That's a lot of judgment.
[1330] I know.
[1331] And I'm sure, like, let's put it this way.
[1332] Kristen heard the same story and heard there was a system.
[1333] She didn't think any of those things.
[1334] But I was like, this is doom.
[1335] This relationship's going to fall apart.
[1336] He's a fraud.
[1337] He's lazy.
[1338] He's taking credit for shit he didn't do.
[1339] I was very judgmental.
[1340] Well, I think I am definitely more on your side.
[1341] For sure.
[1342] With thoughtfulness, I am really triggered when assistance do things.
[1343] Oursing thoughtfulness.
[1344] Now, okay, back up.
[1345] First of all, we don't know anything about Jake, whether he even has an assistant.
[1346] Oh, no, no, no. This is incredibly thoughtful.
[1347] So we're just, we're carving him out of this.
[1348] Also, this is such an outlier because he doesn't owe us thoughtfulness at all.
[1349] So the fact that even if he sent us like a random seize candy.
[1350] Or Beth and Beyond coupon.
[1351] Like that's super sweet just that he thought, oh, I should send them something.
[1352] That's why it was so thoughtful.
[1353] The fact that it had anything personal at all was very sweet.
[1354] Right.
[1355] He doesn't owe us any thoughtfulness.
[1356] But sometimes you owe people thoughtfulness when you're in a relationship or when you're, or when your best friend.
[1357] or when you're something.
[1358] You know, I have this, too, trying to think how I can lay it out.
[1359] I've just been there a couple times where I'm like, how is this person getting credit for all they had was the idea, but they didn't do any of the execution.
[1360] And the execution is a hard part.
[1361] Let's be real.
[1362] Absolutely.
[1363] I tell people this all that time.
[1364] Can I just, this is another, we're on tangents now.
[1365] Maybe because we went so early this one.
[1366] We're full of Vip and Vigger.
[1367] Vim and Vigger, people think that they have a movie idea and that that's some, like, oh, I had this movie idea.
[1368] One in particular.
[1369] Like, I know that there was a lawsuit against the wedding crashers team.
[1370] And some guys was like, I had an idea about crashing weddings.
[1371] I wanted to pull him aside and go, dude's crashing a wedding isn't wedding crashers.
[1372] Yeah, exactly.
[1373] It almost has nothing to do with, Wedding Crows is Vince Vaughn at the pinnacle of his powers and Owen Wilson at the pinnacle of his power.
[1374] powers and Dobkin directing the best he's ever, like the execution is wedding crashes.
[1375] Totally.
[1376] The fucking premise is nothing.
[1377] It doesn't it's not worth of.
[1378] No, I didn't.
[1379] With your hand.
[1380] You made that up.
[1381] You mimed it.
[1382] Don't listen to Monica.
[1383] That's not what I did.
[1384] I was, um, what was I doing?
[1385] I was shaking up, um, tambourine?
[1386] I mean a. No, I was shaking up a face lotion.
[1387] What were they called?
[1388] Maracas.
[1389] You're racist.
[1390] I was shaking up some kind of a face cleanser.
[1391] Don't call me racist at 10 .30 in the morning.
[1392] You're the fun as to call racists.
[1393] Oh, we're not allowed to say it's 10 .30 in the morning.
[1394] Sorry.
[1395] Oh, my God.
[1396] I knew this would be triggering for you.
[1397] Unless they're listening at 10 .30 right now.
[1398] Oh, yeah.
[1399] Then it's exactly right.
[1400] On Thursday.
[1401] Okay, go on.
[1402] I just know people, an idea for a movie is really, honestly, it's 1 ,000th of the, Of the ingredients.
[1403] I just think people think the money is too much of a factor as opposed to the, like, execution sucks.
[1404] Like, that's why people don't do it.
[1405] If everyone had an assistant, people would be way more thoughtful.
[1406] It's just easier.
[1407] You just say, hey, like, go do something.
[1408] Make that person feel loved.
[1409] Yeah.
[1410] But it's the thinking about it.
[1411] It's the, I mean, you did that for Rob.
[1412] You've done it for me with the car.
[1413] You do do that.
[1414] I do it once every 10 years.
[1415] So, like, you guys are fucked.
[1416] Rob, you've got nine more years.
[1417] Oh, no. You're seven years off from something great again.
[1418] Oh, man. It's a long time.
[1419] I know.
[1420] Well, I got that sloths for Kristen like a decade ago, and I haven't done anything since.
[1421] That's not true.
[1422] I think it might be true.
[1423] You got her Vespa.
[1424] Didn't she see it, and then you saw that she liked it, and then you got it for her.
[1425] That's very thoughtful.
[1426] And then she wrote it and decided that it's not for me. Oh.
[1427] That's all that really happened.
[1428] Because she drove it straight into the Hellcat.
[1429] The Hellcat is just.
[1430] But it's back, and we love it.
[1431] Literally the first ride, like, she takes off from those steps.
[1432] Yeah.
[1433] She's like, then I kind of think I hear her, oh, whoa, like, I hear some shit.
[1434] And then I, like, run around the corner, and she's, like, she's gunning it.
[1435] And she's on the brakes and her feet her down.
[1436] And she's going into the Hellcat.
[1437] Oh.
[1438] And then it all stops, and it's on its side just as it gets to the Hellcat.
[1439] And I was like, oh, my God, she's going to.
[1440] Okay, so she didn't know how to drive it.
[1441] Well.
[1442] Did she think she knew how to drive it?
[1443] She has driven many ridden many of motor scooter, but it's been a long time.
[1444] Okay, she needs a refresh.
[1445] She did and probably not in the driveway next to the hell care.
[1446] Okay, wow.
[1447] Anyway, it was still very sweet.
[1448] And you used to do sweet things like surprise her at the airplane.
[1449] Yes, yeah, before kids I was, I had a lot of energy.
[1450] Yeah, now you have to be, now I have to channel the energy.
[1451] Now I'm at plays on Sunday.
[1452] Same.
[1453] Yes, you too, you too.
[1454] Speaking of the play.
[1455] Yeah, so there was a big performance on Sunday of Matilda.
[1456] Yes, Lincoln School.
[1457] Yes, performance.
[1458] Musical, yep.
[1459] And there was a one o 'clock showing and a five o 'clock showing.
[1460] Yes.
[1461] And I went to the one o 'clock and you went to the five o 'clock.
[1462] You went to the matinee.
[1463] And I do wish I could have gone to both because I wonder.
[1464] If you got better?
[1465] Yeah.
[1466] Right.
[1467] Well.
[1468] I think I can just be honest.
[1469] I mean, it was.
[1470] It was so cute.
[1471] It was so cute and so sweet.
[1472] Incredibly cute.
[1473] And incredibly ambitious.
[1474] Very ambitious.
[1475] I just don't know why they didn't wait.
[1476] Get the mic straightened out.
[1477] Well, okay, so that's a separate.
[1478] I don't know why they didn't wait until the end of the year.
[1479] Like, they had more time.
[1480] Why they have to do it now?
[1481] I don't know.
[1482] It felt rushed.
[1483] Fell rushed.
[1484] And then also, I was angry at the adults.
[1485] Oh, tell me. Because an adult is running sound, not a kid.
[1486] and an adult is running the spotlight.
[1487] These adults are not doing a good job.
[1488] Well, hold on, though.
[1489] Here's the thing.
[1490] It's all volunteers.
[1491] It's like no one is professionally in stage management.
[1492] You should still take that seriously.
[1493] Maybe they got it figured out by 5 o 'clock.
[1494] You could hear everyone talking.
[1495] Backstage, yeah.
[1496] And then breathing.
[1497] Yeah, yeah, a lot of breathing.
[1498] A lot of breathing.
[1499] It did make me...
[1500] You were just like, just turn off those minds.
[1501] It answered a great riddle for me, though.
[1502] Like if anything else, well, first of all, I couldn't have loved seeing Lincoln more.
[1503] Me too.
[1504] And I wanted her to be in both halves.
[1505] Yes.
[1506] I was mad she wasn't in the second half.
[1507] Yeah, yeah.
[1508] Because there were too many kids, so they did half.
[1509] A couple hundred kids.
[1510] But it was so cute.
[1511] It was so cute.
[1512] And I loved watching her at the beginning.
[1513] And when she got me and she stomped and screened.
[1514] She looked so beautiful.
[1515] In 1 p .m.?
[1516] Yeah, no, I watched the video.
[1517] Yeah, no sound.
[1518] No sound.
[1519] Yeah, no mic.
[1520] Just her up there up there.
[1521] Yeah.
[1522] So she did have sound in mind, but that's not to say that, you know, whatever.
[1523] Again, volunteers.
[1524] God bless everyone.
[1525] It's hard, hard.
[1526] Okay.
[1527] You're much more lenient than me. I'm going to volunteer next year.
[1528] Okay.
[1529] Actually, I don't have time.
[1530] Yeah.
[1531] We'll get Jake Jill Hull's assistant to do it.
[1532] It answered a long curiosity of mine, which is when you go to Broadway, they mount the microphones for the performers in their hairline.
[1533] And it always confused me. I'm like, why is the fucking microphone up at the top of their forehead?
[1534] Yeah, that's where they put it.
[1535] They like kind of snake the cord through and it's like sitting at the top of their head.
[1536] Okay.
[1537] I didn't know that.
[1538] That was always perplexing to me. But after yesterday, I now know, yet if you don't mount it there, all you hear is this.
[1539] But no, that's not true.
[1540] The people can, if they're on top of it, they can be turning the mics on and off.
[1541] Well, absolutely.
[1542] Not the actor, the sound guy.
[1543] Of course.
[1544] The sound volunteer.
[1545] But additionally, even if that were the case, you have two people singing a duet.
[1546] These are little kids.
[1547] They're running around.
[1548] Their mics have to stay hot the whole time if they're a duet.
[1549] They're not going to be writing it.
[1550] Yes, that acutely.
[1551] Yeah, that's true.
[1552] And that kid who just sang is like, so even if it were flawlessly executed, it's still a bad location for a mic when kids are running and dancing.
[1553] That's really true.
[1554] Okay.
[1555] So that I left going like, well, I have an answer for that now.
[1556] Okay, you learn something.
[1557] We'd love to learn.
[1558] And the kids were so cute.
[1559] They were.
[1560] Everyone had an English accent.
[1561] I know.
[1562] They win for it.
[1563] That's what I'm saying ambitious.
[1564] Like, I think if I were directing it, I would have said to the kids, all right, it's a play of English origin, but we don't need to do that.
[1565] I agree.
[1566] I wondered why they did that.
[1567] Because the story doesn't need to be in England.
[1568] Yeah.
[1569] Right?
[1570] Let's just play it like it's in America.
[1571] I know.
[1572] It's kind of like, is it appropriation that they're doing these things?
[1573] Thank God they're white.
[1574] Yeah.
[1575] We're all good then.
[1576] But yeah, I did think, wow, they asked them to do access.
[1577] Yes, it's just, it was a lot on top of already, again, it's so ambitious.
[1578] A full play.
[1579] It wasn't just even like half the play.
[1580] It's a musical with dance routine.
[1581] I mean, there's a lot.
[1582] And they haven't had so long to learn it.
[1583] Right.
[1584] All that to say, yeah, it would have probably been like, guys, we got our hands full with the singing and dancing.
[1585] Because a couple of the kids really nailed it.
[1586] Yeah.
[1587] But for some kids, it gave them what I began getting curious about.
[1588] Do they have a speech impediment or is that, that's their...
[1589] Version of the accent.
[1590] It's getting so, like, hard to stay in the accent that they're...
[1591] That's how adult...
[1592] I wouldn't be able to stay in that accent.
[1593] Yeah, like, it was something like, I can't want my finger.
[1594] Go let's gay.
[1595] And I was like, what?
[1596] is the character like I'm just I don't know what you understand right yes I was there I I was like is that what's happening here I guess it shouldn't matter but there were a couple of performances that were just so I didn't know if that was the accent or there was more going on offstage it felt like we were watching like a parody of of a kids play of like kids not being ready.
[1597] Honestly, it felt like what my nightmares were before plays.
[1598] Right, yes.
[1599] Like that this, that could happen.
[1600] Kristen had one of the funniest jokes.
[1601] She had gone to the dress rehearsal on Saturday.
[1602] Yeah.
[1603] And she came home and I said, how did it go?
[1604] And she said, oh my God, they worked so hard and they're going to be completely ready in three weeks.
[1605] Yes, she said that to me too.
[1606] In three weeks.
[1607] Yeah.
[1608] Yeah.
[1609] But you know what?
[1610] That's the norm of an elementary school play.
[1611] Because I was actually racking my brain while I was in there.
[1612] I was realizing we didn't have plays in elementary school.
[1613] We didn't even have them in junior high.
[1614] They started in high school.
[1615] Yeah.
[1616] So they're really ahead of schedule.
[1617] Yes.
[1618] And, you know, there was like first graders in there.
[1619] Couple of those little girls were so cute.
[1620] Oh, my God.
[1621] Where they're like, yeah, they were so cute, tiny.
[1622] And that little boy.
[1623] Oh, the little.
[1624] Who was eating all the cake.
[1625] Yes.
[1626] He was so, he was good.
[1627] He was committed.
[1628] But what about the little girl in the all glittery outfit?
[1629] And she sounded like she was in mafia.
[1630] Yeah, she was kind of going cockney.
[1631] But hers was like, let's get out of here.
[1632] Like she sounded like she was mafia.
[1633] And I, and then I saw the woman in front of me is laughing really hard.
[1634] And I was laughing so hard at her.
[1635] Yeah.
[1636] And I leaned forward and I touch her shoulder.
[1637] I said, is that little glittery one yours?
[1638] And she said, yes.
[1639] And I go, oh my God.
[1640] I love her.
[1641] And she goes, yeah, I think she's in the mom.
[1642] Yeah, she played the mom.
[1643] But it was so funny because she was tiny.
[1644] She was so tiny, but the mom in the first act looked so different than the mom in the second act.
[1645] And like all of a sudden you see this tiny girl.
[1646] So you were actually tracking the characters because I wasn't at all.
[1647] I was so lost.
[1648] I don't know who Matilda was.
[1649] Oh, God.
[1650] Yeah.
[1651] I couldn't figure any of it out, Monica.
[1652] You know, I have a hard time hearing.
[1653] like well -spoken lyrics on an album.
[1654] No, I know, same, I know.
[1655] So I was like, what?
[1656] I know.
[1657] I couldn't understand anything.
[1658] And then also, what scene is up?
[1659] Because people in the back had their mics fully on there.
[1660] They're like, oh, my God, that was good.
[1661] What's next?
[1662] What scene?
[1663] Six -C?
[1664] Backstage, you just hear them all fully talking.
[1665] And then the people on stage, you can't hear anything because their mics aren't up.
[1666] Yep.
[1667] And they're singing.
[1668] Someone forgot to turn those mics on.
[1669] And then the ones in the back are full -long.
[1670] Yeah, and towards the end of the 1 p .m., someone in backstage was like, I know, we did horrible.
[1671] Oh, no, you could hear that on the phone, oh, no. But I have to say, I far prefer this was the outcome.
[1672] Like, this was like an episode of parenthood.
[1673] Like, it's so much more memorable than everyone did a good job.
[1674] It was like, it was a bunch of children.
[1675] It was a fucking mass. It was so funny.
[1676] And it was delightful.
[1677] I did think, oh, my God, I wish so badly I was sitting next to Delta.
[1678] Like, I really wanted to know what her reaction would have been to this.
[1679] I can tell you what it was.
[1680] Yeah.
[1681] We hit the snack bar hard before we got inside.
[1682] She got a lot of stuff.
[1683] A bunch of different cheeses, some cupcakes, a bunch of different drinks.
[1684] I was left holding it all.
[1685] Yeah.
[1686] But she didn't sit with me. Oh, no. Yeah, I don't even know who she sat with.
[1687] She sat like six rows ahead.
[1688] Maybe Grandma, maybe Kristen.
[1689] I can't take.
[1690] Chris was in and out.
[1691] There's like area for kids up front.
[1692] Student's eating.
[1693] Okay, she might have been up there.
[1694] All I know is about every three and a half minutes for two hours.
[1695] I'd see you're walking up the aisle.
[1696] No. I'd see you're standing at the end of my row.
[1697] She would be going, Daddy, need more snacks.
[1698] No. Yeah, and she wouldn't fucking take all of them.
[1699] Oh, she had seaweed.
[1700] Oh, my God.
[1701] Yeah.
[1702] Did Daddy home more snack?
[1703] And then I'd pass it down all these people.
[1704] Oh, wow.
[1705] And then seven minutes later, she'd be back.
[1706] Daddy, want my cheese.
[1707] Oh.
[1708] So I don't know how much of the performance she caught.
[1709] Okay.
[1710] Mm -hmm.
[1711] Okay.
[1712] I thought sitting next to her would be quite funny because I had a feeling her reactions might be like, what?
[1713] Why are they talking?
[1714] Yeah, what's going on?
[1715] You would have been fun to watch her confusion.
[1716] Right.
[1717] Whose voice is that?
[1718] Yeah.
[1719] You have to know what's happening to unravel all the stimulus.
[1720] that was going on.
[1721] But all the kids were so fucking cute.
[1722] They were.
[1723] They were really, really, really cute.
[1724] And it was a hefty prefer.
[1725] It was two hours.
[1726] Yeah.
[1727] That is like asking a lot.
[1728] That is asking a lot.
[1729] I applaud these poor adult.
[1730] There were only three people.
[1731] Do you see when they were celebrating?
[1732] It seemed that there was only three people that had put that entire thing on.
[1733] And there were so many kids in that production were 50 kids or something.
[1734] I know.
[1735] Costumes.
[1736] So I applaud them.
[1737] What an undertaking.
[1738] Me too.
[1739] So ambitious.
[1740] I thought my little girl was a little star, though.
[1741] I thought she looked so beautiful.
[1742] She did.
[1743] She looked so cute.
[1744] She was bossing people around.
[1745] Yeah, I did.
[1746] That happened a lot in the 1 p .m. She's telling people what to do in the middle of the play.
[1747] People didn't know.
[1748] Yeah.
[1749] People hadn't entered stage.
[1750] Like some people were in the wings.
[1751] They were supposed to be on stage.
[1752] Yeah.
[1753] That part was so embarrassing for me. Because when I watched Lincoln.
[1754] Yeah.
[1755] It's me, you know?
[1756] It's me. She's up there, like, trying to get everyone to get in.
[1757] Well, I was laughing so hard at that.
[1758] And then I thought, yeah, this is interesting, right?
[1759] Like, I kind of think she thinks like this is her role here.
[1760] Mm -hmm.
[1761] A lot is going on.
[1762] She looks like a no -it -all, and there's a little arrogance to it.
[1763] And then so that part kind of, of course, embarrasses me. Sure.
[1764] Because she's an extension.
[1765] But I know.
[1766] I know what's happening.
[1767] happening in her head.
[1768] Yeah.
[1769] Which is like if that little boy doesn't go stand right there, they've already started the song.
[1770] This whole thing's going to collapse.
[1771] Like that feeling that panic and for whatever reason no one asked you, but you have decided like if you don't save this, the whole thing's going to be like four songs off.
[1772] Right.
[1773] I know that feeling of like, oh fuck, this whole thing's going to unravel and no one seems to be helping.
[1774] Exactly.
[1775] But I think what's so funny is, and she'll learn this as plays go on, in an effort to save, you're actually making it, you're a part, you know, you're a part of the problem.
[1776] Because now everyone's looking at you arranging people.
[1777] You're taking everyone out.
[1778] Yes.
[1779] Yes.
[1780] It's a lack of commitment because now you're the director.
[1781] But you're up there.
[1782] Yes.
[1783] And so I'm having this whole thing.
[1784] And then I'm going, you know.
[1785] It's so sweet.
[1786] You know, she is trying to direct from within the play.
[1787] And then I realized that is what I've done in three movies, which is I'm in everything I direct.
[1788] Wow.
[1789] And I just keep the camera rolling and I'll go like, oh, great, that was great.
[1790] But you said this, and I'm in that action.
[1791] And I'm like, of course.
[1792] Of course, of course she thinks she should be directing from within and also performing.
[1793] And then I just thought, you know, this is who we are.
[1794] And it'll be a whole journey.
[1795] Yeah.
[1796] And she'll watch a video maybe of this.
[1797] And it'll occur to her like, oh, wow, that was a little distracting.
[1798] And I was shouting out all the block.
[1799] Yeah.
[1800] Also, because in ours, this boy, that really cute boy, his, my pack, like, dropped out from under his pants.
[1801] Yeah, same in the second.
[1802] He was holding it this time, though, but I heard it hit the ground in doors.
[1803] Yeah.
[1804] And she was like, pick it up.
[1805] Like, she was telling him, but he, and it was, because he was doing what he should have done which is like play it very cool and act like ignore it ignore it and then he had to be like I'm trying oh they got a little fine a little like just for a second he had to be like I'm trying and then like go back to his lines and you know it was it was so funny the whole thing it was like that Christmas pageant book what's it called um what's that hilarious book about a Christmas page yes Bernstein Bears Christmas pageant the best Christmas pageant yes Hmm.
[1806] Okay.
[1807] You don't know that one?
[1808] I don't know that one.
[1809] Oh, my God.
[1810] One half a dozen.
[1811] It's so good.
[1812] And it's like this like rag tag family.
[1813] I don't remember.
[1814] You know, it's really being a parent so interesting because I'm self -aware enough to know a lot of parents are watching this inter -sene direction.
[1815] And they're so annoyed.
[1816] This what?
[1817] Oh, interesting direction.
[1818] I don't think they are.
[1819] Oh, I think so.
[1820] I think people are like, oh, that little girl's boss.
[1821] Or that little girl's a know -it -all or whatever.
[1822] Well, there's a billion people doing a billion things.
[1823] I don't think she could have been put.
[1824] We are looking at her.
[1825] We have to remember that, too.
[1826] Well, and I'm seeing other little kids who are standing out.
[1827] And they're standing out in different ways.
[1828] Yeah, I guess.
[1829] I mean, some kids are lost.
[1830] Yes.
[1831] That'll catch your eye.
[1832] You know, someone's clearly standing somewhere.
[1833] They're supposed to be off stage a while ago.
[1834] But this mirrors, remember I told the story about Lincoln talking baby talk when we three when we were in Miami and too many people were talking and I was embarrassed and I realized oh that's just my own embarrassment yeah she's free to do whatever that same situation just comes up over and over again when you have kids because they're an extension of your identity in this bizarre way and so I'm foreseeing the worst possible estimation of her protectively yeah so then me at least I can't speak for any other parent we started thinking how am I gently going to bring this up over time to her yes it's like it's really great you were on top of it but also when we see you do that it kind of takes us out of the thing right right and then I'm like I'm a little distracted now for the next 15 minutes thinking about how I'm possibly going to bring this up right and then a greater voice in my head at some point goes you're not going to do that you're going to let her live her life and kids will be annoyed with her or they'll like her or she'll get feedback from her peers and she will adjust just as I've learned to adjust and that's just how growing up is and I have to have a ton of tolerance for her to go on her journey through life and not intervene but my pride vanity ego all those things want to correct her to make her perfect yeah and I have to have faith that like no one pulled me aside and corrected my behavior I still have a level of that Like, I tip into exactly that What we saw on stage As a character trait I'm going to start directing anything Or calling out You know, like, I see it in myself And I'm sure it's off -putting At times Globally I'm getting by just fine Mm -hmm Does that make any sense?
[1835] Yeah, yeah, yeah No, that makes sense.
[1836] I'm like, she's smarter than me And she will, by the time she's my age have figured it all out.
[1837] I don't need to, but it's, It's so tempting to want to...
[1838] Totally.
[1839] Yeah.
[1840] I get that.
[1841] I think it's probably different philosophies of parenting, right?
[1842] Like, I think some...
[1843] My parents told me. They pulled you aside and they're like, hey, don't do that.
[1844] You seem like a brat.
[1845] Yeah.
[1846] Like, they would have said, hey, don't do that.
[1847] Don't be bossy on the stage.
[1848] Yeah, like that seemed bossy.
[1849] People don't like that.
[1850] Like, they would have just straight up told me. Yeah.
[1851] And, you know, I think there's a lot of...
[1852] negative to that, but I also think for me, it was I'm somewhat appreciative of it.
[1853] Here's my question.
[1854] Did you listen to them or did you actually change your behavior based on what your peers, the feedback your peers gave you?
[1855] Or even the subtext of what they were giving you.
[1856] I think both.
[1857] I mean mainly peers, right?
[1858] That was dictating.
[1859] I think that's the force, the main force in your life.
[1860] I guess it depends on, I Because it depends.
[1861] It totally depends.
[1862] I mean, like, even, I just remember my dad with school work.
[1863] Like, I just as an example.
[1864] At one point, he was like, Monica, you're smart, but you make a ton of careless mistakes.
[1865] Mm -hmm.
[1866] And you're a piece of shit.
[1867] No, but he was mad.
[1868] Like, he said it like that.
[1869] Like, he was mad that I was, I was just being absent -minded.
[1870] Yeah.
[1871] And he's like, there's no reason for this.
[1872] It's not that you don't understand.
[1873] So fix that.
[1874] Right.
[1875] And I did.
[1876] Yeah.
[1877] Like kind of as soon as you said that, I was like, that's true, I think.
[1878] It's like I'm just not paying attention or I'm not.
[1879] And I should just take a little bit of extra time.
[1880] And that is a huge piece of my personality now.
[1881] Yes.
[1882] Yeah.
[1883] I think for me it's really a issue of patience, which is I need to have my advice planned out, well thought out.
[1884] it's having the patience to if at some point she's flummoxed by the response of her peers and she asks me or she's lamenting to me yeah i can join her and relate to her yeah yeah and then at that point maybe have but so it's just like well if that needs to be talked about and it's an issue in her life yeah hopefully it'll come up and at that time will be my time to give my two cents when asked Yeah.
[1885] I don't know.
[1886] Parenting's a new, it's fun experience because it's ever evolving.
[1887] There's new dynamics coming up all the time.
[1888] There's new situations and it's not obvious what you should do.
[1889] So I like it.
[1890] Good.
[1891] I like it.
[1892] Okay.
[1893] So this is for Nita.
[1894] So, okay.
[1895] Anne Hache.
[1896] Yes.
[1897] She passed.
[1898] Yeah, and I didn't know that until you said it.
[1899] She died in 2022.
[1900] Right.
[1901] It's very sad.
[1902] Yeah.
[1903] She drove through someone's house?
[1904] Yeah, she crashed her car into a house.
[1905] And then she was in a coma.
[1906] And burned.
[1907] Oh, man. I know.
[1908] But yeah, she was declared brain dead.
[1909] Poor people whose house she crashed into.
[1910] Yeah.
[1911] Oh, scary.
[1912] Okay.
[1913] Has Walt Disney been cryogenically frozen?
[1914] His daughter says no. Really?
[1915] Mm -hmm.
[1916] She says the rumors are false.
[1917] Oh.
[1918] She says whatever you read on the internet.
[1919] he wasn't frozen.
[1920] Huh.
[1921] And he's not buried underneath the Pirates of the Caribbean ride at Disneyland.
[1922] I don't believe that.
[1923] I think he would have been way too long ago, right?
[1924] Or maybe not.
[1925] To be cryogenically frozen?
[1926] No, I think Lenin is cryogenically frozen at the Kremlin.
[1927] Will you look that up, Rob?
[1928] Well, I have a list of people.
[1929] Oh, you do.
[1930] Oh, great.
[1931] Famous people.
[1932] Living people who have been, living people who plan to be cryopreserved and to seize people who have been cryopreserved.
[1933] Great.
[1934] I'll do deceased.
[1935] Okay, correct.
[1936] James Bedford, he's an American psychology professor at UCLA.
[1937] Do you remember him?
[1938] Yeah, took all of his classes.
[1939] I'm the one who said to him, you know what?
[1940] You should get cryogenically frozen when you pass.
[1941] Wow, good job.
[1942] He still is working there because he's frozen?
[1943] Robbie.
[1944] Okay.
[1945] Several books on occupational counseling.
[1946] He's the first person whose body was cryopreserved after legal death and remains preserved at the Alcor Life Extension Foundation.
[1947] And this was in 67.
[1948] Oh, my goodness.
[1949] This is one of these tricky things.
[1950] It's like, try to pick a company from 1967 that's still around.
[1951] I don't know what the attrition rate is, but it's high for companies.
[1952] Right.
[1953] So what are the odds that you're going to, you know, you're going to pay for this cryogenic freezing and that the place is still going to have a refrigerator in 35 years?
[1954] I can just see at some point, them going like well we file bankruptcy we're going to put all these bodies in the trash we don't know what to do we're out of business someone will buy it right by the bodies by the company i don't know not if there's no business oh god i don't know okay dick claire not dick clark oh yeah um but was an american television producer actor uh writer it's a living the facts of life and mom is family.
[1955] Oh, and he cryogenically frozen.
[1956] Okay, great.
[1957] Robert Ettinger.
[1958] I mean, if I'm going to...
[1959] Are there any famous people?
[1960] They're all famous.
[1961] Oh, okay.
[1962] Any famous people that we've heard of?
[1963] Oh, my God, there's someone named FM 2030.
[1964] He's a Belgian -born, Iranian -American, ding, ding, ding.
[1965] He's numbers in his name?
[1966] Author, teacher, transhumanist philosopher, futurist, consultant, and Olympic athlete.
[1967] Holy schnolly.
[1968] Wow.
[1969] Wow.
[1970] Okay.
[1971] No one I know.
[1972] Ted Williams.
[1973] Do you know Ted Williams?
[1974] He's a professional baseball player.
[1975] Yeah.
[1976] Okay, yeah.
[1977] Okay, great.
[1978] Ted Williams.
[1979] We finally got one that I know.
[1980] Okay.
[1981] I don't think you know.
[1982] Any of them.
[1983] These other people.
[1984] Vladimir Lenin?
[1985] It looks like they...
[1986] I think he's on display, right?
[1987] Yeah, yeah.
[1988] Yeah, you can see his frozen corpse.
[1989] And they have to, like, resoke him every year.
[1990] Ooh.
[1991] It would be an interesting career to have.
[1992] Yeah, it looks like it's funded by the government.
[1993] James Bedford.
[1994] This other list is the same.
[1995] Thomas Donaldson.
[1996] Discontinued financial support.
[1997] Oh, so now he's warming up.
[1998] But then in 2016, the Russian government reversed its decision.
[1999] And then it planned to spend 13 million rubles to preserve the body.
[2000] The 80 bucks.
[2001] Weird that he's not on any of these lists.
[2002] Yeah.
[2003] Yeah, maybe because he's not in America.
[2004] No. I don't think that's why, but.
[2005] Someone else is on display there, too?
[2006] I mean, I think now I really might be in science fiction, but I want to say that Stalin was also, but then he obviously didn't age well with time, having killed more people than Hitler.
[2007] So I think they...
[2008] No, Joseph Stalin's embalmed body shared a spot next to Lenin.
[2009] Oh, okay, I didn't.
[2010] From the time of his death in March 53.
[2011] but he was removed as part of the de -Stalinization.
[2012] Talk about the full cancellation.
[2013] Like, when they get rid of your corpse, you've been canceled, canceled.
[2014] When they remove you from the...
[2015] This says 56 days after Lenin's death, the decision was made to preserve Lenin's body permanently so people could continue to visit.
[2016] Initially, the body was going to be preserved by deep freezing.
[2017] However, two eminent chemists, Russian names, suggested preservation by chemical enabilant instead.
[2018] So maybe he's not frozen, but just preserve.
[2019] Inbalmed.
[2020] Yeah.
[2021] Oh, wow.
[2022] Okay.
[2023] That makes sense why he wouldn't have shown up on these.
[2024] He's not frozen.
[2025] It has to be really cold in there, so he's not decomposing.
[2026] They won't, I don't know.
[2027] We don't know.
[2028] Okay.
[2029] Can some people predict their seizures?
[2030] Oh.
[2031] It's called an aura.
[2032] People with partial seizures may experience the following sign seconds or minutes before the actual seizure.
[2033] Unusual smells, taste, sounds, or sensations.
[2034] Naja, a deja vu feeling.
[2035] Yeah, and it's called an aura.
[2036] So some people can.
[2037] Predict their seizures.
[2038] Yeah.
[2039] And what will they do go lay?
[2040] I would go lay down in a bed.
[2041] Yeah, or maybe you tell someone.
[2042] Or I'd lay on the floor.
[2043] You're supposed to, I think you're supposed to be on your side.
[2044] Okay, I would lay on my side on the floor.
[2045] Okay.
[2046] Now, if I were you, I would quickly go peepee, then I would lay on the floor.
[2047] Oh, yeah.
[2048] Yeah.
[2049] I don't know time.
[2050] It happens so fast.
[2051] How much time do we have?
[2052] Some people have seconds or some minutes.
[2053] Minutes?
[2054] Like a life alert thing.
[2055] I do get, yeah.
[2056] Well, it's made, because some people die because they'll like throw up and stuff.
[2057] So I think it's good to tell someone so that they can make sure.
[2058] And then I think that's also why you should be on your side.
[2059] That's part of it.
[2060] In case you aspirate.
[2061] Yeah.
[2062] Asperate?
[2063] I think so.
[2064] What a bad way to go.
[2065] It makes me think of the words, Sputum.
[2066] Hmm.
[2067] Okay.
[2068] You know, when you're in the hospital and they want to check your, your spit.
[2069] They call it your sputum.
[2070] Oh, really?
[2071] I just remember when I'd be in there with my dad that occasionally nurse would come in.
[2072] Okay, Mr. Shepherd, time for a sputum sample.
[2073] Oh.
[2074] Oh.
[2075] She had this, like, plastic cylinder tube, kind of like a test tube, but plastic.
[2076] And my father would have to, you know, put his sputum in there.
[2077] Kind of like, it was one of the COVID test was spit.
[2078] Sputum.
[2079] Yeah, spot him.
[2080] Yeah, spot him.
[2081] Speaking of Russia, Sputnik.
[2082] Okay.
[2083] Stephen Hawking, 15 words per minute is correct.
[2084] That's what he.
[2085] could articulate.
[2086] Uh -huh.
[2087] And she's saying that now they're working on things that are 60.
[2088] It's so incredible.
[2089] That's crazy.
[2090] Mm -hmm.
[2091] She also writes, according to digital .com study of 1 ,250 companies, 60 % use bossware to track the activity and productivity of remote workers.
[2092] Now, is that, did you do that?
[2093] And that's how you knew about Xantham?
[2094] Is it all coming out?
[2095] Are you part of the 60?
[2096] I don't know.
[2097] Okay.
[2098] But, you know, you should definitely try to.
[2099] copy people who figure out what works, you know, is a rule of thumb if you're running a company.
[2100] Separately and in general.
[2101] Yeah.
[2102] Well, I had Wabi Wob install all the stuff.
[2103] You know, I can't operate a computer very well.
[2104] Right.
[2105] Right.
[2106] But then I really paid attention while Wobby was doing it to your computer so that I could then do it to his when he wasn't paying attention.
[2107] Great.
[2108] Then I did it to mine just to make it fair.
[2109] Okay.
[2110] Meth.
[2111] Methanphetamine?
[2112] Yeah.
[2113] Mm -hmm.
[2114] Because you said you thought.
[2115] If you do it once, one in a hundred chance, you'll get addicted.
[2116] Is that what I said?
[2117] Yeah, I couldn't really find one in a hundred.
[2118] I can tell you one statistic about cocaine.
[2119] Okay.
[2120] That was in a Gladwell book.
[2121] Okay.
[2122] And it was high 90 % of people who try cocaine do not get addicted.
[2123] Yeah.
[2124] There's a stat around meth that's like 60 % will relapse in the first year.
[2125] That's what...
[2126] If you're already an addict.
[2127] And I've said this before, but meth, if you're shooting it, it has different recovery rates if you're snorting it and smoking it and shooting it.
[2128] But if you're shooting meth, it has like the lowest recovery rate of any substance.
[2129] Yeah, and it like really changes your brain fast.
[2130] Oh, yeah, yeah, yeah, yeah, yeah, yeah.
[2131] If anyone watched that movie, Beautiful Boy.
[2132] Oh, boy.
[2133] Oh, what a beautiful movie.
[2134] Yeah.
[2135] And you see him in that movie, played by Steve Carrell, learning about that.
[2136] like how many months in and then how long it takes to get back to normal if ever yeah well the saying in a .A. is you know how long were you walking into the woods it's going to take a while to walk out it's going to take more than it took to come in well thank god it takes less like it does yes yes I've witnessed really dozens of times at this point in 20 years of going to meetings yeah dozens of people where when they entered I was like this person fried their brain you know like I don't know how this person returns and I've seen them return wow yeah I've seen it many times another bad one and it's so innocuous we bring it up a lot but benzo addiction is horrendous on your brain coming that takes a real long time for your brain to come out of that fun like a year sometimes for people's brain to kind of return to normal in that one all of these drugs have a reputation like human beings do and they're not necessarily objectively evaluated right it's like we have things that are scary zanics isn't scary because a doctor gives it to you well it is scary to me if people are abusing that's my point if you were to measure opiate deaths from prescribed medicine versus cocaine deaths yeah it's not even it's not even a hundred to one you know it's thousands to one Yeah.
[2137] And then you look at the damage of this benzodia addiction, which is epidemic level, and people now getting what they think are benzos, but they're fentanyl and dying.
[2138] It doesn't have this big, gnarly reputation like crack does, but it should.
[2139] That's my point.
[2140] Taylor Swift, Miley Cyrus.
[2141] Well, let's not drag those two into this.
[2142] I didn't say who was who.
[2143] I'm just saying packaging.
[2144] Okay.
[2145] This, let me see how long.
[2146] It is.
[2147] This is a video.
[2148] Video portion.
[2149] This is an ad.
[2150] In response to what its sponsors claim is an idea whose time has come.
[2151] The first all -drug Olympics opened today in Bogota.
[2152] This is old.
[2153] Oh, yeah.
[2154] These are allowed to take any substance whatsoever before and after and even during the competition.
[2155] So far, 115 world records have been shattered.
[2156] We go now to correspond to Kevin Neeland live in Bogota for the weightlifting finals.
[2157] Bennett, getting ready to lift now is Sergei Akhmudov of the Soviet Union.
[2158] His trainer has told me that he's taken anabolic steroids, Novakane, NyQuil, Darvon, and some sort of fish paralyzer.
[2159] Also, I believe he's had several cocktails within the last hour.
[2160] Sure.
[2161] August, of course, is perfectly legal at the All -Jug Olympics.
[2162] In fact, it's encouraged.
[2163] Akunov is getting set now.
[2164] He's going for a clean and jerk of over 1 ,500 pounds, which would triple the existing world record.
[2165] That's an awful lot of...
[2166] of weight, Dennis, and here he goes.
[2167] His arms have just snapped off and remained attached to the bar.
[2168] He's pulled his arms on the fucking sweetest to the big Russian.
[2169] That's a towel.
[2170] I think tomorrow he's really going to feel that, Dennis.
[2171] Back to you.
[2172] Thank you, Kevin.
[2173] That's funny.
[2174] Dennis Miller with the fucking sweetest hair do of all time.
[2175] That hair is crazy.
[2176] Big and large and in charge.
[2177] Big time.
[2178] Yeah.
[2179] Well, you're 89, 90?
[2180] I mean, it's on here nine years ago, so, but it's Soviet Union, so definitely.
[2181] 3 .89.
[2182] That was it?
[2183] That was it.
[2184] Way to take us out on a joke.
[2185] That's what you want to do.
[2186] I try.
[2187] Really good job.
[2188] Really good timing.
[2189] You really should get involved in next year's play.
[2190] Yeah.
[2191] All right.
[2192] Musical.
[2193] I mean, the sound, maybe Rob, you can help with that part.
[2194] Well, let me also.
[2195] One more defense of them.
[2196] Okay.
[2197] There were so many microphones.
[2198] So many kids were miced.
[2199] I mean, there's probably 25 mics or something.
[2200] I don't know how anyone could have kept.
[2201] This is why when I was in high school and I did high school theater, we didn't have mics.
[2202] I mean, maybe the stage might have had mics.
[2203] Right.
[2204] That's probably the move is for them to put mics on the front of the stage.
[2205] And then the whole, you're supposed to learn how to project.
[2206] That's like part of theater.
[2207] But again, you're talking about high.
[2208] school, and these are first graders.
[2209] Yeah, but they need to learn.
[2210] If they want a career, this is Los Angeles, okay?
[2211] Okay, I have high standards.
[2212] Okay, I'll tell them.
[2213] I'll tell everybody.
[2214] But I will just say, in the groundlings for our Sunday show, you know, we had a six -hour tech rehearsal before every show.
[2215] Really?
[2216] Yes, because there was all new sketches.
[2217] But you didn't have mics, did you?
[2218] No, but just the lighting.
[2219] Oh, lighting, yeah, yeah.
[2220] Right?
[2221] So, like, we would have six hours for something way less complicated, way less performers, 13 of us, not 60.
[2222] You know, if there was a microphone, there's two handhelds.
[2223] Yeah.
[2224] And even that got a big, long tech rehearsal.
[2225] We ran through every sketch in the thing, and it was six hours.
[2226] No, in college, you have a week or two of tech.
[2227] Right.
[2228] So these people had Saturday.
[2229] And it was everything.
[2230] It was a dress rehearsal.
[2231] It was only a couple hours.
[2232] Yes.
[2233] Yeah, I need them to.
[2234] But I want them to just be a tiny bit more realistic.
[2235] Like either just do condensed.
[2236] They should have done something much simpler.
[2237] Condense play.
[2238] Condense milk play.
[2239] Maybe different mic -sitch.
[2240] Just mic the stage.
[2241] Or spend all year and have a two -week tech.
[2242] Like, do it real.
[2243] And they probably couldn't afford that.
[2244] I bet the budget to come in and teach this production was what it was.
[2245] And they probably already didn't make shit.
[2246] They did an outstanding job with three human beings.
[2247] and 50 kids.
[2248] They did.
[2249] Yeah.
[2250] I don't even know what I'm saying.
[2251] What I'm saying is I'm not here to say anyone did a bad job.
[2252] No, everyone did.
[2253] Everyone did a great job.
[2254] Everyone did their best.
[2255] Okay.
[2256] I love you.
[2257] I love you.
[2258] And I loved the play.
[2259] Yes.
[2260] I love seeing Lincoln and Delta up there so much.
[2261] And it was just so cute.
[2262] Kids are so cute.
[2263] Oh, they are.
[2264] But adults, I have problems with.
[2265] Well, sure.
[2266] All right.
[2267] Love you.
[2268] Follow Armchair Expert on the Wondry app, Amazon Music, or wherever you get your podcasts.
[2269] You can listen to every episode of Armchair Expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.
[2270] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.