Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome, welcome to armchair expert.
[1] That's got to be too dissonant, right?
[2] You're expecting one thing.
[3] Welcome, welcome, welcome.
[4] What do you mean?
[5] When you get welcome to armchair expert, you're not expecting four welcomes.
[6] It's kind of like when you see a band you love and they play the song live, but they don't play it the way they recorded it.
[7] And you're like, I know, I want to hear how you recorded it.
[8] Okay.
[9] You know what I'm saying?
[10] Vaguely.
[11] I said, welcome, welcome, welcome.
[12] I threw an extra welcome in.
[13] I don't like that.
[14] No, it's terrible.
[15] Okay.
[16] Take two.
[17] Welcome, welcome, welcome to Armchair Expert.
[18] I'm Dan Shepard.
[19] I'm joined by Monica Padman.
[20] Much better.
[21] The Padman.
[22] Today we have, I think, a very relevant guest, given that we just watch the social dilemma on Netflix, which I urge everyone to watch.
[23] Not a sponsor.
[24] Not a sponsor.
[25] Not receiving a fucking dime from Netflix.
[26] But it's an important watch.
[27] It really is.
[28] It's very disturbing and in a very good way for us to think about.
[29] But this dovetails nicely into Leah Plunkett's work, which is about sharing your children's lives on social media.
[30] She has a great book called Shareant Hood, Why We Should Think Before We Talk About Our Kids Online.
[31] In that book, she examines the implications of adults, excessive digital sharing of children's data.
[32] And Leah Plunkett is a faculty associate at the Berkman -Klein Center for Internet and Society at Harvard University and an associate professor at University of New Hampshire School of Law and a mother of two.
[33] And we just loved her.
[34] Yes.
[35] So listen up.
[36] If you've got kids or even if you don't, it still kind of applies to us.
[37] They're gathering that data.
[38] They sure are.
[39] And they're smarter than us.
[40] The special horror version of Armchair Expert.
[41] Wondry Plus subscribers can listen to Armchair Expert early and add free right now.
[42] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[43] Or you can listen for free wherever you get your podcasts.
[44] God, Rob's like, what's the people at the Mac store, the geniuses?
[45] Leah, do you have any computer questions you want Rob to solve while we're here?
[46] Rob has already been helping me out.
[47] So, no, I think I'm in good shape.
[48] I've got my telemarketer headset.
[49] I'm recording.
[50] I am in love with your headset.
[51] I confess to Rob.
[52] I mean, this is fitting for the topic.
[53] So last night, I went scrambling around to try to find.
[54] find my really nice home podcasting kit that I had purchased for podcasting, and I found part of it, and the other part of it seems to have been taken by my five -year -old, and I cannot find it anywhere.
[55] So hence my really awesome telemarketing headset.
[56] It harkens back to like a 90s commercial for like AT &T where it would be like, hi, Becky, how can I help you?
[57] It's that it's white too, I think, is part of it.
[58] Yeah, it's really popping.
[59] It's enormous.
[60] You know, as what's the principal and tech, right, where everything gets smaller.
[61] There's some term for it.
[62] Yeah, I forget what it is.
[63] You probably know it.
[64] You're smart.
[65] I'm not smart when it actually comes to using technology.
[66] I am smart when it comes to thinking about privacy and how technology is owning us.
[67] But the minimization does happen, and I'm totally going very, very old school.
[68] Yeah.
[69] You're bucking the system and we appreciate it.
[70] Well, thank you.
[71] Now, could you explain to me as I read about you, you seem to be working for two different colleges simultaneously.
[72] Is that the truth?
[73] I no longer am.
[74] I was briefly.
[75] So I am on leave from University of New Hampshire, Franklin Pierce School of Law, in Concord, New Hampshire, which is where I'm currently located.
[76] And I am visiting remotely at Harvard Law School for the year.
[77] Okay.
[78] I am the special director for online education, for which I get the snazzy headset.
[79] And actually, you know, this is my headset.
[80] Harvard didn't give this to me. Harvard has real technology.
[81] and I'm the Meyer Research Lecture on Law.
[82] So I am technically on leave from UNH, but in the nice way of academics, I have my professorship there when this year is over.
[83] Okay, yeah, they seem to be pretty generous in that way, right?
[84] As far as other employers go, there seems to be a little bit of loosey -goosiness.
[85] If people have a cool opportunity, they seem to allow that.
[86] Definitely, definitely.
[87] And UNH, we had the good fortune, slash planning, slash just dumb luck of starting a hybrid online residential law program before the pandemic.
[88] Oh, okay.
[89] We started in 2019 and got it set up to run mostly online.
[90] And so Harvard Law School decided in June of this year in response to the pandemic to be fully online in the fall.
[91] So the dean invited me to spend a year virtually with them.
[92] I am still in the same alcove of my guest room or corner of my porch as I was when I was at UNH.
[93] So I haven't actually gone anywhere, but we are working very, very hard with a wonderful team of technologists and learning designers to have a fully online Harvard Law School curriculum this fall.
[94] And it is awesome.
[95] And are you as flattered as I would be just ego -wise to be under the employee of Harvard?
[96] Yes.
[97] Yeah, good, good, good.
[98] You should be.
[99] Totally.
[100] It should last at least, because this was June, right?
[101] So it should last at least three months before you get disillusioned.
[102] I think it's going to last even longer than that.
[103] I have no false modesty about this one.
[104] I am thrilled.
[105] And it really is right now at Harvard, like, watching a bunch of folks who've been doing Broadway hits for a long time, all of a sudden transitioning to the screen.
[106] And they are consummate performers and international experts.
[107] And they know their craft.
[108] And they are right there with all of us in pandemic life of, okay, well, we know our craft and we knew how to do it in this space.
[109] And now we're doing it in a whole different space.
[110] And in some ways, that's the same.
[111] in other ways it's very different and it's a humbling and exhilarating experience and at times very funny too in a good way well this is neither here nor there but you would be shocked to go around Hollywood in like pitch projects at the owners of media right like you'll go to warner brothers or you'll go to Netflix and all you want to do is get something on the screen in the conference room and it's laughable how few people can get that done you know at the epicenter of where that should It's always comical to me how technologically unsavvy the tech companies can be in practice.
[112] Now, total detour, New Hampshire.
[113] As a kid, I went to Laconia.
[114] Yes.
[115] Okay.
[116] Laconia is in New Hampshire.
[117] It's about 45 minutes north of me. What did you go to Laconia for?
[118] Well, my stepdad at the time raced motorcycles in Laconia has one of the three bike weeks that happened nationally.
[119] And so what is it like when that gang of folks arrives every summer?
[120] it noticeable?
[121] It is noticeable and it is intense and I made the mistake.
[122] So I'm not from here.
[123] I grew up in Michigan, born and raised.
[124] Oh, where?
[125] Grew up in Ann Arbor.
[126] Oh, my goodness.
[127] Were your parents associated with U of M in some capacity?
[128] They were.
[129] They went there for their doctorates in psychology and like many townies in Ann Arbor.
[130] They just never left.
[131] So my dad was clinical faculty and in private practice and my mom was in private practice.
[132] So I grew up there.
[133] Well, I grew up in Milford.
[134] So like 20 minutes down US 23 from you.
[135] Oh my gosh.
[136] Exactly.
[137] Wow.
[138] That's awesome.
[139] It can be misleading for the folks who only went there and went to U of M. When they go up north, they're like, oh, there's a proper amount of hillbillies here.
[140] I used to have a poster in my room when I was in middle school.
[141] That was a picture of Ann Arbor.
[142] And then it said 20 miles surrounded by reality.
[143] And I think that's really accurate.
[144] And so I didn't grow up in New Hampshire.
[145] My husband did.
[146] And so we moved here about, gosh, like 13 years ago from D .C. to get somewhere green and clean and nature and all of that and get out of the city.
[147] So I made a big mistake one of my first years here.
[148] I didn't know that Bike Week was a thing.
[149] It was not a thing in Ann Arbor.
[150] It was not a thing in Cambridge, in New York, or D .C. Or anywhere else I'd ever lived.
[151] And I was trying to get to a friend's bridal shower.
[152] And it involved cutting through Bike Week.
[153] Huge, huge.
[154] It's like Ann Arbor, when they're playing Ohio State.
[155] Everything's a parking lot.
[156] You cannot go anywhere.
[157] where you should not have left home.
[158] It's a rookie mistake, and I made it.
[159] And so I'm sitting there in the car, and you're just stopped on the road calling.
[160] And no one thought to say anything because everyone else had lived there for a while and thought it went without saying that you'd take a detour.
[161] Yeah.
[162] Oh, no. And now when we would go there, we would rent a little cabin on the lake where they shot on Golden Pond.
[163] Do you know what lake I'm speaking of?
[164] Yeah, I think that's Squam Lake, right?
[165] Okay, because Where's Beach is also in the mix of that whole thing, right?
[166] Where's Beach is up there, too.
[167] I think on Golden Pond was shot at Squam Lake.
[168] I think it's Squam, not Winnapasaki, but.
[169] Winapasaki, that's the one that we would.
[170] Yeah, maybe we had, okay, and maybe it wasn't really the sight of on Golden Palm, but we were told that we got a big thrill out of that.
[171] Now, this moved to New Hampshire for the greenery, was it motivated out of having children?
[172] We hadn't had them yet.
[173] It was motivated out of knowing that we wanted to have children, we wanted to have a dog, and my husband grew up here, and he was at a big law firm in D .C. and really wanted to be in public service.
[174] So he got a job.
[175] He was a homicide prosecutor with the New Hampshire Attorney General's office for about six years.
[176] Wow.
[177] That was interesting.
[178] We'll interview him tomorrow.
[179] Yeah.
[180] He's got the stories that are like, you know, John Grisham comes to life.
[181] I mean, I'd gone to Harvard for undergrad and law school, have family in New England, even though I grew up in Michigan, love New England.
[182] And when he said, let's leave D .C. and go, what for him was home, I said, great.
[183] For the first, maybe two years, not so great.
[184] Uh -huh.
[185] A lot less going on, right?
[186] That was it.
[187] Amazing friends from the beginning, amazing professional stuff.
[188] I was a legal aid lawyer, and I worked with kids at -risk in court -involved kids.
[189] So the kids getting into trouble at school, getting arrested, all of that.
[190] Those are my clients.
[191] They were awesome.
[192] Mm -hmm.
[193] A lot less going on, and it was really disorienting.
[194] Like, everything would close at five.
[195] Yeah.
[196] And on the weekend, it never opened.
[197] And then I remember my very first Christmas.
[198] So I'm not a very observant Jew, but I am Jewish.
[199] And again, you know, Ann Arbor, Upper West Side, you know, D .C.'s, places that, you know, were pretty well integrated in certain respects.
[200] Jewish friendly, let's call them.
[201] Jewish friendly.
[202] Upper West Side may have even been more than Jewish friendly.
[203] But you go in front of the New Hampshire State Capitol and there is a crush and a full -out, not just Christmas tree, but a full -out, you know, crash display with, you know, a nativity scene.
[204] A full nativity scene.
[205] Sure.
[206] And I just had not, and maybe, I don't know, maybe this is something in other parts of Michigan or other parts of places you would have seen in front of a state capital, had not seen that before.
[207] And I remember just kind of going.
[208] Isn't there a division of church and state?
[209] There you go.
[210] There is.
[211] And they had put up, I actually looked into this because I got worked up about it.
[212] Sure.
[213] This is one of the hazards of getting worked up about something when you're a launder.
[214] Well, by the way, you'd move somewhere boring.
[215] So you're kind of like anything.
[216] Yeah.
[217] Get the motor run in.
[218] Yeah.
[219] It didn't take much.
[220] So I did the research, and it turned out that they had a very small plaque that was this, like, very minimal disclaimer that said something about how this area is open for public display and the views don't necessarily reflect the city.
[221] But they had put it right next to the government -sponsored Christmas tree.
[222] So I always felt like it was just a little too close for comfort, and that's how I eventually found my way to being on the board of the ACLU of New Hampshire.
[223] Because you're right, when you get worked up about something, and it's nice and quiet, and you're not sitting in traffic all the time.
[224] you have time for board service.
[225] Sure.
[226] Now, you are a law professor.
[227] I found a little snippet of footage of you teaching, and I dug it.
[228] You seem to have some perverse attraction to improv comedians, because I saw you do a parallel of why using some of the tools of improvisation.
[229] I've been doing it for 25 years.
[230] You can't say the fucking word.
[231] That's what I say, and it's always, okay, so you found some techniques in improv that you thought would be beneficial in the classroom, and I kind of.
[232] dug that?
[233] What brought you to that?
[234] The best thing I did in college, or even in law school, for that matter, was be a member of an improv comedy troupe in college.
[235] I was a member of immediate gratification players, also known as IGP, which is one of, at the time, was one of two troops at Harvard.
[236] There may be more now, but it was us and OTI on thin ice, and we were the cool troop, but cool sort of that underdog way.
[237] And I loved it.
[238] I absolutely loved it.
[239] And hands down, the best thing that I did for law teaching, for being an associate dean, for all of the above was learning that yes and mentality.
[240] Well, and you were saying that inherent in law is no because, that that's kind of where you start off as a lawyer, no, because blank.
[241] And so very antithetical yes and to that paradigm.
[242] It is.
[243] And that's one of the things that makes it even more subversive.
[244] fun to bring the improv mentality into the classroom or into anything else that involves law is like that is what lawyers do.
[245] We get trained how to say no like every which way till Sunday and then we say it sideways and we say it with a smile and then sometimes we say it with the frown and then we bill you a lot of money because we said it in six minute increments.
[246] We're really good at that.
[247] And I love the ability of law to actually solve problems and be creative and iterative.
[248] and it is powerful in that regard.
[249] Itterative.
[250] Yeah, it's a bonus word.
[251] We get a boner every time we hear a new word.
[252] Yeah, we love it.
[253] And another aspect, and again, having practiced improv for so many years, one thing you brought up that had never even occurred to me, I mean, clearly I knew listening is the most valuable tool in improv.
[254] You can't really do it without really listening.
[255] But you kind of drilled into the status aspect that I hadn't thought of.
[256] And what's interesting is, and maybe I miss on.
[257] understood you, but if this was your point, I really dug it, which is, if you are high status, it's almost in opposition to listening.
[258] And I was like, oh, that is true.
[259] Like when someone's highest status, they're in the role of, you know, pontificating and educating and they're not taken on much info.
[260] And I just thought that was an interesting observation.
[261] We used to do, and I forget the exact name we used, but one of our warm up games was figuring out how to say, come into my office, let's call him Winston, like come into my office, Winston, in a whole.
[262] whole variety of different modes, right?
[263] So either you really are owning Winston and he knows you're owning him or actually, you kind of technically own Winston on paper, but everyone knows that Winston kind of owns you.
[264] And we would just do that kind of over and over again with all the different permutations of how you say that.
[265] And you are spot on.
[266] That is exactly where I was going with that, which is that if you really are in a space where you own Winston and Winston knows you own him, then you are losing out on the active listening and yes and opportunities.
[267] And I do think in law classrooms and lawyering in general, those status differentials, which are, they're certainly baked into the courtroom, right?
[268] Because a hearing is really just like a specialized meeting where the judge wears a funny outfit and sits up higher.
[269] Yeah.
[270] But the status is like, I mean, you can't get around that.
[271] That's right in your face.
[272] They even give them a weapon or a hurt.
[273] They do.
[274] Yeah, it's really, really interesting.
[275] You know, my experience with it is I worked for General Motors as a vendor for many, many years and we would go to these dinners.
[276] I've said it on here before, but at that time in the mid -90s, GM had levels and I think it went to level eight or something.
[277] And generally at a big dinner of like 40 employees, the level eight, generally it was a guy, that guy would talk until he ran out of gas.
[278] And then when he finally had told every single fishing story he had, and then the torch would be passed to a level seven employee.
[279] And then they would, they'd run out of wind, right?
[280] And it never made it to level three.
[281] And these are the youngest, most creative people in the group that probably have their finger on the pulse of what future buyers want, all this.
[282] They never are heard from.
[283] It's a dangerous system, right?
[284] Just reminded me of when we had a tool Gawande on, who's a surgeon, and he's trying to, like, break the status norms in a surgical room because he wants everyone to feel like they're able to speak up and able to have equal playing field if something's going awry, they can say anything because with all these status barriers built in yeah people feel like they can't say anything yeah they'll see someone bleeding out and they're like well not really my role to point out that this person's not really my role and I might get in trouble and it's yeah but it's also what you just described as like a whole series of blocked offers right like so the status impedes listening and then there's all these offers that could have been made by levels three and down that just never happens they got to wait four years until they can say something at dinner they're like four years, I'll probably be made level six.
[285] Okay, now let's get into your work, which is on child privacy, data collection.
[286] You've written a book called Charenthood, why we should think before we talk about our kids online.
[287] And, you know, I couldn't be a more perfect person for you to critique as I have hours and hours I fill every week on this podcast, and I have evolving opinions on it.
[288] and I do things that initially I thought I want it.
[289] What even brought you to this level of interest?
[290] It was kind of like eagle arms in yoga, right?
[291] Where one arm is wrapping around the other.
[292] I became a mom.
[293] I now have an almost 10 -year -old boy and five -year -old girl, who's the one I'm convinced stole my podcasting microphone.
[294] And I became a mom around the same time that I started being a legal researcher at the Berkman -Kline Center for Internet and Society at Harvard.
[295] So I was dorking out in a fabulous way on student privacy laws.
[296] This was 2013, and there was getting to be a real upsurge of interest in educational technologies, some really big initiatives from Gates Foundation, from Carnegie, and from some tech providers.
[297] And so there was a lot of legal research to do on what the trends were and what the parameters were in schools.
[298] And then at the same time, I was a new parent, and my Facebook feed was filled with ultrasound pictures and toilet training and temper tantrums and I was living it and really at times like just needing that momentary fix of like oh yeah this person in my phone who I haven't seen since I was 16 and living in Ann Arbor agrees with me and they're in the same boat and look I'm not a terrible parent even though secretly you still think you are yeah but then also having this moment of discomfort of like but I'm spending my nerd time looking at how schools are playing maybe a little bit too fast and loose with science of that, not because schools are like bad actors, but because they don't have enough money and they have too much to do.
[299] And so if someone comes along with a shiny thing that is supposed to accomplish a lot, they tend to do it.
[300] And so I got interested in it as sort of the intertwining of the personal and professional.
[301] And then I got to say that, it's interesting.
[302] So my oldest kid is now, as I said, almost 10.
[303] And the past couple of years, even since I started this book, have looked very different for me with this.
[304] Like, I'm still pretty buttoned up about what I say about my kids, like on social media or what devices I let in the home.
[305] But it makes my son crazy.
[306] And here I am sharenting about him.
[307] Sure.
[308] It makes him crazy that we don't have an Alexa.
[309] Oh, yeah, yeah.
[310] I've disabled Siri.
[311] And I will not let him have a YouTube channel, not even for his hamster, who is very cute, but I don't think the hamster needs a YouTube channel.
[312] The hamster is very deserving of its own show, but nevertheless.
[313] I'm never going to let my son hear that you said that because then I would have to deliver.
[314] But especially during the pandemic, too, one of the things that is fully changed around here has been my more buttoned up standards, less about what I say on social media, but more since sharenting is broad, right?
[315] Like, it's what you say about your kids digitally and also other things you do with their private information.
[316] So sharenting is a Facebook post.
[317] It's also saying to your kid, please leave me alone, just take the phone, take the app, do what you're going to do.
[318] I'm running late.
[319] Just please get out of the room while I try to work.
[320] It's the equivalent of our parents saying go outside.
[321] Yes, except that it means that they stay inside, but all of their private information goes everywhere.
[322] And we don't know where it's going.
[323] We don't know what's happening with it.
[324] And right now, in pandemic life, we're living this really bizarre a world, high stakes, real -time experiment in what happens when we say to our kids, hey, we have to live most of our lives digitally right now because it's not safe for you to be too far from the house.
[325] That is bizarre.
[326] I think it's really important to say there's nothing more triggering than critiquing parenting, right?
[327] So we all get very, very defensive.
[328] And I've read you say and I can already tell from your vibe that you're not in the shame game at all.
[329] You're in the, hey, let's just try to be mindful of our decisions and let's understand really what's out there.
[330] So that's a disclaimer, right?
[331] We're all fucking it up.
[332] We're all shitty parents, you know, but...
[333] That is the perfect lawyerly disclaimer.
[334] Are you going to charge me for that in six -knick from you?
[335] Yeah, you can Venmo me. Or just give me that hamster.
[336] I want to get that hamster on YouTube.
[337] I think I could make a fortune.
[338] Samster the hamster.
[339] He's really fun.
[340] Okay, now what I was going to say within that is, you know, when we had kids, it's really funny because the first child, we were like no screens, uh, peanuts cartoons on holidays, you know.
[341] And then eventually we loosened that up.
[342] And then her sister, the little, sister who's two years younger than her, she's just seen it all from the get because the other one was at that age.
[343] And what's really ironic is the first one's completely a TV addict.
[344] The second one could care, I mean, not care less, but is far less interested in it.
[345] So it's kind of an ironic outcome.
[346] But all that to say, we still had some parameters of how many hours a day we would want them facing a screen.
[347] I have my own little things.
[348] I don't like them on phones for whatever reason.
[349] It's just the world's so big and that thing's four by six.
[350] But now it's, Get real.
[351] It's unlimited screen time.
[352] It has to be.
[353] Totally.
[354] And it all just went out the window and it happened like that.
[355] Totally.
[356] And, you know, I have had the experience of doing interviews about the book or topics related to the book since the pandemic where I know that my son is in the room next door doing truly God knows what on his iPad because I just needed him to get out of the room.
[357] And I've also had the experience where like my life is now in the DoorDash app or the Amazon Prime app or anywhere.
[358] one of a number of other apps that I've gotten to try to make things somewhat easier for me as a parent.
[359] And so it was totally like a switch flipped.
[360] And you sort of say to yourself, well, there are a lot of potential harms in the world for our kids.
[361] Right.
[362] I mean, this is the thing that parents always have that running ticker tape of like what is potentially going to hurt or kill my child in any given situation.
[363] Yeah.
[364] Yeah.
[365] And as between too much data leaving the house through an app versus everyone losing their minds because we're in a stay -at -home bubble all the time or like really terrible someone actually getting sick there's just in the balancing test there's just kind of like my mind at least for me I should say no question but I think then it becomes really something where we need the tech companies I don't think government's fast enough but I think the tech companies need to do a little bit of soul searching and hopefully realize that taking advantage of a captive market of really like desperate at wits end kids and parents and grandparents and hamsters is just not okay like it's not cool now i'm guessing what a main hurdle for you getting people to buy into this would be so much of what's at risk is very far down the road and it's not just far down the road in terms of when the info will be exploited but the algorithms that are yet to exist right so you can foresee the algorithm that could predict compatibility, whether you have grit, whether, you know, all these different aspects that might affect you're getting employed, getting accepted to a college, we just don't know them yet.
[366] And I do foresee a time where not unlike we've gotten comfortable with the notion of a credit score, like how good of a bet you are as a person, surely those kinds of scores are going to be attached to us in the future.
[367] But it's so far down the road that I'd have to imagine there's so many immediate fires in a parents' life, that it's really hard to get parents to worry about this.
[368] Totally.
[369] And I will fully own that, like, wearing my research hat, I can tell you all of the, like, well, if I let you have the watch that tracks where you are, it could be hacked into, or the company could figure out when you go outside the boundaries I set for you and therefore make assumptions about you now or in the future that you're a juvenile delinquent.
[370] Or I don't want to have the connected baby monitor because it could be hacked into.
[371] and then you could be traumatized by hearing a spooky voice in your room.
[372] I can go through all of these and personally, you know, have drawn the line at some of them.
[373] But totally, hands down, especially in this moment of like truly existential crisis and like day in, day out, uncertainty and difficulty.
[374] Yeah, I mean, worrying about how much our phone is tracking us versus worrying about like how the hell do we figure out remote learning and also, you know, keep doing what we do professionally and stay somewhat sane.
[375] Like, I personally am coming down on the side of, like, okay, we're just going to let more of this technology in and hope for the best and hope that it's not too terrible down the road.
[376] Stay tuned for more armchair expert, if you dare.
[377] We've all been there.
[378] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers and strange rations.
[379] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[380] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[381] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[382] It's called Mr. Ballin's Medical Mysteries.
[383] Each terrifying true story will be sure to keep you up at night.
[384] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.
[385] Prime members can listen early and ad -free on Amazon Music.
[386] What's up, guys?
[387] It's your girl Kiki, and my podcast is back with a new season, and let me tell you, it's too good.
[388] And I'm diving into the brains of entertainment's best and brightest, okay?
[389] Every episode, I bring on a friend and have a real conversation.
[390] And I don't mean just friends.
[391] I mean the likes of Amy Polo.
[392] Kel Mitchell, Vivica Fox, the list goes on.
[393] So follow, watch, and listen to Baby.
[394] This is Kiki Palmer on the Wondery app or wherever you get your podcast.
[395] Well, there's also, there's like a much broader philosophical question, right?
[396] And I ask myself this all the time.
[397] I guess I would compare it to the horseshoe salesman when Henry Ford invents the Model T. It's like, you know, how much of his life is he going to spend trying to prevent this invention from becoming what it's becoming?
[398] I guess it's the serenity prayer, right?
[399] Like, is it worth wasting your time because it's just how they're going to live and it's inevitable?
[400] So I think trying to navigate that are deciding when to push back and when to accept is very hard.
[401] So I agree with that.
[402] I think the philosophical question that stresses me out too is, is there a point at which all of this screen time and the sharing?
[403] So both the screen time as a sensory experience.
[404] So just like you become like a glorified telemarketer.
[405] but also the data sharing as both a potential real -time threat for stalking and identity theft and all of that, which is not real, but it's not the most salient.
[406] And then, as you said, the risks down the road, at what point is all of that up to a place where we're really not giving our kids a chance to figure out who they are free from our surveillance and datification?
[407] Yeah, every mistake they've made will be recorded.
[408] which is much different from our experience, right?
[409] My college improv troupe pictures exist only in hard copy in a bag in my parents' closet, and that is where they are staying.
[410] Oh, I have sketches that would cancel me if they were on the Internet, for sure.
[411] I know, and definitely some of the stuff we did in practice and even sometimes in shows with the best of intentions.
[412] But that's how you learn, right?
[413] Like, that's how you figure out what is funny and what is not.
[414] Like, what is appropriate play and what is not?
[415] And I sometimes look back on my own improv day, and I think that we were just like a litter of puppies kind of nipping at each other.
[416] But it was safe.
[417] Like no one, except when we were performing, no one was watching us do that.
[418] And certainly we didn't have our parents spying on us, thank God.
[419] And we didn't have some weird company spying on us.
[420] So I worry about that philosophically.
[421] Like, are we just going to produce people who become sort of the next generation version of going back to your GM example, of people who are like at level zero, but not because they just haven't walked the time, but because they actually don't have generative.
[422] Is generative going to do it for you?
[423] They don't have the generative capacity to kind of push the envelope and figure things out for themselves and fuck up.
[424] And then either learn from that or not, but at least have it be their thing.
[425] Well, yeah, one thing that scares me is there's like no second chance, right?
[426] As your identity evolves and there was, you know, today is my 16th sober birthday.
[427] And it's like, congratulations.
[428] Thank you so much.
[429] But, you know, I've had the opportunity.
[430] I was a shitty high school student, and then I went to community college, and I was good at that.
[431] And then I went to UCLA, and I was really good at that.
[432] And I kept redefining myself, and there was no paper trail following me around, because if people were deciding to bet on me with a paper trail, no one would have done that.
[433] Yeah, and I really worry that we are setting our kids up for a paper trail that goes back in some instances, like even before they're born.
[434] I mean, the fertility tracking apps and bracelets, and I don't think I'm a. total sci -fi nut or conspiracy theorist to say that there is going to come at time going back to your example about sort of the social capital version of a credit score, whereas data from that can get connected.
[435] Did they have that in China or have I bought into some kind of, you know, disinformation about China?
[436] But don't they kind of, aren't they rated in like a cooperative way or something?
[437] There's some social capital scoring that has happened in China and you're absolutely right that there was some reporting on it that was a little bit exaggerated.
[438] Yeah.
[439] But you haven't bought into misinformation or disinformation, it is something that does exist.
[440] And there's no rock -solid limitation on that happening here.
[441] And I really do think, and this is not just, you know, law professor middle of the night's thoughts, but we're going to have had at least a year, maybe a couple of years, who the hell knows, of pretty much all of our kids being surveilled and tracked and analyzed digitally in an unprecedented way through this pandemic.
[442] whether it is from remote learning or contact tracing apps or health screening apps or, you know, parents like me who are like, yeah, fine, just turn Siri on, take it, you know, other room.
[443] So we're going to have like a really huge data set and it's going to go all sorts of places.
[444] Well, let's go through the stakes, right?
[445] I'm sure I'm ignorant on a lot of the different things.
[446] So could you walk us through some of the most immediate consequences and then some potential, as we said, like algorithms in the future?
[447] So when you think about sharenting about the digital transmission of kids' private information, mostly by parents, but also by teachers, grandparents, aunts, uncles, coaches, or parents just giving things to kids and be like, you're too.
[448] You can't decide for yourself.
[449] You know, you have at it.
[450] Then there are a number of ways in the present that there can be risks.
[451] The data can be hacked into so that can make kids vulnerable to stalking or to other forms of bullying or trolling or doxing, either by people.
[452] or sometimes by other adults, there are certainly reported instances of other adults at times harassing children by information they've gotten about them.
[453] There are also risks that, as you said, are a little bit further afield.
[454] And so when you are putting data about your child into an educational app or into an Amazon or into anything that is tracking their personally identifiable information, you are typically just kind of clicking like, I consent or it's okay.
[455] or like sign me up.
[456] And in the fine print there, there are loopholes miles and miles wide for that data to be used in all sorts of ways.
[457] You're a lawyer and you can't possibly even understand what the fuck those things are, right?
[458] I'm a lawyer and like when I'm asked to sign up my kids for certain things, unless it's really egregious, I'm like, fine, just do it.
[459] And so that data can then be used for marketing, for advertising.
[460] It can be sold to data brokers, this wonderful group, the Center on Law and Information Policy at Fordham did a study a few years ago where it looked at lists of information about kids, ages, I think, like 2 to 18, that was available for purchase from data brokers.
[461] And the information contained lists of names, including 14 and 15 -year -old girls who might need family planning services, rich kids of America, funny -looking kids.
[462] It just really invasive stuff.
[463] You know, Vermont now has a the law that regulates data brokers, but it's not well regulated at the federal level.
[464] It's not well regulated in most states.
[465] And so truly, we are creating a path anytime we are selecting, I accept or yes or click here, for data about us and our kids to be used not just by the company that we think we're interacting with, which may target us, they may advertise to us, they may market to us.
[466] It's probably not the worst thing in the world.
[467] But we really don't have a sense of or a say in, in most states.
[468] California is doing a little better here.
[469] where they are sharing that data and where that party that they're sharing it with is going to have it land and what they're going to do with it.
[470] So those are some of the big risks that could come back to haunt kits.
[471] Some of the areas that you pointed out, tell me how law enforcement could maybe use this info.
[472] So law enforcement can use information to determine people who might be a risk.
[473] They can profile them.
[474] And law enforcement can also, in some jurisdictions has, enter into contracts with private companies.
[475] private companies taking data that they then have because the user has just said, like, I accept or I agree.
[476] Somewhere in the fine print says, we can share this for research.
[477] We can share it with third parties.
[478] So even if law enforcement doesn't have like a warrant or a subpoena or anything formal, if that private company is like, oh, cool, we want to help you out or maybe you can help us out, then they can share that information.
[479] The private company doesn't have to adhere to all these different, you know, civil liberties issues, right?
[480] So they are free to screen in any manner they want.
[481] They're not a government actor.
[482] That's right.
[483] So they're not bound by the same constitutional restrictions as the government would be.
[484] And now, of course, if they're acting indirectly under the supervision of the government, it's a little different, but as a general rule, you're absolutely right.
[485] They're a private actor.
[486] They're not the government.
[487] The constitution is not binding on them in the same way.
[488] And so they can effectively offer law enforcement a way to surveil or snoop or pry or analyze.
[489] Yeah.
[490] It's basically like you need a warrant to kick someone's door in, but I as a citizen can kick your door in.
[491] I've got whatever.
[492] I've got a misdemeanor now.
[493] But once the door is open, the cops can go in there if they see something, right?
[494] That's exactly right.
[495] Okay.
[496] And now how could insurers use this tracking?
[497] Insurers can use it to make predictions about your viability in terms of your health, your life expectancy.
[498] Car would be a big one, auto.
[499] We see a lot of insurance companies doing this now with people who are at first.
[500] already drivers, so less with kids and more with, hey, if you put this mechanism in your car and allow us to see what a safe driver you are, then we'll give you a discount on your insurance.
[501] Well, that's just tracking your information and your movements and making predictions about you and your worthiness as an insured driver.
[502] Oh, if they tracked me, I wouldn't be able to get insurance.
[503] You know, they would see that I'm sometimes in triple digits for no reason because that's my hobby.
[504] Yeah, it could be career changing for me. And then how about employers?
[505] What kind of things could an employee find about you and then create barriers for you?
[506] So employers already are doing a lot of social media snooping.
[507] And when we see it come up with things that look, you know, those improv sketches that suddenly resurfaced that we hope weren't recorded.
[508] But, you know, even going one step further than things that a parent may put online about a child that then get misconstrued or use against them down the road, there's already a really robust hiring industry in this country.
[509] I think there was a 2016 estimate that about $500 million a year went into hiring programs, software programs, that would try to assess potential applicants fit.
[510] And we have about 60 or 70 percent of adult job applicants as of 2016, we're going through these types of software screening programs.
[511] And, you know, it's not at all a stretch to say, look, let's expect that the next generation of these products will be able to do a much more robust level of screening and prediction if they are able to purchase sets from a data broker or another company that say, look, you know, this person was on the funny looking kid list when they were.
[512] two, and we can put that in as a data point.
[513] And so whatever answers to the questions they're giving you now, we can also do an even deeper dive back.
[514] So it's not just that we're looking for criminal records or driving history or credit checks or any of the more obvious things.
[515] We're going to be going back to that whole digital trail.
[516] If you were holding a tiki torch at some point in public, which by the way I'd want them to pick that up on.
[517] Well, you know, it's really interesting because all these things always ultimately circle back to these fundamental kind of clashes we have as a nation in our Constitution, right?
[518] So, you know, on one hand, you could say this is going to benefit a large section of the population.
[519] When the algorithm's right, it's probably going to benefit a large, but it becomes the individual versus the masses, right?
[520] Because there are going to be outliers.
[521] There's going to be anomalies.
[522] There's going to be people that the algorithm mispredicts.
[523] Again, for me, trauma, six generations of alcohol abuse.
[524] a bad bat.
[525] But guess what?
[526] I'm one of the anomalies, right?
[527] So you get into this, you know, this liberty issue pretty quickly with this stuff.
[528] You do.
[529] And I think to build on that, you also get into, even in a scenario, so let's like play the law professor hypothetical game, because it's what I do for fun, even a scenario where the algorithm is always right.
[530] It is never wrong.
[531] Isn't there still something that reduces personal liberty and just a sense of freedom and autonomy if there are machines that can predict our future.
[532] Like at a very gut level, even if they're always right, isn't that still somehow wrong?
[533] Yeah, it feels like an absence of free will in some way.
[534] I think we'd all be shocked at how fucking predictable we are.
[535] I mean, it's already been proven, right?
[536] Like, they know exactly when you're going to make an impulse by.
[537] And I think as we discover more and more that the computers will be able to pretty much predict, especially you add in now biometrics, stuff.
[538] So, oh, your cortisol is this level.
[539] Your blood sugar is this.
[540] I'm going to send you this flashy set of rims and I know I'll buy them.
[541] You know, it does start challenging the notion of free will.
[542] It does.
[543] And I think that that just really cuts deep to who we are individually and also cuts deep.
[544] You were talking about sort of who we are as a country.
[545] It is not good for a purported democracy to have people who can be reduced and quantified and analyzed quite so easily.
[546] We are more subject to manipulation, to group think, to misinformation, to disinformation.
[547] We're already kind of living that in a way that is not turning out well for us, to put it mildly.
[548] And also, there's something pessimistic about it, too, or cynical in that, like, the algorithms, the prediction, but it doesn't account for growth, or it doesn't assume that we can't transcend.
[549] right?
[550] It ends up being limiting as well.
[551] And I think it then folds in on us.
[552] So if like you think about those moments in life, whether it is as a parent or when you're a kid or when you're doing improv or like whatever it is that gets you into kind of a zone or flow or play, right?
[553] Like just fundamentally sort of free play, that's something that really is or I think should be something that resists quantification or mass production.
[554] But if we come to see ourselves as being so predictable, then I think in turn it limits our own internal sense of our horizons and of who we are and who we can become in a way that I just find really creepy.
[555] And I'm a fundamentally optimistic person.
[556] I'm actually not a pessimist at all.
[557] I'm a happy Midwesterner.
[558] But like this stuff is weird.
[559] Yeah.
[560] Now, I want to just push back for the fun of it because that's what we do.
[561] Cool.
[562] Part of me when I hear this stuff, I'll see these congressional hearings and it's like our data and blah, blah.
[563] And part of me is going, oh, big fucking deal.
[564] Now the ad you're going to see is actually something you're probably interested in.
[565] So in some ways, I prefer that.
[566] There's also part of me that's like, you know, get over yourself.
[567] You want everything for free, right?
[568] You want this massive company, Facebook, spending however many billions of dollars they are on this platform.
[569] You want all that for free.
[570] And then you're going to be pissed that they're selling you the lawnmower.
[571] Some of it feels a little like they want.
[572] their cake and eat it too, or they're just not grownups.
[573] Like, either you get the shit for free or you pay for it.
[574] What do you think about that?
[575] Like, there's a little bit of like, oh, boo -hoo, you know, this entire thing is free and you're bitching about the ads being specific to you.
[576] So to further play the pushback game, I would say that it's set up in a way that people don't realize just how much they or their kids are the product.
[577] So you hear this saying, if it's free, they're buying.
[578] you or you're selling yourself.
[579] And I think that if people actually got the full terms of the deal in a way that people could understand that so you actually had sort of your disclosure checklist that was kind of like a nutrition label.
[580] So yes, the price is zero, but we are monitoring how much you sleep or how much you eat or how many steps you take or how many times you Google Hamster YouTube videos or Richard Guerin Hamster or what, can I say that?
[581] Or whatever.
[582] Or whatever.
[583] Or whatever.
[584] it is, and here are the things that we are going to do with that, right?
[585] So people actually got to make an informed bargain.
[586] Yeah, like Tom Brokhov should pop up on your screen and go, here's worst case scenario, here's best case scenario.
[587] You decide if that's something you want to enter into.
[588] Yeah, so that I think that it's really sort of an illusion of actively entering into an agreement or a contract.
[589] I mean, and the law says oftentimes that these click wrap or kick through agreements are contracts that when we swipe or click to accept, we are entering into a negotiated contract.
[590] I just call bullshit on that because there's so much fine print.
[591] No one's looking at it.
[592] They know no one's looking at it.
[593] Even nerds like me who look at it are still left with our pens being like, well, what does it mean that you're going to use this data to improve your product or improve your performance?
[594] So I don't think that to the extent the objection is, look, people, you are actually entering into a bargain.
[595] You're selling parts.
[596] of yourself in exchange for social media or reduced costs, Siri or Alexa or any other product.
[597] I don't think it's actually a negotiated bargain.
[598] Okay.
[599] So it's really more of a call of transparency less than like a naive, oh, I want all this shit for free and then I expect you to get nothing in return.
[600] It's more just I want you to be very clear about what you're getting in return.
[601] Well, I think that is my positive gloss on.
[602] And I definitely think there's also a component to it that is having your cake and it too.
[603] And we have trouble as a country figuring out that if you want to have nice things like public health or personal freedom, then you also have to take personal responsibility for them.
[604] We're not doing so well with that.
[605] And so I certainly am not, I'm not going to try to argue back against the core of your question, which is do people sometimes whine about things that they either need to accept or not accept?
[606] Yes, 100%.
[607] Yeah.
[608] Okay.
[609] And then another thing that I vacillate on because there was a period of my life where I was a libertarian.
[610] I no longer am, but I do still have some of those residual thoughts.
[611] And now one thing that's frustrating to me is like, we have this obsession in this country with privacy, right?
[612] I would say above many other countries.
[613] I don't know where we rank.
[614] I'm sure there's some ranking, but I have to imagine it's very high.
[615] And I look at situations where like these epidemiological studies that can exist in other countries, Denmark, Sweden, right, where the public health, is held in a database.
[616] There's some level of, you know, detaching your name from it, but they at least have access to all of it.
[617] And I think, oh, we're so hung up on this privacy thing that we're denying ourselves like these amazing medical breakthroughs because of what, some 17th century notion of the British spying on us?
[618] Like, what are your thoughts on that?
[619] Like, privacy in general?
[620] I think that privacy can be defined in so many ways.
[621] And the way that I've defined in, draws on the work of one of my most treasured mentors and dear friends, Jonathan Zitrin, who talks about privacy as a locus for self -discovery, of having some sort of place that protects your autonomy enough that you can discover who you are and are meant to be.
[622] Ooh, I like that.
[623] I like that.
[624] Yeah, it's good.
[625] It's good stuff.
[626] And so I do think that that gets at the core of our philosophical discussion earlier about how do we make sure that we don't just become automaton.
[627] I agree with you that when privacy is raised to limit pragmatic public health or public safety measures, that's a place we really have to call bullshit right now.
[628] It is not a privacy issue in a global pandemic if you were at the site of an outbreak and contract tracers show up.
[629] It's just not.
[630] I will give you the, again, the lawyerly asterisk because I can't help myself.
[631] We do need to keep an eye on the number of different companies.
[632] springing up with health screening apps and contact tracing apps.
[633] And we want to harness the power of private industry because they can move a lot more quickly and often a lot more effectively than government, particularly this federal government.
[634] We also do want to make sure that we don't get ourselves into a situation where we are giving away a lot of private information for sort of unknown ends that might not actually protect public health.
[635] So I'm totally comfortable if you can tell me that this app or this government initiative is well -run, transparent, trustworthy, and is pursuing things like making it possible for us to all go back to school?
[636] Yeah.
[637] For instance, then, like, I'm all there, but I just, gosh, I'm really, I'm coming across as such a pessimist on this.
[638] No, you're not.
[639] You are not.
[640] But, yeah, I think that the false trade -off between privacy and public health, we need to push that aside, but we also do need to be mindful.
[641] like not all contact tracing apps or health screening apps are created equal.
[642] And we need to be kind of looking at those right now, I think.
[643] Yeah, I've been shooting for the last seven or eight weeks on a show.
[644] And part of it is I take a COVID test like every two or three days, which is great.
[645] I've loved being able to do that.
[646] But it's funny because we've used several different test providers.
[647] So, yeah, I'm filling these forms out.
[648] And I'm a little bit like, well, it's none of your fucking business about X, Y, and Z. Like, just tell me whether I have COVID or not.
[649] I'm not asking you to evaluate my other health, you know?
[650] Right.
[651] Okay, here's a question for you.
[652] There is a little bit of this sense where tech is maybe saying, look, if you don't like it, that's fine.
[653] Don't use it.
[654] No one's obligated to use it.
[655] But there is a big leverage discrepancy, right?
[656] There's an illusion of being able to live without it, but we really can't, right?
[657] We're not in a position to negotiate.
[658] Especially not right now.
[659] You know, if we say no to tech right now, then we're saying no to school and work and telehealth and DoorDash.
[660] And DoorDash is very important.
[661] They're a sponsor as well.
[662] Thank you.
[663] Okay.
[664] Yes.
[665] Yes.
[666] Yeah, yeah.
[667] It was an offer.
[668] It was yes and.
[669] I did it.
[670] I did it.
[671] And no one even texted me to tell me to say that.
[672] There is no euphoria that'll match rolling around nude in your Brooke Linnens with some hot food that DoorDash just delivered.
[673] I'll tell you.
[674] What about meandies?
[675] How do you feel about meandis?
[676] Have you tried meundies?
[677] They're fans.
[678] I have it.
[679] It sounds like I need to.
[680] We can follow up in that.
[681] We stand by that one.
[682] Yeah.
[683] Yes.
[684] Yes.
[685] And.
[686] You know, it's not, we don't really have a choice.
[687] And the people that don't have a choice in the other direction, right, who don't have a choice in terms of being able to get technology because they don't have broadband access, because they can't afford to buy devices.
[688] They are far as a whole in a, in a worse place than people who have in many ways the privilege to worry about the things that I've been discussing.
[689] in terms of, you know, was it okay for me to authorize that app on my son's iPad?
[690] And that's a huge problem that we many ways need to be more concerned about right now with technology than privacy is equity.
[691] It's the parts of rural New Hampshire that don't have broadband.
[692] It is the stories that, you know, get profiled of people spending stimulus checks or benefits to buy laptops or devices so their kids can go to school.
[693] Yeah.
[694] That's really tragic.
[695] Well, that dovetails beautifully into one thing I was going to say, which is to me the solution is there was a fork in the road we chose as we love to do in this country free we're like yeah i'll take it if it's free i'll let go of anything now to me the solution is like you pay 15 bucks a month or whatever the fee is and then there's none of that shit there's no tracking there's no and then of course that immediately brings up the income inequality issue and that my family of course could afford to not be tracked and then less fortunate people could so it's like everything's a paradox.
[696] It's like every solution seems like it's going to exacerbate another social issue we have.
[697] So, you know, that would be the solution, right?
[698] If we just paid for our shit, then we wouldn't be tracked.
[699] If we paid for it, and I guess you could also design a payment mechanism where maybe there is a sliding scale or maybe there are some sort of, like, you could imagine a new deal type of program that said the same way we put money into Parks in the original New Deal.
[700] We are going to put money into having a safe and open and playful digital space.
[701] And so you get the government paying for it, not so they can own it, but actually you could imagine them paying the fee for a family that wanted to not be tracked, but couldn't afford it.
[702] It's interesting.
[703] My uncle was a professor at Williams.
[704] And back in the 90s, he and two guys who'd been in his freshman econ class started a company called Tripod that was eventually bought by Lycos, but they were one of the early web -based companies, and their idea was that they were going to be a subscription, like a fee for service, and you could build your own website, your own homepage, and you could also learn about investing and do some investments and give back to the community.
[705] And the subscription model, even back then, this was like mid -90s, it didn't take.
[706] They were very successful in other ways, but that didn't take.
[707] And I think it's built into the DNA of how we've done the internet here.
[708] Well, to build on your New Deal part, you know, there was a huge electrification issue at that time as well with electrifying rural America.
[709] And eventually the government said, and I think rightly so, and this is contrary to my libertarian roots, but hey, you're using all this hydroelectric.
[710] That is driven by the rivers that the citizens of this country own.
[711] So if you want that electricity, you've got to play ball with us.
[712] And similarly, it's all working on an infrastructure.
[713] that we all, in theory, own.
[714] So there is, I think, grounds to say, yeah, if you want to operate here, there's got to be some provisions.
[715] Now, this brings me to the second thing.
[716] Now, let's say they said, Leah, you're in charge now.
[717] You're going to design the legislation that's going to keep every single kid safe.
[718] Here's where I am pessimistic.
[719] It reminds me of, like, Homeland Security.
[720] There was all this, you know, with the Patriot Act and everything, there was all this concern.
[721] Oh, now they're listening to phone calls and they're hacking into AT &T and they're all.
[722] And I was on the sidelines going, Big fucking deal.
[723] Let's see them try to analyze that.
[724] They've got a quadrillion terabytes of information in no actual method to sift through it and synthesize it or do anything with it.
[725] It's just a big trash heap of data.
[726] Now, isn't the volume so big with 300 million Americans that even if we had great legislation, is it conceivably enforceable?
[727] It's just so much.
[728] So the enforcement mechanism for any of this is super tricky because if you're thinking about enforcement, one, you are often thinking about government, either the civil regulator or the prosecutor.
[729] No way government can keep up with tech companies.
[730] We see that all the time.
[731] This goes back to our status riff earlier.
[732] Like, you know, the government is Winston.
[733] The tech company is the one owning, owning Winston.
[734] So the next you think about private enforcement, you think about could you empower like an army of private attorneys general so that they could go out and sue the shit out of the tech companies and get attorney's fees, get trouble damages, so it's worth their time and puts a dent in what the tech companies are doing.
[735] I think it's hard to scale that.
[736] You can look for certain types of privacy or security by design.
[737] I don't think that we're there yet or that that's scalable.
[738] So the solution that I would design if I got to wave my magic wand would be less at the level of collection and more at the level of use.
[739] So it's kind of like, all right, I'm going to go New Hampshire here for a second and say the Horses left the barn.
[740] The dad is getting out.
[741] We're not going to rein it back in.
[742] But what we are going to do, especially in anticipation of the current cohort of kids who are truly, to use a phrase that some colleagues coined born digital, that we're going to say to employers, to insurers, to schools, to really every major gatekeeper.
[743] So think about the people who've backed you and bet on you, despite the fact that you're saying that if they had all the data, maybe they wouldn't have, we want to say to those people, you are legally prohibited from using data.
[744] that was sharented about kids.
[745] Either unless there's a very specific legal rule that says that you can.
[746] Maybe it's a juvenile delinquency history and part of a record that has to be available for law enforcement or that the child, once they turn 18, say, no, no, no, actually.
[747] Like, super cool.
[748] Go ahead and take, you know, all of my education records because I'm super proud of, like, what I was like when I was in third grade.
[749] Monica stole cookies.
[750] I just want you to know.
[751] Yeah, so I would say no. Yeah, she stole someone's cookies.
[752] and got them big, big trouble.
[753] Were they worth it?
[754] Of course.
[755] Cookies are always worth it.
[756] Exactly, exactly.
[757] But now I can't trust her because I know that.
[758] Yeah, see, this is, uh, you sharented me. Sorry, sorry, sorry.
[759] Stay tuned for more armchair expert, if you dare.
[760] But no, guys, we need a new term.
[761] So, like, if it's not sharenting.
[762] Well, I'm her dad.
[763] Yeah, he's effectively.
[764] my dad.
[765] I'm her brother, her employee.
[766] Her employee.
[767] You're not my employee.
[768] Sometimes I'm your employee.
[769] Well, that's true.
[770] Yeah, yeah, yeah, yeah, yeah.
[771] So, yeah, I think that we regulate at that level and it's not going to be perfect.
[772] That's smart.
[773] I like that.
[774] Yeah.
[775] Wow.
[776] Good job.
[777] I was so pessimistic.
[778] Yeah, you really, that's a really great idea.
[779] Now, let's get into some of the things you would urge parents to maybe not do.
[780] I think somehow in some email I saw that you were aware that I don't show our kids' faces.
[781] I don't want a stranger to be able to identify them and say, hey, I just saw your mom, Kristen, because they know her name.
[782] I don't want them to be tricked.
[783] It feels like a potential safety issue for people to know what they look like.
[784] I think originally I was going to not talk about them on the podcast, but 90 % of my life is them, and it's totally unrealistic that I wouldn't.
[785] I also grapple with, okay, I don't talk about them, and maybe they appreciate that privacy -wise.
[786] But then maybe they're 18, and they listen back and they go, oh, my God, my dad couldn't keep his mouth shut about me. He thought about me so much.
[787] And then that's like, it moves me into that direction.
[788] So it's like, it's always evolving.
[789] I don't really know, but I have my own little guidelines.
[790] So what would you urge people to do and to not do or be mindful of?
[791] I would urge people to be mindful of pictures, first and foremost.
[792] I think not showing faces is excellent.
[793] And I would say people should also not show children in any state of undraft.
[794] even if it's seemingly innocuous.
[795] Like, we don't need the swimsuit picture, even if they're three.
[796] We don't need the bathtub picture.
[797] Just don't put it out there.
[798] We also want to stay away from pictures that could be embarrassing in any way, shape, or form, because that stuff is going to bubble up.
[799] I would urge parents to be mindful in terms of disclosing pictures that reveal addresses or locations.
[800] Because, again, like, the inner and success pool, that stuff gets weird.
[801] I often say to parents, think about using a holiday card rule of thumb.
[802] So, you know, back when I was a kid and we used to get those nice, like, Hanukkah or Christmas newsletters with, like, the updates on everything that's happened.
[803] From your city government.
[804] From our city, from our city government.
[805] Did you a little baby of Nazareth?
[806] Yeah.
[807] Oh, right.
[808] No, they did not send me one.
[809] Yes.
[810] But, you know, the thing, if you would say it to anyone from your boss to your great aunt, then it's probably okay.
[811] Then it's probably in that space of kind of sanitized and clearly.
[812] loving and not something that could really be weaponized.
[813] And then another rule of thumb I sometimes trot out is try to put yourself in your kid's shoes and think, gosh, you know, if my parent had said this about me, either at the age my kid is now or if they're too young to know, fast forward five years, how would I felt about it?
[814] And if the answer is like I would have like hidden in a bathroom and cried or refused to go to school or been ripped shit, then maybe just don't post it because even if it's partially your experience, right?
[815] Because as a parent, like, especially when the kids are little, you're absolutely right, like 90 % absolutely.
[816] But is it as much yours as it is theirs to share that they've finally stopped wetting the bed or that they're finally sleeping through the night?
[817] So those are just some kind of common sense guidelines that I put out there and that I more or less follow.
[818] The temptation, at least from our end, and I know we're in a unique situation, but I guess we're in a position to publicly own our failures in a way that I would hope would be comforting to other people that might hold us on a pedestal.
[819] So it's like I feel very obligated to share either, you know, that we go to couples therapy, that we, you know, I don't want to be perpetuating some fairy tale that people can't replicate.
[820] So that's kind of my conundrum at times is like, like my wife you just said that thing about wetting the bed so my wife said oh our five year old still sleeps in diapers at night now that immediately got changed to in headlines i saw that she always wears a diaper which is not even the case and it's interesting depending on what news outlet it was so like on the fox version of it we're basically you know we're not even parenting our kid and then maybe on the liberal side of it it was maybe the outcome we'd hope for but anyways that one got tricky.
[821] She wants to be relatable, which she is.
[822] So it is just saying, like, we struggle with this, too.
[823] And by the way, I had like multiple people reach out to be like, oh, I loved hearing that.
[824] Yeah.
[825] So it's, it's so, it is so complicated.
[826] And I think, you know, what you just raised is such a profound point, which is that when the internet is used for good, it is extraordinarily, phenomenally, phenomenally powerful in breaking down silos and barriers and misconceptions.
[827] And, singing to the people in the back, right?
[828] In traditional media, and you just gave some powerful examples of why we would want to get kind of over the media filter and connect directly with each other, that that's a shared humanity moment.
[829] And that's a beautiful thing.
[830] And so to make a judgment as a parent that, look, I'm going for the shared humanity here.
[831] The haters are going to do what they're going to do anyway.
[832] So let's connect to the people who now are able to say to their kids, you know what?
[833] it's not just you or to say to themselves it matters it's like it's that the fairy tale is actually more you know more beautiful and powerful if it's more real yeah i think that you're right that that's in some ways a very unique responsibility to have in other ways it's very much one of the things that i think all users of social media or internet technologies think about which is as i am trying to be authentic in the world and as i am trying to be the kid in the middle of nowhere michigan or the middle of nowhere in New Hampshire, who is finding the other kids like me, across the world even.
[834] Yeah.
[835] Being authentic is ultimately beautiful.
[836] I mean, that's art, that's friendship, that's romance, that's all of the good stuff is authenticity.
[837] It's just sometimes when you try to get the people in the back and be authentic in the digital age, it can go a little sideways.
[838] Yeah.
[839] Yeah, it really can.
[840] And also, I'll just add, I feel like it's a commitment I have to be part of the antidote to the curated life that social media has created.
[841] So again, I just feel an obligation to go like, oh, no, no, we both look like shit often and we, you know, it's tricky to navigate.
[842] Yeah, I think it's probably just something to think about because.
[843] In terms of them.
[844] Yeah, we think about it in terms of us and other parents and the, and the relatability.
[845] And we're not thinking of the tool we're using to do that.
[846] And the tool we're using is the child.
[847] Or that Delta, if she was aware of the fact there was an article about.
[848] For wearing a diaper, she'd be totally bummed.
[849] Exactly.
[850] But let's talk about maybe the personal responsibility.
[851] Now, this is the part I think people should own, which is I do think people should be, they should be aware of the fact that their children are extensions of their own ego.
[852] And that whereas we might not feel comfortable posting a picture about ourselves looking for praise, they provide this cover fire where really that's what we're looking for.
[853] If the kid look cute, I want you to see that my kid's cute.
[854] We would feel a little unethical about, oh, I looked cute.
[855] Let's show everyone I look cute.
[856] So I would urge people just to be honest about that aspect of the process.
[857] I think that's such an important distinction.
[858] Are you doing it for a reason that has to do with them or some bigger purpose around authenticity and creation of meaning?
[859] Or is it really that you could have grabbed a cigarette, a piece of chocolate, more caffeine, but instead you went for the dopamine hit from getting a like on.
[860] your phone.
[861] Yeah.
[862] And if it's that you're going for the dopamine hit of getting a like on your phone, then when it's about your kid, especially if it's about your kid in a way that they really might not like if they knew about it now or in the future, gosh, I was about to say maybe go for the chocolate, but I'm not sure that's the solution either.
[863] But maybe just don't go for the dopamine hit of posting the picture of the child.
[864] That is where I was going with that.
[865] And is there a difference between moms and dads that you've discovered?
[866] That is a wonderful.
[867] question.
[868] I do think that there is a difference that I have seen.
[869] And it's, of course, with the caveat that every family and every person is unique, I do think that there are certain gendered aspects to this in terms of the level of scrutiny and performance that women expect of themselves and of each other and that broader society expects.
[870] And the level of mom shaming that happens is vicious.
[871] It is vicious.
[872] It's relentless.
[873] I mean, it starts from when you get pregnant.
[874] Are you going to have a homebirth?
[875] Are you going to do an epidural?
[876] Are you going to fucking do it in a tree?
[877] I saw it immediately.
[878] And again, my wife's in a position where she couldn't be more empowered.
[879] And yet, it's still happening.
[880] And then the example I always give is like, we'll walk down the same red carpet.
[881] And every single question to her is how are you managing being a mother and working?
[882] And no one gives a shit about how I'm managing that.
[883] I would assume that you could probably score some praise more easily too.
[884] I mean, I feel like I can be on the airplane with two kids and everyone's just looking at me like, you know, who's the lunatic traveling by herself and can she keep the kids under control and where's the partner?
[885] And then my husband has them both.
[886] It's like, oh, you're doing such a good job.
[887] Oh, totally.
[888] Oh, I feel like when I fly with them by myself or something, I feel like people are looking at me like I'm juggling chainsaws.
[889] Like, they're so impressed that I, as a man, can possibly do this.
[890] You're right.
[891] It's just all wins for me. The bar is so low to be a good dad.
[892] So I guess that's a great explanation for maybe why there's more incentive for a mom to post a picture of the kid being perfect or seemingly having the perfect childhood because they're in this constant state of defending whether or not they're a terrible mother.
[893] I think there's a lot of that and that we are using as moms, it can go beyond the dopamine hit, right?
[894] It's the sense of inner worth.
[895] It can also be a standing or a sense of relationships.
[896] within a network, that if you are part of an IRL, right, in real life, as the kids like to say, network where everyone is part of the neighborhood Facebook group or everybody is always using Instagram, then there is a crossover from your digital life into your brick and mortar life in a way that you kind of avoid being part of it at your own peril because then it's like, well, she's so standoffish or what she's hiding or she thinks she's so much better than everyone.
[897] And that's really hard, too.
[898] And that, by the way, is also a pushback I get sometimes when I talk about the book is people say, well, if you're saying that employers and colleges and insurance companies and future spouses are going to be looking at the digital data trail, shouldn't I give my kid the best possible digital data trail out there?
[899] Like, shouldn't I double down on this?
[900] Then the notion, yeah, that you'd have to be, you know, creating this great digital history.
[901] Okay.
[902] Okay, my last question for you is, you know, what age personally do you think a kid should be allowed to have a social media account?
[903] My 10 -year -old thinks that, or sorry, almost 9 -year -old thinks that he should have been allowed to do it like two years ago.
[904] And he hates the fact that I am so uptight about this and wishes I would just let him do it.
[905] I personally would say put it off for as long as possible.
[906] I think high school, I know that it creeps down to middle school, but I just think that it's, so quickly becomes this compulsion for people in terms of their daily life and the stuff that can inadvertently creep out can be so weaponized that the longer, the better, and ideally high school, middle school makes me a little bit uncomfortable.
[907] Having said that, though, like, I already am seeing from our nine -year -old some real pressure of, well, I'm going to be left out.
[908] Other kids are doing this.
[909] And especially at a time when, like, he's, I mean, we're in New Hampshire, so he's, in the forest and stuff, but, like, he's been inside for six months.
[910] I mean, they all have, and, you know, he fully pulled one over on me during, I will confess, during this pandemic, I had been really holding the line at Fortnite, which, are they a sponsor?
[911] No. Okay.
[912] Yeah.
[913] We're hoping, we're hoping.
[914] Well, they won't be now.
[915] Yeah, we were hoping they'd join us, but yeah.
[916] Maybe I'll just keep my mouth shut.
[917] No, no, go ahead and say Fortnite.
[918] We're clear.
[919] But, okay, just, I mean, violent and sketchy and not for kids.
[920] And somehow, it's, some point in this pandemic moment, he got me to agree one day to let him play a game with his friends because he was just losing his mind and I was losing my mind and it was just a shit show.
[921] And you're weighing now like he does need to be social.
[922] So like, and boy, have I regretted that decision.
[923] And I think this is one where my husband would say if you were in here that he would have made a different one.
[924] He says it quietly because he knows not to, right?
[925] But like, yeah, I mean, look, I will fully on.
[926] Like I, you know, I have my nice, like, Sharon Hood book right here about how we need to be mindful and regulate, and there was a day where, like, everything was exploding and I just needed to make it go away.
[927] And now I can't get rid of it.
[928] That's the other part, too, is like, once the horse is out of the barn, whether it is your child's data or a particular product, especially because I will also confess.
[929] I'm confessing a lot of things, but they're just about me. I have forgotten the parental control pin for his iPad.
[930] Oh, boy.
[931] Oh, great.
[932] Great.
[933] Yeah, so I'm locked out.
[934] And I either have to reset it or, like, hope that it comes to me in a moment of inspiration.
[935] It's pretty locked down.
[936] I wanted to adjust some settings last night.
[937] It didn't work.
[938] So, look, we all make the choices we have to make to get through, especially right now, I think.
[939] Well, my take on it is generally, like, have an honest conversation about how in control you are with those apps.
[940] Like, I'm not.
[941] I'm 45.
[942] And I am so susceptible to the reward system and the dopamine.
[943] And so if I can barely do it, how on earth do I think my 12 -year -old would be able to do it, you know?
[944] And at the same time, you're right.
[945] If every single kid's doing it, they're all going to be equally fucked up.
[946] Why make them a pariah on top of already being fucked up?
[947] So, man, is it hard?
[948] It's not an easy parenting conundrum.
[949] It isn't.
[950] And I just think it is so much more.
[951] right now than it was, even if you just go back to the beginning of March, which now feels like forever ago.
[952] But the dial on all this stuff has been turned up so incredibly high.
[953] And sometimes I just throw up my hands and go for low tech or no tech solutions.
[954] Like my solution to the iPad issue that I'm currently having is I've just been taking it and putting it like up at the very top of a shelf that people who are under the age of 10 can't reach and like just letting it be out when I can actually have eyes on it.
[955] And sometimes that can be, I think, the antidote to some of these problems.
[956] It's just like, what can I do that doesn't involve figuring out a tech fix to this?
[957] Let me just do this the old -fashioned way.
[958] I just was wondering if you had any fear that because you're strict about it, that once he does have access, it's going to be it.
[959] Ape shit.
[960] Yeah.
[961] Yeah.
[962] So I do, because that was me with sugar cereal in college.
[963] Right.
[964] Lucky Charms.
[965] I never got sugar cereal except on vacation.
[966] I got to college.
[967] It was like, I can eat this for every single meal.
[968] And I did for a while.
[969] But then I did figure out that that was not a good idea.
[970] And I stopped.
[971] And so I think that on balance, I come out with, yeah, I am probably setting him up for some of that.
[972] But hopefully he's self -correct.
[973] Exactly.
[974] And I'm buying enough time to, like, protect just enough neurons that there will come a point where he's like, oh, my gosh, like, this is going to make me puke.
[975] Literally or metaphorically, I'm not going to do this anymore.
[976] But yeah, I know I'm totally scared of that because I think it's going to happen.
[977] Well, maybe we'll all get super lucky in the fact that they're all learning now on a computer.
[978] They'll just start associating it with this dreadful thing school and want nothing to do with it.
[979] Maybe they'll be excited to go outside.
[980] Could be unintended.
[981] That is an amazingly optimistic thing on this giant expamer.
[982] I hope to how you're right.
[983] I really do.
[984] That would be not a silver lining, but at least a positive little spark.
[985] A bronze line.
[986] bronze yeah well Leah it's been so fun talking to you and i'm really glad that you're committing time and energy and resources to giving this some thought i think you're providing a great service and i do urge people to read sharenthood why we should think before we talk about our kids online i think it's an incredibly important thing to be mindful about and we are incredibly grateful it's been a pleasure thank you so much for having me all right thank you so great talking to you please come back on when you write your next book I will, and when I have an update on it, was it my undies, me undies?
[987] Me undies.
[988] You'll love them.
[989] Okay.
[990] We already know the update.
[991] Yeah.
[992] I can't wait.
[993] All right.
[994] Thank you so much.
[995] I look forward to it.
[996] And now my favorite part of the show, the fact check with my soulmate, Monica Badman.
[997] Leah Plunkett.
[998] Miss Plunkett.
[999] I like her last name, Plunkett.
[1000] Leah Plunkett sat on a tuftit, eating her curds and wet.
[1001] Is that it?
[1002] What's the real person's...
[1003] Miss Muffet.
[1004] Wait.
[1005] Miss Munkett?
[1006] Muffet.
[1007] Uh, dang.
[1008] You don't have it?
[1009] I thought it was Mrs. Muffet.
[1010] Let me look.
[1011] Mrs. Plunkett sat on her muffins, eating her turds and way.
[1012] Oh, little Miss Muffet.
[1013] Oh, little Miss Muffet.
[1014] Sure, sure, sure, sure, little Miss Muffet.
[1015] She sat on a, um, Tuffet.
[1016] What's a tough it?
[1017] There was a man from Nantucket whose dick was so long he could suck it.
[1018] He said there's more to that thing and he did something.
[1019] something and da da da da fuck it oh my god yeah have you ever heard that one no there was a man from nantucket whose dick was so long he could suck it no i'm not well -versed in pervy rhymes pervy limericks yeah also um i mean you got to use uh nantucket because it's gonna rhyme yeah but also not what i associate with someone from nantucket like i think a nantucket like you're kind of upper class highbrow yeah but they have a lot of perviness they have carnal desires yeah i guess you're right we should Stereotype.
[1020] That's true.
[1021] Plunkett sounds like an automatopoeia.
[1022] It does.
[1023] Did she say anything inaccurate or did I?
[1024] Yeah, there was some stuff to check.
[1025] But first I wanted to start with, she talked about improv a bit.
[1026] Oh, right.
[1027] It was so fun that she was an improv nerd.
[1028] Yes.
[1029] And she talked about a little bit of improv warmups and I wanted to see what your favorite improv warmups were.
[1030] Oh.
[1031] And we should tell people what they are and how to do them.
[1032] Okay.
[1033] I think I just liked one word and three -word story.
[1034] I thought that was just the cleanest way to get your brain moving and abain the rules.
[1035] Okay.
[1036] And remember we did that with our friends on logically irrational?
[1037] We started a story and then we were having them finish it.
[1038] Oh, yes.
[1039] You're right.
[1040] And then that fell apart.
[1041] I think it fell apart.
[1042] Okay.
[1043] It's on them.
[1044] I hope it's on them.
[1045] I'm going to say generically it fell apart.
[1046] Okay, all right.
[1047] Not to please blame.
[1048] No reason to look further.
[1049] No. Yeah, just went the way of the dodo.
[1050] Do you want to do three words?
[1051] story?
[1052] Sure.
[1053] There once was...
[1054] Oh.
[1055] Oh, okay.
[1056] See, so hard not to say a man from Nantucket now because we planted a seed.
[1057] I got confused.
[1058] I'm sorry.
[1059] I thought we were each doing one word that would add up to the story is three words.
[1060] No, oh, no, no, no. So you would either do one word story where you can only say one word or you can add three words.
[1061] Okay.
[1062] I feel like I like three is a little more to play with.
[1063] Okay.
[1064] Okay.
[1065] Blue dresses are...
[1066] So steamy when...
[1067] The temperature.
[1068] rises.
[1069] That's the end of that.
[1070] And we kind of close it up early.
[1071] But yeah, that's fun.
[1072] Do it at your next dinner party.
[1073] Yeah, these are all fun things to do at dinner parties.
[1074] Big time.
[1075] Improv warmups.
[1076] What's your favorite improv warmup game?
[1077] I have a couple.
[1078] Oh.
[1079] There was also some I despised.
[1080] We had one called Hot Spot.
[1081] Every time we had to play fucking hotspot, I wanted to die.
[1082] How's that one work?
[1083] So you're all in a big circle and someone jumps in and starts singing.
[1084] I'm already out.
[1085] No, you like it.
[1086] You like singing.
[1087] No, I like singing in the safety of the attic.
[1088] I don't want to do it in front of anybody.
[1089] Well, you know a lot of people listen to this.
[1090] I'm not aware of that.
[1091] Okay, okay.
[1092] In those moments, I pretend no one listens.
[1093] Because my mother doesn't listen to most, which I think is very healthy.
[1094] Yeah.
[1095] I think it's really healthy that she doesn't.
[1096] But she probably listens to 10 % of them.
[1097] Yeah.
[1098] So when you just know your mom's not even listening, you kind of, you can buy in, you know.
[1099] You rule things out.
[1100] My parents don't as well.
[1101] Yeah.
[1102] Well, as they should not.
[1103] I would like them to not.
[1104] Yeah, my mom, there's nothing I would say that my mom hasn't heard.
[1105] Yeah.
[1106] But there would be some revelations for your parents.
[1107] I think so.
[1108] Yeah, yeah, yeah, yeah.
[1109] Okay, so someone would jump in and they'd start singing like twinkle, twinkle little star, we'll say.
[1110] Okay.
[1111] And then somebody else would tag out and start singing Star Spangled Banner.
[1112] Oh, because the word star was in there.
[1113] Exactly.
[1114] Then you tag out and you keep going and going and it keeps like.
[1115] All right.
[1116] Let's do it because we don't want to.
[1117] You ready?
[1118] I'm not playing.
[1119] Just give me the night.
[1120] Uh -huh.
[1121] Uh -huh, uh -huh, just give me the night, uh -huh, uh -huh, and we go out tonight.
[1122] When we meet together, we found them, found you know.
[1123] Okay, stop.
[1124] Let me tell you something.
[1125] Uh -oh.
[1126] This is a part why I don't like it.
[1127] Okay.
[1128] Because my brain has a really hard time.
[1129] Clicking out of that melody.
[1130] Yeah.
[1131] Like I know that the next thing should be about night.
[1132] Yep.
[1133] But I can't hear that sound.
[1134] And also think of night.
[1135] stuff like blackbird it would be black i would see now yeah i hate this you so you start one let me see if i can do it i don't want to sing you're a good singer no i'm not saying you are you're not tricking me no i just want to see if i can't into me singing well i want to know if i have the same block as you or if i can do it i feel i just want to test myself begrudgingly sing just a little bit of a song i really don't want to i know Most of the things that are good for us, you don't want to do them.
[1136] No, that's what they say in nexium, okay?
[1137] I'm not falling into that cultish behavior, that cultish mindset.
[1138] The notion that we're not in nexium is pretty funny.
[1139] They try to trick you into doing things you don't want to do so that your guard is down when your intuition is telling you you don't want to do something.
[1140] That's for real.
[1141] Yeah, they install a button that any time you're questioning something that you're, it just means you're wrong and there's a fear behind it.
[1142] Exactly.
[1143] Which, by the way, is largely true.
[1144] That's why it's such an appealing.
[1145] Of course, we're talking about The Vow.
[1146] If anyone wants to watch a great doc on HBO or on episode four of The Vow, which is about this self -help group nexium that then had a sex cult within it called DOS.
[1147] There's a great podcast about it that we listen to.
[1148] Now there's this doc.
[1149] And so Monica and I are watching it with great interest as...
[1150] Cold people?
[1151] We're obsessed with cults.
[1152] We are obsessed with cold.
[1153] Just because of the mindset that goes along with it, because half of it is constructive.
[1154] Yo, I'd say more than half.
[1155] I'd say like 70 % of the stuff they're talking about.
[1156] I'm like, that's awesome.
[1157] Yeah, but they do it in a way where they exploit you and then manipulate you.
[1158] The main problem is it's a multi -level marketing scheme.
[1159] At some point, you've got to recruit people to join the thing.
[1160] So right there, you're already going to, in my opinion, bad.
[1161] footing yeah okay but the principles most of them until they start getting heavily weaponized against the parishioners yeah are pretty good well let's just say a personal one that i had to admit to you that i finally saw your point of view is that we've in here many times said i say you responsible for your feelings like the world is inert in some way it's just this place that has all these variables but ultimately you you have to own how you react to it and that's pretty much the whole foundation of this nexium thing yeah but then and the thing you've always said is like no no people can go out of their way to fuck with you and hurt you and it's not your weakness as a person because you've been manipulated and fuck with yeah and i've kind of pushed back on that yeah and now i can see that you're right and i'm wrong and that some percentage of the time yeah people are fucking trying to destroy you or control you And that we don't live in a vacuum.
[1162] Like people's actions affect other people and people's words affect other people.
[1163] They're powerful.
[1164] So, you know, telling yourself it doesn't affect you is not all that helpful because it does.
[1165] Right.
[1166] Yeah.
[1167] And then they get them so bought into that line of thinking, thinking that if, and thinking that if.
[1168] And thickening.
[1169] And a thickening happens, basically.
[1170] And a corrosive kind of thickening occurs.
[1171] occurs and then it becomes the structural integrity breaks down and the people snap and they do like fear of course is a huge element where they're trying to break down your fears and tell you that they're not worthy and that you're living by that and you know it's so interesting I mean I do obviously not in this case with a hotspot though I am not going to sing for you today oh my god but um it's it is like what you did just then as a as someone who loves me and not as a cult leader but just That is what they do.
[1172] They tell you, well, if you're not going to get over this fear, you're going to let fear control your life and you're going to be better off if you do it.
[1173] You're never going to get anything you want.
[1174] Exactly.
[1175] If you're controlled by fear.
[1176] And then it forces you to ignore your intuition, which your fear can be so good.
[1177] Well, here's the thing.
[1178] Both things are true.
[1179] Generally speaking, people have fears about things that they really shouldn't because either A, they're really low probability or B, they have absolutely zero control over it.
[1180] So it's a complete waste of time to even think about it.
[1181] So that's all true.
[1182] Yeah.
[1183] And then also fears kept us alive.
[1184] It has.
[1185] It's there for a reason.
[1186] Yes.
[1187] I, without going into too much detail, have had a couple recent incidents recently where I have felt a lot of anxiety and fear.
[1188] Uh -huh.
[1189] And tried to talk myself out of that.
[1190] Mm -hmm.
[1191] And those things that I was fearful of were validated.
[1192] Mm -hmm.
[1193] Mm -hmm.
[1194] Mm -hmm.
[1195] Yeah.
[1196] So a part of me has some additional fear moving forward that I'm not going to be able to talk myself out of these things in the future because I'm like, I'm right.
[1197] Turns out my intuition is exactly right.
[1198] Uh -huh.
[1199] And everything I think that is scary is real and is going to happen and did happen.
[1200] Yeah.
[1201] But I'm also smart enough to.
[1202] Also remember, you were afraid to get kidnapped.
[1203] Yes.
[1204] Which was probably a waste of time.
[1205] Yes, probably.
[1206] As it turns out, it was a waste of time.
[1207] Exactly.
[1208] Do you feel a little sad you didn't get kidnapped?
[1209] Like do you ever, does it make you insecure like?
[1210] Like they didn't, they didn't want to kid that me. Yeah, I wasn't cute enough.
[1211] Yeah.
[1212] Yeah.
[1213] Oh my God.
[1214] A buddy of mine, he worked with one of the actresses that was fully in nexium.
[1215] And then he, by coincidence, got on a train to go upstate New York where he lives.
[1216] And she was on the train going up to Albany where this is headquartered.
[1217] No. Yes.
[1218] And they talked for hours.
[1219] And he is obsessed now with the documentary.
[1220] And he said to me, you know, my own ego went, why didn't she try to induct me into this?
[1221] And I go, dude, that's exactly what I would think.
[1222] Like, wait a minute.
[1223] I'm an actor and this is the people they're trying to recruit.
[1224] Why the fuck didn't she try to recruit me?
[1225] That is so funny.
[1226] Well, at that point, I think she was more into DOS, which is a female.
[1227] I think she was recruiting females.
[1228] Yeah.
[1229] Yeah, to be slaves.
[1230] But also, you know, they know the type.
[1231] Well, what's interesting about nexium, I would argue, more than some of the other groups, cults we've found, is that a lot of the cults seem to prey on particularly vulnerable people in society.
[1232] And this one preyed on people that were very smart and articulate and just good personalities.
[1233] Yeah, shiny and empowered.
[1234] I still think, though, and I think some of them would argue with me about this, but I still think they're vulnerable because many of these.
[1235] people have these gigantic goals.
[1236] I would say Scientology prays upon this same desire.
[1237] Yes, where you really want to be an actress and that's really hard to do.
[1238] And by the way, what they would be right to say is that 80 % of the things standing in your way as an actor is fear.
[1239] It's like fear of auditioning, fear of not this, fear of that I'm not enough, fear that I'm not something people want.
[1240] So they also, you know, yeah, these people who are like trying to be actors, trying to be musicians, trying to do this and that and be seen all those occupations are like ones where you want to be seen really bad and you you deal with a lot of rejection in those pursuits yeah and in those professions so when they come and they have a community and they're accepted and they have some power like the rising in the ranks that feels so antithetical to this bad feeling that's connected with their professional pursuit so they kind of like say maybe I don't need to pursue singing anymore because I feel really good here and I'm empowered here and it It's all a ruse.
[1241] Yeah.
[1242] It's very interesting.
[1243] So even though they don't seem vulnerable, they are, I would say.
[1244] Yes.
[1245] And then again, a little bit in their defense, it's also baby steps.
[1246] It's like years and years before you're getting to the point where it starts getting an alarm.
[1247] Oh, yeah.
[1248] Wait, what?
[1249] That's part of this?
[1250] The grooming process is slow and it.
[1251] And your community is all doing it.
[1252] And the power of community is so much stronger than the power of the individual.
[1253] and we're all susceptible.
[1254] And do not get me wrong.
[1255] I watch that and I think every single person is susceptible to this.
[1256] Yeah.
[1257] Myself, everyone I know, even the people who are the most confident, everyone's susceptible.
[1258] Yeah.
[1259] I, of course, think of myself as someone that is completely not susceptible to it.
[1260] Mm -hmm.
[1261] You know, like if there was some kind of spectrum of susceptibility, I'd put myself on this far end of it.
[1262] But while we're watching it, I'm recognizing that there are elements of the cult that exists in AA and ground lanes and ground lanes where there's like there's a world view and you snap into it and an inner circle like all of it I thought there's status yeah yeah and I was like oh some of this is familiar yeah and also even this happens in AA which is there is this no one has instructed us to do this but there's a pretty regular declaration of gratitude for what the program's done for you.
[1263] So you're weirdly, you are proselytizing a little bit.
[1264] And you believe it.
[1265] Like I do have deep gratitude for it.
[1266] And it's changed my life immeasurably.
[1267] And yet I do feel obligated to vocalize that quite often.
[1268] So it's like, oh, that's just a weird little pattern that exists in sharing that I picked up on and did.
[1269] Yeah.
[1270] I become a testimonial almost any time I share.
[1271] Even if it's, I'm telling to myself for something negative I've done or how I'm dealing with a problem.
[1272] I generally will wrap it up with I'm grateful.
[1273] I have a place like this to come and talk about it.
[1274] You know?
[1275] And you are.
[1276] Yes, it's totally genuine.
[1277] But it's just, I could see these parallels the way they spoke and the way we speak.
[1278] Yeah.
[1279] Yeah.
[1280] It's very fascinating.
[1281] Improv.
[1282] Oh, yeah.
[1283] Improv.
[1284] Oh my God.
[1285] Ding ding dingings.
[1286] Dingles.
[1287] Dingles.
[1288] It goes back.
[1289] Okay.
[1290] So just let me tell you the one I like.
[1291] Okay.
[1292] There's a few, but I'll stick to one.
[1293] Mind Meld.
[1294] Remind me of how this goes?
[1295] So Mind Meld is two people.
[1296] say a word at the exact same time.
[1297] Oh, and you try to figure out the word that would link them.
[1298] Exactly.
[1299] So let's try it.
[1300] One, two, three, Karate.
[1301] Blue karate.
[1302] Well, well, no, then we have to, yeah, blue karate and then we go one, two, three, Sash.
[1303] Dang it.
[1304] That's the same.
[1305] It's the same.
[1306] We did it.
[1307] I am not going to accept that Sash and belts are different.
[1308] Okay.
[1309] A sash is a belt.
[1310] Okay.
[1311] And it's also ding, ding, ding, ding, a big part of nexium.
[1312] Oh, shit.
[1313] Yeah, dingle, dingles, motherfuckers.
[1314] I've told you about my favorite sound man, Big Ron from parenthood.
[1315] Joey and I were obsessed with Big Ron.
[1316] It's Big Ron.
[1317] Big Ron was a sound dude.
[1318] He was a boom operator.
[1319] And he was about 6 '5, gorgeous black dude built.
[1320] And it's that guy's job when they say rolling to let everyone know they're rolling.
[1321] And generally, you would say rolling, but he would say, dingles or ding ding ding.
[1322] Because often you go on a bell In a sound stage So there'd be a Right They'd be like Rolling, sound Dingles And we Joy and I lived for hearing Dingles Oh I love dingles We were so obsessed With Big Run's private life Because he's so gorgeous And charismatic And we always Anytime he took a vacation We were like front row seat He could have told us Like every hour Of the vacation And we were riveted He one time And I don't think he'd mind me Telling people this He in Cabo played a baby grand piano, which he can play proficiently, on a beach, in a speedo.
[1323] Oh, my goodness.
[1324] That's hard to pass up.
[1325] This guy is fucking awesome.
[1326] I couldn't dream of having that confidence, and he totally deserved to have that confidence.
[1327] Oh, that's great.
[1328] Big Ron.
[1329] Dingles.
[1330] Okay, Leah.
[1331] Okay, so you mentioned a term for when things in tech gets smaller and smaller.
[1332] Oh, right.
[1333] We should definitely have known it, because you know what it's called?
[1334] It's a name.
[1335] Miniaturization.
[1336] Oh, but there's also someone's name associated with the theory, like the Borg principle or the Big Ron.
[1337] Yeah, it's called the Big Ron.
[1338] Miniaturization is the trend to manufacture ever smaller mechanical, optical, and electronic products and devices.
[1339] Examples include miniaturization of mobile phones, computers, and vehicle engine downsizing.
[1340] If anyone else knows what Dax is talking about, you can put it in the comments.
[1341] Eric Richardson does.
[1342] I always rely on him to say this principle.
[1343] Oh, let me text him.
[1344] Okay.
[1345] Okay, let's hope he responds.
[1346] In a timely fashion.
[1347] That's right.
[1348] Okay, moving on.
[1349] Okay, so you stayed in a house in New Hampshire on the lake where on Golden Pond was filmed.
[1350] You thought it was Winnipezaki and she thought Squam Lake.
[1351] Yeah.
[1352] It is Squam Lake.
[1353] Oh, it is.
[1354] Okay.
[1355] We are misled.
[1356] Or maybe we were on Squam Lake and I'm just infusing it with Winipasaki.
[1357] New Hampshire is beautiful.
[1358] Have you ever been?
[1359] I've never.
[1360] been you didn't go skiing there i know you were you were like a fancy skier it was like an olympic level skier and also a fancy one like out west skiing and yeah we went to some great places beautiful beautiful place but they have black flies what's that mean they it's a kind of flying they they try to fly up your nose and in your ears they're the worst and people there um they call it fly dope they're like you got to get fly dope in the summer here and it's a specific type of bug repellent that is for black flies.
[1361] Oh my God.
[1362] But if you don't have your fly dope, it's rough.
[1363] Is it dope because you're putting it up your nose?
[1364] No, I don't know why they call it that.
[1365] Oh.
[1366] This reminds me of a conversation we had the other day because our area of Los Angeles currently is overrun with mosquitoes.
[1367] It's disgusting.
[1368] It's horrible.
[1369] I know other people from other places are like mosquitoes, get over it.
[1370] We all have mosquitoes.
[1371] Yeah.
[1372] But Los Angeles, you move there with the promise that there's going to be no mosquitoes.
[1373] And then even worse, that that promise was real the first 22 years I lived here.
[1374] Exactly.
[1375] And then all of a sudden, the past few summers, there's been mosquitoes and it's climate change.
[1376] Yeah, humidity.
[1377] The taxes in California are outrageous.
[1378] Sales tax, state tax, you know, whatever.
[1379] But I've always been like, you know what, I'm willing to pay it because there's no mosquitoes.
[1380] That's a big part of my rationale.
[1381] And now that they're mosquitoes, I'm like, I don't think I should be paying 16 % of my income if there's going to be mosquitoes.
[1382] It's really made me rethink it.
[1383] You know what's gross, though?
[1384] It's mainly our part of town.
[1385] I don't know why.
[1386] Oh, really?
[1387] It's like an east side thing.
[1388] I know.
[1389] Do you know also that our area in particular, especially where you just bought your new house?
[1390] Yeah.
[1391] Used to be called Skunk Hill.
[1392] Ew.
[1393] Because that's where all the skunks hang in Los Fulis.
[1394] Oh, great.
[1395] Yeah.
[1396] Yep.
[1397] We're in the mosquito epicenter of Skunk Hill.
[1398] Oh, my God.
[1399] Oh, my God.
[1400] Are we crying on the yacht?
[1401] app.
[1402] Yeah, we're crying on the app.
[1403] Okay, so my point is we were outside.
[1404] There are mosquitoes.
[1405] I was getting bitten because I always get bitten.
[1406] They love your blood.
[1407] They love my blood.
[1408] And I was swatting, but we were having kind of like a serious conversation.
[1409] Oh my God.
[1410] Yeah.
[1411] And then we were saying it'd be so funny in a movie to have like a really serious scene.
[1412] The seminal scene of the movie.
[1413] Yes.
[1414] And everyone's kind of crying, like maybe a funeral or something.
[1415] Or a big divorce announcement.
[1416] Exactly.
[1417] And everyone's just kind of swatting the whole time.
[1418] Yeah, like in between crying and hugging, swatting mosquitoes.
[1419] And it reminded me because I've been watching old seasons of The Bachelor in my downtime.
[1420] And I was watching Pilot Pete's season and no spoilers in case anyone else wants to watch Old Bachelor.
[1421] But there was a big scene.
[1422] It was unexpected.
[1423] It was very emotional.
[1424] And they were outside somewhere where there were so many books and they're like crying and like hitting their knee.
[1425] And they're in the middle of this really serious thing and just swatting away.
[1426] And it was high hilarious.
[1427] It's so funny.
[1428] Yeah, if some real life thing is happening, you just can't ignore while at the apex of some emotional disaster.
[1429] That's so funny.
[1430] Okay, I think Eric has just responded.
[1431] He has.
[1432] He was very fast.
[1433] Thank you, Eric.
[1434] Moore's principle?
[1435] The word for the chips getting two times as fast every two years is Moore's Law.
[1436] There we go.
[1437] Moore's Law.
[1438] And I guess that is the same as getting smaller and smaller because the same computing power goes into a smaller area.
[1439] Okay.
[1440] Moore's Law.
[1441] That was so identical to every time I try to think of an actor's name.
[1442] And I go, I'm like, okay, well, I'll go to IMDB and I'll go to a movie I know they're in.
[1443] In the moment I'm clicking on the movie, I remember the name.
[1444] Yeah, that'll happen.
[1445] But it's, we still got to give credit to Eric.
[1446] Absolutely.
[1447] It's still his win.
[1448] His win.
[1449] Okay, so she taught us a new word, iterative.
[1450] That means relating to or involving iteration.
[1451] especially of a mathematical or computational process.
[1452] Iteration is the repetition of a process in order to generate a sequence of outcomes.
[1453] You know, if you'd be like, that's one iteration of this.
[1454] One version.
[1455] Doesn't it mean version as well?
[1456] An iteration is like a version.
[1457] Yeah, sure.
[1458] Sure, sure.
[1459] We obviously don't have that much of a handle of what it means.
[1460] We don't.
[1461] I'm not going to be using that word anytime soon.
[1462] No, more's law.
[1463] We're not ready.
[1464] GM's levels, what's the highest level?
[1465] I found differing info.
[1466] on this.
[1467] Okay.
[1468] A few places said seven, then one said eight, one said nine, which didn't seem right.
[1469] I believe it was eight when I was there.
[1470] And they may have added a ninth.
[1471] Or taken one away.
[1472] And then there was some distinction that was above the level system for like the CEO and the CFO and the...
[1473] That makes sense.
[1474] You said Facebook spends billions.
[1475] Facebook spent 31 billion in 2018 up from 20 .4 billion in 2017.
[1476] To supply the platform.
[1477] Right.
[1478] Okay, great.
[1479] So that's their operating cost, basically?
[1480] I go.
[1481] Okay.
[1482] I'm iteration.
[1483] Okay, we got a couple softballs here.
[1484] Let's just stick with Moore's Law and improv games.
[1485] Yeah, we nailed that.
[1486] And we did great.
[1487] Great.
[1488] And we had some ding, ding, ding.
[1489] We had some major ding, ding, ding.
[1490] Especially with that sash.
[1491] That was great.
[1492] That's crazy.
[1493] Mind meld.
[1494] We want to do one more before we go?
[1495] Yeah.
[1496] Okay.
[1497] One, two, three, Christmas.
[1498] Christmas shark.
[1499] Okay.
[1500] One, two, three.
[1501] Okay.
[1502] One, two, three, hooties.
[1503] Oh.
[1504] Okay.
[1505] One, two, three, clothes.
[1506] Oh.
[1507] Sometimes they try to go broad if we're in the same category.
[1508] Wait, so what were we at?
[1509] Shirt and clothes.
[1510] Okay.
[1511] One, two, three, Claset.
[1512] Dang it.
[1513] Okay.
[1514] I said closet.
[1515] You said pants.
[1516] One, two, three.
[1517] Fold and dresser.
[1518] Fold and dresser.
[1519] One, two, three, drawer.
[1520] Draw and furniture.
[1521] One, two, three, wood.
[1522] Yes.
[1523] Guys, this wasn't edited.
[1524] Oh, my God.
[1525] I was getting real scared for ourselves for a minute.
[1526] Sometimes that happens.
[1527] You leave the station, but then you come back.
[1528] Yeah, yeah.
[1529] It's a fun game.
[1530] I'll teach everyone about contact next week.
[1531] Wonderful.
[1532] Okay, I love you.
[1533] I love you.
[1534] Bye.
[1535] Follow Armchair Expert on the Wondry app, Amazon Music, or wherever you get your podcasts.
[1536] You can listen to every episode of Armchair Expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.
[1537] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.