Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert.
[1] I'm Dan Shepard.
[2] I'm joined virtually, or rather I'm joining her virtually, miniature mouse.
[3] She has an Emmy nomination.
[4] It's so weird to just see you through the screen.
[5] I don't like it.
[6] I can't smell you.
[7] Well, that's good because I haven't showered in a couple days.
[8] Sure.
[9] I understand.
[10] Today we have a really exciting guest, and I would say one of the reasons it's exciting is that I shit the bed on this one about three or four times, I'd say.
[11] In that I thought I understood how certain aspects.
[12] of cellular reproduction happened and I was incorrect but he encouraged me to keep going ahead so if you like seeing me eat shit this is the episode I don't think we've ever had somebody with a more interesting trajectory into being a world leader in their field Eric Lander is currently the president and founding director of the Brod Institute at MIT and Harvard he's a geneticist a molecular biologist and a mathematician and he has played a pioneering role in all aspects of the reading, understanding, and biomedical application of the human genome.
[13] He was a principal leader of the human genome project, and Lander is a professor of biology at MIT and professor of systems biology at Harvard Medical School.
[14] He's won a bunch of honors and awards, the MacArthur Fellowship, the Breakthrough Prize in Life Sciences, the Albany Prize in Medicine and Biological Research.
[15] But most importantly, this guy was just a brilliant mathematician that decided, I don't want to study math the rest of my life, and he landed in the most incredible place.
[16] Wasn't he fascinating?
[17] he was and he's also listening back i mean i thought it at the time but i forgot he's so vibrant and fun like you can feel his passion for the subject and just kind of life in general incredibly playful i wouldn't doubt it if the students he teaches refer to him as professor playful and if not i hope they do going forward they should yes so please enjoy professor playful eric lander Wondry Plus subscribers can listen to Armchair Expert early and ad free right now.
[18] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[19] Or you can listen for free wherever you get your podcasts.
[20] Welcome to our program.
[21] Well, great to be here, sort of, virtually, kind of.
[22] You know, I got to just right out of the gates, I got to say, I have to learn about people.
[23] in a rapid manner, often for the program.
[24] And your story is one of the most complex I've ever had to take on and try to synthesize into some kind of linear story.
[25] Well, if you do it, let me know.
[26] It was actually more complex to try to live it and make sense of it.
[27] So good luck to you.
[28] Well, you had this great quote I read that said you live your life forward thinking, yet you retrospectively make sense of your life.
[29] So neither perspective.
[30] are maybe objective, right?
[31] So it's kind of like you just add it all up at the end and then try to find the through line.
[32] Well, that's the thing.
[33] You have to treat all biography as suspicious because people try to make some through line as if you were living it in that direction, knowing where you were going.
[34] You know, it's not like that.
[35] No, or that you had set a course for yourself at 18 and now here you are.
[36] God knows, not in my case.
[37] Okay, so I'm going to try to rapidly go through this kaleidoscopic life of yours so that we can get to.
[38] to the stuff that's on the frontier.
[39] But first and foremost, you're a kid from Brooklyn.
[40] Yeah, well, actually, you know, growing up in Brooklyn was interesting.
[41] Yeah, especially at the time that you grew up.
[42] And you were also, you were being raised by a single mother from 11 years old on, which is its own unique experience.
[43] So yeah, what was Brooklyn like back then?
[44] It was lively and filthy and exciting.
[45] Well, Brooklyn was not the Brooklyn of today.
[46] What it wasn't was trendy.
[47] There was nothing trendy about Brooklyn.
[48] You know, I was growing up in Brooklyn in the 1960s.
[49] Actually, my dad had been in the hospital since I was five.
[50] He had multiple sclerosis and he died when I was 11.
[51] So I was raised entirely by my mom, me and my brother.
[52] And you know, we didn't have a lot of money back then.
[53] And so she came up with everything free or cheap you could do to expose a kid.
[54] But I mean, like, it was amazing.
[55] So like, number one, we got to stay home for every NASA rocket launch.
[56] Oh, really?
[57] Oh, God, yeah.
[58] She decided, like, it was so much more educational to stay home and watch NASA launch people into space than to whatever we're going to learn in school that day.
[59] So I saw all those launches and things, yeah.
[60] What gave her that conviction?
[61] That seems like, you know, a daring proposition for especially a single mother raising kids on her own.
[62] I think she had a lot of convictions that, you know, kids.
[63] could learn a lot if you expose them to things.
[64] So, like, she took us an infinite number of times to the Brooklyn Museum, to the Hall of Egyptology, and we saw archaeology.
[65] And, you know, in 1964, 65, there was the World's Fair in New York City.
[66] Yeah.
[67] And we must have gone 14 times to the World's Fair.
[68] And it costs, like, a buck for a kid's ticket and two bucks for a grown -up ticket.
[69] And I would say, like, the exposure to everything at the World's Fair was formative.
[70] This was the era of incredible optimism about what science and technology could do.
[71] And you had all these pavilions by America's great corporation.
[72] First, there was the big unisphere sculpture.
[73] Was that the one that's made famous in men in black?
[74] It ends up getting destroyed in men in black when the spaceship hits it.
[75] But when I was there, when I was a kid, it was still intact before a man in black.
[76] destroyed it.
[77] You know, there was a guy who flew over it in a jet pack.
[78] And they told us, yes, yes, yes.
[79] And they told us we were all going to be going to work in jet packs.
[80] And I'm still waiting for my jet pack.
[81] But, you know, so that one didn't quite deliver.
[82] But like there was the Bell Labs picture phone and there was the GE Carousel of Progress.
[83] And as you moved around, they sang this song, you know, it's a great big, beautiful tomorrow.
[84] And, and things like that.
[85] So it was an era of anything was going to be possible that science could do amazing things.
[86] I mean, for God's sakes, the Mets won the World Series in 1969 after essentially being in last place for essentially every series they had.
[87] And so there was an amazing sense of possibility that one had as a kid growing up.
[88] You know, obviously being naive at the time, you know, under 10 years old to the social tensions, the political tensions that were all brewing and things.
[89] like that.
[90] But, you know, as a kid, it was great.
[91] And the New York City public school systems, you know, we couldn't afford anything else, were just fantastic.
[92] You know, I ended up going to this high school.
[93] It was one of these specialized math science high schools.
[94] It was free.
[95] You took an exam.
[96] You could get in.
[97] It was Stuyvesant High School.
[98] And it changed my life because it opened me up as a lower middle class kid to all sorts of things I'd have never seen otherwise.
[99] You know, I later went to, you know, fancy school, you know, Princeton and Oxford.
[100] But I think the New York city public school system really opened my eyes to things.
[101] I'm so glad you just pointed out the context because you're right.
[102] To be, I don't know, six or seven or eight or nine in a period where we have for the first time ever left planet Earth, everything beyond that is conceivable.
[103] Because what a paradigm breaker that was.
[104] Now, whether the tech was really that big of an advancement or not, but just the theme of it is incredible.
[105] No, it always left me feeling, and also, of course, you know, Star Trek.
[106] And this was the era of we're all going to get along, and the Russians and the Americans, and every race and gender, we're going to be up in space together on the USS Enterprise, you know, going where nobody has gone before.
[107] It was all of a piece growing up, and I think I've always had in the back of my mind, even as I get more realistic about the challenges and the problems and all that, that there's an element of that possibility that we just can't lose because there's something important that as long as we also mix it with some realism, you know, it keeps us going.
[108] If you weren't born when you were born, that the current generation takes all that for granted through no fault of their own.
[109] But it doesn't hold the same, you know, burst of imagination that it did when you were a kid.
[110] It's just, yeah, sometimes we go out into space, we return, big deal.
[111] And I do wonder what kind of overall optimism that the kids now are lacking because they haven't experienced a huge breakthrough or a huge paradigm shift where, you know, there's something so exciting happening.
[112] I do wonder what long -term effects it has.
[113] Well, it certainly seems to me that we got more than enough problems right now that are going to need science to help fix them, not exclusively, but I don't see how we get through a lot of things without science.
[114] And I hope we have a generation of people who have that sense of possibility that even when we face really big adversity, you know, we can somehow get through it.
[115] The thing that needs an injection of is romanticism.
[116] That seems to be what's a little bit missing in modern, what do they, stem work, you know, is where's the romanticism?
[117] Well, you know, in my own life, living through the human genome project, I got to live through and be a part of the generation that, you know, sort of first thought about this crazy idea that we were going to read out all the human genetic information.
[118] Three billion letters of genetic code.
[119] You know, students today can't imagine how ludicrous that was, how utterly insane that was to be thinking about it.
[120] But back then, if you did a simple linear extrapolation, you know, you'd figure it would take centuries to do that at the rate we were going in the mid -1980s.
[121] And yet people said, we're going to do it and we're going to do it in 15 years.
[122] I think at that point that you guys were cracking that code, and putting it on paper.
[123] At that point, it was like, okay, great, we know the ingredients.
[124] But I actually think now you're talking about some very out there sci -fi ideas of what we can now do with the list of ingredients.
[125] I mean, I got to say the Human Genome Project was tiny compared to the challenges and the excitement and the inspiration that comes afterwards.
[126] So when you think about young people imagining what's going to be possible, at least in the life sciences, I think eyes have been popping since 1950 when the structure of DNA was discovered through figuring out how to clone genes to figure out how to read the whole human genome.
[127] And then all the things that I'd be happy to talk about that are going on in this century and every generation goes so much further and it's so much cooler.
[128] And you always think, oh my God, it's got to have maxed out by now.
[129] But it keeps getting better.
[130] So, you know, whatever romanticism might have been lost with space, because, you know, we went to the moon, we played a rounder to a golf, we came back, and that was pretty much it.
[131] You know, and we don't go back a lot anymore.
[132] And we don't even have our own vehicles to get to the space station.
[133] I think whatever that is has been transferred to other frontiers.
[134] The frontiers of biology are just incredible.
[135] Well, I would argue that what's on the table now is so profound that some of the things that are being discussed actually has me thinking, what are the odds?
[136] On a 5 billion -year -old rock, I mean, I'm a member of a species that's only 150 ,000 years old, that I am going to witness some of these things.
[137] It actually has me suspicious that I am living in a simulation.
[138] Do you ever get suspicious?
[139] because it just seems impossible that we're alive at a time where some of these things are happening.
[140] Well, I mean, it might seem impossible, but who's to say that 50 years from now, 200 years from now, a thousand years from now, there won't be even more amazing things that are happening.
[141] I think pretty much since about 400 years ago when people sort of came up with the idea of science and the scientific revolution, it's just been one revelation after another.
[142] And you might think you're at peak revelation right now.
[143] But, you know, check back in a couple centuries.
[144] Maybe this is going to seem pedestrian by comparison.
[145] Can I tell you the things I wanted to point out about you?
[146] Yeah, yeah, please.
[147] While you were the head of this math team and competing in the Olympiad, a lot of your teammates described you as very charismatic, and that's very interesting to me. And I think that plays a huge role in your success.
[148] I just think that's really relevant in the ability to motivate people, to bring people together.
[149] One of the things that I was curious about, and this is so trivial, but I just, when I saw it, I was curious.
[150] And that is when you went, you were an Olympiad, a math Olympiad, or I don't know what you call those, you guys won silver, and I just wondered if you succumb to the curse of a silver metal.
[151] There's all this new stuff about how people who win silver often have big bouts of depression, that the people who win bronze are happy and that gold's happy, but silver for some reason comes with some baggage.
[152] Oh, oh, God, no, just the opposite.
[153] Oh, good.
[154] This was 1974.
[155] I was on the Stuyvesant High School math team.
[156] You know, that was a big deal.
[157] Stuyvesant High School didn't have big athletic teams, but we had a great math team.
[158] And so, you know, I'd been on it since I was a freshman.
[159] I ended up being the captain of the math team, my senior year, which was like way cool.
[160] But it was my junior year was the first year the United States participated in the International Mathematical Olympiad, which was a Soviet bloc competition that the U .S. had never been in before.
[161] And, you know, in the middle of the Cold War, I got to say, the United States was very worried about sending a team to East Germany to compete with the Russians and the, and the Hungarians, who were very good mathematicians.
[162] And they were afraid, you know, we were going to embarrass the country.
[163] Uh -huh.
[164] So, you know, we went to this camp to prepare for the Olympics that was held in New Jersey.
[165] And then we flew over to East Germany, went through Checkpoint Charlie.
[166] We didn't have diplomatic relations with East Germany.
[167] at the time, which made it even more exciting.
[168] And we went down to this city airport in Germany for the competition.
[169] And I'll just digress and say, we hid it off amazingly well with the Russians.
[170] Because there were the other superpower.
[171] Yeah.
[172] And it was East Germany, and they knew that they owned the joint, and they couldn't get in trouble.
[173] So, like, we'd go up on top of buildings with them and throw water balloons down into traffic.
[174] because the Russians told us, you know, no way can you get in trouble.
[175] We're here.
[176] And so it's a great time.
[177] And we ended up finishing second to the Russians and ahead of the Hungarians, who were the legendary, you know, mathematicians of Europe.
[178] So we didn't embarrass the United States.
[179] Oh, good.
[180] And we came home.
[181] So I think the important lesson here is the expectations were very low.
[182] Yeah.
[183] And would I be right and guessing that their math, were probably had the resources of Russia behind them, as opposed to you guys who were living normal lives outside of studying math?
[184] Yeah, apart from having, you know, two weeks in New Jersey to prep.
[185] Yeah.
[186] But I don't think the Russians used steroids on their, on their math teams.
[187] It wasn't like Rocky Three.
[188] No, I don't think so.
[189] I don't think so.
[190] They were very good mathematicians.
[191] But, you know, the Russians legendarily had great mathematicians because all it took was a pencil and they didn't have, like, great computers and other technologies.
[192] Yeah, they're chess and math, huh?
[193] That's, again, two low barriers of entry.
[194] Yeah, yeah.
[195] So anyway, so, I mean, I loved math in high school, and I got to say, you know, curiously, given what I do, I hated biology.
[196] Biology was infinitely boring.
[197] It was just memorizing stuff, cat brains and frog anatomy and things.
[198] And there was, like, nothing deeply beautiful and elegant about it.
[199] it didn't obviously seem like that to me as a high school student.
[200] And happily, that all changed later after I ended up getting my PhD in pure mathematics.
[201] So you go to Princeton and you graduate in two years, Valvictorian, you also write for the paper.
[202] You then are a Rhodes Scholar.
[203] You go to Oxford.
[204] You get your PhD in two years, which is a record at that point.
[205] And then you exit and you are now faced with a very monastic life.
[206] Is that true?
[207] That's right.
[208] I mean, good mathematics happens in your own head.
[209] It is really monastic.
[210] And I somehow came to peace with the fact that I wasn't a very good monk.
[211] And this wasn't going to work.
[212] Was there any sadness accompanying that?
[213] Because you had dedicated so much time.
[214] It's like you had an identity as a mathematician that was then being challenged by yourself.
[215] Sadness is the least of it.
[216] It's guilt.
[217] You feel guilty because your professors at Princeton and Oxford, you know, mathematics professors, you think your image is they want you to be a pure mathematician.
[218] Yeah, they've invested in you.
[219] You're failing them in some way.
[220] Yeah.
[221] And it took me maybe, oh, I don't know, 15 years before I managed to have a conversation with my PhD thesis advisor who kind of let me in on the secret that he just wanted me to be happy.
[222] He really didn't care what I did.
[223] And the guilt and responsibility I felt was mine.
[224] Yeah.
[225] And he ended up being like really proud I went off and did other things.
[226] And I think it's really good for people who carry that kind of feeling of guilt and responsibility to know that, you know, if their mentors or their parents are any good at all, they just want them to be happy and enough with the guilt.
[227] It took me a little while to get over it.
[228] Yeah, I am motivated largely by guilt.
[229] Yeah.
[230] It's a powerful force.
[231] Guilt has powered much progress in the world, but in too large a measure, it can be a problem.
[232] Okay, so when you get to this crossroads where you're deciding, you know, it's just not going to be for me, you end up to you.
[233] at Harvard in the business school for managerial economics.
[234] Yeah, managerial economics, which, again, I imagine you're probably grateful for the job, but also not finding all that sexy.
[235] Yeah, you know, at Oxford, I'd sat in on a couple of economics lectures.
[236] They sounded more worldly than mathematics.
[237] And, you know, being hopelessly naive as one is in one's early 20s, I said, oh, I should go try that.
[238] And somehow managed to get offered a position teaching managerial economics.
[239] on the faculty of the Harvard Business School.
[240] And the thing is, I've always been interested in lots of different things.
[241] I think the world is, like, infinitely interesting.
[242] Maybe it was growing up in Brooklyn at that time or whatever.
[243] And so I started doing that, but quickly figured out that, like, it didn't make my heartbeat fast.
[244] Yeah.
[245] It wasn't like a passion.
[246] And so I was, again, casting about for, what should I do that is more in the world that makes more difference, is less monastic?
[247] And my brother, who's younger than me by a year and a half and was a MD PhD studying neurobiology, said, well, you know, you did your PhD on coding theory.
[248] You should study the brain.
[249] The brain has lots of coding in it.
[250] And being even more naive, I took it seriously and started while teaching the business school, randomly sitting in on biology courses, including, you know, neuroanatomy and other things.
[251] And, you know, happily, I had no idea how ridiculous this all was.
[252] But one thing led to another, and I started moonlighting in biology labs and learning genetics just a couple years before everything started getting really interesting.
[253] And do I have it right?
[254] So you start kind of hanging around a worm laboratory.
[255] And at a certain point, there is someone who's looking for someone with a strong mathematical background because what they're trying to do is start tracking down these things that require multiple genes, right?
[256] So not like the Mendel P. It's much more complicated.
[257] There's maybe, I don't know how many genes are involved in heart disease, but, you know, a lot.
[258] And then that now ups the level of math required.
[259] Right.
[260] You've got it.
[261] And, of course, you're telling this in retrospect, making it sound very logical.
[262] But, you know, what happened was, you know, I was hanging out in fruit fly labs and nematode worm labs doing basic genetics.
[263] And one day after a seminar, by this point, I was hanging out at MIT, David Botstein.
[264] yeast geneticist who'd made some really important discovery in human genetics about how you could find the genes for simple genetic diseases, single gene diseases.
[265] He buttonholes me after the colloquium and he says, I hear you're a mathematician.
[266] And I hated that introduction because usually biologists, you know, had no time for mathematicians.
[267] But Botstein, he knew what he was talking about.
[268] He wanted to talk to somebody who knew genetics and mathematics to ask, how would it be possible to study not these rare genetic diseases, but the common diseases we all get, you know, heart disease, Alzheimer's, things like that, schizophrenia that are much, much more complicated.
[269] And he started with some really bombastic pronouncement of, it wouldn't be possible to do blah, blah, blah, blah.
[270] And I immediately disagreed.
[271] And we started arguing because he's from the Bronx.
[272] I'm from Brooklyn.
[273] And arguing is the form of discourse.
[274] And so we had such a good time.
[275] We decided to get together to argue the next day and the day after that, and I left my worms growing mold in the cold room and started working on human genetics.
[276] And I continue to think, and this is another life lesson, what in the world would have become of me if he hadn't accosted me that day after the seminar?
[277] And so anybody who thinks your life is charted in advance, rather than being like wonderful, lucky accidents, I beg to differ.
[278] Oh, well, we had the pleasure of interviewing Bill Gates recently.
[279] And it's like he just kind of went through the improbable variables that add up to Bill Gates, right?
[280] It's not an aptitude for coding.
[281] That's one -fifth of it at best, you know.
[282] And then something that we talk about a lot, because even within show business, I think my ability to survive has been largely due to my flexibility.
[283] Like, oh, this thing is embracing me and I have interest in it.
[284] Let me walk towards it.
[285] And I think, yeah, some people get a little too fixated on a tiny point in the three dimensions.
[286] they just can't get off of it.
[287] And almost all the interesting jobs and challenges haven't been invented at the time you're a kid.
[288] So, like, what's the point of planning your destination with any precision?
[289] Because the destination isn't there yet.
[290] Well, and let's even talk about David for one second.
[291] Because if you're looking at perhaps identifying, let's say, I don't know, six low side that's going to create heart disease, there's even no point to it at that moment in time, right?
[292] What you're going to know that those are the five genes?
[293] Yahoo, there's nothing to do to augment genes, right?
[294] And if you'd thought about it, we'd even go back, there was no way to find those genes, there was no way to interpret those genes.
[295] So, you know, what it took what's so, like, amazing about the story of biology was when David started thinking about this, just a few years before we met after this seminar, you couldn't do any of this stuff.
[296] The whole idea of cloning genes was just getting invented.
[297] And so you know, for a bunch of years, people were figuring out how to clone individual genes.
[298] Then the year after I met David, this idea of a human genome project started circulating as kind of this crazy idea because it was nuts at the time.
[299] And then by 1990, it gets launched as a project.
[300] And I get drawn into that generation that starts doing it.
[301] And by 2001, 2002, like, it's actually gotten done.
[302] It's amazing.
[303] And then you say, well, that's great.
[304] But all we got are the letters of the genome.
[305] Yeah.
[306] Like, you know, we got the letters of a book written in some ancient language, but we don't speak the language.
[307] Yeah, and can I ask a couple of really stupid fundamental questions?
[308] Yeah, go for it.
[309] One being at that time when they decide to launch the genome project and you're among they, had they already mapped smaller chunks of it?
[310] Did they know some pieces of it?
[311] And then my further question is, how do you observe that?
[312] Is that something that's seen under a microscope or a radio microscope?
[313] I mean, how do you see it?
[314] Well, the idea of reading out the letters of the DNA of your genes, your genome, which is your whole DNA, 3 billion letters, that gets developed at late 1970s.
[315] And you'd think you might do it under a super -duper powerful microscope, but it turns out you don't.
[316] You do it by chemistry.
[317] And it's kind of really weird.
[318] you take one strand of DNA that might have back in those days a couple hundred letters and you sort of somehow get a handle on the left side of it, I'm being metaphorical.
[319] Yeah.
[320] And you make random breaks wherever there's an A. And you measure those fragments.
[321] And that tells you the distance to the A's.
[322] Then you do something else and you make breaks where there are C's and breaks where there are G's and T's.
[323] those are the four letters of DNA.
[324] And by measuring the lengths of those fragments in a certain kind of experiment, you can piece out.
[325] It goes A, T, G, A, da, da, da, da, da.
[326] But in those prehistoric times in the 1980s, you might do a week's work and get 200 letters.
[327] And that's why it would take centuries to do the whole thing.
[328] And that's why it was nuts to say, no, no, no, we're going to do three billion letters.
[329] And it wasn't even obvious it would be useful.
[330] In 19, you know, 1986.
[331] there was the first meeting where it came out into the open that people were really thinking about this crazy idea sequencing the human genome and the distinguished scientific journal Nature wrote about it and the biology editor of Nature says it is of dubious value what possible use could come of this if we can't even interpret like the couple of genes we've got so it was not obvious that this thing made any sense but the young people saw it And the older people didn't see it.
[332] And that's the story in every decade is the young people have the vision of where this thing is going.
[333] And to the great credit, some of the older people who organized national committees and things and four years later said, yeah, we probably should do this.
[334] And it got going and it drew in a new generation.
[335] And pretty much every five, ten years, you get the same thing coming up of some new possibility where on average people are saying, oh, that's pretty dubious.
[336] Why do we need that?
[337] But there's young people coming and saying, no, no, no, that's going to break it open.
[338] So, like, when we're done with the human genome project around 2001, 2002, you know, it's becoming clear if we want to find the basis of diseases, we don't just need, like, one human genome.
[339] We got to have genetic variation across lots and lots of people, like almost all the genetic variation in the human population, because we can use it as tracers to trace inheritance and map disease genes.
[340] So that goes on for a while.
[341] Stay tuned for more armchair expert, if you dare.
[342] We've all been there.
[343] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.
[344] Though our minds tend to spiral to worst -case scenarios, it's usually nothing.
[345] But for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[346] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[347] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[348] It's called Mr. Ballin's Medical Mysteries.
[349] Each terrifying true story will be sure to keep you up at night.
[350] Follow Mr. Ballin's medical mysteries wherever you get your podcasts.
[351] Prime members can listen early and add free on Amazon music.
[352] What's up guys?
[353] This is your girl Kiki and my podcast is back with a new season and let me tell you it's too good and I'm diving into the brains of entertainment's best and brightest.
[354] Okay, every episode I bring on a friend and have a real conversation.
[355] And I don't mean just friends.
[356] I mean the likes of Amy Poehler, Kel Mitchell, Vivica Fox, the list goes on.
[357] So follow, watch, and listen to Baby.
[358] This is Kiki Palmer on the Wondery app or wherever you get your podcast.
[359] Well, and can I ask one silly question is, so there were three, as I understand it, three really big organizations kind of helping break the genome.
[360] And whose DNA was it?
[361] It's a single individual's DNA?
[362] It's a guy from Buffalo.
[363] Oh, no shit.
[364] Yeah.
[365] Oh.
[366] I can't decide if you're joking.
[367] So we don't know.
[368] So here's the story.
[369] Here's the story.
[370] I can't see if you.
[371] We didn't want this to be like, you know, Dax's DNA.
[372] And, you know, this would be a celebrity project for use.
[373] So there was a lab in Buffalo that was making what's called the DNA libraries, the fragments we were going to read out.
[374] They put an ad in the Buffalo newspaper.
[375] And they got, like, 30, 40 people to show up and, you know, agree, sign and form consents, give some blood.
[376] And they took them all.
[377] They completely destroyed all identifying information so we would never know who it was.
[378] And they pick one at random.
[379] Wow.
[380] So I can tell you it's a guy from Buffalo, and even 13, 14 years later, as we know more about genetic variation, I can tell you it's an African -American guy from Buffalo who signed up because there's enough of the genetic variations that are more common in the African -American population.
[381] But I can't tell you anything about them.
[382] There is also, isn't it in the mitochondrial eve, there's less mutation, right?
[383] And this is how we know that all of us come from Africa?
[384] Well, no. It kind of turns out there's...
[385] Well, you got it almost entirely right.
[386] Okay.
[387] Except there's more mutation in this region of the mitochondrial DNA.
[388] That's this special DNA you get from your mom.
[389] And the point is, because there's enough mutation going on, we can build really good family trees out of all that DNA.
[390] And we can see that everybody traces back to Africa.
[391] Because, you know, it's there in the DNA.
[392] You know, Africa is the cradle of humanity, and it's got more genetic variation than all the folks living outside Africa.
[393] And you can trace these little genetic changes like breadcrumbs along people's paths of migration out of Africa.
[394] Yeah.
[395] Well, I remember learning in anthropology.
[396] The reason race is a terrible scientific distinction is that so often there are people within Africa that may have more genetic similarity with someone from Ireland than they may have with someone from Southern Africa.
[397] Bingo.
[398] That's exactly right.
[399] So when we try to, like, divide people up into groups, it's kind of crazy.
[400] At this particular gene, I'll pick one on chromosome number one, you might be more related to somebody from Africa than they are to somebody else in Africa.
[401] Right.
[402] But another gene, you're probably more related to Monica.
[403] He wishes.
[404] There might be one.
[405] I don't know.
[406] In this case, hard to say.
[407] Whatever the stubborn loci is, that's what we share that one's right.
[408] So the point is there's so much variation.
[409] And when we talk about the variation, even within any one group, there's more variation within groups than the difference between the averages of those groups.
[410] So where you want to draw boundaries is pretty arbitrary for most purposes.
[411] And that's really hard for people to think about.
[412] But imagine you tried to take New Yorkers.
[413] I grew up in New York.
[414] and figure out how to classify New Yorkers.
[415] Oh, easy.
[416] You ask the person if they have more than five minutes of opinion on pizza, and you know you're dealing with someone from New York.
[417] That's pretty good.
[418] Oh, no, no, no. That's easy.
[419] That's how you tell New Yorkers from non -New Yorkers.
[420] I got a lot of those.
[421] Within New York, I got millions of New Yorkers.
[422] Should you divide them up by Yankees Mets fans?
[423] Should divide them by the borough they come from?
[424] Should you divide them up by, you know, what jobs they do, what interests they have.
[425] Nobody would doubt there's so many different ways to classify people based on many, many different characteristics.
[426] Yeah.
[427] And it's no different from DNA.
[428] Yeah.
[429] Okay, now, after being a part of this really momentous event in history for science, you came to really enjoy that collaboration, right?
[430] This had never really been done, or at least to my knowledge, this level of collaboration had not really ever been done, had it?
[431] No, that's exactly right.
[432] I think biology before this era was much more people working alone at the bench and doing their experiments in recording their information, I wouldn't even call it data, in pen in their lab notebook.
[433] And this marks the era of a couple things changing.
[434] First, collaborations, large collaborations, not just a few people in a lab, but the whole world working together to get something done was exhilarating.
[435] It was a time when biology became data that it no longer fit in your lab notebook.
[436] Biology institutes in the 1980s, they didn't even imagine they needed provisions for computers.
[437] But by 2000, people were already doing papers, most of the data of which had been generated by other people.
[438] So suddenly there were like a tsunami of data and it kept growing and growing.
[439] And the idea that you should generate data and make it freely available to anybody, even before, before you published your paper.
[440] That was another, you know, huge thing.
[441] So for me, I wish I could claim credit for going into biology because I anticipated it was going to turn into data and information and mathematics, but I can't claim credit for that.
[442] But it was really lucky because as that's progressed, we've gone from, okay, we can read out three billion letters of your DNA and your DNA, and then we can read out all the genetic variation in the human population.
[443] population, and then we can figure out which genetic variations are associated with, you were talking about these six genes for heart disease.
[444] Well, we figure them out by just finding some genetic variations are more frequent in these people with heart disease.
[445] And actually, it's up past six now.
[446] It's over 300 of those.
[447] Oh, wow.
[448] Wow.
[449] Oh, yeah, and 300 for schizophrenia.
[450] We now understand that all of these traits, height, weight, you know, psychiatric diseases, diabetes, they're all influenced by lots of genetic variations.
[451] So when people talk about the gene for something, it's a real oversimplification.
[452] What's really going on, like there's some process in your body.
[453] And it can be tweaked in a lot of different ways.
[454] So when we find all these things, the next big question is, how do they fit together in some pathway?
[455] Yeah.
[456] And so once all that was, you know, once the genes were found, you know, the next generation comes along and says, oh, very good elders, you found the genes.
[457] genes, but you haven't told us how they fit together.
[458] Yeah.
[459] And so they start doing things like, well, we got to read out the genetic code of every cell type in the body, not the DNA.
[460] That's the same in every cell.
[461] Yeah.
[462] But which genes are turned on and off, like how the program is read out in every cell in your body.
[463] And that's still ongoing.
[464] There's this international human cell atlas.
[465] I just want to put, I think I can put it in layman's terms, but there's this great curiosity that if I take up my hair follicle, my.
[466] My whole DNA exists within this hair follicle, yet clearly the hair follicle is much different than my femur, than my ear, than my eyeball.
[467] So why on earth is my hair, the shape and consistency it is, and my toenail is what it is, yet they're working off the exact same ingredient list.
[468] How's the body know how to make a nail versus a femur versus hair?
[469] And you can ask the same question about your laptop.
[470] Your laptop has the same program when you boot it up, but you could be watching a movie running a spreadsheet, you could be, you know, doing some drawing, you could be buying something on eBay.
[471] Yeah.
[472] All those things happen, despite the same code sitting on your laptop, and that's because laptop can run different programs.
[473] Yeah.
[474] So can the cells in your body.
[475] When that's a hair follicle or a bone cell or a blood cell, they're running different programs.
[476] So right now, people are just trying to find out what are all the patterns of stuff that's turned on and off in every cell in your body?
[477] Because even though it's same code, it runs different programs.
[478] at different times.
[479] And is it too generic to think of it in terms of, so you have this string of billions of ACGT.
[480] And is it right that just some section of it's creating the hair follicle and some other portion of it is creating the toenail or not?
[481] Is the entire strand required to make each individual thing?
[482] So not is the right answer.
[483] Okay.
[484] So it turns out that if you hired like a consulting firm, they would probably decide we'll have this portion of your DNA for hair follicles and this portion of your DNA for toenails, but it would never work.
[485] Uh -huh.
[486] Because it takes, like, hundreds of different things to make your hair follicles and toenails and blood cells.
[487] And then it turns out many of those things get reused for different purposes.
[488] Mm -hmm.
[489] Like, they get, you know, some sensing molecules might get repurposed for 10 different purposes in the body.
[490] Yeah.
[491] And why is that?
[492] Well, evolution.
[493] You know, we all started.
[494] from some very simple cells.
[495] And then they got more and more complicated.
[496] Then they figured out how to do more and more things.
[497] And there was no consultants around to say, it would be really good for you to divide it up and assign a distinct set of genes to process A and another set of genes to process B. No, no, no. It's hacking it as it goes along.
[498] And evolution's trying stuff.
[499] And natural selection is selecting.
[500] And so it's all kind of intertwined.
[501] Well, and to your point, yeah, Yeah, knowing a chimpanzee has whatever it is, 99 % the same exact strand as we have, and yet we have such different shapes and different everything, right?
[502] That's fascinating in and of itself.
[503] Yeah, that virtually the same ingredients, 99 % of the same ingredients are making such a different species.
[504] Well, yeah, but you've got to be careful not to compliment yourself too much here, because if somebody were to land from another planet and were to compare species on Earth and look at you and the chimp, Yeah.
[505] They'd say, okay, well, yeah, that's one class there, right?
[506] Okay, that's slightly different.
[507] You know, one of them's hair or the other way.
[508] But, you know, if we look at how our organs are laid out, so again, you look at the chimps and they look at us, and we look at our differences.
[509] We even do that with different human groups.
[510] And we ignore all our similarities, which actually runs so much deeper.
[511] Yeah, in fact, Monica and I's favorite thing to say.
[512] I'm glad you like that analogy is we always say, oh, the aliens are liking this.
[513] We imagine the aliens flow.
[514] above Earth and they're like, oh, these little monkeys drive cars.
[515] That's interesting.
[516] And then we witnessed a wedding one time at a hotel and we were like, you know what, the aliens like this.
[517] The monkeys get together and they cry and they join hands.
[518] And that's kind of a sweet thing the monkeys do.
[519] The aliens, they like that.
[520] They like that, yeah.
[521] I just want to say to the aliens who are listening today, I hope you come and validate what I just said.
[522] I thought of a good analogy, or I think is a good analogy, which is I used to get pulled over in the 80s in my car in Nebraska, and I get a speeding ticket, and I had no concern in the world because I would just mail them the payment, and those points would never transfer to Michigan, because there was not a system that was combining all this data, right?
[523] And now we're at a, and I think it's a good way to think of it, is now we have computer systems that are now compiling, you know, local police force with county police, with state police, with FBI, with CIA, and now we're able to synthesize all this information for the first time ever, and now every state talks to each other.
[524] That's kind of what's happened in your lifetime in science, yeah?
[525] Well, it's certainly what's happened with basic scientific information.
[526] We have these databases for sharing all this information about the DNA and the genetic variations and the cells and the other things.
[527] That's amazing.
[528] And there's a great ethic of it.
[529] I got to say it doesn't really happen in medicine so well because the electronic medical records are kind of almost deliberately written in different systems that don't talk to each other.
[530] Because maybe the folks who write the systems aren't so nuts about the idea that you could transfer things between systems because you could hold your customers captive and all that.
[531] Yeah, yeah, yeah.
[532] So it's a little bit of a sore point that some of us have that there's no excuse for the fact that you can't get your medical record from five or seven different institutions easily into.
[533] integrated, or are the many, many people who want to share their medical records for science have such a hard time doing it?
[534] Yeah, I agree.
[535] It's like I should be given a thumbnail drive and every time, or a, yeah, thumbnail?
[536] Thumb drive.
[537] Thumb drive.
[538] I added now.
[539] Thumb drive.
[540] Yeah, yeah, thumb drive.
[541] And then every time I go to a doctor, I don't have to download them on every single twist and turn of my life that I surely will forget on one day and remember on another day.
[542] We ought to be able to do that.
[543] Look, we're seeing this right now in the pandemic that, you know, It's really hard.
[544] Somebody comes in.
[545] They want to get a test.
[546] How do you connect to their insurer?
[547] How do you do this?
[548] And the idea that we got to protect people's privacy.
[549] We've got to protect their information.
[550] But I think most people want to be able to share medical information to advance science.
[551] Yes.
[552] And it often is frustrating.
[553] You know, we have a project right now that's, you know, with cancer patients.
[554] It's called Count Me In to allow patients to partner with researchers to share their information.
[555] And we know that people.
[556] people with cancer, you desperately want to share their information.
[557] So I want to agree with you.
[558] There's so much information that's being shared.
[559] You've got to watch out for the points on your license tax.
[560] There's information being shared scientifically, true.
[561] But we still got a ways to go to make sure that our medical data systems live up to what patients think they do for them.
[562] Right.
[563] Yes, because that's the last kind of boundary and synthesizing all this great information.
[564] As you say, like, so I'm always fascinated with epidemiological studies that are done in Scandinavia where, like, we're having a debate in this country for three years about whether this vaccine causes autism.
[565] And it's exactly that.
[566] It's a debate.
[567] Who knows?
[568] You go over there and they literally put a command into the computer.
[569] And in five seconds, they go, no, we had this group that had it and this group that didn't.
[570] There's no correlation.
[571] And then it's over.
[572] Many countries have this ability.
[573] The Scandinavians are world leaders.
[574] The Estonians turn out to be remarkably good at this.
[575] But what they have is trust.
[576] It really takes trust.
[577] Yeah.
[578] And we can't forget that the science takes place in a bigger social context.
[579] And if you don't have the trust, it's going to be hard to implement these things.
[580] Yeah.
[581] Well, we saw that we've seen that with COVID.
[582] You know, I don't think there's a clearer illustration of that.
[583] When you're really getting down to it, I think you're dealing with people's just genuine distrust with authority.
[584] Some people, some people not were seeing it lived out in real time.
[585] Now, the Broad Institute, I want to keep this momentum going.
[586] I want to be able to collaborate.
[587] I want to be able to share data, and you get this great endowment from the Broads.
[588] And you become the leader of this incredible now institute that's at both Harvard and MIT.
[589] I don't know how that works.
[590] Well, it's nice the way you tell the story that we get this nice endowment where we're, you know, as usual, this is the looking backward in history and making it all sound like it was destined to happen.
[591] Yeah.
[592] What was really happening was around the year 2000.
[593] We had this amazing collaboration going on with all these people.
[594] around Boston, Harvard and MIT, and five different hospitals happening in the shadow of the human genome project.
[595] And nobody's paying much attention at the institutions.
[596] It's kind of just letting us let this go on.
[597] And I don't know, we did a calendar, like 65 collaborations with no official agreements or anything like that.
[598] Yeah.
[599] And then we realized this human genome project thing is going to be over soon.
[600] And like, how are we going to keep this going?
[601] And so we spent some time scratching our head and saying we were going to have to become legit and create some kind of an institution that that kind of crystallized what was happening organically on the ground.
[602] You know, take these facts on the ground that turned it into something.
[603] And so, you know, we thought, well, all we need to do is find some wonderful, amazing philanthropist who's willing to bet on some extremely vague promise that great things would come of this.
[604] and I doubted it was going to work out, but we tried.
[605] And by utter dumb walk, Eli and Edie Brode from Los Angeles showed up at our genome center one day in, and I think it was October of 2002, because Eli was getting some award in Boston and they had given a small grant, about $100 ,000 to support something at our genome center for a disease they cared about.
[606] And they showed up on a Saturday, and the place was buzzing.
[607] And they turned out to be so captivated.
[608] They were going to come by for, like, 30 minutes, and they stayed for like two and a half hours.
[609] And then, you know, I thought I was going to have to go convince somebody.
[610] And what happened was Eli calls up, I don't know, a couple months later.
[611] And he says, I hear you want to start an institute.
[612] Would you be willing to come to Los Angeles and talk to me and Eadie about it?
[613] And I said, yes, I would.
[614] And the problem was, he said, well, how about next week?
[615] And I, of course, we didn't have anything resembling a proposal or a PowerPoint.
[616] And I said, well, how about the week after that?
[617] Yeah.
[618] And so I came out and we started talking.
[619] And they were amazing because they did not say, we want you to work on X or Y or Z. They saw that there was something magic happening with young people recognizing that turning biology into information was making them stay, you know, on weekends working and sharing data.
[620] and, you know, Eli always protests.
[621] He never really understood it.
[622] I'm not sure he really didn't understand it as much as he says.
[623] Yeah.
[624] It sounds like he identified a movement.
[625] It sounds like he witnessed a movement and he wanted to help finance a movement.
[626] And that's what it was.
[627] And they gave an initial gift and they since increased the gift and eventually gave an endowment.
[628] I, of course, thought, why don't you give an endowment to start?
[629] And Eli said, why don't I give a tenure a gift first and see how it works out?
[630] Because, you know, Eli, Eli's like that.
[631] But amazingly, four years later, they gave an endowment, and the brood went from being a temporary experiment to being a permanent institution.
[632] And it's a bit of an unusual institution.
[633] It's connected to MIT and to Harvard and five teaching hospitals.
[634] And all the scientific faculty who are part of the brood are part of one of those seven institutions and also part of the brood.
[635] It's not like they leave their home institution.
[636] So it's this connector and amplifier, and it's grown and grown, and the goal is to put jet rockets under amazing people, used mostly amazing young people, and let them like try doing ambitious things before they find out they're not supposed to think about that.
[637] You must be assembling patents, no doubt, yeah?
[638] Do you assemble like patents and medicine and how, maybe that's none of our business, but does that then get owned by the foundation?
[639] No, no, no. The Broad Foundation has no ownership whatsoever.
[640] They did this as a totally philanthropic thing.
[641] You know, Eli and Ead need to get pleasure and pride out of this.
[642] When the Brod is working on a scientific project, we might be working with staff scientists at the Brod and faculty member at MIT and somebody at one of the hospitals.
[643] And if some important discovery gets made, we'll file a patent on it because if you don't, often people will not develop it as a drug.
[644] Without a patent, you can't run a clinical trial because if you spend $100 million on a clinical trial and you don't have a patent, somebody else will just come in and do it.
[645] Yeah.
[646] So in order to serve patients, you file patents, but it's not what we live for.
[647] Right.
[648] We live for getting information out there.
[649] And, you know, we live for creating new methods that people use, but it's responsible to file patents.
[650] But I got to say, the Broads' involvement is totally philanthropic and totally unrestricted.
[651] And that's so rare.
[652] Yeah.
[653] The ability to have funds where they don't tell you do this or do that, but they say do the most important thing you can think of, like that weighs a lot heavier on your head because you say, oh, damn, I really have to do the most important thing I can think of.
[654] Yeah.
[655] And it lets you take risks on young people.
[656] So now you touched on it a little bit, but now with this knowledge of the human genome, now it's starting to find some applications.
[657] So could you walk us through some of the exciting applications that this has led to?
[658] Oh, yeah.
[659] Well, there's a figure out a gene that plays an important role in a certain cancer.
[660] Yeah.
[661] Pick a cancer.
[662] Some gene is mutated in it.
[663] Find that by looking at a lot of cancer patients and seeing that their tumor contains genetic changes of that gene, but their regular cells don't.
[664] Information at work for you.
[665] And then you say, oh, maybe we should make a drug that inhibits the protein encoded by that gene.
[666] Right.
[667] Or that activates it or something like that.
[668] Well, you know, in the ancient days of the 20th century, most cancer drugs were just poisons.
[669] Right.
[670] They're trying to kill that cell and not your other cells, right?
[671] Exactly.
[672] Whereas now what happens is people try to, in a really molecular way, inhibit this specific protein.
[673] Ideally, one that's mutated only in the cancer cells.
[674] So you get drugs that have pretty minor side effects.
[675] I never want to say no side effects, but pretty minor side effects.
[676] pretty safe.
[677] They're not often injections.
[678] They're pills that you can take.
[679] And there's more than a thousand clinical trials for cancer drugs that come out of understanding which genes, which proteins, and people around the world are doing that now because we kind of have a lookup table for lots of cancers.
[680] And I don't want to, you know, underestimate it's very hard work.
[681] Yeah.
[682] But it's a lot harder feeling around in the dark when you don't know what you're doing.
[683] Yeah.
[684] Same, you know, At other extremes, you know, heart disease, this is one of the early successes that predates the genome project, but, you know, a lot of people take medicines to lower their cholesterol.
[685] That really came from figuring out a particular gene and a particular rare genetic disease.
[686] And then at the other extreme, you take something like schizophrenia.
[687] Yeah.
[688] For the longest time, people had no clue what the biology was.
[689] Well, can I add one fascinating?
[690] Yeah, please.
[691] I remember learning this in psychology 25 years ago was that it's, a genetic trait and that a stressful event in your life, you have a window, right, between like, I don't know, 17 and 30 or 27, where if some traumatic event can turn on that gene and that when they've identified people who have it hereditarily that maybe they try to protect them in that window, that in and of itself to me was one of the most mysterious, interesting concepts.
[692] Is that all been disproven or is that, is there a different understanding of it nowadays or well i don't quite know what gene that is or what you were told okay um monica's gonna be cutting all that out i can promise i'm gonna leave it in but people can carry it right but not express it phenotypically oh well but that's true for lots of things uh -huh right there are a lot of people who inherit a gene for breast cancer but not all of them will actually develop breast cancer uh -huh so that's the thing about a lot of these diseases is they're not deterministic.
[693] They can greatly increase your risk.
[694] Sometimes that risk might also depend on environment, diet, or maybe stress.
[695] And so the idea that there's like a unitary cause really only applies to a very limited number of rare single gene diseases.
[696] But what I will say about schizophrenia is we're learning a lot about, well, maybe 10 years ago, there was no gold standard gene that we could associate.
[697] with schizophrenia whatsoever.
[698] People probably told you about guesses that they had and maybe they were plausible guesses, but we're pretty hard -nosed in the genetics biz.
[699] And we have a standard for like, is this a real associated thing that's not by chance?
[700] And that only started happening in the last decade.
[701] And now there's more than 300 of those.
[702] And like some of them point to amazing things.
[703] Like one is a gene that affects the pruning of synapses in your brain.
[704] And it turns out that that happens a lot in late adolescence, early adulthood.
[705] The teenage brain turns into an adult brain because of a lot.
[706] I mean, you think about parenthood.
[707] There's a lot of pruning that goes on at that time.
[708] And a couple of the genes that were found amongst the earliest are ones that affect the pruning of synapses and they're a little overactive.
[709] So does that suggest that maybe overactive pruning and maybe, maybe, maybe, someday medications that turned that down a little bit.
[710] Now again, I want to be clear not to oversimplify.
[711] I never oversimplify this sort of stuff.
[712] But the point is to have an insight like, huh, pruning, wouldn't have thought of that.
[713] And now here's some handles.
[714] And there's like seven or eight such insights that are emerging in each of many diseases.
[715] So it's like somebody went in and turned on the lights.
[716] And we still don't understand that we're beginning to see the connections.
[717] Yeah, and then people say, well, what do we do about it?
[718] Sometimes it could make a drug.
[719] And then, occasionally, people are saying more and more, maybe even tweak the genes.
[720] So that's becoming part of the possibilities for the first time.
[721] I'm going to take a stab again.
[722] I just fumbled one of them, but I'm going to take another stab.
[723] I'm not losing my confidence.
[724] So I'm going to try really quickly to just say in the most layman's terms, and again, this is 25 -year -old biology memory.
[725] But you're doing pretty good so far.
[726] Okay, thank you.
[727] So your DNA exists.
[728] And what happens is on top of that DNA, each of those four letters can attract an opposite letter.
[729] And that is your RNA, right?
[730] And your RNA goes on top of your DNA and then it breaks off and it goes to wherever it goes to create the opposite of it, which becomes a new strand of DNA.
[731] Is that semi what happens between RNA and DNA for replication for mitosis?
[732] Not so much.
[733] Oh, fuck!
[734] Okay, all right, hit me. Because all I'm trying to set up is that there's a couple different approaches to this.
[735] You could either try to change that DNA or you could try to change the proteins that attach to it and make more.
[736] Oh, no, no. Look, it's good you're bringing this up.
[737] Okay, good.
[738] And because for the last 25 years, I've taught introductory biology at MIT.
[739] Yes.
[740] Having never taken it yourself, which I love.
[741] Having never taken it.
[742] I was so boring.
[743] I would have never taken it.
[744] But I have this, like, moral obligation to slightly, slightly infix that explanation.
[745] Please do.
[746] Let's do it.
[747] All right.
[748] So the DNA is a double -stranded molecule, this famous double helix.
[749] And you're absolutely right that on one strand, you have a letter, and the opposite strand has the matching letter.
[750] A's match with T's, G's match with C's.
[751] That's cool.
[752] If you want to just copy your DNA, it's dead easy.
[753] Unravel the two strands.
[754] and each serves as the template for the other.
[755] If on one strand there was an A, the cell has an enzyme that puts in the T. If there was a G, it puts in a C. So the DNA, one double helix when you unwind it, easily makes two double helixes.
[756] So that's not bad.
[757] Okay.
[758] That's replication.
[759] Okay.
[760] Now what you were talking about with the RNA comes next.
[761] If the DNA is just getting copied and sitting there, it's like your hard drive.
[762] It's not very interesting.
[763] It's not reading out of program.
[764] Certain parts of your DNA encode instructions for making proteins.
[765] Okay.
[766] Those parts are called genes.
[767] So you have about 20 ,000 genes, and there are regions that encode instructions.
[768] And what happens is your liver cells, maybe separately your immune cells, will turn to certain genes and start what's called transcribing them into RNA.
[769] They unzip the DNA strands.
[770] They pick one of those strands, and they copy it into the.
[771] this temporary messenger molecule RNA, the little yellow sticky notes of the cell.
[772] Uh -huh.
[773] And they just copy it off.
[774] And they say, okay, here's some instructions for making important protein in your liver.
[775] And it gets shipped out to the factory in the cell.
[776] And that factory turns those instructions into a protein.
[777] This is great.
[778] I'm so glad you're breaking it down this way.
[779] Yeah.
[780] Yeah, yeah.
[781] Okay.
[782] And it's like, this is the secret of life.
[783] Every organism on this planet does it.
[784] This was worked out like 3 .7 billion years ago.
[785] and everything past that point are small improvements.
[786] Uh -huh.
[787] And so that's how it happens.
[788] So you've got 20 ,000 genes.
[789] And when we talk about what cells are doing, it's which ones are turned on and off.
[790] When we talk about proteins, hemoglobin, collagen, the carrot in your hair, every one of them has instructions written down on the hard drive of the DNA.
[791] Okay, great.
[792] That's it.
[793] So when we're trying to impact the outcome of this process, we have a couple options.
[794] Yeah?
[795] Maybe three.
[796] You could affect the DNA, the RNA, or the protein that gets made.
[797] Stay tuned for more armchair expert, if you dare.
[798] The growth in the immunology, immune, immune?
[799] Ah, yes.
[800] Immunoancology.
[801] Immunoancology, that is specifically kind of trying to tackle the protein side of this equation, yeah?
[802] Yeah, that's right.
[803] Because what's happening is you're getting the immune system often to look for, you know, rogue proteins that shouldn't be there.
[804] In fact, almost all of medicine has been, you know, whether it's aspirin or any other drug you think about, for the longest time, it's mostly been little molecules binding to proteins to change how they function.
[805] That's pretty much how all the business was done pretty much until the last 20 years.
[806] Only in the last 20 years have people first found ways to actually affect the RNA message.
[807] Uh -huh.
[808] interference, and then to go back to the DNA itself and actually change the letters of the DNA.
[809] And now you have this full -service menu.
[810] And you could think about, should I affect the protein?
[811] Should I affect the RNA?
[812] Should I affect the DNA?
[813] Yeah.
[814] And this last one is this whole revolution in genome editing, of which the best exponent is CRISPR.
[815] Right.
[816] And so I would do it, but I failed twice.
[817] Should I go for strike number three?
[818] I did listen.
[819] No, no, no. No, this is so good.
[820] Because, like, you know, when you teach, you realize when you just tell people like the right answer, they don't learn.
[821] Yeah.
[822] When they actually hear somebody trying to explain it and we get to talk about how that's a little different.
[823] Yeah.
[824] So go for it.
[825] Okay.
[826] This is great.
[827] So here's my understanding of CRISPR.
[828] I feel like if memory serves me, I listen to this podcast literally five or six years ago that they started noticing that there was bacteria or virus, I'm not sure which, that they were finally.
[829] finding these bacteria that had incorporated pieces of our DNA into them so that they would not be identified as pathogens and then destroyed by our immune system.
[830] And they thought, well, how is it getting chunks of our DNA into itself?
[831] How many errors have I made so far?
[832] So, so.
[833] Oh, yeah, I get so by his body language.
[834] So many of the correct words are there.
[835] The building blocks, A -Z and G -T are there.
[836] So I'm going to slightly rearrange him a little bit.
[837] Okay, please do.
[838] Please do.
[839] So the thing is, actually about 25 years ago now, people began observing these strange systems in bacteria, weird regions of the DNA of bacteria.
[840] They had these funny interspersed repeats with some spacers.
[841] And it took, There's this guy in Spain, brilliant guy, and it took him 10 years, and he figured out those spacers were little pieces of DNA that had been grabbed from elsewhere.
[842] So far, so good.
[843] Yes, I like it.
[844] You're right on, except it wasn't grabbed from humans.
[845] Okay.
[846] And it wasn't grabbed to avoid our immune system.
[847] It turns out it was grabbed from viruses that infect the bacteria.
[848] Oh.
[849] It was right, right, right.
[850] It's the bacteria's own kind of immune system.
[851] So the bacteria, it's a tough job being a bacteria, right?
[852] You're down there, you're competing with other bacteria, that's tough.
[853] You know, it's not just nine to five.
[854] They're at this 24 hours a day.
[855] And there's billions, right?
[856] And you're good.
[857] Billions, it's really tough.
[858] And then you get infected by a virus and maybe you die.
[859] But if you didn't, you grab a piece of that viral DNA.
[860] You stick it in your own kind of immune system, a thing called the CRISPR system, and then you're no fool.
[861] You keep making that instruction into RNA.
[862] You copy your little DNA reminder of that virus into RNA.
[863] And then you hand that RNA to a special surveillance protein.
[864] And the surveillance protein from that day forth is running around saying, do I ever see anything that matches that?
[865] Uh -huh.
[866] That is how the bacteria's immune system works.
[867] And if it ever sees it, it says, actually, bacteria don't say anything, but I anthropomorphize.
[868] Yeah, you must.
[869] You must, because how else can you do it?
[870] Okay, good.
[871] So, you know, it's got this instruction for a virus that infected it two years ago.
[872] And it says, I see the matching sequence.
[873] And it cuts.
[874] And it's a bacterial immune system.
[875] And so it took a very long time for people to discover that it exists, figure out how it works, figure out that it cuts, figure out how it cuts, it cuts, and then it turns out that all that stuff that has nothing to do with you, it turns out remarkably you can port that with a few tricks from the bacteria, and if you do the right tricks, make it work in human cells.
[876] And it turns out then that those human cells can be instructed, if you give them little RNA instructions, to go find the matching spot in your own DNA and cut it.
[877] So it turns out you can actually direct a protein to like one specific place in your three billion letters of DNA.
[878] And I got to say in the 1980s, in 1990s and in early 2000s, when, you know, lay friends would say, oh, why don't you just go in and fix the DNA sequence, you would, you know, you try not to be condescending, but you'd pat them on my head and you say, oh, you know, it's really not possible to do that.
[879] But it turns out it is possible to do it.
[880] And bacteria figured this out a billion years ago.
[881] And we figured it out how to learn.
[882] from them.
[883] Okay, so if we gave like a hard example, let's say like sickle cell anemia, it's not a super complicated genetic disorder, right?
[884] Well, it's a single letter change.
[885] I mean, medically it's complicated, but from the DNA point of view, it's one letter.
[886] Okay, so you could identify that and then you could send in the CRISPR scissors to potentially cut out that specific gene, and then if memory serves me, sickle cell anemia is like a codonement gene, right?
[887] So some of your blood cells are not sickled and some are?
[888] Well, it turns out, if you have sickle cell anemia, you know, you get two copies of every gene.
[889] Yeah.
[890] One copy might have the mutation.
[891] One copy has the non -mutated.
[892] Okay.
[893] And so the mutation will give rise to this sickled protein.
[894] Yeah, you could identify it, and then you could go in there, and you could virtually cut out that mutation, and then the body would no longer create sickled cells.
[895] Is that an example of how it could work?
[896] That is an example of how it could work, which really would like to do is actually not just cut it, but repair it.
[897] And it turns out you could also send new instructions to be used as a template to fix it up.
[898] This is the part of CRISPR that seems impossible to me. So the cutting out, the whole thing seems impossible.
[899] That's true.
[900] But the particular part that seems absolutely impossible is I understand cutting something out.
[901] But then my understanding is you can actually just inject this new protein into the cell or something into the cell.
[902] And that your own DNA will fill in that hole you've just cut out and now incorporate.
[903] it into itself and start replicating throughout your body.
[904] Yeah, so it seems amazing to you that your body has the ability to take a piece of broken DNA and repair it.
[905] Yeah.
[906] But it turns out your body has got broken fragments all the time, sunlight, other things.
[907] You have evolved, your ancestors, in fact, for hundreds of millions, in fact, billions of years.
[908] Organisms have had to deal with the problem of repairing broken.
[909] DNA, and it turns out they have systems to do it.
[910] Almost nothing we're talking about was conceived for the first time by human scientists.
[911] We largely sit at the feet of bacteria and learn from them, although for truth in advertising, your bacteria don't have feet.
[912] But we learned from, you know, we learned from bacteria.
[913] Yeah.
[914] Okay, so then immediately your imagination can fill in the blanks of all these different things you can conceivably think of, find, and replace.
[915] Now, if I remember correctly, Not only can it do that, but you can also implant a bit of CRISPR that will then, for the future going on, fix all future genes.
[916] Isn't that a part of CRISPR?
[917] It can be a permanent chain for the rest of the genealogical tree of that individual.
[918] Yeah.
[919] Okay.
[920] So now it's an important distinction you're making, and here you're spot on.
[921] Okay.
[922] So we got change.
[923] We can ask where do you want to make a change?
[924] suppose you inherited a dominant form of blindness.
[925] Maybe we want to inject a virus that will cut the gene for the protein that's causing that blindness.
[926] We'd inject that into your eye and it would get into the eye cells, the retinal cells, and it would cut that gene.
[927] That wouldn't get passed on to your kids because it's just in your eye.
[928] Right.
[929] Same would be true if we wanted to, you know, make a change to a liver disease, a familiar, hyper cholesterolemia, or a muscle disease, or you name it.
[930] None of that gets passed on.
[931] The only things that get passed on are in the cells we call the germ line, sperm and eggs.
[932] Uh -huh.
[933] So, if you were to take a newly fertilized embryo and try to do editing on it, those edits would get inherited by all the cells.
[934] Uh -huh.
[935] Uh -huh.
[936] Including the sperm or eggs.
[937] of that person.
[938] And that would be a way to change to make what we'd call heritable human genome editing as opposed to most of what we're talking about in medicine is non -heritable.
[939] Yeah.
[940] You got it.
[941] That's the distinction.
[942] Now we're into very juicy philosophical terrain because it's one thing for an autonomous person to make the decision, I don't want to be blind anymore, great.
[943] But to make decisions that will ultimately impact every future generation of you, It just becomes an interesting ethical thought.
[944] Also, there are many genes, right, that seem destructive today that perhaps in the past were useful, in the future may be useful, right?
[945] We don't have a full knowledge of when they're useful and when they're not.
[946] We're just seeing that in this day and time, this is very destructive.
[947] Yeah, this is a subject that has been much on my mind for quite some time because many of my colleagues at the Broad Institute have been involved in developing CRISPR.
[948] I should say, CRISPR has been an ensemble act involving many scientists around the world over these 25 years.
[949] But some of the very important demonstrations of editing in cells and things, it's come out of my colleagues at the Brode, and they've done many improvements to it.
[950] So I think a lot about that, and they're very thoughtful about it.
[951] But also, I ended up on this international commission on heritable human genome editing that has been meeting for the last, I don't know, two years.
[952] feels like two years, and just issued its report a couple of weeks ago, because we got asked these questions, what should be done?
[953] And, you know, the first thing is, scientifically, we are far from being able to do this accurately enough to make this responsible.
[954] So that was like message number one.
[955] Do not do this right now because it would not be responsible.
[956] We can't accurately control this editing.
[957] We can't do all.
[958] And we laid out what would need to be done.
[959] But then it raises a lot more questions beyond it.
[960] Would we know what the effect is if we make a change?
[961] So, you know, for some very terrible, rare genetic disease, we'll probably know, we do know the answer for some of those.
[962] But I've heard much more ambitious suggestions made by people who kind of want to, quote, I'm using this in air quotes, improve the human race.
[963] Yeah, yeah, a new eugenics.
[964] Well, that's exactly right.
[965] So we have this tension between, on the One hand, there are people who really want to do something for a handful of couples who really can't have a kid otherwise who wouldn't have the genetic disease.
[966] That's very rare.
[967] In most cases, where there are couples who are at risk, they can easily prevent transmitting the disease by just doing in vitro fertilization, chingina typing the embryo and making sure that they only implant the ones that don't carry it.
[968] Right.
[969] But there are some who can never, because they have a double dose.
[970] And so on the one hand, you know, you want to take very seriously that there are some couples like that.
[971] On the other hand, there are people who unabashedly have, you know, eugenic ambitions.
[972] Yeah, they want Arnold Schwarzenegger as a son with, yeah, Bill Gates' aptitude for computers or something.
[973] There was a Finnish seven -time Olympic gold medalist for endurance sports who had a certain change to the gene that made erythropoitan, the hormone that stimulates red blood cells.
[974] Oh.
[975] And it's like natural doping.
[976] Right.
[977] That's the EPO doping in cycling, right?
[978] Well, one person got a mutation that naturally turned on his own EPO.
[979] Seven Olympic gold medals, you know, more power to him.
[980] But maybe some parents says, I want to go do that.
[981] And then I had a colleague who actually said, you know, cruise ship viruses are awful.
[982] They're called norovirus.
[983] Other of you have a cruise ship virus, but they're horrible.
[984] Well, we know that they use a certain protein to get into cells.
[985] Let's just mutate that protein, and then you'll be immune from cruise ship viruses.
[986] Yeah.
[987] So I went back to the Brod Institute and talked to some of my colleagues who work on all sorts of human genetic things, and we looked it up.
[988] And yeah, that's absolutely right.
[989] It probably would do that.
[990] It would also dramatically increase the number of cases of inflammatory bowel disease and colon cancer.
[991] because there's not so many free lunches in the genome.
[992] You change one thing.
[993] Yeah.
[994] You know, something else goes wrong.
[995] Well, you have an entire system built on the back of all this stuff, right?
[996] Yeah, let's just rip out this transistor.
[997] What could possibly go wrong?
[998] Well, and like, we're just learning what happens in the microbiome.
[999] Like, that's entirely new.
[1000] 25 years ago, you would just think, oh, microbes in your gut, those are got to be bad.
[1001] Now we're learning, oh, you don't have the right one.
[1002] You might have obesity.
[1003] You don't have the right one.
[1004] You might have mental health issues.
[1005] You know, yeah, so much is so unknown still.
[1006] Well, you know, science has to bring humility to things.
[1007] Like, it's amazing.
[1008] We've done all sorts of powerful things, read the genome, all this stuff.
[1009] I love it.
[1010] I'm proud of it.
[1011] Yeah.
[1012] You know, what this generation is done.
[1013] But science has to bring a lot of humility that there's tons of stuff we don't know.
[1014] We are still reading this, you know, three billion -year -old text.
[1015] And we've been reading it for a decade or two where kindergartners reading the text.
[1016] and we think like we understand all its subtle meaning.
[1017] So humility has got to be a part of every bit of science and how you balance amazing opportunity and the duty to serve patients and the duty to move forward with the right kind of humility is a important thing.
[1018] Scientists have to hold together.
[1019] And now here's where it gets really complex geopolitically, right?
[1020] Because we may have this council that you're a member of that's now issued a statement on where you think we're at with the ethics of CRISPR, country why is ambitious and country why wants to be a leader in medical technology and attract patients from around the world and build it and invest in it as an industry.
[1021] They're heavily incentivized to fast track this stuff.
[1022] What levers do we have in place that we can prevent that or are there none?
[1023] And then deeper to that would be this weirdly now starts getting into kind of a national security element to it.
[1024] It's very complex.
[1025] right the many tentacles that can grow out of all this well it is complex i think it's not like we have no levers don't underestimate moral opprobrium that an international commission comes together that the whole scientific community comes together and says no this is not right at this time we're in this way and the way this report got written i got to say they were really thoughtful people on this commission and we argued and debated and and wrote and tried the way this got written was we said there should be thresholds.
[1026] You can go up to this threshold.
[1027] But before you go to the next threshold, the world has to come together and discuss it again.
[1028] And then before you go to the next one.
[1029] And so we recognize there's no way you can pass laws that restrict countries from doing what they're going to do.
[1030] But there are norms.
[1031] Now, you know, we see norm breaking going on, but there's still norms that govern a lot of things.
[1032] When one scientist in China surreptitiously did apparently edit two baby girls in China, there was a firestorm of criticism, and he ended up serving three, is serving three years in jail.
[1033] Oh, really?
[1034] So, oh yeah.
[1035] Well, it was done in ways that violated all sorts of regulations.
[1036] It was a really dumb gene to edit in the way it was done.
[1037] It was done without proper approvals in China.
[1038] Yeah.
[1039] And so China threw the book at him.
[1040] Can I ask really quickly what happened to the embryos?
[1041] Did they, were they implanted?
[1042] Is this a human that's, on planet Earth?
[1043] He reported that two baby girls were born.
[1044] Oh, really?
[1045] No information beyond that was disclosed other than the supposed genetic sequence.
[1046] What did he change?
[1047] Well, it's really interesting.
[1048] It was a gene that encodes the receptor for the AIDS virus for HIV.
[1049] And his idea was, and there's evidence that certain mutations, if you have them and you have a double dose of them, would help protect you from infection by AIDS.
[1050] Why you would do this in baby girls born now when there are lots of ways to prevent, you know, there might be a virus by then, whatever, and in what way did he convince the parents to do this?
[1051] It was a very unethical choice.
[1052] So like everything that could be wrong about what was done was wrong, including the choice of the target.
[1053] But I'm not given up yet on norms and on moral opprobrium for doing things wrong.
[1054] And so the commission was very clear about what you shouldn't be doing.
[1055] And we're not going to be able to tell people what they can and can't do, but it was more than a statement.
[1056] It's a lengthy report that really tries to lay this out, both the science, and it makes suggestions for governance.
[1057] And, you know, look, it's part of what we were talking about at the beginning of our conversation.
[1058] I think we got a lot of challenges ahead, health challenges, climate change challenges.
[1059] with energy, you know, running cities, all sorts of things, feeding a planet, a lot of that is going to take science.
[1060] Not the only answer.
[1061] We're going to need a lot of other pieces beyond science, but science is going to be so much a part of it and figuring out how science can play its proper role in the world and the debate over, you know, what that role is.
[1062] But, you know, I go back to, you know, the Matt Damon manifesto.
[1063] Oh, you got Monica's attention.
[1064] He's a homie here in Cambridge.
[1065] Yes.
[1066] He went to Cambridge Ringe in Latin school.
[1067] And when I first moved to Cambridge, our first apartment was facing, you know, his high school.
[1068] Oh, my gosh.
[1069] So, you know, Matt Damon, I think, makes the right observation when he's on Mars and he's left behind on Mars and he has nobody to reach.
[1070] And he's got to survive until they can send somebody back.
[1071] And, you know, it is, it is the observation.
[1072] that I'm going to have to science the shit out of this.
[1073] Best line in the movie.
[1074] Best line in the movie.
[1075] And I think that applies to where we are today.
[1076] It's not the only thing we have to do.
[1077] We got issues about equity and justice.
[1078] We got a lot of things that science alone can't solve.
[1079] But I don't see how we don't take advantage of the opportunities and how we don't solve our problems by bringing science to bear on all these things.
[1080] So this generation that's grown up right now, I hope they get the same inspiration I got from growing up when I did.
[1081] Yeah.
[1082] But the difference is I grew up in a world where, you know, science was an unalloyed joy.
[1083] Everything was great about it.
[1084] It was going to produce nothing but a wonderful future.
[1085] It's got a little more complicated.
[1086] Science really still can produce amazing futures, but it can also produce things that go off the rails if we're not careful.
[1087] And so scientists have to stand their ground on science.
[1088] They have to say, we know how we figure out what's right and wrong, what's true and what's not.
[1089] And that's not up for debate.
[1090] If you want to debate it, you have to bring the goods.
[1091] You've got to bring evidence.
[1092] But then there's a question of how do you make wise decisions about how to use any of these things?
[1093] And I think, like with this commission and many other things, scientists then take their place as parents and as citizens with a whole world.
[1094] And, you know, climate change frustrates me greatly.
[1095] I don't want to hear politicians saying I don't believe in climate change.
[1096] I don't respect that.
[1097] I would respect, of course I believe in climate change, but I don't believe we should do anything yet about it because X, Y, Z. I can disagree violently that I think you're wrong that we don't have to do something about it right now.
[1098] But that's a respectable position.
[1099] If we don't agree on the reality that we're living in, there's really no place to go.
[1100] So we've got to get these both right.
[1101] There's a question of finding and agreeing on reality.
[1102] and there's a set of ground rules that have served us well for 400 years, which is we get evidence, we admit when we're wrong.
[1103] You know, politics, people don't like admitting when they're wrong sometimes.
[1104] But, you know, scientists have to admit where they're wrong.
[1105] And you might think folks in politics say, oh, that's a sign of weakness.
[1106] No, no, that's a sign of strength.
[1107] You only get to truth if you're willing to admit when you're wrong.
[1108] So there's this battle over what's truth.
[1109] And then there's this battle over what we should do.
[1110] And I think we've got them all munged together right now.
[1111] And I think we have to like really vigorously defend that science is going to be really important.
[1112] But as scientists, we also have to bring a humility to how that stuff gets used in society and recognize there we don't necessarily have special insight.
[1113] We might even have kind of sometimes blinders about it.
[1114] And we need like a lot of input into that.
[1115] Yeah, it's easy to get myopic in any of these fields, I think.
[1116] Okay, really quick.
[1117] People are certainly fascinated by you.
[1118] They're certainly starting to cry that this interview is coming to a close.
[1119] So I do want to urge them to listen to Brave New Planet, your podcast coming on October 12, where you can hear the good doctor, Eric, speak at much further length on all these different topics.
[1120] But I just have two little questions for you before we depart.
[1121] One is, could you tell us what's physically happened already in this field?
[1122] Because certainly we've done animal experiments and stuff, right?
[1123] Have we done any, what's like a headlining animal experiment we've done with CRISPR?
[1124] that's impressive.
[1125] Oh, with CRISPR in mice, people have replaced genes for blindness, hearing, all sorts of things.
[1126] People use CRISPR every day on every floor of the Broad Institute and every other institute there is in the world.
[1127] It goes from being shocking, amazing, can't believe it, to being ho -hum, and the students in a few years will say, oh, I assume we always had CRISPR.
[1128] So it's just a routine tool that's used for everything.
[1129] What is interesting is it's now being used for some, human clinical trials in cancer to be able to engineer immune cells as part of that immuno -oncology you were talking about.
[1130] It's beginning to be used in some clinical trials for genetic diseases.
[1131] They're still early, but we're going to see that happening, and people are approaching it slowly and carefully, but it's going to become a mode.
[1132] I don't think it'll become, you know, the cure for all diseases, but it's going to go into the toolbox for new therapies.
[1133] It's so exciting.
[1134] Now, this one's more of a, I guess it's mildly a spiritual question, and I'm an atheist, just know where I'm coming from.
[1135] The transition from inorganic material to organic material.
[1136] Am I wrong in that that's still a humongous mystery, yeah?
[1137] I mean, do we know how we go from rock and lava and water to organic?
[1138] Is that still a mystery?
[1139] So that one is not so much of a mystery.
[1140] I think they've had a pretty good idea for a long time that inorganic things, when you put them in certain circumstances, can form organic compounds.
[1141] So you heat them up, you do things.
[1142] You can get building block organic molecules.
[1143] But that's a long way from life.
[1144] I think the thing you might want to be asking is how do you go from molecules, whether they're inorganic or, quote, organic, which is just a name for a certain a subset of molecules.
[1145] How do you go from that to living organisms?
[1146] Nobody's managed to pull that off in the laboratory yet, but, you know, some young generation may come along and say, oh, yeah, you know, that's our challenge, but it might be a little while.
[1147] But there are lots people with ideas on how it happens, but, you know, we want evidence.
[1148] Is it of interest to modern scientists, or is it largely something that's just like, whatever, I don't know, let's go forward?
[1149] No, no, no, it's of amazing interest.
[1150] I have friends here and town who devote their labs to, how is it that organic molecules could somehow come together and make living cells?
[1151] And, you know, some of the parts aren't so hard.
[1152] There are certain kinds of phospholipid molecules that naturally form cell -like boundaries.
[1153] So membranes of cells, they happen spontaneously.
[1154] All right, checkmark.
[1155] What about the stuff in the cell?
[1156] What about the other stuff?
[1157] You know, folks are trying to take that apart.
[1158] When I say we don't know what happens, I don't mean, we don't have ideas, we don't have experiments, we don't have people speculating and testing.
[1159] It's just, we don't have the goods yet to tell you exactly how that happened.
[1160] Yeah.
[1161] And that's why we keep having young scientists coming along.
[1162] Yeah.
[1163] Oh, I have one last question.
[1164] And that is, I want to say that I think the history was when ExxonMobil, in trying to clean up the Valdez spill, they did create an organism that could eat oil and turn it into something that was not toxic or lethal.
[1165] And then in doing that, they copyrighted its genome, this, I don't know what it was that they created that could eat oil.
[1166] But then that started this process of patenting or getting copyrights on DNA sequencing.
[1167] And is that good?
[1168] Is it bad?
[1169] It seems dangerous to me, like that you would have to maybe license if you want to study a chimp.
[1170] You might have to license their DNA or something.
[1171] Is there, Oh, God, all right.
[1172] For starters, when you say that people created an organism that could eat oil, what they did was they took an organism and they gave it a little bit extra DNA that encoded the instructions for eating oil.
[1173] Okay.
[1174] It wasn't really creating the whole organism.
[1175] It would be like, you know, I give you a document and you add a sentence to it.
[1176] Yeah.
[1177] And you wouldn't say, I created this document.
[1178] Or we put a bomb sniffing device on a dog and say we've created a bomb detecting animal.
[1179] No, we put a backpack on a dog.
[1180] Yeah, right.
[1181] So you put a backpack on this bacteria that makes it eat oil.
[1182] And then people did file a patent on this.
[1183] And it actually went to the Supreme Court as to whether or not patents could be made on organisms in a famous case called Chakra Barty v. Diamond.
[1184] And the Supreme Court held that you could issue patents.
[1185] And this led to amazing set of patents in biotechnology that helped, you know, form the foundation of the field.
[1186] but it also led to a problem.
[1187] People started patenting human DNA, just naturally occurring human DNA, like the DNA for a breast cancer gene and saying not just, I have a method for sequencing your breast cancer gene, but I'm the only person who could sequence your breast cancer gene because I have a patent on your breast cancer gene.
[1188] This got a bunch of people pretty upset because it meant you couldn't get a second opinion, they could charge whatever they wanted.
[1189] And anyway, this breast cancer gene is a product of nature, not a product of humans.
[1190] And they said, well, we broke it apart from the chromosome, so we get credit for it.
[1191] And it eventually went to the Supreme Court.
[1192] And in my extracurricular activities, I sometimes write amicus briefs to the Supreme Court.
[1193] Of course you do.
[1194] Yeah.
[1195] Yeah.
[1196] So.
[1197] It'd be embarrassing if you didn't.
[1198] I like constitutional law.
[1199] Of course you do.
[1200] You know, I met my wife in a constitutional law class and I have always followed law very closely since.
[1201] And I decided that on some topics, the Supreme Court doesn't really get serious friend of the court briefs from scientists.
[1202] Yeah.
[1203] So I wrote them one on this question of could you patent genes?
[1204] And anyway, make a long story short, I took a position.
[1205] and I was then working in the President's Council of Advisors and Sciences and Technology.
[1206] The Solicitor General took the same position and the Supreme Court unanimously adopted that position.
[1207] Oh.
[1208] The short answer is Supreme Court said that even though you can patent artificial things, you can't patent natural DNA and nobody can own your DNA and I was so thrilled by that decision, glad to have played a little bit of a role in writing a brief on it.
[1209] Oh, my gosh.
[1210] How crazy I ask a question I feel like is completely unrelated to him.
[1211] Yeah, and you're a part of it.
[1212] Yeah, that's pretty wild.
[1213] That did not come up in my research of you.
[1214] I just have been curious.
[1215] Yeah, yeah, yeah.
[1216] That's crazy.
[1217] Some future podcast we can talk about the amicus brief on gerrymandering.
[1218] Oh.
[1219] Very interesting.
[1220] But for another podcast.
[1221] For another podcast.
[1222] Dr. Eric Lander, you're so fascinating.
[1223] I'm so happy that you've taken this crazy twisty, unexplainable, don't try to make sense of it.
[1224] Turn in life and that you're where you're at, and we really appreciate your time.
[1225] And we just hope we get to speak with you again.
[1226] And great luck with your podcast.
[1227] You guys are great.
[1228] Thanks for having me. Okay, wonderful.
[1229] Thank you.
[1230] Have a good one.
[1231] Bye.
[1232] And now my favorite part of the show, the fact check with my soulmate Monica Padman.
[1233] You can't hear me, can you, Monica?
[1234] She's going to be really tall.
[1235] Oh, my gosh.
[1236] You've made a little bird's nest for yourself.
[1237] Are you going to lay some eggs?
[1238] I wasn't ABR.
[1239] Oh, you weren't.
[1240] No, I'm so sorry.
[1241] Oh, it's okay.
[1242] I was ABR.
[1243] Okay.
[1244] How you doing?
[1245] Hi from Los Angeles to Michigan.
[1246] Hi from Furndale, Michigan, California, USA, Colorado, New Mexico.
[1247] Are you having fun?
[1248] Oh, my gosh, am I having fun?
[1249] I'm on vacation.
[1250] with, well, Aaron was visiting for two weeks.
[1251] And then I followed him home and I'm now visiting him for a week.
[1252] And yesterday we, I don't want to say too much because I recorded a bunch and maybe it'll be its own episode.
[1253] But suffice to say, we did hit four cider mills yesterday.
[1254] It was a 13 hour excursion.
[1255] We left at about 11 .30 and we got home at one of the morning.
[1256] And boy, did we go for it.
[1257] And it evolved.
[1258] We set out with one mission to eat it for.
[1259] cider mills and then steaks came into the mix and then we had a new objective and it was just a really big day.
[1260] Yeah.
[1261] Yeah.
[1262] And no one threw up and it was great.
[1263] I was so impressed by that.
[1264] I am too.
[1265] I anticipated an involuntary throw up after the third stop.
[1266] Didn't happen.
[1267] I'm grateful.
[1268] Just kept feeling stronger and stronger as we went along.
[1269] What did you do yesterday?
[1270] Well, I slept till noon on accident.
[1271] It was an accident.
[1272] I woke up at eight a respectable time.
[1273] Yeah.
[1274] And I decided to lay back in bed just for a few minutes.
[1275] Sure.
[1276] Oof, and then that turned into a few hours.
[1277] Okay.
[1278] And then I woke up at noon.
[1279] I woke up at noon.
[1280] You have a new policy and it's ABS.
[1281] Always be sleeping.
[1282] Yes.
[1283] Always be sleeping.
[1284] Are you having, like, you're getting so many dreams in, I assume.
[1285] Yeah, well, that's part of it.
[1286] Like, if I'm in the middle of a dream and I wake up, I want that dream to continue.
[1287] So I close my eyes again just to see.
[1288] Like, how's this going to end?
[1289] You virtually slept the same length of time it took us to set a world record.
[1290] Well, maybe I set a world record, too.
[1291] Longest sleep.
[1292] Well, I think I've gone past 13.
[1293] You have?
[1294] Have?
[1295] You're always trying to break my records.
[1296] Yeah.
[1297] I'm sorry.
[1298] It did cross our mind yesterday that probably no one's ever tried to go to four cider mills on a day.
[1299] I just don't know why anyone would have.
[1300] Sure.
[1301] And then we were considering the thought that maybe we had the least impressive world record of all time.
[1302] That could be beat by anybody, you know, the next day.
[1303] But that it might be standing right now.
[1304] Yeah.
[1305] What else do you think you have a record in?
[1306] Oh, great question.
[1307] Maybe pairs of meandies or...
[1308] Oh, yeah.
[1309] Like, you probably have the most meandies of anyone in the world.
[1310] I would not rule out people starting to collect old prints like Jordy's.
[1311] Like, they're being a secondary market for meundies.
[1312] Oh, my God.
[1313] Because people miss the patterns.
[1314] Vintage underwear.
[1315] Yes, and they might reissue, but until then people might become pantyheads, I guess.
[1316] Because you call them sneaker heads, right?
[1317] Panny heads.
[1318] Panny hats.
[1319] Wow, we just invented something, I think.
[1320] So much proprietary stuff happening.
[1321] But would they want the original pair where it's used?
[1322] Well, no, no, no. They would want it like someone got extras.
[1323] They didn't unwrap it.
[1324] You know, it's the same with Jordy's.
[1325] Like, you want a perfect pair that's not really been worn.
[1326] Oh, that's true.
[1327] I guess you don't want someone's old shoe.
[1328] No. But of course, there are people's old undies we would want.
[1329] I would love to have some of Bill Gates' old me undies.
[1330] Wouldn't you?
[1331] I'd put them on and try to write it with some software or something.
[1332] Yeah, see if it had some magic.
[1333] What if our undies do carry our own personal magic?
[1334] I wouldn't doubt it because they're right next to our most powerful area.
[1335] Equipment.
[1336] Or janitailia.
[1337] Tia.
[1338] In that case, I would want Dave Chappelle.
[1339] Oh, me too.
[1340] I want his bandies so bad.
[1341] Oh, me too.
[1342] I want to go stock him and collect some of his undies.
[1343] I want his mey undies, and then I want to try doing stand -up again.
[1344] Exactly.
[1345] Just see if there's a little kick.
[1346] I also might just, when I have Bill Gates' panties, I might put them on to have a Diet Coke like one would put on a smoking jacket to have their pipe.
[1347] Sure.
[1348] That's nice.
[1349] I would keep a Diet Coke wrapped in the Mi -undies like you keep your pipe in your pocket of your smoking jacket.
[1350] Oh, that's going to be a little uncomfortable, but I guess worth it?
[1351] Yeah, worth it.
[1352] Um, guess who's here?
[1353] Who?
[1354] Wobbywob.
[1355] He's joining us today.
[1356] Say hi, Rob.
[1357] Hi, guys.
[1358] Oh, wow.
[1359] Okay.
[1360] Oh, this fucking dog.
[1361] You guys, Aaron's dog.
[1362] Oh, my God, he's going to knock over my soda.
[1363] His dog is the size of a horse, and it's seven months old, and it's so perverted.
[1364] Oh, my God, is this dog perverted?
[1365] He's not scheduled to get his nuts clipped until January, and he's so horny.
[1366] Look at him, you guys.
[1367] Look at the size of that stupid thing.
[1368] I feel bad for him, too, because I'm really cutting into his living space.
[1369] We had to put a gate up so he couldn't come upstairs because he eats Jordy's, and I can't risk that.
[1370] Oh, my God.
[1371] He loves Jordy's for breakfast.
[1372] Well, he loves any footwear to eat.
[1373] Also, and then now he can't be in Aaron's room because I'm recording here.
[1374] So now he's just down to, like, two rooms.
[1375] Oh, he's resenting you.
[1376] He's got to be counting down the minutes until I leave.
[1377] Okay.
[1378] Hi, Rob.
[1379] Hi, Wobby.
[1380] Rob hasn't been with us for fact checks for most of quarantine.
[1381] Most of 2020.
[1382] Most of 2020.
[1383] That's right.
[1384] Tell us about your life, Rob.
[1385] Same old.
[1386] Just hanging out in my house.
[1387] I got some questions for you.
[1388] Do you somehow have a long lens on your fucking computer?
[1389] Because you look amazing and the background is all blurry like you're on a long lens.
[1390] So I'm using my Canon as a webcam right now.
[1391] Oh my God, Monica, doesn't he look gorgeous?
[1392] Wait, he's using, what is he using?
[1393] My canon that I take photos with.
[1394] You look incredible like you're on a real show, right, Monica?
[1395] Yeah, it's gorgeous.
[1396] Dax, is it snowing in your room?
[1397] Yes.
[1398] Okay, there are fuzzies all over the place.
[1399] Because that perverted dog just came in here and kicked up a lot of snow.
[1400] He did.
[1401] Oh, my goodness.
[1402] Oh, my goodness.
[1403] Okay, well, let's get into some Eric Faye.
[1404] facts, shall we?
[1405] He was so interesting.
[1406] Fascinating.
[1407] When I was listening back, I just felt like I was back in school.
[1408] You know, we love school.
[1409] Yeah, he did have a way of like, it sounded like he was teaching us, like 101 something.
[1410] Yeah.
[1411] There was no talking over anyone's head, right?
[1412] He didn't get esoteric.
[1413] He had a really great way of keeping it in layman's terms.
[1414] I love him.
[1415] I love him, too.
[1416] I'm in love with him.
[1417] I'm in love with him.
[1418] Do you think he has?
[1419] Oh, my gosh, one thing real quick.
[1420] Yeah.
[1421] When we were at Lake Arrowhead, Eric told us that a T -Rex skull...
[1422] A whole T -Rex, a skeleton.
[1423] A T -Rex skeleton in its entirety had sold at Christie's auction for $31 million.
[1424] 31 million and the projections were that it was going to like fetch $8 million.
[1425] So it was way over the projection.
[1426] 31 mil, and it was not to Leonardo DiCaprio, unfortunately.
[1427] No, but one thing I pointed out is it seems like a lot of money, but then when you think that these Ferrari GTOs have sold at the Concourse Delegance for $59 million, you compare a Ferrari to a T -Rex full skeleton.
[1428] It seems like a bargain to me. It sure does.
[1429] I would really like one and starting to get obsessed with it.
[1430] I am too.
[1431] I don't need the whole skeleton because then you need a room that's like 25 feet tall, But the skull, the cranium, I want that.
[1432] Yeah, but the reason it's good to have the full skeleton is when you're having sex in the mouth, you're very high up.
[1433] Oh, and it's dangerous.
[1434] Yeah, that's a little danger.
[1435] Heightens it.
[1436] Yeah, literally.
[1437] Can I make a recommendation?
[1438] Sure.
[1439] Will you start with making love in the rib cage?
[1440] Because that's only probably eight, nine feet off the ground.
[1441] Work my way up.
[1442] Yes, because you're going to need to learn.
[1443] I know you've got great balance from your back.
[1444] background.
[1445] Cheerleading.
[1446] But none of your cheers involved.
[1447] Coitus, so you don't know how good you're, you know what I'm saying?
[1448] Well, some of them got close, you know, catching by the pee.
[1449] Yeah, sure, sure.
[1450] As you know.
[1451] Yeah.
[1452] As you invented.
[1453] But you're right.
[1454] I'll work my way up.
[1455] I have got progressively more fearful of heights the older I get.
[1456] So I will be a little scared.
[1457] Maybe you have a little net under the rib cage and the mandible.
[1458] In case you do fall out.
[1459] Tip out.
[1460] You fall into like a little, what do they call those people, high flyer?
[1461] Yeah, a trapeze.
[1462] A trapeze net.
[1463] Yeah, you'll fall into a trapeze net.
[1464] Oh, that would be fun.
[1465] And then I bet you could auction off your sex trapeze net for a couple thousand bucks to recoup some of that 30 mil.
[1466] Oh, my God.
[1467] What a plan.
[1468] You could probably rent it out too.
[1469] You know, in Japan and Tokyo, they have these, they're called love hotels.
[1470] And they're like these themed hotels that people can go in almost exclusively to just have sex.
[1471] Like a brothel?
[1472] Well, they're not providing the women to my knowledge.
[1473] It's just, it's a place to rent to have sex, and they have these exotic themes.
[1474] Again, they're called Love Hotels, and you could Google them.
[1475] But you could rent out your T -Rex mouth and ribcage as a love hotel and maybe generate some income there as well.
[1476] You're right.
[1477] I'm pretty, like, stingy, though.
[1478] I think I'd be kind of afraid to loan out my T -Rex to some strangers who are going to have sex.
[1479] They may be really rough, and then they may break some of the bones.
[1480] Okay, that's a good point.
[1481] But look, I think this skeleton could pay for itself in like 18 months.
[1482] All right.
[1483] And then you'd be back in the black.
[1484] And then I get another one.
[1485] Yeah.
[1486] Okay, Eric, so he said that someone flew over the Unisphere sculpture in a jetpack.
[1487] That person was performer Robert Carter.
[1488] C -O -U -R -T -E -R.
[1489] He flies past the Unisphere at the New York World's Fair on May 13th, 1964, wearing a rocket outfit that was originally developed for the U .S. Army.
[1490] The outfit, according to its manufacturer, can fly a distance of 815 feet at speeds of 60 miles an hour.
[1491] Okay, so I have since all that conversation watched some videos of what appears to be maybe the Swedish military or some military organization where the guy has a fully functional jetpack and he's flying all over the ocean.
[1492] He flies from big boat to big boat putting on this show.
[1493] It looks just like Tony Stark and I can't believe it's real and it's real.
[1494] Oh my God.
[1495] He's got like turbans on his back and then each arm has like three turbans around the wrist and that's how he's, I'm assuming, you know, pointing himself in this or that direction.
[1496] And it's, it looks like fake.
[1497] It looks science fiction.
[1498] Okay.
[1499] So if we all had.
[1500] jet packs, though, would it be kind of the same as driving if everyone's just in the air, but you still have to kind of maneuver around each other, there'll still be crashes?
[1501] There's not a chance in how they will let people operate jet packs in mass. But there is conceivably a computer program that you would type in your address and then the jetpack would take you there.
[1502] Mind you, it's a horrendously inefficient use of fuel to run, you know, six or seven turbines to go up to Gelson.
[1503] It's probably more likely that you would be in a drone, a battery -powered drone that will take you places that you also will not operate.
[1504] These are my predictions, my Nostradamus predictions.
[1505] How far out do you predict?
[1506] Well, if they're already existing.
[1507] I think in about 15 years they're going to hit the consumer market.
[1508] Fifteen, really?
[1509] I'm worried about perverts, though, because someone could just hover outside your window.
[1510] Well, luckily, we have a ring.
[1511] You know what I just exposed about myself?
[1512] I'm not worried about perverts.
[1513] I think if I had one of those jetbacks, I'd hover out someone's window.
[1514] That's why I thought of it.
[1515] Well, I just thought of, I mean, why would I think of that if not I was considering, you know?
[1516] Well, no, you have kids and you could be concerned that someone would be hovering around their window, and I'm your kid, so hovering around my window as well.
[1517] I did get nervous for you, and then also I imagine that I could hover outside people's windows.
[1518] So both things were happening.
[1519] Okay.
[1520] Protective and predatorial, I guess.
[1521] Yes.
[1522] Can I remind you that you're famous?
[1523] And so if you're hovering around someone's window, that's a cancelable offense.
[1524] Unless I pick the perfect person who loves let's go to prison or something and they're pumped that I'm hovering outside their window.
[1525] And again, I don't know how I would screen for that, but maybe I open it up to social media and just say, just send me your address if you're cool with me hovering outside your window.
[1526] Social media.
[1527] Day by day I'm getting more and more fearful.
[1528] So we, you started the rabbit hole.
[1529] The New York Times rabbit hole podcast.
[1530] It's phenomenal, right?
[1531] You recommended it to me and I started it.
[1532] I'm on episode five.
[1533] Oh my God.
[1534] I, I think it should be required listening personally.
[1535] That's what I, that's exactly what I wrote on Instagram.
[1536] You did.
[1537] In order to use the internet, you should have to listen to this because don't fool yourself.
[1538] I'm not fooling myself.
[1539] We're all getting led.
[1540] Yeah.
[1541] down a fucking rabbit hole by an algorithm that predicts what we would like.
[1542] And they're right.
[1543] They're right.
[1544] And then they take you somewhere.
[1545] And they're smarter than you.
[1546] Like the way they were talking about how the way the algorithm is making connections, a human brain can't do.
[1547] It's like actually smarter than a human brain.
[1548] So there are things happening that you can't even figure out why you'd want to see that, but you do.
[1549] They said they built this neural network, basically.
[1550] It's mostly about YouTube.
[1551] or at least what I've heard so far, I don't watch it on YouTube, but I'm not stupid enough to think it's only happening on YouTube.
[1552] It's happening on everything that I look at.
[1553] Yeah, it's much smarter than you and it can predict what you're going to find interest in.
[1554] And then based on if you enjoyed that, they're going to predict.
[1555] And then they're just going to lead you somewhere.
[1556] And by the way, I think most people think about this is something that only happens on the right.
[1557] And it happens on the left, too.
[1558] Oh, yeah.
[1559] It's all happening to everybody.
[1560] Yes.
[1561] Well, that is a big part of the first few episodes.
[1562] I guess, I don't want to give it away.
[1563] But it's definitely happening on both sides and it's the exact same thing.
[1564] Well, minimally, we can say that the first few episodes revolve around this guy who was kind enough to let them access five years of his YouTube watching history.
[1565] And they can just chart day by day how he got radicalized.
[1566] And this was a smart guy who was going to major in environmental science and, you know, was into punk rock and this and that.
[1567] And then he ends up a place he could never imagine.
[1568] he would have ended up.
[1569] And then by some luck, broke out of it and started realizing what had happened to him and going down different channels.
[1570] But now he's just down different channels and he kind of admits he's down different channels.
[1571] And it's all cuckoo.
[1572] It's the same thing.
[1573] Yeah, but it is also so funny because it all boils down to this need for community.
[1574] All of this, the whole thing.
[1575] And I hate to say it, it goes back to Jonah Harari.
[1576] It's like, you know, he identifies a lack of community is being the underpinning of addiction going on and on.
[1577] And, you know, it's kind of underpinning this thing.
[1578] And, yeah, we need to be in a group.
[1579] And we'll be in generally what group invites us and accepts us or that we see ourselves in.
[1580] Yeah.
[1581] Yeah.
[1582] It's scary.
[1583] I'm scared.
[1584] Do you want to repeat his name?
[1585] It's Johan Hari.
[1586] Johan Hari.
[1587] Sorry, Johan.
[1588] What did I say, Johan Harari, I think.
[1589] Oh, Jesus.
[1590] You should hear how I'm ordering our salads.
[1591] We're trying to eat healthy here, and we're getting a lot of Greek salads.
[1592] And I want them to add what I've been calling gyro meat, which I think – and gyro meat, I've been saying.
[1593] And it's Euro – maybe.
[1594] Maybe it's Euro.
[1595] It is.
[1596] But Aaron and I were talking about – they don't even blink when you say whatever you want to call it.
[1597] They're so used to hearing people mangle that word.
[1598] Of course, yeah.
[1599] Yeah.
[1600] Okay.
[1601] You said you learned in psychology that people with schizophrenia have a particular genetic marker and that a stressful event during a period from around age 7 to 30 could trigger the schizophrenia.
[1602] He rejected that, and I know I learned that.
[1603] Yeah, he said he wasn't sure if that was true.
[1604] So scientists have not traced the genetic causes of schizophrenia, but more than 80 % of schizophrenia cases are considered to have a hereditary cause.
[1605] In a new report published in translational psychiatry, Japanese researchers report that a rare genetic variant are RTN4R may have a fundamental role in the disease.
[1606] RTN4R is a subunit of RTN4, which regulates crucial functions for neural circuits, namely axon regeneration and structural plasticity.
[1607] Moreover, RTN4 is a promising candidate gene for schizophrenia because it is located at chromosome, wow, 22Q11 .2, a hot spot for schizophrenia.
[1608] They have to remember all this stuff, these scientists.
[1609] Yeah, but, you know, that's all they're remembering, probably.
[1610] They have not seen one show on Netflix, you know, that's all they're doing.
[1611] There's nothing about the environmental causes of it becoming active or not?
[1612] I have read that before, too, but it's not proven.
[1613] That's the problem.
[1614] Okay.
[1615] But I have read that.
[1616] I'm going to prove it.
[1617] Okay.
[1618] I'm going to find 100 people with that variant.
[1619] And then I'm going to jump out of a closet with a big butcher knife and a cloak on to 50 of them.
[1620] And then I'm going to send 50 of them to Bora Bora for the seven -year window.
[1621] And then at the end, I'll publish my results.
[1622] And I have a hunch that the people I jumped out with a butcher knife will have higher rates of active schizophrenia.
[1623] Okay.
[1624] Such an easy study.
[1625] I don't know.
[1626] I just solved it.
[1627] Why haven't they done that?
[1628] Oh, my God.
[1629] I'm glad you're staying humble.
[1630] I know that was a goal of yours post -reelapse.
[1631] Yeah, yeah.
[1632] Sounds like it's really working.
[1633] Yeah.
[1634] Okay.
[1635] Who was the scientists in China who Gene edited two baby girls and is serving three years in prison?
[1636] Oh, I'm already excited to hear you pronounce this.
[1637] Because Xi Jinping, when I see Ji -G -Ping spelled, not a chance would I come to that?
[1638] I mean, might as well not even have it spelled for me. I just have to remember that that iconography equal.
[1639] equals Gigi pink.
[1640] I don't think they're spelling it for you.
[1641] I think that's just his name and how it's spelled.
[1642] I know, but I feel like it needs to be translated here.
[1643] Those letters don't equal those sounds here.
[1644] Sure.
[1645] To be fair, in this article, Science Magazine that I'm looking at right now, it does say pronounced.
[1646] It tells me how to pronounce.
[1647] That's great.
[1648] Yeah, it's great.
[1649] So it's pronounced, he, H -E -H, No, that's he.
[1650] Okay, yeah.
[1651] So he jianku.
[1652] Great.
[1653] Can I bolster my argument just with one thing?
[1654] When we learn about King Tutankhamun from Egypt, it does not give us the hieroglyphic of his name.
[1655] They decided to spell out Tutankumman.
[1656] That's not what they did in Egypt.
[1657] Yeah, but these aren't hieroglyphics.
[1658] They're English letters.
[1659] But am I not right in that in China, in Mandarin and Cantonese, they use symbols.
[1660] They don't use the alphabet.
[1661] They use characters.
[1662] So somebody converted the character to these alphabetic letters and they fucking screwed up royally.
[1663] Oh, my God.
[1664] I'm filing this under the music thing.
[1665] Oh, my God.
[1666] Music notes.
[1667] I have another cause.
[1668] You feel offended by personally offended.
[1669] No, I just think they shit the bad.
[1670] If you're moving from characters to alphabet, let's get it spelled phonetically for us.
[1671] We might need to get a linguist on to explain this to us.
[1672] to explain why they don't make it easy for Americans to understand other people's languages?
[1673] No, with a Latin -based alphabet.
[1674] I'm not just going to include Americans.
[1675] Anyone that's using this alphabet, why are they converting the characters to the letters that don't represent the sounds?
[1676] I think it just might be harder than you'd expect to make those translations.
[1677] Do you think there's an underpinning of xenophobia in my critique of this?
[1678] I don't know.
[1679] I feel like you think there's a little bit of xenophobia.
[1680] No, I don't think there's xenophobia.
[1681] I think there's just a sense of like, it should be like this because Americans need it to be like this.
[1682] Where, you know, like.
[1683] Okay.
[1684] Well, I'm also mad at American companies that's like my medicine, Zaljans.
[1685] No, thank you.
[1686] What do you mean?
[1687] It has everything but numbers in it.
[1688] Like, when I look at the letters on that prescription bottle.
[1689] Yep.
[1690] It certainly does not say Zaljans to me. Does it start with an X?
[1691] Yes.
[1692] Oh, sure.
[1693] Nothing should start with an X. Let's start there.
[1694] God.
[1695] You can't start a word with X. Anyway, He -Giang Q. On June 10, 2017, two couples came to the Southern University of Science and Technology to discuss whether they would participate in a medical experiment that no researcher had ever dared to conduct.
[1696] the Chinese couples who were having fertility problems gathered around a conference table to meet with he Jiangku, a biophysicist, then 33, my age.
[1697] Wow, what an overachiever.
[1698] He had a growing reputation in China as a scientist entrepreneur.
[1699] Those shouldn't really go together, but that's a red flag.
[1700] But was little known outside the country.
[1701] Can I pause you for a second?
[1702] A scientist entrepreneur is just an inventor.
[1703] They should have called them an inventor.
[1704] But you shouldn't have an inventor doing a medical procedure on you, though.
[1705] Well, the people who invented like the stint and the defibrillator, those are inventors, right?
[1706] But I don't think those are the people actually applying it.
[1707] What we really don't want is baby inventors.
[1708] Well, right.
[1709] Except for our invention, the pee baby.
[1710] I guess we don't have a leg to stand on here.
[1711] We're acting, you're trying to, well, we're both trying to claim the moral high ground.
[1712] and we just don't have it.
[1713] What, yes, we do.
[1714] He is doing an invention on other people's babies.
[1715] And we invented this poor pee baby who's stuck in a toilet that we're living in a glass house throwing rocks.
[1716] I disagree.
[1717] We created life.
[1718] Uh -huh.
[1719] And what he's doing is trying to tinker with an existing life.
[1720] Uh -huh.
[1721] Okay.
[1722] I think it's different.
[1723] We are inventors of a life.
[1724] well you invented you invented two lives also real children that's not a problem oh three thrice decorated life inventor one is a medical marvel though because you do have a vasectomy true very true and we've created a whole new category because there's asexual reproduction and then sexual reproduction and this is yet a third thing urinary reproduction that's right that's exactly right we we production we We reproduction.
[1725] We reproduction.
[1726] I hope that people don't copy us.
[1727] And then so many toilet bowls in America have a healthy pea baby living in it and then they can't be used.
[1728] And then we have to start going outside a lot like you know where.
[1729] What?
[1730] You know where and you're offended or you don't know where and it didn't make sense.
[1731] Oh my God.
[1732] You've been spending too much time in Michigan.
[1733] That is true.
[1734] I was listening to.
[1735] I can tell.
[1736] Well, I was listening to.
[1737] a sassy, I had a couple of real culture shocks last night.
[1738] A, we went to all these cider mills and there's just hundreds of families out, everywhere you go.
[1739] And you don't see that in L .A. You know, there's really no place where you're just seeing like hundreds of families gathering.
[1740] That was just different.
[1741] I kind of was like, oh, we live in an interesting bubble that I'm reminded of that a lot of America doesn't live that way.
[1742] It's much more community kind of centric here, which was one thing I noticed.
[1743] And then, sad.
[1744] Cassie was telling us about kind of recent, what would definitely be categorized as sexual harassment in the workplace.
[1745] And the level of things that have been said to her that she was just, just kind of shrugged off, was so drastically different than what would pass in our little bubble.
[1746] Yeah.
[1747] I found that to be really fascinating.
[1748] And what was your conclusion?
[1749] You know, you won't like this, but it was just an anthropological conclusion, like a culturally relative conclusion, which is just, oh, that's how it is here.
[1750] I didn't come to some, you know, I don't know what outcomes better for each place.
[1751] Somehow this is the outcome that's fine for them.
[1752] And in L .A. would have been a much different reaction.
[1753] It's just interesting.
[1754] I find it more interesting than I'm finding myself levying a verdict.
[1755] Yeah.
[1756] Because I'm measuring things by how distraught is the point.
[1757] person that received it?
[1758] Like, that to me is the metric.
[1759] If someone feels very distraught and troubled and all these things, then that's a big problem for me. It's like maximizing suffering and minimizing flourishing.
[1760] But if the person doesn't feel any suffering, it's hard to say.
[1761] So do you think that the people in the vow, if they're happy there, that's fine?
[1762] No, because they had tons of suffering, which is why they all defected and took it down.
[1763] No, not those people, the people in it, like India and Alice and Mac.
[1764] Like, they were totally happy there.
[1765] And they thought it was a great thing.
[1766] But their happiness relied on victims.
[1767] There was suffering.
[1768] You know what I'm saying?
[1769] The slaves of those people were suffering greatly.
[1770] They couldn't eat the way they wanted to.
[1771] They were dying of male nutrition.
[1772] Their hair was falling out.
[1773] They had been branded.
[1774] Like, there's just all these clear markers of suffering.
[1775] So that to me seems very easy.
[1776] Well, I think it's suffering to be sexually harassed, whether you are aware of it or not.
[1777] I think that has an impact, and I think it's similar to these people who are brainwashed.
[1778] I totally agree, but I guess what I'm arguing is that it is relative to where you're at.
[1779] So something that would be super inappropriate where we live isn't necessarily what's super inappropriate here.
[1780] And you can't tell the people that are experiencing it that it is, inappropriate and they just don't realize it or something like there's something a little arrogant about that right like if someone's saying oh yeah we joke about sex all the time at work and i don't give a shit who am i to say well no you have to give a shit and you're wrong and you should you should suffer from that that's a weird declaration to make a little bit right yeah anyways it was just it was just a clear i found it to be a very clear difference yeah agreed it's different we're all in our bubbles that's for sure we are really in our bubbles Anyway, Eric went through this little bit of what happened with him, but he did CRISPR to remove this HIV gene, and it was just not necessary, and he went to jail, and he's a rogue scientist, a mad scientist.
[1781] Do you think when he gets released, he'll move to an island and become like a Dr. Evil type 007 villain?
[1782] I hope.
[1783] We could use a couple more villains.
[1784] And he could create like super strong humans that are impervious to HIV and gonorrhea.
[1785] This guy must have suffered for some STDs and thought, I got to, I want to live in a world without these.
[1786] Changing the world.
[1787] One gene at a time.
[1788] One gene at a time.
[1789] Was that all the facts?
[1790] That is.
[1791] Oh.
[1792] Well, I miss you.
[1793] I miss you.
[1794] I'm having so much fun, but I also miss you.
[1795] And I think you would be having fun here too.
[1796] There was lots of steak eating yesterday.
[1797] I love a steak.
[1798] Yeah.
[1799] And I love a donut.
[1800] Yeah.
[1801] Woo.
[1802] How many donuts did you eat?
[1803] Just tell us that.
[1804] 28.
[1805] Between the two of us.
[1806] So we each had 14.
[1807] Full size.
[1808] Oh, yeah.
[1809] Full size.
[1810] Coated in cinnamon.
[1811] Woo, woo, woo.
[1812] We got half dozen the first stop, half dozen the second stop.
[1813] A couple dozen the third stop, but still only eight a half dozen.
[1814] And then on the four stop, they were even better than we were expecting.
[1815] And we thought we would only have another half dozen.
[1816] But then we ended up eating four there.
[1817] And then when we got home, we split one right before bed at 1 a .m. Wow.
[1818] You guys are really, really virile.
[1819] That's the only virility we have left.
[1820] Okay, dokey, love you.
[1821] Love you, Wabiwob.
[1822] Love you, too.
[1823] God, does he look good?
[1824] Look at that, Monica.
[1825] I know.
[1826] It's crazy.
[1827] Will you take a screen grab of yourself right now, please, Rob?
[1828] I mean, what a, what a, what a. A shot.
[1829] What a shot.
[1830] Oh, God.
[1831] I miss our P. Baby so much.
[1832] Will you give her a kiss?
[1833] You know what?
[1834] I'll be honest.
[1835] I went into my house the other day, and I was scared to look.
[1836] Yeah.
[1837] Well, I'm afraid she's going to get hair.
[1838] That's when he's going to get real scary.
[1839] Ew.
[1840] Oh, man. All righty.
[1841] All right.
[1842] Love you.
[1843] Love you.
[1844] Follow armchair expert on the wonder.
[1845] app, Amazon music, or wherever you get your podcasts.
[1846] You can listen to every episode of Armchair Expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple podcasts.
[1847] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.