The Joe Rogan Experience XX
[0] Boom.
[1] And we're live.
[2] Hello, sir.
[3] How are you?
[4] Good to see you, man. Nice to see you.
[5] Thanks.
[6] What?
[7] You're eating chocolate when you got here and you told me that you are a cacao shaman.
[8] And I said those are strong words.
[9] They are.
[10] What does that mean?
[11] So I was in Berlin last year giving a talk at a tech conference, and somebody invited me to a sacred cacao ceremony.
[12] Never heard of it.
[13] I thought, wow, that sounds awesome.
[14] I love chocolate.
[15] I went.
[16] It was so wonderful.
[17] And at the end, they were talking about these people, these great cacao shambins.
[18] And I thought, what is that?
[19] I got to be one of those.
[20] And so I came back.
[21] I looked for certification.
[22] There wasn't certification.
[23] I self -declared.
[24] And then I started doing cacao ceremonies in New York.
[25] I have hundreds of people who come.
[26] It's really wonderful and it's exciting.
[27] Like, if you want to be a doctor, you've got to go to medical school, right?
[28] You want to be a comedian, you've got to become a professional, you've got to put in your time.
[29] Cacao shaman just show up.
[30] It's like putting up a shingle.
[31] Hey, I'm a cacao shaman.
[32] If anybody shows up and they have a good time, then you're real.
[33] Now, how much do you need to know about cacao, like the nutritional properties of it?
[34] Well, cacao is amazing.
[35] It's great stuff.
[36] It's incredible.
[37] And so definitely cacao and people have been using it ceremonially for about 5 ,000 years.
[38] So it's incredible.
[39] chocolate makes people happy.
[40] It helps your brain function, your circulation, there's all these kinds of incredible things.
[41] But in the ceremonies that I do, I have two key messages.
[42] One is, you are the drug.
[43] I mean, we all take people take drugs, people take ayahuasca and psilocybin and all these kinds of things.
[44] But I also think that we have, for the things that we take drugs for, this kind of release and happiness and joy.
[45] We have those things inside of us, and we just kind of get out of our way.
[46] We can experience them.
[47] And the second thing is I believe that there are no, I say this in my ceremonies, there's no such thing as sacred cacao or sacred plants or sacred mountains or sacred people if we don't treat life with sacredness.
[48] But if we recognize that everything is sacred, then we infuse life with sacredness and meaning.
[49] And that's, anyway, that's why I do.
[50] It's a lot of fun.
[51] That's very interesting from a guy who specializes essentially in manipulating life.
[52] Well, you know, we have manipulated life as humans for a very, very long.
[53] time.
[54] But it's interesting.
[55] You know, the idea of things being sacred, but your specialty is manipulating genetics, right?
[56] Yeah, well, so that is this strange moment that we're in, because for about 4, about 3 .8 billion years, our species has evolved by this set of principles.
[57] We call Darwinian evolution, random mutation and natural selection.
[58] It's brought us here.
[59] We used to be single -cell organisms and now look at us.
[60] And there's been a lot of magic in that process, and there still is, but we humans are deciphering some of that magic.
[61] We are like looking under the hood of what it means to be human and we are increasingly going to have the ability to manipulate all of life, including our own.
[62] Yeah, that is very unnerving to a lot of people.
[63] It's uncomfortable and scary.
[64] Yeah, it is.
[65] They like things the way they are.
[66] Jamie, I'd like to stay the way I am.
[67] We always think that.
[68] Why do we always think that?
[69] Because there's a built -in conservatism in our brains.
[70] And yet, we live these lives that are entirely dependent on these radical changes that our ancestors have created.
[71] I mean, we didn't find this building or agriculture or medicine in nature.
[72] We built all those things.
[73] And then everybody gets a new baseline when you're born.
[74] And you think, well, you know, I want organic corn.
[75] I want whatever.
[76] But all these things are creation.
[77] We live an entirely created world and our ability to manipulate and change that world is always growing.
[78] And I think we need to recognize that.
[79] But being afraid is okay and being excited is okay.
[80] And we need to find the right balance between those two emotions.
[81] I think for a lot of people, they feel like so many changes happen, particularly when you're talking about genetically modified foods, so many things happened before they realized they had happened.
[82] So when they're like, hey, man, I don't want to eat any GMO fruit.
[83] Well, then you probably shouldn't eat any fruit.
[84] Yep.
[85] Because everything that you buy has been changed.
[86] Yeah.
[87] Every orange that you buy, that's not what an orange used to be like.
[88] Yeah.
[89] Could buy an apple.
[90] Apples didn't used to be like that.
[91] Tomatoes didn't used to be like that.
[92] No, I know that's, we reset our baseline just from when we're kids.
[93] So if you went back 12 ,000 years ago to the end of the last Ice Age and you said, all right, find me all these things that we buy at Whole Foods.
[94] Most of them didn't exist.
[95] We've created them.
[96] Sure.
[97] And then in the 1970s, we had the ability to do what's, recombinant DNA, what people call genetic modification.
[98] And people are afraid because it's, well, that feels unnatural.
[99] We're applying science to food.
[100] And, you know, that's the issue.
[101] And now we're entering the era of genetically modified humans.
[102] And there's that same level of uncomfortableness.
[103] But what happened, the reason why I've written this book, Hacking Darwin, is that if we approach genetically modified humans in the same way we approach genetically modified foods, which is the scientists say, hey, we've got this, we're going to manage them responsibly, and it just kind of happens to people.
[104] People are going to go nuts.
[105] I mean, imagine how agitated people are about GMO foods.
[106] If they don't have a say in how the genetic, the experience, the human experience of genetic modification plays out, people are going to go berserk.
[107] So we have this window of time where we can start bringing everybody into an inclusive conversation about where we're going, because where we are going is just radically different from where we've been.
[108] I think it's an awareness issue, and I also think it's a perception issue.
[109] I think that everything people do is natural, including cities.
[110] I think cities are natural.
[111] That's why they're all over the world.
[112] I think there is natural as beehives.
[113] And I think as much as we like to think that technology is not natural, it's clearly something people naturally make.
[114] Of course.
[115] They make it in every culture.
[116] Yeah, it's the history of our species.
[117] And we kind of misuse this word natural.
[118] Because what is natural?
[119] I mean, maybe natural was when we used to live and we were just part of nature.
[120] I always say, it's like, people say, oh, I love nature.
[121] I love, like, going out and hiking in the woods.
[122] The reason you love hiking in the woods is that we've murdered all the predators.
[123] It's like in the old days, you stay in your cave.
[124] You're not going out of hiking in the woods.
[125] There's stuff that's going to kill you out there.
[126] I know, that was a massive luxury to go wander through the forest with no weapons.
[127] Yeah.
[128] Like, nobody did that.
[129] Exactly.
[130] No, exactly.
[131] But he goes, oh, I want nature.
[132] I want my, you know, my natural corn.
[133] And I want my natural chihuahua, even though, you know, 25 ,000 years ago, there's no chihuahua.
[134] There's wolves.
[135] Yeah.
[136] And look what we've done to them.
[137] I know.
[138] Well, look what we have done to them.
[139] Yeah.
[140] If you had natural wheat, like, or natural corn in particular, natural corn used to be a tiny little thing.
[141] Yeah, it's a few weeds.
[142] Yeah, weird, gross little grain.
[143] Yep.
[144] And then now we made it this big, juicy, sweet, delicious thing that you put butter on.
[145] Which is great.
[146] That's great.
[147] As long as it doesn't have glyphosate on it.
[148] No, it's true.
[149] We can't fetishize that there's some kind of imaginary world where kind of everybody was wearing birkenstocks and eating in whole food.
[150] That's that imaginary world sucked for us.
[151] That's why we left it.
[152] It's true.
[153] But there's some sort of a balance, right?
[154] We do appreciate the nature aspect of our world and eagles and salmon and all these wild things.
[155] And to be able to see them is very cool.
[156] But yeah, you don't want to get eaten by those things.
[157] You don't want them everywhere.
[158] You want to be able to go out and get your newspaper without being worried about getting intact by a Jaguar.
[159] It helps.
[160] Yeah.
[161] You, when you think about the future, at least me, the one let me tell you my concern.
[162] Yeah.
[163] I'm worried that rich people are going to get a hold of this technology quick, and they're going to have massive unfair advantages in terms of intellect, in terms of physical athletic ability.
[164] I mean, we really can have a grossly embank.
[165] balanced world radically quickly.
[166] If this happens fast, where we don't understand exactly what the consequences of these actions are until it's too late.
[167] And then we try to play catch up with rules and regulations and laws.
[168] Yeah, that's a very, very real danger.
[169] And that's why I've written this book.
[170] That's why I'm out speaking every day about this topic, because we need to recognize that if we have, if we approach these revolutionary technologies using the same values that we experience today where, you know, we're here and very comfortable, but just down the road, there are people who are just who are living without many opportunities.
[171] There are people in parts of the world like Central African Republic where there's just a war zone.
[172] Kids are born malnourished.
[173] If those are our values today, we can expect that when these future technologies arrive, we'll use those same values.
[174] So it's real.
[175] And right now we have an opportunity to say, all right, these technologies are coming.
[176] Whatever we do, these technologies are coming.
[177] There's a better possible future and a worse possible future.
[178] And how can we infuse our best values into the process to optimize the good stuff and minimize the bad stuff?
[179] And certainly what you're saying is a real risk.
[180] Think of what happened when European countries had slightly better weapons and slightly better ships than everybody else.
[181] They took over the world and dominated everybody.
[182] And so, yeah, it's very real.
[183] The governments need to play a role in ensuring broad access and regulating these.
[184] these technologies to make sure we don't get to that kind of dystopian scenario that you've laid out.
[185] Well, it's also in terms of governments regulating things.
[186] Like, why are they qualified?
[187] Who are they?
[188] Who are the governments?
[189] They're just people, right?
[190] There are people that are either elected or that, you know, or some sort of a monarchy.
[191] You're dealing with either kings and queens and sheiks or you're dealing with presidents.
[192] And we've seen in this country that sometimes our presidents don't know what the fuck they're talking about, right?
[193] So who are they to disrupt science, to disrupt this natural flow of technology and decide?
[194] Well, we need somebody to do it.
[195] We need some representation of our collective will, but just to avoid some of the things like you just mentioned, that that's the reason why humans banded together and made these kind of created governments.
[196] And the reason for democracy, especially if you have more functioning democracies, is that your government in some ways reflects the will of the people.
[197] And the government does things that individuals can't do.
[198] And I know there are libertarian arguments.
[199] Well, everyone should just, like, if you want a little road in front of your house, either go build the road or you pay somebody.
[200] But there are a lot of things, even in that model that won't get done.
[201] There are a lot of kind of big national and even global concerns that you need some kind of regulation.
[202] because what we're talking about is the future of life and life on earth.
[203] And there have to be some kind of guardrails.
[204] And that's why what I'm arguing for is we really need a bottom up.
[205] I think every person, and that's why I'm so thrilled to be here with you today, Joe, every person needs to really understand these revolutionary technologies like genetics, like AI.
[206] And all of our responsibilities and opportunities to say, hey, this is really important.
[207] Here are the values that I think that I cherish.
[208] And just like you said, I don't want it to be that the wealthiest people are the ones who have kids with higher IQs and live longer and healthier than everybody else.
[209] And then so we have to raise our voice.
[210] And there needs to be a bottom up process and a top down process.
[211] And it's really hard.
[212] Many people have a concern that someone else is going to do what we're not willing to do first.
[213] Yes.
[214] They're worried China and Russia.
[215] Those are the big ones.
[216] China and Russia.
[217] Especially China.
[218] Yeah.
[219] They're very technologically advanced.
[220] And their innovations off the chain when it comes to like Huawei just recently surpassed Apple as the number two cell phone manufacturing in the world five years ago they had some like a single digit share of the marketplace now they're number two on the planet earth I mean they they hustle and if they just decide to make super people and they do it before we do that's what people are worried about right they're worried about there's trivial things we're seemingly trivial.
[221] like athletics and then there's things that are really like what what's to stop people from just becoming the Hulk what's to stop people from becoming immortal what's to stop you know what is well there's two questions first is is china because I think it's a really big issue the 21st the story of the 21st century one of the biggest stories of the 21st century will be how the u .s china rivalry plays out and the playing field will be with these revolutionary technologies And China has a national plan to lead the world in these technologies by 2050.
[222] They're putting huge resources.
[223] They have really smart people.
[224] They are really focused on.
[225] And it's a big deal.
[226] In genetic technologies, when last year, my book Hacking Darm was already in production in November when it was announced that these first genetically engineered babies had been born in China.
[227] And so I called the publisher and said, we need to pull this back out of production because I need to reference this.
[228] But it didn't require much of a change.
[229] I'd already written, this is happening.
[230] We're going to see the world's first gene edited humans.
[231] It's going to happen first in China.
[232] And here's why.
[233] So I had to add a few sentences saying, and it just happened in October of 2018.
[234] So China is on that path.
[235] And we need to recognize that on one hand, the United States needs to be competitive.
[236] On the other hand, we don't want a runaway arms race of the human race.
[237] And that's why we need to find this balance between national ambition.
[238] and some kind of global rules.
[239] That's really hard to do.
[240] And the other thing is that we're competing with them.
[241] Yeah.
[242] And so if they decide to do it first, we're almost compelled to do it second or compelled to try to keep up.
[243] Yeah.
[244] How far away do you think we are from physically manipulating living human beings versus fetuses versus something in the womb?
[245] So physically manipulating living human beings, we're there.
[246] We're there.
[247] Yeah, yeah.
[248] So often it's called gene therapy.
[249] So, for example, there's a whole class of treatments for treating cancer called CART therapy.
[250] So you have a cancer.
[251] When you're younger, your body is better able to fight cancers.
[252] What you can do is someone with the cancer.
[253] You take their cells, you give their, you manipulate their cells to give them cancer fighting superpowers, and you put them back into the person's body.
[254] And now the person's body behaves like you're a younger person.
[255] You have the ability to have the ability to fight back.
[256] So gene therapies are already happening.
[257] A relatively small number of them have already been approved.
[258] But there is a list of thousands of them with regulators in applications to regulators around the world.
[259] So the era of making genetic changes to living humans, that's already here.
[260] Like what can they do with it so far?
[261] So so far, most of it is focused on treating diseases.
[262] But a lot more is coming.
[263] Because when people think about the human the genome.
[264] Our genome isn't a disease genome.
[265] It's not a health care genome.
[266] It's a human genome.
[267] And so we are going to be able to do things that feel like crazy things, like changing people's eye color, changing people's skin color to funky things.
[268] I mean, there's a lot of stuff that we're not doing now that we will be able to do.
[269] How far away do you think we are for something like that?
[270] 10 years?
[271] So in 10 years, we're going to have green people.
[272] If someone so chooses.
[273] Yeah, if someone so chooses.
[274] What if it sucks?
[275] Will they'll be able to go back to normal color?
[276] Well, if it's, that's a good question, if it's with this kind of gene therapy and it's a small number of genes, probably, but we are messing with very complex systems that we don't fully understand it.
[277] So that's why there's a lot of unknowns that coming back to your point on regulation, that's why we, I don't think we want a total free -for -all where people say, hey, I'm going to edit my own genes.
[278] Yeah, and you don't want some backyard hustler.
[279] Yeah, it's true.
[280] It's true.
[281] It comes about you're saying about the Hulk.
[282] I mean, I just think that there are all kinds of, you know, we're.
[283] humans, we're diverse, any kind of thing that you can think of, there is a range.
[284] And there's crazy on the left and crazy on the right and crazy on the top.
[285] So people are going to want to do things.
[286] And the question is, if we're at any society, what do we think is okay and what do we think is not okay?
[287] And maybe there should be some, I believe there should be some limit to how far people can go with experimenting, certainly possibly likely on themselves, but certainly on their future children.
[288] Certainly on the future children.
[289] Yeah, but once you're 18, I think, do whatever the fuck you want.
[290] If you really, well, maybe 25, 25, we're going to have a lot of 25 girls with gills.
[291] It's like, it's like, it's like the tattoo, seem like a good idea.
[292] Yeah, well, we probably will.
[293] Yeah.
[294] So you think we're probably like 50 years away from that being a reality?
[295] So I think that we are, the genetic revolution has already begun.
[296] And it's going to fundamentally change our lives in three big areas.
[297] The first is our health care.
[298] So we're moving from a system of generalized health care based on population averages.
[299] So when you go to your doctor, you're treated because you're a human, just based on average.
[300] And we're moving to a world of personalized medicine, and the foundation of your personalized health care will be your sequence genome and your electronic health records.
[301] That's how they know you are you.
[302] And that's how they can say, this is a drug.
[303] This is an intervention that will work for you.
[304] When we do that, then we're going to have to sequence everybody.
[305] So we're going to have about 2 billion people have had their whole genome, sequence within a decade, and then we're going to be able to compare what the genes say to how those genes are expressed.
[306] And then humans become a big data set.
[307] And that's going to move us from precision to predictive healthcare where you're going to be just born and you're going to have all this information, your parents have all this about how certain really important aspects of your life are going to play on.
[308] Some of that is going to be disease related.
[309] But some of that's just going to be life related.
[310] Like you have a better than average chance of being really great at math or having a high IQ or low IQ or being a great sprinter.
[311] And how do we think about that?
[312] And then, again, a revolution that's already happening.
[313] We're just going to change the way we make babies.
[314] We're going to get away from sex as the primary mechanism for conceiving our kids.
[315] We'll still have sex for all the great reasons we do.
[316] And that's going to open up a whole new world of just of applying science to what it means to be a human with a lot of new possibilities.
[317] That's what's going to be so freaky when people stop having sex to make kids and they make kids in a lab.
[318] Every kid's made in a lab.
[319] Well, not only that, I think we're going to move to an era where people who have, who make babies through sex will be seen as taking a risk, kind of like people who don't vaccinate their kids, where it's natural to not, it's more natural to not vaccinate your kids than to do it.
[320] But people say, wait a second, you're taking on a risk on behalf of your kids.
[321] about 3 % of all kids in the world are born with some kind of harmful genetic abnormality.
[322] Using in vitro fertilization and embryo screening, that 3 % can be brought down significantly.
[323] And what happens if you see somebody 20 years from now who has a kid with one of those preventable diseases?
[324] Do you think that's fate?
[325] Or do you think, well, wait a second, those parents, they made an ideological decision about how they wanted to conceive their kids.
[326] So I think we're moving towards some really deep and fundamental changes.
[327] Hmm.
[328] Well, yeah, that's an interesting conversation of whether or not you, I wonder what, if we're ever going to get to a point where people don't allow people, sort of like people don't allow people to not get vaccinated.
[329] Right.
[330] And like there's a lot of that going on today.
[331] Right.
[332] Right.
[333] Which is great, right.
[334] You don't want diseases floating around.
[335] But what if that gets to the place where we do that with people, with people creating new life forms?
[336] What if you say, hey, you are being irresponsible, you're just having sex and having a kid.
[337] Yeah.
[338] I know that's your grandma did it.
[339] We don't do it that way in 2009.
[340] Yeah.
[341] I think it's going to be hard to do that in a society like the United States, but in a country like North Korea.
[342] They'll be able to do that.
[343] Or if a country, if they said, look, you can make babies however you want.
[344] But if you make babies the old -fashioned way and if you have some kind of genetic, your kid has some kind of genetic disorder that was preventable, we're just not going to cover it.
[345] with insurance.
[346] So you're going to have a $10 million lifetime bill.
[347] You don't need to require something.
[348] You can create an environment where people's behaviors will change.
[349] And then there will be increasing social pressures.
[350] I mean, right now, you know, somebody sees some little kid riding around their bicycle without a helmet.
[351] They're kind of looking at the parents like, hey, what are you doing?
[352] How come you don't have a helmet on your kids?
[353] And I just think that we're moving toward this kind of societal change where people will, I believe, see conceiving their kids in the lab as a safer alternative.
[354] And it's not just safety, because once you do that, then that opens you up to the possibility of all other kinds of applications of technology, not just to eliminate risks or prevent disease, but you have a lot more information.
[355] So already it's possible to roughly rank order 15 pre -implanted embryos, tallest to shortest in a decade from highest genetic component of IQ to lowest genetic component of IQs.
[356] I mean, this stuff is very real and it's very personal.
[357] What do you think would be the first thing that people start manipulating?
[358] I think certainly health.
[359] Health will be the primary driver because that's every parent's biggest fear.
[360] And that is what is going to be kind of the entry application, people wanting to make sure that their kids don't suffer from terrible genetic diseases.
[361] And then I think the second will probably be longevity.
[362] I mean, right now there's a lot of work going on sequencing people, the super ages, people who live to their late 90s, people do 100, to identify what are the genetic patterns that these people have?
[363] So it's like to live to 90, you have to do all the things that you advocate, healthy living and whatever.
[364] But to live into 100, you really need the genetics to make that possible.
[365] So we're going to identify what are some of the genetic patterns that allow you to live those kinds of long long, But then after that, then it's wide open.
[366] I mean, it's higher genetic component of IQ, outgoing personality, faster sprinter.
[367] I mean, we are humans.
[368] We are primarily genetic beings, and we are going to be able to look under the hood of what it means to be human, and we'll have these incredible choices, and it's a huge responsibility.
[369] How long do you think before you have a person with forearms?
[370] I think it's going to take a long time.
[371] A couple hundred years?
[372] Well, the thing is, here's how I see it.
[373] the real driver there's two two primary drivers one will be embryo selection um so right now average woman uh going through IVF has about 15 eggs extracted um and then in IVF in vitro fertilization those eggs are fertilized using the male sperm and in average male ejaculation there's about a billion sperm so so men are just giving it away women uh human female mammals are a little bit are a little bit stingy But then the next killer application is using a process called induced pluripotent stem cells.
[374] And so Shinya Yamanaka, this great Japanese scientist, won the 2012 Nobel Prize for developing a way to turn any adult cell into a stem cell.
[375] So a stem cell is a kind of cell, it can be anything.
[376] And so you take, let's say, a skin graft that has millions of cells.
[377] You induce those adult skin cells into stem cells.
[378] So you use these four things called Yamanaka factors.
[379] And so now you have, let's call it, 100 ,000 stem cells.
[380] And then you can induce those cells into egg precursor cells and then eggs.
[381] So all of a sudden, humans are creating eggs like salmon on this huge scale.
[382] So you have 100 ,000 eggs, fertilize them with the male sperm, in a machine, an automated process.
[383] You grow them for about five days.
[384] And then you sequence cells extracted from each one of those.
[385] and the cost of genome sequencing, in 2003, it was a billion dollars, now it's $800, it's going to be next to nothing within a decade, and then you have real options, because then you get this whole spreadsheet, an algorithm, and then you go to the parents, say, well, what are your priorities?
[386] And maybe they'll say, well, I want health, I want longevity, I want high IQ, when you're choosing from big numbers like that, you have some real options, and then on top of that, then there is this precision gene editing, the stuff that happened in China last year.
[387] I think it will be, and coming back to your question about four arms, I think it's going to be varied.
[388] People have this idea that tools like CRISPR are going to be used and someone's going to sit at a computer and say like four arms and three heads and wings and whatever.
[389] But it's pretty hard because human biology is incredibly complicated.
[390] And we always know more.
[391] But we're at the very beginning of understanding the full complexity of human biology enough to make these big kind of changes.
[392] But if you're choosing from 100 ,000 fertilized eggs, those are all your natural kids.
[393] Yeah.
[394] And then you would get the best of that and then work on those.
[395] Exactly.
[396] That's exactly the model.
[397] You get that.
[398] And then you say, all right.
[399] What if you had like 20 sons that were awesome and they didn't tell you about 18 of them?
[400] You kept two of them.
[401] Then 18 of them shipped off to some military industrial complex.
[402] Turned them into assassins.
[403] Any kind of crazy thing you can think of?
[404] That's the problem, right?
[405] It's, all this stuff will be possible.
[406] And so, and a lot of technologies, you can imagine all kinds of crazy stuff.
[407] And that's coming back to your earlier point about regulations.
[408] We want to live in regulated environments.
[409] I mean, so like now, think of the internet.
[410] And you know, in the beginning days of the internet, people thought, oh, just let the internet be.
[411] Let, you know, just let it play out.
[412] It's going to liberate all of us.
[413] And now China is showing how the internet can be actually, be really actively used to suppress people.
[414] Facebook is taking people's information in Google in a way that's frightening a lot of people.
[415] appears like, hey, it shouldn't be that these companies can do whatever they want.
[416] We have to have some way of establishing limits because not every individual is able to entirely protect themselves.
[417] They don't have the power.
[418] They don't have all the information.
[419] We need some representatives helping us with that.
[420] The real concern is the competition, right?
[421] The real concern is whether or not we do something in regards to regulation that somehow another stifles competition on our end and doesn't allow us to compete with Russia and China, particularly China.
[422] Yeah, that's exactly right.
[423] And so what we need to do is to find that balance.
[424] And one of the big issues for this is privacy.
[425] So if you kind of look around the world, I'd say there's of the kind of the big countries and groupings of countries, there's three models of privacy.
[426] There's Europe, which has the strongest privacy protections for all kinds of data, including genetic data.
[427] There's China that has the weakest.
[428] And there's the United States that has the middle.
[429] And the paradox is from an individual perspective, we are all thinking, well, we kind of want to be like Europe because I don't want somebody accessing my personal information, especially my genetic information.
[430] This is like my most intimate information.
[431] But genetics is the ultimate big data problem.
[432] And so you need these big data pools and you'd access to these big data pools in order to unlock the secrets of genetics.
[433] So these three different groupings, everyone's making a huge bet on the future.
[434] And the way we're going to know who wins.
[435] Like right now in the IT world, we have Amazon and Apple and Google and those big companies.
[436] But whoever gets this bet right, they will be the ones who will be leading the way and making a huge amount of money on these technologies.
[437] What we're talking about is a trillion, multi -trillion dollar industry.
[438] How do you think this is going to affect things like competitive athletics?
[439] Hugely.
[440] So right now, we have this problem.
[441] Someone like Lance Armstrong He's doing, he's manipulating his body.
[442] And what he's basically doing is adding more red blood cells so that he can carry more oxygen.
[443] And people feel that that's cheating.
[444] And it's a different topic that probably everybody in the Tour de France was doing exactly that when he won.
[445] But what if, which will be the case, we're going to be able to sequence the people, let's say nobody's doing drugs.
[446] And we sequence all these athletes.
[447] some of them will just have a natural genetic advantage.
[448] Their bodies will naturally be doing what Lance Armstrong had manipulated his body to do.
[449] You know that's happening with a sprinter right now?
[450] Yeah.
[451] That female sprinter that has high levels of testosterone.
[452] Yeah.
[453] No, and I feel really sorry for her.
[454] Yeah.
[455] But we have categories.
[456] I mean, with your world in mixed martial arts.
[457] I mean, I think I remember in the past, there was some person who was kind of a borderline between genders and we're just kicking the shit out of all of these women in cage fighting.
[458] And it's like we have these categories of man and woman.
[459] We know that gender identities are fluid.
[460] But how do we think about it when these genetic differences confer advantages?
[461] So if your body is primed to do something, maybe you could have like a Plato's Republic world where everybody fulfills a function that you are genetically optimized.
[462] to do and that you could imagine that being a very competitive kind of environment but what do you do for now in something like the olympics if somebody has this huge genetic advantage should we let somebody else manipulate their bodies there's a thing called gene doping in order to change the expression of genes so your body to act like you're as naturally genetic enhanced as somebody else it's complicated are they capable of doing certain physical enhancements through gene doping right now Yeah.
[463] Like, what can they do right now?
[464] No, no, so the way it works is so your genes instruct your cells to make proteins.
[465] That's how the whole system works.
[466] So you can change genes or you can trigger the expression of proteins.
[467] So you can get people's bodies to behave as if they had the superior genes.
[468] Yeah, and so that's why now the world anti -doping agency, I mean, they are now starting to look at gene doping and this is the first time that that's that that's even being considered as a category and then there are have people that have done that are there people that have done that successfully you know i don't know the answer to that i know that wada is looking for it which makes me assume that it must have done but i haven't seen i've looked for it i haven't seen any reports if china starts winning everything what china is so so my my one of my sci -fi novels genesis code was about this so china as you know has their system of their Olympic sports schools.
[469] And the way it works is they test kids all around the country.
[470] So let's just say it's diving.
[471] And they identify, what are the core skills of a diver?
[472] What do you need?
[473] And then they go around the country and they test kids and then they bring a bunch of them to their Olympic sports schools.
[474] And then they get them all involved.
[475] And then some kids are the best of those kids and the best of those kids.
[476] And then you get with these champions.
[477] That's why China advanced so rapidly.
[478] But what happens if they're doing that, but it's at the genetic level.
[479] And there are countries like Kazakhstan that are already announcing that they are going to be screening all of their athletes.
[480] So the science isn't there yet.
[481] So it's really impossible right now to say, well, I'm going to do a genome sequence of somebody and I know this person has the potential to be an Olympic sprinter.
[482] But 10 years from now, that's not going to be the case.
[483] Wow.
[484] Yeah, it's sort of going to throw a monkey wrench in the whole idea of what is fair when it comes to athletics.
[485] Yeah, what is fair?
[486] What is human?
[487] Right.
[488] Right.
[489] What is human.
[490] Yeah.
[491] I mean, look, it's not like people don't already alter their bodies by training, by diet, exercise, all sorts of different recovery modalities, cryotherapy, sauna, all of it.
[492] You know, high elevation training, all these different things that they do that manipulates the human body.
[493] But it's not like, it would be kind of crazy if you had sports, but you couldn't practice and you couldn't work out.
[494] Like, we want to find out what a person's really.
[495] really like.
[496] No practice, no working out.
[497] And that's the thing is like we are moving, it comes back to what we were saying before about nature.
[498] It's like we have this feeling, nature somehow feels comfortable to us.
[499] That's what we're used to.
[500] All this stuff that you're talking about nobody was doing that 10 ,000 years ago.
[501] It's like, hey, I'm running after a buffalo.
[502] And so, and so as these boundaries change, as the realm of possibility changes, then we're going to be faced with all of these questions.
[503] Even now, look at a sport like competitive weightlifting.
[504] They have these, like, the real competitive bodybuilding.
[505] And you see these guys, and they're monsters.
[506] And then they have these drug -free guys, and everybody looks like a yogi.
[507] They still look pretty big.
[508] They look pretty big, but not compared to these other guys.
[509] The only way to get those freak levels is two steroids.
[510] Yeah.
[511] And so, like, how are we going to police this?
[512] And I think it's going to be very difficult.
[513] And so maybe we can have some kind of natural area of life.
[514] But I think that our model of what's normal is just going to change.
[515] Because like I was saying in the beginning, we set our baseline based on how we grew up.
[516] And that, it seems about right.
[517] Like it seems about right to us that everybody gets immunizations.
[518] But immunizations are a form of superpower.
[519] Imagine if our ancestor, they couldn't even imagine immunizations.
[520] What an unfair advantage when you have 100 million people.
[521] dying of of Spanish flu so we're this all this stuff is scary and it's going to normalize but how it normalizes is is that's what it played in well the world has changed so much just in the last 20 years but it feels like this is just scratching the surface in comparison to what's coming it was a people misunderstand and they underestimate the rate of change and the reason that they do that is since the beginning of the digital revolution we have experienced a thing called exponential change.
[522] You've heard of Moore's Law, which is basically computing power roughly doubles every two years.
[523] And we've internalized Moore's Law, and that means that every new iPhone, we expect to be better and stronger and faster and all these kinds of things.
[524] But now we're entering in a world where we're going to have exponential change across technology platforms.
[525] And so we think about, well, what does exponential change mean in the context of biology?
[526] Well, at the very, very beginning, it's genome sequencing is is going to be basically uh basically free but we're going to be able to change life and because we're on this j curve like when you think of what's a 10 year unit of change looking in the in the rear view mirror that amount of change is only going to take five years going forward and then two years and then one year and so that's the reason why I've written this book is we have to get that this stuff is coming fast and if we want to be part of it we have to understand it, and we have to make our voices heard.
[527] What makes you nervous about this?
[528] All right, three big areas.
[529] First, humans and all of bodies are incredibly complex.
[530] I mean, I talk about genetic code, which is mind -bogglingly complex.
[531] But our genetics exists within the incredibly complex systems biology.
[532] We have all these things like our microbiome, our viome, our proteome, our metabolome.
[533] And then that exists within the context of our environment and everything's always changing and interacting.
[534] And so we are messing and we have the tools to mess and we will mess because we're this hubristic species with these really complex ecosystems, including ourselves.
[535] We don't fully understand.
[536] That's number one.
[537] Two, and you mentioned it before, this issue of equity.
[538] What happens if we have, every technology has to have first adopters.
[539] If you don't have it, you never get the technology.
[540] But what happens if a group of people move much more quickly than that?
[541] other people.
[542] Whether it's real or not, even if they believe it's real, you could imagine big, dangerous societal changes.
[543] And the third big area is diversity.
[544] When we think about diversity, we think, well, it's great to have diverse workplaces and schools and we're more better people for it and we're more competitive.
[545] But diversity is something much, much, much deeper.
[546] In Darwinian terms, diversity is random mutation.
[547] Like, that's our core survival strategy as a species.
[548] If we didn't have that, you could say we'd still be single -cell organisms, but we wouldn't.
[549] We would have died because the environment would have changed and we wouldn't have had the built -in resilience to adapt.
[550] Yeah, that is really important when you think about diversity, right?
[551] That we need a non -uniformity when it comes to our own biology.
[552] Yeah, because we need a bunch of different kinds of people.
[553] We have to have it big as even if we optimize for this world, the world will change.
[554] There's no good and bad in evolution.
[555] There's just well -suited for a particular environment.
[556] If that environment changes, the best suited person for your old environment may be the least suited person for the new environment.
[557] So even if we have things that seem like really great ideas now, like optimizing health.
[558] So if you have sickle cell disease, you're probably going to die and you're going to die young and it's going to be excruciating painfully painful.
[559] And so you would say, well, let's just get rid of sickle cell disease, which we can do.
[560] But if you are a recessive carrier of the single, of the sickle cell disease gene, you don't have it, you're just carrying it, and you have a pretty significant risk of passing it on to your kids, but you also have an additional resistance to malaria.
[561] And so we are probably, we are almost certainly carrying around all kinds of recessive traits, maybe even ones that we don't like that are harming us now, but that could be some protection against some future danger that we don't yet understand or haven't faced.
[562] And so the challenge is that diversity has just happened to us for four billion years.
[563] Now we're going to have to choose it.
[564] And that's a big, it's a big challenge for us.
[565] So essentially, we're going to have, without doubt, some unintended consequences, some unintended domino effect, things that are going to take place that we really can't predict.
[566] We just have to kind of go along with this technology and see where it leads us as it improves.
[567] If you go back and look at surgeries from the 1950s, comparison to surgery of 2019, giant leaps.
[568] I mean, I would never advise someone to get their knee operated by a 1950s physician.
[569] You need a good advice, yeah.
[570] Right?
[571] But that's kind of, someone's going to have to be an early adopter when it comes to these genetic therapies.
[572] Yeah, no, so I agree with you, but where I would slightly, or I would add to what you're saying is these technologies, they're going to happen, they're going to play out.
[573] What's it play now is not whether these technologies are going to advance.
[574] They will, it will do, they will advance in a way that is going to just blow.
[575] slow people's minds.
[576] What's at play is what are the values that we are going to weave in to the decision -making process so that we can get a better outcome than we otherwise would have had?
[577] And that's what, in my view, that's the real, the important issue now.
[578] Yeah.
[579] Unintended consequences are something that I've been taken very seriously lately when I'm paying attention to technology as it's used in social media.
[580] Particularly, one of the things has disturbed me quite a bit over the last few weeks is that there's a model that they use and not intentionally but there's a model that they use to get people upset about things, show you things in your feed that you argue against because that makes you click on them more and you engage in them more.
[581] And because of the fact that we have this advertiser -based model where people are trying to get clicks because they want to get ads on their page and the more clicks they get, the more money they get from those ads, and so they want to incentivize people to go there and the best way through the algorithm, what they've found without doing it on purpose, is to get people upset.
[582] Yeah, and they're pushing people into these little information ghettos that are really dangerous.
[583] Right.
[584] And we've gotten to this point where this is just an accepted part of our lives that you go to check your Google feed or your Facebook feed and oh, but what the fuck are they doing?
[585] Is this real?
[586] They're going to pass this.
[587] God damn it.
[588] And then you get mad and then you engage with people online and then it results in more revenue.
[589] But getting them to stop that, if you had to go to Facebook and say, hey, hey, Mark Zuckerberg, I know you have fucking $100 billion or whatever you got, but you can't make any more money this way.
[590] Right.
[591] Because what you're doing is fucking up society because you're encouraging dissent.
[592] You're encouraging people to be upset and arguments.
[593] And you're doing it at great financial reward but great societal cost.
[594] Yeah.
[595] So stop.
[596] Yeah.
[597] And he's not going to do it, right?
[598] Well, he may not do it.
[599] That comes back to the point about regulation.
[600] The question is how big is your stick?
[601] Well, one of the guys who was the founder of Facebook.
[602] Chris Hughes, yeah.
[603] Yes, is now coming out and saying that Facebook needs to be broken up.
[604] And then he was one of the original founders.
[605] And he's like, it has gotten so far out of hand.
[606] It's so far away from where it is.
[607] It's literally affecting global politics.
[608] Yeah.
[609] Well, it is.
[610] And so one option is to break it up.
[611] It seems to have worked pretty well with AT &T.
[612] Another option is to regulate it, which in my mind would be a better approach.
[613] And that is to say, here's what's okay and here's what's not okay.
[614] And this stuff is really intricate.
[615] You have to really get down beneath these algorithms, which are unbelievably complex, but you're exactly right.
[616] I mean, what we're seeing now is we are being pushed into, I said information ghettos, but it's like information barricades.
[617] And so pushed into camps.
[618] Yeah, and so.
[619] And it's so dangerous because the old, I mean, this country is based on not everybody agreeing, but having a process where people come and they work it out.
[620] out and they say, you know, I'm not perfectly happy with this outcome, but here's a compromise.
[621] And if we can't compromise, then our civic culture is going to break down.
[622] And there's so much.
[623] I mean, people don't see these pillars that are holding up our society.
[624] I lived in Cambodia for two years.
[625] And if you don't have these civic pillars under your society, societies look very, very different.
[626] Everyone's life experiences, we kind of take for granted.
[627] You can go out the door, walk to Starbucks and not get shot, or you can have your house, something happens.
[628] Your house gets robbed.
[629] call the police and the police aren't the ones who've robbed your house or they're not in I mean there's all these kinds of crazy things if we break down the foundations that underpin our lives that's really dangerous what I was kind of getting at was that through what this process of this algorithm how this algorithm selects things that that shows you in your feed and how people are getting upset by this and how this is generating massive amounts of revenue once it's already happened it's very difficult to stop and my concern would be that This would be a similar thing when it comes to genetic engineering.
[630] We're saying we need to be able to put regulations on this.
[631] We need to be able to establish it.
[632] But once it gets out of the bag, once it gets rolling.
[633] And I have, you remember when Mark Zuckerberg sat in front of all those politicians?
[634] They had no fucking idea what they were talking about.
[635] How do you make money.
[636] There's such piss poor prepares.
[637] And it just shows you like these are the people that are looking out for us.
[638] Good fucking luck.
[639] These are Luddites.
[640] They're dumbasses.
[641] They're fools, right?
[642] and they don't know anything about...
[643] Some are, there's better and worse, but...
[644] But almost everyone was underwhelming and under -impressive.
[645] In that hearing, yeah, absolutely, yeah.
[646] The fact that they're dealing with one of the most important moments of our time, but they didn't bring on some sort of a legitimate technology expert who could explain the pitfalls of this and do so in a way that the rest of the world's going to know.
[647] So they're not going to protect us from genetic engineering either, right?
[648] Because they're generalists in terms of their education for the most part, and they're not concerned.
[649] They're concerned with raising money for their campaign, they're concerned with getting reelected.
[650] That's their concern with.
[651] Yeah, I totally agree with you that if we wait to focus on this issue until it becomes a crisis, it's going to be too late because all the big decisions will have been made.
[652] The reason why I wrote this book, the reason why I'm on my almost week three of this book tour doing events like this every day is what I am saying in every form that I can is this is really important.
[653] You're kind of, we were watching the news yesterday.
[654] They had this royal baby in the UK.
[655] Like, I don't give a shit.
[656] It doesn't affect my life in any, but what does it play now is the future of our entire species and our democracy and our lives.
[657] And we have to be focusing on those things because we have a moment now where we can, to a certain extent, influence how these revolutions play out.
[658] And if we just wait around, if we're distracted and we're focusing on all this stuff that's sucking up our attention and whether it's Trump or Brexit or Mueller and all these things.
[659] I mean, how much of our time are we spending focus on is fine.
[660] Let's pay a little bit of attention.
[661] But there's really big stuff.
[662] 50 years from now, 100 years ago, no one's going to look back and now say, oh, that was the age of Trump.
[663] They're going to say that was the age when after almost four billion years of evolution, humans took control of their own evolutionary process.
[664] And it's huge and it's going to change all of life.
[665] And what I'm trying to do is to say everybody has to have a seat at the table, whether you're a conservative Christian, whether you're a biohacking transhumanist, everybody needs to be at the table because we're talking about is the future of our species.
[666] We're talking about the future of our species, but are we even capable of understanding the consequences of these actions, the stuff that we're discussing?
[667] Like right now, I'm not.
[668] Like, I'm talking about it.
[669] Right.
[670] I mean, if someone said, hey, you've got to go speak in front of people about the consequences in a very clear one -hour presentation.
[671] I'd be like, no, I'm not.
[672] I don't know what I'm talking about.
[673] One, we can go together, so you're good.
[674] Thank you.
[675] But two, the reason why I've written this book, Hacking Darwin, is I wanted to say if you could read just one book and it's written just for everybody in a very clear way with a lot of jokes that I think are funny, my mother laughed at them as well, that you get it.
[676] And then once you know just the basics as a human being, anybody it has an equal right to be part of this of this conversation as the top scientist or the the leaders of any of any countries i would agree with you there but i don't think that other people are going to see it that way i think the people that are in control they're not going to say hey we need to find we need to be fair with everyone all the citizens of the world how do you feel we should proceed but that's why we have to that's why we need this bottom up groundswell but we can't have a bottom up groundswell if people if just general people aren't even aware of what the issues are.
[677] And that's the challenge.
[678] And that's why forums like yours are just so important.
[679] I mean, you have all of these people.
[680] And then, you know, maybe everyone doesn't listen to this podcast and say, all right, I get it.
[681] I can go give that hour long speech.
[682] But you can read a couple books.
[683] And then you can give an hour speech because the issues, like, yes, there are scientific issues.
[684] But this isn't a conversation about science.
[685] This is about values and ethics in our future.
[686] And it has to be a conversation for everybody.
[687] Yeah, it's not just a scientific conversation.
[688] It's a conversation about the future of this species and what the species will become.
[689] And that's something we're wholly unqualified.
[690] No, it's true.
[691] But here's a little vote for optimism.
[692] Okay.
[693] We have never been this literate as a species.
[694] We've never been this educated.
[695] I don't think we've ever been this nice either.
[696] Well, I hope so.
[697] I really do.
[698] Yeah.
[699] When you look at all the wars and all the murder that used to happen, it's actually, this is the best time ever to be alive.
[700] Still sucks for people that are in bad situations.
[701] For sure.
[702] No, but it's, yes.
[703] On average, it's better.
[704] And we've never been this connected.
[705] So we have, so I, in the book, I call for a species -wide dialogue on the future of human genetic engineering.
[706] You think, well, that's nuts.
[707] Seven billion people on Earth.
[708] How are they going to do that?
[709] But we have the opportunity, and we have to try because, you know, you don't want, like with the beginning of the genetically modified crops era, the scientists were actually really responsible, but regular people weren't consulted, and they felt these guys just did it to me. So if you have all the marchers with genetically modified organisms, We are entering the era of genetically modified humans, and that's going to scare the shit out of people.
[710] And so we need to start preparing, and we need to make people feel that they're respected and included.
[711] And our government leaders aren't going to do it for us.
[712] So we have to find ways of engaging ourselves.
[713] And that's why with me with the book, I set up a website where people can share their views, debate with other people.
[714] I really want everybody to be part of this conversation.
[715] How do you think it's going to play out in terms of how people, various religions perceive this?
[716] Yeah.
[717] So there's a real variation.
[718] So there are people on one end of the spectrum who believe that this is quote unquote playing God.
[719] And if you believe that the world was created exactly as it is by some kind of divine force and that it's wrong for humans to change, to quote unquote play God, it's hard to explain how you could justify.
[720] everything that we've done.
[721] I mean, we've changed the face of this, of life on this planet Earth.
[722] But I really respect people who say, look, I think that there's a line that, you know, that I believe that life begins at conception and that any kind of manipulation after conception is interfering.
[723] That's going too far.
[724] And I respect that.
[725] And those people need to have a seat at the table.
[726] And there's certainly very strong religious views.
[727] In Judaism, there's an idea called Tikun Olam, which means that the world is created, cracked and broken, and it's the responsibility of each person to try to fix it.
[728] And that's a justification for using science and doing things to try to make the world a better place.
[729] And then there are now these new kind of, I mean, transhumanism.
[730] It's almost like a religion.
[731] It's this religion of science.
[732] And so we're going to have, we're humans, we're so diverse, we are going to have this level of diversity.
[733] And the challenge is how do we make, how do we have a process that brings brings everybody in but it's it's tough so when we're talking about genetic any sort of genetic manipulation we're basically talking about doing stuff to the wetware doing stuff to the biology right what do you think about symbiotic interactions with technology because i sure one of the things that i'm concerned with more than anything is this sort of inevitable path of technology getting into our bodies whether it's through nanobots fixed diseases or through implement implementation we were talking yesterday about chips right like what would what would they have to do to get you to put a chip in your body like what what kind of powers would it have to have before you accepted it yeah well people are already doing it in sweden sure and so what are they doing in sweden yeah they're just they're putting just little chips in their hands and they're and under their skin and they're using it to open doors and access things um so it's just starting so i definitely believe you know right now you look at we look at photographs of our parents and you say god look at your hair your clothes that's crazy definitely I think that 20 years from now, 30 years from now, people are going to look at pictures of us.
[734] And you're going to say, what's that little rectangular thing?
[735] And you're going to say, that was a phone.
[736] And they'll say, what?
[737] It's like, yeah, we used to carry it around in our pocket.
[738] Well, like Michael Douglas, when you watch him in that movie Wall Street, he's got that giant phone on the beach.
[739] He was a state of the art. We are all Michael Douglas because our technology, you're absolutely right, is not going to be something that we carry around.
[740] Technology is coming inside of our bodies.
[741] That is the future of where it's going.
[742] And, you know, people say, well, what does human genetic engineering have to do when we know that AI is going to get more and more powerful?
[743] But the future of technology, the future of all of this, it's not human or AI.
[744] It's human plus AI.
[745] And that is what's going to drive our, we're co -evolving with our technology, and that's what's going to drive us for it.
[746] But you're exactly right to be afraid and to be concerned.
[747] And again, everything comes over.
[748] how are we going to regulate it?
[749] Are we going to have guardrails of how far is too far?
[750] Are we going to let companies just do whatever they want?
[751] Or are we going to put restrictions on what they can do?
[752] I think letting the whole world decide, though, you're going to run into those religious roadblocks.
[753] For sure.
[754] And that's the challenge is that the science is advancing exponentially, whatever we do.
[755] And so we have to have our understanding of the science needs to at least try to keep pace.
[756] regulations need to keep up.
[757] I'm part of the World Health Organization International Advisory Committee on Human Genome Editing.
[758] So we're meeting six times this year in Geneva.
[759] And the question that we're asking is how do we think about global regulation, at least to try to put limits on the far ends of what's possible?
[760] And it's really, really difficult.
[761] But that's why we need to have this kind of process.
[762] And it seems impossibly ambitious, but every crazy idea has to begin somewhere.
[763] So you're doing every couple months?
[764] Yeah, yes.
[765] Wow.
[766] Yeah.
[767] Because they'd want to be on top of it as things change.
[768] Well, that's the goal.
[769] It's just, it's so hard because...
[770] Almost impossible, it's impossible.
[771] It's impossible.
[772] And that's why even the World Health Organization, which is the lead health organization of the United Nations, it's not enough.
[773] The task is so much bigger.
[774] And that's why we need to have this kind of bottom -up ground swell that I'm pushing for.
[775] And you're absolutely right, what you said before.
[776] Like, because there's not a crisis, people are focusing on other things.
[777] Open any news site.
[778] Like, what do you see?
[779] It's not like the really important stuff.
[780] It's, you know, Trump did this or Kardashians did that.
[781] And, like, we're in this culture where there are a lot of draws on our attention.
[782] But sometimes there's really important stuff.
[783] And people are afraid of it.
[784] People are afraid of science.
[785] People feel like, yeah, I remember science from high school.
[786] I didn't like it.
[787] I was uncomfortable.
[788] You know, this is for technical people.
[789] And I just feel like we can't, science is so deeply transforming the world, not just around us, but within us.
[790] And so we have to understand it.
[791] And people who are explaining science like me, the onus is on us.
[792] Like if somebody reads my book and says, well, that was really dense.
[793] That was too hard.
[794] Like, that's my failure.
[795] Like, I was giving a talk in New York a couple of weeks ago.
[796] And so I gave my talk and I try to make this really accessible for people.
[797] people were all jazzed up they got it and then there was this wonderful guy's brilliant senior scientist at this major stem cell research center and so the the host said all right jamie just talked um can you sign can you to give us a little background on the science this guy knows so much and he started going and it was very technical and you could just like I could just see the faces of the people in the audience it was like oh god what's happening here and just like their level of excitement it just shrunk because it couldn't really put it They couldn't, yeah.
[798] And scientists, scientists aren't trained by and large to communicate and to see in the future.
[799] So a little more than a month ago, I was in Kyoto in Japan.
[800] And I went to the laboratory of the world's leading scientists who's doing a process of what I mentioned earlier of turning adult cells into stem cells into eggs.
[801] And so this will revolutionize the way humans reproduce.
[802] And so I was in a meeting with his top post -dust.
[803] students.
[804] So these are like really the cutting edge of these technologies.
[805] And I went around to each of them.
[806] And I said, here's my question.
[807] I have two questions for each of you.
[808] One, tell me what you're doing now.
[809] And two, tell me what are the implications of what you're doing now for 50 years from now.
[810] And the first question goes, oh, I'm doing this.
[811] And we're doing this with mouse models and people were so animated.
[812] And then 50 years from now, people just froze.
[813] And it was so uncomfortable.
[814] They were like squeezing the table just because that's not what scientists do.
[815] They are trained to say, well, this is the thing just in front of me. So I thought I was writing this book for the general public, but I'm being invited to speak to thousands of doctors and scientists because what they're saying is we get that we're doing this little piece of this and whether it's lab research or fertility doctors or all sorts of things.
[816] But it's really hard to put together the whole story of the genetics revolution and what it means for us and for society.
[817] Yeah, man, that is interesting about scientists, right?
[818] They're just concentrating on the task.
[819] I mean, that was like one of the big concerns about the Manhattan Project, right?
[820] This is the task.
[821] The task is how do you figure out how to do it.
[822] So they figure out how to do it, not the eventual consequences.
[823] So when Robert Oppenheimer, who was the lead of the Manhattan Project, when that first bomb went off, I mean, he has his famous quote.
[824] Yeah, exactly.
[825] I mean, the English common translation was, holy shit, what have we done?
[826] And this science is real.
[827] But it's not going to be, it's not one person doing it.
[828] I mean, that's the whole, like science has been diffused, at least with nuclear power.
[829] It was a relatively small number of people.
[830] And it was one or two states that could do it.
[831] Now with precision gene editing, I mean, you get the Nobel Prize for figuring out how to do, you will get the Nobel Prize, figuring out how to do CRISPR gene edits.
[832] But to apply it, once the formula already exists, you get like an A -minus in your high school biology class.
[833] So this technology is out there.
[834] It's cheap.
[835] It's accessible.
[836] Did you go to that 2045 conference in Manhattan a couple of years back?
[837] No. Do you know about all that 20 -40?
[838] That's part of the thing with these transhumanist folks.
[839] They believe that with their own calculations of the exponential increase of technology, that's somewhere around 2040.
[840] Singularity.
[841] At the very least, we're going to reach this point where you're going to be able to either download consciousness or have some sort of an artificially intelligent sentient beings hanging out with you.
[842] Yeah.
[843] So I'm involved.
[844] I'm on faculty for one of the programs of Singularity University called Exponential Medicine.
[845] And so we're thinking a lot about that.
[846] Actually, I had an editorial in the New York Times a few weeks ago imagining a visit to a fertility clinic in the year 2045.
[847] And again, because we were on this.
[848] this exponential change, it's really hard for people to internalize, to kind of feel how fast these changes are coming.
[849] I do think, though, Ray Kurzweil, who's a really incredible genius, he thinks that we are soon going to get to a point where our artificial intelligence is self -learning.
[850] Because when you think about it, AI, if it gets to the point where it can read something, read and comprehend.
[851] Like, in seconds, it will read every book ever written in human history.
[852] And then when you have all these doublings and all this more knowledge, you can imagine how that would happen pretty quickly.
[853] The counterargument against, and I think that it will.
[854] But I don't think that our human brains are, on one hand, they're incredibly complex.
[855] And they're also kind of irrational.
[856] I mean, we have all these different layers.
[857] We have our lizard brain and every decision that we make, there's the rational decision.
[858] But then there's all the other stuff that our brains that doesn't even rise to the level of our awareness, that our brains are processing.
[859] And right now, we only really have one really effective artificial intelligence algorithm, which is for pattern recognition.
[860] But if you think of pattern recognition as a core skill of what our brains do, our brains probably have 1 ,000, 2 ,000 different skills.
[861] But the core thing is whether we reach this singularity moment or not, these technologies are going to become incredibly more powerful.
[862] They're going to become increasingly integrated into our lives and into our beings and part of our evolutionary process.
[863] There's no longer, oh, we just have our biological evolution and our technological evolution and those are separate things.
[864] They're connected.
[865] It's going to be that weird question of whether or not if an artificial intelligence is going to be able to absorb all the writing that human beings have ever done and really understand us.
[866] Yeah.
[867] Will they really still be able to understand us just because they get all the writing?
[868] So right now, you would say no. I'd say no, yeah.
[869] But 20 years from now, 50 years from now, 100 years from now?
[870] They could come up with a reasonable facsimile.
[871] I mean, they could figure out a way to get it close enough.
[872] Yeah.
[873] You know, where it's like her, like that.
[874] Yeah, yeah.
[875] That's an essential point because I think when people imagine this AI future, they're imagining like some intimate relationship with some artificial intelligence that feels just like a human.
[876] I don't think that's going to happen because it's...
[877] You don't?
[878] Well, no, but just because AI, it will be its own form of intelligence.
[879] And it may not be, frankly, we wouldn't want...
[880] AIs with these brains like we have that have all these different impulses that are kind of imagining all this crazy stuff.
[881] We may want them to be more rational than we are.
[882] So like chimpanzees are our close relatives.
[883] They don't think just like us.
[884] We're not, you know, we're not expecting them to think likes.
[885] They're their own thing.
[886] And I think AIs will be their own things.
[887] Will we be interacting with them?
[888] Will we be having sex with them?
[889] Yes.
[890] But it's not going to be that they're just like us.
[891] We're going to, they're going to be these things that live within us, live with us, and together we're going to evolve.
[892] Well, they're certainly already better at doing certain things like playing chess.
[893] Yeah.
[894] I mean, it took a long time for an artificial intelligence to be able to compete against a real chess master, but now they swamp them.
[895] Yeah.
[896] And they learn quickly, like, incredibly quickly.
[897] They teach themselves.
[898] Yeah, so first we had chess, and chess people said, oh, that's what it means to be a human.
[899] The computers will never beat humans.
[900] at chess.
[901] Now, it's like everyone says, well, no human could ever compete.
[902] And then they said, well, there's this Chinese game of Go, which kind of when people here look at it, it looks kind of like checkers, but it's actually way more sophisticated, way more complicated than chess.
[903] I heard that there are more moves in Go, more potential moves, and there are stars in the universe.
[904] Yes, yes.
[905] So then they had AlphaGo that this company DeepMind, which was later acquired by Google, they built this algorithm that in 2016 defeated the world champions of Go.
[906] People thought that was we were decades away.
[907] And then DeepMind created this new program called Alpha Zero.
[908] And Alpha Zero, with AlphaGo, they gave it access to all of the digitized games of Go.
[909] So it very quickly was able to learn from how everybody else had played Go.
[910] Alpha Zero, they just said, here are the basic rules of Go.
[911] and they let Alpha Go just play against itself with no other experience other than here are the rules and play against.
[912] And in four days, Alpha Zero destroyed AlphaGo.
[913] And then Alpha Zero destroyed the world champions of chess and destroyed every other computer program that had ever played chess.
[914] And this, again, those computer programs had internalized all the chess games of Grandmasters.
[915] Alpha Zero had not.
[916] internalized any.
[917] It just played against itself for a few days.
[918] And then Shogi, which is a Japanese traditional game, kind of like chess.
[919] It destroyed the grandmasters of that.
[920] So that's what I'm saying is that these, the world is changing.
[921] It's changing so much faster than we anticipate.
[922] And we have to be as ready for that as we can.
[923] I think we need to come to grips to the fact that we're way stupider than we think we are.
[924] We think we're really intelligent and we are in comparison to everything else on this planet.
[925] Yeah.
[926] But in comparison to what is possible, we are really fucking dumb.
[927] In comparison to what this computer can do and what the future of that computer is, and what maybe that computer is going to redesign another computer.
[928] You know, this is good, but I've got some hiccups here.
[929] Yeah.
[930] No, it's true.
[931] And yet the technology is us.
[932] Right.
[933] Like this, it's not like this technology, some alien force.
[934] It's like we create art. We create, yeah, you use, like you mentioned cities in order.
[935] Like, we create these cities, which are these incredible places.
[936] places where dreams can happen in cities like here in Los Angeles or New York where I'm from.
[937] So this technology is us.
[938] And the challenge is how can we make sure that this technology serves our needs rather than undermines our needs?
[939] Yeah.
[940] And whether or not our needs supersede the needs of the human race or supersedes the needs of the planet.
[941] Yeah.
[942] It's, we're almost too much champ, right, to contemplate these critical decisions.
[943] in terms of like how it's going to unfold from here on out.
[944] We really might, not we, but the people that are actually at the tip of the spear of this stuff, they really might be affecting the way the planet is shaped 100 years from now.
[945] And we're doing that now.
[946] I mean, we are, there is an article that came out the other day.
[947] There's a million species that are on the verge of extinction.
[948] We are driving all these other species to extinction.
[949] We're warming the planet.
[950] So humans are the determining factor.
[951] many ways for how this planet plays out.
[952] And that's why in my mind, everything comes back to values.
[953] You're right.
[954] We have this lizard nature, this monkey nature.
[955] It's who we are.
[956] And that's, I mean, you wouldn't want to take that away because that's the core of what we are.
[957] And yet we're also a species that has created philosophy.
[958] We've created beautiful religions and traditions and art. And the question is, which version of us is going to lead us into the future?
[959] fits this tribal primate with these urges, like that's really frightening.
[960] If we can say, you know, we've done better and worse in history.
[961] We had this terrible Second World War.
[962] And yet at the end of the Second World War with American leadership, the world came together.
[963] We established a United Nations.
[964] We established these concepts of human rights.
[965] Like, you can't just kill everybody in your own country and say, hey, it's just my business.
[966] So we have this capability, but it's always a struggle.
[967] I mean, these forces are always at war with each other in many ways.
[968] It's just too much to think about.
[969] Yeah.
[970] But we have to.
[971] I know.
[972] We do have to.
[973] And one of the things that's always been amusing to me is that we seem to have this insatiable desire to improve things.
[974] And I've always wondered why.
[975] But is that maybe because this is what human beings are here for?
[976] Yeah.
[977] It's what we do.
[978] It's who we are.
[979] Right.
[980] Yeah.
[981] But it's just a product is just us being intelligent, trying to survive against nature and predators and weather and all the different issues that we came up that we evolved growing up and dealing with.
[982] And then now we just want things to be better.
[983] We just want things to be more convenient, faster, but more data.
[984] You're aware of Elon Musk's neural link technology.
[985] How much do you know about it?
[986] I know a decent amount.
[987] And my friend Brian Johnson, he has a company Colonel.
[988] There's a few different companies that are trying to think about these brain machine interfaces.
[989] And what are they trying to do?
[990] Basically, what they're trying to do is to find a way to connect our brains to our machines.
[991] And there's a little bit of progress.
[992] And our brains are, they're incredibly complicated and they're messy.
[993] I mean, there's a lot that's happening.
[994] But we are increasingly figuring out how to connect our brains to our technology.
[995] And so people are imagining a time when we can do things like down.
[996] download memories, download ideas, or upload memories and upload ideas.
[997] And there's some very early science that is suggesting that this will be possible, but it's still the very early days.
[998] The very early days.
[999] But Elon was giving the impression that sometime this year they're going to release something.
[1000] You know, they may release something, but it's not going to be something that's going to change the world.
[1001] Because that technology is way more nascent than even the genetics technology.
[1002] that I've been talking about.
[1003] So it's not like that there's that it's at all remotely possible that this year you're going to be able to like upload a full memory or download a full memory.
[1004] But there are little things that are happening, but every journey begins with a step.
[1005] But the technology is fairly transparent in terms of like where the state of the art is right now?
[1006] It is in that it's extremely early.
[1007] This stuff is, so when you think of like about systems that we understand, I mentioned that we, you know, We understand just a little bit about genomics.
[1008] We know less about the brain.
[1009] The brain is kind of the great unknown of this universe.
[1010] We know more about the oceans than we know about our brain.
[1011] I mean, we know very, very little.
[1012] We understand that if you kind of stick an electric current in somebody's brain, like that's going to, if you kind of shoot a spike through somebody's head, but really understanding how the brain functions, we're still in the very, very early days.
[1013] So do you think that Kurzweil's off with this idea that you're going to be able to download your consciousness into a computer, because that's one of the most controversial ideas that he's come up with, right?
[1014] So I think he's off based on your use of the word your.
[1015] So I mentioned that a month ago, I was in Kyoto, and I was at this stem cell lab, but I also went to another lab of a guy named Hiroshi Ishiguro, who's the world's leading humanoid roboticist.
[1016] And so he's the guy who was on the cover of Wired, and he's created these robot avatars.
[1017] And like I had a conversation, with this robot woman, Erica.
[1018] And it was really interesting because I could see that, like, if I would smile, she'd smile and lean forward.
[1019] And if I had like an, you know, over -exaggerated sad face, she'd, like, change her expression.
[1020] And she can have, like, you know, basic, basic conversations.
[1021] Wow.
[1022] But we're still a long way.
[1023] And so from having full robotics, but I had this for robotic human interactions, but I had this debate with Ishiguro.
[1024] And he was saying that he thought that the future of humanity was non -biological, that we were going to kind of unload ourselves to these non -biological entities.
[1025] And that is how we would gain our immortality.
[1026] And I argued something very different.
[1027] I feel like we are biological beings.
[1028] I think we'll fully integrate with our technology.
[1029] But if we ever become entirely non -biological, then that's not us.
[1030] Either we will have committed suicide as a species or these robots or other, we will, will have killed us because even if, let's just say that I could download my entire consciousness to some kind of robot, and let's just say that was possible, that robot would be me for that first exact moment when the transfer happened.
[1031] But then, beyond that, it wouldn't be me anymore because there would be a whole other set of experience.
[1032] But certainly our interaction, our connectivity with this tech is going to be greater.
[1033] And so even if Kurzweil isn't exactly right, he's directionally right yeah the problem would be that you would be locked in like if they downloaded your consciousness into some sort of a bank of computer somewhere right where are you if that's your consciousness your consciousness your consciousness is in these ones and zeros yeah and you're I mean that's terrifying the what's terrifying is somebody didn't like you and they said I'm going to make one version of you suffer for all eternity yeah yeah yeah I'm going to just download you while you sleep it's true but I have something worse than that.
[1034] Okay.
[1035] And so I think that nobody is going to say, well, I'm going to be Joe living a life or I'm going to like not be Joe and I'll just have my consciousness downloaded someplace else.
[1036] And so if the comparison is, well, I've lived this life and I don't want to die.
[1037] And so I'd rather kind of be here in some kind of version.
[1038] And even if it's not me, it's just something.
[1039] Really?
[1040] I think some people will want that.
[1041] Not everybody.
[1042] I think some people will, but they don't know what they're getting, right?
[1043] in terms of you don't know what that experience is going to be like, nor do you know if there is some sort of a chemical gateway that happens in the mind when you do expire and allows you to pass through to the other dimension that your consciousness and your soul longs to travel to.
[1044] But no, you've downloaded it.
[1045] I hope you're right about that.
[1046] I'm definitely not right.
[1047] I've written about this in my novel.
[1048] It's like, yeah, but I think kind of when you're dead, you're just dead.
[1049] Why do you think that, though?
[1050] Well, just because I think that this kind of immortality comes because time stops.
[1051] Time is this relative concept.
[1052] And so at the moment that you die, that's immortality for you because time stops flowing for you.
[1053] Time is, that's what Einstein taught us.
[1054] Time is this relative concept.
[1055] Other people very legitimately, and there's no way to prove it, feel that we have this soul and this soul can travel to other dimensions.
[1056] I happen to believe that we are biological beings and our experience of the soul, whatever, is connected to our biology.
[1057] when our biology stops functioning, those experiences, whatever they are, stop being accessible, at least to us.
[1058] Have you had any psychedelic experiences?
[1059] I haven't, and I was so tempted.
[1060] We started the interview talking about my chocolate shamanism.
[1061] You haven't had anything?
[1062] I haven't.
[1063] But I was realized, did you want to?
[1064] I don't think so, and I'll tell you why.
[1065] So I listened to the Michael Palin interviews, and he had his great conversation with Sam Harris, and I really think that this psilocybin stuff is real.
[1066] Just got decriminalized in Denver.
[1067] I know, in Denver.
[1068] I was just there the other day.
[1069] But, as I said before, I think that the ultimate drug is us.
[1070] And so for me, I would rather, and I definitely think that our awareness, it doesn't encompass everything that is knowable, everything that we could know, but we hem ourselves in.
[1071] And if we want to get out of those limitations, certainly drugs are ways that have people have used for for many thousands of years.
[1072] But when you take the drug, you're like taking the drug and then you're not taking the drug.
[1073] Like I would rather...
[1074] What does that mean?
[1075] It's like if you're taking psilocybin, you're having this great experience.
[1076] But then tomorrow you're not taking psilocybin and so kind of your consciousness has narrowed.
[1077] My aspiration would be to recognize that the drug is us, that if we want to expand our consciousness, there are all kinds of ways, whether it's meditation or awareness or just, you know, Simple appreciation.
[1078] That's when I do these cacao ceremonies.
[1079] What I say is, like, you have this cacao in front of you, but it's not just this.
[1080] Like, think of the person in Honduras who planted the seed, the person who watered that seed, the person who took the plant, the person who paved the road to bring the plant.
[1081] And I just think that we can expand our consciousness through our own means.
[1082] And then we always have access.
[1083] I hear what you're saying, but you're saying this from a person that's never had psychedelic experiences.
[1084] I know.
[1085] It's really preposterous.
[1086] If you did experience what psilocybin can do to you, you definitely wouldn't be saying it this way.
[1087] You also wouldn't be thinking then you take it and then you're not on it anymore because it's profoundly influential for your perspective in regards to the whole rest of your existence.
[1088] There's many people that have had psychedelic experiences that think about it as a rebirth, that they've gone through this and changed.
[1089] So why would you have this rigid thought process about drugs and not drugs?
[1090] But yet you don't have it about cacao, which is a mild drug.
[1091] Yeah.
[1092] And so you're right that it may not be entirely consistent.
[1093] Some of the people you've described are good friends of mine who've really done it.
[1094] And I've really, I've talked to them about it.
[1095] And I'm endlessly curious.
[1096] So why don't you do it?
[1097] The reason is so far I have been on this journey to see what's possible within myself.
[1098] And I'm still on that journey.
[1099] I'm not, I don't want to close off any possibility for anything.
[1100] You assume that it would close things off.
[1101] That's what's confusing.
[1102] It's just opening you up to a new experience that other people have found to be profoundly influential.
[1103] Yeah.
[1104] So so far.
[1105] You're very resistant to this.
[1106] Like, you know what I'm talking, you're like, yeah, yeah, yeah.
[1107] Like you can't wait to come back with your own rational perspective from the perspective of a person who hasn't experienced anything.
[1108] You're so right.
[1109] And just I think it's such a great point.
[1110] Because I'm very close with actually the Tibetans.
[1111] One of my closest friends is the prime minister of the Tibetan exile government.
[1112] So I've been many times to Dram Sala in India.
[1113] I've met with His Holiness, the Dalai Lama many times.
[1114] And the most incredible thing about meeting with these guys.
[1115] And they are all people who've found these incredible states of heightened consciousness, so much that their brains are changed when they go into the fMRI machines.
[1116] But when you have a conversation with them, it's not like what we do.
[1117] like exactly and thank you for calling me out like you say something i've already in my head countered what you're saying before you're finished saying and i'm like that's why you kept saying exactly which probably means that you made me a little uncomfortable which is good that's what we what we want and these guys it's like you'd talk to them and they would just be so tuned in to what you were saying and they would just kind of think about then you'd finish and then they'd kind of look up and then because we're americans we we want you know somebody stops speaking like you have to, if you don't speak right away, and then they was, like a minute.
[1118] And then it's like you're kind of looking around.
[1119] Was it me?
[1120] Did I say?
[1121] And then they would come back with this incredibly thoughtful thing.
[1122] So you're right.
[1123] You know, I don't want to close off any possibility.
[1124] There are many different things that we could do.
[1125] The path that I have been on, and certainly with this cacao, and you're right, cacao is like a mild, I mean, not a huge one, but like a mild drug and life is a mild drug so yeah psilocybin's not very mild yeah no and my friends who have done it have said exactly what you said they say I have a good friend of mine who did it and then he said it just it showed me a path to a different identity a different consciousness and now he said I don't do psilocybin but I do daily meditation but I can see where I'd like to go what's possible so I get that yeah yeah there's a bunch of different substances out there that have very similar profound effects.
[1126] There's a real thought, and this is something that Terrence McKenna described way back in the late 90s, early 2000s, he believed that you're going to be able to recreate a lot of psychedelic states through virtual reality so that people that don't want to actually do a drug will be able to experience what it's like to be on that drug and that, I mean, it's very theoretical and hypothetical and it's, you know, who knows whether or not that's possible.
[1127] But that could be one other way that human beings interface with technology.
[1128] So humans, in my view, are far more hackable than we think.
[1129] That there are so many that we just imagine our biology is being fixed, but our biology is really variable.
[1130] Like, I have a friend who's an anesthesiologist at Stanford, and she's done this experimentation of just running just like very mild electric currents through people's brains, and people have these very real experiences, and whether it's arousal or something, that you're entering people's minds through different ways.
[1131] So you talk about virtual reality.
[1132] I mean, we are entering a world where the pixelation of virtual reality will be equal to life.
[1133] and so you're going to be in this VR space and it will look it may smell it could even with haptic suits it may feel just like life and our brains I don't know if you've done these things with these VR glasses where you go and you you can see people you're just like in a hall and you put on the glasses and now you're in an elevator going to the top on the outside like a window cleaners elevator to the top of this high rise and there's a little rickety board and then there's this cat at the end of the board and they're saying yeah you have to go go save the cat and you've already seen that you're just in a hall you know it in your brain there's this cat everybody's looking at you and you've seen all these other people panic and it's like when I'm there I'm going to be so I'm just going to go grab that cat and you're terrified like you're trying to override your lizard brain and your lizard brain's saying like no don't step off this cliff that for our HT5.
[1134] Yeah.
[1135] Yeah.
[1136] We need to set that up, Jamie.
[1137] We need a two by four that we put on the ground for that.
[1138] Oh, it's incredible.
[1139] And so I think that this whole concept of reality is like that our technology is going to be changing our sense of reality.
[1140] And then what's real?
[1141] Like if you feel arousal in your brain because of an electric current or you feel it because you're with your girlfriend or wife or reading a magazine or whatever, is that different?
[1142] Is one real and one not real?
[1143] I mean, I think that there's real big issues here.
[1144] Yeah, I mean, that is the matrix, right?
[1145] I mean, the idea that, and if it feels better and it's more enjoyable than real life, what is going to stop people from doing the Ray Kurzweil and downloading yourself into this dimension?
[1146] Well, I mean, whether it's possible to do a full download or not, I mean, I think that's an open question.
[1147] But whether people are going to be more comfortable living in these alternative worlds and whether we're going to be able to say, oh, no, that is the fake world.
[1148] Like if you're in this virtual world, but you're doing, you have friends in that world, you're interacting in that world, you have experiences that feel every bit as real in that world as in our world and people say, oh no, that's not real.
[1149] Those aren't your friends.
[1150] Like even now, like, you know, we all, people with global lives, you kind of have these friends, like, I have a good friend in Mongolia.
[1151] We talk all the time, Do you ever see them in person?
[1152] Once in a while, like once every year or two, it's great to see them.
[1153] Well, that's a real person, though.
[1154] No, it's a real.
[1155] But that's someone you actually know.
[1156] No, but I mean that.
[1157] I mean, you actually do know them.
[1158] No, absolutely.
[1159] But if it was someone that you only talk to online and they live to Mongolia, that's where things get weird.
[1160] It's true, but let's just say, following that hypothetical, you have that person.
[1161] They're part of your whole life.
[1162] And they're with you, they're with you through your life experiences, you call them up when you're sad.
[1163] Like, is it so essential?
[1164] that you've met that person physically?
[1165] Like, is that the core of what it means to be someone's friend that you have?
[1166] It's not essential, but it means a lot.
[1167] It does.
[1168] And so I mean, because we are not, like, we are these physical beings and we are these virtual beings, but figuring out what's the balance is going to be really tricky.
[1169] Yeah, what is the balance?
[1170] I'm worried about augmented reality, too.
[1171] Because, you know, when you see people that use Snapchat filters and they give themselves doggy ears and stuff like that, like, how long before that is just something that people choose to turn on or turn off about life itself?
[1172] Yeah.
[1173] Like, you'll be able to see the world through different lenses.
[1174] The sky could be a different color.
[1175] The plants could be a different color.
[1176] Yeah.
[1177] I write about this in one of my novels, Eternal Sonata, where I think we're just going to have these contact lenses, and it'll be different kinds of information based on what people want.
[1178] I'm like, I'll meet with you and it'll say, all right, this is Joe, here's a little bit of background, whatever, and we'll have useful information or you're walking around a city and you'll get little alerts of things that you might do or history.
[1179] And so I think that that's what they were thinking about with Google Glasses, right?
[1180] Yeah, I know, but it just was so annoying that people wanted to kill people.
[1181] It was just too weird.
[1182] Yeah, yeah, yeah.
[1183] Yeah, it was just too.
[1184] Everybody felt like they were getting filmed, too.
[1185] They were.
[1186] Yeah, I mean, when you were walking around with Google Glasses on, you assume that people were recording everything.
[1187] It was very strange.
[1188] They were.
[1189] But I think that's another thing that we're just, all of our lives are going to be recorded.
[1190] Of course.
[1191] Now, do you think that that's going to come in the form of a contact lens or do you think it's going to come in the form of like ski goggles that you're going to put on and see the world?
[1192] Nobody wants to look like an idiot.
[1193] And so in the beginning, you talked about Michael Douglas with his, or like my favorite one is Kurt Russell in Escape from New York.
[1194] What do you have there?
[1195] It's like this really cool tech.
[1196] Like he finally gets out of this Manhattan hell, and he's got this phone, and it's like, you know, this big.
[1197] And it's like, yeah, Mr. President.
[1198] This is pre -cell phone.
[1199] It's like very, very early days.
[1200] And so, you know, now we have these kind of glasses and there's like a little bit of a little bit of cash.
[1201] Palm has a cell phone that's that big.
[1202] Have you seen that?
[1203] No. How's at the Verizon store?
[1204] Yeah.
[1205] I think it's like an attachment or an accessory to a phone.
[1206] Yeah, it's like you can bring it with you and not bring your other phone.
[1207] That's the, behind it.
[1208] So, like, you could decide, well, I'm going out.
[1209] Let me just bring my tiny phone for essentials.
[1210] But all this tiny.
[1211] I mean, the phones are going to get, quote, unquote, phones are just going to get so small.
[1212] That's why I say they're going to come inside of it.
[1213] You'll have, like, a little contact lens, maybe a little thing in your ear, maybe like a little permanent implant in your, behind your tooth.
[1214] Or one of your teeth, they'll replace a tooth with a computer.
[1215] Yeah.
[1216] Pipe it right into your nerves.
[1217] Any kind of crazy stuff, you can think about it's probably going to happen.
[1218] Probably going to happen.
[1219] Some of it will take, some of it won't.
[1220] Right.
[1221] Right.
[1222] Yeah.
[1223] It seems like that's what we're going to.
[1224] have to see, like how it plays out.
[1225] And that's one of the things when you were talking about scientists that are working on these things, they're working on what's right in front of them.
[1226] They're not looking at the greater landscape itself in terms of like what the future holds.
[1227] It's not their job.
[1228] And that's why we need other people.
[1229] I certainly see myself.
[1230] Who are those people?
[1231] You and who else?
[1232] Who would you elect?
[1233] If Trump came to you and said, Jamie, we've got problems.
[1234] We need to figure out the future.
[1235] What should we do?
[1236] So certainly, I mean, we need to have a mix of different kinds of people.
[1237] And so I, certainly people like me who are kind of big picture futurists, we need that.
[1238] We need scientists.
[1239] I work with some really incredible sign.
[1240] I did an event at Harvard last week with George Church, who's kind of like the living Charles Darwin.
[1241] David Sinclair, who you know has been on this program, is a friend of me working on life extension.
[1242] And we need people just from all backgrounds.
[1243] And so, like I would say, like, we need people.
[1244] We need people from developing world.
[1245] We need all kinds of people, but in terms of the people who are kind of articulating the big picture of the world and what are the challenges that we're facing, I certainly put myself in that category.
[1246] People like Yuval Noah Harari, who are just kind of big, also kind of big thinkers, people like Sid Mukherjee.
[1247] And I just think we have to articulate the big picture.
[1248] And we have to do it in a way so that people can see themselves in this story and then enter into the conversation.
[1249] Are you writing this book just to sort of educate people and let them understand exactly what is going on and that it is a really volatile and chaotic and amazing time and that all these things are, or are you doing this book to, but what are, what essentially was hacking Darwin?
[1250] Like, what was the motivation behind it?
[1251] Was it for a person like me or was it for the everyone?
[1252] It's for everyone.
[1253] And so what I really wanted to do, so the background, I can give you just a little bit of background.
[1254] So more than 20 years ago, I was working on the National Security Council.
[1255] And my then boss, Richard Clark, who was then this obscure White House official, who was jumping up and down saying we need to be focusing on terrorism and al -Qaeda and bin Laden.
[1256] And he was trying to tell everybody and nobody was paying attention to.
[1257] He was totally marginalized.
[1258] And when 9 -11 happened, Dick's memo was on George Bush's death saying, exactly that.
[1259] We need to focus on al -Qaeda.
[1260] Here's what's going to happen to.
[1261] And Dick, even before then, would always tell me that if everyone in Washington was focusing on one thing, you could be sure there was something much more important that was being missed.
[1262] And so more than 20 years ago, I was looking around, I saw these little pieces of disparate information, and I came to the conclusion that the genetics revolution was going to change everything.
[1263] So I educated myself.
[1264] I started writing articles.
[1265] I was invited to testify before Congress.
[1266] And then to try to get that story out, I wrote my two most recent near -term sci -fi novels, Genesis Code and Eternal Sonata.
[1267] And when I was on book tours for those, and I explained the science to people, the way a kind of a self -educated citizen scientist and a novelist would explain the science, all of a sudden people got it.
[1268] And that was when I realized I needed to write a book about the genetics revolution that people could absorb, that wouldn't scare people.
[1269] But my mission for the book is that this stuff, as we've talked about, it's so.
[1270] so important that everybody needs to be part of the conversation.
[1271] We have this brief window.
[1272] And what I'm calling for is this species -wide dialogue on the future of human genetic engineering.
[1273] And I have a whole game plan in the book about what people can do, how they can get involved, people individually, on a national level.
[1274] We have to put a lot of pressure on our elected leaders and say, stop focusing on the crap.
[1275] There is really important stuff that needs to be addressed.
[1276] and we need leadership.
[1277] I'm speaking in Congress a week and a half from now talking about these issues.
[1278] So we need to have.
[1279] And on an international level, we have to have some kind of international system.
[1280] We're so far away from being able to do that.
[1281] We don't even know what the standards are, but we have to be pushing.
[1282] And so I know.
[1283] So you think of it in terms of like the same way we have with like nuclear weapons?
[1284] Yeah.
[1285] In many ways, yeah.
[1286] But the thing is with nuclear weapons, a lot of that happened at the state level, at the country level.
[1287] This needs to happen at a popular level and at a government level.
[1288] So the only way that that's going to happen effectively is we need real comprehensive education on the subject.
[1289] It's not something that people can just guess, right?
[1290] They need to know what's the consequences, where we're at right now.
[1291] Yeah.
[1292] And that's like Sanjay Gupta had a wonderful quote that's actually on the cover of my book, which is if you can read one book on the future of our species, this is it.
[1293] So what I've tried to do is to say like if you just want to go to one place to understand what's happening what's at stake what it means for you and what you can do now if you want to get involved i've tried to do that but then ironically i'm now like as i mentioned being asked to speak to thousands of doctors and scientists because they're all reading this book and they're saying this is this is positioning my work in in a much bigger context sanjay goopta is a very interesting cat because he was very anti -marijuana and then started doing research on it and then totally flip to 180 degrees, which to me is a great sign of both humility and intelligence.
[1294] He recognized that the data was different than his presuppositions.
[1295] He had these prejudices that were very common.
[1296] Yeah.
[1297] Now, when you speak to Congress, do they brief you in terms of like what they would like specifically for you to address?
[1298] No. So this one, I've been asked to go and speak, and a lot of members of Congress who are going to be invited.
[1299] And what I'm going to tell them is, look, this is really important.
[1300] Our Congress is not doing enough.
[1301] And here are the things that we need to do.
[1302] What are you going to say to try to really get it into their head?
[1303] What I'm going to say is that the genetics revolution is here.
[1304] If we don't have a system, if you don't have a rational system to manage it, if we don't have a system, you talked about public education.
[1305] The challenge that we face in the United States is we traditionally have had a representative democracy.
[1306] And now we're transitioning from.
[1307] a representative democracy to a popular democracy.
[1308] So Switzerland has a popular democracy, but they have really well -educated people who have enough information to make smart decisions.
[1309] We haven't educated our public, and yet the public is making big decisions, and a lot of it is happening just on a gut feeling.
[1310] That's what's happening with trade agreements, where people have a feeling it's bad without the ability to really get into the details.
[1311] And so we are having that transition, which means there's a lot of responsibility on us to educate our public.
[1312] And it's just, it's a tragedy.
[1313] We treat people in this country, like you can just throw people away.
[1314] Like you somebody, if you're in some crappy school system and your, your chances of success are so minimized, not because of anything that you've done, just because of your circumstance.
[1315] And it's unacceptable.
[1316] It is unacceptable.
[1317] I mean, equal opportunity is what we really should all strive for.
[1318] And I think some people conflate that with equal.
[1319] success.
[1320] Right.
[1321] And you're not going to get the same equality of outcome.
[1322] You're going to get different amounts of effort and different people are qualified or more talented at different things.
[1323] But what I'm worried about is what I said initially that some people are going to get a hold of this stuff quickly and it's going to give them a massive advantage.
[1324] Yes.
[1325] Sort of like, I mean, if you, the first person that has the ability to go forward in time five minutes is going to be able to manipulate the stock market in an unprecedented way.
[1326] I don't think that that's really possible in our lifetime, but that's the kind of thing I'm talking about.
[1327] You could get so far ahead that if we're talking about competition, there will be no catching up.
[1328] But you don't have to travel in time to do that.
[1329] Right.
[1330] But I'm saying that that was a technology.
[1331] But there's real technologies that are likely to happen, which are going to confer billions, tens, hundreds of billions of dollars of benefit.
[1332] Of advantage.
[1333] Yeah, yeah, and that stuff is happening now.
[1334] And this is the concern with getting behind countries like China.
[1335] Because if they get ahead of us and something like that, when they're already moving in this direction in terms of technology.
[1336] Yeah, so I am a proud American.
[1337] My father and grandparents came here as refugees.
[1338] I believe in what this country at our best stands for.
[1339] I want us to continue to be the country that's setting an example for the rest of the world that is articulating what are ideals of responsibility and governance.
[1340] good governance and accountability and all these things that we've championed.
[1341] And because of that, I want us to get our act together politically.
[1342] And I want us to be the leading technological country in the world.
[1343] And so I think that's what's at stake.
[1344] And we're losing so much time because there was a time in the period after the Second World War where we recognized that technological leadership was the foundation for everything else.
[1345] We had recreated the world out of the ashes of the war.
[1346] But we realized that we needed to have the economic growth.
[1347] We needed to have the competition.
[1348] We needed to have these technologies.
[1349] And it was a miracle what we've done.
[1350] And now we've lost our focus and we have to regain it.
[1351] The way I look at humans and the way I look at the human race today in 2019, it's like we're driving very fast through fog.
[1352] And it's very difficult to see what's in front of us.
[1353] When I look back at, I don't know if you ever read any HG Wells, but some of his predictions about the future are really interesting.
[1354] because he he was pretty close on quite a few things but that that vision to be able to sit there and use your imagination close your eyes and think what is this going to be like what is what we're dealing with now in opposed to as opposed to what 2119 right what is which is similar to hg wells versus us right what the fuck is that going to be like like yeah is there going to be a time where there are no diseases there are there is no death and that we just have to regulate population control in some sort of other manner.
[1355] And the only way people die, they're going to die from accidents and things along those lines.
[1356] But mortality in terms of old age.
[1357] I mean, this is, that's a, according to David Sinclair, this is a fixable issue.
[1358] It's a matter of when they fix it.
[1359] They fix it in 20 years or 30 years or 50 years?
[1360] Maybe.
[1361] David is a friend and I have a whole chapter.
[1362] I don't want to misquote him either.
[1363] I have a whole chapter in the book on the science.
[1364] of human life extension.
[1365] So I think definitely it's real that we're going to live healthier longer.
[1366] We're going to harness our technology for that.
[1367] I don't think that immortality, that biological immortality is in the cards for us.
[1368] Maybe not immortality because we'll still be biologically vulnerable.
[1369] We'll have hearts and brains and stuff.
[1370] But aging.
[1371] Yeah, I think we will age slower and we will live healthier longer and I think it's going to be great.
[1372] But back to your core point, I mean, that's the reason why I also write science fiction is that the world of science is changing so fast that we really need to apply a lot of imagination to imagine where it's going.
[1373] Because if you're just looking at what's happening now, it's like this train is going to speed by you.
[1374] We have to kind of imagine.
[1375] It's like Wayne Gretzky.
[1376] We have to imagine where the puck is going to be, not where it is now.
[1377] And I mentioned George Church, who's like, he's at Harvard.
[1378] He's like the living Charles Darwin.
[1379] And I do a lot of speaking alongside George.
[1380] And it's become our little thing that he says that he reads, science fiction like mine and then imagine and says well that's pretty cool how can we do that and what i do is i look at the research coming out of labs like georgias and i say right well that's where we are now what's that going to mean in 2050 100 years and so we have to that the science fiction plays a more important role than it ever has in kind of imagining where we're going and it's that imagining that allows us to try to say well what if that's one of the options of where we're going, what are the decisions that we need to make now so that we can have a better outcome rather than a worse one?
[1381] If you're a gambling person, I don't know if you are, but if I had to give you a hundred bucks to put on something in 20 years that's going to be profoundly just, it's going to change, it's going to change us in a way that is something that we're not really prepared to understand or deal with.
[1382] What do you think that's going to be?
[1383] I think it's going to be predictive genetics that we're going to have all, right now, it's like you go to your doctor when you're sick.
[1384] You could have been, this could have been some genetic disorder that you had from the moment you were conceived.
[1385] And it was ticking.
[1386] And it was ticking.
[1387] And you showed up 50 years later when that's been manifest.
[1388] So it's going to be very different.
[1389] You're taking your kid home from the hospital, your newborn.
[1390] And the doctor says, hey, congratulations, this is really great.
[1391] but just FYI your kid has a 50 % greater than average chance of getting early onset Alzheimer's 50 years from now and your kid has a really great chance of being phenomenal abstract math like how are we going to think about how are we going to think about what it means to be human when we have all of that information and there are things now that we call fate and it's just a different model and so I think that and once we have that that's going to change a lot of things It's going to fundamentally transform our health care.
[1392] What we call health care is really sick care.
[1393] You show up with a symptom.
[1394] And this is going to be predictive.
[1395] And it's going to change the way we make babies because people are going to have real choices about which embryos to implant.
[1396] And we're going to have a lot of information about a lot of really intimate stuff.
[1397] So you feel like genetic manipulation and genetic engineering, genetic understanding, genetic knowledge, and then applied genetic medicine.
[1398] Those are going to be the big changes in the next 20 years, even more.
[1399] more so than technology?
[1400] Well, it's interconnected because these, there's really, it's like a super convergence of these technologies.
[1401] So the genetics revolution is the artificial intelligence revolution in the sense that the complexity of genetics is so great.
[1402] It's way beyond what our brains on their own could process.
[1403] And so really, all these technologies are touching each other.
[1404] And so the biological models are now influencing the AI.
[1405] So, for example, we are coming to the limits.
[1406] of silicon storage, but DNA has unlimited storage capacity.
[1407] So it's the, as I've said before, the kind of the boundaries between biology and AI or genetics and AI is going to be very blurry.
[1408] Yeah, that is an interesting concept, right?
[1409] The idea of storing information in DNA, and that has been discussed.
[1410] Yeah, DNA is the greatest information storage mechanism ever imagined.
[1411] But the question is what happens when you do store?
[1412] things in there and how does that information interact with all the rest of the stuff that's in your body already well i mean if you can do it in your body you can doesn't have to be in your body but just think of like your your DNA has four billion years of history and it's done a great job of recording it it's incredible like my my old eight -track tapes they they haven't lasted that is a squirrelly concept that you have all that data inside your head i mean that's also when people make when they try to understand instincts yeah that people have that that these are some sort of genetically encoded memories or some understanding of things that are dangerous and that they're in there because this is how we've learned over the years without actually having to experience these things personally.
[1413] Yeah, no, so it's baked in.
[1414] Our genetics are baked into us.
[1415] And so, you know, I don't know if you've been to Indonesia before I was in Indonesia, and I went to this place called Komodo Island.
[1416] Oh, wow, all the Dragons are?
[1417] It's the Komodo Dragons are.
[1418] And it was fascinating because it's like, you can tell they don't have plane, up's attorney.
[1419] So you're just walking around.
[1420] They're all these comoda dragons and say, yeah, these are like the most deadly creatures on earth.
[1421] And there's like some little guy with a little stick.
[1422] And it's like, how effective is that stick?
[1423] But the way it works with...
[1424] So you're just walking around?
[1425] You're just walking around.
[1426] Because the commoters, when they're not killing people or killing animals, they're just sitting there.
[1427] So it's pretty scary.
[1428] Do they ever get jacked?
[1429] Do people ever go there and get bitten?
[1430] Yes.
[1431] And they say, oh, it's only a few times a year.
[1432] It's like, well, a few times a year.
[1433] That seems like a lot.
[1434] Anyway, but the way it works for a commoto dragon, a mother lays the egg, and then buries the egg, and then forgets where the egg is.
[1435] And then, let's just say that this egg hatches and this little Komodo dragon comes out.
[1436] And the mother sees her own baby Komodo dragon.
[1437] She'll eat it in a second.
[1438] Oh, Jesus.
[1439] And so if you're a Komoto dragon, you better have your entire survival strategy baked into your DNA because nobody's teaching you anything.
[1440] And so for us, we have this sense that it's like parenting is really important.
[1441] It is.
[1442] Environment is really important.
[1443] It is.
[1444] But so much of who and what we are is baked in to our genetics.
[1445] And I think that's going to be this challenge.
[1446] We're going to see ourselves as increasingly genetic beings.
[1447] We can't become genetic determinists, think that we're just genetics.
[1448] But we're going to know a lot more.
[1449] We're going to demystify a lot of what it means to be a human.
[1450] Poof.
[1451] Yeah.
[1452] Poof is right.
[1453] But are we going to lose the romance and just the randomness of life because of that?
[1454] That's what people are concerned with, right?
[1455] Yeah.
[1456] Like, if we have some sort of genetic uniformity, especially in particular with like things like intelligence and athletic performance, we're not going to appreciate freaks as much.
[1457] Or maybe we'll all want to be freaks because the freaks are the ones who push us.
[1458] And so I think, you're not going to want to be a moron.
[1459] Yeah.
[1460] Well, your question, it's the essential question.
[1461] It's like, what makes a human?
[1462] A human isn't to some, we have higher IQ.
[1463] That doesn't make you a better human.
[1464] That makes you someone with a higher IQ.
[1465] But how are we going to think about constructing societies when it's up to us?
[1466] Like, if we are going to say we value certain people, certain ideas, I think we're going to need artists.
[1467] Like right now, people like artists are sometimes in the mainstream, sometimes they're on the fringe, but artists are going to be maybe the most important people in this new world.
[1468] And like right now in hospitals, we have kind of a hierarchy.
[1469] like the most technical people are the people who are valued the most.
[1470] And the least technical people, like some of the nurses or nurses' aides, are the people who are often valued and paid the least.
[1471] But when technology can do these technological feats, what's going to be left is how can we be great humans?
[1472] How can we emote?
[1473] How can we connect?
[1474] How can we create art?
[1475] And if we get swept away by this tide of science, and you know how excited I am about the science, but if we could really undermine our humanity right and as for humans what humans value is many aspects of that humanity the art the creations yeah the literature things we we we when you read someone's great prose you're reading like an insight into their mind and that's what's interesting about it right you're like yes you're not going to get that from just ones and zeros yeah and that's and there there will always be this way we call it mystery and even if we can do a genetic analysis of Shakespeare and Mozart and whatever, like, it's still miraculous.
[1476] And we need to celebrate that.
[1477] And we can't allow us to say that we are just our genetics or even just our biology, but we also can't just say biology has nothing to do with it, and especially because we're going to know more about our biology and about our differences.
[1478] And that's, that's normal.
[1479] I mean, it used to be in the old days that everyone thought, well, God is weather.
[1480] And now we understand weather pretty much and nobody's saying oh that lightning that's god delivering a message it could be but we still like we still have that mystery and i think that it's in some ways it's about our orientation like how do we make sure that we keep this view of life that we have artists and humanists who are just at the core of this conversation about where we're going what if that mystery ultimately turns out to just be ignorance and that as you develop more and more understanding there's less and less mystery.
[1481] Would we like to be less smart?
[1482] Would we like to be more overwhelmed by possibility?
[1483] It could be.
[1484] Could that be part of what romance is?
[1485] It could be.
[1486] And certainly, like, the unknown, we wake up every morning.
[1487] Sure.
[1488] And we just don't know the answer.
[1489] And there are some people, like, going back to the issues of life extension, there are some people who say, well, that death is essential for appreciating life.
[1490] If I talk about this stuff all around, and then there are people who say, you know, you're talking about eliminating these terrible diseases, but I know somebody who had that terrible disease, and their suffering was a gift to everybody else because we all had more humanity in response to their suffering.
[1491] I'd like, well, that's kind of screwed up.
[1492] I'd prefer them to not have that suffering, but those people are thinking wacky.
[1493] It's wacky.
[1494] But we need to, like I totally agree with you that if we allow ourselves to get swept away with this kind of scientific determinism.
[1495] If we don't say we really value our humanistic traditions, our artists, our cultures, we could get lost.
[1496] We could become obsolete.
[1497] We could become obsolete, but we could also just become less human.
[1498] And there's something wonderful and there's magic in humans.
[1499] Do you think that monkeys used to think, man, it can't become a human.
[1500] We become less monkey.
[1501] Do you know what I'm saying?
[1502] But no being looks in the mirror and recognizes that they are evolving.
[1503] You know, we've only been homo sapiens for about 300 ,000 years.
[1504] So we just, it's hard, we know where we've come from, because you see all those little charts from high school biology.
[1505] But it's really hard for people to imagine being something else in the future.
[1506] It's outside of our consciousness.
[1507] And so.
[1508] It's HGL squared.
[1509] And we are monkeys.
[1510] It's just that we've redefined our monkey thing.
[1511] You know, we do it with a little different way.
[1512] That's what I was kind of getting out.
[1513] Are you concerned at all with artificial life?
[1514] Are you concerned about?
[1515] the propagation of artificial intelligence?
[1516] Well, there are different kinds of artificial life.
[1517] So one is artificial intelligence, and I know people like Elon Musk and late Stephen Hawking are afraid.
[1518] Terrified.
[1519] Yeah.
[1520] And I think that we need, whether it's right or not, I think it's great for us to focus on those risks.
[1521] Because if we just say, oh, that's crazy and we don't focus on it, it increases the likelihood of these bad things happening.
[1522] So kudos to Elon Musk.
[1523] But I also think that we're a long way away from that threat.
[1524] And we are, and we will be enormous beneficiaries of these technologies.
[1525] And that's why, I don't want to sound like a broken record, but that's why I keep saying it's all about values.
[1526] I think we should take those threats very seriously.
[1527] And then say, values are so abstract and we don't agree on them.
[1528] It's true.
[1529] But like Elon Musk, I mean, they've set up this institute where to say, well, what are the dangers?
[1530] Right.
[1531] And then what are the things that we can do now?
[1532] What are standards that we can integrate, for example, into our.
[1533] computer programming.
[1534] And so I mentioned my World Health Organization Committee.
[1535] The question is, well, what are the standards that we can integrate into scientific culture that's not going to cure everything, but it may increase the likelihood of a better rather than worse outcome?
[1536] But isn't there an inherent danger in other companies or other countries rather not complying with any standards that we said because they would be anti -competitive?
[1537] Yes.
[1538] Like that would, they would somehow another diminished competition or diminish their competitive edge.
[1539] It's true.
[1540] And that's the balance that we're going to need to need to hold.
[1541] And it's really hard.
[1542] But we have a window of opportunity now to try to get ahead of that.
[1543] And like I said, we have chemical weapons, biological weapons, nuclear weapons, where we've had international standards that have roughly held.
[1544] I mean, there was a time when slavery was the norm and there was a movement to say, this is wrong.
[1545] And it was largely successful.
[1546] So we have history of being more successful rather than less.
[1547] And I think that's the goal.
[1548] But you're right.
[1549] I mean, this is a race between the technology and the best values.
[1550] My real concern about artificial intelligence is that this paradigm shifting moment will happen before we recognize it's happening.
[1551] And then it'll be too late.
[1552] Yes.
[1553] That's exactly right.
[1554] And that's, like I was saying, that's why I've written the book.
[1555] That's why I'm out on the road so much talking to people, why it's such an honor for me to be and pleasure for me to be here with you talking about it because we have to reach out to people and people can't be afraid of entering this conversation because it feels too technical or it feels like it's somebody else's business.
[1556] This is all of our business because this is all of our lives and it's all of our futures.
[1557] So if in the future you think in 20 years, the thing that's going to really change the most is predictive genetics and to be able to predict accurately a person's health, what do you think, health and life?
[1558] What do you think is going to be the biggest detriment for all this stuff and the thing that we have to avoid the most.
[1559] Yeah.
[1560] So one is, as I mentioned, this determinism, just because if we just kind of take our sense of wonder about what it means to be a human away, like that's really going to harm us.
[1561] We talked about equity and access to these technologies.
[1562] And the technologies don't even need to be real in order to have a negative impact.
[1563] So in India, there are no significant genetic differences between people in different casts, but the caste system has been maintained for thousands of years because people just have accepted these differences.
[1564] So it's a whole new way of understanding what is a human.
[1565] And it's really going to be complicated.
[1566] And we aren't ready for it.
[1567] We aren't ready for it culturally.
[1568] We aren't ready for it educationally.
[1569] Certainly, our political leaders aren't paying much of any attention to all this.
[1570] We have a huge job.
[1571] Oof.
[1572] Oof.
[1573] So when you sit down, and you give this speech to Congress.
[1574] Yeah.
[1575] What are you anticipating from them in terms of like what do you think that there's anything that they can do now to take certain steps?
[1576] Yes.
[1577] So a few things.
[1578] One is we need to have a national education campaign.
[1579] I mean, this is so important.
[1580] I would say it's on the future of genetics revolution and of AI because I think we just, it's crazy that we aren't focusing on these.
[1581] Like I learned French in grade school and high school, and I'm happy to speak French.
[1582] But I would rather have people say, this is really important stuff.
[1583] So that's number one.
[1584] Number two is we need to make sure that we have a functioning regulatory system in this country, in every country.
[1585] And I do a lot of comparative work.
[1586] And like the United Kingdom, they're really well organized.
[1587] They have a national health care system, which allows them at a national level to kind of think about long -term care and trade -offs.
[1588] In this country, average person changes health plans every 18 months.
[1589] And I was talking with somebody the other night, and they were working on a predictive health company.
[1590] And they said their first idea was they were going to sell this information to health insurers, because wouldn't this be great, if you're a health insurer, and you had somebody who was your client, and you could say, hey, here's some information, you can live healthier, and you're not going to have this disease 20 years, from now.
[1591] And what he found out is the health insurers, they could have cared less because people were just, they were only going to be part of it for a year and a half.
[1592] So we really need to think differently about how do we invest in people over the course of their lives.
[1593] And certainly education is one, but thinking long term about health and well -being is another.
[1594] What do you think is going to be the first technological innovation, like in terms of like what's already on the pipeline right now that's going to radically alter human beings?
[1595] so radically i think it's going to be um the end of procreative sex and so when we stop conceiving our babies through sex and we're selecting our embryos that's going to open up this massive realm of possibility and certainly when we expand the number of fertilize eggs that we're choosing from that that is really i think that's the kind of the killer application of genetics to the future of human life do you see that being attainable to the general population anytime in the near future?
[1596] Like once it starts, once the technology gets established, it seems like it's going to be wealthier people that are going to have access to it first, right?
[1597] Well, it depends.
[1598] Probably, yes.
[1599] But when you think about right now, we have all these people who are born with these terrible, in many cases, deadly genetic diseases and disorders.
[1600] And what is the societal expenditure for lifetime care for all those people?
[1601] I mean, this is huge huge amounts of money so if we were to eliminate many of not the people but prevent those diseases and disorders from taking place in the first place and we could use that money to provide IVF and embryo screening to everybody using just the economic models now but then there's another issue that we have to talk about it's really sensitive so I talk a lot about this but there are and I talk in the book about people with Down syndrome And I have a lot of friends who have kids with Down syndrome.
[1602] These are wonderful kids, and they deserve every opportunity to thrive the same as everybody else.
[1603] And I'm really sensitive because people say, well, hey, if you're, and I say like Down syndrome is largely not going, there aren't going to be newborns with Down syndrome 10 or 20 years from now.
[1604] Right.
[1605] So people say, well, what are you saying about my kids?
[1606] Like, are you saying that if this is going to be eliminated, that my kid has a less of a right to be as somebody else?
[1607] And I always say, absolutely not, and we need to be extremely sensitive that we're not dehumanizing people.
[1608] But if you have 15 or 15 fertilized eggs in a lab and you have to pick which one gets implanted in the mother, and one of them who just has a disease like TASX or sickle cell disease where they're going to die before they're 10 years old, would you choose, affirmatively choose, to implant that embryo versus the 9 or 14 or whatever, number is of other ones.
[1609] And so these are really sensitive things.
[1610] And we can't be blasé about them.
[1611] But we will have these choices and we're going to have to figure out how do we make them.
[1612] Well, I think there's also a real possibility of them being able to fix that.
[1613] Yeah.
[1614] And some things will be fixable and some things won't.
[1615] And so that's why, though, for these single gene mutation disorders, there's a debate.
[1616] I was speaking in Berkeley the other day.
[1617] And so I was talking about these two options one is embryo selection and one is gene editing and so there were different people who got up says oh no we can do embryo selection that's the way and that we should prevent these diseases but gene editing that's going too far that's that's playing god and so for different things there'll be different options do you when you when you hear that that's going too far who's usually saying that there's two groups I mean one is certainly in the religious community.
[1618] We're saying, well, this is playing gun.
[1619] But there's another kind of, it's like a progressive community who are the kinds of people who are uncomfortable with genetically modified crops, people who are saying that there's this slippery slope that once we start making what are called germline genetic modification.
[1620] So germline is that are sperm, eggs, and embryos.
[1621] If you make a change to an adult human, it doesn't pass to their kids.
[1622] If you make a change to a sperm, an egg, or an embryo, it will pass on.
[1623] And so there are a lot of people who are saying, well, we don't understand genetics well enough to make these changes that will last forever.
[1624] I'm not in that view.
[1625] I just think that we need to be cautious and we need to weigh the risks and the benefits of everything that we do.
[1626] Do you think we do know enough about those changes?
[1627] It depends because if we're, it depends on what we're selecting against.
[1628] Like if the thing we're selecting against is some kind of terrible genetic disease that's going to kill somebody when they're a little kid, we have a lot of latitude because the alternative is death.
[1629] And that's why I was so critical of this Chinese biophysicist who created, who genetically engineered these two little girls born in China last year because he wasn't in the gene edits that probably weren't successful.
[1630] It wasn't to eliminate some disease or disorder.
[1631] He was trying to confer the benefit of increased resistance to HIV.
[1632] And so I think that we need to be very mindful and We need to be doing kind of a cost -benefit analysis of the different things, different interventions.
[1633] And there was an unintended side effect of this, they believe, a perceived potential unintended side effect.
[1634] And that's increased intelligence.
[1635] Well, it's a possibility.
[1636] Possibility.
[1637] How does that work?
[1638] So this gene, it's called a CCR -5 is this gene.
[1639] And when it was disrupted in some mouse studies, those mice became a little bit able to navigate mazes.
[1640] And so that was what led people to believe that this disruption of the CCR5 could potentially lead to that kind of change in human.
[1641] Nobody really knows.
[1642] There's lots of things that happen in mice that don't have analogs in humans.
[1643] And that was why it was so irresponsible is that this scientist in secret made these gene edits.
[1644] He didn't get a proper consent from the parents.
[1645] Oh, really?
[1646] Yeah.
[1647] Yes.
[1648] Because it's China.
[1649] Because it's China.
[1650] You can just let it ride.
[1651] I mean the parents were all manipulated and it's really yeah and so that's the thing is so you're exactly right like we are humans we're nuts as a species and so and so we need to try to establish some kind of guide rails guard rails about what's okay what we're comfortable with what we're not now this guy is not operating in an isolated incidents there's got to be a shit ton of that going on right now as we're talking in China what do you think is happening over there you know I think China has a lot of money.
[1652] They have brilliant people and they have a government that is hell -bent on leading the world in advanced technology.
[1653] Yeah.
[1654] And the scientific culture in China is just very different than it is here.
[1655] And so we know what we know, but we don't know what we don't know.
[1656] And that's, it's a really, really big deal because China is in many ways a wild west.
[1657] And the technology exists to do some really big stuff and that's that's why we have to at least try to establish standards will we succeed fully no but maybe we can do better than than worse are you anticipating seeing like a lot of freaky things come out of china yes whoa you said that very quick yeah no it's true i spent i spent a lot of time in china i i and in this it's a different mean but the thing with china china china has this great ancient civilization but they destroyed their own civilization in the Great Leap Forward and the Cultural Revolution.
[1658] They burn their books, they smash their own historic relics.
[1659] And so it's really, it's a society in many ways that's starting from scratch.
[1660] And so all of these norms that people get inherit through their traditions, China in many ways doesn't have.
[1661] And so it's a very different.
[1662] And China is growing, I mean, they are increasingly powerful.
[1663] And China is going to be a major force, defining the world of the, of the, the 21st century.
[1664] That's why America has to get its act together.
[1665] That's a hard concept for us to grasp when we think about the fact that they had the great wall, they have so much ancient art and architecture.
[1666] We just assume they're a really old culture.
[1667] They are, but they wiped it out.
[1668] That's so crazy.
[1669] That's why if you want to see, if you want to see great Chinese art, you have to go to Taiwan.
[1670] Because when the Chinese nationalists left in 1949, when they lost, as they were losing the Civil War, they took the treasures and they put them in the National Museum of Taiwan.
[1671] In the culture revolution, the Great Leap Forward, China, the red guards were just smashing all of their own stuff, their own ancient history.
[1672] And now, now the Chinese Communist Party is saying, oh, no, we're going back and we have this great 5 ,000 -year -old culture.
[1673] In some ways, it's true.
[1674] But in some ways, it's like an adolescent culture without these kinds of restrictions that other societies have.
[1675] That's such a unique perspective that I haven't heard before.
[1676] It makes so much sense in terms of like how frantic they are at restructuring their world.
[1677] Yeah.
[1678] Yeah.
[1679] And they feel that they got screwed over because there is this vague, this sense of Chinese greatness.
[1680] When you hear the word Middle Kingdom, it's like China's the center of the world and everybody else is some kind of tributary.
[1681] And so they're monumentally pissed off that these colonial powers came and overpowered them and they had to make all these concessions.
[1682] They had to give land away.
[1683] And hell bent on regained.
[1684] regaining it.
[1685] They are playing a long game.
[1686] They are playing a long game.
[1687] And we have to be, and we are not, and we have to be mindful of it.
[1688] That's also something you can do if you have complete control of your population, and you don't have to worry about people's opinions, or you can just go in the direction that you feel is going to benefit the Chinese power, the power that be.
[1689] This is a country run by engineers.
[1690] We're a country run largely by lawyers and reality TV people, I guess.
[1691] But in China, it's run by engineers.
[1692] So there are all these problems, and the answer is always engineering.
[1693] So if you have a population, problem.
[1694] The answer is the one -child policy.
[1695] Environmental problem, you have three gorges dam.
[1696] You don't have water in the north of China.
[1697] You build this massive biggest water project in the world from south to north.
[1698] You want to win in the Olympics.
[1699] You engineer your population.
[1700] You take kids away from their families and put them in their Olympic sports school.
[1701] So I write about this in Genesis Code.
[1702] If you're China and you kind of have this Plato's republic model of the world and we're going to kind of identify the genetic or maybe manipulate these genetic superstars to be our greatest scientists and mathematicians and business leaders and political leaders, like there's a model that you can imagine for how to do it.
[1703] Wow.
[1704] It makes you really nervous.
[1705] It should.
[1706] Yes.
[1707] That's the thing.
[1708] That's why, like, I just feel like with this country, we don't have time to have all these distractions.
[1709] We're focusing on junk.
[1710] Like what?
[1711] It's just like all of this, I'm on CNN all the time when I'm home in New York.
[1712] And I always say, like, you guys, and I'm talking about kind of geopolitical issues, China and North Korea, what I always say is like, you guys recognize this is porn.
[1713] Like CNN and MSCB, that's like one kind of porn and Fox and whomever else, Info Wars, that's another kind of porn.
[1714] But it's all porn.
[1715] And we're kind of, we're drawing people's attention to these few stories.
[1716] But there's these big stories that we have to focus on.
[1717] And certainly the rise of China is such an essential story for the 21st century because China is competing in all of these technologies.
[1718] And China, it's like, go, go, go.
[1719] I mean, people in China who are involved in the tech world, when they go and visit Silicon Valley, uniformly, they say, we cannot believe these people are so lazy.
[1720] Like, why are they not working 24 hours a day?
[1721] Why are they not issuing new products every week?
[1722] And so this is, I mean, they are racing somewhere and it's going to have huge implications for the world.
[1723] And so if we believe in our values, as I believe we should, we have to fight for them.
[1724] And the place that we have to fight for them first is here.
[1725] And we can't, you know, it's just like every day that we're just focusing on this drama, this reality TV drama of our government is another day where we're not focusing on the big things.
[1726] How are we going to get our act together?
[1727] How are we going to lead the world in technology?
[1728] I mean, another example is this is immigration.
[1729] We have this whole fight of how do we keep people out.
[1730] What I'd like to do is to go to the State Department and say, all right, every embassy in the world, you have a new job.
[1731] You have to, we're going to give you whatever number, 500 slots per year.
[1732] You have to, in your country, find the 500 most brilliant, talented, creative, entrepreneurial people and say, we're giving you a green card.
[1733] We're going to give you a little starter money.
[1734] We want you to move to the United States and just start a life and have kids.
[1735] And we should be creaming the crop, skimming the cream over the rest of the world.
[1736] I'm like we could take over, we could revitalize this country, but we're having this fight of how do we keep a small number of refugees out?
[1737] And it's just, we're not focusing on the right things.
[1738] That's a, that's again another very, very interesting perspective.
[1739] We are, we learned about Huawei in this country, really, not just, well, I learned about it because they put out some pretty innovative phones and some, right?
[1740] interesting technology.
[1741] But we learned it because the state department was telling people to stop using their phones.
[1742] Do you think that that is trying to stifle the competition?
[1743] Because the market share that they have, if they do really have the number two selling cell phones in the world now, that's not from America.
[1744] America is largely out of that conversation.
[1745] And if they were in America, they would probably dominate in America as well.
[1746] Because they're cheaper.
[1747] Yeah, and they're really good.
[1748] I mean, their phones are insane.
[1749] The cameras in their phones are off the charts.
[1750] They put some video of the Zoom capability of their newest phone, and people were calling bullshit.
[1751] There's like, there's no way.
[1752] That's not even possible.
[1753] But it turned out it was true.
[1754] It really can zoom like a super expensive telephoto lens.
[1755] Yeah.
[1756] So Huawei, it's a complicated story.
[1757] For sure, the founder of Huawei is a former Chinese military.
[1758] officer.
[1759] For sure, in the early stages of their company, they stole, straight out stole lots of source code from companies like Cisco.
[1760] For sure, we should be really worried if Huawei is the sole supplier of the infrastructure that supports 5G all around the world because the Chinese government would have access to everything.
[1761] And so that leads us to the question is, one, is there a problem with Huawei itself, but then two is let's just say, and I think the answer to that first question is probably yes, but then the question too is let's just say Huawei is a legit company and they're not totally intimately connected to the Chinese government.
[1762] Can we trust their relationship with the Chinese government?
[1763] And the Chinese government has a rule that every one of these companies has the big Chinese national companies, national champion companies, they have a communist party sell inside of that company.
[1764] And so these, like, I think that we can't think of big Chinese companies just like we think of companies here.
[1765] We have to think of them as quasi -state actors.
[1766] And that's why this fight that's happening right now is so important.
[1767] And that's why, like, when China is out investing in different parts of the world, including Africa, their companies are kind of acting like arms of the government.
[1768] I mean, they're making all kinds of investments that don't really make sense if you just see, well, this is a company doing something.
[1769] If you say that this is a company with backing by the state that's fulfilling a function that supports the state, it's just a very different model.
[1770] So I am actually quite concerned about Huawei and I'm not a fan of everything that this administration is doing.
[1771] But I think on China, it's important that we need to stand up.
[1772] And I think pushing back on Huawei is the right thing to do.
[1773] I'm uncomfortable about this for two reasons.
[1774] One, I'm uncomfortable about that, about the Chinese government being inexorably connected to this global superpower in technology, but I'm also uncomfortable that it sets a precedent for other nations to follow.
[1775] Because they looked as the only way to compete.
[1776] Because what you were talking about, the investments at Huawei or that the Chinese government makes in these other countries and that don't seem to make sense if you're just dealing with a company.
[1777] But if you're dealing with someone who's trying to take over the world, it makes a lot a sense.
[1778] Yeah.
[1779] And so when we have our companies that you're out in someplace in Africa and you're competing with a Chinese company to do something, build a port or whatever.
[1780] And you're competing because you are an American company.
[1781] And so you have your calculator.
[1782] This is the port.
[1783] What's the income stream going to be about it?
[1784] And you have a certain amount that you can bid because otherwise it becomes a bad investment.
[1785] But if the Chinese company is that their calculus is not, is this a good or bad investment.
[1786] It's what is the state interest in controlling or quasi -controlling this asset.
[1787] And so that's why we can't project ourselves onto the Chinese.
[1788] We can't say they're just like us, just different.
[1789] We have different models, and our models are competing.
[1790] Do you think that we should avoid Huawei products, like consumers should?
[1791] I think the government should very tightly regulate products like Huawei products.
[1792] Because some of their network, like routers or they've shown that they, they're using them to extract information.
[1793] And so we have a long history of European, Japanese, South Korean companies that have invested very well.
[1794] They've outcompeted us.
[1795] And we've allowed the Japanese companies to outcompete our auto manufacturers.
[1796] And that was, that was fine.
[1797] In the sense, in the 1970s, our cars had become shit because we had this monopoly.
[1798] And so I'm all for open competition.
[1799] I'm all for free trade.
[1800] It has to be fair.
[1801] But I think that what China is doing, China recognized as a state that they could use the tools of capitalism to achieve state ends.
[1802] And I think we need to be very cautious about that.
[1803] That's interesting that you compare it to the automotive market because the consequences are so much different, right?
[1804] So much different.
[1805] But we do have a model to go on.
[1806] We could see what happened.
[1807] We made shitty cars the Japanese took over.
[1808] And then we made better cars.
[1809] I have a rental car here in Los Angeles and I went to the rental car place at LAX and they had all of the different cars and there was like a Nissan and Toyota and there was a Cadillac and for the fact that you know I said I'm going to go with the caddy so it's a great car and so they're amazing.
[1810] They're incredible.
[1811] Yeah American cars are very good now.
[1812] They're great and so like I'm all for competition but I just feel like what Chinese, some Chinese companies are doing It's not competition.
[1813] They have become, not all of them, but quasi -state actors.
[1814] And if that's what they're doing, I think we need to respond to them in that way.
[1815] Interesting.
[1816] What else should we be concerned with?
[1817] We should be concerned with anything that North Korea is doing?
[1818] Oh, absolutely.
[1819] So I have spent a lot of time in North Korea.
[1820] Yeah, that's why I brought it up.
[1821] Yeah, so I've advised the North Korean government on the establishment of special economic zones, which I certainly believe if North Korea could have.
[1822] economic growth and integrate into the rest of the world, that would be great.
[1823] And so I, I, um, when was this that you went over there?
[1824] This was in 2015, but I've been there twice, um, crossed, uh, the border, uh, from China and zigzag the country by land, visited 10 or 12 different sites.
[1825] So spent almost two weeks by land.
[1826] So I've really, incredible.
[1827] I mean, North Korea, um, one, it's the most organized place I've ever seen.
[1828] I mean, there's not a, anywhere.
[1829] There's like on the side of the road, the, The stones are all raked.
[1830] There's not a stick.
[1831] Every little line is drawn.
[1832] It's like total control.
[1833] In the agricultural areas, there were very few machines and very few farm animals.
[1834] So I saw people pulling plows.
[1835] Like, you know, usually have the animal in front of the plow and the person behind here.
[1836] There were like two people in front of the plow and one person behind.
[1837] The people were the animals.
[1838] And we would go and visit these, just because they didn't, a lot of the animals got eaten when they had their famine.
[1839] And so we visited these different sites for these special economic zones.
[1840] And they would say like what they had done and what they were thinking about doing.
[1841] And I would say like, how do you, do you know anything about the market?
[1842] Like, do you, what are you going to sell here?
[1843] And they said, well, we know about clearing land and building a fence.
[1844] And then we went to Pyongyang and I spoke to about 400 economic planners.
[1845] And I said, look, I know you have these plans to do these special economic zones.
[1846] It's totally going to fail.
[1847] The way it's going to work, you have.
[1848] to connect to the market economy.
[1849] You have to empower your workers.
[1850] You need information flow.
[1851] How else are you going to learn and adapt?
[1852] So North Korea, it's a really dangerous place.
[1853] And now it's even more dangerous because President Trump, it was this kind of nonsensical Hail Mary in these meetings with Kim Jong -un.
[1854] There was never any indication that the North Koreans were planning on giving up their nuclear weapons.
[1855] They never said they would.
[1856] It's the last thing they would do because their goal is survival.
[1857] And so there was this kind of head fake, which was like a PR stunt, to be able to say, we're having these meetings.
[1858] And of course, the North Koreans weren't ever going to give up their nuclear weapons.
[1859] They're still not.
[1860] So now things are ramping up.
[1861] So North Korea in the last couple of days has started firing missiles again.
[1862] The United States today, CS military seized a North Korean ship.
[1863] So we're going back to this very dangerous place.
[1864] and so I think we really need to do a much better job.
[1865] We need much more concerned.
[1866] North Korea is, it's really hard, and these guys are really smart.
[1867] I mean, they are very, people say, well, these guys are poor.
[1868] They must not be smart.
[1869] I mean, like, we're playing cards with them.
[1870] We've got the whole deck.
[1871] They don't have any one card.
[1872] And yet they're in the game.
[1873] They're in the game.
[1874] They're holding us to a stalemate, and it's really worrying.
[1875] And why did you go over there?
[1876] What were you thinking?
[1877] So I thought a lot about it because I have a background in human rights.
[1878] I was a human rights officer for the United Nations in Cambodia, the child of a refugee.
[1879] I have this very strong belief in human rights and in supporting people.
[1880] In North Korea, they have about 120 ,000 people in the most brutal, brutal, horrific prison camps.
[1881] And so when I was asked to be part of this six -person delegation advising them on the establishment of special economics, one instinct was, screw them, I don't want to be part of this at all.
[1882] But I also felt that if North Korea could have some kind of integrated economic development, that would at least connect them to the world that would create some kind of leverage and that would help help people.
[1883] So I decided to go.
[1884] And I'm glad that I did, but these are really hard issues, people.
[1885] And it's very unfortunate that in President Trump's negotiations with the North Koreans, human rights was never once mentioned.
[1886] And I think that that's coming back to values.
[1887] Like, we have to be clear about who we are and what we stand for and be consistent in fighting for it.
[1888] Do you think that Trump didn't bring that up because he wanted to be able to effectively communicate with, them and not put them on their heels?
[1889] Maybe.
[1890] But I feel like had they done, I mean, I think that if he thought that there was a real chance of progress.
[1891] But the hard thing was he didn't know much about the North Koreans.
[1892] He has people.
[1893] We have brilliant people working in the United States government.
[1894] And all of those people, all of the U .S. intelligence agencies were telling President Trump that the North Koreans have absolutely no intention of giving up their nuclear weapons.
[1895] Right.
[1896] And so maybe he did think that he would charm Kim Jong -Long.
[1897] or he would say, hey, we're going to give you economic development or whatever.
[1898] But I think for most people who were observers of North who had watched it for a while, I thought that was not.
[1899] So we gave away a lot.
[1900] So we didn't mention human rights.
[1901] We suspended our military exercises.
[1902] We gave them the legitimacy of a presidential meeting, which they've been wanting for 30 years.
[1903] And we didn't get anything back.
[1904] So had we gotten something back, then you could say, well, that was a risk for taking maybe.
[1905] Yeah.
[1906] Yeah, I haven't heard described that way, but I'm agreeing with what you're saying.
[1907] What do you think you could have done differently, though?
[1908] I don't think the meeting should have happened with no, no conditions.
[1909] There were no, so if he had said, I'm open to meet with the North Koreans, which is something the North Koreans have always wanted.
[1910] We could have met with the North Koreans any time immediately for the last 30 years.
[1911] But in order to do it, they need to do this, this, and this.
[1912] And if they do it, we'll meet.
[1913] like that would have been a legitimate thing but what he said is somebody the the North Korean I mean sorry the South Korean national security virus just peaked into his office and he goes hey they want to meet and it was like sure that that seems like an interesting thing to do and I think that with this diplomacy you kind of have to get something and so we gave away so much up front and the North Koreans weren't didn't have an incentive to do anything in return was his perspective that it would be better to be in communication and to be friends with this guy was that what that was that what he was thinking?
[1914] It could be, but we have real interests in the sense that we have large military forces in Seoul.
[1915] We have a lot at stake.
[1916] We have our closest ally, Japan, who's had citizens abducted.
[1917] And so I think that was what he thought, is like, let's be friendly.
[1918] And then with the force of personal chemistry, everything will unlock.
[1919] But I think that was always extremely unlikely.
[1920] What do you think is going to happen to that country?
[1921] I think eventually, and I've written this, I think eventually this regime will collapse under its own weight, but it's really held out a long time because you think of the collapse of the Soviet Union, they had enough, Soviet Union had enough bullets to survive.
[1922] If they had said, you know, we're just going to shoot everybody at the Berlin Wall and every dissenter, they would still be, the North Korea has essentially murdered millions of people.
[1923] So with famine and execution and prison camps, so I think they're going to stay for a while, but eventually, there will be leaders in North Korea who will come to the conclusion that it's safer to oppose the Kim family than to wait for the Kim family to come and get you.
[1924] And that tends to happen in these kind of totalitarian systems where there's so little trust, there's so little loyalty.
[1925] Jesus.
[1926] Yeah.
[1927] What are their conditions like technologically?
[1928] What is their infrastructure like?
[1929] So the general infrastructure is absolutely terrible.
[1930] I mean, they have roads in the big cities that are actually quite nice roads because there's no cars but their infrastructure is terrible I mean all of their power supply they have brownouts blackouts all the time their manufacturing is all being decimated so it's terrible but they have really focused their energy on building these nuclear weapons because they think that these nuclear weapons give them leverage to do things and to extract concessions and to get to get a, but it's terrible infrastructure.
[1931] So do, they don't have an internet, right?
[1932] But they have something similar, but it only allows them access to a few state -run websites.
[1933] Well, average person doesn't have access to the internet.
[1934] So the way it works is it's all about loyalty.
[1935] So you need three or so generations of loyalty to the Kim family to even set foot in Pyongyang, the capital.
[1936] So really?
[1937] Yes.
[1938] So it's not like you can kind of move around or whatever.
[1939] It's like just to be in the capital, like you have to have your loyalty proven.
[1940] And so average person out in the country, they don't have access to much of anything.
[1941] They have a little bit more now than they did in the past.
[1942] And then for this relatively small number of elites who are largely in Pyongyang and in the other cities, which are like there's a ring of defense around these cities.
[1943] And just to enter, you have to have all of these checks.
[1944] some of them have access to limited internet but it's it's tightly controlled and it's not like you're kind of going on google and going wherever you want right they and they probably would get in trouble if they googled the wrong thing yes and trouble it's not just you trouble like if you trouble if if my brother or my uncle does something that gets me in trouble with the regime the whole extended family is out and that means you either you go to prison camps or you're kick out of Pyongyang.
[1945] I mean, it's all about collective punishment.
[1946] People are terrified.
[1947] And by that ruthless punishment structure that they've set up, that's how they've kept control of the country.
[1948] Yes.
[1949] And everybody's forced to rat on each other, right?
[1950] Yeah, absolutely.
[1951] Yeah.
[1952] They're actually compelled to tell on each other for one thing that you did.
[1953] If you don't, then what?
[1954] If you don't, then you are complicit.
[1955] Yes, so that's, and that's these horrible stories.
[1956] I've met a lot of these of these people who were in the prison camps like I have a friend of mine I mean she was this 13 year old girl and her father was a like a low level North Korean official and then he was accused of something and so this family that was privileged all of a sudden was out and just just these horrible things and prison and rape and this little I mean now she's in the United States an incredibly positive and it's amazing how resilient she is But this is like a real hell.
[1957] And it's an issue.
[1958] And I think that for us as Americans, as humans, we're less human when there are people who are suffering like this.
[1959] Yeah, I agree.
[1960] Now, you were traveling all over North Korea.
[1961] What were they having you do while you were out there?
[1962] So what we would do is we would go from one of these special economic zones to the other.
[1963] And in each one, it was kind of the same story.
[1964] You'd get there.
[1965] There'd be like a big field.
[1966] The farmers had been kicked off.
[1967] There was a fence around it.
[1968] and then the group of the local officials would come and they'd have like a big chart and they'd have a plan like here's where we're going to build this building and here's and I would always ask the same question like what are you going to do here why do you think you're going to be competitive how do you know what the market prices are how are your workers going to be empowered so they can change things I mean in the old days it used to be you just kind of have these automaton workers now workers are actually making big decisions and fixing things and they're didn't have an answer to any of those questions.
[1969] And that's what happens when you have these totalitarian top -down systems is that like if you being creative is actually really dangerous.
[1970] So if somebody says do X, you just do X. Wow.
[1971] Yeah.
[1972] No, it's really incredible.
[1973] And it's so sad because I spend a lot of time in South Korea.
[1974] And this is the most dynamic place.
[1975] There's like I often go to Seoul just to see what technology is going to show up here a few years in the future.
[1976] I mean, it's, it, Seoul is like the future.
[1977] And then just 35 miles from Seoul is the demilitarized zone.
[1978] And the other side, it's, it's incredible.
[1979] And the real problem would be once they finally did get free of that community or of that, that, that, I mean, you can call them whatever you want, the dictator and that, in his family, what would they, what tools would they have?
[1980] Like, how prepared would they be to be autonomous?
[1981] Yeah, it's, well, it's really, the, good thing, the benefit that they have if there is, so here's my thought of what a scenario, the scenario might look like.
[1982] I think eventually probably there'll be some kind of coup attempt against the Kim family.
[1983] Let's just say it succeeds.
[1984] But that would probably result in another military dictatorship with another group.
[1985] Well, we don't know because then I think immediately, I think the Chinese would invade.
[1986] Really?
[1987] Yeah, because people think of the Korean War from the early 1950s.
[1988] They think, oh, it's the Korean War.
[1989] It must have been the Americans fighting against the Koreans.
[1990] But the Korean War, the two sides were America and the South Koreans fighting against the North Koreans and the Chinese.
[1991] The Chinese did the most of the fighting.
[1992] And so China, North Korea is the only country in the world that has a treaty alliance with China, kind of like we have with Japan and with South Korea.
[1993] And so China, their biggest fear is having a reunified Korean peninsula allied to the United States.
[1994] So I think if there was a coup, the Chinese would immediately move in militarily.
[1995] Then immediately there would be this call to have some kind of UN body and there would be a call for a UN authority and then I think it would be agreed that the Chinese would stay and they just would put on blue helmets like as a UN force.
[1996] And then we'd have to negotiate what happens next.
[1997] And I think what the Chinese would do would say well we'll leave when the Americans leave.
[1998] I think that would be what will likely happen.
[1999] But eventually, I think we're going to see a Korean reunification.
[2000] And the good news of these reunified countries like East and West Germany is there's a whole system of law that is just North Korea will be swallowed into South Korea.
[2001] And then you have law.
[2002] You have an infrastructure.
[2003] And it'll take one or two generations.
[2004] But I think that will eventually happen.
[2005] And I'm hoping it can happen without nuclear war, terrible bloodshed.
[2006] But it's going to be a big challenge.
[2007] God damn.
[2008] That sounds insurmountable.
[2009] Just hearing you talk about that, about North Korea getting absorbed by South Korea, I'm like, oh, my God, good luck.
[2010] Just imagine.
[2011] Just imagine the whole thing.
[2012] Well, listen, man, it's been a fascinating conversation.
[2013] I really appreciate it.
[2014] I really appreciate you coming here, and you've certainly sparked a lot of interesting ideas in my head, and I'm sure a lot of other people's heads as well.
[2015] And I would like to see down the line where this all goes, and I hope we don't get some.
[2016] swallowed up by machines.
[2017] We won't, but it's up to us to fight for what we believe in.
[2018] Well, thank you very much, man. I really appreciate it.
[2019] It was a lot of fun.
[2020] Bye, everybody.