Armchair Expert with Dax Shepard XX
[0] Welcome, welcome to armchair expert, experts on expert.
[1] I'm Dan Shepard.
[2] I'm joined by Lily Padman.
[3] Silly Padman.
[4] Lily, Silly Padman.
[5] We have two Brainiacs on today.
[6] Lee Hood and Nathan Price.
[7] Lee Hood is a pioneer of personalized and precision medicine, National Medal of Science winner, and one of the fathers of the Human Genome Project, multi -award -winning systems biologist researcher and professor turned chief scientific officer of Thorne Health Task.
[8] They have a new book out right now called The Age of Scientific Wellness Why the Future of Medicine is personalized, predictive, data rich, and in your hands.
[9] Now, this was kind of fun because we are in such fear of AI.
[10] We've been talking a lot about our AI fears lately.
[11] Yeah.
[12] And this is one of the areas where AI is going to be very, very, very helpful.
[13] Yeah, there's some really cool stuff happening and on the horizon that will change medicine for sure.
[14] Dramatically.
[15] Yeah.
[16] Like younger generations may never even have the illnesses to fight and to begin with.
[17] All about prevention.
[18] It's incredible.
[19] Please enjoy Lee Hood and Nathan Price.
[20] Wondry Plus subscribers can listen to Armchair Expert early and ad free right now.
[21] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[22] Or you can listen for free wherever you get your podcasts.
[23] gentlemen welcome how are you we're good you're presumably coming from Seattle Lee I'm from Seattle okay since 91 92 but you're close okay great and I'll even suggest some literature on you is wrong that exists online and then Nathan where are you from I'm in New York so you've both traveled here to L .A. are you staying in the same location we aren't because I was going to have dinner with my daughter last night but she transported kids that got COVID, and we decided we would not have dinner together.
[24] I'm so sorry to hear that.
[25] Yeah.
[26] What does she do down here?
[27] She's a lawyer.
[28] She has a firm on discrimination and employment law.
[29] Oh.
[30] And they're merging with a very big San Francisco firm.
[31] She's going to take over the whole thing.
[32] Oh, my goodness.
[33] You're so proud.
[34] She's amazing.
[35] Yeah.
[36] That's wonderful.
[37] Rob and or Monica may need her services at some point.
[38] Yeah, I have a bone to pick with my employer.
[39] They'll probably be a wrongful termination.
[40] Well, if you get her on your side, you're one -up on anybody else.
[41] Okay, I'll keep that in my back pocket.
[42] We have a friend in common.
[43] Oh, you do?
[44] Well, we all do.
[45] Bill Gates, yeah.
[46] We've interviewed him a few times and we really adore him.
[47] Isn't he something, though?
[48] He is.
[49] Your work and even the first chapter of your book speaks to this, but I think of him as the second coming of Rockefeller in the sense that Rockefeller really funded so much of of our medical advancement in trying to get a unified curriculum around the country in these different medical schools.
[50] The Flexner Report was funded basically by Rockefeller.
[51] That catalyzed the first big revolution in American medicine.
[52] Yeah, and the things he tackled, too, were not sexy.
[53] Like hookworm, I'm going to eradicate hookworm from the U .S. And put a ton of money into that, and it's just fascinating.
[54] We didn't live then.
[55] It could have been really sexy.
[56] Hookworm?
[57] Yeah, like, we don't know what the state of things were then.
[58] Well, this may ring a close bell to you.
[59] You know, at the time, it was just rumored.
[60] The stereotype was that Southerners were lazy as hell.
[61] Yeah, I don't like that.
[62] And it's because, like, 30 % of them were suffering from hookworm, which made you exhausted because your body was fighting this crazy bacteria.
[63] I think I have it.
[64] You might have it, yeah.
[65] It's because you all didn't wear shoes.
[66] So Rockefeller had to teach you all how to wear shoes.
[67] This was one of the big things.
[68] Take some iodine.
[69] And then, Nathan, let's just catch up with you for a second.
[70] Yeah.
[71] You are the chief scientific officer at Thorn.
[72] Yes, that is correct.
[73] I feel like I have some Thorne supplements.
[74] Do they make supplements?
[75] You probably do.
[76] So Thorne makes supplements.
[77] They've been a company in that space for about 10 years.
[78] And then what they really want to do is to expand and become a scientific wellness company focused on healthy aging.
[79] When they reached out to me because of the work that's in the book, really that became around.
[80] All right, how do we take testing, AI, plus the supplements?
[81] And they make about 300 different ones.
[82] So they're not married to any particular compound.
[83] Yes.
[84] But you want to know how do you deploy all the resources in the natural world so that you can make a difference in improving people's health and do that in an intelligent way.
[85] Yeah, and I don't know if Huberman's secretly on the books, but he's the one who recommended me like three or four of the different Thorne supplements.
[86] Yeah, Huberman used us a lot, and we actually supported his podcast in the really early days.
[87] And I remember I was on the phone with Andrew.
[88] At that point to me, it was some professor at Stanford that I'd never heard of saying, I'm going to start this podcast, it's going to be huge and amazing.
[89] and I'm like, good luck, cool.
[90] And like, four years later, I'm like, please let me on your podcast and love him.
[91] Yeah.
[92] Okay, so let's start first with a brief history of medicine in this country, which is where we start in the book.
[93] Talk to us about what was happening at the turn of the century, beginning of the 1900s.
[94] Why were we dying?
[95] What were the causes?
[96] And how did we treat those?
[97] In the 1900s, infectious disease was the big rage.
[98] Children were especially devastating.
[99] by it.
[100] But the real story at that time is almost all of the medical schools, both in the U .S. and in Canada, were trade schools.
[101] There was no formal education of the MDs.
[102] It was all an apprenticeship kind of thing.
[103] There was no marriage of research or science to health care.
[104] And it was in approximately 1910 that Flexner was asked to do a report on the 155 medical schools, both in Canada and the U .S., and he went to many of them.
[105] One after another, when he went there, he saw there was no training, there was no science.
[106] It was very much a cult of personalities of the individual that existed in the medical school.
[107] And he wrote this devastating report that said this is all really nonsense.
[108] And he made two major points.
[109] I mean, he had a lot of specificity in it.
[110] But one was that medicine should be based in science.
[111] And the second was that there should be formal scientific training of medical students.
[112] And, of course, any time you propose a big paradigm change, the question is, how do you get the system to do it?
[113] Yeah.
[114] And that was where Rockefeller came in because they had encouraged him to write this report.
[115] And once he did, and they saw what a devastating challenge it represented.
[116] They started putting substantial money into encouraging selected medical schools to begin the revolution.
[117] And Hopkins, for example, was one that very early jumped in, saw the power of bringing science and bringing medical education.
[118] They had a legitimate curriculum.
[119] They were the only ones doing it.
[120] And then by weird coincidence, Flexner's brother -in -law is at Johns Hopkins at that time or something.
[121] Yeah.
[122] Yeah.
[123] That's so weird.
[124] And then Rockefeller, right, all these schools, these 150 schools, he basically said, I'll give you endowments if you adopt this curriculum and have these standards.
[125] In the discussion we have today, we're facing exactly the same thing because we'll be talking about a revolution from a disease -focused health care to a wellness and prevention health care.
[126] And that's even a bigger paradigm change in some ways than what went before because the whole system is totally invested in disease.
[127] and profit is only focused on disease.
[128] You don't make money with wellness or prevention.
[129] So back in the 1900s, when Flexner did this report, one in three children are dying, and they're dying of either diarrhea, or a couple of the other.
[130] Well, infectious disease, yeah, of one sort or another.
[131] Obviously, they made a lot of strides in the 20th century.
[132] We did pretty good at finding and fixing, yeah, identifying pathogens coming up with treatments for them.
[133] And for a long period, we were making quite a bit of progress in reducing at least the amount of infant mortality.
[134] Absolutely.
[135] Tuberculosis was in the top three or four at that time, yeah.
[136] And it's still pretty high these days.
[137] Is it?
[138] Well, with the drug -resistant form.
[139] Oh.
[140] Yeah.
[141] Okay, so then in the 20th century, the big breakthroughs were antibiotics and vaccines.
[142] And public health.
[143] Good water, chlorination of water made a profound difference.
[144] Right.
[145] We cover some ground.
[146] We proved to be very effective once this system is in place and we pat ourselves on the back.
[147] Now we're at another crossroads, which is we have been successful at tackling those diseases.
[148] But now we're in an era of chronic diseases.
[149] And just to paint a picture of the amount of money behind all this, we're spending $600 billion on prescriptions in the U .S. And how effective are these prescriptions?
[150] So a little under 10 % of the people respond effectively.
[151] Did you hear this?
[152] Oh, my gosh.
[153] No, you got a really honing.
[154] So 90 % have all the side effects of the drugs plus the cost, no benefits.
[155] 90%.
[156] They listed of the top 10 highest grossing prescriptions in the U .S. The best of them, one in four people, have relief from the symptoms.
[157] but as bad as one in 25.
[158] There's some pretty popular medicines where only one in 25 people are helped from the medicine.
[159] The whole picture is pretty disheartening when you see it mapped out like this.
[160] And as you point out in the first chapter, even the ones that are successful, let's take Humera.
[161] That's a great one.
[162] I was on it for a period.
[163] I have psoriotic arthritis, chronic illness.
[164] It is pretty effective as compared to a lot of the other options.
[165] Like one in four people will have a reduction.
[166] But the drug itself is only trying to mitigate the...
[167] Inflammation.
[168] Yes.
[169] There's no effort being made really to eradicate the cause.
[170] Fixing the symptom.
[171] Yeah, it's just mitigating symptoms.
[172] Correct.
[173] And if you think about that from an economic point of view, it is the ideal situation, right?
[174] If you are running a company that's maximizing profit, you want something that treats symptoms that you have to take every day in order to stay alive.
[175] That's an incredible business model.
[176] Ideally, a few times a day.
[177] A deal with a very short shelf life that you have to constantly re -up.
[178] And then also another thing that will be disheartening is the whole system all in, $4 trillion a year.
[179] I saw that.
[180] Four trillion.
[181] That's not even a number I can wrap my head around.
[182] That's not a real number, but it is.
[183] Yeah, what does that account for as a percentage of our GDP?
[184] 18 to 19 % of GDP is spent on health care.
[185] Move one step further and know that 86 % of that $4 trillion, is spent on chronic diseases.
[186] And you know what the focus of the problem is right there.
[187] Okay, so 86 % of this enormous pile of money is going to chronic diseases that we're only looking to mitigate symptoms at this point, or primarily that's what the focus is in the expenditure?
[188] I think that's correct.
[189] I mean, if you take something like Alzheimer's, where it's a classic example over the last 14, 15 years, they've probably had 500 unsuccessful clinical trials, mostly focus on amyloid or tau proteins.
[190] The establishment believed were the heart and cause of Alzheimer's, and probably there are a consequence of rather than a cause of.
[191] But if you think about those 500 tests at a billion to $3 billion per clinical trial, you can see the effort that's been put into chronic disease with no, amelioration whatsoever.
[192] And Nathan, do you want to talk about the recent study that's been done and where it stands?
[193] Yeah, I don't know if we want to dive into Alzheimer's at this point.
[194] Of course we do.
[195] So we've been doing a lot of work on the science of Alzheimer's disease and I really believe that most of the period of time over the last several decades that we have focused on amyloid as the cause of Alzheimer's, that that has been wrong.
[196] And so what we've done over the last three years is to build what's called a digital twin type model.
[197] So what you're doing is you're trying to to represent computationally how the brain maintains itself.
[198] How does it stay alive?
[199] And we've taken data from about 950 papers in order to reconstruct this.
[200] And then we've run simulations for 10 million digital twins of patients.
[201] And what this lets us do is to evaluate different mechanisms that we can look at at the molecular level, but because we can simulate cognition now over the course of the lifetime, we can then look at the population of the digital twins and compare them to the actual population to see if different mechanisms explain what we see in the hospital or not.
[202] And so we've done this, and we can actually reproduce a huge amount of the data out there now around the following mechanism.
[203] So some of the things that we really think about in the brain are your brain consumes 20 % of your body's energy, and it's 2 % of your biomass.
[204] So it's 10 times more metabolically active than an average part of your body.
[205] Oh, wow.
[206] Let's just think on that for one second.
[207] That's good.
[208] So it's only one -tenth of your mass. 2 % your body mass, and it's 20 % of your body's energy.
[209] It's an expensive organ.
[210] And in fact, in studies where they do starvation of mice or animal models, you can watch and it will shift energy away from your peripheral limbs first.
[211] And the very two last organs that it will save is the heart and the brain.
[212] And if it pushes to the very limit, it prioritizes the brain.
[213] Over the heart.
[214] The brain's in charge, right?
[215] Yeah, yeah, yeah, yeah.
[216] It helps when you're making the decisions.
[217] If you're the one in charge.
[218] You're picking teams, isn't it?
[219] you're always on the good team.
[220] Shockingly, the brain that presents itself as the priority.
[221] Yeah.
[222] So it's a huge energy hug.
[223] So what that means is that every second of every day, you've got to be feeding energy up to your brain or it to stay alive.
[224] It can be pulling from food you eat or fat stores or wherever, but you have to use energy.
[225] So what happens as you get older is that you lose the ability to perfuse oxygen into your brain.
[226] It goes down with time.
[227] It's why exercise is so protective to Alzheimer's disease and virtually everything else, right?
[228] Yeah.
[229] So as you get older, and this is all measured, then that ability to profuse oxygen throughout your brain goes down and it's not evenly distributed in your brain.
[230] So certain regions get low.
[231] There's a priority list within the areas of the brain?
[232] Yeah, or just the way the blood vessels are distributed.
[233] So some parts are just harder to push into than others.
[234] Got you.
[235] And then there was a clinical trial that came out last year and they had done eight years of PET scans.
[236] So you can look at metabolism in the brain.
[237] And what they saw was that there were these areas of hypometabolism or low metabolism as it's called, so low energy, that were the first regions where you'd get Alzheimer's.
[238] Alzheimer's always in the same location in the brain?
[239] There are certain locations where it's more prevalent.
[240] It's not always the same, but there are regions that it's more likely to happen, like the cortex, for example.
[241] The frontal lobe?
[242] You can get it in the frontal lobe, you can get frontal lobe dementia and the cerebellum.
[243] So your neurons have to make enough energy to stay alive.
[244] So they have a certain demand, and then they have a supply.
[245] Supply's got to be higher than demand.
[246] And so as long as you're in that regime, you're fine.
[247] Now, as you get into lower oxygen, that becomes harder.
[248] So you start having this decrease to where the amount of energy that you can produce is not enough to keep those neurons alive so they start to die.
[249] As your neurons die, as you try to maintain cognition, it pushes the demand up on the remaining neurons.
[250] So their floor is going up, and because of the oxygenation, their ceiling is coming down, and so you get this cascade of changes.
[251] Now, as that happens, as you start losing the neurons, you start losing synapses, you go below a threshold where you can't do something that's called Hebbian learning.
[252] This is what fires together, wires together.
[253] It's how your brain learns.
[254] And so as that goes down below a certain threshold, you start losing your cognitive abilities.
[255] So the brain actually has to recruit additional synapses to keep functioning.
[256] And so it has to secrete a molecule to recruit synapses.
[257] And in recent studies, one of the things that's come out is, you know, what is that molecule?
[258] Amyloid beta.
[259] So amyloid beta, rather than being the cause of Alzheimer's.
[260] Right.
[261] It was correlated, but it was the chicken, not the egg.
[262] Exactly, exactly.
[263] And it's even a little worse than that because it's part of the brain responding.
[264] So if you go into models and you clear out amyloid, what does the brain immediately do?
[265] It immediately starts secreting amyloid again, unless you've solved the underlying issue.
[266] Right.
[267] It was really confusing for a long time because there are genetic signals that point towards amyloid.
[268] And amyloid can have a negative effect.
[269] So one of the things that can do is amyloid can embed in blood vessels, which constricts your oxygen, which feeds back.
[270] into that central mechanism.
[271] And there's all kinds of complexity, of course, we're not diving into.
[272] But what we saw was that you could start hanging all of this data around this central hypothesis, which we turned into a quantitative model.
[273] So it has to simulate all of this at the same time.
[274] And then we simulated brain health over the lifespan for 10 million people.
[275] And what we saw was that that population from this mechanism looked very much like what we could see.
[276] So if we put in their mechanisms like they're genetic.
[277] So if you have APOE4, you're at high risk, for late onset Alzheimer's, and APOE2 protects you.
[278] And we can simulate those mechanisms in the brain, and it immediately shows that people with APOE4 would get Alzheimer's earlier.
[279] So half of those people, if they have two copies, we'll have Alzheimer's by the time they're 70.
[280] If you have 2 -2, 80 % of people won't have Alzheimer's by the time they're 90, so big effects.
[281] Whoa.
[282] We had a psychiatrist that's a professor at Harvard on with a book, proposing that all mental health disorders are really metabolic disorders.
[283] And this seems to work in concert with, what you're saying.
[284] That's very much in line with what we're seeing.
[285] I wouldn't go to all because I just haven't studied all, but is it a massive factor?
[286] The number one fact that I mentioned about the brain is it is a massive energy hog.
[287] So that always comes up.
[288] It has to solve that.
[289] And if you can't solve that, it's going to cause problems in many, many ways.
[290] Okay, now this maybe gets into at least one of the descriptors when I look you up, which is you're a systems biologist.
[291] So would I be right to assume that most people have become specialist, even if you're a neurologist, you probably specialize in some area of the brain.
[292] And you can drill down deeper and deeper and deeper.
[293] But that may be there's fewer people dedicated to looking how the whole thing's working together.
[294] And is that what a systems biologist does?
[295] And how rare are you folks?
[296] So systems biology was extremely rare at the beginning.
[297] In fact, many people consider Lee the father of systems biology.
[298] He's probably the first systems biologist by some accounts.
[299] Oh, really?
[300] Started the first institute.
[301] He could tell you all about that.
[302] Partly, if you think about systems biology, which is taking all these pieces together, it is a reaction to molecular biology where you were building up all the pieces.
[303] But now you want to know how do the pieces fit together and how do they give all the emerging, what we call phenotypes, all the things you would care about.
[304] How do you think?
[305] How do you move?
[306] How do you have energy?
[307] Why do you not?
[308] It's so forth is all a process of how those things interact.
[309] And so I really grew up in this interdisciplinary world where I was studying systems biology from the beginning.
[310] Then I was really only made possible because of Leach.
[311] who had invented.
[312] That is a field.
[313] He's 40 years older than me. Sometimes people ask, well, where were you when Lee did this?
[314] I'm like, I was in kindergarten.
[315] Right, right, right.
[316] Being frustrated with my phenotypical expression of my athleticism or this or that.
[317] Exactly.
[318] I was in kindergarten building models of the brain.
[319] But, you know, the other thing right on top of Alzheimer's being a metabolic disease, is that then gets you to start thinking about therapies that deal with diet, therapies that deal with exercise therapies that deal with stress i mean it's all a part of this system and those are all major factors and we're coming to realize how important things are in the model that nathan talked about one of the things that had the biggest impact on delaying alzheimer's in these individuals was exercise and it was strikingly better than any drug this is also the case with mild to moderate depression, exercise three times a week as a preventative, is much higher success rate.
[320] Yes.
[321] Okay, so knowing that it's a metabolic condition, then we would go further up river, right?
[322] What are the organs that we most want to bolster?
[323] So your heart, obviously, everything that pulls oxygen out of the air, puts it in your bloodstream, and delivers the oxygen to your brain anywhere else, seems like that would be maybe step one.
[324] And the things you do that increase the delivery of oxygen, of the brain, like exercise, size are really important.
[325] Well, yeah, when we, I am forgetting his name.
[326] I know, he's the most beautiful guy, too.
[327] It's just upsetting.
[328] Yeah, we suck.
[329] But he was a big proponent of the keto diet because very specifically how it relates to, well, epilepsy was about how it started.
[330] Yeah, these metabolic disorders.
[331] The best pharmacological response to it is like 50 % success rate, but that people on a keto diet will have like 87 % or something really drastically high.
[332] relief from epilepsy through that diet.
[333] Chris Palmer.
[334] Thanks, Wavi.
[335] Okay, so because you're looking at the whole system and we're entering a new era in which we're going to have great help and assistance in monitoring ourselves.
[336] I think it would be a fun time to let people know that you're one of the fathers of the genome project.
[337] Were you working with Eric Landner by chance?
[338] We worked together on the genome project.
[339] Yes.
[340] We did.
[341] Okay.
[342] Fascinating gentlemen.
[343] But cracking the genome, was supposed to be the end of everything.
[344] In essence, right?
[345] We were all so excited that if we knew the exact building blocks and we knew our own individual building blocks, that from that would come all these treatments and all these specialized medicines, perhaps.
[346] But of course, that was going to be kind of unachievable because what would you have?
[347] One human dedicated to another human's health.
[348] It would be too much for a doctor, wouldn't it, to know entirely what's going on with somebody?
[349] Absolutely.
[350] Okay, so what bits of information do we now have?
[351] and what new technology is emerging that can help us get a fuller picture and learn to treat our body as a really complex system and do things upriver?
[352] Well, I think we've gained enormously sophisticated techniques with systems biology in terms of being able to take very different kinds of data and integrate them together.
[353] And in the integrating of data that tell you about the different organs in your body and the inside and the outside of your body and your brain and so forth as you integrate those data sets together in a sense what you're doing is you're recreating the organism and setting it up so you can make predictions about the organism and where it's going and Nathan has played a major role in developing some of the computational techniques for doing this kind of thing but I think the really important point that's become very clear is you can not understand the subtleties and complexities of a human being if you study populations and average things together.
[354] What you have to do is you have to look at one person at a time.
[355] You have to gather in each of their data and you have to assemble it because that data is a reflection of what you are.
[356] And the extent to which you can interpret and integrate it and understand it, you'll come to understand the person.
[357] And I think one of the most powerful tools in the future for dealing with this complexity is going to be AI and the large language models and hyperscale AI kinds of things.
[358] And there are two visions which I have for the future of AI.
[359] One vision as we carry out the human phenome initiative, it's basically the idea we take a million people and over a 10 -year period we analyze their genome and their phenome.
[360] That will do three really important things.
[361] One, it'll increase the quality of health care enormously.
[362] So as a model system to validate that, it's really important.
[363] And two, it's going to show that we can save trillions of dollars and cost for the health care system.
[364] And three, what it's going to generate because this encompasses your entire body are little vignettes called actionable possibilities that if you carry them out either let you improve your wellness or avoid disease and these actionable possibilities come from statistical correlations in the data that we can then go to the literature and validate these basic underlying actionable possibilities well With this million -person project, we may have 10 ,000 of those.
[365] And there's no way any physician could begin to understand the diversity of ways that we can improve your health.
[366] So what AI will have to do is, one, send a message for each of these actionable possibilities to a physician that's appropriate for their patient and explain what it is.
[367] So the AI will do some analysis.
[368] It'll make a recommendation.
[369] but that is turned over to an actual physician.
[370] It's turned over to a physician.
[371] But the other thing AI has to do because physicians are appropriately skeptical types is it has to validate why this possibility is real.
[372] So both of those things will change the fundamental role of a physician because you will then become the master of every field of medicine.
[373] You'll have 10 ,000 possibilities that you're being.
[374] beck and call where you can do the brain or you can do diabetes or you can do aortic stenosis.
[375] You'll be an expert in virtually every aspect of medicine through the AI.
[376] The second thing is we're going to take these large language models and we want to take one and educate it only about medicine so it doesn't get into conspiracy theories and all the other things the internet try again and once it's properly trained and we can put in very large knowledge graphs that relate genes to diseases to drugs to all these different kinds of things and once we put in the information from digital twins like the one nathan described for alzheimer's and once we put in the entire literature of biology pub med is called then you'll have an educated system.
[377] And our hope is we'll be able to take your genome and phenome data as complex as it is and put it into that large language model and have it identify the gaps and insufficiencies and new kinds of opportunities for growth and avoidance of disease and come out with a prioritized list of actionable possibilities for you that will transform your life entirely.
[378] I want that.
[379] What we learned about the first hundred years of consolidated medicine we had in the 20th century until now is we're going to have to shift focus from treating to preventing a transition from disease to wellness ultimately it's going to be focused on two things how to optimize your wellness how to prevent any transition to disease to keep you out of ever needing a treatment and see that'll be the biggest revolution ever in the history of medicine to keep everyone healthy as opposed to treat people who are sick.
[380] Well, and the major idea is we want your health span, the number of years you're mentally agile and physically capable, to extend out to your lifespan and both of those to go into your 90s and hundreds.
[381] And suppose you can be fully functional, both physically and mentally, at 90.
[382] I mean, you're not going to want to have retired 40 years earlier.
[383] Yes.
[384] So this gets into the sociology of what do you do with those extra 30 years?
[385] And they're not just an extra 30 years.
[386] They're an extra 30 functional years.
[387] And the question is, how do we deal with things like education and jobs?
[388] Your job may decay away.
[389] So you need to be able to go get another job when you're 60 because we're not going to be retiring you at 65 or 70 when you're going to live to be 100 plus.
[390] Is there an overpopulation concern?
[391] Well, all those things are working in parallel to also the raising of the standard of living, which reduces birth rates.
[392] There's a book called The Empty Planet that discusses the fact all the countries of the developed nations are below replacement population.
[393] and the country that in many ways is most strikingly so is China.
[394] The other thing they point out is once you've gotten below that, it is unbelievably difficult to reverse that.
[395] And you know what the most important single factor is in bringing down that birth rate?
[396] It's the emancipation of women.
[397] It's all of a sudden women.
[398] They're educated.
[399] They want to have their own jobs.
[400] They don't want to be subservient.
[401] Japan, many young professional scientists and so forth don't want to touch marriage or anything like that.
[402] And the prediction overall from the book is within the next 10 years we'll start to turn around and we'll see a decrease in the population.
[403] What's going to happen to China if it doesn't have all these young people that can bump into these jobs or any of us?
[404] Well, we're used to, most countries are aiming for a 2 % growth annually.
[405] and you're going to be seen a very predictable and precipitous loss every year.
[406] There's no country that's gaining population strikingly.
[407] In a decreasing population, you then have decreasing consumption.
[408] You're going to see this spiral downward in some way.
[409] So having older, healthy population who still wants to consume things and have recreational hobbies will maybe help because those people pretty much left the marketplace conventionally.
[410] Stay tuned for more armchair experts.
[411] If you dare.
[412] We've all been there.
[413] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.
[414] Though our minds tend to spiral to worst -case scenarios, it's usually nothing.
[415] But for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[416] Like the unexplainable death of a retired firefighter whose body was found at home by his son, except it looked like he had been cremated, or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[417] Hey, listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[418] It's called Mr. Ballin's Medical Mysteries.
[419] Each terrifying true story will be sure to keep you up at night.
[420] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.
[421] Prime members can listen early and ad -free on Amazon Music.
[422] What's up, guys?
[423] This is your girl Kiki and my podcast.
[424] is back with a new season, and let me tell you, it's too good.
[425] And I'm diving into the brains of entertainment's best and brightest, okay?
[426] Every episode, I bring on a friend and have a real conversation.
[427] And I don't mean just friends.
[428] I mean the likes of Amy Poehler, Kell Mitchell, Vivica Fox, the list goes on.
[429] So follow, watch, and listen to Baby.
[430] This is Kiki Palmer on the Wondery app or wherever you get your podcast.
[431] The other bonus with this attitude, if you think about it, is in general, as we grow older, we accumulate wisdom, but that wisdom's usefulness is terminated by our brain deficiencies.
[432] If we can put together experience and wisdom with functionality, then you have the ideal world where you gain enormous in creativity and productivity, and maybe that's one of the ways you can begin to make up for this decreasing replacement population.
[433] Okay, so I think maybe a very easy example to give of genotypical and phenotypical responses would be if you can imagine there is a fetus and the fetus is xy it is genotypically a male right it's DNA says it's going to be a male and then mom for whatever reason she accidentally starts taking a testosterone blocker accidentally she's supposed to be sending testosterone to the fetus that activates the ovaries distending and becoming gonads that doesn't happen the baby comes out and the baby could come out phenotypically female we would look at the baby and say there's a vagina there's no penises, no testicles.
[434] So it could be genotypically male but phenotypically female just by arresting the mother's development of testosterone.
[435] So similarly, we all have, sure, our DNA, we have the recipe, but all these environmental factors happen along the way that prevent some genes from getting expressed or all these different things.
[436] So just having what we're supposed to be isn't really the full picture, is it?
[437] We have all these tremendous environmental things that are interacting with our DNA.
[438] Let me explain what the phenome is because I think you've hit it right on.
[439] So the phenome is how we look all the way across our developing life.
[440] So in a sense, if we take snapshots of you, we have an infinite number of phenomes.
[441] And the way we can assess the phenome is to look at blood components.
[442] We can look at the gut microbiome.
[443] We can look at digital health factors and many other kinds of things.
[444] And they begin to give us explanations for how your phenome has changed.
[445] But when you look at it most deeply at the highest level, the phenome is the responsibility of three factors.
[446] It's your genome.
[447] It's your behavior.
[448] And it's your environment.
[449] And all of those play incredibly important roles.
[450] Yeah.
[451] And the percentages we don't really know yet, right?
[452] Historically, we've had this nature -nurture debate and more and more everyone's like, no, they're one thing.
[453] And it depends on what your environment is.
[454] That is your environment under ideal circumstances may play a relatively monist role in your health.
[455] But if you were in Hiroshima when it got bombed, you had massive radiation damage that's strictly environmental and you had that for the rest of your life.
[456] Right.
[457] But knowing that you have your DNA, but then you have all these other factors and variables.
[458] A, it's too much for even the individual to keep track of.
[459] Then to have in your mind at all times, at this stage, we need some computer assistance.
[460] It would be too much.
[461] Well, with the phenome, that's exactly what we can give you, because we can track the genome, the behavior, and the environment beautifully.
[462] The one we don't measure very well today because it is so complicated as the environment but more and more effort is being put into that you have to take that complexity which is massive right we start talking about genome environment all these things no person can get their mind wrapped around it no doctor can get their mind wrapped around it right there's no chance so you have to build these things into a system just to come back to the example of the digital twin with Alzheimer's because we talked about all the mechanisms but it actually tells you things about what you can do you can put in your genes the the genes that affect Alzheimer's disease.
[463] You could put in different blood measures and things of this nature.
[464] And then it will run a simulation of expected cognitive ability over the course of life.
[465] It will then recommend a series of interventions.
[466] Could be exercises, and certain exercises actually work a little better than others.
[467] There's a recent study that suggested anything where you jump up and down a lot.
[468] You get like the blood kind of pumping that way.
[469] It's particularly good.
[470] I have a trampoline in my living room, so I was excited about that.
[471] You're one step ahead.
[472] Another element might be when we run the simulations and the metabolism as it goes down, one compound that becomes rate limiting is phosphatidylcholine.
[473] You can supplement with it, but it's also just in eggs.
[474] It literally runs out.
[475] So when we saw that in the model, we then looked in the nutrition literature and just asked the question, do people who eat diets rich on phosphatidyl colon get Alzheimer's later than people who don't?
[476] And we go look it up, and it turns out, yeah, about three years later on average.
[477] and we think we can push that to about five.
[478] Now, what you can do with the model is you can look at all those different axes, exercise, phosphatidylcholine, vitamin D plays a role, inflammation plays a role, stress plays a role.
[479] So there's all these different axes.
[480] But then what it will do is from your measurements, it will make an estimate, and it will tell you what percentile of a responder we think you are to every single one of them.
[481] And then it can make a prediction that will show if you do these things, what is the expected benefit?
[482] And these are all probabilities.
[483] Nothing's set in stone, but it will push how far in advance do we think you can push off what would be your expected time of dimension and so forth.
[484] So this is the capability that's just come on board that we've got now in the last few weeks how we've really put it into a visualization engine so that we can use this.
[485] I'm glad you mentioned vitamin E because I keep wanting to bring up vitamin D because I was thinking even in the simplest terms I have psoriotic arthritis, okay?
[486] I have a prescription and originally for Humera, it's the same prescription.
[487] Everyone would get, no matter where they're at in the world, no matter what their genes are, it's baseline.
[488] But yeah, it doesn't ask me where do I live in America?
[489] What amount of vitamin D am I getting naturally?
[490] This is a huge component that varies so greatly on what latitude you're on, and it's not really being asked by a doctor.
[491] It's not really in the mix when we're coming up with any kind of solutions.
[492] It's one of the many variables that would be helpful for some model to incorporate.
[493] Absolutely.
[494] And so you can do measurements of it, so you can see what your actual level is.
[495] If you're trying to guess at it, so we did measurements in our Pioneer 100 study that Lee and I did, and we found 91 % of the people that we looked at were deficient in vitamin D. Now, that was taken in Seattle.
[496] Although when they do geographic analyses, it's a lot more than that because even right now, right, we're sitting indoors.
[497] All right, ancient ancestors would have been walking around on the plane.
[498] I know if you can tell, but I'm read from about here up because we went on a hot air balloon yesterday.
[499] Wait, you two went on a hot air room?
[500] Yeah, Leah and I did yesterday.
[501] This is fucking adorable.
[502] Where did this happen?
[503] And here in Los Alamos?
[504] Los Alivas, which is a little...
[505] Oh, yeah.
[506] Oh, lovely.
[507] We were doing an event out there.
[508] And it is spectacular.
[509] I'd recommend it for anybody who has even an ounce of adventure in them.
[510] Oh, you'd love it.
[511] Was that your first hot air balloon ride?
[512] First one.
[513] Oh, my God, I'm so happy.
[514] And, you know, what was so cool is how important it is to find the various levels of wind that go very different directions when you're coming down.
[515] because you want to land at a certain place but you have no control so you have to be able to go up and down and find it you process yourself and then suddenly go down you really get a sense of the inversion too because you're going up and it's pretty cool and then you cross a threshold and instantly it's like 85 degrees it's hot it goes from 60 to 85 degrees because it was trapping all the clouds underneath yeah cool I grew up in an area of Michigan where they were flown a lot and they would regularly land in our backyard they would miss their bull's eye.
[516] Yes, and I was asked often as a young kid if I could help ballast it.
[517] Like they couldn't get to send it.
[518] Yeah, so I've not gone all the way up, but I've drifted across my backyard probably a dozen times.
[519] You just been hanging from it?
[520] Yeah, you just come and jump on the side and help them get it down.
[521] Yes, yes, I did that a bunch of times.
[522] It was slow.
[523] Like you grew up in the 1850s or something.
[524] The Zeppelin era, the dirigible era.
[525] So let me give you my own personal experience with vitamin D to show you it's a little bit more complicated than what we've discussed.
[526] So I was one of the 91 % in the study Nathan mentioned, and I was really low.
[527] So I started taking the typical dose to boost things up, thousand international units, and it did nothing for a period of a month or so.
[528] And then it turned out I had two variant genes, each of which blocked the uptake of vitamin D. And the only way I could get around, that is to take megadoses of vitamin D. So I took 15 ,000 international units and now I maintain with 5 ,000 international units.
[529] So not only do you have to measure the level of vitamin D, you have to know something about your genetics as well.
[530] And that's why I'm saying in this new medicine, we are going to have this integrative systems view that says you're low, 15X.
[531] But you can't do it with 1 ,000 international units.
[532] You do it with 15 ,000.
[533] So the whole question of is it just simply sun becomes much more complicated then.
[534] Surely, surely.
[535] But I remember because I have this thyroidic arthritis, sit, people with that track very low on vitamin D. But then it becomes the same thing you were talking about in Alzheimer's.
[536] Is the low vitamin D a response of the inflammation, or does the low vitamin D help create the inflammation, You get into this chicken or an egg, which one is a byproduct of the condition and which one is causal to the condition.
[537] You absolutely do.
[538] And that's why when you find correlations, it's a hint, right?
[539] Things that are causal are correlative, but not necessarily vice versa.
[540] The other really interesting thing from these digital twin simulations we've done, we'll often look at, like, what's the effect of taking vitamin D?
[541] And the answer will be, like, virtually nothing.
[542] And what's the effect of taking an anti -inflammatory?
[543] All these different things in singular, for a lot of people, are very small.
[544] But when you start doing combinations of them, you start stacking it.
[545] All of a sudden, the effect size goes from, oh, it's going to delay it for a decade or 15 years, you hit critical mass because the problem is because everything is a system.
[546] And it's an issue also with how we do clinical trials because we always look at one variable.
[547] The news report will say, none of this stuff really matters.
[548] But at least in this first, and this is a very new capability of the digital twins, I have not actually seen another digital twin version that is operational yet.
[549] When you start looking at those things, what happens is because it's a system's problem, there's all these different issues.
[550] So you solve one, but you just hit the next block.
[551] And then you do that in the next block.
[552] But as soon as we start putting them together, you just see this breakthrough when all the sudden, at least in the simulated mechanism, is it start making a big difference.
[553] And we'll be doing a perspective clinical trial.
[554] Because each step has an impact on the other step.
[555] So by going up in 10%, then baseline across the board, all those start functioning 10 % better.
[556] Because it's all working together.
[557] Exactly.
[558] Everything's interlocked.
[559] So if you fix one gear, but there are four other gears that are broken, the contraption isn't going to turn.
[560] But if you fix them all, then it does.
[561] I just go straight to mechanical things.
[562] Like, there are so many issues in an engine.
[563] It's exactly the same kind of thing.
[564] The really important thing is these chronic diseases that we were talking about, all of them are going to require multiple therapies.
[565] So the whole question is, how can you do clinical trials where you look at five different things?
[566] Yes.
[567] And we've got to start rethinking the client.
[568] Asaic approaches to clinical trials.
[569] Because if you do the five one at a time, the report will be nothing.
[570] Yeah.
[571] You miss the essential point of you have to add them together.
[572] The other thing that's really important about that is your five isn't my five, isn't Monica's five.
[573] And so one of the things that you can do in these clinical trial settings then is don't test a compound.
[574] Test the algorithm.
[575] So if you take, for example, something like these digital twin models that we have, you're going to test that.
[576] So what you'll do is you'll use that algorithm on a per -person basis, and then you will take what, through the AI systems and the digital twins, it says, is the best thing for that person.
[577] And then you evaluate that group against a control group.
[578] But every single person in that trial might have done something different.
[579] You're not testing a drug.
[580] You're testing a system of intelligence that operates on many things at the same time.
[581] Yeah, so if I go see a doctor here in Los Angeles who's seen, I don't know what the number would be, let's say they see 2 ,000 patients in their lifetime.
[582] That's kind of their data set that they've had experience with.
[583] Whereas if I'm using AI, I put in all my stuff.
[584] I'm 6 '3.
[585] I was asthmatic.
[586] You just add all these things.
[587] And within L .A., there might only be two other people that got the same setup as me. But globally, and with AI, it could be observed.
[588] We could find the 2 ,500 other people that are.
[589] Yes, my cohort.
[590] And what has worked on my cohort is something AI can do in one second.
[591] eventually.
[592] It can scan the globe and it can put me in a group that would be most predictive of a course action for me. If you have something that's simulating mechanism, so you think you know what's the underlying biology, for example, the 10 million digital twins, that's a mapping across the space so that when a person comes in and we say, who is this person like, you map them into who are the twins that are closest to them.
[593] And then the more information you get, the more granular you can be, the more accurate you can be.
[594] Because for a woman who gets the bracket, they do 23 and me and they find out they have the Bracca gene, right?
[595] I don't know what percentage of the pie that would be, but minimally it's not more than what, 30 % of the pie.
[596] Yeah, I think it's less.
[597] I don't remember the exact number.
[598] Yeah, I just less than that.
[599] It's much less than that.
[600] Right.
[601] But I don't remember what.
[602] So now we need the pie, but oh, breast cancer?
[603] Yes, like what will ultimately lead to breast cancer.
[604] It's a part of it, but it's maybe one fifth of the part or maybe it's one eighth.
[605] And now we find out four other things about this woman.
[606] Now we have six of the eight data points that we compare to everyone else and now we have a really good idea of what predictions we can make and you know the cool thing as nathan said once you've identified a conceptual twin that's very close to you you can use that twin to make all sorts of experiments to see how you can be optimized because the twin will have all the data that we've gathered from everywhere whereas on you will have a very limited amount of data by comparison yeah so we can use twins to extrapolate into areas we have no knowledge about, I think medicine is going to be profoundly changed by digital twins.
[607] It's fascinating, though, because it goes back to the age old, ethical, if you go walk into a body scan and it could tell you absolutely everything about yourself, would you want to know that?
[608] We're here.
[609] This is what we're essentially saying.
[610] The bigger question is like asking people, if we could tell you exactly when you were going to die, would you want to know?
[611] That's like...
[612] Right, because there's some element of that, right?
[613] Like, oh, you can get this many more years, or you can't.
[614] There is an element to that, and we've been using Alzheimer's an example a lot today, because that's one of the issues that came up, when the genomes first got out to people, and 23 and me was getting out to consumers and ancestry in these groups.
[615] That was always the line in the press was, oh, you don't want to find out about these things because it will scare you, and there's nothing you can do.
[616] Right.
[617] And the thing that always drove me nuts was that there's not nothing you can do.
[618] There's no drug that you can take that will cure you.
[619] But there's definitely things you can do.
[620] And the other issue I want to come back to, which you pointed out before, the critical difference between thinking about wellness and prevention and disease.
[621] If you think about the problem, the challenge of curing Alzheimer's, your neurons have died, your synapses are gone, all the encoding of memories, which is incredibly complicated.
[622] Your brain doesn't go through mitosis.
[623] It doesn't repair.
[624] Yeah.
[625] How are you bringing that back?
[626] That is a sci -fi, you know, the notion that there's a small molecule drug that you're going to take.
[627] No, you need a time machine.
[628] If you add a time machine, that would be ideal.
[629] Yes, it would be perverted in the situation.
[630] Or you need something like, you know, Elon must neuralink and a brain map and a download.
[631] Like, you could think about it, but sci -fi stuff.
[632] Yeah, you'd have to replace the damage components of your brain with something digital.
[633] Incredibly hard.
[634] Prevention.
[635] Yeah.
[636] Don't let your neurons get into negative energy, but keep them alive.
[637] That is a challenge that is way easier.
[638] Yeah.
[639] Well, then we get into the much bigger and harder philosophical question, which is people seem to track towards not prevention.
[640] That's the weird human hurdle.
[641] Like things that they know will help.
[642] Preventative, yes, when you tell people if you do this, you know, they're willing to go get a surgery.
[643] They'll have something cut off.
[644] They'll do anything to deal with the problem once it arises.
[645] But the prevention model is really hard to get people to adopt.
[646] But I think if we have the assistance of our phone tracking it, phone reminding us, our phone setting the schedule for us, it would be easier for us to adhere to.
[647] Well, the other thing we can really push is education.
[648] I set up an education group, five people to do K -12 science education in 2000 when we started ISB.
[649] Just this year, they've finished a four -module course on systems medicine using a textbook we're going to have published next year.
[650] Basically, this is a full, long course.
[651] It represents all the aspects of citizens' medicine that one can have.
[652] And I'll argue that the students that graduate as juniors or seniors with that program will know more about medicine in the future than 95 % of the physicians that are out there.
[653] Moreover, they will realize what wellness and prevention are about.
[654] and be very adapted to it.
[655] So I think early education is one really key direction that we can take to start changing the whole culture of exactly what you said.
[656] We don't want touch prevention.
[657] Yeah, yeah, yeah.
[658] People don't understand it.
[659] But if you really understand it and you get really good examples of the difference it makes, that changes people's attitudes.
[660] And linking it to incentivizing through, just you could introduce people to the cost of not doing that.
[661] People are kind of motivated by the notion that they'll lead to be retiring in a beautiful house in some warm area.
[662] So kids are just more adaptable to hearing stuff like that.
[663] They're more malleable, I feel like once you're an adult and you hear.
[664] You're like, I don't have time to eat in other than McDonald's.
[665] I'll tell you the other thing about Alzheimer's, though, is it takes away one person's life as you go through the disease, right?
[666] It takes away a second person's life.
[667] and that's the caretaker.
[668] Oh, yeah.
[669] Most caretakers don't have the money to do anything but spend their full time taking care of their partner.
[670] And you end up expending your retirement.
[671] It's just the most horrible situation to be in.
[672] Yes, you've been on this journey with your wife for about 17 years now or something.
[673] Right.
[674] In reading the chapter where you tell us what it is like being a caretaker And even more heartbreaking, the spouse of someone who knew her in all of her glory and all of her prime.
[675] And to watch it slowly disappear and to see the frustration on her face as she realizes what's happening.
[676] The simple thing you wouldn't think of like you loved living on a lake in Washington.
[677] And she got to a point where she's like, this is too much for me. I need to be in a small place that I can manage.
[678] And you're going to have to go along with that.
[679] And then even when we get there, there's going to be now more problems.
[680] And you really get a sense of how impactful it is on, yes, the caretakers.
[681] And often there's many caretakers whose lives are altered.
[682] It's a uniquely cruel disease.
[683] But, you know, the other point I'd make, and it's really an important one, is in our family, my wife had one copy of the APOE4 gene, and I had a copy of the APOA4 gene.
[684] So I have to worry about it.
[685] Is it a recessive gene?
[686] It's not that simple.
[687] It has much larger effect if you have two copies, but if you have one, especially for women who are much more susceptible to Alzheimer's, you have a higher tendency toward it.
[688] But if you find yourself in your family with a member that has Alzheimer's, you ought to all be checked for the genetic propensity because I think they're powerful, preventive measures that have to do with behavior and have to do with drugs that you can take.
[689] Nathan talked about phosphorylcholine.
[690] I think in 2005 when my wife was diagnosed, I knew what I knew today.
[691] I don't think she'd be where she is.
[692] I think she'd be with us and we could have done a preventive kind of mode.
[693] And I will say my son, for example, has two bad copies of the gene.
[694] And we've put him on a very strict preventive mode.
[695] One of the most effective aspects of it is exercise.
[696] Well, that's the best part is you'll be reaping the benefits of it long before.
[697] Absolutely.
[698] And he's an ultra marathoner.
[699] He's Hans and he lives in Alaska.
[700] You know Alaska.
[701] So he does all these physical things that are probably the best single thing you can do for yourself.
[702] Is there a specific type of exercise or just any exertion?
[703] I think anything that gets oxygen to the brain is really key.
[704] Okay, so you give some examples of where this AI is already assisting doctors.
[705] I think there's a couple of interesting examples.
[706] First of all, there's an alarming statistic in the book.
[707] The Age of Scientific Wellness, Why the Future of Medicine is personalized, data -rich, and in your hands.
[708] You have this statistic in here, which is almost hard for me to believe, which is 250 ,000 people a year die because medical mistakes account for about a quarter million deaths annually in the U .S. alone.
[709] That seems...
[710] It's so scary.
[711] Proposterously high.
[712] It sounds preposterous.
[713] It's a huge number.
[714] And if you look at these lists, so you'll see cardiovascular disease, cancers.
[715] Medical errors is one of those top several that are on there.
[716] Now, that is, of course, quite hidden for obvious reasons.
[717] But it is low -hanging fruit for some of the kind of things you can do with computational systems and AIs to just solve for that.
[718] So certain things, for example, a doctor who's overworked, tired, they've got a lot going on.
[719] and they have to click on a drug's name in an EHR.
[720] And some of those drugs, they're very long, complicated names.
[721] The doctor knows them well.
[722] Can I read a few?
[723] I wrote them down from the book.
[724] Yeah, read them.
[725] This doctor would be probably checking between two boxes.
[726] Novalin or Novalog.
[727] Vinblastine or Vincristine.
[728] Hydroxiazine or hydrolyazine.
[729] I mean, these are imperceptible when you see them written.
[730] Yeah.
[731] So there's way too many drugs.
[732] They're too closely named.
[733] And Dr. Scrawls are just that.
[734] They're terrible.
[735] compounding everything.
[736] So they're clicking this on a screen or someone's reading the scrawl, right, or something like that.
[737] So those are all very human kinds of errors.
[738] But to a computer, to a machine, it makes no difference.
[739] Those things aren't close to them.
[740] They have a different unique ID.
[741] And with AI, the really important thing is that if it's fed so it knows what this patient is being treated for, and then they click a very similar name, but a compound that's actually for the treatment of some other disease that the patient doesn't have, it's very easy for them to flag back to the doctor.
[742] Did you actually mean this drug that is for psoriasis?
[743] Or did you mean this drug that is actually for cardiovascular disease?
[744] There's a simple thing to solve.
[745] A heartbreaking example in the book of a young boy who's supposed to get prescribed.
[746] Asthma and you got a blood thinner or something?
[747] You got a blood thinner.
[748] A horrible example.
[749] But so there's this system called Metaware that is a companion for doctors that is double checking, co -piloting all of these prescriptions.
[750] And obviously their success rate has got to be in the high 90s as far as detecting that stuff.
[751] Yes.
[752] If you're using this kind of system, then you'll be able to lower those medical errors dramatically.
[753] Because of those type of error, again, computers, you know, for lack of a better word, think differently than we do.
[754] They're incredibly good at large number processing and they don't never make mistakes, but they very, very rarely do.
[755] And so just taking capabilities of humans and the capabilities of computers that are so radically opposed, and so radically different.
[756] Since they're so orthogonal to each other, you can marry those together and you can reduce errors to a huge degree.
[757] And you can see the numbers on that.
[758] What can the person do?
[759] Do you hear that word?
[760] No, I liked it.
[761] Which one did I use?
[762] Orthognal.
[763] Sorry, that's a very common word in science.
[764] I got to get hip to orthogical, right angle, right?
[765] I like it.
[766] Orthognal, I love that.
[767] But AI, obviously, is a huge conversation right now.
[768] In every industry.
[769] Every industry, it's a big one right now in our industry.
[770] Our WGA writer's strike right now is a big component.
[771] A lot of AI issues, our topic of conversation.
[772] So I can answer right now for the writers what they can do that the AI can't.
[773] So I just want you to be honest, what can the human do?
[774] Why do we still need the human?
[775] Yeah, we all had that moment.
[776] I think the first time you used chat GPT.
[777] Yeah, scary as hell.
[778] And it just rips out all this text and nothing flat.
[779] I'm so excited about this technology, though, I have to say.
[780] My experience with it is you can get it to write things for you and especially when you get into GPT4 and the more advanced that are pretty accurate.
[781] But what I've never seen it do yet is to really surprise me. So it gives you a very good description of what's out there and it will condense it.
[782] It can write at a very high level.
[783] I very often give it things that I write that make it a little better or spruce it up or critique it or tell me what you think is wrong about it or act as an editor that receives this as a letter and respond, would you accept this?
[784] Yeah.
[785] And actually, I love doing that because it'll write me an acceptance or a rejection letter to different things and give me reasons.
[786] That's pretty useful.
[787] Yes, big time.
[788] There's all kinds of stuff you can do.
[789] What I think it puts a premium on now are a couple of different things.
[790] So one is really unique and original thought.
[791] And so we have to be thinking very creatively, humans.
[792] Yes.
[793] Because you have to add value to the corpus of anything that's out there.
[794] But everything that used to be, like I'm just going to write up this blog that's going to say the same kind of thing that everyone always says, but to a different audience, I think that's going to go away.
[795] Yeah.
[796] But in medicine specifically.
[797] There's one thing in the chapter that's interesting is that the AI, currently at least, has some predictable issues.
[798] One is it has this hallucination phenomenon.
[799] That's a function of what it's trained with.
[800] If you give it, the internet is going to hallucinate all over the place.
[801] Yeah, yeah.
[802] It sees conspiracy theories.
[803] It has a sweet spot.
[804] When it cycles through the same command, like over 100 times, it starts doing some weird stuff.
[805] But the primary thing that we can still do, which is really astounding, is we expect the unexpected.
[806] We do great with novel information that doesn't exist anywhere else in its previous learning.
[807] We're really adaptable when we see something new and curious.
[808] Is that still our strength that this brain does somehow shockingly well?
[809] Absolutely.
[810] The other element that's really important is that the AI is, only as good as the knowledge on which it's trained on and what it exists.
[811] So the big areas of value are, one, to have corpices of texts that are really trusted.
[812] You do what's called vectorization, but you vectorize against it, which means that you're letting your AI go, but you're telling it, here's a library of things that we've vetted that are true.
[813] Only answer if you can get the answer from this set.
[814] And that set keeps getting bigger and bigger.
[815] And then the second is to have data types.
[816] So as we get into this kind of information where we're measuring things out of our microbiomes or our blood or all these elements that you can have so you have a lot of information about that person, you can contextualize that into the system so that when you're going into a large language model, you give it all this background context about who is this person and what is their problem.
[817] What are they really trying to solve?
[818] So that gets you into a certain location.
[819] You then pull in and load into its short -term memory, right, into its context.
[820] pieces of information from the general knowledge in the scientific and medical literature that's relevant to that and then you let it do the personalization of the dialogue back to the patient or the individual or to the doctor if they're still in the loop as an intermediary for this, depending on how sensitive it is.
[821] That really is a system that you can get a lot of delivery for.
[822] And the really amazing thing, though, is that we can take all this information from genetics, the microbiome, the blood measures, your own individual digital twin, whatever it is, the chat GPT and so forth, can actually give that back to an individual in plain English.
[823] This world just didn't exist a very short time ago.
[824] Yeah, okay, so now a couple of really fascinating things on the horizon with it.
[825] I'll give kind of anane example.
[826] So they have created algorithms that specifically exclude it to identify race, let's say to set a insurance rate policy.
[827] But the AI can figure out race, quite simply.
[828] There's enough data points that ultimately it does figure out race without, quote, figuring out race.
[829] So that's a bad and a ferrous example.
[830] But I'll say the good part of that, the AI, and this gets into the deep reasoning future of the AI, is it will be able to see patterns that we would never see.
[831] It will be able to link variables that we would never think to.
[832] We get caught in thought paradigms.
[833] We're in one.
[834] We're in the find it and fix it paradigm.
[835] We're trying to get into the next paradigm.
[836] So we have thinking paradigms that the AI doesn't have, and it will ultimately start exposing to us some patterns that we won't understand.
[837] This is where it gets really curious.
[838] You parallel this with aspirin in the book, which is like aspirin works.
[839] really we don't know why.
[840] We just know it works.
[841] And there's a whole host of things that we know it works.
[842] We don't really know why it works and we accept that and run with it.
[843] That's our future.
[844] The AI is going to basically say, eat a grapefruit with two scoops of sugar and an aspirin and jump nine times.
[845] And that's going to work.
[846] And you'll be great.
[847] Yes.
[848] And we're just going to have to do it.
[849] There really is an important point about you need to know the race when you're treating patients.
[850] Surely, surely, yes.
[851] And the most strong.
[852] The striking example I've seen is looking at four different races, each populations in them that have two bad copies of the Alzheimer's gene.
[853] The Japanese are three and a half times more likely to get Alzheimer's than the Caucasians.
[854] Caucasians are twice as likely to get Alzheimer's as the African Americans, and the Hispanics don't have to worry about being homozygous at all.
[855] Really?
[856] These are really striking differences.
[857] Where it gets complicated is in the future, we're all going to be mixtures.
[858] Truly, yeah.
[859] And then we have to deconvalute the genes that are associated with those kind of differences for each individual and not for each race.
[860] All these little things, yeah, are medically relevant, but probably not when giving someone alone.
[861] Yep.
[862] Yeah, absolutely.
[863] Yeah, definitely not.
[864] Are assessing what kind of a risk they are.
[865] But in some ways it can help too, because there's a big issue.
[866] with black women are not getting the care that they need in medicine.
[867] And so in some ways, AI is helpful there because they're not going to have prejudices.
[868] They can have prejudices.
[869] Curiously, it all depends on what the goal of the algorithm is.
[870] Well, I don't think they're going to, like, not listen.
[871] There are two really distinct issues about prejudices.
[872] One is the bias from the data, that is most of our data is Caucasian.
[873] There is a big bias.
[874] the other is the bias of the algorithm writer.
[875] And those are two different things you have to think about independently.
[876] And it has been interesting.
[877] I think you were pointing towards this to X because initially it was exactly what you said.
[878] We thought, let's get rid of all the racial information.
[879] And then we'll just have an unbiased algorithm.
[880] And it's not true because it learns these things from other factors.
[881] Yeah, it goes through your Amazon music history.
[882] It can do that.
[883] And it goes, oh, this person listens to only R &B.
[884] Okay, that ticks, you know, like it figures it out.
[885] It's like when people try to build political models, it's like, what are your five favorite TV shows.
[886] You list them.
[887] 92 % I think it is.
[888] You could predict like how someone votes.
[889] Yeah, absolutely.
[890] And so you want to keep all of that messy information in.
[891] You're not training on it exactly, but it's the kind of thing that clues you into that you're making a mistake.
[892] And these AIs, especially in the early days, can make the kind of mistakes that humans would never make.
[893] One of the really famous early examples was it got into a healthcare system and just said, how do we stop people from getting cardiovascular disease?
[894] And it goes through and it came up with this 99 % accurate way to do it.
[895] Super amazing.
[896] What do you do?
[897] You just give carcinogens to everybody.
[898] Nobody dies of heart disease because they all die of cancer.
[899] Oh my God.
[900] So we never implement it because we understand it.
[901] So the problem is when you get into this area of, okay, AI is going to tell us stuff that we don't understand at what its motives are, especially as you look into the future, what is its capacity going to be to explain things to us?
[902] Because you look at things like the large language models.
[903] And it is quite good.
[904] good at you can give it a lot of information and tell it to summarize for you so i'm a little more optimistic than i used to be that we won't get stuck in only black box a i because we may not be able to understand or wrapped our mind around the patterns but the hope is that we can at least probe and have described to us at least the level of our intelligence what's going on once it was pointed out we would fathom it at least to some some degree you know yeah but but a great example of its success is a study showed that urinary black tumor analysis could be performed with AI at an accuracy rate of 93%, which is pretty mind -blowing.
[905] Yeah, these systems, at least in certain areas, are getting to the level of humans and even in some cases better, especially like the medical errors and so forth that we talked about.
[906] Especially in imaging, where they can actually take in far more detail in a human can.
[907] Right.
[908] There's a lot more data in that image than we can detect.
[909] Stay tuned for more armchair expert, if you dare.
[910] One of the limitations to AI has really been that it just doesn't understand anything outside of its very narrow context.
[911] It doesn't understand anything about the world.
[912] The large language models, you know, they simulate text.
[913] That's it.
[914] They don't actually understand anything about what's around.
[915] Now, they do bring in some pretty interesting capabilities, though, because even though they don't understand context, we can feed them with all kinds of, kinds of information that we have derived as humans about context.
[916] And you can start to surround that around things like an imaging model.
[917] So the imaging models in the past would only be able to deal with that that they saw.
[918] But you can now feed them a lot of information about what's going on in that patient's life, what are other things that are happening.
[919] And it can start to learn patterns, at least out of the language that it's derives, that is related then to that.
[920] So there is an ability for it to start to have access to more and more domains.
[921] Opinions.
[922] Yeah, some of them are just opinions, and we have to guard against that.
[923] I'd like to take just a few minutes and talk about the stories we've received from the data about what wellness and prevention are going to be in the future.
[924] Yeah, tell me. So the big vision for what Nathan and I are really pushing is a very simple idea.
[925] We now have the means to free.
[926] follow your health trajectory in terms of data and to assess where you are and to optimize your health.
[927] That's it.
[928] And the idea is we'll follow for each person their health trajectory all away across.
[929] And health really means three things.
[930] One, it means wellness.
[931] And the really important thing from this data -rich health we're talking about now is we can optimize enormously your wellness wherever you are.
[932] And we would guess the typical person might be 20 or 30 % of their potential if you're pretty healthy.
[933] But you can really go far more.
[934] The second thing is we can now image right at the point of wellness to transition to disease, that earliest detectable point.
[935] And the reason that transition is important is we think we can do it for all chronic diseases.
[936] And if we can reverse it when the disease is early and simple, you'll never get a clinically manifest disease.
[937] Wouldn't it be cool if we could block most chronic diseases in the next 10 years or so?
[938] And the third aspect of health is how we treat disease and we can use precision medicine and things like that.
[939] But what Nathan and I did was start in 2015, a company called Arafail that was to bring scientific wellness to consumer.
[940] and we collected 5 ,000 people over a four -year period.
[941] And the results have been spectacular.
[942] We've probably got 30 papers now that have come out of analyzing that dataset.
[943] In the 5 ,000 individuals, we saw 167 the transition from wellness to disease.
[944] And the first of the patients that did so about two years into the study was someone will call Eve, and she was diagnosed with pancreatic cancer, very late stage.
[945] What we did is we went back to bloods that we'd drawn prior to the time of her diagnosis and showed for all four of those up to two years away, she had five proteins that were expressed at very high levels that the normal population didn't express.
[946] And three of those five mapped into disease -bejurbed networks, common to late -stage pancreatic cancer.
[947] So these are like precursors of it?
[948] Right.
[949] We had 35 people transition to cancer.
[950] We looked at 10 others, and for each, we saw the same kind of thing.
[951] With the million -person project, we'll get 200 ,000 transitions.
[952] So the cool thing is we'll have markers identified for the transition to virtually every chronic disease there is.
[953] And we'll begin to learn how to reverse the disease.
[954] disease at that early stage.
[955] So this is a big start of beginning to save money, as well as transforming the quality for those people of life.
[956] Okay, so in this utopian system that is coming, what is the current hurdle?
[957] It would seem to me that the AI would need a lot of data from you, biometrical data.
[958] Gathering data on well people so we can see the transition.
[959] That's the issue.
[960] That is the real issue because essentially no people out there have the kind of data gathered on them automatically that you need.
[961] Are they wearing devices that help with this?
[962] The devices probably won't help with that.
[963] I mean, Nathan might argue with me. Right now, they have no clue as to early transitions.
[964] Shirley, I guess my question to you is, let's say there's a $9 .99 a month app I can get.
[965] That's going to monitor all this stuff.
[966] I either have to go to Quest's lab and give my blood every three days.
[967] My question is, are we around the corner, or is that still the stumbling block?
[968] We invented a device for this called the Wondra.
[969] It was actually named two weeks ago as MedTech's medical device product of the year, which is pretty cool.
[970] Congratulations.
[971] Thank you.
[972] Did Elizabeth Holmes also win one of those?
[973] Yeah, I really try to follow that.
[974] They've withdrawn.
[975] They took her Heisman back?
[976] She left an opening.
[977] I tried to learn a few lessons for the doctor.
[978] Yeah, yeah, with some don'ts.
[979] So first, do science.
[980] Yeah, sure it works.
[981] Don't use Siemens products and put your name on them.
[982] Pretend they're not Siemens products.
[983] Pretend they're not.
[984] But this is a device that goes onto your arm, and then you push a button, it creates a vacuum suction, you push another, and then it will pull blood out into a cartridge.
[985] And then you can put that into a sleeve.
[986] Now, once you do that, it's not considered a biohazards.
[987] You can ship this in the mail.
[988] Okay, okay.
[989] And so we just announced a partnership on this.
[990] where we'll be generating what's called metabolomics data.
[991] So this is thousands of small molecules out of your blood that you can get off a device that's mailed to your house.
[992] It's essentially painless.
[993] On a scale up to 10, people rate it between a zero and a one.
[994] There's two needles that will get the blood.
[995] But when you pull the cartridge out, they go into the device.
[996] So it becomes its own sharps container, which you can just throw it in the garbage.
[997] My Lord.
[998] So it's super easy.
[999] This is approved for direct -to -consumer use already in Japan and Brazil.
[1000] There's a few other countries online.
[1001] And then U .S. it's approved with a coach supervisor.
[1002] And we're expecting and hoping that it will be approved for direct -to -consumer here very shortly.
[1003] Oh, my God.
[1004] Okay, so once people have that...
[1005] That's part of it, because now you can get access to a lot of the data types we're talking about, whether it's small molecule metabolites or proteins, and you can get them out of the blood.
[1006] And since you mentioned Elizabeth Holmes and Theranos, so what Theranos was trying to do is they were trying to take conventional medical tests and then miniaturize them.
[1007] Yeah.
[1008] And people have failed on that trajectory quite a lot.
[1009] What we're doing is something a little bit different.
[1010] You're taking what are called omics tests, which we do all the time in science, and they're only ever done on very small amounts of blood.
[1011] And so we're not actually changing that technology at all.
[1012] This is how you always do them, but it becomes an information problem.
[1013] But in the era of AI and where we're at now, I'd much rather have an information problem of how do I read all these small molecules and figure out what they say about health than trying to solve a mechanical issue that people have failed out a bunch of times.
[1014] Because to have that thorough of a screening today conventionally, a doctor could even order that test?
[1015] And does it go to a special lab?
[1016] Do most labs even measure these?
[1017] So these metabolites, these small molecules, are mostly done in the research world.
[1018] These are just now starting to hit into clinical practices.
[1019] And we're going through a variety of different approvals on that as we develop those tests out.
[1020] But this allows you to get access to a much broader range of potential information.
[1021] I was in a meeting with just step down as a CEO of one of the largest health companies in the world.
[1022] And we were chatting about this.
[1023] And one of the things that he said, as we were going through this, is the only reason that we do lots of the tests that we do now is only because of historical chance.
[1024] If you were creating medicine now, you would take all these omics measures and that would be the substrate that you would use in order to identify what was going on in your body because you're going from your doctor measuring 10 things out of your blood to having thousands of measurements.
[1025] and you can do so for about the same cost.
[1026] And so that is a capability that's really coming.
[1027] Switching gears from the blood, the other really big element that I highly recommend for people's health is to look at the microbiome.
[1028] I don't know if that's a familiar topic.
[1029] We're super interested in it.
[1030] And yeah, to me, I'm like, how are they measuring that?
[1031] How do they get a sample?
[1032] Do we know what the composition is supposed to be?
[1033] How do we alter it?
[1034] It's supposed to be a panacea?
[1035] Nothing is a panacea, for sure.
[1036] Sometimes we get excited about things, but no patisia, no. But it is really important.
[1037] So one of the things you mentioned was how do you get a sample?
[1038] One of the elements that we invented over the last year or so was a new way to get access to a microbiome sample.
[1039] So have you done your microbiomes before?
[1040] No, but I can assume.
[1041] No, but I got obsessed with doing a fecal tram plant for a while.
[1042] We were deciding in our friend group who was going to be the provider of the stool.
[1043] That's right.
[1044] And it caused some interpersonal issues.
[1045] Of course, who we thought was the most healthy.
[1046] You know, that's very complicated.
[1047] You've got to go through a lot of people before you find a good stool source.
[1048] Very state to state, yeah.
[1049] There's some states that are very open to it and others that aren't.
[1050] So most people have not done their own microbiome.
[1051] And one of the hurdles to that is because you can just imagine the logistics of having to do this.
[1052] You get a kit, but you have to poop in a bucket or on a piece of paper.
[1053] You got to get a little shovel, scoop it up.
[1054] You get your hands dirty.
[1055] Some of the tests require freezing.
[1056] And I don't know what you keep in your freezer.
[1057] Not sure.
[1058] I'm probably not sure.
[1059] Yeah, we'll try to avoid it.
[1060] It's probably true.
[1061] It's found to happen, but, yeah.
[1062] So we invented something called the microbiome wipe, which is basically what it sounds like.
[1063] So it's made out of a special polymer.
[1064] It's a special toilet paper, space -age toilet paper, and you wipe like you do every day.
[1065] Yeah, until there's blood if you're me. You just side note.
[1066] Oh, you got to throw that one out.
[1067] Yeah, the blood you'll get too much human DNA.
[1068] I'm going to give you an extra sample you'll get it from me. Anyway, you drop it in a vial, you close it, you shake it in about 10 seconds.
[1069] The wipe will dissolve away.
[1070] Uh -huh.
[1071] And then you push a butt, releases salt solution, and it will preserve the DNA.
[1072] And we show that we could get just as high quality sequencing.
[1073] We published this in Frontiers and Immunology last year to show that we could do it.
[1074] And so you can get it.
[1075] Wow.
[1076] I will admit that I have a version of this for both of you if you wanted.
[1077] Oh, my God.
[1078] I'll give it to you after.
[1079] I want all this stuff.
[1080] But I do have, I brought you each a kit just in case.
[1081] Oh, my God.
[1082] In case you were interested in this.
[1083] And then wait, do we mail it in somewhere?
[1084] We just mails it up.
[1085] You mail it to each other.
[1086] And identify what's going on.
[1087] It has a package.
[1088] So it just goes in the mail.
[1089] And then it will populate all the data back to you on your computer.
[1090] So I'll just get it back to you.
[1091] So then my next question is, am I wrong in that there's still so little known about the microbiome?
[1092] There's some interesting links to some chronic diseases that stem from there.
[1093] It's really evolved a lot in the past few years.
[1094] So if you go back like five years ago, we were doing microbiomes, but you had to squint a little, right?
[1095] You're like, okay, there's some information.
[1096] But now there's actually quite a lot you can say.
[1097] You can get information about the gene content, whether or not you are making too much ammonia, for example, in your gut.
[1098] And why would you care about that?
[1099] Because if you have certain species in there making ammonia, that will cause your stomach to be less acidic, won't break down food as well.
[1100] And you can see if that's in your gut.
[1101] A bunch of your neurotransmitters are made in the gut.
[1102] Your serotonin's made.
[1103] Serotonin, 95 % are made in the gut.
[1104] That matters a lot.
[1105] there's a bunch of vitamins and nutrients that your gut will create and you can look at the gene content so the way we do sequencing now you get all the genes of the different microbes that are in there and we know which genes encodes synthesis pathways and so if you go in there like the last one I did and I didn't have any of the synthesis pathways for some of the vitamin Bs and turned out I was low on those my physician had told me that a year before and she had given me some shots for it to make it better but after I did the gut health test I thought, oh, okay, I actually don't have the bacteria in there to do it.
[1106] There's risk factors like TMAO is a risk factor for cardiovascular disease, but you only get it if you have certain bacteria in your gut.
[1107] Because if you have certain bacteria in your gut, they will eat either L -carnatine, which comes out of red meat, and they'll change it into something called trimethylamine, and then trimethelamine gets converted in your liver to TMA, which is a risk factor for cardiovascular disease.
[1108] Coming back to one of the compounds I talked about being really important before, which is phosphatidylcholine for your brain.
[1109] Those same bacteria will eat phosphatidylcholine.
[1110] They'll turn it into trimethalamine again to TMAO.
[1111] So if you have these bacteria, you could be trying to help your brain.
[1112] So you could be supplementing, which we think can make a big difference into the health of the brain.
[1113] But if you have bad bacteria, instead, you could be causing problems.
[1114] Feeling a different disease, yeah.
[1115] You're going to win the Nobel Prize.
[1116] Oh, my God.
[1117] Our first Nobel Prize?
[1118] No, we've had one, but not pre, I don't think.
[1119] Okay, we're free.
[1120] This is a pre.
[1121] That would be a big of paper.
[1122] I don't even know what that means.
[1123] It's an incredible future that feels right around the corner.
[1124] Are you pissed off, Lee, that this is all happening right now for you?
[1125] Not at all.
[1126] You're not.
[1127] So let me tell you about one more factor, and then I'll tell you why I'm not pissed off.
[1128] Okay, yeah, yeah, yeah.
[1129] So in the Aerville population, we had people from 21 to 90 plus.
[1130] We were able to develop.
[1131] an algorithm in looking at how your ability to control the expression patterns of blood analytes changed.
[1132] The algorithm basically gave you your biological age.
[1133] It's the age your body says you are as opposed to what your birthday says you are.
[1134] And the lower the biological age is from your birthday age, the better you're aging.
[1135] Okay.
[1136] So we looked at the Aravail women in found that they lost a year and a half of biological age per year they stayed in the program.
[1137] So six years lost in four years.
[1138] Really striking.
[1139] Men did 0 .8 years, about half as much, roughly.
[1140] And Nathan actually had his biological age go down 10 years during that time.
[1141] He was 71 when they did it.
[1142] Yeah.
[1143] Lee and I went to high school together.
[1144] I don't know if you know that.
[1145] Their biology were cohorts.
[1146] But the other thing we were able to show is people that had diseases like type 2 diabetes always had biological ages higher than their chronological age.
[1147] And what's really cool about this is the algorithm for biological age uses metabolites.
[1148] And for the individual, they give us clues as to how you can optimize aging.
[1149] So the reason I'm confident is my biological age is about 15 years, in my chronological.
[1150] Congratulations.
[1151] So I have plenty of time to see all of these come from fruition.
[1152] I often float this at dinner parties.
[1153] If I happen to be born at the right time to witness us arrest aging and stop it, I will become very, very suspicious that we are in a simulation.
[1154] What would the odds be 300 ,000 years as a species that us four were born in a time where we might watch this happen?
[1155] Very suspicious.
[1156] Are you suspicious?
[1157] Nathan.
[1158] You've hit on one of my favorite pet topics.
[1159] Oh, good!
[1160] Good.
[1161] I'm very suspicious about this.
[1162] So, for example, in quantum mechanics, there's these weird things with the effective observer.
[1163] And so if you don't watch it, if you look at the sled experiments and so forth, it's this fuzzy wave, only if you watch it does a particle go through one at a time.
[1164] So if you're a programmer and you're hacking up a simulation, you're going to say, I'm not going to bother to calculate every freaking particle unless someone's watching.
[1165] And if they're watching, I'll calculate.
[1166] But if not, I'm just going to make a it a wave function because it's just simpler.
[1167] Uh -huh.
[1168] So that makes a lot of sense.
[1169] And so there's, of course, the Fabus argument about if you consider any advance at all in the ability to do simulation, then it eventually becomes the same as reality.
[1170] So what's the likelihood that you're in base reality instead of one of the multiple regressions?
[1171] We're in a model right now.
[1172] We're in a model right now.
[1173] It's here to deal with climate change, perhaps.
[1174] So I don't take any of this very seriously, but I do like to think about it because I do find some of the aspects of quantum mechanics really bizarre.
[1175] They kind of do make sense to me a little bit.
[1176] If it's a simulation.
[1177] But would they program it in for us to even be able to have these conversations to question it?
[1178] Yeah, because we have to run free for it to work.
[1179] It probably depends how much you paid for your simulation package.
[1180] I always imagine waking up and just be like okay, did you enjoy your ride to Earths?
[1181] Yes.
[1182] And you can't afford it again for another 500 million years.
[1183] But obviously, you're paying for a simulation whereby you live for eternity, which is what it's going to give you in about 13 years, you're going to shut all your aging off and you'll be frozen at that age for eternity.
[1184] And at that point, you should go, yeah, duh, I'm in a fucking simulation.
[1185] They have never done this before.
[1186] Why am I here during this time?
[1187] But since we really don't know, I'm trying to work on long jazz.
[1188] I like the ride.
[1189] I would like to take out as long as possible.
[1190] I guess I'm just asking, Lee, you've seen so many insanely revolutionary breakthroughs since you were born in, what, 1939?
[1191] 38.
[1192] We just interviewed Jane Fonda and she was born in 37.
[1193] And she like you, intimidatingly sharp and cogent and linear.
[1194] What my feeling is about biological age is, how do you feel?
[1195] I look at my 85 -year -old colleagues and I think I'm in better shape.
[1196] I do 100 plus push -ups every day and sit -ups and all of those things.
[1197] So those do make a difference.
[1198] There's no question.
[1199] Now, making us immortal is.
[1200] quite a different question.
[1201] Slowing the aging process down, I think is much less of a ask.
[1202] But if you start with the simple notion of, okay, mitosis makes perfect replicas of each cell, it's conceivable that we could make a identical mirror image of ourselves to infinity.
[1203] There's this curious mechanism that makes us not make perfect mirror.
[1204] Many, many mechanisms.
[1205] Yeah, I'm sorry, yes, yes.
[1206] But in concept, if a cell device, and replicates itself perfectly, there's no reason that aging is implicit.
[1207] Well, let me tell you another really interesting aspect about aging.
[1208] We studied maybe 10 years ago 18 people that were 115 or older.
[1209] Oh, my goodness.
[1210] The genetics were absolutely fascinating about genes that might correlate with this longevity, but the population size was too small.
[1211] The really important thing I learned, though, is once you get to be 100, most people die very, very rapidly of a complete system's failure.
[1212] And the classic way is you fall, you break a hip, that's it.
[1213] Six weeks later, you're dead.
[1214] So my objective now is I want to slow the aging process so that people's health span can get them into the hundreds, and then they're kind of on their own, right?
[1215] Yeah, yeah, yeah.
[1216] The argument that people spend 25 % of their health care dollars in the last.
[1217] six months is just terrifying and this ensures that it's not going to happen if that's a general rule well it's an incredible future i'm very excited to watch it unravel and a bit terrified for all the reasons everyone else is but the age of scientific wellness why the future of medicine is personalized data rich and in your hands is a great starter piece for everyone to read to understand where we're headed and it is very exciting every time we get doomsday about A .I. Getting rid of every job and relegating us to a leisure class, what that looks like, all that's scary.
[1218] But then I always say to Monica, but AI might cure cancer like next week.
[1219] Like once it turns on and it starts learning and it has data like literally could be next week.
[1220] And this disease could fall next.
[1221] And I think last time we talked about it, I said, yeah, but I don't care or something.
[1222] And now I do.
[1223] I do care.
[1224] We got you.
[1225] Yeah.
[1226] Yeah.
[1227] If we're going to let you live to be 100 and be productive.
[1228] excited about life.
[1229] Exactly.
[1230] I actually think we're entering by far the most exciting period ever in biology, and it's because of AI.
[1231] Because I don't think we can actually solve the really complicated issues, like can you reduce the error rate of replication across your entire body?
[1232] I don't think we get there without AI.
[1233] So you actually have to have this massive acceleration of AI in order to even start contemplating some of the really big challenges that we're just not even within reach before.
[1234] I'm trying to expand my thinking on, can we actually go to places that were impossible before by this and how fast do it and how do we do it in a way that's safe and ethical and it's incredibly complicated.
[1235] We're not even getting into the geopolitics of it all.
[1236] Like, we have to do it.
[1237] Yeah.
[1238] We can't be third.
[1239] We don't really have.
[1240] For even second.
[1241] You got to be first.
[1242] Yes.
[1243] So it really makes the decision for you.
[1244] And it's here.
[1245] So I hope everyone reads the age of scientific.
[1246] wellness while the future of medicine is personalized data rich and in your hands gentlemen it's been so nice to have both of you i'm so glad you both came in person it's such a pleasure to meet you you lee incredible work it's been a wonderful conversation i really enjoyed it okay wonderful we will wipe with these products and we'll report back on the fact check we'll repeat back on the fact check well we wipe and mail -in okay great great great great yeah don't just flush it down the point don't just wipe Next off is the fact I don't even care about facts I just want to get into your pants Did you have F1 this morning?
[1247] I did this is my second shirt.
[1248] I'm embarrassed to tell you.
[1249] I had to just now go in and change You spittled?
[1250] No I sweat so profusely through my back this morning?
[1251] Yes at 8 .30 a .m. Almost where you might think is there a metabolic issue going on here ding ding ding, ding tell me for today's it's for lee hood and and Nathan Price, those medicine guys.
[1252] Yes, metabolic.
[1253] Yeah.
[1254] Yeah, we like that word.
[1255] But anyways, I made a fucking mess out of my first shirt this morning.
[1256] I mean, like someone threw a bucket of water at my back when I was walking as a prank.
[1257] That's so weird.
[1258] I know.
[1259] Because it's early and it's not hot.
[1260] No, I don't know.
[1261] And I feel like you probably weren't like, well, I guess I don't know because I don't know about that far.
[1262] Tell me. But I feel.
[1263] Ask some questions.
[1264] Okay.
[1265] I feel like often you'll see.
[1266] sweat if you're a little not nervous but you're interviewing a person it's you know you have to be on in a different way and have questions ready and i feel like f1 would be different because it's your friends but i guess you do still well it's my friends but one of them's calling from france over the internet another one's in orlando at a hair show over the internet so a lot of you know internet sure orchestrating two different zoom calls and then i'm you in f1 so it's like you You right now are holding the facts.
[1267] You've got like the game plan in your lap.
[1268] Yeah.
[1269] And so on F1, I'm you, as I just said.
[1270] I go by Monica on F1.
[1271] Oh my God, cool.
[1272] Even though it's called F1 with DRS.
[1273] I think I might need a little bit of a taste of it.
[1274] Yeah, if you're going to be using my name.
[1275] Your likeness.
[1276] Yeah.
[1277] Yeah, I go by Monica Padman.
[1278] But so that there would be no legal issues, I go by Monica Leland Padman.
[1279] Oh, you should have gone by Monica Silly Padman.
[1280] Oh, that would be really good.
[1281] Is that a known joke that?
[1282] No, I just made it up.
[1283] I just made it up.
[1284] Okay.
[1285] Well, yeah, so maybe I was like, you know, I'm trying to run.
[1286] There's a structure that I guess I'm loosely in charge of.
[1287] Okay.
[1288] Well, then that is like armchair.
[1289] Yeah.
[1290] Okay.
[1291] It all makes sense.
[1292] And then even weirder, because I don't, there's no time to edit.
[1293] Oh, so you have to pay.
[1294] Another layer is like I wanted to go perfectly.
[1295] Oh.
[1296] That's probably where the stress comes up.
[1297] Wet comes.
[1298] Yeah, this is a flop sweat.
[1299] How was your sexy weekend?
[1300] It was really nice.
[1301] You had a sexy girl's trip to a sexy hotel.
[1302] We can talk about it.
[1303] First, I want to talk about something else.
[1304] Oh, okay.
[1305] I'm drinking a tea in here.
[1306] Yeah.
[1307] Which I normally arrive with my tea.
[1308] Mm -hmm.
[1309] But today I brought my tea bag.
[1310] I didn't have time to make it.
[1311] Well, it turns out I did, but I read the schedule wrong.
[1312] Okay.
[1313] It updated this morning.
[1314] Oh, okay.
[1315] Well, obviously.
[1316] I'm not.
[1317] Yeah, text would have been nice.
[1318] Well, I assumed you saw it because I hung up.
[1319] here till like 10 .04?
[1320] No, I literally walked up.
[1321] I checked my phone.
[1322] I walked up the stairs at 10.
[1323] Oh, really?
[1324] Oh, you know what it is?
[1325] My watch is like four minutes ahead.
[1326] Her watch is on Miami time.
[1327] Well, no, no, my watch right now says 23 after the hour.
[1328] Okay.
[1329] Yeah.
[1330] Which I don't really know what it is after the hour.
[1331] 20.
[1332] So I'm running three minutes.
[1333] That at least explains it.
[1334] But I didn't think of it when I looked at my watch.
[1335] So I was like, oh, she must have seen the schedule changed.
[1336] I didn't.
[1337] Which is fine.
[1338] but I brought my tea bag instead.
[1339] And then when I got here, I was like, well, it'll be fine.
[1340] Because, you know, I'm in the middle of a whole issue with oat milk.
[1341] Because I'm trying not to drink it as much because of the gums and the oils.
[1342] What?
[1343] Now oat milk's bad for us?
[1344] Right.
[1345] Certain oat milks, I guess, or lots of these products, not just oat.
[1346] They have gums and oils, and I guess it's bad.
[1347] Chewing gum?
[1348] Zantham gums.
[1349] Has anthem gum in it?
[1350] Oh, my gosh.
[1351] And others.
[1352] And I guess it's bad.
[1353] And the one I love the most, obviously, it has it.
[1354] Of course.
[1355] Well, it's probably what makes it taste good.
[1356] It is.
[1357] Like, looked at me so weird.
[1358] Like, you were doing some out of character.
[1359] Yes.
[1360] They're like, I guess I've never seen you chew gum.
[1361] And I said, I love gum.
[1362] You know what, Monica?
[1363] Same.
[1364] So I love gum.
[1365] In high school, I had a Ziploc gallon -sized bag hanging from the hook inside my locker and it was filled to the very brim with all the different cheaper gums, the doublements, the yellow, winter green, big red.
[1366] They'd come in a 10 -pack assortment and I'd get a bunch of them and I'd take it all apart and it would just be the sticks of gum.
[1367] Oh, wow.
[1368] So you wouldn't know what you were getting.
[1369] No, you would because the, oh.
[1370] Sometimes you don't know.
[1371] You're right.
[1372] It's all aluminum.
[1373] I don't know how I knew that.
[1374] Gum roulette.
[1375] Well, at any rate, I chewed gum nonstop from.
[1376] from sixth grade until, I don't know, 20s?
[1377] You were probably chewing it when I was born.
[1378] Yeah, blew a big bubble on your birthday.
[1379] Yeah.
[1380] August 24th, 1987.
[1381] Yeah, I remember that day, actually.
[1382] Summertime.
[1383] Big gum day.
[1384] Extra, I had an extra slice of gum in there.
[1385] It blew a big old bubble.
[1386] Yeah.
[1387] And I had chewed so many different flavors that it turned the gum brown.
[1388] Yeah, you just shoved it all in at once.
[1389] And it turned it brown.
[1390] I had blew a brown bubble.
[1391] Oh, okay.
[1392] Yeah, I remember it.
[1393] Yes, ding, ding, ding.
[1394] Sweet.
[1395] Yeah.
[1396] Did you ever chew it so much?
[1397] There were a couple gums that if you chewed the same stick so much, it started to disintegrate.
[1398] Fucking hate that kind of gum.
[1399] It was so weird.
[1400] Hate that kind of gum.
[1401] It seemed to track on the sugar -free variety that did that.
[1402] But there are some sugar -free ones that do, like, you know what I really like is an adult.
[1403] Because those were all Wrigley products, the wriggly gum that I was doing.
[1404] Trident.
[1405] Oh, I love Trident.
[1406] It's very stable gum.
[1407] And Orbit.
[1408] Orbit's great, too.
[1409] I love the bubble gum flavor.
[1410] And you get the little plastic container.
[1411] It's not good for the environment, but boy, it's fun to rattle.
[1412] But that's a good...
[1413] Trident holds up under pressure.
[1414] Yeah.
[1415] Stable.
[1416] Very stable gum.
[1417] Well, six out of seven dentists recommend Trident.
[1418] Yeah.
[1419] I think even eight out of seven.
[1420] No, it's...
[1421] There's one dentist that's holding out.
[1422] He's holding out.
[1423] It's in the pocket of orbits.
[1424] Okay, I got off track.
[1425] So, oat milk.
[1426] So I'm having trouble.
[1427] I'm in a phase.
[1428] I'm in a new era.
[1429] Can I ask if you read this somewhere or did someone whisper this to you?
[1430] You've got to be careful.
[1431] I know.
[1432] I saw someone on Instagram who I follow, who I trust, who was going through this themselves.
[1433] They were on a war path?
[1434] Well, they were posting about different, like they were trying different oat milks.
[1435] They were in my position.
[1436] They were sad to learn this information.
[1437] So they were testing out different oat milks.
[1438] Then I did.
[1439] do some research and it's like, oh yeah, I guess it's bad.
[1440] I mean, I never really found out why it's that bad.
[1441] See, that's what we got to be careful about.
[1442] These things just travel really quickly.
[1443] You know, did you hear what's bad?
[1444] No, what's bad?
[1445] Apples are bad.
[1446] There you go.
[1447] They taste bad.
[1448] They taste delicious.
[1449] They're so crisp.
[1450] It's nature's toothbrush.
[1451] It hurts your teeth.
[1452] Like if you're in a pinch here somewhere and you're like, fuck, I could really go for a brushing of my teeth, but I don't have any of my equipment.
[1453] The next best thing, in my opinion, is eat a couple apples because it brushes your teeth as you're eating it.
[1454] Yeah, crunching and it's real flat.
[1455] It washes your teeth really well.
[1456] But I think it hurts your gums and stuff.
[1457] No, makes mine feel powerful.
[1458] Okay.
[1459] Energized.
[1460] Anyway, so I bought an oat milk that only had three ingredients.
[1461] Oats, water, nothing.
[1462] You know, like, yeah, probably.
[1463] And it tastes like shit, right?
[1464] Yes.
[1465] Yeah, of course.
[1466] It just tastes like water.
[1467] From an oatmeal.
[1468] If you overfill your oatmeal, my Bob's Redmill in the morning, I've got to dial because I eat it 100 out of 100 mornings.
[1469] I eat Bob's red milk.
[1470] But sometimes I pour a little too much water in it from the Insta Hot.
[1471] And I got to like strain it out.
[1472] And sometimes I've sipped it out.
[1473] Is it good?
[1474] No, it's not good.
[1475] I mean, the product's great.
[1476] But that water, just oat water, no. I know.
[1477] It needs that cream.
[1478] He needs the gum and the oils.
[1479] So anyway, so then I switched back to cows milk.
[1480] There we go.
[1481] But I don't like that either.
[1482] Poodie.
[1483] booty issue?
[1484] Yes, like not great for my body.
[1485] I'm only using a teeny bit.
[1486] It doesn't feel great in my body and it is too rich.
[1487] Oh, wow.
[1488] Too creamy.
[1489] Yeah.
[1490] My gosh.
[1491] Anyway, so I've decided if I'm out and about and I'm getting a coffee or something, I'm allowed to have gums and oils.
[1492] Oh, cool.
[1493] If I'm cooking it, if I'm making it at my house, I have to either do the the water oats or the cows milk which is my normal morning but if I happen to be owl or it's like second treat yeah I can have gums and oils and today that would be a good clothing line gums and oils yeah right yeah sounds good oh maybe I'll try to get in touch well not like I haven't been trying to get in touch with mary cane and Ashley but maybe I'll try have you actually been trying to get in touch well I've tried yeah you've DM them they're not even on No, they're not in on Instagram, but I've tried to get to their people.
[1494] Okay.
[1495] We probably don't even have any people.
[1496] Exactly.
[1497] They're so elusive.
[1498] But you know who's friends with them?
[1499] Who?
[1500] Jennifer Lawrence.
[1501] Oh, okay.
[1502] And I feel like we could get to her.
[1503] Oh, wow.
[1504] This is, you generally want to go to somebody low status as the gatekeeper.
[1505] You don't want to go to someone that's like the highest status.
[1506] Like, step one, we've got to become best friends with Tom Cruise.
[1507] That'll lead us to his stunt double, Terry, but I'm having a hard time getting his number.
[1508] No, how dare you connect Terry?
[1509] Although Terry's great, but he's the best.
[1510] He's not the same as Mary Kate, Ashley.
[1511] But, no, I feel like if we had Jennifer Lawrence on and we did a good job, she then maybe recommend.
[1512] Or, you know, we could do Elizabeth Olson.
[1513] Oh, wow.
[1514] Now this is getting very McAvellian.
[1515] She's very close.
[1516] I just started a show that she's in love and death.
[1517] It's supposed to be great.
[1518] It's great.
[1519] It's great.
[1520] Yeah.
[1521] Okay.
[1522] We love you, Elizabeth Olson.
[1523] Feel free to come on and then.
[1524] Also, if it goes well, I don't want to use her for that.
[1525] That's rude.
[1526] I think I watched that show with a particular interest in her because I know that her sisters, and they're seemingly much different.
[1527] Yeah.
[1528] And that's fascinating.
[1529] Elizabeth is much more, she does interviews and she does stuff.
[1530] Like, she's out there.
[1531] I wonder if she feels weird about the row.
[1532] Like, am I supposed to wear that every day?
[1533] I love it, but am I supposed to wear that every day?
[1534] Or am I not ever supposed to wear it to?
[1535] My guess is the latter.
[1536] Identity.
[1537] Yeah.
[1538] I mean, just imagining growing up in the shadow of those two.
[1539] Also, like, Matt's supportive.
[1540] But think about this to you.
[1541] And by the way, this is me speculating.
[1542] I have no inside information.
[1543] This never came up when I was dating.
[1544] Okay.
[1545] A .O. You called her that?
[1546] Occasionally, yeah.
[1547] You did?
[1548] Yeah.
[1549] It's pretty good, right?
[1550] Wow.
[1551] What does that mean?
[1552] What does that mean to you?
[1553] I don't know.
[1554] It just really took me by surprise.
[1555] Okay.
[1556] So I don't know.
[1557] know this at all.
[1558] But if you're the younger sibling, that's already hard.
[1559] The older siblings, they're using you only when they're bored.
[1560] The older siblings are always the best ones.
[1561] I totally disagree as a middle child, but regardless, they only hang with you when they're bored out of their fucking mind.
[1562] Now add in the older sibling has an identical twin, which no two people can be as close as identical twins.
[1563] Exactly.
[1564] So I imagine that's got to compound the estrangement.
[1565] Yeah.
[1566] But again, I don't know that.
[1567] I don't know anything about her.
[1568] I don't know if we ever even talked about her younger sister.
[1569] That's even worse.
[1570] Well.
[1571] That proves it.
[1572] Because I'm sure she talked about Mary Kate.
[1573] Well, Mary Kate was just around.
[1574] Exactly.
[1575] Oh, wow.
[1576] Okay, no. Now we're into theoretical.
[1577] I want to be very clear that.
[1578] I don't know anything about it.
[1579] No, I know.
[1580] We don't.
[1581] So.
[1582] Okay.
[1583] So your weekend.
[1584] Let's talk about your weekend.
[1585] Oh, wait, but I didn't finish.
[1586] Oh, okay.
[1587] So this morning, when I brought my tea bag, I thought, oh, it'll be okay because I bought oat milk a while back for the attic, the one I like with gums and oils.
[1588] Since I didn't have time to make it at my house, it was a good excuse for me to have gums and oils.
[1589] You know, yeah.
[1590] But then when I got here, it's expired.
[1591] Who gives a fuck?
[1592] No, I'm too scared of that.
[1593] Really?
[1594] Oh, I don't care at all about expiration dates.
[1595] I put it there so I can throw it out.
[1596] I'm going to put it back in the fridge.
[1597] Then I'm going to drink it tomorrow.
[1598] I'm going to bring in some cereal.
[1599] I have a big, big bowl.
[1600] But we had some other thing in here, a nut pod.
[1601] Oh.
[1602] And so I'm trying a nut pod.
[1603] It also has gums and oils.
[1604] Yeah, well, all the good stuff.
[1605] But that's fine.
[1606] But it's vanilla flavored.
[1607] So now I'm drinking a vanilla flavored tea.
[1608] Oh, yeah.
[1609] Which is interesting.
[1610] Mm -hmm.
[1611] That is interesting.
[1612] Why are oils bad for you?
[1613] milk?
[1614] I don't know.
[1615] It's the added oil.
[1616] It's not like the natural oil from cow's milk.
[1617] He's added oil to his coffee.
[1618] Well, not olive.
[1619] Olive's good.
[1620] Premier oil, right?
[1621] These are like synthetic oils.
[1622] Old oil's premium oil.
[1623] It is.
[1624] Okay.
[1625] All right.
[1626] Now you're weekend, which you're so reluctant to talk about.
[1627] We finally got to my nut pod.
[1628] Yes.
[1629] It was a long walk to the nut pod.
[1630] It was.
[1631] Yeah, we went to Calamigos, friends from home.
[1632] How many girls total?
[1633] Five girls total.
[1634] Okay, FGT.
[1635] Oh, you're really into...
[1636] Something's happening today.
[1637] It's part of the sweating and the metabolism, I think.
[1638] It's like that thing Ricky does.
[1639] Yeah, it's happening to me. Cool.
[1640] Yeah.
[1641] I'd like you to get better at that, actually.
[1642] Me too.
[1643] It's a fun party trip.
[1644] It is.
[1645] But five gals.
[1646] Yeah, and we went to Calamigos Ranch in Malibu.
[1647] Were there any bachelors there on a golf trip or anything?
[1648] No. It wasn't for wedding.
[1649] I feel like you're thinking of it more wedding.
[1650] I'm thinking of Bachelorette party.
[1651] I know, which it's not.
[1652] But it always is.
[1653] If you get five gals together, sure, one of them's out of commission with a child.
[1654] That's clear to everyone.
[1655] At this point, she's close to delivering.
[1656] Yeah.
[1657] But the other four, virtually a Bachelorette weekend.
[1658] Sure.
[1659] That's the vibe.
[1660] When you see five gales at the pool on a vacation destination and near there on your golf trip, that's going to peak everyone's interest.
[1661] Okay.
[1662] That didn't happen?
[1663] No. No, it didn't.
[1664] Yeah, we were there for kind of a faux baby shower, just an excuse for everyone to be together before Callie has her baby.
[1665] And so it was really nice.
[1666] What are the highlights?
[1667] Anything exciting?
[1668] A lot of pool time, we went to the beach.
[1669] They had a beach club, so we went to the beach.
[1670] We went to Nobu.
[1671] Oh, my God.
[1672] Yeah.
[1673] That's a pricey restaurant.
[1674] Yeah.
[1675] That might be the priziest restaurant.
[1676] No. It's not the pricest restaurant.
[1677] I've seen some bills there that absolutely staggered in my imagination.
[1678] Okay.
[1679] I mean, it's not cheap.
[1680] It is expensive.
[1681] There was five of you.
[1682] Uh -huh.
[1683] And how much was the bill?
[1684] $1 ,600?
[1685] No, no. It was like $500.
[1686] It was.
[1687] Is that possible?
[1688] People were not drinking?
[1689] Only two of us had one drink.
[1690] Oh, interesting.
[1691] Well, how do you explain that?
[1692] Well, we've been drinking all day.
[1693] Yeah, everyone was already shit -faced.
[1694] Well, we just, it wasn't that we were shit -faced.
[1695] You guys were trying to sop up the booze from the day.
[1696] We had been drinking all day.
[1697] It was 8 .15.
[1698] Not hard.
[1699] In the morning, you started drinking?
[1700] No, no. It was, we started drinking probably around noon.
[1701] That's pretty late.
[1702] I mean, again, it's not a bachelorette party.
[1703] I think it's a bachelorette party.
[1704] You want it to be, but it wasn't.
[1705] But we were drinking consistently at the beach.
[1706] Yeah, of course.
[1707] Was there service at the beach, or did you have to lug everything?
[1708] There was.
[1709] It was cool.
[1710] You could order drugs.
[1711] drinks right to your chair?
[1712] Oh, my God.
[1713] And they set up the chairs.
[1714] Like you're on vacation at a real place.
[1715] Yeah.
[1716] Oh, my God.
[1717] I think it is.
[1718] It's weird when you go in town.
[1719] Yeah, true.
[1720] I had a similar.
[1721] I had a out of the ordinary, but within town trip of my own.
[1722] Tell me. So Lincoln and Kristen went to see Lizzo and Palm Springs Friday night.
[1723] Oh.
[1724] And they spent the night there, which left Dee Money and I to have a special date night.
[1725] Oh.
[1726] That we started planning a couple days beforehand.
[1727] And on Friday morning, as I was taking my morning evacuation, she came in and handed me a piece of paper and a list of everything we were going to do.
[1728] Oh, it's an itinerary, right.
[1729] That's what they're called.
[1730] Thank you.
[1731] And we nailed it.
[1732] So she got out of school.
[1733] Oh, she specifically wanted to be in the 454SS.
[1734] Oh.
[1735] Because she really likes going fast in that truck.
[1736] She loves the zero to 60 in it a lot.
[1737] She wants to race.
[1738] She always telling me to race people and beat this guy.
[1739] And she's love.
[1740] She's not like that.
[1741] I know.
[1742] And it's that truck in particular.
[1743] She loves that truck.
[1744] Okay.
[1745] Which I love because I also love that truck.
[1746] You do.
[1747] And no one likes it when I fucking gas it except for her.
[1748] So we took that truck to In -N -Out burger Friday evening.
[1749] Your favorite spot.
[1750] Yes.
[1751] Parked.
[1752] What did you get?
[1753] Two double doubles protein -style animal -style fries.
[1754] She got a cheeseburger.
[1755] This is, I'm glad you asked this.
[1756] She got a cheeseburger and fries.
[1757] Yeah.
[1758] and a Coca -Cola, which she's never gotten.
[1759] Oh.
[1760] Yeah, they don't drink soda.
[1761] Right.
[1762] But this was a special date.
[1763] Oh, yeah, yeah.
[1764] So she had a Coca -Cola, but she only had like three sips.
[1765] And then she said, I need to go get a glass of water.
[1766] And I was like, well, she's just a good girl.
[1767] So she tried doing Coke, but she ended up wanting water with her.
[1768] Anyways, we left there.
[1769] We took the scenic route downtown, so through all the city streets, lots of peeling out, lots of flora and lots of racing.
[1770] We were having a great time.
[1771] And then we went, this.
[1772] is fancy.
[1773] People are going to be triggered.
[1774] We went to the Ritz Carlton in downtown L .A. Because we wanted to have views of a city.
[1775] We wanted to be high up.
[1776] That was the pre -record.
[1777] We wanted to be high up in downtown L .A. to look out over the city at night.
[1778] So I got a corner room.
[1779] Oh, you got a hotel.
[1780] Yes.
[1781] Yes.
[1782] We went and checked into the Ritz Carlton.
[1783] Oh, my God.
[1784] Uh -huh.
[1785] And.
[1786] Oh, this is so cute.
[1787] It was so cute.
[1788] And upon check -in, we were told, if you want popcorn, we have free popcorn tickets.
[1789] You just take them across the street to the movie theater.
[1790] And there's free popcorn, yes.
[1791] So we're excited about that, but we get up to the room and we look out the windows for a long time where we see the observatory.
[1792] We're figuring out where our house is compared to that.
[1793] Yeah.
[1794] And then we were also on a club lounge.
[1795] What's that?
[1796] The club floor.
[1797] What's that?
[1798] They had a room.
[1799] It's like the club room on the 23rd floor.
[1800] You're, and if you're on that floor, like just this big buffet, bunch of different.
[1801] In the room?
[1802] No, no, no. It's like.
[1803] Oh, I know what you mean.
[1804] Yes, a club.
[1805] There's a club.
[1806] Like a lounge.
[1807] Up there.
[1808] Only for the people on that.
[1809] Not in the club floor.
[1810] Maybe for everyone.
[1811] I don't know.
[1812] They told us that.
[1813] That happened to me in that horrible hotel in Austin.
[1814] So it's not always good.
[1815] Isn't it?
[1816] It's not always good.
[1817] So don't get that upset.
[1818] Everyone would calm down is what you're saying.
[1819] This one was spectacular.
[1820] Oh, no. I mean, great.
[1821] Remind you, we just had in and out.
[1822] Yeah.
[1823] So we get in there and there's macaroons on display and Delta's dessert time.
[1824] Fired up.
[1825] So she starts pounding some macaroons.
[1826] She finds maybe a cookie she likes.
[1827] We're sitting on this couch.
[1828] We're chatting with other guests that are there.
[1829] Then we go downstairs.
[1830] I'm like, let's go walk around L .A. Live.
[1831] We're like right next to this outdoor area.
[1832] Go down to L .A. Live.
[1833] We walk around.
[1834] him.
[1835] We see some guy leaving the bar hammered, which I think is the first time that she's, I called it out.
[1836] I go, oh, my God, look how drunk this guy is.
[1837] And so he comes out of the bar and he's, like, he's swaying Gator Walk.
[1838] And then he finds a little, like, power box that someone's sitting on.
[1839] He immediately leans into that and is touching the other person, like, no boundaries.
[1840] And Del's like, wow, he's really, like, that really got her.
[1841] Then she got scared.
[1842] Yeah.
[1843] Because now we're walking on the street with this guy.
[1844] Yeah.
[1845] And I'm like, oh, no, no, you don't need to be safe.
[1846] This is very benign.
[1847] He's just drunk.
[1848] Watch, this is fun.
[1849] What's watch them?
[1850] He bumped into a lot of things.
[1851] He then crossed the street, crossed the street, walked on the sidewalk, walked directly into this huge fence and fell down to the ground.
[1852] And then got up and walked away.
[1853] And at that point, we were both laughing.
[1854] She had, he was far enough away that I was just amusing.
[1855] Look at this adult acting like a baby.
[1856] Yeah.
[1857] So that was educational.
[1858] Then we get our popcorn.
[1859] And we stay in line for a very long time.
[1860] It's Friday night at the real cinema, right?
[1861] So we're, we probably dedicate like 35 minutes that this two bags for free popcorn.
[1862] Wow.
[1863] Which I like, I like.
[1864] Oh, wow.
[1865] Because it's already too entitled, right?
[1866] We're at the Ritz.
[1867] Okay.
[1868] But now we had to really earn it for the free popcorn.
[1869] You're right.
[1870] And then we get the popcorn.
[1871] Then we go back to the club level.
[1872] And then there's more eating and more eating.
[1873] And then we get to the room and she had prepared a surprise for us.
[1874] She unloaded her backpack and she had a fresh tube of Pringles, sour cream, and onion.
[1875] Family side.
[1876] This is her whole.
[1877] whole luggage was that a family size ruffles sour cream and cheddar and then two almond joys because she knows she and i both like almond joys and she had secretly gone to target with t t the day before to prep for our special night bonica it was heaven and then we got back to the room we in all these snacks now we're eating we're just eating so much and then we're like staring out the window at all the lights and we spent a long time looking out the windows and it totally delivered it was like everything I'd hoped.
[1878] And I felt like we were in a completely different city.
[1879] Yeah.
[1880] It was so fun.
[1881] Wake up in the morning, I had a room credit of $100.
[1882] So we ordered room service.
[1883] And with the tip, it was exactly $100.
[1884] That was perfect.
[1885] That's exciting.
[1886] She had French toast, Rio's French toast, something, something, something.
[1887] And I had had some bacon.
[1888] She didn't have that with her French toast.
[1889] I said, oh, I noticed this morning when I got coffee in the club lounge, they had a huge tray of bacon.
[1890] So finished that thing, Then go into the club, have another breakfast.
[1891] Oh, wow.
[1892] Yeah, she has cereal there.
[1893] She has bacon.
[1894] She's already had French toast.
[1895] Oh, my God.
[1896] She already had in and out.
[1897] She had the popcorn.
[1898] She had the 95 things.
[1899] And then we were going to go to Aces Championship baseball game.
[1900] Uh -huh.
[1901] And I said, I'm going to take a poop, you know.
[1902] And I took one.
[1903] Yeah.
[1904] And I was like, you got to take a poop?
[1905] No. And at this point, I'm looking at her little body.
[1906] And I'm thinking how much food is inside of it.
[1907] And I'm starting to get very nervous.
[1908] Yeah.
[1909] That's a lot of food in that three foot, three and a half.
[1910] She didn't seem to care.
[1911] And we went to the baseball game.
[1912] Ace 1, 13 to 2, won the championship.
[1913] It was wonderful observing that.
[1914] Yeah, it was a hell of a date night.
[1915] I love that.
[1916] It was great.
[1917] She didn't kick me a ton in the middle of the night.
[1918] She woke me up all night long.
[1919] She kicks like a mule.
[1920] It's like a crocodile.
[1921] That's a very special memory.
[1922] Oh, it's so fun.
[1923] She'll remember that forever.
[1924] I hope.
[1925] Yeah, she will.
[1926] You just don't know these two.
[1927] No, that's a core memory.
[1928] That's very, very sweet.
[1929] You're a good dad.
[1930] Oh, thank you.
[1931] I'm just a selfish dad.
[1932] It was so fun to go out and just, you know, spoil someone you love.
[1933] Yeah.
[1934] That was that.
[1935] Great.
[1936] I love that story.
[1937] All right.
[1938] Well, let's move on to some facts.
[1939] Okay.
[1940] Okay.
[1941] All right.
[1942] This is for Lee Hood.
[1943] Uh -huh.
[1944] who, you know, one of the fathers of the human genome project, very legendary, honored to have him here.
[1945] Also, I want to say our second, well, he was 84, so he wasn't quite 85, but our second mid -80s guess in the last month that were both sharper than you and I. Definitely.
[1946] Markedly so, yeah.
[1947] Yes, it's incredible.
[1948] It's very encouraging.
[1949] I hope I get more and more sharp.
[1950] I can already feel myself getting less sharp.
[1951] Like that turn has happened.
[1952] No, you can't.
[1953] Yeah, I can't.
[1954] You have?
[1955] I think.
[1956] Really?
[1957] Where do you feel it?
[1958] Because I know where mine is.
[1959] Everywhere.
[1960] Okay.
[1961] I guess just communicating has gotten harder.
[1962] Harder.
[1963] You can't find the words you want?
[1964] Not always.
[1965] That's where I feel it.
[1966] But I also wonder if before I just wasn't searching for those types of words, what we're talking about now is more...
[1967] That's possible, but you and I interviewed someone on Friday that we had interviewed three years beforehand.
[1968] So I thought, oh, I'll listen to that previous.
[1969] interview so I don't go over the exact same ground because it's hard to remember three years ago you talked about and I think in listening to that I was like yeah you were much sharper three years ago I thought I thought I could hear verbal dexterity was much higher than I don't think so I don't think so all right I'll stop worrying okay but lee bona fide genius yeah and Nathan also I called he's going to win the Nobel Prize yeah you made their first Nobel prediction um have you you used your wipe yet?
[1970] I haven't.
[1971] I don't even know if I've brought it out of the attic.
[1972] Oh, no, it's sitting right here.
[1973] Okay.
[1974] I could use it right now.
[1975] Oh.
[1976] Do you want me to do that live?
[1977] No, it's got to be after a...
[1978] Yeah, I think you need a fresh poop.
[1979] Yeah, right now it's spick and spam back there because I just used the bronnel.
[1980] Yeah, no, you have to do it after a poop.
[1981] A dune.
[1982] Okay.
[1983] Have you used yours?
[1984] No. And when I was editing this, I thought, oh, I need to do this before the fact check.
[1985] Yeah.
[1986] But then I started my period.
[1987] And I don't think I can do it because what if some period gets on there?
[1988] Cross contamination.
[1989] They might think you're dying.
[1990] There's blood in your stool.
[1991] Yeah, and like eggs.
[1992] They might send an ambulance directly.
[1993] Yeah, so I have to wait.
[1994] Okay.
[1995] I had a foray and eggs this weekend, by the way.
[1996] You know, I don't eat eggs.
[1997] Right.
[1998] I was thinking, you know, I only ate eggs with toast.
[1999] I've never eaten eggs without toast.
[2000] I always make a sandwich.
[2001] and dip it in the yolk.
[2002] That's why I like it.
[2003] So I just, that crossed my mind at the Ritz.
[2004] Yeah.
[2005] So guess what?
[2006] On Saturday morning, I ordered eggs, just no bread.
[2007] Yeah.
[2008] And I ate overeasy eggs.
[2009] And it was fine.
[2010] And it was delicious.
[2011] And I felt nothing.
[2012] And then yesterday morning I made myself another round of overeasy eggs.
[2013] Oh, my God.
[2014] Oh, also I cooked my first tray of cornbread, gluten -free cornbread muffins.
[2015] I made those this weekend.
[2016] Whoa.
[2017] They were delicious, and they're gluten -free.
[2018] Yeah, nice.
[2019] Yeah, and you can make those muffins in 90 seconds.
[2020] Oh, what do you mean?
[2021] You must make muffins and stuff.
[2022] I actually don't make very much.
[2023] I made blueberry ones not that long ago, but I don't make muffins that often.
[2024] Yeah, I got the Bob's red.
[2025] This sounds like this one long ad for Bob's, but it is the truth.
[2026] I got the Bob's packet.
[2027] I dumped that in there.
[2028] Then I got to use a cup and a half of milk, two eggs, and then butter.
[2029] Oh, okay.
[2030] Swirl that around.
[2031] That took me five seconds to swirl that around.
[2032] And then I just dumped it in a fucking pan.
[2033] And I threw it in the damn oven.
[2034] And 23 minutes later, I had these delicious corn muffins.
[2035] So did you put them in muffin tins or you just did it like corn bread?
[2036] A loaf.
[2037] Yeah, a loaf of bread.
[2038] Okay.
[2039] So anyways, then I got to dip my eggs that I made with that.
[2040] And everything's honky dory.
[2041] So I'm slowly, I'm dipping my toast back in the egg waters.
[2042] I can't believe you never put two and two together that it was because you ate it with toast.
[2043] Silly, right?
[2044] Yeah, because you've been missing out on so much.
[2045] I love eggs so much.
[2046] I love eggs.
[2047] Oh, they're so good.
[2048] I want some now.
[2049] Now, why don't you get home state?
[2050] What happens there?
[2051] The best breakfast tacos.
[2052] Oh, and they get it with a corn tortilla.
[2053] Yes.
[2054] Okay.
[2055] So good.
[2056] And they make their own flour tortillas, too.
[2057] He can't have that.
[2058] I can't have the flour tortillas.
[2059] Have you not been listening to the previous seven minutes of this conversation?
[2060] The flower, though.
[2061] Callie gets it all the time.
[2062] She gets it with corn.
[2063] She does.
[2064] Ooh, I want to try that.
[2065] It's so good.
[2066] You do have a signature egg dish.
[2067] Yeah.
[2068] And it doesn't have toast.
[2069] Which made me realize I should start eating that again.
[2070] And you think you can.
[2071] I think I can, yeah.
[2072] I don't know.
[2073] I'm going to keep pushing and see if I have any issue.
[2074] I love that.
[2075] Yeah.
[2076] My famous egg dishes are so people know is a scramble with shallots and calbasa.
[2077] Yeah, it's really good.
[2078] And a mix of the Mexican blend cheese.
[2079] Cheese, yeah.
[2080] It's very, very good.
[2081] TBD, but I'm definitely interested in I'm pursuing.
[2082] That's a big deal.
[2083] Okay.
[2084] Rates of tuberculosis.
[2085] Where were we?
[2086] Oh, yeah.
[2087] Rates of tuberculosis.
[2088] TB is the 13th leading cause of death and the second leading infectious killer after COVID, above HIV and AIDS.
[2089] Wow.
[2090] Mm -hmm.
[2091] In 2021, an estimated 10 .6 million people fell ill with tuberculosis worldwide, 6 million men, 3 .4 million women and 1 .2 million children.
[2092] These are just all people who have not been vaccinated.
[2093] Well, he said something about a resistant strain.
[2094] Really?
[2095] Yeah.
[2096] My stomach just growled.
[2097] Oh.
[2098] He wants eggs.
[2099] It's ready for eggs, you know.
[2100] In 2021, eight countries accounted for more than two -thirds of global.
[2101] TV cases.
[2102] India, 28%.
[2103] Oh, sorry, India.
[2104] Indonesia, 9 .2.
[2105] China, 7 .4.
[2106] Philippine 7, Pakistan, 5 .8.
[2107] Nigeria, 4 .4.
[2108] Bangladesh 3 .6 and Democratic Republic of Congo 2 .9.
[2109] By the way, huge train accident.
[2110] I know.
[2111] And I thought, of course, I think of you right away.
[2112] You thought I was there?
[2113] No. You should have checked in and made sure I wasn't there.
[2114] Well, I already knew you were on a bachelor's party.
[2115] Sexy bachelor's party.
[2116] But I am curious, do you think your parents are naturally more interested in that than others?
[2117] I would assume that, but then also something happened a few months ago when I was home, a boat situation happened in Kerala, and people died.
[2118] And I said, hey, do you hear about this?
[2119] And they said no. And then they didn't seem to like think about it or care.
[2120] But I think they should care more Because they might know someone Yes, from Kerala Exactly, but then they didn't Maybe they thought if we'd have heard by now Someone, you know Someone they knew But it's like if a tornado happens in Georgia I care more than if it happens in Kansas Of course!
[2121] You connect with it I just want to know is everyone okay that I know Yeah, yeah And then I get sad for...
[2122] Also it just means you have more context for it If it rips through Duluth and you find out the Arby's is gone.
[2123] Your Arby's being gone is much different than Arby's you've never been to.
[2124] Yeah, it's true.
[2125] It almost feels impossible the number of fatalities.
[2126] I know.
[2127] How many now?
[2128] It was three trains that intersected.
[2129] The last I read was 300, but I think it was growing.
[2130] Yeah, it's so devastating.
[2131] Transitioned into another disease.
[2132] That's how you lift us out of this.
[2133] Yeah, 280 people killed over 1 ,100 injured.
[2134] 1100 injured well this is a ding ding ding because my grandma when I was in I guess I must have been sixth grade or seventh grade I used to take the bus to their house okay to savanna no no they moved to Atlanta when I was in before sixth grade okay and they moved into the neighborhood so they could help with us no so I took the bus and I took the bus and I was in before sixth grade okay and they moved into the neighborhood so they could help with us no so I took the bus and to their house, and I was there after school every day.
[2135] Yeah.
[2136] And the school bus?
[2137] Yeah, school bus.
[2138] Yeah, not a city bus.
[2139] Correct.
[2140] My grandma, we thought, had tuberculosis.
[2141] When?
[2142] During, in, like, sixth grade.
[2143] I think maybe they went to India and they came back, and then they thought she had it.
[2144] I don't know if she sort of tested positive or something, or the first round tested positive.
[2145] and we all had to isolate.
[2146] What's ironic, she had to wear masks, and it was so sad, weirdly.
[2147] And then we all had to get tested.
[2148] And then it turns out she didn't have it.
[2149] Good.
[2150] Yeah, I think that's good.
[2151] I've had to take many TB tests because I've taken different immune suppressants.
[2152] And they want to make sure you don't have TB before they put you on one because that's the disease that'll get a stronghold if you weaken your immune system.
[2153] Makes me think, oh, a lot of people carry it.
[2154] Don't know.
[2155] They carry it.
[2156] You know, there would be this test.
[2157] Scary.
[2158] Okay, the book, The Empty Planet, that he references that book, it's by Daryl Bricker, Empty Planet, the Shock of Global Population Decline.
[2159] Came out in 2020.
[2160] Okay, so he said there's no country that's gaining population in a striking fashion.
[2161] And now I'm forgetting if he said developed.
[2162] country or not because there are countries that are gaining population but they're more developing countries yeah must have meant yeah or said maybe he even said and i just forgot although then we had a guess since them who tried to contradict that well on this this is pew we trust pew we've very trusted brand yeah by 2100 geez that's a hard that's only but that's It's only 77 years from now.
[2163] I know.
[2164] But it's a weird thing to say.
[2165] What do you say?
[2166] Like, it's 2100.
[2167] Yeah.
[2168] That sounds weird.
[2169] Okay.
[2170] 2100, that doesn't sound much better.
[2171] Yeah.
[2172] In the year 2100.
[2173] In the year 2100.
[2174] I don't like either.
[2175] In the year 21002.
[2176] Too much.
[2177] I like 93, 85, 86.
[2178] We just had it down to two numbers.
[2179] That was best case.
[2180] But now it's still okay.
[2181] 22, 22, 2023.
[2182] Yeah, 2024.
[2183] 2020, but 2100.
[2184] Mm -hmm.
[2185] No. That's bad.
[2186] 21 -odot.
[2187] It's bad.
[2188] That's a little bit away.
[2189] Okay.
[2190] 2100.
[2191] By 2100.
[2192] Five of the world's ten largest countries are projected to be in Africa.
[2193] Five of the ten largest countries are projected to be in Africa?
[2194] Population -wise.
[2195] Wow.
[2196] Well, okay.
[2197] In 1950, let me read the list.
[2198] Okay.
[2199] Yeah, do it.
[2200] China.
[2201] This is in descending order from the most populist to the least.
[2202] Correct.
[2203] Okay.
[2204] I'll do top five.
[2205] China, India, U .S., Russia, Japan.
[2206] 1950.
[2207] 2020, China, India, U .S., Indonesia, Pakistan.
[2208] Okay.
[2209] But by Japan.
[2210] And Russia.
[2211] They've had a declining birth rate for quite a while.
[2212] Exactly.
[2213] Mm -hmm.
[2214] 2100 prediction, India.
[2215] Oh, still.
[2216] Rainier.
[2217] No, China was.
[2218] Love China.
[2219] Okay, yeah.
[2220] So India, China, Nigeria, U .S., Pakistan, then Dominican Republic of Congo, Indonesia, Ethiopia, Tanzania, Egypt.
[2221] Oh, wow.
[2222] They have U .S. hanging around.
[2223] They're at four, yeah.
[2224] Because we are in a declining population as well.
[2225] I know.
[2226] With our own birth rate.
[2227] When we were on our sexy baby shower, bachelorette party, it was not very crowded there, which was nice.
[2228] But everyone who was there was pregnant.
[2229] Really?
[2230] We saw so many pregnant women.
[2231] Oh.
[2232] And I thought, oh, we had just talked about this.
[2233] And I don't know.
[2234] So.
[2235] Okay.
[2236] That's where we're going to be out.
[2237] Well, Yos is still in the mix, though.
[2238] Still in the mix in 2100.
[2239] Okay.
[2240] Now, according to Johns Hopkins, another trusted brand.
[2241] The most trusted.
[2242] What is my cancer risk if I test positive for a BRCA mutation?
[2243] According to Johns Hopkins.
[2244] 12 % higher?
[2245] 20 % higher.
[2246] It says, It is high.
[2247] Having a BRCA mutation means you have a likelihood of 45 % to 85 % for developing breast cancer in your lifetime, along with a 10 % to 46 % chance of ovarian cancer.
[2248] Whoa.
[2249] Yeah.
[2250] That's not great.
[2251] No. And that's not what we were saying at all.
[2252] No. Although the only thing that might obscure that is the notion that, like, all men will get prostate cancer.
[2253] Maybe, like, if you live long enough, you're going to have breast cancer.
[2254] Like, that's just a given.
[2255] But not everyone has the BRCA mutated.
[2256] This is just if.
[2257] No, I know.
[2258] I'm just wondering if it's only ticked up from like, maybe the range is 20 to 60 percent, but with the BRCA variant, it's 60, 40 to 85.
[2259] Does that make sense?
[2260] Like, it's really only taking you up like 30 percent more, but just already the base number so high?
[2261] It is getting higher.
[2262] That's why they've lowered the mammogram age.
[2263] To 13.
[2264] Like 40.
[2265] Oh.
[2266] I think the end of 13.
[2267] Okay.
[2268] It's getting more.
[2269] No, thank you.
[2270] I know.
[2271] I've thought about that because I have a love -hate relationship with my breasts.
[2272] You do?
[2273] What's the hate part?
[2274] Well, they're large, you know, and I don't love the way they look in clothes.
[2275] Oh, okay.
[2276] Sometimes I wish they were smaller for certain outfits.
[2277] Okay.
[2278] Outfit dependent.
[2279] Actually, often I do.
[2280] You know, because I have considered getting them reduced.
[2281] You have not.
[2282] Yeah.
[2283] In my life, I have.
[2284] have.
[2285] Just like when I'm laying down and thinking about it sometimes.
[2286] I've never, like, gone to a consultation.
[2287] Okay.
[2288] But I have thought about it.
[2289] Oh.
[2290] And I'm not going to.
[2291] I don't think at this stage I can say I'm probably not going to.
[2292] But then I thought, if I got breast cancer and they were gone, would anyone even like me?
[2293] Like, I started to get very scared.
[2294] Boy.
[2295] I guess it's a huge part of my identity.
[2296] Surely.
[2297] Which is not good.
[2298] Why?
[2299] Because if it does go away, like, I shouldn't be so attached to something like that physically.
[2300] Right.
[2301] You're going to be fine, though.
[2302] You'll probably keep those all the way to the end.
[2303] I don't know.
[2304] It's just weird to be attached, right?
[2305] I mean, like, men have that with their penises.
[2306] Like, when you had Patriots disease, what was it called?
[2307] Peronis.
[2308] Peronis.
[2309] Peronies.
[2310] Yeah, it was a big hit to my identity.
[2311] Weren't you worried?
[2312] Sure.
[2313] That if my penis didn't work anymore?
[2314] Yeah, that's a big one.
[2315] Yeah, yeah.
[2316] Again, you've got to imagine your vagina sealed up.
[2317] That would be the equivalent, right?
[2318] Yeah.
[2319] Yeah, like just became non -functional.
[2320] Right, okay, but what about if, okay.
[2321] All right.
[2322] So what about if the Peroni's, it still worked, but it looked.
[2323] Crazy?
[2324] Yeah.
[2325] I would embrace that.
[2326] That'd be fine.
[2327] Okay.
[2328] Yeah, it would just need to get firm enough for action.
[2329] Yeah.
[2330] Yeah.
[2331] But they were saying there was a 50 % chance it would never get firm.
[2332] Yeah.
[2333] Yeah, that's different, I guess.
[2334] At that point, maybe cut it down a bit.
[2335] Just make everything easier.
[2336] If it's not going to get hard over again, maybe just trim it, you know?
[2337] Get less equipment down there between your legs.
[2338] It's a cumbersome setup.
[2339] It's an engineering flaw.
[2340] I mean, the most motion on your whole body is right there between your thighs.
[2341] Sure.
[2342] And that's where we put everything.
[2343] Just cram it all right there.
[2344] Yeah, that's true.
[2345] Yeah, if you had, like, hired architects and they put your...
[2346] your full kitchen on the porch.
[2347] You're like, what on earth is going on here?
[2348] That's what it feels like.
[2349] But where else would it go?
[2350] Anywhere would be better.
[2351] My stomach.
[2352] Oh, no. We think that would look gross because we're not used to that.
[2353] But at least there's not too big thighs moving around nonstop.
[2354] But like it would.
[2355] Like when I'm running, I have a couple eggs and a sausage.
[2356] You know, like there's stuff in my lap.
[2357] But if it was on your stomach, it would hit stuff when you're walking.
[2358] Great.
[2359] You know, no, you would wear, your underwear would be around your abdomen.
[2360] What?
[2361] Yes.
[2362] To keep everything supported.
[2363] But imagine even if you're just leaning on this desk to write something, it would bump up against it.
[2364] I think more the liability would be as someone punch you in the stomach.
[2365] That would be a bummer.
[2366] I think about how many times your stomach hits stuff, like just walking around and like scooting by chairs.
[2367] I don't think more than your hips in your groin.
[2368] Probably more protected, though, between your legs.
[2369] Yeah, I think it is.
[2370] It's like housed in there.
[2371] It's not.
[2372] Well, I don't know where yours is at, Rob, but mine is above my.
[2373] No, it's so exposed.
[2374] When you see a male, a nude male, you go, that's preposterous.
[2375] No other animals' junk is so on display and in such a weird spot.
[2376] If it was on your stomach, it would be way more on display.
[2377] Just coming out.
[2378] Definitely.
[2379] It would.
[2380] It would.
[2381] But it would be easier to run and jump and exercise.
[2382] I don't know.
[2383] We'll have to ask someone who has it on their stomach.
[2384] Okay, we'll find someone.
[2385] Maybe instead of getting a reduction, I'll get a transplant.
[2386] I'll have it moved up to my stomach.
[2387] Okay.
[2388] For science.
[2389] Anyway, but I do think about women who have mastectomies, and I think emotionally that's really hard.
[2390] Absolutely.
[2391] God, yes.
[2392] Because it does become such a part of your identity and.
[2393] Well, and it's a marker of femininity.
[2394] Yes.
[2395] It's very much a marker.
[2396] Yeah.
[2397] Huh.
[2398] Why you switch to breast milk?
[2399] Have you thought of that?
[2400] Oh.
[2401] So I bet you wouldn't have any intestinal issues.
[2402] I'll think about trying it if there's extra.
[2403] I bet it would.
[2404] I'll throw my tea.
[2405] I'll have to check if it has gums and oils, though.
[2406] I can't have that.
[2407] All right.
[2408] Well, that's all.
[2409] All right.
[2410] Well, I'm glad you had a great weekend.
[2411] You too.
[2412] Love you.
[2413] Love you.
[2414] Follow armchair expert on the Wondry app, Amazon music, or wherever you get your, your podcasts.
[2415] You can listen to every episode of armchair expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.
[2416] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.