Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert, experts on expert in this edition, Monica.
[1] Dax.
[2] We have a very interesting human being, Dr. Eric Topal, who is going to blow your mind about technology and where we're going with medicine.
[3] It's incredible.
[4] We talked about it all weekend.
[5] Monica and I are so excited to get all these instruments he talked about because we want to give ourselves body scans all the time.
[6] Yeah.
[7] Because we're a little mini hypochondriac.
[8] Yeah.
[9] He has a new book out called Deep Medicine.
[10] how artificial intelligence can make health care human again.
[11] I loved talking to him.
[12] And side note, his cute daughter was here.
[13] Yeah, it was so sweet.
[14] Oh, she brought us a mugs.
[15] We'll get into that in the fact check.
[16] But until then, buckle up and enjoy the good Dr. Topol to learn more about AI in your health care.
[17] Wondry Plus subscribers can listen to Armchair expert early and ad free right now.
[18] Join Wondry Plus in the Wondry app or on Apple Podcasts.
[19] or you can listen for free wherever you get your podcasts.
[20] Dr. Eric Topol, welcome to armchair expert.
[21] I just want to be really clear.
[22] Is it topple or Topal?
[23] Oh, it's Topal.
[24] It's Topal.
[25] It's rare that I do it right.
[26] It is.
[27] I'm a dyslexic left -hander, and I butcher most names.
[28] But you are, this will be maybe my longest introduction of a guest, which is exciting.
[29] you are a cardiologist a geneticist a digital medicine researcher you started and founded scripts yes you're the founder and the chairman do we say director the director okay that's great um so you you um i i had the pleasure of listening to both a ted talk of yours and then some panel you were on at Stanford discussing our future in medicine and I immediately was interested in what a terrible job we're currently doing.
[30] Can you walk us through just how abysmal it is right now?
[31] Yeah.
[32] And by the way, can I just also add, I do a lot of work with prostate cancer awareness.
[33] So I was a little shocked when I saw some of these, you know, the screenings aren't as effective as I would have hoped as a spokesperson.
[34] So I'm sorry, please tell us.
[35] No, no, you're right on about.
[36] this.
[37] So the first thing is, you know, what I call shallow medicine, we need to fess up about how bad things are.
[38] We make more than 12 million serious diagnostic errors a year in the U .S. alone.
[39] 12 million.
[40] Right.
[41] And if you're one of those 12 million people, you know, that doesn't feel good that you have an error, either missed diagnosis or a mistaken diagnosis.
[42] Then you have, you know like for radiologists when they look at films they miss they have a false negative 30 plus rate of missing important things do we know is that the quality of the imaging or is that like people who watch security cameras for so long they just it becomes white noise like what what's happening there do you yeah no i think it's partly the latter that is each radiologist reading like a hundred studies a day yeah and it's hard to stay focused and even human eyes are only so good And that's what we're going to talk about, of course, of machine eyes.
[43] Yeah.
[44] But everywhere you look at it, we have problems.
[45] We have very limited time with patients so the diagnosis doesn't come to mind.
[46] Yes.
[47] And if it isn't come to a doctor's mind within five minutes, then there's at least a 70 % error rate.
[48] Oh.
[49] And it's gotten increasingly short the time you spend with a doctor as the added computer work, basically, right?
[50] I feel like I've observed anecdotally in my life going to doctors is I used to go as a kid and the guy looked at me for about 20 minutes and we chatted.
[51] And now the guy looks at me like for four seconds and his typing the whole time.
[52] Yeah, so you don't even have eye contact.
[53] The time is so limited.
[54] The squeeze on doctors and nurses, clinicians in general has been profound.
[55] And it's really detracted.
[56] And it's set up for a lot of these mistakes and problems.
[57] It's shallow medicine, the lack of being able to get your arms around the data for the person, the lack of the real contact to understand the human story of the person.
[58] And so whatever way you look at it, our screening is so incredibly dumb.
[59] We have the mammography for all women every year or two, whatever.
[60] I like to give them to monica every 30 days.
[61] Sorry, I sexually harass my eye, I'm just teasing.
[62] 88 % of women will never, in their lives, have breast cancer.
[63] And the false positive rate for these scans, whether it's mammography or PSAs, exceed 60%.
[64] Okay, right.
[65] So I want to put, I want to make this crystal clear because as I heard you say it on the panel, I want to say there was a study that followed 10 ,000 women for 10 years.
[66] And of those 10 ,000 women there were 6 % had detected breast cancer from those mammograms.
[67] And yet there was, the error rate was, what was it?
[68] Six thousand of the 10 ,000 women.
[69] What?
[70] And out of those, you know, so many had unnecessary biopsy, chemo, it could be radiation, could be surgery.
[71] but at the very least, profound anxiety while they were getting further evaluations.
[72] So, I mean, this is the state of the art today, and it's really very shaky.
[73] Yeah.
[74] It's just error -laden.
[75] There's not enough time with patients.
[76] And we really have a broken system because we keep having more people come into the system, and our outcomes in the U .S. are among the worst of anywhere in the developed world.
[77] Right.
[78] Right.
[79] And so, again, just to kind of dig in a little bit on the mammogram thing, what would we attribute, again, is that user error?
[80] Is that poor instruments?
[81] Is it bad imaging?
[82] Is it a combination of all these things?
[83] Well, it is a combination, but the biggest thing goes back to this Reverend Bayes theorem, which is you don't put people into a test, a screen, unless they have a risk that's increased.
[84] So we are violating this base.
[85] theorem, which is, you know, the same theorem of how to pick a horse in a horse race by their prior performance.
[86] And so this is that whole Bayesian logic and model.
[87] So you don't take every man to have a PSA or every woman to have a mammogram or all these other things we do for people, to people, when you don't have an established increased risk.
[88] And so we can be much smarter today.
[89] There's so things that we can do to gauge risk and not treat every person the same.
[90] Right.
[91] So in your specialty being cardiology, right, I think as it was explained to me by my cardiologist, there's maybe five really pertinent factors in your potential to have a heart attack or this kind of issue, right?
[92] There's your genetics.
[93] There's your kind of lifestyle, your diet.
[94] Walk me through.
[95] Is it five?
[96] Smoking, diabetes, high blood pressure, obesity.
[97] and a sedentary lifestyle and family history, as you were alluded to.
[98] But those were the old traditional risk factors that were established, you know, 70 years ago.
[99] Yes.
[100] Over time.
[101] But now we can get a genetic risk score for heart disease.
[102] It's so much more precise and adds to those traditional risk factors.
[103] Oh, so I could.
[104] So Monica and I both had 23 and me, which we found really fascinating for basically.
[105] Yes.
[106] Just to talk about it.
[107] Once I found out I didn't have braca, then it just became about whether I have dry or wet earwax and all these other fun things that we like to talk about, misophonia that they actually have a marker for, which blows my mind.
[108] Yeah, so we could take your 23M