Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair expert, experts on expert.
[1] I'm Monica Padman, and I'm joined by Dan Shepard.
[2] You're doigned.
[3] I'm joined by Dan Shepard.
[4] Hi, Purple.
[5] You're joined by Jan Shepard.
[6] I'm joined by Jan Shepard.
[7] We have, I've been saying this about him.
[8] Tell me if you think it's too much.
[9] I would say that Daniel Conneman is the Sigmund Freud of our generation.
[10] I think he's the most impactful psychologist of the last hundred years.
[11] I think it's accurate.
[12] That's not too much, right?
[13] I mean, I don't want to associate Freud because a lot of Freud's ideas are now proven to be not true.
[14] Right, but you have to still honor his significance to the field of psychology.
[15] In his significance, yes.
[16] Daniel Connman is a psychologist.
[17] He is the winner of the Nobel Peace Prize in Economics and recipient of the Presidential Medal of Freedom.
[18] He had a hugely successful book called Thinking Fast and Slow, and he has a new book called Noise, A Flaw in Human Judgment.
[19] And why Danny Kahneman is so fun, in my opinion, is he takes all these assumptions about how people think they think, and then he demonstrates through experiments that people don't act in the way they would imagine.
[20] We're all pretty irrational.
[21] We think we're rational.
[22] We're quite a bit illogical and irrational.
[23] and the way we make decisions and judgments.
[24] It is worth saying that this is like one of the most brilliant people on earth.
[25] And it was a huge honor to be able to talk to him.
[26] And he comes up all the time.
[27] Yeah.
[28] When we talk about the narrative self or the experiential self or we talk about bias, he basically cracked bias.
[29] All of our smartest guests have brought him up.
[30] And that just kind of goes to show he's been on our list for so long.
[31] so this was very, very exciting.
[32] All roads lead back to Connaman.
[33] Also, if you don't like this episode or you do like it and you want more, I do want to recommend Adam Grant had Daniel Connamet on his show as well, and it was tremendous.
[34] So you could listen to more Daniel Connamet on Work Life, Adam Grant's great podcast.
[35] So please enjoy Daniel Conneman.
[36] Wondry Plus subscribers can listen to Armchair Expert early and ad free right now.
[37] Join Wondry Plus in the Wondry app or on Apple Podcast.
[38] or you can listen for free wherever you get your podcasts.
[39] I just want to start by saying we could not be more excited to be interviewing you.
[40] We've not had a guest that more previous guests have talked about than you, period.
[41] Absolutely.
[42] Our favorite people are just obsessed with you.
[43] Let's go through the list.
[44] Sam Harris.
[45] Sam Harris.
[46] Adam Grant.
[47] Uval Haurari.
[48] a duck where these people, every time I bring up something, they go, oh, yeah, that's Connemans.
[49] You're just trying to rip off Daniel Connman.
[50] So we're just so thrilled, really, really from the bottom of our heart.
[51] So grateful you're joining us.
[52] Well, it's a pleasure.
[53] Where are we talking to you at?
[54] Where are you currently?
[55] I'm in New York.
[56] And do you have to be at Princeton?
[57] No, I'm retired.
[58] I'm long retired.
[59] You're long retired.
[60] But you're still busy.
[61] You're still writing books.
[62] Yeah, I'm still work over your career is so immense that I'm having a very hard time putting it in a linear line, but I'm going to attempt to.
[63] Okay.
[64] And what I'd like to start with is just what an incredibly unique personal history you've had.
[65] I'm always most interested, not necessarily in what people study, but maybe more why they study it, why they were interested in it.
[66] And just quickly, you were born in Tel Aviv in 1934 or something?
[67] Yes.
[68] But you grew up in Paris, and this is mind -blowing to me, that you grew up during Nazi occupation in Paris, as a Jewish family.
[69] And you had this very, very profound experience that, at least according to your lore, got you interested in psychology.
[70] And I was just wondering if you could tell Monica about that.
[71] Well, I had an experience.
[72] I don't think it turned me into psychology, but it certainly, example.
[73] something that I was interested in as a trial.
[74] I was interested in the complexity of human nature.
[75] And the example that I'm sure you want me to talk about is that when I was seven years old, living in Paris under German occupation as a Jew, and the Jews had to wear a star of David on their clothes to be visible and identified.
[76] And there was a strict curfew.
[77] And as a seven -year -old, I played at a friend, and I missed the curfew.
[78] I was too late.
[79] And so I turned my sweater inside out, and I went home.
[80] And in a place that I remember, actually distinctly a few years ago, I went and visited just to make sure that my memories were right, and they were.
[81] I saw a German soldier.
[82] The street was deserted, and I saw a German soldier.
[83] We were walking toward each other.
[84] and he was wearing a black uniform, and the black uniform were the uniform of the SS, and they were the worst of the worst.
[85] If you were a Jew, you knew that.
[86] So we walked towards each other, and then he calls me, he sort of beckons me, and picks me up.
[87] And of course, I'm afraid that he'll see inside my sweater that I'm wearing the yellow star, but he doesn't, and he hugs me really tight.
[88] Oh, wow.
[89] And then he puts me down.
[90] and he opens his wallet and there's a picture of a little boy and then he gave me some money and then we each went their own way and it's a story that I happened to tell.
[91] I had to write an autobiography for the Nobel.
[92] Everybody who gets a Nobel has to write an autobiography and I wrote that incident as an example of early exposure to the complexities of human nature and for some reason lots of people who am very curious about that episode, but it's a true incident.
[93] It happened.
[94] I don't think it's changed my life much, but it's a true story.
[95] It is an indication of how complicated human nature is because this guy, you know, he could have killed me, but he also hugged me, and he had a boy, and I reminded him of his boy.
[96] Complicated.
[97] Well, immediately I'm thinking of the many layers of in -group, out -group, just knowing that like, okay, so one major in -group he was in was Nazis.
[98] You were in a very out -group Jewish, yet his in -group of, I'm a dad, and this is a little boy, is so compelling, right, that these conflicting in -and -out -groups can exist all at the same time.
[99] I mean, it wasn't even conflicting, you know.
[100] I mean, Hitler likes dogs, and he liked babies, and he liked flowers.
[101] There were so many people that Hitler was very nice to.
[102] Right.
[103] So we're complicated features.
[104] We are.
[105] In your work, which I think I probably first started seeing your work referenced when I fell in love with Malcolm Gladwell books.
[106] And, of course, his great perversion or proclivity is to take things we think are common knowledge or just assume to be true in pointing out how wrong our thinking is.
[107] And, of course, this all kind of starts with you in your early work.
[108] Now, what impact do you think, having grown up in occupancy?
[109] Paris, and then moving back to Tel Aviv, just prior to the Jewish state being formed, like these are pretty enormous historic events that happen in a pretty brief period of your life.
[110] Does that somehow give you the courage to think, oh, don't take any of this for granted or don't assume that what we've just inherited is correct, like challenge everything?
[111] I'm curious where you get your curiosity.
[112] Just your rejection like, no, no, we shouldn't assume we think this way or act this way.
[113] I'm wondering.
[114] Well, I mean, one thing that I wrote in my autobiography was that my mother was a gossip and that she was a very interesting gossip.
[115] She was not interested in just stories of who is misbehaving with whom.
[116] But it was about people being complicated and people who look intelligent being stupid and people looking stupid were in fact intelligent and complexities of relationships and people were just very interesting.
[117] And as a Jewish boy, and I think, you know, there's, in Jewish tradition, words are very important.
[118] And so I lived in a world of words and people.
[119] Yeah.
[120] And that certainly shaped me. Now, the transition from France to Israel was very important because in Israel, I didn't feel like a Jew anymore.
[121] more.
[122] For me, during the war, the experience of being a Jew with the experience of being a hunted rabbit, I knew that that was the experience.
[123] And of course, in Israel, that's not the ether.
[124] So it was a very significant thing for me. We just had a professor on who studies cultural differences in business, and they have this sliding scale of which Israelis are at the very top for being disagreeable and willing to challenge authority.
[125] And I wonder if that's something that evolved later in the Jewish state or when you were there, was that also the culture of the day?
[126] The challenge of authority was definitely part of the culture and the tendency to improvise is part of the culture and being disagreeable is fairly common.
[127] Yeah, I imagine there's a lot of factors for that, but one thing I've been introduced to, which I really like the idea of, is like the rabbinical approach to talking through problems.
[128] Do you think that's somewhere in the stew?
[129] Oh, yes, definitely.
[130] Many Israeli professors, I included, are descendant or closely related to significant rabbis.
[131] So that's part of the same.
[132] There is that tradition.
[133] It's a tradition of studying and understanding and of looking at things that could be simple and actually making them quite complicated, which is part of the tradition.
[134] It's seeing the complexities and things that appear to be simple.
[135] At the risk of stereotyping, my own experience with talking to rabbis is I interpret there being a tremendous amount of confidence in the ability to explore all these other avenues of dissent or opposing opinion and feeling quite moored in your position enough so that you're free to wander through all these other angles of a problem.
[136] It feels like a place of confidence.
[137] I think that's true.
[138] There is a lot of confidence.
[139] There is a lot of overconfidence, certainly, in the Israeli tradition.
[140] This is part of being Israeli as being overconfident and overly assertive.
[141] I've often been accused of I should have been born in Israel.
[142] He has all those qualities.
[143] Yes, yes, yes.
[144] Some call it arrogance, but I call it curiosity.
[145] So I guess when you probably started studying psychology, there were some kind of base beliefs about how we thought.
[146] And one of them was this kind of utilitarian belief that we were ultimately rational.
[147] Was that what you inherited or what you were taught?
[148] Well, when I was a student, that's not what I was exposed to.
[149] When I was a student, I was exposed to Freudian ideas.
[150] I mean, that was in the background.
[151] And we're thinking and areas of thinking were motivated and were dominated by emotions.
[152] Ah.
[153] And so that was actually the background when I was a student.
[154] Then when I started collaborating with my friend Amosovsky on the study of judgment and eventually of decision -making, there the background was there was a theory or a logic of how rational people should think about uncertainty, should think about probability, should make decisions.
[155] And what we started studying is how do people deviate from that logic.
[156] And this is just a logic of how to, statistics is a logic, and decision theory is a logic.
[157] And in economics, economics was based on the idea that people follow that logic, that people are rational agent.
[158] So that gives you a hypothesis that there is easily.
[159] proven false.
[160] And, you know, it's a very large target.
[161] And you can't miss in some ways because people are not like that.
[162] But finding interesting ways in which people deviate from the logic of rational thinking and rational decision making, that is a fun exercise.
[163] And that's the exercise in which we engage.
[164] It is a bit of a paradigm shatterer, or at least an attempt at one.
[165] I think equally impressive is figuring out a way to test it.
[166] I'm so impressed with the way you guys tested how rational people are.
[167] So just, I want to say for a baseline, like I guess we would assume people maximize their rewards and they minimize their risk.
[168] And anytime they're given something that's black and white, like numbers or statistics, that they will always choose to maximize that reward and minimize the risk, but then what really happens?
[169] Let me first say the way that we thought about it and the way that we try to demonstrate it, which is what made us.
[170] successful.
[171] It's almost an accident because we try to take those deviations from logic and find crisp examples of them, like one -liners, basically.
[172] And so all our work was creating one -liners.
[173] And I think our most famous one, should I mention one?
[174] It's not about Oh, yeah, yeah, I want you.
[175] It's about judgment.
[176] I'd let you mention all them.
[177] I've got all year to Well, all of them is a lot, but the most famous one, I think, the one that aroused the most controversy, is about a lady named Linda, who as a student was, she studied philosophy and she was very engaged in all sorts of causes, and she marched in many marchers and so on.
[178] And that example, we had it a long time ago, and people all assume that she was from Berkeley.
[179] But then we ask people, Linda is now 31 years old.
[180] Is she now a bank teller?
[181] Or is it more probable that she is both a bank teller and active in the feminist movement?
[182] So those are the two options.
[183] And it turns out that most people who haven't studied statistics have an immediate answer, that she is more likely to be a feminist bank teller than she is to.
[184] to be a bank teller.
[185] Now, that's absurd.
[186] Now, that violates logic because if she's a feminist bank teller, she's a bank teller.
[187] So it can only be more probable that she is a bank teller.
[188] And yet, that mistake is really very common.
[189] And that sort of illustrates the method that Amos and I followed.
[190] And I call that the psychology of single questions.
[191] That is, you find a single question which makes a point.
[192] And it turns out that when you do it that way, it's much easier to convince people that you have a point because they can see it in themselves.
[193] So even someone who has the correct answer to this knows that his spouse or her spouse probably won't have the correct answer and that he or she is tempted to give the wrong.
[194] answer.
[195] So you recognize in yourself the incorrect intuitions.
[196] That's what we did.
[197] So we created examples where our readers would recognize themselves and would recognize themselves making a mistake.
[198] And that's where our work was influential.
[199] It was influential because our readers recognized in themselves that their intuitions were not logical, that they were tempted to give illogical answers.
[200] And I'll add, there's a genius to ruling out defensiveness.
[201] So if I'm reading about this third party, right, I can both join them mentally in the decision -making process.
[202] But it's not me being exposed as having an illogical answer.
[203] So I can actually take on that info without getting defensive.
[204] Oh, when you're reading about it.
[205] Yeah, that's the power of reading about it in a third -person story.
[206] But actually, works on you is that you recognize that it works on you.
[207] So it's true that it's being told as a third party, but you recognize it on you, and that's why you believe it.
[208] If you just had a demonstration that worked on 100 students, then you'd say, oh, those are students, or you'd say something to dismiss the finding.
[209] But when it's something that you recognize in yourself, you're much less likely to dismiss it.
[210] I don't think it's the third -party aspect.
[211] I think it's the personal recognition that the reason our work was successful.
[212] And I stress this because this is luck.
[213] We happen to use that method, and it was the method more than the message that made the work influential.
[214] Yeah, I think we all think we're really rational.
[215] Like, we just believe that about ourselves.
[216] So it's so easy to say, oh, they're irrational.
[217] when you're caught in this thought experiment.
[218] Like, I just was.
[219] I 100 % was like, yeah, the feminist bank teller, of course.
[220] And I think I'm really rational.
[221] But, yeah, so you have to almost get caught yourself as opposed to being able to point the finger.
[222] Yeah.
[223] I feel like that's happening so much right now with the vaccine versus COVID.
[224] Like all the numbers and all the statistics around it, it's backwards for both people, right?
[225] Like the person who hears the, the numbers of COVID says, like, well, I'm not willing to take the risk of not wearing a mask, but I'm 100 % willing to take the risk for the vaccine.
[226] And the people who don't want to wear the mask say, like, I don't mind taking the risk of not wearing a mask, but I'm afraid of that vaccine.
[227] Exactly.
[228] Both sides can't possibly be right about fearing the opposite.
[229] And everyone thinks they're on the right side of rationality.
[230] And neither side is.
[231] I mean, the vaccine is really, when you think about it.
[232] The example of the Johnson vaccine and people thinking, oh, it's unsafe, is really ridiculous.
[233] Yeah.
[234] The risk is one in a million to get that clot.
[235] And the risk are certainly way more than a million for most people to die of COVID.
[236] Yeah.
[237] So it's absurd.
[238] And yet they probably were politically correct in stopping the vaccine because the public needed that.
[239] The public would have thought that it's reckless not to stop the vaccine.
[240] although, in fact, stopping the vaccine is a big mistake.
[241] I totally agree.
[242] And I think they had zero option because the integrity of vaccines is always so magnified by these passionate groups that if we don't do it always perfect.
[243] And it's the same thing with self -driving cars.
[244] So we demand that self -driving cars, it's not enough for self -driving cars to be as good as human drivers.
[245] They have to be 10 times as good, 100 times as good, times it's not enough, 100 times we're not be enough.
[246] Well, you probably would know the name of this in psychology or maybe even coined it, but I'm always fascinated with the notion that we as Americans do not mind at all that 100 ,000 people die of heart disease a year.
[247] Yet the notion that 2 ,000 died from terrorism will throw trillions at the terrorist threat and zero at the heart disease threat and there's some huge dissonance for me there well those are studies that were done by psychologists actually friends of ours what risks do people worry about and terrorism is worse than car accidents although there's no comparison about which is more dangerous because terrorism is intentional so it's human and it's intentional and it's not a natural thing So the vaccine is intentional, and it's not a natural thing.
[248] And we're much more afraid and much more responsive to bad things that happen when they're intentional more than when they're natural.
[249] Yeah, and it's ultimately a mental construct we all agreed upon.
[250] That's just the part where I think, yeah, but you're dead, you're dead, you're dead.
[251] You're not going to be on the other side going, this was a much easier death because it was an accident.
[252] I was like, no, you're dead.
[253] Yeah, but that's not how people think, obviously, and numbers don't catch it.
[254] I mean, it's not, the numbers don't do it.
[255] So the idea that there is that risk out there, having your child vaccinated and your child could die, the probability is infinitesimal, but that thought stops you in your tracks.
[256] Well, I do think, too, there's an illusion of control involved in this, right?
[257] So I think I can control whether I catch COVID or not because I can stand far away from people or I can just not go out, blah, blah, blah.
[258] But with the vaccine, they put it at my arm, I'm out of the driver's seat.
[259] It's going to do what it's going to do.
[260] I'm not sure that people have an illusion of control over COVID.
[261] I think my guess would be that when you think of the risks associated with a vaccine, you put yourself in the situation where something bad has happened.
[262] and it was due to your choice.
[263] So it's the anticipation of enormous regret and guilt.
[264] Oh, wow.
[265] Because you will be responsible for what happens.
[266] Yeah.
[267] And this idea that there's somebody responsible, and so in the case of Tesla, somebody is responsible for that death.
[268] It didn't just happen by accident.
[269] And when it's me being responsible, it's paralyzing.
[270] Yeah.
[271] So is that the root of Lossiver?
[272] or loss bias that you uncovered?
[273] What we call loss aversion is an asymmetry between the way that people think of gains and losses.
[274] People are more sensitive to losses.
[275] So if I describe a gamble to you, we're going to toss a coin, and if it's heads, you'll win $200, and if it stays, you'll lose $100.
[276] What attracts attention is the loss, the possibility of loss.
[277] I mean, we found about the gains has to be more than 2 to 1 to compensate for the loss.
[278] So when you offer people a gamble, that's 150 equal problem, well, gain 150 or lose 100, people don't like that gamble.
[279] It's not enough.
[280] Right.
[281] It has to be more than 200.
[282] But if you're a casino and I brought you a game that said, for every 100 you lose, you're going to win 150.
[283] Statistically, we know this.
[284] would be the most excited casino in the world.
[285] Well, certainly, and that's because, even that, if you won't take that gamble on one toss of a coin, you will accept the gamble if I toss it ten times.
[286] So most people can see that ten favorable gambles, when they are packaged together, that's an attractive package.
[287] But what's psychologically interesting is that when I ask you about one toss of a coin, you don't think, well, I have many gambols in my life, I should have a policy of being sort of risk neutral.
[288] It's one of many chances to make money and to take small risk.
[289] That's not how we think.
[290] We think of that problem.
[291] That's the choice that I have now.
[292] And in that choice, I'm lost of those.
[293] And we see this right extensively in stock trading.
[294] Isn't that true?
[295] Or people are so adverse to taking a loss.
[296] Well, there is an interesting phenomenon in the stock market for individual investors.
[297] What it is is that when people have a portfolio that contain some winners and some losers, stocks that have made money and stock that have lost money, they tend to sell their winners and to hang on to their losers.
[298] It turns out that that's a mistake.
[299] And if a stock has been winning, in the short term, you should continue betting on it.
[300] and if a stock has been losing, you should bet against it.
[301] People do the opposite.
[302] And that is an aversion to cutting your losses.
[303] It's an aversion to admitting, well, that was a bad choice, that stock.
[304] So people hope that the stock will recover and the quality of their decision will recover, and so they make that mistake.
[305] It's one of the mistakes that cause individual investors to lose a lot of money in the stock market.
[306] Stay tuned for more armchair expert, if you dare.
[307] What's up, guys?
[308] This your girl Kiki, and my podcast is back with a new season, and let me tell you, it's too good.
[309] And I'm diving into the brains of entertainment's best and brightest, okay?
[310] Every episode, I bring on a friend and have a real conversation.
[311] And I don't mean just friends.
[312] I mean the likes of Amy Polar, Kell Mitchell, Vivica Fox, the list goes on.
[313] So follow, watch, and listen to Baby.
[314] This is Kiki Palmer on the Wondery app, or wherever you get your podcast.
[315] We've all been there.
[316] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.
[317] Though our minds tend to spiral to worst -case scenarios, it's usually nothing, but for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[318] Like the unexplainable death of a retired firefighter, whose body was found at home by his except it looked like he had been cremated or the time when an entire town started jumping from buildings and seeing tigers on their ceilings hey listeners it's mr ballin here and i'm here to tell you about my podcast it's called mr ballin's medical mysteries each terrifying true story will be sure to keep you up at night follow mr ballin's medical mysteries wherever you get your podcasts prime members can listen early and ad free on amazon music okay now another of yours that has changed my life, truly.
[319] And I became aware of it through reading Yuval, Harari's books.
[320] And it was this distinction between the experiential self and the narrative self, which I find just to be mind -blowing that someone could realize that.
[321] I don't know how this was discovered, but when it's explained to me, I recognize immediately how relevant it is in my hourly existence.
[322] Oh, I want this thing.
[323] Well, I'd like it now, but tonight I'm going to hate that I did that.
[324] I mean, that's the racket in my head all day long.
[325] So you started studying happiness, and how much in your study of it, did it inform you, or how much did you come into with these notions?
[326] Well, I can tell you the story of how that idea came up.
[327] Please.
[328] And it came up in conversations with Amostweski.
[329] I invented a puzzle.
[330] and so I can tell you that puzzle.
[331] So you have a disease and they are treating you and you are getting one injection in the butt a day.
[332] And you're not adapting to it.
[333] So every day it's just as painful at the day before and the day after.
[334] And now I ask you how much money would you pay, say if you have a course of 20 such injections, to reduce the number from 20 injections to 10 and how much would you pay if you had 12 injections to reduce it from 12 to 2?
[335] And people say very different numbers.
[336] I mean, it was obvious.
[337] We didn't even run the experiment.
[338] It's so obvious.
[339] You'll pay more to reduce from 12 to 2 than from 20 to 10.
[340] And yet that's absurd because I told you that every day is just as bad as every other.
[341] So you're saving 10 injections, 10 painful injections in both stories.
[342] And yet, that's not how decisions are made.
[343] So that was a distinction between what I later called experienced utility and decision utility.
[344] So the utility of that choice, that option of having 20 injections or 10 injections or 12 or 2, when you consider it ahead of time, that's decision utility.
[345] Those are the weights that you put on it.
[346] And when you experience it, it's different.
[347] So clearly, when you experience it in the example that I gave, it's just linear.
[348] It increases any injection is just as bad as any other.
[349] So the start was experience and decision are separable.
[350] And the decision that we make don't maximize our experience.
[351] And that led me to ask, oh, okay, so if we're studying experience, can people predict ahead of time?
[352] how their experience will change, how they will adapt to things.
[353] So we did studying the ability of people to predict future experiences.
[354] So we ran an experiment where we gave people the same helping of ice cream, and they also listened to a bit of music while having that helping of ice cream, same ice cream every day for a week, every day at the same hour.
[355] And we asked people ahead of time, what do you think will happen to you?
[356] what will it be like on the seventh day?
[357] And people had no idea.
[358] So some people become addicted to that and some people really are tired of it.
[359] But the people don't know in advance which group they will belong to.
[360] So that was people can't predict their experience.
[361] So that led us to ask, well, can people remember their experience?
[362] Oh, boy, yeah.
[363] So we gave them experiences, and we measured how pleasant or unpleasant every second of the experience was.
[364] And then we asked them, well, how were the total experience that you had?
[365] How bad was it in total?
[366] And there we made discoveries in that how people evaluated an episode absolutely did not correspond to what actually happened.
[367] And was it consistently, did they remember it worse or did they remember it better or it varied greatly?
[368] It's not that simple.
[369] Okay.
[370] It turns out that if you have a medical procedure, for example, and so we studied colonoscopies.
[371] It's a medical procedure for adults.
[372] Today, nobody even knows what a colonoscopy is because they put you to sleep.
[373] But that was like 40 years ago, and they didn't put you to sleep.
[374] It was a very painful procedure.
[375] So we had people undergoing that procedure.
[376] And every 60 seconds, we asked them, how much pain are you experiencing now?
[377] And so you get a profile of pain.
[378] I'm so sorry, I have to interrupt you.
[379] It just hit me a visual of you sitting next to some man getting a colon.
[380] Like, you're just telling it as it is, but you were next to a...
[381] I didn't do it.
[382] Someone...
[383] It was done in a hospital.
[384] I had a colleague on Reddlemaier.
[385] But somebody sat with a clipboard and said, how are you doing now, Mr. Smith, every 60 seconds?
[386] Still terrible.
[387] Thanks for asking.
[388] So somebody was actually recording that profile.
[389] And then you ask people when it's all over, how bad was it altogether?
[390] And it turns out that how bad it was depends on the worst moment of pain and the last moment of pain.
[391] And do they average those two numbers?
[392] And they average those two numbers, but the duration of the colonoscopy doesn't matter It's just the peak and the end.
[393] But five minutes or 20 minutes, people don't discriminate.
[394] They take those two data points.
[395] Yeah, and that led to an experiment in which we actually added to the colonoscopy.
[396] So we ran a special experiment where, without telling people, when the experiment was over, they didn't remove the tube.
[397] Okay.
[398] All right.
[399] They kept it in for several more minutes.
[400] Now, that's really unpleasant.
[401] Nobody would volunteer for that, but it's much less unpleasant and less painful than it was when the tube was moving.
[402] So you improve the experience of the end.
[403] Oh, wow.
[404] Oh, wow.
[405] And we ran that, not with a colonoscopy.
[406] It was run with a colonoscopy as well and with similar results, but we ran it with putting your hand in cold water.
[407] And you can put your hand in cold water for 60 seconds, or you can put your hand in cold water for 90 seconds, where the first 60 are exactly the same, but the next 30, the water become very slightly warmer.
[408] And then you ask people, which are the two experiences you had, the one with the left hand or the one with the right hand, would you rather have?
[409] And they pick the longer one.
[410] Wow.
[411] Because the ending wasn't as miserable.
[412] Because you improve the ending.
[413] Oh, wow.
[414] So that was the background.
[415] That's how I got into well -being, that people seem to have really no idea when you ask people what's your life like.
[416] Yeah.
[417] We know that they're making some kind of computation, but we shouldn't trust that computation.
[418] Right.
[419] Because it may not correspond to their actual experience.
[420] So that's the background.
[421] Do you think that has anything to do with our addiction to relief?
[422] The like 30 seconds of a little bit warmer water feels like a relief from the pain a little bit.
[423] And that maybe we just love that feeling of being relieved so much.
[424] We actually ran experiments.
[425] I mean, we're getting into the weeds here a bit, whether it's the end or whether it's the relief.
[426] And the evidence seemed to suggest that it's, the end and not the relief that you actually those two points the peak and the end are very important and if you improve the end you're going to improve the whole experience we did that with several things we did that with loud sounds and and with pain okay so as you were mid -career gathering these results you now have some pretty substantial proof that were quite bad at being rational we're quite bad at evaluating ourselves.
[427] Were you doing this with the thought?
[428] Well, I'm going to come up with a control or a counterbalance to all this so we can still aim towards being rational and logical creatures.
[429] Or were you just going, oh, my God, we're hopeless.
[430] Because a lot of times when I read your work, I'm like, we are hopeless.
[431] Like we think we understand so much better than we do.
[432] I wasn't really thinking, quite in those terms of being hopeless, but human nature.
[433] I mean, it's clear when you think about those experiments that if you think about evolution, then evolution would do exactly that.
[434] It would make you sensitive to the peak of the pain because that's the peak of the threat.
[435] And it would make you sensitive to how the episode ended because that's an indication of how likely you are to survive.
[436] And it really, how long the episode lasted at the encounter with the tiger or whatever, that really doesn't matter.
[437] It's, you know, how bad it was and how well it ended.
[438] So there is an evolutionary logic behind a lot of it.
[439] That's true, because if you're deciding whether you're going to try to kill a badger for food, you have to decide after that encounter whether that's worth pursuing ever again.
[440] So those are good metrics, I guess, for that.
[441] They are.
[442] Now, the point is that when you're facing a colonoscopy or a film, then evolutionary logic is irrelevant.
[443] Adding 30 seconds of pain to a painful episode does not improve things when the alternative is putting your hand, say, in a warm towel instead of keeping it in slightly warmer ice water.
[444] I'll never get a chance to ask this to a more important person.
[445] So I just want to read, as you define happiness, is what I experience here and now, but that in reality, humans pursue life satisfaction, which is connected to a large degree, to social yardsticks, achieving goals, meeting expectations.
[446] So when I'm deciding between whether I want to indulge my experiential self or my narrative self or my, as you would say, life satisfaction self, I'm inclined to think that, and Monica and I have had this debate, that the life satisfaction meter is really ego.
[447] Is it not?
[448] It's like the story I'm writing about myself.
[449] I achieved all these things.
[450] I made good use of my life.
[451] Like, those are all really ego -driven, as opposed to, I enjoy this thing here and now.
[452] Could you say what one people should prioritize?
[453] Is it a scale that they should be 30 % here and say, like, do you have an opinion now of that?
[454] Well, I started out with an opinion.
[455] And the opinion was I was following up on our experiments where clearly the memory was a mistake.
[456] I mean, if you remember the 92nd episode has been.
[457] better than the 60 -second episode.
[458] You're just making a mistake.
[459] So I thought that what matters is the real experience and that what you remember about it, well, that's what you remember about it, but that if you want to make people happier, it's by improving their experience, not where they think about it.
[460] But later, I realized that if one followed the logic of, you know, my thinking, then you would have a definition of well -being, which doesn't correspond to what people want, because what people want is not the experience.
[461] When people think of an experience, they think of the memory of it.
[462] It's very difficult to imagine an experience, but you really imagine in the moment of remembering it, mostly, or it's a peak, or it's an end, it's a moment.
[463] Yeah.
[464] You cannot imagine the film, the continuity, the whole thing.
[465] You cannot do.
[466] But the thing that when you check what people want for themselves, they want to be satisfied with their life.
[467] And so having a definition of well -being that focuses on experience doesn't correspond to what people want.
[468] And then having a definition of well -being that doesn't correspond to what people want didn't seem to be the right thing.
[469] So I actually quit.
[470] I gave up.
[471] I did not find a way, I mean, clearly both are important, I have no idea how they should be weighted.
[472] Yeah.
[473] My guess is that people put too much weight on life satisfaction.
[474] Well, we're so lucky that we get to interview people who have won Academy Awards, who have won Nobel Peace Prizes, who have gone to Harvard, who have done all these things, right?
[475] And we've all had a fantasy of how we would feel internally, how the experience would be, based on this imagined state.
[476] And without exception, it never feels like the thing we imagined.
[477] It doesn't cure these internal angsts of ours, but yet the process seems to be worthy and great and worth pursuing, but the achievement, the yardstick, the result seems to disappoint for most people.
[478] I think that if people expect their emotional experience to be changed radically by an achievement, then they will be disappointed.
[479] But what we were thinking about was not big achievements, what I were thinking about.
[480] It was the problem of how you spend your time.
[481] That is what followed from our experiments was the focus on time, on sheer duration, that two days of vacation, in a way, if you enjoy them equally, they're worth twice as much as one day of vacation.
[482] That would the logic of the injections apply.
[483] And this is a calculation that people don't make, but I thought it was important.
[484] and my example would be commuting.
[485] So some people will take on a commute to have a larger house.
[486] But because of their commute, they spend one hour less or two hours less in a day with people that they love.
[487] Yeah.
[488] That's a big sacrifice.
[489] If you did the emotional calculation, if you were thinking in terms of experience, you would stay in a smaller place and have a shorter commute.
[490] So you would actually make different decisions.
[491] Yeah, this is a lesson I've learned in my life, which is I come from Detroit.
[492] I come from a culture in which the point of money is to save it and buy some big object, whether that's a house or a car.
[493] And I thought of expensive vacations as a waste of money.
[494] And then I learned of a study, maybe you were somehow involved in it, where if you do measure happiness, that when you're looking back in your life, this object can give you very little happiness.
[495] There's no real memories attached to it, but the memories of the vacation or the meal, those enhance your overall happiness.
[496] And that, to me, it was like, oh, once there's dad, it liberated me to spend money on that instead of the objects.
[497] For a long time with a colleague Norbert Schwartz, we were following that thought of, when do you enjoy your luxury car?
[498] And you enjoy your luxury car primarily when you're thinking about it.
[499] Yeah, yeah, yeah.
[500] But you don't think about it alone.
[501] That is, you think about it immediately after buying it.
[502] You think about it when people compliment you.
[503] You occasionally think about it when you overtake somebody else.
[504] But most of the time, you're not thinking about it.
[505] And when you're not thinking about it, it plays very little role in your happiness.
[506] Well, now I'm an anomaly because I love cars.
[507] The whole time I'm driving it, I'm like, oh, my God, I love this.
[508] But yes, your point remains.
[509] And even you, you don't think of your car all the time.
[510] What you are now remembering is the moments in which you're thinking about the car because at other times when you're thinking about the things, the car, it doesn't exist.
[511] Yes.
[512] So your memory just played the trick on you.
[513] And they're replaceable, right?
[514] Like my trip to Disney World at eight years old is irreplaceable.
[515] But you could switch out any number of cars in my memory of a great car, and they're interchangeable.
[516] But, you know, now you're talking of your memory.
[517] memory of Disney.
[518] And it's true that when people go to Disney, in many cases, it's not to have the experience.
[519] It's to have the photograph.
[520] It's to have the picture.
[521] It's to have the memory.
[522] So that gets to be quite complicated.
[523] Thinking about experience and about posterior evaluation gets to be quite complex.
[524] The car, so you enjoy it when you get it, you enjoy it when you're in it.
[525] Operating it.
[526] But the rest of the time, you're actually thinking about the next luxury car you want or the next car you want to give you that same initial satisfaction.
[527] So it's not even the original car that's propelling the excitement.
[528] Yeah.
[529] This led me to a phrase that is actually a phrase I'm proud of, the sentence I'm proud of, which is that nothing in life is as important as you think it is, when you're thinking about it.
[530] That is, when you're thinking about something, you exaggerate its importance.
[531] This just happened to you with respect to a car.
[532] Yeah.
[533] Okay, I really want to talk now about noise because that's why you're giving us your time.
[534] It's valuable.
[535] So if I could just sum up, your book, Thinking Fast and Slow, is just enormously successful, seven and a half million copies, and that really goes into human rationality.
[536] and I believe it's with a focus on bias.
[537] Is that an accurate summation?
[538] Okay, and now you've done it again with noise, and noise is a flaw in human judgment.
[539] So now you're looking at another area by which people's judgment becomes flawed, and it is independent of bias.
[540] So how would you describe noise?
[541] If you're looking at errors, say you're measuring a line, and you're measuring it many times, and you're measuring it in very fine units.
[542] So you're not going to be getting the same measurement every time.
[543] You are going to have variability.
[544] Now, the average error, the systematic error, that's a bias.
[545] So you can have a ruler that overestimate the length or underestimate the length and the average of your errors.
[546] But the variability of your errors, the fact that you don't come up with the same number every time, that turns out to impair accuracy because you can see that suppose the bias is zero so that on average you're exactly on target but actually you're all over the place sometimes you underestimate sometimes you overestimate you're not being accurate you are making errors right so with bias you could perhaps adjust you could establish what your deviation is off and then adjust but with this it's just sporadic and there would be no way to adjust for it there is no adjustment that you can do, and it's a form of error.
[547] And actually, in the theory of error, bias and noise are equivalent.
[548] That is, the mean of the error and the standard deviation of the error, which is a measure of variability, play equivalent roles in the equation for what error is that is used by all scientists and biostatisticians, there is a famous formula.
[549] And in that formula, bias and noise play the same role.
[550] And that's remarkable because in our thinking about human judgment, we think almost exclusively in terms of biases and noise doesn't figure.
[551] Now, the kind of noise that is the most interesting to us, it's not within a person, it's within an organization.
[552] So you have an organization of, say, underwriters.
[553] And you would want your underwriters, all of them, when given the same program, to come up with the same number.
[554] You would want all your federal judges, when exposed to the same case to assign the same sentence.
[555] You would want physicians in the emergency room to reach the same decision.
[556] You would want patent officers to be the same.
[557] In many, many systems, variability is undesirable, is unwanted.
[558] That variability is noise, and it's widely underestimated in terms of the damage that it causes.
[559] So that's why we wrote the book.
[560] Well, and the example with the insurance underwriters, I believe that you guys suggested.
[561] like you would expect perhaps two different underwriters to have some variation of, say, 10 degrees, right?
[562] Like, I don't know.
[563] Let's say they price a policy at $100, and you'd think the other person would do it at $110 at the worst.
[564] But once you study it, you find out there's actually above a 50 % variation, right?
[565] I mean, we actually, it's quite remarkable, but when you ask people in a well -run organization, when two people are estimating the same number, you know they won't come up with the same number if it's a matter of judgment.
[566] It's a definition of judgment that people do not perfectly agree.
[567] But how much variation do you tolerate?
[568] And people come up with that number almost universally.
[569] It's actually quite remarkable.
[570] 10 % is what people think difference between judges that is tolerable.
[571] And 50 % is more realistic.
[572] So 50 % is qualitatively different.
[573] There's just a lot more noise than people expect.
[574] Was what you're calling noise is actually that variation?
[575] Or are you talking specifically about distractions?
[576] It is unwanted variability in judgments or decisions about the same thing.
[577] So when you have judgments that in principle should be identical and they vary, that variability is what we call.
[578] noise.
[579] Okay.
[580] And even if you've like controlled for bias and everything else, it's still just that.
[581] It doesn't matter.
[582] And that's the striking thing that reducing noise is always good.
[583] If your aim is accuracy, then you can improve your accuracy equivalently by reducing the variability of judgments or by increasing the average error.
[584] And this is really non -intuitive, but it But it's a mathematical fact.
[585] Wow.
[586] So I have to imagine that these industries and the government, they pay a big price for this, have they just been ignoring it?
[587] Yes.
[588] And the insurance company, they were unaware that they had that problem.
[589] So we studied an insurance company that was doing.
[590] That's where the project began.
[591] I was consulting with the insurance company, and we decided to run that experiment, that is to take 50 underwriters, and show them the same cases and see by how much they differ.
[592] And they differed by about 50, 55 % for underwriters.
[593] Other claims adjusters were 42 % or something.
[594] And the organization was completely unaware that it had that problem.
[595] That's what made it interesting.
[596] It's the unawareness.
[597] How does an organization solve that problem?
[598] Because it just seems like individual differences or uniqueness or perspective.
[599] I know you have a concept decision hygiene how does that function like let's take the insurance company what would they do about this well let me first explain where the term decision hygiene comes from please the bias is like a disease and how do you deal with a particular disease well you have a vaccine or you have medication but they're specific the vaccine is against COVID it's not against shingles so it's specific.
[600] When you wash your hands, that's completely different.
[601] You are killing germs, but you have no idea which germs you're killing.
[602] And furthermore, you will never know, because if you're successful, you just won't get the disease.
[603] So hygiene and medication of vaccination are really different.
[604] And they're different in a way that correspond to the concept of bias and noise.
[605] Yeah.
[606] When you're reducing variables, steps that you would take that would reduce variability, we call them hygiene.
[607] Uh -huh.
[608] Now, many of these steps reduce biases as well, but they're conceived of as steps that are more likely to get people to uniform conclusion.
[609] Well, one suggestion I read that you proposed is kind of controlling for the group think aspect, right?
[610] getting everyone's opinion independently before anyone's talked about anything and then attempting maybe to aggregate that.
[611] Is that part of it?
[612] Yes, absolutely.
[613] I mean, the general principle of independence is really essential to reducing noise.
[614] And one way of reducing noise that is impractical in most organizations is simply to take many people and take the average of their judgments.
[615] And that's perfectly straightforward.
[616] you reduce the standard deviation of errors by the square root or the number of observations.
[617] So there is a mathematical formula.
[618] And if you have enough judgments, noise becomes unimportant.
[619] You are not touching bias when you're doing that.
[620] That is, if there is a shared bias, it will remain regardless of how many observations.
[621] But noise will vanish.
[622] Now, an essential condition for that is that the observations be independent.
[623] like witnesses.
[624] So if you have multiple witnesses to a crime, you don't want them to talk to each other before you question them.
[625] You want them to come in with their independent testimonies.
[626] And you will get more inconsistency, and that's good.
[627] That is, you want to have the noise that actually exists.
[628] You don't want to hide from it.
[629] You want to see it, and you want to reduce it.
[630] So that's the approach.
[631] And so how would they do that, say, in the judicial process, like a judge?
[632] How would they aggregate judgments?
[633] I don't think that judges could aggregate judgment, but they do.
[634] They have panels, but it's not a practical thing for organizations.
[635] For example, it would be good to have four patent offices independently judging every applications, but that would increase the budget by a factor of four.
[636] You can't have it.
[637] Yeah.
[638] So you need to do other things.
[639] In the case of judges, they try to have guidelines that would restrict actually their range of sentences that they could impose.
[640] Judges hated that.
[641] Judges prefer to be noisy rather than to be told what to do.
[642] And they feel that that serves justice, which is interesting.
[643] Stay tuned for more armchair expert, if you dare.
[644] What are your personal feelings?
[645] Because just as an idiot on the outside, it seems like a potential correction for all this is AI.
[646] I remember seeing a 60 minutes about AI diagnosing different forms of cancer.
[647] And they're just, as you say, this great variability on the prognosis that these different oncologists give.
[648] Yet the AI is consistently always at like 78 % or something.
[649] That is true of any rule or algorithm.
[650] What defines a rule or an algorithm is, is that it will give you the same answer when you give it the same problem twice, which people will not do.
[651] And one of the main advantages of algorithms is that they are noise -free.
[652] That's why they do better than people.
[653] So you can have the performance of bail judges, human bail judges, who decide whether to keep somebody in jail before trial or to release them.
[654] And there are different considerations.
[655] If you release them, they could commit a crime, but keeping them in prison when they are not going to commit a crime is really a horrible waste and painful to them.
[656] Turns out that you can have an algorithm that given the same data will just do better than judges.
[657] And it turns out that one of the things is that the algorithm is noise -free and judges are all over the place.
[658] and they're all over the place in different ways.
[659] Some judges are just more severe than others, so they keep more people in prison.
[660] Others are more lenient, and they release more people.
[661] But that's not what accounts for most of the noise.
[662] But accounts for most of the noise are tastes.
[663] Judges have different tastes in criminals.
[664] Uh -huh.
[665] And so they will release a criminal that would be released by one, will be kept in jail by the other.
[666] So it's not as if the criminals are ranked alike, and you just have some judges being more severe.
[667] They have different tastes.
[668] They have different rankings.
[669] Well, I have to imagine a lot of it's informed by who they've been burned on in the past.
[670] Like they gambled on this one type of criminal, and they got egg on their face, and never again.
[671] And some people just hate repeat offenders, and some people just hate youthful offenders.
[672] Some people are affected by the victim.
[673] And then you just may be affected by somebody who reminds you of your daughter.
[674] Right.
[675] Yeah, like the SS officer.
[676] I mean, that's a great example of...
[677] Yeah, he definitely went rogue with that decision to hug you.
[678] Yeah, it was very unpredictable decision.
[679] That was in the 90 -90 % noise there.
[680] I mean, it was a sensible decision.
[681] He just took the risk of my being Jewish.
[682] Now, I think we're all very familiar with the bias that exists in hiring practices, but even if those are corrected for, you're going to see this enormous 50 % variation and who would get hired, yeah?
[683] Or are going to see enormous variation.
[684] But hiring is a problem where you can reduce noise.
[685] So some big companies, for example, Google, I would say they have a state -of -the -art procedures for hiring, and they put a lot of effort into it.
[686] And in particular, they don't allow a single person to make their judgment.
[687] They have several people, and they keep them independent.
[688] And they arrange their thinking in a way, and that's an essential part of decision hygiene, that you delay intuition.
[689] That's a very important problem of decision hygiene.
[690] Normally, an interviewer forms an impression of the interviewee within seconds or a minute or two.
[691] And the rest of the interview is largely spent by the interviewer confirming that their initial impressions are right.
[692] What you want is you want to avoid that by structuring the way information is collected, by asking a series of specific questions that should be answered in a factual way.
[693] That's called a structured interview.
[694] And at the end of the interview, when the various properties or characteristics or attributes have been scored separately, then you can have an intuition, a global intuition, and it will be more accurate than your intuition that you would have had immediately.
[695] So delaying intuition, breaking up problems, having independent judgments, there are a few principles of decision hygiene.
[696] Yeah, is it that our intuition is good, but assuming we have.
[697] have the actual facts that are relevant?
[698] Like, I can just imagine meeting somebody and I'm like, oh, this coat, he's wearing a fur coat.
[699] I hate fur.
[700] This guy doesn't like animals, whatever the thing is.
[701] But before I see the fur coat, I find out what he graduated in his class, the charity work he does, you know, the actual substance of what the person is.
[702] Well, and for that, breaking up the problem so that if you're going to hire somebody and you say, what are the characteristics that I want from that person?
[703] And you try to assess those characteristics one at a time by asking questions about their experience, how responsible they are, how sociable they are, how well they get along with authority and so on.
[704] If you have a multiple set of criteria, you evaluate each of them by itself.
[705] You don't think of whether you want to hire that person until the end.
[706] Uh -huh, yeah, delaying that.
[707] That's so interesting.
[708] Do you think if this approach is implemented, in all many ways it could be implemented, that for the average person it'll feel less human?
[709] I think that there will be a lot of resistance to anything that constrains intuition.
[710] I mean, people like the opportunity to make intuitive judgments, and they trust their intuition.
[711] And what happens when you reduce noise is that people will be exposed.
[712] to noise.
[713] So I'll give you an example.
[714] As a professor when I was reading exams, so you have essays, a person writes three essays.
[715] And the normal procedure is you'll take a booklet and you'll read a person's booklet.
[716] One essay, then the second, then the third.
[717] And typically, when you do it that way, there are few surprises.
[718] The second essay is pretty much like the first and the third is pretty much like the first two.
[719] But suppose you do something else.
[720] You read the same essay for all students, and for each one you write the grade at the back of the booklet so that you don't know when you come to the second one what you gave to the first.
[721] What will happen, you'll find many surprises.
[722] Wait, I didn't understand that, did you?
[723] Okay, let me repeat that.
[724] Yeah.
[725] When I read multiple essays of the same person, When I read the second essay, I'm not independent.
[726] So if I like the first essay and I see something which is a stupid mistake, I would say, oh, that was just confusion.
[727] Right.
[728] That student is just too bright to have made that mistake.
[729] So I'll give the student the benefit of the doubt.
[730] And if you didn't like the first answer, you'll say, oh, I must have been right all along.
[731] Yes.
[732] This is an idiot.
[733] That's what happens when people are allowed.
[734] of their intuitions to govern.
[735] When you make the judgments independent of each other, you don't make those mistakes.
[736] And the result is, and that's what you were saying earlier, people won't like that procedure.
[737] They don't like the procedure because then they realize, well, here is that person, brilliant first essay, lousy second essay, what am I going to do?
[738] Life becomes more complicated.
[739] It's harder.
[740] You realize how much noise there is.
[741] Well, I think that's why people are so drawn to binary solutions for everything.
[742] It's like, yeah, no, it's going to be a lot of work.
[743] And there won't be a definitive answer at the end.
[744] That's the other great news.
[745] We will live in some gray area forever.
[746] Absolutely.
[747] I'm just so delighted we got to talk to you.
[748] I know.
[749] Can I ask you one completely off -topic question that I think because you're an economist, you might have an opinion on, and I'm just - I'm not, by the way.
[750] I'm a psychologist.
[751] Okay, but how come they call you an economist?
[752] Because I got the Nobel Prize in economics.
[753] I never had a course in economics.
[754] Okay, okay.
[755] Well, I want your opinion anyways.
[756] Knowing what you know about the way we think, knowing what you know you have some understanding of the economy, this is what I'm constantly asking myself.
[757] As the stock market just goes up indefinitely, the rational side of me says, well, this has to be tethered to earnings and production and blah, blah, blah.
[758] Yet there's another side of my brain that says, no, money's a story, and it is whatever story we want to tell.
[759] So if we all believe it is infinite and abundant, can it self -perpetuate on our belief in the story, or does it need to be anchored to real things?
[760] I feel like we're at a moment in our history where that decision's unfolding.
[761] What do you think about that?
[762] Well, I mean, when you're in the stock market, the value of stocks, there is that objective value.
[763] the earnings or the company.
[764] But what really determined the value of the stock in the market is what other people think it is.
[765] So when you're buying a stock, you are actually betting on what other people will think that stock is worth in the future.
[766] And that's what people do, mostly.
[767] Now, ultimately, I think values are tethered to reality.
[768] So ultimately, there are corrections.
[769] Uh -huh.
[770] But so it's not that you can have a cell of perpetuating story forever because there are alternatives.
[771] There are alternative ways to invest money.
[772] The stocks are not alone.
[773] It's not only a story about stocks.
[774] And ultimately, reality will have it stern, I suppose.
[775] Yeah, okay, I wonder.
[776] Because our government seems to have reacted now in a way where we'll just invent more money.
[777] We invent it a lot.
[778] And then a lot of other countries are just inventing more money.
[779] And again, the logical side.
[780] him.
[781] He says, well, you can't just indefinitely invent money.
[782] The bill will come do.
[783] But at the same time, I think, well, if everyone on the globe agrees to this fantasy, then maybe you can.
[784] Does that make any sense what I'm saying?
[785] Yeah, yeah.
[786] That is real economics, and that's too complicated for me. It's complicated.
[787] It's complicated.
[788] Well, if it makes you feel any better, when we interviewed Bill Gates, I said, is there any discipline you have a really hard time understanding?
[789] And he said, yeah, economics is the one that I have to call a lot of experts on when I'm trying to understand.
[790] understand it.
[791] That's nice.
[792] So you're in great company.
[793] Good.
[794] This was lovely.
[795] Yeah, this was so great.
[796] Yeah.
[797] Yeah.
[798] In my life satisfaction story, this will make the highlight real.
[799] Thank you.
[800] You have very good interviewers.
[801] It's a pleasure to talk with you.
[802] Thank you.
[803] Everyone should buy noise, a flaw in human judgment.
[804] It comes out May 18th.
[805] We're both so excited to read it and we thank you so much for your time.
[806] Bye -bye.
[807] Okay.
[808] Bye -bye.
[809] And now my favorite part of the show, the fact check with my soulmate Monica Padman.
[810] Hello.
[811] Hello.
[812] Good morning.
[813] Hello, purple people eater.
[814] I am a purple girl today.
[815] Head to tell purple.
[816] That's right.
[817] Sweatsuit.
[818] I guess the socks aren't purple.
[819] But they compliment my purple nicely.
[820] I'm wearing very loud socks today.
[821] You are?
[822] Bright red.
[823] Yeah, bright red.
[824] With some snowflakes and pine trees.
[825] Well, they're cute.
[826] You love a good sock.
[827] I like to draw attention down to my ankle.
[828] I have very elegant ankles.
[829] It's a fun pop.
[830] Yeah.
[831] It's one of the parts of my body that I'm like, boom, nailed it.
[832] Elegant ankles.
[833] But I got to be careful what I say, as you know.
[834] Why?
[835] Because I jinxed my feet.
[836] Remember I used to brag about the beauty of my feet.
[837] You're right.
[838] Showed them off on talk shows, really arrogant about them.
[839] And then God smited me. Yep.
[840] And made one of my knuckles freeze, which resulted in surgery, which resulted now in a miniature toe.
[841] Yep.
[842] Which then resulted in one making a very sharp.
[843] right turn yeah it looks like an evil claw now my foot yeah and your ankles might start getting um what's that there's a thing that happens um an ankle condition my papa bob had horrendous ankles just terrible they were this big and i'm holding up what it would be two grapefruits to monica right now and he had to have special shoes and the ankle kind of hung over the side of the shoe walking was so difficult for her i think it's called something that happens to ankles an older age.
[844] His was like an elephant titus of the ankles.
[845] Okay.
[846] What's gout?
[847] Gout is when you have these shards coming through the ends of your toes.
[848] They're crystals come out.
[849] And it's diet derived.
[850] You have too much tannins, for lack of, the right word.
[851] Okay.
[852] Okay.
[853] It's usually like rich foods.
[854] It's alcohol, brown booze, lots of meat.
[855] And folks will get these crystals that form in the ends of their toes.
[856] And then as their toes try to reject it and get it out of the body, it kills.
[857] Ooh.
[858] Yeah.
[859] Okay.
[860] I think it somehow can affect your testicles.
[861] My old boss, Tommy Sapp, I can say this out loud because he's passed, rest in peace.
[862] Uh -huh.
[863] He had gout, and occasionally I would go into his office and he would have his testicles in a mason jar full of ice.
[864] Oh, my God.
[865] And he would just carry on a casual conversation.
[866] Yeah.
[867] After you get done sweeping up the thing, make sure you pull my truck around and get it washed.
[868] At California Pizza Kitchen?
[869] No, at a race.
[870] No, this one was 15.
[871] It was a race team.
[872] Oh, wow.
[873] Yeah, yeah.
[874] And they fabricated one -off cars for GM.
[875] Oh.
[876] Yeah, and he was from the South.
[877] Before he got into racing, he would extract alligators.
[878] He's from Georgia.
[879] So people would get alligators in their, like, residential area in their backyard.
[880] And he would get in a tree and drop out on top of them.
[881] Wow.
[882] Yeah.
[883] What a guy.
[884] Yeah, he was impressive.
[885] Oh, my goodness.
[886] So as you can imagine, there are no facts for Danny Connaman because he doesn't make mistakes.
[887] And you didn't make any because you were mainly asking questions.
[888] Oh, I was scared of him the whole time.
[889] Yeah.
[890] I will say, I did make a note that we talked about Israel a fair amount at the beginning.
[891] And it seems a little tone deaf in lieu of what's going on that we didn't address that there's a war happening.
[892] happening right now, including Israel, but we recorded this before that.
[893] Yes.
[894] So I just wanted to make that kind of clear.
[895] Yeah, I'm glad you said that.
[896] It's tragic, all the stuff that's going on.
[897] Now, back to, I didn't make any errors, any blunders.
[898] You didn't.
[899] Because I was very intimidated by him.
[900] So what was fun, though, is I listened to Adam Grant, interview him.
[901] And Adam Grant is an expert as well in that field.
[902] Uh -huh.
[903] And so I listened to that and I thought, oh, yeah, even with all the info, it would be scary.
[904] Of course.
[905] He's a legend.
[906] He's the originator of all the UFO virtually.
[907] And he won the Nobel Prize.
[908] He's got a Nobel.
[909] My God.
[910] You just totally set me up for a huge ding, ding, ding.
[911] Oh, wonderful.
[912] Because I think what I want to do, since there are no facts for Danny, Adam has a new personality test that he created.
[913] And I thought it would be a fun quiz.
[914] Oh, okay.
[915] He's done a quiz in a while.
[916] He created it with Brian Little, a new personality assessment combining the scientific rigor of the big five with the practicality of the MBTI and the inneagram.
[917] Okay, so it combines a bunch of stuff.
[918] Syntheses.
[919] That's right.
[920] I'm going to tell you the types before we do it, okay?
[921] Okay, great.
[922] And he has this cute picture on here.
[923] It's like a little archipelago.
[924] Oh, yeah.
[925] It's called the archetype archipelago.
[926] And you see how they all kind of connect.
[927] Okay, there's an individualist.
[928] There's a creator section.
[929] Within creators, there's artisans, inventors, adventures.
[930] Then there's enthusiasts within that.
[931] Entertainer, promoter, impresario.
[932] Advocates, within that.
[933] Inspire, coach, campaigner.
[934] Givers, within that.
[935] Helper, problem solver, peacekeeper.
[936] producers, which have technicians, implementers, and investigators, architects that have orchestrators, strategists, and planners, fighters, fighters that have protectors, enforcers, critics, leaders that have shapers, quiet leader, commander, and then seekers, which are explorers, thinkers, growth seekers.
[937] I'm going to guess what mine is right out of the gates.
[938] Okay.
[939] Because of one thing.
[940] Okay.
[941] I was listening to it and I was already getting offended.
[942] Oh, okay.
[943] Because I was like, you can have all these things.
[944] I don't like that it's broken into.
[945] Well, that's how all of these are.
[946] Which tells me I'm probably an individualist.
[947] Like the fact that I'm bulking at the notion that I'm going to fit into a box, I'm guessing probably has me in that individual category.
[948] But you know what I'm saying?
[949] Like I'm hearing some of those things.
[950] Well, that's not me. That's not me. that's me. So there's, that's me in several of those bubbles.
[951] Sure.
[952] I like that guess.
[953] We'll see.
[954] Let's do it.
[955] Okay.
[956] I am sensitive to others' emotions.
[957] Now, this is a scale from disagree strongly to agree strongly and it's bubbles.
[958] So we'll say disagree strongly is one and seven is agree strongly.
[959] So here's what's tricky because is it someone I love and care about or is it strangers?
[960] I am sensitive to others' emotions.
[961] So the people in my bubble, I'm super sensitive to their emotions.
[962] Right.
[963] And I don't really care if a stranger is having emotions.
[964] But I think it's actually, it's more than caring.
[965] Are you sensitive?
[966] Like, can you pick up on?
[967] So even if it's a stranger?
[968] Yeah, then I would say yes.
[969] I'm very.
[970] So do you think you're all the way?
[971] Seven.
[972] Yeah.
[973] When engaged in projects, I want to get everything just right.
[974] Two or three, I guess.
[975] Three.
[976] Okay.
[977] I stay calm under pressure.
[978] All the way.
[979] I get stressed out easily.
[980] No. But not perfect.
[981] So a two or a six, whatever side that would be.
[982] Okay.
[983] I can usually achieve what I want if I work hard for it.
[984] Five.
[985] I always follow through on my commitments.
[986] Five.
[987] I am able to recover quickly from most setbacks.
[988] Four.
[989] Four.
[990] That's right in the middle.
[991] But closer do agree.
[992] No, it's literally right in the middle.
[993] Okay.
[994] But do you want me to do one more closer?
[995] Okay.
[996] When others around me feel sad, it makes me feel sad as well.
[997] Makes me feel angry.
[998] Yeah.
[999] So I'm going to disagree.
[1000] Okay, disagree.
[1001] The most meaningful way to succeed is to help others succeed.
[1002] I'll go four on that.
[1003] I love to get critical feedback.
[1004] I strongly disagree.
[1005] I hold people accountable for results.
[1006] I'd say four.
[1007] I make decisions based on facts, not feelings.
[1008] Six.
[1009] I have been told I'm a good motivator.
[1010] No, I don't think I've been told that.
[1011] Okay.
[1012] That's this very specific sentence.
[1013] It is.
[1014] Yeah, you're a great motivator.
[1015] I'll give myself a two on that.
[1016] Okay.
[1017] I excel at identifying opportunities.
[1018] All right, seven.
[1019] Once you become an adult, it is extremely difficult to change your basic nature.
[1020] Boy, this is tricky, because, like, is that my opinion or have I experienced?
[1021] I have changed my...
[1022] I think it's your opinion.
[1023] I do think it's really, really hard.
[1024] I've done it, I think.
[1025] Yeah.
[1026] So I guess I would give that a six.
[1027] Okay.
[1028] I like to be told what to do.
[1029] Oh, my God.
[1030] Zero or one.
[1031] I'm likely to show off if I get the chance.
[1032] Oh, yeah.
[1033] Seven.
[1034] People describe me as an inspiring leader.
[1035] Hmm.
[1036] Well, they describe me as a leader sometimes, but not inspiring.
[1037] So four?
[1038] I think people, a lot of people describe you as inspiring.
[1039] But this is for you to...
[1040] I know.
[1041] That's a hard one for me to think about objectively.
[1042] I guess, five.
[1043] Okay.
[1044] I don't hesitate to assume leadership in groups.
[1045] Seven.
[1046] I adapt easily to new situations.
[1047] No. terribly i seek adventure oh seven when i talk seriously with other people i often find their life stories to be fascinating seven i am highly organized in my work um four i am highly systematic in my work two i would probably make a good actor four i'm teasing i'm teasing i'm teasing i'm teasing this is This is interesting, though, because, like, are your qualities ones that would make you a good actor?
[1048] Like, you are a good actor.
[1049] Now we know, but you have to kind of think of it as, like, if you weren't, are your qualities ones that you think would lead?
[1050] I think anyone who managed an addiction for a decade has to consider themselves a good actor if they largely got away with it.
[1051] Okay.
[1052] You know?
[1053] Like, if none of my teachers or my friends.
[1054] friends were concerned about me. True, true, true, true.
[1055] I think that's probably some good acting.
[1056] True.
[1057] Okay, so you want me to do it?
[1058] Seven.
[1059] Seven.
[1060] Okay.
[1061] I am ready and willing to tell people what they did wrong.
[1062] Ooh, I do not like telling people what they did wrong.
[1063] I'd say a four.
[1064] I'll do it when it has to be done, but I hate doing it.
[1065] Others seek me out for advice on personal problems.
[1066] Yes.
[1067] Six.
[1068] I am easily discouraged.
[1069] No. Two.
[1070] I tend to avoid interpersonal conflict.
[1071] Two.
[1072] I want to find out how good I really can be at my work.
[1073] Four.
[1074] I occasionally hold back my opinions.
[1075] Oh, very occasionally.
[1076] How do I answer that one, Monica?
[1077] I definitely choose to not give my opinion sometimes.
[1078] Rarely.
[1079] But very rarely.
[1080] Yeah.
[1081] Yeah.
[1082] There has to be threat of emotional damage on the table.
[1083] So probably disagree, yeah, okay.
[1084] Okay, I get angry easily.
[1085] I disagree.
[1086] Even though you get mad when people are sad.
[1087] If you see me move through life, I'm not angry very often.
[1088] Okay.
[1089] I mean, real low percentage, especially from at least my experience with other males growing up.
[1090] I'd say it's low.
[1091] By learning from my mistakes, I have significantly changed the kind of person I am.
[1092] Seven.
[1093] When something confuses me, I often find it fast.
[1094] Mm -hmm.
[1095] Yeah.
[1096] Six.
[1097] I set specific goals for people to achieve.
[1098] Three.
[1099] Oh, three.
[1100] I don't think we've had a three.
[1101] Oh, yeah.
[1102] I am basically a rebel.
[1103] Seven.
[1104] I love a good fight.
[1105] Six.
[1106] Well, is it with a guy or a girl?
[1107] That's the problem.
[1108] I don't like fighting with women.
[1109] I love fighting with men.
[1110] Okay.
[1111] Okay, I'll leave it at six.
[1112] Six.
[1113] I'm often the life of the party.
[1114] Most of my life, yeah.
[1115] I'm retiring from that role.
[1116] So I would say six.
[1117] Okay.
[1118] I am able to keep my cool when others are losing theirs.
[1119] Seven.
[1120] I get upset easily.
[1121] Two.
[1122] I assume personal accountability for successes and failures in life.
[1123] Seven.
[1124] A core value for me is creating something new.
[1125] Hmm.
[1126] Five.
[1127] Six.
[1128] Six.
[1129] Okay.
[1130] Yeah, I'm trying to be unique and original and all that crap.
[1131] I want every detail taken care of.
[1132] Does that mean so I don't have to?
[1133] Because that I desire, but I'm not a, like, micromanager.
[1134] I'm not someone who.
[1135] I want every detail taken care of.
[1136] I don't know.
[1137] That's tricky.
[1138] Let's go in the middle.
[1139] Okay.
[1140] So it doesn't tip it.
[1141] I sometimes find it difficult to get down to work.
[1142] I go neutral on that.
[1143] Okay.
[1144] I'm always looking for experiences that challenge how I think about myself and the world.
[1145] Yeah, I go six.
[1146] I'm more a creature of habit than a bold adventure.
[1147] Oh, man, I'm both.
[1148] Hmm.
[1149] You know?
[1150] Yeah.
[1151] You are, too.
[1152] Yeah.
[1153] I'm not a bold adventure.
[1154] But you love going to Europe and you love going to New York.
[1155] But is that a bold adventure?
[1156] I don't know.
[1157] I mean, maybe it is.
[1158] I guess it's like it's all relative.
[1159] I'm not a creature of habit.
[1160] I'm home for seven days and I want to go somewhere.
[1161] So I'm going to go, whatever.
[1162] I want adventure, yeah.
[1163] Okay.
[1164] I was just thinking of my eating habits are very.
[1165] very habitual and yeah yeah i like to keep a strict schedule instead of winging things disagree it's important to me to avoid wasting time that's weird because like i don't have a schedule and i hate wasting time i think you don't like wasting time yeah so i agree i hate wasting time i usually go with my gut rather than spending a lot of time analyzing decisions well my older age i that's not true i went with my gut everyone now i fucking think about everything had nauseam okay so disagree disagree i prefer to hide my failures rather than to have them exposed yeah i prefer that yeah agree but you're you expose your failures a lot on yeah as as a method of staying sober yeah but i yeah i would like you know i would hide things that I most of my life I would.
[1166] Yeah.
[1167] I don't know.
[1168] What should I do?
[1169] I don't know because how about we go neither, neither agree nor disagree.
[1170] Okay.
[1171] Okay.
[1172] I think you agree slightly.
[1173] Okay.
[1174] I'll I trust you more than I trust me. I mean, I mean you disagree slightly.
[1175] You prefer I not prefer but you know.
[1176] Yeah, I'm going to say, I'm going to decide.
[1177] Okay.
[1178] In different roles and with different people, I often act like very different persons.
[1179] It's slightly disagree.
[1180] I like having a boss who tells me what I need to do.
[1181] No, strongly disagree, yeah.
[1182] I'm not shy about telling people how good I am at something.
[1183] Oh, boy.
[1184] I kind of think he wrote some of these specifically for you.
[1185] Oh, fuck.
[1186] I mean, yeah, I do that.
[1187] I do it.
[1188] I strive to be as unpopular.
[1189] What?
[1190] No, fuck.
[1191] What?
[1192] The next level of my evolution would be to not ask for recognition for this.
[1193] but I just I got step one done today so now I'm going to ask for recognition can believe I made it through that whole interview without saying I have an anthropology degree good yeah great job I wanted to oh good 25 times I'm so glad he kept bringing up anthropology and how he likes anthropology and I never did it I'm glad oh it was painful I was hoping you would observe that what's so funny is I obviously didn't, but I would have.
[1194] Yeah, if I had said it, it would have driven you crazy.
[1195] I just would have been like, oh, boy, okay.
[1196] Yeah.
[1197] But also, because everyone already knows.
[1198] It's not like.
[1199] Well, he didn't.
[1200] That was obvious.
[1201] Yeah, I see what you mean.
[1202] Yeah.
[1203] Okay.
[1204] And I thought, wow, this guy likes anthropology.
[1205] He's going to like me more knowing that that's what my degree is in.
[1206] Motivation was there for him.
[1207] I wanted him to like me more.
[1208] Of course.
[1209] I strive to be as impartial as I can be in dealing with contentious issues.
[1210] Strongly agree.
[1211] I love to think of new ways of doing things.
[1212] Six.
[1213] I avoid drawing attention to myself.
[1214] One.
[1215] I tend to talk a lot in social situations.
[1216] Seven.
[1217] Achieving excellence is a driving principle in my life.
[1218] One.
[1219] One.
[1220] Really?
[1221] I am not a perfectionist.
[1222] I'm not someone who's trying to...
[1223] Like when you're on top, you're like, you want to win, and you want to win a good.
[1224] games and you want to like I think that's not true well I can tell you my thing with driving which is very interesting so in the world of driving what I am uniquely great at is car control is getting the car out of control and then getting it back into control that's like my specialty racing is very tedious and very technical it is breaking at this exact breaker mark turning in at this exact moment that part i'm not great at i'm good at it but i'm way better uh my my gift in that arena is i can go beyond the breaking point and then recover the car so i'm not the excellence portion is what daniel ricardo can do like the tedium of excellence yeah i don't even pursue because i've at some point in my life i decided that's not what i can do i can't focus on tedium in repetition and makes like i that's why i don't want to golf yeah that whole thing is like perfecting the swing.
[1225] Yeah, but you've, like, text me like, oh, I beat Jethro.
[1226] I want to win really bad.
[1227] Yeah, that is what this question is.
[1228] Like, I don't think you're fine with mediocrity.
[1229] You want the things you do to be good, and you want them to get some...
[1230] Yeah, I guess I just think of, like, Ken Kennedy.
[1231] Ken Kennedy has this.
[1232] And since I grew up as best friends with him, I'm so further down the spectrum than he is.
[1233] And then Aaron is then beyond me. But I'm not, like, Yeah, when he and I would take an engine out of a car, he would label every hose.
[1234] Like the amount of time, the meticulousness he had because excellence was the bar.
[1235] And mine was like, if the car starts at the end of this, that's my bar.
[1236] Attention to detail is not the same thing as excellence.
[1237] Excellence is being the best.
[1238] Oh, okay.
[1239] Okay, then let's go five.
[1240] Okay.
[1241] That was like a negotiation when he settled on five.
[1242] Well, you had it at one, which is absurd.
[1243] That's just like so not right.
[1244] I guess you'd have to be in my mind.
[1245] Like my whole, the whole way I start writing is that I say I'm not going to try to write something excellent.
[1246] Like it's my whole.
[1247] The reason you are saying that to yourself is because you do want something excellent.
[1248] And you know that going in with that method is bad.
[1249] So you're trying to talk yourself out of that.
[1250] That's a good way.
[1251] That's a good angle.
[1252] Good angle.
[1253] I want my work to provide me with opportunities for increasing my knowledge and skills.
[1254] Five.
[1255] I just want to have fun at work.
[1256] That's what I'm in motivation.
[1257] I like being the center of attention.
[1258] Can we go six on there because it's decreasingly so?
[1259] It's too late.
[1260] Okay.
[1261] No problem.
[1262] I can live with that because I don't care about excellence.
[1263] I love to break with convention.
[1264] Oh, yeah.
[1265] People think of me as a take charge individual.
[1266] Yeah, I strongly agree.
[1267] I never hesitate to call BS when necessary.
[1268] Strongly agree.
[1269] I like hearing other points of view that are different.
[1270] mine.
[1271] Yeah, I do.
[1272] Six.
[1273] Helping others is a core value for me. Ugh.
[1274] Okay.
[1275] It's forced on to me. I mean, I...
[1276] Yeah, so it's not a core value.
[1277] Yeah, it's not a core value.
[1278] I had to adopt it.
[1279] I tell people the truth even when it hurts.
[1280] Six?
[1281] I like turning problems into opportunities.
[1282] Six.
[1283] I don't always finish what I start.
[1284] Hmm.
[1285] Two.
[1286] People describe me as an extremely dependable person.
[1287] I don't know Yeah.
[1288] Logical reasoning is really overrated.
[1289] Intuition and imagination are more important.
[1290] Two.
[1291] I tend to make decisions deliberately rather than spontaneously.
[1292] Five.
[1293] Or four, whatever.
[1294] Middle.
[1295] Yeah.
[1296] I love a disagreement in which I learn and change my mind.
[1297] Five?
[1298] I love a disagreement where I change someone else's mind.
[1299] Oof.
[1300] I'm being honest.
[1301] I know, I know.
[1302] You have to.
[1303] This is a personality test.
[1304] Oh, my.
[1305] Oh.
[1306] You've made great progress.
[1307] What we've learned.
[1308] so far is you're insufferable halfway through we really are how to how you prefer to think creative you're high on the creative so far medium on deliberative and low on detailed and reliable oh my god this is interesting oh how you engage with others high extroverted high tough you love that low nurturing how you apply yourself high composed low flexible high determined when interacting with others you have an outgoing nature and originality that can help you stand apart from the crowd you're likely to state your thoughts candidly and directly you may be less sensitive to and aware of others needs or feelings even despite your intent at times when when solving problems you may be fascinated when solutions aren't obvious.
[1309] You are comfortable exploring new ways of doing things.
[1310] You can balance exploring new possibilities with the need to take decisive action.
[1311] When planning, you may favor stability and consistency, though may lean on others to provide the planning and structure to enable it at times.
[1312] You can have a strong desire to push things through that may compensate for less organization and orderliness at times.
[1313] You tend to be less caught up in operational details associated with the creating and executing plans.
[1314] That's kind of what you said about the Ken Kennedy.
[1315] Okay.
[1316] Continuing answering.
[1317] That probably means halfway.
[1318] No, it's at 29%.
[1319] Okay.
[1320] Should we do a part two later?
[1321] Yeah, yeah, yeah, yeah.
[1322] And three.
[1323] Yeah, we'll do this in parts.
[1324] Okay, so now we are at the one third point.
[1325] Yeah, yeah.
[1326] And we'll resume next time.
[1327] Yeah, we'll get a real detail conclusion about...
[1328] So far, I think that was pretty accurate.
[1329] From, I mean, mostly, mostly.
[1330] But I'm excited to know your final, like, what you are.
[1331] Do you think a personality test is just navel gazing?
[1332] Like, why do you need to find out what box you fit in?
[1333] Yeah.
[1334] I mean, are you just you?
[1335] You know what I'm saying?
[1336] Is it seem a little ego maniacal or self -centered?
[1337] Well, anytime you're thinking about yourself, it could be thought of that way.
[1338] I mean, I think it's interesting to know you.
[1339] your strengths and weaknesses.
[1340] Yeah.
[1341] And know that there are some general, I mean, we all come in and out of all these categories, I think, obviously.
[1342] But there are people who are a lot like you and people who aren't.
[1343] And people who you are probably better off connecting with and people you aren't.
[1344] Yeah.
[1345] You're more compatible with.
[1346] Yeah, better pairing.
[1347] Yeah.
[1348] But I guess I'm imagining someone taking, like, here's what seems like a paradox about it.
[1349] Like, I'm aware of my character defects and my shortcomings.
[1350] I know I suck at a lot of things.
[1351] And I feel like if people were taking this test to discover what they were bad at, it would have required them to admit they were bad at it to begin with.
[1352] So they already know it.
[1353] I don't know.
[1354] Or someone probably is like, oh, I bet people say I'm a very inspiring leader.
[1355] And they're just out to lunch.
[1356] People don't say that about.
[1357] Well, I guess you're hoping that people are thinking critically about the questions.
[1358] But, you know what I'm saying?
[1359] And it's a little bit of a like catch 22 because if you're aware enough of your your hiccups, then you're aware of them.
[1360] And if you're not, you're answering like, I strongly agree people think I'm an inspirational leader, but you're not.
[1361] And no one's saying that about you.
[1362] Yeah.
[1363] That's the catch 22 of it a little bit to me. Yeah.
[1364] You don't see that?
[1365] I do.
[1366] I just, I don't know.
[1367] I don't know what to say.
[1368] I think it's, I think they're interesting.
[1369] Oh, yeah, yeah.
[1370] Yeah, I was just having a kind of a global thought about them.
[1371] Like, I guess what I've experienced in the Enneagram feedback, and Kristen got really into it on this show she's doing, everyone was talking about it.
[1372] People will say, like, I'm a blank, and they declare it with this excitement that they've figured out what they are.
[1373] Yeah.
[1374] You know, the cynic in me is like, and now what?
[1375] Like, oh, okay, you're a 13 or you're a 1.
[1376] what does this mean?
[1377] How are you going to alter your life based on this information?
[1378] What is its purpose to have given yourself this label?
[1379] I question those things because I'm cynical by nature.
[1380] Yeah, and like maybe there doesn't have to be a purpose.
[1381] Maybe it's just for amusement.
[1382] Is it amusement?
[1383] I mean, I think it's to know, people like to know where they are in different categories and different types.
[1384] I mean, I agree with you and that was our whole Hufflepuff thing.
[1385] Like, I don't like that people are married to this idea of themselves.
[1386] Right.
[1387] I don't like that.
[1388] Yeah.
[1389] But I do think it's fun to be like, oh, there are these general traits.
[1390] Yeah.
[1391] And people do fall into these categories.
[1392] Like, it is interesting.
[1393] You know, we all think we're so unique.
[1394] Yeah.
[1395] Yeah.
[1396] But do you find that I think people are universally excited with the category they fell into, which is also just very interesting.
[1397] I think they are.
[1398] They are.
[1399] Yeah.
[1400] Yeah.
[1401] Which is interesting because it just to me. says the categories have no value.
[1402] There's ones you don't want to be in and ones you want to be in, which doesn't exist.
[1403] Well, except I was, my anagram and I forgot, I didn't love the one I got.
[1404] Okay.
[1405] I was like, oh, that wouldn't have been.
[1406] Hi.
[1407] Hi.
[1408] Sorry.
[1409] Danny wants to know how much for you.
[1410] Okay.
[1411] But you don't have to.
[1412] Two minutes.
[1413] Oh, great.
[1414] Fine.
[1415] Also, you're a one.
[1416] Oh, Kristen has come in and said, I'm a one.
[1417] What does that mean again?
[1418] Perfectionist or something bad.
[1419] Putrid.
[1420] What if it says number one?
[1421] Putrid.
[1422] Stinky.
[1423] Number one, brown skin.
[1424] One is conscientious and ethical, strong sense of right and wrong, teachers, crusaders, advocates for change.
[1425] Basic fear being corrupt or evil and defective.
[1426] Basic desire to be good, have integrity, and be balanced.
[1427] Oh, those are your...
[1428] That's pretty accurate.
[1429] Those are your principles.
[1430] Key motivations, want to be right, strive higher, improve to end.
[1431] everything, be consistent with your ideals, justify yourself, and be beyond criticism so as not to be condemned by anyone.
[1432] Examples, Confucius, Plato, Joan of Arc. Monica Padman.
[1433] Oh, Gonti.
[1434] Oh, my God.
[1435] I like these people.
[1436] Beautiful, but Jerry signed them.
[1437] Oh, my God.
[1438] Oh, shit.
[1439] Okay, I love it now.
[1440] You love it now.
[1441] Well, what is it called again, though?
[1442] The Eniogram.
[1443] The Reformer.
[1444] The reformer.
[1445] Yeah.
[1446] I could see it.
[1447] And I like it, but I also don't.
[1448] Like there are, but obviously I don't like the parts that I know about myself that are bad.
[1449] That are putrid.
[1450] Yeah.
[1451] All right.
[1452] Well, we're going to finish this personality test next time.
[1453] All right.
[1454] I love you.
[1455] I love you too.
[1456] Goodbye.
[1457] Follow armchair expert on the Wondry app, Amazon music, or wherever you get your podcast.
[1458] You can listen to every episode of Armchair Expert early.
[1459] and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.
[1460] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.