Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to Armchair Expert, Experts on Expert.
[1] I'm Dan Ariely, and I'm joined by Monica Padman.
[2] Hi.
[3] I always say Dan Shepard, and our guest is Dan Ariely.
[4] Yeah.
[5] So it just felt right to do it immediately.
[6] Dan is a social scientist, best -selling author, and professor of psychology and behavioral economics at Duke University.
[7] He's written a ton of wonderful books, predictably irrational, the upside of irrationality, dollars and cents, payoff, and the honest truth about dishonesty.
[8] He has a new book called Misbelief.
[9] What Makes Rational People Believe Irrational Things?
[10] So we get into the nitty -gritty on conspiracy theories, which is really fun.
[11] It was.
[12] And the funnel of misbelief.
[13] He really breaks that down, which is really interesting.
[14] Yes.
[15] And important, I think.
[16] Really enjoyed this interview so much.
[17] Me too.
[18] Yeah.
[19] Also, we're sick.
[20] Not in the interview.
[21] No, we weren't in the interview, but we are in the intro.
[22] And in the fact check.
[23] But don't worry, we weren't on the day.
[24] We're not contagious to you.
[25] Yeah, you can't catch it right now.
[26] But we're all dying inside this attic.
[27] That's okay.
[28] Please enjoy Dan Ariely.
[29] Dan, welcome.
[30] How are you doing?
[31] I'm very well.
[32] Oh, fun.
[33] We have a visitor.
[34] How exciting.
[35] Oh, I thought you were going to enjoy.
[36] to view me. Thank you, buddy.
[37] You got this for me?
[38] No, I'm sorry.
[39] Oh, my God.
[40] Thank you.
[41] Good luck.
[42] I'll come to see right after, okay?
[43] That's my best friend.
[44] So she had to...
[45] I met her when they came earlier.
[46] Oh, you did, yes.
[47] She's a special one.
[48] Okay, where were we?
[49] Pleasantries.
[50] Pleasantries.
[51] I'm going to ask a really embarrassing question, but what state is Dukin?
[52] You don't know?
[53] No. North Carolina.
[54] Oh, that sounds completely right now that I've heard it and I'm embarrassed.
[55] Well, that's fast.
[56] fascinating them because you went to undergrad at Chapel Hill?
[57] No. So first of all, you asked me, how am I doing?
[58] Yes, how are you doing?
[59] Let's go on order.
[60] Okay, so let's start with that because yesterday was an unbelievable day for me. During COVID, I helped a little bit to write a drama series loosely based on my life about the university professor who also helps the FBI solve problems using social science.
[61] And yesterday, the first episode aired.
[62] Oh, get out of here.
[63] Oh, wow.
[64] So did you go to like, um, yeah, yeah.
[65] Viewing party or?
[66] We had a party and all the writers and some of the actors and all the people working behind the scene and it was very exciting.
[67] And to see, you know, it's not me, of course, it's a made -up character, but to see somebody playing a version of me on stage with some of my sentences and some of my preferences and some of my preferences and so on was delightful.
[68] Is that your first taste with show business?
[69] Yeah.
[70] I participate in a couple of documentaries.
[71] This is like real, real show business.
[72] Where does it air?
[73] It's a NBC.
[74] Oh, okay, so one of the three big networks.
[75] And the show is called The Irrational.
[76] The Irrational.
[77] Oh, fun.
[78] Okay, so now back to, of course, undergrad is at University of Tel Aviv.
[79] My first X years, I go to graduate degree at UNC Chapel Hill.
[80] Then I also start studying at Duke.
[81] Then I graduate from UNC.
[82] Then I graduate from Duke.
[83] I go for about a decade to MIT.
[84] 98 to 2008?
[85] A decade.
[86] A decade, yes.
[87] Yeah.
[88] And then I moved back to the United.
[89] Duke and I've been there since.
[90] Okay, now let's go back to first PhD is in psychology.
[91] Second one is in business at the urging of Daniel Conneman.
[92] Yes.
[93] Okay, we had him on.
[94] How on earth is Daniel Connman encourage you to go learn about business?
[95] So it's not so much learning about business, but we are dealing with decision making.
[96] And decision making, I think, is a really important field, but the theory is not really very deep.
[97] We're more interested in changing human behavior, then understanding why people behave.
[98] If you look at the world around and you say, you know, we have poverty and we have procrastination and we have violence and we have all kinds of terrible things, my main objective is to change behavior, to see what's wrong and what can we fix.
[99] And psychology is much more interested in the underlying processes.
[100] Sometimes it's important and sometimes there's a divergent.
[101] And at a time, Danny thought that my interest, and I think some of other people in the field as well would be better served in an applied discipline where we could tackle the problem.
[102] You know, if you think about this applied field of behavioral science, behavioral economics, behavior change, I'm happy to borrow theories from everywhere.
[103] Psychology, sociology, anthropology, philosophy, bring it on.
[104] If the goal is to change behavior, you're not married to one discipline.
[105] So from that perspective, I think he gave me very solid advice.
[106] I'm not sure it's true for everybody.
[107] But for me, I look at social science as a tool to understand and to fix something.
[108] But I'm not happy with just understanding.
[109] You know, I had the same frustration at the conclusion of my anthropology degree, which is I loved it, I loved learning it, and then I thought, well, what do we do with this?
[110] And in 2000, when I graduated, there weren't even really any anthropologists working for companies even advising.
[111] That's increased, I guess.
[112] But I remember thinking, well, great, now we have this great understanding of everything.
[113] Can we advise anything based on that?
[114] And really, no, was the answer.
[115] and I was frustrated by that.
[116] And, you know, for a long time, I think academics had this view that we write a paper and we have the conclusion and sometimes we say possible applications.
[117] And then we said, oh, somebody would read this paper, somebody would look at the possible application and I would say, oh, let's do that.
[118] Someone else would try it, not the actual scientist.
[119] But I think that over the years we've realized that there's a big gap between the possible application and actually being able to use it.
[120] And the question is, who is responsible for fixing that gap?
[121] Or even agreeing on what direction we should be, to use your term, nudging all this behavior.
[122] Who's to say?
[123] Yeah.
[124] Before we talk about the moral issue of who's to say, once you finish a paper, there's still a big gap until it becomes applicable.
[125] Should academics worry about this gap or should practitioner worry about that gap?
[126] And I think more and more that there's a place for some academics to worry about that gap and to try and make our findings more easily applicable for the people who want to actually go and do something.
[127] Yeah.
[128] Who would you think would ultimately be using what was discovered?
[129] Business?
[130] Would it be governmental?
[131] Everywhere.
[132] So every time people behave the right way, we leave them alone.
[133] And every time people misbehave, now we want to change something.
[134] And it's true for government.
[135] Think about something like texting and driving.
[136] If we try to do something about it without a lot of understanding about human nature, we might miss the mark.
[137] So for example, there was some data showing that states that created laws against texting and driving and fines actually had higher rate of accident and mortality because of texting and driving.
[138] Why?
[139] Why?
[140] Because people, instead of texting over the wheel, they started texting under the wheel.
[141] Ah, of course.
[142] Yeah, that makes sense.
[143] So the issue is that governments set up regulations to basically get us to behave in a better way.
[144] And if they don't understand the mechanisms or what's caused us to.
[145] to behave, they might miss the mark.
[146] Right.
[147] The same thing is true for civic organizations.
[148] So I was just in Brazil the week before last, and I went to visit an organization that is trying to help people get out of poverty in the favelas.
[149] So favelas are these very poor, illegal living quarters, where they build these structures between a road and a factory.
[150] And there's a wonderful organization that is asking the question of, how do we help these people get out of poverty?
[151] When you look at the conditions, there's just a lot to improve.
[152] There's violence.
[153] There's domestic violence.
[154] There's poverty.
[155] There's low education.
[156] There's running sewage.
[157] They do have electricity.
[158] But outside of that, they're missing everything.
[159] And without an infinite amount of money, you say, what is the first step?
[160] What should we do?
[161] To get the biggest impact.
[162] That's right.
[163] If you have, let's say, $1 ,000 per family, where would the impact be?
[164] Not a simple question.
[165] You probably need a set of different people.
[166] you give $1 ,000 in many different ways to and monitor the outcome?
[167] That's right.
[168] You don't want to do it randomly.
[169] You want to have some initial thoughts about what would be helpful.
[170] And then you want to try different things and see which approaches seems to have the biggest delta and which one depend more on each other.
[171] So we said, you know, the government needs to think about it.
[172] Civic organizations also need to think about it because there's lots of things to fix and it's not clear where we need to fix things.
[173] And the same is also true for businesses.
[174] Well, some of our best psychological data sets come from like IBM trying to figure out what their employees do and motivates them across the world, right?
[175] I see this referenced all the time.
[176] I actually have a lot to say about that topic.
[177] Oh, you do?
[178] Yeah, yeah, yeah.
[179] Let's do it.
[180] Let's go tangential.
[181] Because so much is based on that.
[182] The high disagreeability chart, right?
[183] That stuff comes from the IBM data.
[184] Let me just say one more thing before that and then we'll come back to this.
[185] But just to think about businesses, so one of my startups, and I love startups because it's a way to take ideas and implement them, we're trying to figure out how to improve the last chapter of people's lives.
[186] So think about the chapter between people get a terminal illness diagnosis until the end of life.
[187] On average, it's slightly longer than five years.
[188] And now you say, how good are these five years and what's the potential?
[189] The reality is that we as a society are really missing the opportunity.
[190] When people get a terminal illness diagnosis, they often become patients.
[191] They stop being people.
[192] Right.
[193] Stop pursuing goals, stop setting appointments.
[194] And we did some research with 30 -some death doulas.
[195] Dulas help usher kids in life.
[196] Death -dullors help usher people out.
[197] And we asked them, basically, what's a good end of life?
[198] So not the five years, but the real end.
[199] The summary of what they said is no pain, no loss in the dignity of the body, feeling surrounded by love and having a legacy.
[200] Uh -huh.
[201] Kind of achievable.
[202] And I'll say, okay, so if you told the, somebody has five years.
[203] You're reverse engineering.
[204] You need a network of people that will surround you with love.
[205] That's a quick timeline to get a network of people who will love you.
[206] Well, I've moved to a new city and had friends by the end of the year.
[207] Let's say the following.
[208] It's certainly longer than discovering in the last six weeks and trying to fix it.
[209] Yeah, that is true.
[210] But when I talk to palliative care experts, they say that they think that for the majority of the people, if done correctly, the last chapter of their life could be the best one.
[211] like imagine this impossible thought experiment somebody gets a terminal diagnosis you do everything you can for them for the next five years they die you wake them up that's the impossible and then you say pick one chapter of your life to repeat how many of them would under the right circumstances repeat that last chapter and they think the majority could under the right circumstances and the reason is that we often live in unexamined life we pursue all kinds of goals and so on.
[212] And the conditions of a terminal diagnosis actually can in principle get people to reevaluate where they're going, fix relationship, think about legacy, do all kinds of things that are actually very, very good for them.
[213] I often say this is the gift of cancer as opposed to heart disease.
[214] My father died of cancer.
[215] We knew it was coming.
[216] All kinds of time to repair and dedicate that section of time to that.
[217] Had he gone out in a heart attack, he'd be like, well, that would have just happened.
[218] So in a bizarre way, there is some, some silver lining to that.
[219] It's complex, but, you know, the question is, could we do it without it?
[220] Like, imagine everybody who became 60 would say, okay, let me...
[221] Yeah, do our five -year -end -a -life plan.
[222] That's right, but we don't.
[223] You've read Atul Gawande's book.
[224] Of course, a beautiful, similar exploration.
[225] Yeah.
[226] So anyway, that was a detour, but now should we talk about human motivation?
[227] Yeah, yeah, yeah.
[228] Okay, so I'm not going to talk directly about IBM, but I have been researching human motivation for a while.
[229] I had a chapter of my life where I would do lab experiments, bring people to do.
[230] the lab, do different things and study motivation.
[231] Then I started working with companies one at a time.
[232] I would go to a company.
[233] I would change incentives, do all kinds of things.
[234] And then in this last chapter, about seven years ago, one of my friends came to me and he says, is it possible that some companies understand human capital, understand human motivation, better than others?
[235] So absolutely.
[236] And he said, is it possible that the companies who understand it better will have higher returns in the stock market.
[237] Absolutely.
[238] So we went on a search for data about how companies treat their employees and how the employees feel.
[239] Because you only need to put in an incentive for the company to have growth in the stock market to get them to implement some good culture strategies, maybe.
[240] That's right.
[241] So that's kind of the objective of that.
[242] We basically had this observation that every CEO goes on a stage and said, the quality of my people is the best thing I have and so on.
[243] But then do they actually act according?
[244] But we said if we find the connection between human motivation and stock market performance, maybe they will not just say it, they will start caring to a high degree.
[245] So we went on a long search for data.
[246] And we got data from some proprietary sources like this employee engagement surveys and some public sources like Glassdoor.
[247] And we had data going back to 2006.
[248] And we started thinking about what is it about the data from 2006 that could predict the stock market of 2007 returns, and from 2007, 2008, and so on.
[249] And the results were wonderful.
[250] We could see what kind of things in terms of employee human capital.
[251] Can I ask what the metrics are for the companies that would report?
[252] For example, we find that salary doesn't matter, but fairness in salary matters a lot.
[253] That makes sense.
[254] This is broken ladder.
[255] We find that feeling appreciated is one of the biggest predictors.
[256] We find that psychological safety is a big predictor.
[257] We find that bureaucracy kills motivation.
[258] Meaning you can't implement an idea because there's so many hurdles.
[259] Birocracy does a few things.
[260] First of all, it says to the employees, we don't trust you.
[261] You want to go to lunch, that's fine, but we need 16 receipts and a video of the whole thing.
[262] Yeah, yeah.
[263] Then it basically communicates to the employee.
[264] We don't care about your time.
[265] The last thing is what you mentioned, which is to say, we really don't want to do anything different.
[266] We figured it out.
[267] We put it in writing.
[268] Here's the book.
[269] Read it.
[270] Everything that you try to deviate will be really, really difficult and complex to do.
[271] So we've proven that sentiment.
[272] And the people from J .P. Morgan, J .P. Morgan, they have a very good group that does quant research.
[273] We shared with them our data, and they verified our results.
[274] But they said not only that, but it's a separate factor.
[275] So, you know, in investment, there's all these factors.
[276] momentum and large assets and debt.
[277] And they basically said this is not captured by these other things.
[278] Understanding human capital is a separate thing and you need to think about it differently.
[279] And I'll just tell you two more fun anecdotes about this.
[280] One is what do you think happened during COVID to the role that human capital plays in stock market return?
[281] Did it go up, down, stay the same?
[282] And the answer is it became more important.
[283] Think about the kid in seventh grade.
[284] If the kid is in the classroom, the teacher can have some control.
[285] Sit straight, don't talk to Johnny, put your phone down.
[286] When the kid is at home, the power dynamic changes.
[287] The kid can turn off the teacher.
[288] The same thing is true for adults.
[289] When the cats away, the mice will play.
[290] So how many of us during COVID have said in a meeting, oh, my internet connection is not that good.
[291] I'm turning off my video.
[292] So the workplace has a lot of implicit ways to try and get us to be excited and motivated.
[293] And at home, people need to have more interesting.
[294] motivation when they work at home.
[295] If you think that the struggle is between extrinsic motivation and intrinsic, COVID made the intrinsic motivation stronger.
[296] And there's another interesting anecdote that came from this.
[297] There's an index called the She Index.
[298] And that index basically says, let's take the companies with the highest proportion of women on the board and on the executive suite.
[299] And let's invest only in those companies.
[300] And now the question is, how does that index financially?
[301] Is this a good approach for investment or not?
[302] So what do you think?
[303] I don't think you'd be bringing it up if it wasn't positive.
[304] It'd better be.
[305] I can't imagine you would be out here saying that you shouldn't invest.
[306] Those stocks collapsed.
[307] Bye.
[308] So that measurement is not a good investment strategy.
[309] Oh, it's not.
[310] Oh, you are saying that.
[311] Okay.
[312] So I was wrong.
[313] That's not a good investment strategy.
[314] The real question is why.
[315] So I looked at my data.
[316] And in my data, it's not based on counting women.
[317] It's based on how people feel.
[318] promotion fair, and his salary fair, and all of those things.
[319] And when I do a delta for each company, how well do the men feel in this company versus how do the women feel, that difference is a good predictor of stock market return.
[320] By the way, it doesn't matter so much if everybody is happy or unhappy.
[321] It's the equality that matters.
[322] Now, why is their data not indicative of good stock market performance and mine is?
[323] And the reason, and that's the important thing, is that sometimes we measure silly things instead of the important things.
[324] And sometimes we measure silly things just because it's easy.
[325] So let's think about counting the number of women.
[326] Yeah, this is the easiest metric.
[327] Like, are they happy and satisfied out of 10?
[328] What the fuck does that mean?
[329] It's so arbitrary.
[330] Are they male or females generally pretty easy?
[331] That's right.
[332] If you say, let's count the numbers, it's easy to count and it gives us a feeling that it's objective.
[333] But we can have professions where it's mostly women, like teaching, nursing, and we treat all of them badly.
[334] It's not as if as the proportion of women increases, all of a sudden we treat them better.
[335] These are two separate issues.
[336] Proportion is not the same as treating well.
[337] And the second thing, of course, is that treating women at the top of the organization well doesn't mean that we treat all women well.
[338] So my data shows that treating women well is incredibly important.
[339] Treating everybody fairly is incredibly important.
[340] And the problem is that measuring something that is not capturing that can backfire.
[341] And my concern is that a lot of times we just do measurements that sound good.
[342] I didn't give you five minutes to think about this measure.
[343] If you thought about it for five minutes, you would recognize the shortcomings.
[344] But initial thoughts said, hey, sound reasonable, let's do it.
[345] But too many of our measurements are like that.
[346] And they don't actually go a little bit deeper and try to make sure that you tap the psychology of what would actually drive improvement.
[347] So women have to feel that they are being treated fairly, if we want him to be motivated.
[348] Numbers are not fairly.
[349] I would also argue you have to be inherently suspicious of any very simple approach to understanding a very complex system.
[350] The outcome of a stock market is one of the more complex issues to look at.
[351] There are millions of variables.
[352] There are supply chain issues.
[353] You know, you can go through the list.
[354] So the notion that we would quite simply boil it down to that, you should just innately kind of go, that feels a little too simple.
[355] Let's think about something that we find like feeling appreciated.
[356] If you say, I take this conclusion that feeling appreciated matters, and I can draw for you a mechanism of how feeling appreciated get people to produce much more at the workplace.
[357] And statistically, it would yield better results.
[358] Not for every company, not always, not immediately, but statistically it's the right approach.
[359] Being treated fairly, absolutely.
[360] So the real question is, what's the mechanism?
[361] How is this actually coming to play in this situation?
[362] Yeah, how do you implement that?
[363] Do you have the answer?
[364] How to implement?
[365] Yeah.
[366] I actually don't want to give an answer for that.
[367] And I'll tell you why.
[368] I think that every company should find its own answer, for example, to how to get people to feel appreciated.
[369] Let's say we came up with a recipe that says, oh, twice a week, say thank you to somebody.
[370] Right.
[371] First of all, it would not feel genuine.
[372] in the same way that every couple is going to find their way to express romantic love and gratitude and so on and you don't want a recipe for that, I think that we have this tendency to want to standardize everything and to have a checklist and here's the legal requirement to how to get people to feel appreciated.
[373] I don't think it's the right approach.
[374] If it's obligatory, it's innately...
[375] It becomes an obligation.
[376] It makes it feel non -genuine.
[377] And companies should figure out what's right for their culture.
[378] I would imagine, too, in the soul, searching, they're putting skin in the game.
[379] Taking the exploration may make you land on a principle or a value that you had maybe missed.
[380] And then the other thing is you would hope that companies will keep exploring it and will not stop.
[381] The moment you have a checklist, we've arrived.
[382] We've said twice a week, thank you.
[383] No more need to do anything else.
[384] Buckle up for this stock market surge, everybody.
[385] Go airplane shopping.
[386] It's also personal.
[387] Each individual person feels appreciated differently.
[388] Oh, totally.
[389] Also, if you think about this approach in general, it means that we need less command and control.
[390] So imagine a big company.
[391] And then you have direct managers with five people.
[392] You want to give each one of those direct managers freedom and autonomy and do the things that rise for them and for their particular employees.
[393] But of course, we are running against a very, very strong, I think, societal pressure to try and standardize everything.
[394] There's some legal reasons to do it.
[395] For sure.
[396] We complied.
[397] We gave the sexual harassment video.
[398] So we did everything.
[399] We can't be liable.
[400] We have legal.
[401] We have bureaucracy.
[402] Like it's more convenient than everybody has the same.
[403] But we're actually having a cost for that.
[404] So if you think about a company just for simplicity, you have a hundred people managing each of them five people.
[405] If you give a standard procedure for all of them, it's very unlikely to be the right one for any of them.
[406] You might hit some average.
[407] Yeah.
[408] This is the myth of average.
[409] Yeah.
[410] But.
[411] if you give freedom and flexibility and so on, you basically unleash human creativity and motivation and so on, but at the cost for bureaucracy and maybe legal compliance.
[412] It's funny, we've definitely accept, at least in the Western world, that that was the flaw of communism, is you can't have a centralized decision -making to address all the concerns of a market over the span of a country, yet we don't think that applies to an organization.
[413] Yeah.
[414] Okay, so your new book, Misbelief, What Makes Rational People Believe Irrational Things, is of great interest to me because we, I don't think to the degree you have, although I don't know, we have some protesters occasionally.
[415] Conspiracy theories, which we do a side show, debunking or exploring conspiracy theories, we're greatly interested in.
[416] We've also had guests on over the years that bring out all the conspiracy theorists.
[417] Prime example, Bill Gates, when he was on the comment section was tens of thousands of people accusing him of having children in his basement.
[418] To get a firsthand experience with it, it's one of the more, for me, demoralizing feelings.
[419] It's actually for someone who's not terribly fearful, the notion that the people around me might not be living in the same reality as me is deeply concerning and threatening.
[420] And above all other things, I'm actually more comfortable with psychopaths on the street than I am this other thing.
[421] And so you begin this book with a personal experience, which is fucking unbelievable.
[422] And I can't wait to have you roll it out.
[423] But it starts in COVID, yeah, in July 2020?
[424] So I think I'll comfort you a little bit and I'll worry you a little bit.
[425] Okay, great, great.
[426] So COVID starts and I feel I'm at the highlight of my academic career because people are calling me from everywhere, governments, companies, what to do.
[427] You know, it's obviously a question of a virus, but there's also a lot of social science.
[428] What do we do with distant education?
[429] What do we do with paying people on furlough?
[430] What do we do with domestic violence.
[431] How do we motivate people to comply with these?
[432] Everything, right?
[433] And I feel amazing.
[434] I'm with two phones and my computer.
[435] Social science, I've said for a long time, is unbelievably important.
[436] And now it's your time to shine.
[437] It's amazing.
[438] And I feel useful.
[439] So we have a few months of this when I feel at the height of my career.
[440] And can I quickly ask?
[441] Because I feel like I was following this closely.
[442] Were you out in front of cameras and stuff?
[443] No. I'm just answering questions as fast as I can.
[444] and talking to everybody in the world who wants to talk to me. And then at some point in July, I get an email from somebody I once helped.
[445] And she says, Dan, how have you changed?
[446] How you become this person?
[447] I answered quickly, what do you mean?
[448] And I get a list of links.
[449] And I'll describe just one of them.
[450] In that link, it has a beautiful video, starting with me in hospital.
[451] When you're 17?
[452] Yeah, I was badly burned, 70 % of my body, three years in hospital.
[453] Three years.
[454] Almost three years, yeah.
[455] Unreal.
[456] Burns are really, really tough, very tough.
[457] Yeah.
[458] Yeah, you're so, what, susceptible to infection?
[459] First of all, the body can heal itself better when there's good blood supply.
[460] But when you get burned and there's no good blood supply and it's such an extensive part, takes forever.
[461] And by the way, this happened many years ago.
[462] This happened when I was 17 and a half.
[463] I'm 56 now.
[464] I still, of course, have pain and difficulty and so on.
[465] But I still had surgery a few years ago.
[466] It's a tough issue.
[467] But anyway, this 90 -second video shows pictures of me in hospital, tough pictures.
[468] And then it described how because of that, I started hating healthy people.
[469] I joined the cabal, Bill Gates, and the Illuminati in an effort to kill as many healthy people as possible.
[470] And that's why COVID is there.
[471] There was another one that they thought I was the consciousness architect of the COVID plundemic.
[472] Oh, they called it a plandemic.
[473] Yeah, wonderful.
[474] They really gave me a good role, right?
[475] Yeah, mastermind.
[476] Mastermind.
[477] Forgetting you're not a biologist.
[478] Yes.
[479] And then I try to argue And I try to explain myself Can I ask quickly How ubiquitous was this?
[480] Like if we had to put it on a scale Q and on will say it was a 10 That reached a 10 What is this?
[481] How many people are engaged In thinking you were this?
[482] I know Fauci put him at a 6, right?
[483] Or a 7?
[484] Yeah, very hard for me to tell.
[485] Yeah, because you're inside of it.
[486] When all this is happening, I'm saying to myself, these people are clearly wrong and if they only knew they will change their opinions.
[487] And probably the illusion that you could correct them.
[488] That's right.
[489] And to my benefit, I would say I did call some experts to ask them, and they all said don't engage.
[490] I engaged.
[491] I couldn't.
[492] I couldn't.
[493] It was too hard.
[494] I was wrong to engage.
[495] I engaged for about a month.
[496] I called people.
[497] I invited a couple of people to my house.
[498] I talked too many online.
[499] It was terrible.
[500] I'm a slow learner.
[501] It took me a long time to figure out it was a bad strategy.
[502] But then what I did is I said, okay, this is a terrible, terrible, terrible situation.
[503] Something very strange is happening here.
[504] I don't understand this.
[505] And, you know, it's one thing to say people believe in X, but when they believe in X about me and I can provide them with evidence and nothing changes, that was a very intense realization of the complexity of the situation.
[506] So for the next two years, I spend in some of the darkest corners of the Internet, and I talk to about 20 misbelievers regularly, and I try to understand the psychology.
[507] You know, what was very clear from the beginning was that these people should not be discounted.
[508] It's not that they are different or less something or another.
[509] No, it's just people that somehow became like this.
[510] And that was the mystery of what got them.
[511] And let's say you said, oh, they are less X or less Y. That's a cheap answer.
[512] And it doesn't help us.
[513] It doesn't respect the complexity of it.
[514] Also, it denies that we ourselves are also in this.
[515] Exactly.
[516] That we're somehow different.
[517] And it doesn't help us fix it.
[518] So this book has three components.
[519] It has a little bit of my story.
[520] Most of the book is about this funnel of misbelief.
[521] It's about the way I characterize this machine that takes people and change them.
[522] And as an analogy, if you think about the food machinery in the U .S., there's too much salt, too much sugar, too much fat, and you say, how is it changing us?
[523] In the same way, the funnel of misbelief is the machinery that comes from lots of different direction that attacks our psychology in a way that gets people to change their beliefs in fundamental ways.
[524] And then the last chapter I talk about trust and what it means in a broader way.
[525] So if you say, we all have these people in our lives, that five years ago you said, we're the same.
[526] We believe in the same thing.
[527] We think in the same way and so on.
[528] And now you look at them and you say, are we both human?
[529] How can it be that we both look at the world and come up to such different conclusions?
[530] So you are both human.
[531] They just went through the funnel of misbelief.
[532] and it's important to understand what they went through.
[533] And the thing is, it's not just respecting the people, it's realizing that these misbeliefs that people adopt fulfill a real function in their lives.
[534] It's not that somebody wakes up one day, say, oh, I really want to believe that, you know, Bill Gates is a...
[535] Yeah, I need a new hobby.
[536] I'm going to go shopping for a conspiracy theory.
[537] Yeah.
[538] It's worse than that, because it's a trivial observation, but took me a while to get to it, is that people who believe in God, believe the world is basically a good place, with a little bit of devil from time to time.
[539] But the people who believe in conspiracies or misbeliefs, their lives is actually quite difficult.
[540] So misbelief is not just about believing in something wrong.
[541] It's about adopting a worldview that the world is out there to get us in some important ways.
[542] If you wake up in the morning and you feel that there's an initiative to put G5 into your kids, name whatever, that's a terrible way to live.
[543] I was on some radio show the day before yesterday and somebody after that wrote to me about some salmonella poisoning.
[544] You and I, here's some manila poisoning.
[545] We said, somebody made a mistake.
[546] Where did this process work out?
[547] What needs to be improved?
[548] That person read it as a signal of intentional poisoning.
[549] Well, they're looking for proof of evil all around them, and they're finding it.
[550] That's right.
[551] Just think about how tough it is to live like this.
[552] It's not a choice.
[553] Well, if you look at some of the definitions of depression, one of them is having a pessimistic view of the future as opposed to an optimistic, which most people have, and they're stuck in a very pessimistic.
[554] Worst and pessimistic.
[555] It's an evil, intention everywhere and powerful in ways we don't see.
[556] And nobody else sees it either.
[557] It's a very tough thing.
[558] And I think they need a lot of empathy.
[559] You know, when you think about it, it's very tough to give it to them.
[560] It is.
[561] They are the people that need it the most, and they make it the most difficult to embrace them.
[562] Stay tuned for more armchair expert, if you dare.
[563] So you break it down, falling into the funnel, into kind of four categories.
[564] We have the emotional, the cognitive, the social, and the personality.
[565] So let's start with the emotional, and I want to give you one fun anecdote that I saw on 60 minutes, maybe 14 years ago.
[566] Presumably it was in the wake of the 2008 financial collapse.
[567] It was a story on the sovereign citizens.
[568] Have you ever heard of the sovereign citizens?
[569] No. There are a group of people who deny the role of the government.
[570] They won't carry license.
[571] They've been in tons of shootouts with law enforcement.
[572] They sue representatives.
[573] They have these frivolous lawsuits.
[574] They have a whole game plan.
[575] There's a lot of them.
[576] And as they investigated them, what they found that was almost universal among them, which was crazy, is that almost every member had lost a job in the last two years.
[577] And not just a job, a career.
[578] And I think that plays beautifully into what we're about to talk about.
[579] I don't use the word beautifully.
[580] Sadly.
[581] Okay, sadly and conveniently.
[582] Yes.
[583] That's right.
[584] But that principle, I've witnessed it in friends.
[585] when they've at least minimally felt disenfranchised or excluded.
[586] Now, if there's a system that you're excluded from and everyone else seems to be included, of course that system has to be broken.
[587] How could it be just?
[588] That's exactly the starting point.
[589] So the starting point is stress.
[590] We're not talking about stress.
[591] Oh, I'm so busy.
[592] I don't know how I'll manage to do this.
[593] We're talking about stress where you say, I don't understand the world and I'm not getting my share.
[594] Something is very broken and I'm suffering because of that.
[595] And now you need a story.
[596] And even cognitively, when we deal with uncertainty, we want a story.
[597] There's a need to explain things.
[598] But not only do you want a story, but you want a story with the villain.
[599] Of course.
[600] That will be somebody else's fault.
[601] You don't want a story where it's your fault.
[602] You want a story with somebody else's fault.
[603] And the final component, which took me a while to figure out, is that you want a complex story.
[604] Now, usually we think that people like simple stories.
[605] Why do you want a complex story?
[606] Because that gives you a sense of control.
[607] Usually the people who are in that situation feel like underdogs.
[608] Other people are more successful and so on.
[609] Now you say, it's not me. I want a story.
[610] It needs to be somebody else.
[611] And I want a story that would make me feel superior.
[612] I understand something that you don't.
[613] I'm not the inferior one here.
[614] I'm actually the only one that knows the truth.
[615] I'm the only one.
[616] And they use words like, I'm the sighteds and you're the sheep.
[617] And it's an interesting thing to think about, okay, so we have this need for a complex story with a villain that would give us relief.
[618] Now, here's the interesting thing.
[619] So you have this state of stress.
[620] You have an urge to find a story with a villain that is complex.
[621] You find that story, and it's the moment you feel a relief.
[622] You deflected blame.
[623] The world is more understandable.
[624] You feel in control.
[625] But that relief is not long -lived.
[626] Because now, you basically said the world is a worse place.
[627] There's evil people out there.
[628] They're a little bit like a mosquito bite.
[629] You it, you get initial relief, but it makes the problem worse.
[630] Drug addiction.
[631] Yeah, and then it gets worse and worse and worse.
[632] So that's the initial process.
[633] By the way, if you see people at that stage, that's a good place to try and stop it.
[634] Because if you wait too long, it will be very, very tough.
[635] Yes.
[636] I did just want to say, that part breaks my heart.
[637] Someone, anyone would be feeling that everyone thinks they're stupid and that the notion of, no, I'm actually smart is so appealing.
[638] That part is very heartbreaking to me. The whole thing is heartbreak.
[639] right?
[640] The fact that we are creating the stressful environment that is creating the breeding ground for those beliefs.
[641] Yeah.
[642] When we leave people behind, this is what happens.
[643] And some of it was done in COVID without paying attention.
[644] We didn't say enough, look, we don't really know, but here is what we think right now.
[645] Right.
[646] Think about what it means to come up.
[647] One day you say, oh, surfaces are infected, whenever you say you don't.
[648] It was a fertile ground.
[649] And we were all stressed, even if you had the best situation.
[650] And then the best inaculation against stress is resilience.
[651] And mostly we get our resilience from social support.
[652] We get it from people who love us.
[653] And again, COVID made it very, very tough to get any kind of deep social support.
[654] So we created these conditions.
[655] And if you go and look at some of the TV show and the programs that talk to people who are becoming misbelievers, you can see how they're trying to both create stress and then capitalize on that.
[656] I make the analogy to obsessive -compulsive disorder.
[657] Let's say I worry about how I look before I go out.
[658] I don't know, but I used to, and I wash my hands.
[659] It doesn't solve the problem, but it gives me a sense of control, some activities, and that reinforces itself, and we have obsessive -compulsive disorder.
[660] In the way in which people are looking at videos, let's say, of a villain, it's slightly similar.
[661] It's the same activity every time, but it's also different.
[662] And it's different because at some point I'll stop washing my hands.
[663] But with watching videos, there's no end.
[664] They leads to one another.
[665] I also don't watch the same video every time.
[666] The way that it works is that my cognitive framework for understanding the evilness of the world is expanding and expanding and expanding and therefore it becomes much worse.
[667] And there's even algorithms to assist in that.
[668] You're getting more and more fundamentalists with each recommendation.
[669] That's right.
[670] So now we understand the starting point.
[671] and we have to admit that we as a society contribute to it and that sometimes we as individuals.
[672] Yes.
[673] And so a normal person is listening to that and they're thinking, well, this couldn't happen to me because I am more logical than that.
[674] Yes.
[675] Right?
[676] Well, we haven't touched logic.
[677] We've just touched emotion right now.
[678] Yes, yes, just emotion.
[679] But I think a lot of people might go like, oh, I respond to those feelings or I can imagine having those feelings I've had them in a different way, maybe.
[680] I think I have the type of brain that would inoculate me from this.
[681] Yes.
[682] So first of all, if you're that kind of person, I hope you'll never find out.
[683] Right, the hard way.
[684] There's a concept called scarcity mindset.
[685] The scarcity mindset is usually thought about with poverty.
[686] When you say people who are very poor and worry about where the next dollar would come from, show as if they have a diminished cognitive capacity, as if they lose IQ.
[687] And it's because part of the, mind is always occupied.
[688] Will I be able to pay my bills?
[689] And will I be able to pay for dinner?
[690] And what can I do?
[691] And during that initial period when I was attacked, I got death threats almost daily for the first two years.
[692] And the last one I got three days ago.
[693] But still, oh, I admire their stick -toedness.
[694] I didn't do a test to see how many IQ points I lost.
[695] But during the day, I would basically manage to work.
[696] But part of my mind was occupied the whole time.
[697] Well, we know from neurology, your amygdala is probably far more online than your prefront cortex is normally.
[698] So you're having some actual biometric physiological changes upstairs as a result.
[699] Every ping, there's a little bit worried.
[700] Maybe it's the next death threat.
[701] It's there all the time.
[702] And some people experience when they worry for a loved one or health or finances.
[703] But it felt I lost a big component of my intellectual capacity.
[704] Stress changes us.
[705] And now comes the cognitive component.
[706] And broadly speaking, and there's lots of examples, the cognitive component is broken into confirmation bias, which is basically say there's a huge amount of range of information out there.
[707] And we choose to look at the one that conforms to our hypothesis.
[708] Anyone who's in a relationship, just take two seconds and think about the story you have about your partner.
[709] They're messy.
[710] And how all you see is the drawers opened up or the fridge left open or the container on the ground.
[711] But certainly the partner also put away about a thousand things, and you just don't pay attention to those things, because it goes against your story that they're messy.
[712] By the way, you could, in principle, change that in the relationship case.
[713] Yeah, you tell a different story.
[714] You could basically say, I'm only looking at the positive.
[715] Yeah.
[716] My partner's conscientious, and you'll start seeing examples of conscientiousness.
[717] And to say, the things that they don't do are charming.
[718] Right, right, right.
[719] Anyway, but we have confirmation bias, which is the world of information has a huge range, and we look at a small part of it.
[720] But the second component, which is more worrisome, is that even when we look at information that we disagree with, we have a way to bend it to our will.
[721] I would imagine when you were responding, this is what was happening immediately.
[722] That's right.
[723] They were exposed to information they didn't want, and they were just able to either discount it or ignore it.
[724] Build a new thing off of it.
[725] You almost give them fuel.
[726] Exactly.
[727] And then there's a third component, the relationship between our confidence and our knowledge.
[728] This is a Dunning Kruger effect.
[729] That's part of it, exactly.
[730] There's another frame of it called The Illusion of Explanatory Depth.
[731] Oh.
[732] Remember we had the woman on and it was like allusions of competency?
[733] Oh.
[734] Remember that?
[735] We talked about it a lot.
[736] Vaguely.
[737] Okay.
[738] So I'll give you an example of how I illustrated it.
[739] So imagine we ask people about a flush toilet.
[740] And we say, do you understand how a flush toilet?
[741] Right.
[742] Yeah, yeah, yeah.
[743] And people say, of course I do.
[744] It's a flush toilet.
[745] I pull the handle, the poop goes away.
[746] The example she gave too is a helicopter.
[747] You asked people how helicopter works.
[748] And then I said to people, okay, wonderful that you understand how it works.
[749] Luckily, I have all the pieces here.
[750] Could you please assemble a flash toilet?
[751] Nobody in my sample could assemble a flash toilet.
[752] And you go back to them and you say, so how much do you understand the flash toilet?
[753] And they say, not so much.
[754] Yes.
[755] What's interesting about this, and it's true for helicopters and locks and zippers and you name it.
[756] It's true for things that have moving parts like locks.
[757] and also even more true for things that are invisible, like viruses or how democracy works or voting.
[758] And the interesting thing about this approach is that we don't tell people anything.
[759] I don't give you any new information.
[760] I basically say, explain to me what you think you know already.
[761] And all of a sudden, just by the need to provide a detailed explanation, people say, now I admit I don't understand it as well.
[762] As people get ready to Thanksgiving dinner and Christmas and New Year's Eve, we're going to meet people that have very different opinions than us.
[763] One approach to have a discussion is not to attack them, because the moment you attack somebody, they basically defend themselves.
[764] And in fact, they defend themselves so well that they don't hear your argument and they just don't the defense.
[765] Again, a different part of their brain is now working.
[766] Completely not paying attention to this, just thinking about what's my defense strategy.
[767] Now, if I don't attack you, I say, I'm with you.
[768] Just help me understand how does this exactly work.
[769] Give me the details.
[770] By the way, one nice other strategy is to say, what would it take to change your mind?
[771] That's a very lovely phrase that people use.
[772] You would know about these people.
[773] So I interviewed Robert Sapolsky, very intimidating man to interview.
[774] If you have a differing opinion, I certainly don't know what he knows.
[775] And I asked another smart friend, Adam Grant, I'm like, I don't really know what to do in this situation.
[776] I know intuitively I disagree with him, but I don't want to get in a battle.
[777] And he said, why don't you ask him if there's any evidence that might change your mind?
[778] It's a frightening thing to ask ourselves.
[779] Oh, big time.
[780] So that's the cognitive component.
[781] And then the other component is basically personality.
[782] And some people are more likely to go down the funnel of misbelief and some people are less.
[783] But just because somebody doesn't have the personality traits we'll talk about, doesn't make them immune.
[784] Right.
[785] It just makes it slightly more difficult to go down that path.
[786] So if you're creative doesn't mean you'll get there.
[787] And just because you're not creative doesn't mean you're immune.
[788] but those things do make a difference.
[789] Let's think about the process of adopting a misbelief.
[790] So the first component is basically trusting our intuition to a very high degree.
[791] This is any belief, right?
[792] Not just a misbelief.
[793] Any belief?
[794] Yeah.
[795] There's a very beautiful question called the bat and the ball that I'm sure you've heard about.
[796] No, you should explain it.
[797] Okay.
[798] So the bat and the ball.
[799] So here's a simple math problem.
[800] I'll say it, but don't answer.
[801] Okay.
[802] A bat and the ball costs together a dollar $1 .10.
[803] The bat cost a dollar more than the ball.
[804] And now we ask people, how much is the ball?
[805] And there are basically two approaches.
[806] Most people, the first answer that comes to their mind is 10 cents.
[807] Type number two checks the answer.
[808] They say, well, it sounds like 10 cents, but let me check it out.
[809] If it was 10 cents, the ball will be 10 cents, and the bat will be a dollar more.
[810] That's a dollar 10.
[811] Together they're $1 .20.
[812] Oh, but he said, together they're $1 .10, it has to be 10 cents less.
[813] each of them is five cents less.
[814] Maybe it's five cents and a dollar five.
[815] Oh, yeah, that works.
[816] Now, we're not talking high math.
[817] Anybody could do this math.
[818] The question is whether people check themselves or not.
[819] And there's some people who just, 10 cents sounds good.
[820] Let's go with it.
[821] Impulse.
[822] And some people say, let me check myself.
[823] When it turns out that the people who trust their intuition and follow it are more likely to go to the funnel of misbelief that are the people who question themselves.
[824] And you can imagine how it would work.
[825] You see a video describing X, Y, or Z. It sounds right.
[826] People who say sounds right and I'm adopting that answer are different.
[827] And the people say, let me hold that belief for a while.
[828] Wait, we got to close that loop.
[829] What would the price of the ball be?
[830] Five cents and a dollar five.
[831] Okay, thank you.
[832] It's a dollar difference.
[833] Right.
[834] Hoof.
[835] Yeah.
[836] You got it.
[837] No, no. He explained it.
[838] I explained a little fast.
[839] I was the 10 cent person.
[840] My first thought was that.
[841] And then I was like, no, that's 120.
[842] And then I started going backwards into 90 cents.
[843] I'm like, well, hold on.
[844] Well, I was like, oh, it's 10 cents, but we're being tricked.
[845] So I know it's not 10 cents, but I don't know what it is.
[846] So 10 cents?
[847] That's right.
[848] That's great.
[849] And, you know, this is not the right condition to be asked math problems.
[850] The second attribute comes from a really interesting research.
[851] So do you know that some people believe that they were abducted by aliens?
[852] I love this example.
[853] This is my favorite story about misbelief.
[854] It turns out, like many misbeliefs, that there's some grain of truth.
[855] truth in it.
[856] And in the issue with people who feel like they abducted by aliens, it turns out that when we have REM sleep, our mind transmit instructions to the body, save the princess, jump over the thing, slay the dragon, all kinds of things, but we don't move.
[857] Because the mind also paralyzes the body.
[858] The brain thinks it's sending instructions, but they don't get to fruition.
[859] This is called sleep paralysis.
[860] Sometimes some people wake up before...
[861] It's happened to me. It happened to you?
[862] You're one of the 8%.
[863] Oh, scary.
[864] Do you want to describe how it feels to you?
[865] Yeah, I mean, your brain is awake.
[866] You're cognizant of what's going, well, this was my experience, cognizant of what's going around, Roger, but you can't move.
[867] And you think you've become paralyzed somehow, I'd imagine, right?
[868] Yeah, you're stuck in your body, and you can't say anything.
[869] You can't wake your body up.
[870] But you know, like, I was in college, and so my roommate was at her computer, and I knew she was at the computer, but I couldn't say, hey, help, or do something.
[871] Yeah, it was scary.
[872] Yeah.
[873] So let's say it happens to about 8 % of the people.
[874] Not all of them say, oh, I was abducted by alien.
[875] That's the reason.
[876] What separates the people who, like you, say, something really strange happened to my body from the people who say, oh, I was abducted by aliens.
[877] I've just been returned.
[878] I wish I had thought of that.
[879] Thought of that.
[880] Yeah.
[881] It's way cooler.
[882] It is cooler.
[883] No. And one of the differences is that they connect the dots in ways that are not supported by the evidence.
[884] So imagine, for example, I read to you.
[885] a list of 10 types of fruit.
[886] And then I say, hey, let me read to you another list of fruit and tell me if they were in the first list or not.
[887] If you confuse new fruit with old fruit.
[888] Your example is if you said apples, bananas, oranges, and then in 10 minutes you said, hey, was apricots on that list?
[889] There is some group of people that would insert apricots and do you read.
[890] That's right.
[891] Do we know a percentage of people that think that way or is everyone to do that?
[892] Everybody does it a little bit.
[893] But the more you do it, so if you have sleep paralysis, you might become an alien abdictive.
[894] Right.
[895] Right?
[896] Because what happened is there is a question of interpretation.
[897] So you wake up, you're paralyzed.
[898] What do you remember?
[899] How do you connect the dots?
[900] Are you adding things that were not there in the picture, but make the picture more complete?
[901] Well, you're looking for causality.
[902] So you'd likely start with, or me, I think of myself is highly rational.
[903] I would go like, did I eat something different last night, right?
[904] Like, first thing to be like, what anomaly occurred in my life that has resulted in this?
[905] You know, it's funny.
[906] That's your own experience because you have autoimmune.
[907] Yeah, and you understand that your body's really sensitive to food.
[908] So that's where your brain goes.
[909] Everyone's brain just goes by their own history.
[910] I wouldn't have thought that.
[911] Well, also if you are into sci -fi, I mean, I wouldn't have even thought about alien abduction.
[912] By the way, we didn't go into details of your description, but many people describe lights.
[913] I mean, there's other things that happened as the system changes from REM sleep to awakening.
[914] And if it happens in the middle of a dream, Life is much more confusing.
[915] So anyway, this ability to tell a story, thinking that more things happen, they really did, but to complete the story, by the way, could be a really good attribute for an artist.
[916] Creative, seeing dots.
[917] Oh, we should have said it much earlier when we were talking about stress.
[918] But we just tell Monica about when people are under high stress and you show them a completely unpatered random assortment of dots, what happens.
[919] Because I think that's really relevant at this moment.
[920] Yeah.
[921] So we talked about what stress does.
[922] Our system wants to have an answer.
[923] So one way to do it is to show people what we call white noise.
[924] So white noise is sometimes noise like shh, but we can also have a picture of white noise.
[925] Think about a page full of black, white, and gray dots, just randomly scattered.
[926] And the question is, do you see patterns?
[927] And it turns out that the more stressed people are, the more they see patterns.
[928] Oh, interesting.
[929] Including people who go skydiving and they get extra stressed and they see more patterns just before jumping.
[930] And there are other things like that.
[931] Like if you think about what kind of superstitions people adopt.
[932] There was a study of some tribes that showed that tribes that fish in deep sea, more uncertainty and so on, adopt more superstitions than tribes that fish in the lake.
[933] Why?
[934] Life is more random, more chaotic and so on.
[935] We need a sense of control to overcome it.
[936] So we said stress is the breeding ground.
[937] The cognitive component helps a lot.
[938] personality component, some people are more and less able to do it.
[939] Oh, one other personality trait that surprised me a lot is narcissism.
[940] It turns out that narcissists are more likely to go down the funnel of misbelief.
[941] And we're not talking about extreme people.
[942] Just think about your friends and think about the top 20 % of them who are slightly more narcissists.
[943] Narcissists love to get positive reinforcement from the world.
[944] And when they don't, they feel deprived.
[945] Now, think about what happened in a period like, COVID or when something bad is happening, they say, what's wrong?
[946] If we think about stress, narcissism basically amplifies stress.
[947] You are more expecting good things from the world than other people.
[948] So when it's not happening, you're much more likely to look for the villain.
[949] Yeah.
[950] By the way, one other random point is that there's a beautiful study that looked at many, many countries and looked at the percentage of people who believed in conspiracy theories around COVID, and they found that it's highly correlated with violence in that country.
[951] Again, why?
[952] What does violence create?
[953] Stress.
[954] And it's not as if we can separate stress.
[955] It's not as if we can say, oh, this is stress for my kids and this is stress from work.
[956] No, we get this overall stress.
[957] It could come from violence in the country.
[958] It could come from all kinds of things.
[959] There's a cortisol dump.
[960] It doesn't really matter why it dumped.
[961] The more stress you have, now the more the system needs, need a story.
[962] By the way, if you look at the correlation between violence and stress, The U .S. is kind of above the line in the sense that there's more conspiracy theories than would be explained by the amount of violence in the country.
[963] Oh, I have an armchair theory on that, which is I also think our individualistic capitalist society prizes those individual achievements more than other countries.
[964] So when you're not achieving and accomplishing the dream, it is more painful than it is in, say, a collectivist.
[965] society.
[966] We can speculate, but I'll give you two other possibilities.
[967] Okay.
[968] Well, you've got to vote on that one.
[969] Is that okay?
[970] Yeah, yeah.
[971] Okay, okay.
[972] One is that as a country gets a higher rate of inequality, resilience goes down.
[973] So if you think that resilience is an antidote to stress, even at the level of a neighborhood, as inequality goes up, people are less likely to go to the neighbor and ask for help.
[974] So something about resilience changes.
[975] Do you read the broken ladder?
[976] That's a great book about income inequality.
[977] Yeah, and it talks about just the perception of being poor, as opposed to being quantifiably poor, has more disastrous outcomes on educational achievement, health, everything.
[978] Yeah.
[979] And then the second explanation could be is that the U .S. has a lot of violence that is not physical.
[980] So think about our partisan system and the aggression that we have on the left and right that wouldn't be quantified as violence, but could have related effects.
[981] So we said stress, cognitive personality, and then the thing that kind of seals the deal is the social element.
[982] And there's kind of three subcomponents in a social element.
[983] And the first one is ostracism.
[984] The first guy that started research on ostracism have this beautiful story.
[985] He says he goes to the park with his dog and all of a sudden he sees two people playing frisbee.
[986] One of them, by mistake, throw the frisbee and it falls next to his feet.
[987] Picks it up, throws to one of them.
[988] And to his surprise, they throw it back.
[989] and for a few minutes the three of them play and he feels quite happy playing with them and then they stop throwing the frisbee to him and he says he described how he felt rejected and then he goes and he says let me recreate that experience so let's invite people to the experiment and then let's have two collaborators two people who look like participants but they're not and let's ask everybody to wait and have two conditions condition one the three people the participant and the two collaborators wait outside they have a ball and they start tossing it around between the three of them.
[990] And they play for 10 minutes and then they bring them in and do the experiment.
[991] In the second condition, they start playing the three of them.
[992] But after five minutes, they do what they did to him in the park.
[993] They stop tossing it to the participant.
[994] They just, the two collaborators play between themselves.
[995] So now we have a control condition, three people played.
[996] An ostracism condition, three people played for five minutes, and one person, they stopped throwing it to him.
[997] How does that person feel?
[998] And the results are amazing.
[999] I mean, amazingly bad.
[1000] Yeah, yeah.
[1001] It affects well -being.
[1002] It affects optimism.
[1003] It affects willingness to help other people.
[1004] It affects dishonesty.
[1005] It affects donation to charity.
[1006] It really changes people.
[1007] And we're talking five minutes of being ostracism.
[1008] Now, this is a part where, as I realize that, I have to say that I have unintentionally ostracized some people.
[1009] When we are exposed to somebody that's just starting to be a misbeliever, I think.
[1010] think the tendency is to crush their misbeliefs.
[1011] But of course, when we try to take this approach, we ostracize them.
[1012] We think, oh, this is just a small comment, and I'm saying it's partially joking and so on, they view it as a real big comment.
[1013] And if we think about how do we help our friends and family, so we said, you know, we want to give the feeling of resilience, but we want to stop ostracizing.
[1014] We've all done it, and it's very tempting to do, but we need to hold back on that.
[1015] I would say from the anthropological lens, what you're immediately doing for that person is creating us and them.
[1016] And once you are them, we all know we're all capable of all manner of atrocities against them.
[1017] And we have no compassion for them.
[1018] It's one of the strongest things.
[1019] And then ostracizing for 280 ,000 years we were here meant literal death.
[1020] And we know it.
[1021] We need a tribe.
[1022] If we don't have a tribe, we're not eating, we have no water, we have nothing.
[1023] We know in ourselves that this is life or death?
[1024] Very, very strong.
[1025] And now it leads to the next component.
[1026] So imagine somebody who's ostracize.
[1027] And exactly what you said, what would they do now?
[1028] They would go and look for a group that gives them the social support that they need.
[1029] They need so bad community at this point.
[1030] So there was one post, the misbelievers in COVID thought one day there will be Nirenberg 2 .0 trials for the crimes against humanity during COVID.
[1031] and this guy wrote a very long post about my specific crimes against humanity and he ended the post questioning whether I should get life in prison or public hanging.
[1032] Those are two options on the table.
[1033] And there were about a thousand people who responded.
[1034] Oh, wow.
[1035] If you just read the tone of their responses, it would look like these are people that are gathered together to solve poverty and to do something wonderful or maybe plan a party.
[1036] It was also positive.
[1037] There were so many loves and people appreciated his writing stuff.
[1038] and the wisdom, if you just look at the amount of support, it was amazing.
[1039] Yeah.
[1040] Now, if you look at whatever discussion group of academics, there's no love, and nobody ever deserves to win anything, in those groups, once a day somebody deserves the Nobel Peace Prize, people get all kinds of accolades.
[1041] And again, it's not for nothing.
[1042] They need that, right?
[1043] You see a behavior.
[1044] You don't want to say, it's for nothing.
[1045] You say, was it fulfilling?
[1046] And it's fulfilling a deep need to feel connected and, loved and appreciated and so on.
[1047] And these groups are presenting it in a very extreme way.
[1048] So the social element starts with some exploration of misbelief.
[1049] Then ostracism.
[1050] Then people go to these groups.
[1051] They get support.
[1052] But now it goes a step further.
[1053] Now you want to basically display your loyalty.
[1054] You want to be above the fold.
[1055] You want to solve it.
[1056] You're serious.
[1057] So what do you do?
[1058] You want to express a more extreme opinion.
[1059] The word we use for this is Shibolit.
[1060] the story is a story from the Bible there were two tribes that had a serious fight and at the end of the fight one settled on one side of the river one on the other but there was still people roaming around after the fight and when they were trying to figure out who was whom they asked them how they pronounced the word of this plant because one tribe said it shibolet and one tribe said sea bullet so now I show you the plant I said how do you say this if you say it the right way my way I say welcome home if you say the wrong way I maybe try to chase you or kill you or do something else.
[1061] When you think about this term, the term is not about that I care about what is this plant called.
[1062] I care about your identity.
[1063] So we use the term Shebolet now to indicate that a person is basically identifying their identity.
[1064] They are proclaiming their identity with the shared knowledge.
[1065] And it's not about the truth.
[1066] Am I tribe A or tribe B?
[1067] Am I on your team?
[1068] That's right.
[1069] And now think a little bit about this in our current.
[1070] political discourse.
[1071] Sometimes people say crazy things.
[1072] And when they say crazy thing, you say, do they really mean that?
[1073] Or is this shibolet?
[1074] Oh, this is the great question of the last 10 years.
[1075] Are they in on it or not?
[1076] Especially in the beginning.
[1077] Because later on, I think something changed.
[1078] But let's think about the first time somebody says something.
[1079] Are they talking about the truth here?
[1080] And I'm telling you a piece of information.
[1081] Or they're saying, you think it's about a piece of information.
[1082] But I'm saying something ridiculous on purpose.
[1083] Because if I say, something standard, no issue.
[1084] I need to say something extreme in order to proclaim my identity.
[1085] Look how serious I am.
[1086] By the way, it's not political in a sense that only Democrats do it or only Republicans do it.
[1087] It's really across the aisle where we say things that are proclaiming our identity.
[1088] And of course, once people say it once, and maybe they have to defend themselves, and then they say it again.
[1089] And all of a sudden it could become the accepted truth, even though it was never meant to be that way.
[1090] And now we can go more and more.
[1091] more.
[1092] So another very important part of the social process is cognitive dissonance.
[1093] The original story from Festinger is that there was this woman who many years ago proclaimed that the earth would be destroyed.
[1094] Only she and her followers will be saved because aliens would come and we take only them and that's it.
[1095] And Festinger was very curious to see her followers how they would feel the day after when...
[1096] The proclamation didn't come true.
[1097] That was his assumption.
[1098] Yeah, very arrogant.
[1099] And pre -Festinger, you would say the diehard believers, the one who gave their homes and sold their properties and gave everything to charity and said goodbye to everybody, they would be the most disappointed.
[1100] And the next day when they wake up and the earth is still there and no alien came, they would just say, I am living.
[1101] And the people that were on the fence, you would say, they never believed it so much.
[1102] They wouldn't be too disappointed.
[1103] Maybe they would even stay.
[1104] But Festinger said the thing will happen would be the opposite.
[1105] it.
[1106] He said the die -hard ones, the one that was supposedly the more disappointed, they are the one that gave up too much.
[1107] And because they gave up too much, they can't confront this dissonance.
[1108] They can't say, I gave up so much for a fake guru.
[1109] Yeah, and I was wrong.
[1110] And he said that they would actually strengthen their belief in her.
[1111] And that's what he saw.
[1112] He saw that the people who were on the fence said, not a true guru will go home.
[1113] But the die -hard believers, actually went out to recruit more people after that.
[1114] Now, if you think about what it means psychologically, it's basically that our actions drive our beliefs.
[1115] We usually think that our beliefs drive our action.
[1116] I'm a vegetarian.
[1117] Here is what I do.
[1118] But the reality is that it goes both ways.
[1119] And sometimes our actions drive our preferences.
[1120] One of our favorite sayings is it's easier to act your way into thinking different than think your way into acting different.
[1121] Yeah.
[1122] That's the essence of cognitive dissonance.
[1123] And the bigger, the action is the more powerful it is.
[1124] So now think about what it means in terms of our story about misbelievers.
[1125] It means that once people have invested in the social media and demonstrations, whatever it is, how can they go back?
[1126] And indeed, that's what we see.
[1127] We see now COVID is basically over.
[1128] But what we see is that the action is not diminishing.
[1129] It's moving to questions about the environment.
[1130] It's moving to questions about central currency.
[1131] Yeah.
[1132] Yeah.
[1133] stay tuned for more armchair expert if you dare so there's the new york times rabbit hole podcast series i don't know if you've listened to it but it's fantastic you see how many people that were qanon believers started as occupy wall street and then when that ends they have to land somewhere and so the great question now is like where'd all the qanon people land because they didn't quote come to their senses after the collapse of that they're just somewhere else and likely probably more extreme.
[1134] Yeah.
[1135] I've looked very hard to find people who came back.
[1136] I found very, very few.
[1137] I'm hoping that we'll be able to understand it better one day.
[1138] But statistically, when you look at the numbers of people who say, oh, it turns out we were wrong.
[1139] Yeah.
[1140] Let's go back.
[1141] It's a very, very hard journey.
[1142] So they adopt all kinds of things.
[1143] And the last part of this whole thing is about trust.
[1144] Really, that's the right perspective to view it the whole thing.
[1145] Because when you say, oh, let's say somebody believes the earth is flat, because I don't care so much.
[1146] Let's say you dated somebody who believes there's no COVID and you invited them to meet your parents.
[1147] Maybe there's some risk there.
[1148] If somebody believes the earth is flat, they're not going to change the shape of the earth.
[1149] But of course, it's not just about that belief.
[1150] It's the belief that there's something that's being hidden, then NASA is in on it, that every school is in on it.
[1151] The story is deeper than just the earth is flat versus not.
[1152] And once people adopt the perspective of misbelief and start looking at everything in a negative way and as a confirmation to the fact that they think that the world is an evil place out to get them, it doesn't go away.
[1153] It expands.
[1154] And that's how you can move from one misbelief to another misbelief, why they expand to be more and more.
[1155] And we're just starting a new election season.
[1156] Last election season were not so good in terms of misbelief.
[1157] And the one that we're starting now, we're not starting from zero.
[1158] We're starting from whatever we left last time, maybe even worse.
[1159] Well, let me ask you this.
[1160] This is what I sometimes ruminate on, and I guess it's the only optimism I have.
[1161] Let's take the phenomena that you just laid out, that you have to get more and more extreme.
[1162] It's almost like entropy, right?
[1163] It's going to enter a phase where there's nowhere else to go.
[1164] So kill everybody.
[1165] Everyone sucks, but we get to whatever the nth degree of this is, hang you in public.
[1166] There's not much more to say after that.
[1167] We're going to run out of extreme, right?
[1168] So is there any hope in the notion that people also fatigue?
[1169] When I look at the polarization that has existed over the last 15 years, I guess the only thing that makes me optimistic is I think people are fatiguing of it.
[1170] Just like every other thing that's cyclical and we're passionate about and then it dissipates.
[1171] My only hope is that we do fatigue out of these things.
[1172] I don't see us fatiguing out of this because I think the cognitive emotional shortcut is not to get out of it, but to stay in it.
[1173] Let's say if you look at the percentage of Americans who are okay with their kids marrying somebody from the other side of the political spectrum, these numbers are changing and not for better.
[1174] Do you remember a ballpark of what they are?
[1175] How many people who are opposed to that?
[1176] I think it's slightly less than 50%, but close to 50%.
[1177] Now, if you think about that and you say somebody gets fatigued, I think they would just stay with your opinion rather than be open to adopt it.
[1178] So I don't think as fatigue is a good mechanism to bring change.
[1179] I'll tell you why.
[1180] what I do think.
[1181] So we started with whether this conversation is going to be optimistic or pessimistic.
[1182] So I hope it gave you some optimism in a sense that when you look now at some of your friends or other people, you say they're not different.
[1183] They just had a very unfortunate set of events and I understand now better how they got there.
[1184] I think the problem is bigger and more complex and more painful than what I thought when I started the book.
[1185] The book proposal had a chapter on solutions.
[1186] I don't have that chapter.
[1187] I have these thing called hopefully helpful throughout the book, but I don't have a solution.
[1188] I think there's lots of things we need to do, but there's lots of them and there's not that much time.
[1189] Minimally, I would say we would have to agree that upriver is the solution, which is as we track, let's say, meth addiction, your odds of recovery if you're snorting meth is X percent.
[1190] If you're smoking it, if you're shooting it, you're done.
[1191] It's like a 5 percent, you know, it's so low that you can't really spend your energy trying to intervene with someone that's shooting it.
[1192] You've got to start with someone who's snorting it.
[1193] And I think that has to be the case here is like figuring how to identify when people feel ostracized, having some kind of system in place.
[1194] Yeah.
[1195] And that's why for me, the metaphor of the funnel is very good, because you say, okay, catch people early and figure out what we need to do.
[1196] So there are things we could do as individuals, social support, what would it take to change your mind, the illusion of explanatory depth, not ostracize.
[1197] There are things we have to do on the social networks side.
[1198] The fact is that I don't know about you, but when social networks started, I thought this was an amazing innovation that would propel humanity forward.
[1199] I think I was deeply wrong, and maybe there was some turns in the middle that could have been different, but we're in a very different world.
[1200] We need to fix it.
[1201] So I look at this and I say, the optimistic side is if we recognize the size and the complexity and the urgency, we can act.
[1202] We just need to decide that this is a priority.
[1203] And I think it is because if you look at every problem in society, I think every one of them is worse because of misbelief.
[1204] Imagine there was COVID -23 that started the day after tomorrow.
[1205] How well are we set to deal with it together as a country?
[1206] Worse.
[1207] Yeah, yeah, yeah, yeah, yeah.
[1208] When COVID started, I called some people that I know in pharmaceutical industry.
[1209] And I said, why don't you start showing the scientists who are working day and nights?
[1210] Why don't you start documenting this?
[1211] I said, I hope at some point this will be kind of the shining moment where the pharmaceutical industry could save the world.
[1212] Could be a complete rebranding of Big Farba.
[1213] That's right.
[1214] And of course, everybody was too busy to do anything and the priorities were different and so on.
[1215] But I think it's a very missed opportunity.
[1216] I think if people had a live camera in many of the labs and they would see the way that people were working tirelessly to try and come up with things and fail.
[1217] And, I mean, it's just a heroic story, but it's not the story that is being told.
[1218] The story that is eventually being told has none of the tears and sweat.
[1219] and effort and ingenuity.
[1220] And eventually the story has been taken from them and played out in a very different way.
[1221] Yeah, in a very, very political way.
[1222] Well, it's incredible.
[1223] Now, I need your help.
[1224] I have an inherent problem.
[1225] I love your book.
[1226] I think you're brilliant.
[1227] Anyone who listens, who happens to know the history is going to ask why I would believe your book because obviously you've had some controversy and some past work.
[1228] And it is my goal to acknowledge that and try to alleviate and or justify why we would take this book serious.
[1229] Yeah.
[1230] I can't not bring that up.
[1231] I am happy to talk about anything.
[1232] Okay.
[1233] So you did a bunch of work on nudging.
[1234] I wouldn't call it nudget, but I didn't bunch of work on changing behavior.
[1235] Okay.
[1236] One of them being that if we switch the line you sign at the end of a questionnaire, I promise these were all truthful.
[1237] If you move that to the top of the questionnaire, you'll see that people are more truthful.
[1238] And then a lot of stuff was built on the back of that.
[1239] That's right.
[1240] And so one of the three experiments in particular, and correct me as I go, I'm just learning of this.
[1241] I was bummed to learn this.
[1242] That's not what I want to learn.
[1243] Was this data that came from a health provider?
[1244] It came from a car insurance company.
[1245] And the thought was they have to report their odometer readings at some point.
[1246] And so this is a great opportunity.
[1247] There's going to be 13 ,000 people asked to do this.
[1248] With half of them will tell them to be truthful at the top.
[1249] And at the other half will be truthful at the bottom.
[1250] It was reported that it was very effective.
[1251] Yep.
[1252] And the insurance provider has come out since and said that wasn't our data.
[1253] We only turned over 3 ,700 examples, not 13 ,000.
[1254] The paper was since removed.
[1255] A lot of critics came out.
[1256] So I think, A, I'm not a shame monster.
[1257] I'm a big addict who's relapsed in public and came out and set it.
[1258] So I'm not interested in shame at all.
[1259] I am interested in what the experience is of losing credibility.
[1260] And I'm also curious if you think even, understanding a lot of these psychological principles that we both understand inoculates you from not doing them as well.
[1261] So first of all, let me set the record straight in some important ways.
[1262] Okay.
[1263] There was a paper that I got the data in somewhere around 2007, 2008, that we found out later was falsified.
[1264] We actually found out about it two years ago.
[1265] I acted all along in a very trusting way with the data.
[1266] I shared it with people.
[1267] I noted possible mistakes that happened with it.
[1268] I shared it publicly.
[1269] I did everything a scientist would do, believing that the data was real data.
[1270] And it turns out it wasn't.
[1271] Just to be clear, I have never thought, imagined, or acted in any way that is dishonest.
[1272] Just not my interest in any of this.
[1273] But a big, terrible mistake happened.
[1274] I certainly think that I have some responsibility.
[1275] Very sad that it happens, but it happened.
[1276] Hold on, though.
[1277] I just got to push back a little bit.
[1278] A mistake is, I accidentally.
[1279] read that wrong.
[1280] I was given this and I should have sussed out.
[1281] Claiming you had 13 ,000 bits of data when there was only 3 ,000, that doesn't feel like a mistake.
[1282] My mistake was that I trusted this data.
[1283] But are you claiming you trusted the data that came from the insurance provider or from your team who was analyzing the data?
[1284] This is a very strange story and it's a story about a very long time ago.
[1285] And it's very hard for me, I've been trying for two years to understand what happened.
[1286] But somewhere along the way, by the time I had confidence in this data, some very bad things happened to the data.
[1287] We would agree someone manipulated it.
[1288] Yeah, absolutely.
[1289] I'm saying it was falsified.
[1290] Yes.
[1291] Okay, great.
[1292] The data was falsified.
[1293] There's no question about it.
[1294] There's also, in my mind, no question that I didn't do it.
[1295] Right.
[1296] But it did happen.
[1297] In retrospect, I can say, oh, you know, I could have been thoughtful here.
[1298] I could have run this analysis there and so on.
[1299] But the reality is I trust lots of people.
[1300] I work with lots of people.
[1301] Also, because of my injury, I can't type.
[1302] There's all kinds of things that's very hard for me to do.
[1303] There's lots of things I don't do myself.
[1304] That you outsource.
[1305] Something very bad happened under my watch.
[1306] I take responsibility.
[1307] So the data was falsified.
[1308] The finding about signature upfront and signature on the bottom holds.
[1309] Since then, for example, we re -ran all kinds of experiments.
[1310] Anyway, so that's about defining.
[1311] Something very bad happened.
[1312] I deeply regret it.
[1313] I go back and forth of what could I have done differently.
[1314] The one thing I could do is replicate the study.
[1315] Which I read was not achievable.
[1316] No, no, no, very much achievable.
[1317] I can say.
[1318] There's the example of this scientist that worked with you guys went to Guatemala to help with the taxes.
[1319] Yes.
[1320] So I don't know how much you want to go into this, but that study in Guatemala used a very different approach and a very different method.
[1321] And it's true, it didn't find it.
[1322] But there's lots of nuances about this.
[1323] But if you look at what the British government did, if you look at what the U .S. government did, there's lots of other things.
[1324] And I've also replicated it.
[1325] The results do work.
[1326] It's more nuanced as a story.
[1327] But signing on the top does change honesty.
[1328] Right.
[1329] You ask me also, how do I feel about this?
[1330] Yeah, yeah, like let's just say that, again, I've done this.
[1331] I've erred enormously and come out and dealt with it.
[1332] And that's a very specific experience.
[1333] It's also a specific experience to, and this happens a lot, I think, for people at the top, the boss, right?
[1334] You are expected to sometimes remove yourself from the job, to take full responsibility.
[1335] I don't know all the details, but this happens a lot where you have to leave because of something that happened underneath you because it's.
[1336] expected.
[1337] Like the admiral of his ship, you're saying.
[1338] Yeah, and that is ultimately tricky.
[1339] That one doesn't interest me, if I'm being honest, because then you're not culpable.
[1340] My hunch is you're culpable in some way, and I want to know what it's like to publicly confront being culpable in something.
[1341] And if you think, in retrospect, oh, I think I know what I so came to.
[1342] I understand how the mind works.
[1343] I've been studying it forever.
[1344] What did I do that mirrors what I know?
[1345] I had the experience of being accused by the misbelievers for a while before this happened.
[1346] So I had some kind of both extra sensitivity and some inoculations, because death threats are a different category than angry professors.
[1347] Yeah.
[1348] But of course, one of them is my reference group, the people that I feel belong to.
[1349] So I would say when I found out about the data falsification before it became public, I was distraught.
[1350] It was terrible to know that something like this could happen under my watch.
[1351] I tried as much as I could to understand how it happened, and I basically did not sleep.
[1352] But my expectation in that period, and I called many of my friends at the time, and I said, look, when you get data from postdocs or PhD students or other people that you work with, how often do you run test on it?
[1353] How often do you check that the data was falsified?
[1354] You know, we run all kinds of tests on the data, but assuming that it was falsified requires very different tests.
[1355] And everybody said, you know, we actually don't do it very much.
[1356] The level of trust is incredible.
[1357] Like if you think about how much trust is running society, it's everywhere.
[1358] Oh, sure.
[1359] Absolutely.
[1360] Yeah.
[1361] And I actually in my nevite thought that when this would become public, the field would take a deep look inside and say, this is something we all do.
[1362] There's a level of trust that we have.
[1363] And what are some of the mechanisms that we should do?
[1364] And I prepared for that because I talked with my lab.
[1365] I talked with some people and I started creating algorithms that would help us do it in a standardized way.
[1366] Not everybody has to invent the wheels, but we say, okay, if we're now going to basically run some algorithmic checks, let's do it together.
[1367] But when it came out, that was not the response.
[1368] The response was very personal and very, very different.
[1369] I realized that academics are people.
[1370] Do you think the reaction would have been softer if you hadn't been so celebrated as a behavioral scientists?
[1371] Of course.
[1372] I think I saw that last year there were about 2 ,000 papers that got retracted.
[1373] It's not that it never happens and it shouldn't happen.
[1374] I mean, we should work very hard to do it, but there are also lots of car accidents.
[1375] Bad things happened, but the personal attacks were very hurtful.
[1376] I don't know if you realize, but most academics are on the left side of the...
[1377] Oh yeah.
[1378] Of the map.
[1379] And there's something very canceled culture like that the left celebrates eating our own.
[1380] We love to eat our own.
[1381] But then in the same way that when I look at the misbelievers, I don't try to blame them.
[1382] I look at the system and I look at human nature.
[1383] I still try to look at it in the same approach.
[1384] And I say, here is the world they live in.
[1385] Yeah.
[1386] I guess it would be really tempting for you because in many ways you've been ostracized in that moment.
[1387] And we know what happens to people that are ostracized.
[1388] So how do you combat that?
[1389] You can think about that this book is my attempt to find a villain.
[1390] Lovely.
[1391] It took me a long time to get to that.
[1392] But here I was, villainize, stress, scarcity mindset, death threats.
[1393] And like everybody else, I need a story.
[1394] And I need a villain.
[1395] This book is really my attempt to find a villain.
[1396] And the villain I found is a combination of human nature and media and some other things in society.
[1397] The villain's vestigal evolution, I think, at all times.
[1398] It's like designed to live in a way we don't live anymore.
[1399] So this book really is my.
[1400] complex story about what is happening in the world and why I got to be where I was.
[1401] And when I look at whatever academics and so on, I look at it in the same way.
[1402] I basically distant myself a little bit and I try to understand how this process is working.
[1403] So it's not that I don't take things personally, but most of the time I look from the side with kind of an amazement and interest.
[1404] Do you worry?
[1405] Because when we talked at the very beginning about the end of life and the things, that you...
[1406] Legacy was the fifth thing.
[1407] That's what I was about to say, that that has had an impact on your legacy?
[1408] No, I'm not worried.
[1409] I'll tell you why.
[1410] I think that it's very easy to make very loud accusations in the short term, but I still believe that in the long term, the right things would come out.
[1411] So if I was going to end my life in the next week, I would say, okay, maybe things are not balanced the way I want.
[1412] But if you say I have at least, five more years, things have a way to crystallize over the long term.
[1413] And I feel good about myself because as much as I look back and I try to find out, you know, do I have any memory of ever doing something that I think was ill intention?
[1414] I have zero memories of things like that.
[1415] I feel very confident as a positive contributing human being.
[1416] I wake up every day with a desire to help.
[1417] I don't doubt myself.
[1418] And there is this thing that is happening, but I think it has a cycle.
[1419] it has social media, it has some other things, but I'm quite confident and reality would eventually win.
[1420] This has been lovely.
[1421] I don't know if you would enjoy this, but when I was going through that, of saying I heard that gave me great comfort was, I did the thing they said I did.
[1422] I'm not the thing they say I am.
[1423] I really liked that.
[1424] I have a weird, though, okay, so I don't want to make this equivalency, and I don't even know for a lot to talk about this, but we know someone who's just recent, quote in trouble, a comedian who it's like come out that part of his special is made up.
[1425] Like the stories are made up.
[1426] But the truths that he's getting at are real.
[1427] And there's a question of is it okay for him to have fabricated some stories or pieced together stories in order for a real truth to be exposed.
[1428] I think there are people on both sides of that argument, especially in comedy, obviously everything's exaggerated, so it's a weirdly specific niche, but I am curious.
[1429] I think about my role, I'm kind of a strange social scientist.
[1430] I publish paper.
[1431] I do everything that other people do, but I've set my own path.
[1432] I think part of it is because of my injury and my limitation.
[1433] I can't do what other people do sometimes, so I do what's easier for me. For example, I have a lot of pain in my hands.
[1434] And when I travel, I type less.
[1435] So it's easier for me to go and give lectures than to type.
[1436] I re -engineer my life in a way that works for me. And it's different than what works for other people.
[1437] So I've created my own map of what I think I can do in this world in a way that works for me. It's a new path.
[1438] I don't have examples of other people who've done it.
[1439] And maybe I've made some choices that I haven't thought about in our mistakes.
[1440] What is the role of a social scientist?
[1441] How much of it is to be an activist and how much is to publish papers and are general audience books like misbelief part of what we do or not?
[1442] What about all the people who write me and ask me for personal advice?
[1443] Should I answer them all and say, look, this is not my field.
[1444] I don't know what to do or do I have a responsibility?
[1445] For me, academics, what exactly do we get paid for?
[1446] What exactly is our profession?
[1447] I think our profession is not limited to teaching in class and is not limited to writing papers.
[1448] I think, for example, I feel an obligation.
[1449] Somebody wrote to me today and asked me a question about their father and hearing aids.
[1450] Not my area.
[1451] I know something about it.
[1452] I thought you were a cochlear implant expert.
[1453] Not my area, but I did read a couple of papers on this topic a few years ago.
[1454] Do I ignore this email?
[1455] Do I respond?
[1456] Where do I stand?
[1457] And I feel that we have some kind of social obligation.
[1458] We have some kind of social contract with society.
[1459] I think academia is a very, very special place.
[1460] And we get this incredibly interesting job with a lot of freedom and tenure and all kinds of things.
[1461] But I don't think it's one -sided.
[1462] I think we owe something to society.
[1463] So I say yes to every government that wants me to talk.
[1464] Because I'm Jewish and Israeli, I say yes to every Muslim country, with the exception of Iran.
[1465] But I say yes, as much as I can to hospitals because of my history.
[1466] And I answer people.
[1467] I don't know what are the rules of engagement.
[1468] What's the boundary?
[1469] Nobody's giving me examples.
[1470] And I try to make it up as I go.
[1471] And among those are things that annoy other people.
[1472] But it's okay.
[1473] Yeah, it's okay.
[1474] Well, Dr. Ariely, this has been so fun.
[1475] I hope everyone gets misbelief.
[1476] What makes rational people believe irrational things.
[1477] And then I want to close with this.
[1478] Sometimes it takes me the whole time.
[1479] two hours to figure it out, but I just got it.
[1480] Oh, wow.
[1481] You ready?
[1482] I'm going to tell you what celebrity you look like.
[1483] Okay.
[1484] I'm hoping it's been said to you before.
[1485] No, nobody have ever said I look like a celebrity.
[1486] Paul McCartney.
[1487] Wow.
[1488] Oh, I could see it.
[1489] Your fucking eyes are identical.
[1490] I can see it.
[1491] It was driving me nuts for two hours.
[1492] You know what?
[1493] When I was very young, somebody said it was.
[1494] Okay.
[1495] You did it.
[1496] Impressive.
[1497] It's a big victory here.
[1498] I'm going to take this.
[1499] It's the high point of this.
[1500] Dan, this was really a pleasure, and I thank you for also adjusting your time.
[1501] I was touring to school with my daughter, and I knew I wouldn't be back in time.
[1502] No problem.
[1503] All right.
[1504] So thank you so much for joining us.
[1505] Thank you very much.
[1506] Stay tuned for the fact check so you can hear all the facts that were wrong.
[1507] You're wearing another burberry piece for the audience.
[1508] Yes, the second one I got.
[1509] That's very nice.
[1510] And it's very autumnal.
[1511] Very intumnal.
[1512] We're both sick.
[1513] Yeah, I finally got my mom's cold.
[1514] Mm -hmm, mm -hmm.
[1515] I got your mom's cold, too.
[1516] Yeah, somehow I got your mom's cold.
[1517] Or I got what the girls had while I was out of town, and there was some residual.
[1518] Yeah, I could have also got that.
[1519] So last night, whiskey, you know, he's so needy whiskey.
[1520] Or three -legged dog.
[1521] Yeah, yeah.
[1522] He just, I think he just wants to pass.
[1523] Like, I think he wants to be lying next to me and then just gently pass.
[1524] Yes.
[1525] You do?
[1526] I do.
[1527] So young.
[1528] Well, he is young and I don't want him to pass.
[1529] Let me be very clear that that's not my agenda form.
[1530] Well, let's just say this.
[1531] He's got the worst depression of any creature I've ever seen.
[1532] He spends the entire day in bed.
[1533] And then at night after he's got to get his TV time with me, like laying in bed next to me while I watch TV.
[1534] And then I take him out to go pee -pee.
[1535] Then I put him in his house downstairs in the first floor.
[1536] So did all that last night.
[1537] But I hadn't been home very long and he didn't like that.
[1538] So I took him out to pee, put him in his house.
[1539] And then he started barking.
[1540] in his crate.
[1541] He didn't get enough TV time.
[1542] He did not get enough, nor did I even turn it on.
[1543] Okay.
[1544] He didn't get any TV time.
[1545] None of the cues were there.
[1546] So anyways, I let him sleep in my bed last night.
[1547] He was sleeping on my neck, on my face on my shoulder.
[1548] But you let him sleep in the bed?
[1549] Yes, that's what I ended up doing.
[1550] Because I was exhausted because I was cold.
[1551] I have a cold.
[1552] I wasn't chilly.
[1553] And he was all over the place.
[1554] But all that was manageable.
[1555] And then at 5 a .m., he decided to give himself a full bath.
[1556] I just wake up to him licking his arms and every part of his body.
[1557] He's just cleaning soup to nuts, nose to tail.
[1558] He's cleaning himself.
[1559] And it wakes me up.
[1560] And I can't.
[1561] There's no way to get him to stop.
[1562] Yeah.
[1563] So I take him finally at 5 a .m. down to his house, put him in there, go back upstairs, try to go back to sleep.
[1564] It's not going to happen.
[1565] and decide and go hiking.
[1566] So I went hiking at 645 and I had a really boogie so that I'd be back in time to take the kids to school.
[1567] And then I really was out of time to choose this outfit.
[1568] That was a long -winded way to tell you that I think I probably have a pair of pants that matches better than this.
[1569] So stay tuned for this piece to get a little more amplified.
[1570] Do you have black jeans?
[1571] I just ordered black.
[1572] Wow.
[1573] Because of Beckham.
[1574] Black denim.
[1575] Well, it would go good with that piece.
[1576] Oh, I can't wait to pair it.
[1577] Yeah.
[1578] Great tip.
[1579] Thanks for the pro tip.
[1580] You're welcome.
[1581] So, that story is one of those stories.
[1582] Do I have any bugs on my face?
[1583] No bugs.
[1584] Okay.
[1585] Yeah.
[1586] Have you had a grasshopper on your cheek.
[1587] I've been great luck.
[1588] When I walked here, a bug flew in my face.
[1589] Oh, really?
[1590] And it was on my face.
[1591] So sometimes you don't know when there are bugs on your face.
[1592] That's true.
[1593] That's true.
[1594] I just had to check.
[1595] And now I feel like it's covered in bugs.
[1596] No, no bugs.
[1597] The story you told about whiskey is one of those stories that I just find so nice, but I find it unsettling.
[1598] So nice, but what?
[1599] Are you going to go nude?
[1600] No, no, that was an accident.
[1601] My piece is too hot.
[1602] As much as I want to wear my piece, it's too hot with the cold.
[1603] Okay, you were saying the story about whiskey.
[1604] Yeah, it's really sweet.
[1605] It's such a sweet story.
[1606] But I hate it, yeah, because it's unsettling because I feel, I feel like I don't know you.
[1607] Oh, really, when I do something out of character like that.
[1608] It's not terribly out of character, though.
[1609] You know that when Kristen was out of town, I would be very nice to all the dogs.
[1610] If someone's around being nice.
[1611] But not sleep in your bed.
[1612] You never do that.
[1613] No, I don't.
[1614] But he's, well, he's like negotiating with a terrorist because he'll bark.
[1615] He'll just sit down there and bark.
[1616] If I had all night to play that game, but I wanted to go to bed, I was very, very tired from my cold.
[1617] Sure.
[1618] And too busy of a weekend.
[1619] It was out of my own laziness, I think.
[1620] Okay, so you're still you, I guess.
[1621] Oh, yeah.
[1622] That sucks, too, that I'm still me?
[1623] No, I want you to be you.
[1624] I'm still very much me. I do have an issue with people making changes.
[1625] You do.
[1626] It scares you because you'll think that you won't be a part of whatever new version of it.
[1627] I guess so.
[1628] Yeah, you will be.
[1629] Even if I start sleeping with dogs and stuff, still be best friends.
[1630] Are you sure?
[1631] Yeah, positive.
[1632] Okay.
[1633] I think how much changing I've done since I left Michigan and Aaron and I are still best friends.
[1634] Yeah, that's true.
[1635] I mean, I got sober.
[1636] That was a big oopsies.
[1637] That was a big divergent from the path.
[1638] But that did cause a bit of a rift for a little bit.
[1639] Well, it really, well, it didn't.
[1640] It didn't.
[1641] Like when I went home, I still hung out with him every time I went home.
[1642] It was harder for him to hang out with me because he had to keep his shit semi under control while I was there and that was challenging for him but I didn't I wasn't really privy to that like yeah I would see him and we'd go out to lunch and it was lovely now had he been sober he would have wanted to come visit more and and so sure it drove a wedge in that way yeah um but I don't know if I I wouldn't blame it on that I just I don't think anyone had much time with Aaron when I was walking here yeah I was listening to your favorite podcast no New song.
[1643] New old song.
[1644] Well, Taylor's 1989 came out in Taylor's version.
[1645] Yeah.
[1646] And there's a re -record of Bad Blood with Kendrick Lamar.
[1647] No way.
[1648] Yes.
[1649] I love that he did that for her.
[1650] So cool.
[1651] Yeah.
[1652] Yeah.
[1653] And also shows how much respect she has.
[1654] Uh -huh.
[1655] For sure.
[1656] In the music industry.
[1657] Yeah.
[1658] It's very cool.
[1659] And I love him so much.
[1660] Me too.
[1661] Ugh.
[1662] And so when we were walking, I mean, when I was walking, when me and the bugs were walking.
[1663] Mm -hmm.
[1664] You tailor and the bugs?
[1665] I was listening to Bad Blood, and it reminded me that I just learned my blood type.
[1666] Oh, wonderful.
[1667] What is it?
[1668] O positive.
[1669] Ooh.
[1670] How does that work?
[1671] You can receive O negative.
[1672] Uh -huh.
[1673] And O positive.
[1674] That's it, too.
[1675] That's it.
[1676] Now, if you're A -B -B -positive, is it universal acceptor?
[1677] One of them is, one of the A -Bs is a U. universal acceptor.
[1678] I bet it's A, B, positive.
[1679] Is that what you are?
[1680] I don't know when I am.
[1681] Oh, that's what I wanted to know.
[1682] Do you have any depression attached with the cold?
[1683] Because I do.
[1684] I mean, it's low energy.
[1685] Everything's just a bit low energy.
[1686] Is that what you mean?
[1687] Well, tipping into depression.
[1688] Oh, no. Yeah, yeah.
[1689] Because you can't like, well, you hiked.
[1690] I did.
[1691] I forced myself.
[1692] I'll keep doing stuff.
[1693] Huh, that's interesting.
[1694] Yeah.
[1695] Do you think it's because you like lay a lot, you know, when you're sick, you have to lay.
[1696] I'm pretty exhausted and I don't seem to be able to catch up on my sleep all that well.
[1697] Yeah.
[1698] And then I don't know, just a bit of depression.
[1699] I'm sorry.
[1700] But you're not having that with your cold.
[1701] I'm just wondering if the cold, the symptoms are depression.
[1702] No, I don't think so.
[1703] I don't have depression.
[1704] I just have exhaustion, but that doesn't mean depression for me. Right.
[1705] What are your other symptoms?
[1706] This is very private.
[1707] Oh.
[1708] Remember when I was saying, I've now told the story 25 times, but it was definitely on the episode with Jada Pinkett Smith, which is the time I got in a huge fight with somebody and then hung up the phone.
[1709] It was very horny.
[1710] Oh.
[1711] Remember that story?
[1712] I remember that story.
[1713] Yeah, everyone does.
[1714] You've told it a lot, actually.
[1715] I know.
[1716] I admitted to that.
[1717] Yeah, yeah, yeah, yeah.
[1718] Everyone will be familiar with that story.
[1719] So I think when I'm so uncomfortable and depressed, I have had a lot of horniness.
[1720] And it's like a swirling with grumpiness and agitatedness.
[1721] Oh, my God.
[1722] Can anyone relate to that?
[1723] Rob, do you get horny when you're sick?
[1724] No, not really.
[1725] Okay.
[1726] Yeah, my brain is just like you need relief.
[1727] I also thought, I mean, I resist saying this because I don't want people to be scared.
[1728] But I also even thought like, oh, I need opiates to get through this, you know, which of course, I'm not in search of nor.
[1729] But it crossed my mind.
[1730] Like, I need relief.
[1731] Wow.
[1732] Yeah.
[1733] So the horniness, I think, is just the one of these other.
[1734] techniques my brain uses to get me out of the discomfort.
[1735] If I'm going to go to the doctor and I'm going to say, I think I have a cold and they'll say, what's your symptoms?
[1736] I'll say depressed and horny.
[1737] And he'll probably or she'll probably say, that doesn't sound like a cold.
[1738] Yeah, that's right.
[1739] Yeah, he definitely doesn't.
[1740] I was going to say something important.
[1741] Oh, no. Yeah, it looks like you got.
[1742] Yeah.
[1743] Matthew Perry.
[1744] Exactly.
[1745] Oh, because I said op. Yes.
[1746] Yeah, yeah, yeah.
[1747] It's really sad.
[1748] It is.
[1749] It is.
[1750] It just feels like this mortal life is too hard for some people.
[1751] And that's so heartbreaking.
[1752] It is, yeah.
[1753] I think if people have sober people in their life, they'll have some experience with this.
[1754] I think us folks in recovery can seem a little crass during these moments or?
[1755] insensitive.
[1756] Insensitive.
[1757] There's something that I know from all of my sober friends, the news carrying through that circle and then my other non -sober friends.
[1758] And on the surface, it's really ironic because we have been the recipients of great compassion and forgiveness.
[1759] And you would think we would be that much more poised.
[1760] But there is something weird that happens when you're trying your hardest not to do something.
[1761] And then when someone goes and does the thing, we're not supposed to do and they die, in some bizarre way, it recommits you.
[1762] And so, like, you immediately use it to fuel your own recovery.
[1763] And so I just noticed we seem to have a different reaction.
[1764] Well, of course.
[1765] And I would, I guess I would, I would just, I would really parallel it with, like, the dudes that are in Afghanistan.
[1766] Like, they have a certain reaction to their fellows getting killed in that situation.
[1767] And there's, it's not the same reaction that people that are.
[1768] Yeah.
[1769] Because it happens all the time, you know?
[1770] Yeah.
[1771] It happens all the time.
[1772] And it's, it is such a bummer.
[1773] But you know what it is?
[1774] For the civilian, it goes straight to just heartbreak, which is super appropriate.
[1775] It is heartbreaking.
[1776] But I think for us, just because this is how we got to talk to ourselves.
[1777] Like, you got to fucking show up and you got to do the shit or that's what happens to you.
[1778] You know, it's just, it's, I don't know.
[1779] But it's still heartbreaking that not everyone's in that position.
[1780] Not everyone is struggling minute by minute to not do something.
[1781] Right.
[1782] Oh, no, I agree.
[1783] And so it's so sad that some people are afflicted in that way.
[1784] I think what's interesting is having sober people in my life, you and others, has done the opposite.
[1785] It's given me so much more.
[1786] compassion and so much more understanding that that wasn't a choice.
[1787] Let me ask you this.
[1788] This might be, I'm almost so curious about this, when you saw him out about promoting the book recently, what feelings did you have during that?
[1789] Because my feelings were he's in very bad shape and he's not going to make it.
[1790] well my feelings were he's in bad shape and i really hope i really hope he can get it together yeah yeah as as i too really desperately wanted that for him i knew him on his sober times that that's the other thing that's pretty maddening it's all anger at yourself if you're an addict it's like i knew him in his sober periods yeah and he was a happy person He was like a thriving, happy person with lots of friends, and it worked.
[1791] So, right, he worked big time.
[1792] So, yeah, I think when that happens, there's like this, maybe just this projection of anger, like, reverse projection.
[1793] Like, fuck, yeah, God damn it.
[1794] What the fuck?
[1795] Like, I don't know.
[1796] It's very curious.
[1797] But this isn't the first time I've noticed it.
[1798] It probably makes me feel safer to think I know that it was heading in that direction.
[1799] Yeah, yeah, yeah.
[1800] You know?
[1801] Well, yeah, protects yourself from when.
[1802] happens, it's like, oh, yeah, yep, I knew that was coming.
[1803] And I'm not in that position, so it won't happen to me. Look, in AA, we say, you know, regularly, but for the grace of God, there go I. Yeah.
[1804] So, like, in these situations, I think there's a lot of, but for the grace of God, there go I moment.
[1805] So it's like, for us, it's a tragedy yet it's also met with, like, gratitude that it's not you.
[1806] But for the grace of God, there go I. Yeah, I feel, I feel that.
[1807] Who was saying, someone was saying this about war.
[1808] Oh, this is so weird.
[1809] I can't even believe I remembered it.
[1810] Paul Shear on the episode he did about Platoon.
[1811] He interviewed the guy who trains all the actors for war.
[1812] And he himself had been many times in war.
[1813] And he does this kind of workshop at the beginning.
[1814] And he says, what would, do you assume your feeling is when the guy next to you get shot?
[1815] And everyone had a guess and blah, blah, blah.
[1816] And he said, well, what I felt and what many other soldiers I'm friends with have felt is immediate gratitude.
[1817] Like, oh, my God.
[1818] Oh, my God.
[1819] That wasn't me. Like elation.
[1820] Holy fuck, that could have been me. Yeah.
[1821] I feel guilty admitting all this, but this is kind of.
[1822] It's good to say it.
[1823] Yeah, this is, it's always a weird thing.
[1824] It's like I'm very sad and also wasn't me. That could be me. That's so selfish to say.
[1825] Well, no. I mean, I think it's good if you.
[1826] have reminders you're seeing the reality of the danger yeah i i agree i mean a lot which this was very cute i got so many texts right because yeah of my connection uh -huh i don't know him for real right yeah but for me the first reaction i had we were all together we were at a party yeah we were at a pumpkin carving yeah we're out of pumpkin parving yeah and it just like all of a sudden And people were like, oh my God, oh my God, Matthew Perry died.
[1827] Yeah.
[1828] My first thought was also some sort of mixture of first just like shock.
[1829] It's just crazy when you hear someone has died.
[1830] A young successful person.
[1831] Exactly, a 54 -year -old person, followed very quickly with just so much fear around addiction.
[1832] The disease, yeah.
[1833] Yes.
[1834] Also relief.
[1835] Like you and another sober friend was there.
[1836] Yeah.
[1837] And it's a constant, you guys are a constant, bummer.
[1838] Yeah.
[1839] You make me scared a lot.
[1840] That's my work, but still, it - Is it making mad at me for a second?
[1841] No. Okay.
[1842] It just, it makes me so scared.
[1843] Right.
[1844] For me, like for me losing you.
[1845] Yeah, yeah, yeah.
[1846] It could be.
[1847] It could be at any.
[1848] minute one of these people who I love.
[1849] Yeah.
[1850] I just hate it.
[1851] Yeah.
[1852] I remember one time you said this was a long time ago, really long time ago, we were at the old house.
[1853] I forget what happens.
[1854] Maybe I said something like, don't relapse, like as a joke.
[1855] Yeah, yeah.
[1856] And you were like, I will.
[1857] Uh -huh.
[1858] And I was like, and you're like, no, I will.
[1859] In life?
[1860] Yeah.
[1861] And I hated that.
[1862] I was like, no, you won't.
[1863] So were people listening?
[1864] You and go, that's why you relax.
[1865] No. Or maybe they would, but it's also an acknowledgement that it's not just like, I'm making this decision today and now it's over.
[1866] It's over for good.
[1867] It's not.
[1868] Right.
[1869] And I didn't believe you, but then when it happened, it was like, oh God.
[1870] Well, it might have been foreshadowing, yeah.
[1871] Yeah, I think, look, I think I've had a similar opinion about sobriety that I had when Brie and I fell in love at 20 and 19 years old, and we met cheating on our partners.
[1872] I thought the odds of us getting long term through all this, and never cheating again seems low to me. I want to be realistic about that.
[1873] And so I think some part of me is like, I don't know, like, let's get realistic about that this isn't, that you can't fix it and I'm going to work at it all the time, but also life's long and I don't know.
[1874] I don't know.
[1875] I just, I don't know.
[1876] That's probably kind of, I'm, I mean, really worried I'm triggering a bunch of sober people who are, you're pretty much not allowed to think like that.
[1877] That would be like where the group think would be, would be mad at me. Yeah.
[1878] Well, first of all, I shouldn't even be evaluated whether I'm going to do it for the rest of my life.
[1879] I am only supposed to be evaluating today.
[1880] Yeah.
[1881] So it's like rule number one violated.
[1882] Yeah.
[1883] And then rule number two would just be, they would say, well, you were pre -planning a relapse if you were talking like that, which maybe they would be right.
[1884] I don't know.
[1885] I just also think if you have people in your life who are sober or in recovery, that it's also really important for you to remember that that's not a one and done thing.
[1886] And to be cognizant of that when you're moving through life and I don't know.
[1887] I think it was my attempt to admit like I'm always vulnerable to it.
[1888] Yeah.
[1889] But I didn't believe you.
[1890] You had been sober for a long time.
[1891] Yeah.
[1892] As someone who doesn't have that, to me, it was like, that was old.
[1893] Yeah.
[1894] You know the one that did break my heart, the one that pierced that kind of recovery bubble was AM, DJ AM, because that one felt like kind of not his fault.
[1895] And that one really stung.
[1896] Was it anyone's fault?
[1897] It's not.
[1898] No, no. I know.
[1899] But what I'm saying is, here's a. dude who loved the program, sponsored so many people.
[1900] It was such a huge, wonderful part of his life.
[1901] He was such an example.
[1902] And then he gets in this fiery plane accident.
[1903] That just circumstance set him on a course.
[1904] And I hate that.
[1905] You know, yeah.
[1906] It's so tenuous.
[1907] That's what's sad.
[1908] It's such a tenuous grass.
[1909] The whole thing is.
[1910] That's whoever relapses or dies from this, It's always something like that.
[1911] Maybe it's not as profound as a fiery plane crash or it's not as dramatic as that.
[1912] But something small, normally, I think it's almost sad or when it's something small that like that shows how fragile.
[1913] Yeah, I just feel like some people are just too sensitive for this world.
[1914] Yeah.
[1915] I very much felt that way about Heath.
[1916] There's been a few people I felt that way about.
[1917] This really, really sad and scary.
[1918] You're not ready for anything remotely funny about it, are you?
[1919] Probably not.
[1920] This isn't really funny.
[1921] It's just...
[1922] The game lesson?
[1923] Yeah, so Rob and I were at the football game last night in Charlie and Ryan and Damar.
[1924] Some lawyer's commercial came on mid -game.
[1925] It's a personal injury attorney.
[1926] And clearly this had been planned a year ago.
[1927] And he had licensed the friend's theme song.
[1928] Oh, my God.
[1929] And it was blasting in this arena with 75 ,000 people.
[1930] We were like, oh, my God, this is the worst possible timing for this song to be played.
[1931] Because this song's so happy.
[1932] And you could feel the wave, like 75 ,000 people all realizing, like, oh, this is not.
[1933] And it was, there's something about the juxtaposition of the happiness of that.
[1934] that song versus the tragedy that just happened and almost like confronting reality.
[1935] Like, yeah, that song doesn't really exist.
[1936] I know.
[1937] And that, you know, it's like it's like a burst the bubble of the whole thing.
[1938] Yeah.
[1939] Oh.
[1940] Also that lawyer that, I mean, it's such a good idea months ago.
[1941] Yeah, it just somehow slipped through or like the programming couldn't.
[1942] God.
[1943] Are they underestimated the reaction in the, in the, I wonder if that was on the, no, no, I wouldn't have been on the telecast.
[1944] They would have been a commercial for that.
[1945] Yeah.
[1946] It was just a collective, like, grimace as it was happening.
[1947] Sucks.
[1948] Yeah.
[1949] I feel like I want to apologize on his behalf.
[1950] It was an attic present.
[1951] I'm sorry.
[1952] That's not how that works.
[1953] I'm sorry we put everyone through all this.
[1954] It's not your fall.
[1955] It's no one, that's the saddest part.
[1956] Not my fault, but my responsibility.
[1957] Yeah.
[1958] I will say the theme of this show is extremely highlighted with him that money and fame do not do it.
[1959] No, no. He had friends money.
[1960] There's really no beating that.
[1961] No. Yeah.
[1962] Yeah.
[1963] Um, okay, doke.
[1964] Well, this is for Dan Ariely.
[1965] Okay, so the woman we had on who talked about illusions of competition.
[1966] Yes.
[1967] That was Wu Kyeong -on.
[1968] Oh.
[1969] Remember how much we loved her?
[1970] We loved her.
[1971] And I love, I don't know why I love finding out ways that we don't think logically because we think we're so logical.
[1972] Yes.
[1973] I really like that.
[1974] Wu Kiyang -on.
[1975] She is a professor at Yale.
[1976] Oh, very trusted.
[1977] Let's go back and listen to that episode if you haven't listened.
[1978] How many sovereign citizens are there?
[1979] I have two different reports.
[1980] In 2010, the Southern Poverty Law Center estimated approximately 100 ,000 Americans were, quote, hardcore sovereign believers, with another 200 ,000 just starting out by testing sovereign techniques for resisting everything from speeding tickets to drug charges.
[1981] And then the other one is from the Dallas News .com, Dallas Morning News.
[1982] Okay.
[1983] And that number is all speculative, of course.
[1984] Well, think about it.
[1985] They're sovereign citizens.
[1986] filing out a survey.
[1987] They won't even carry a license.
[1988] I know.
[1989] At first, I really couldn't find anything, and I was getting annoyed.
[1990] But then I found these.
[1991] Okay, estimates put their numbers at between 300 ,000 and 500 ,000.
[1992] Aye.
[1993] But the movement has no leadership hierarchy or organizational structure.
[1994] Good luck.
[1995] So.
[1996] Okay.
[1997] Percentage of people who are okay with kids marrying on the other side of the political spectrum.
[1998] In 2014, it said most Americans are comfortable, but this is early.
[1999] days.
[2000] Most Americans are comfortable with political diversity in their households.
[2001] Just 9 % of the public say they would be unhappy if an immediate family member were to marry a Republican.
[2002] And about the same percentage, eight, would be unhappy about the prospect of a Democrat marrying into their immediate family.
[2003] Roughly equal percentages of Democrats, 15 % and Republicans 17 % say they would be unhappy welcoming someone from the other party into their family.
[2004] Okay.
[2005] Now, another site.
[2006] I'm doing multiple sites.
[2007] Snap.
[2008] Yeah, I like this.
[2009] I like this approach.
[2010] Trends going back to earlier years are not available, but other sources indicate that we have become less tolerant toward interpolitical marriages.
[2011] When Gallup asked in 1958, if you had a daughter of marriageable age, would you prefer she marry a Democrat or Republican?
[2012] All other things being equal.
[2013] The results, 18 % of Americans said they would prefer their daughter to marry a Democrat, 10 % preferred a Republican, and the majority didn't care.
[2014] However, in 2016, 28 % of respondents said they prefer their child to marry a Democrat and 27 % a Republican, and the share who didn't care has shrunk significantly, according to Lynn Vavrek, a political scientist.
[2015] A UCLA, ding, ding, ding.
[2016] Yeah, that went from 72 % approval to 40 -something percent.
[2017] Which is what he said.
[2018] Yeah.
[2019] I was just thinking about this lying in bed last night, weirdly.
[2020] I was thinking that for many people I know to say they're a Republican is an insult.
[2021] It's like saying they're an asshole.
[2022] Yeah, he's a Republican.
[2023] Yeah, it's tricky.
[2024] Yeah.
[2025] I can't speak for the right.
[2026] I don't know what they're saying.
[2027] But the left in general, Republican, it means you don't like that.
[2028] Like, you immediately don't like them.
[2029] If you're on the right, being a liberal is an insult too.
[2030] Sure.
[2031] Yeah.
[2032] So it's the same.
[2033] It's just what you're not over there.
[2034] I'm curious, like, when they say he's, yeah, I guess.
[2035] He's a liberal, yeah.
[2036] I mean, you, he, I've heard it a lot.
[2037] Right.
[2038] I mean, certainly when you add those things like LibTard, they like this call us, Snowflake and stuff.
[2039] It's very clear.
[2040] But I wonder if they're just saying, she's a Democrat.
[2041] Does that mean a asshole?
[2042] It means has crazy socialist ideas.
[2043] Mm -hmm.
[2044] I also just, again, not to be paint with.
[2045] like such a wide brush, but we're mostly not going to be using names.
[2046] Like, we're just saying the thing and we don't like it, but we're not going to make it like some sort of play on words.
[2047] Oh.
[2048] You know what I mean?
[2049] Like lib -tard.
[2050] Yeah.
[2051] I'm trying to think if we have any that we say.
[2052] I mean, the deplorables thing was the thing that, like, imploded.
[2053] Backfired greatly.
[2054] So we don't really do that.
[2055] We learned our lesson.
[2056] Yeah.
[2057] But when I was laying in bed, I was thinking, you know, for so many people just say Republican, that'd be a pass, they're out.
[2058] And then I was just thinking of Ted Olson.
[2059] I was thinking of this, like, legendary Republican lawyer who proved DOMA was unconstitutional.
[2060] So then there's that.
[2061] I mean, when I hear it about people, I need more information.
[2062] Okay.
[2063] I'm not like, oh, that's great.
[2064] Yeah.
[2065] Because to me, that comes with a lot of problematic.
[2066] stuff.
[2067] Yeah.
[2068] But I want to know more.
[2069] Okay, so do they vote for Trump?
[2070] Are they fiscally conservative?
[2071] I want more information.
[2072] Yeah.
[2073] And I think that's where the benefit of the doubt has disappeared.
[2074] So I think if you're on the right and you're a Democrat, you go to, they're a socialist.
[2075] Well, certainly there are socialists in the Democratic Party, but it's got to be under 10%.
[2076] Yeah.
[2077] I don't know any Democrats that are actually socialist.
[2078] Yeah.
[2079] Agreed.
[2080] And then likewise, you know, are they into putting kids in cages in anti -abortion?
[2081] Well, the national data would say they're not anti -abortion in majority.
[2082] They're not.
[2083] So I think there used to be a benefit of the doubt where you would just go, oh, they're, they're fiscally conservative.
[2084] I get that.
[2085] They want a small government that doesn't spend a lot of money.
[2086] And then the other side would say, like, they think the government can solve everything.
[2087] Very benefit of the doughty.
[2088] You're not going to fight over that.
[2089] Yeah.
[2090] And I think that's what's eroded.
[2091] And I don't know that it's all our fault just because the knuckleheads are the ones getting all the goddamn attention.
[2092] Yeah, exactly.
[2093] I mean, especially in the Republican Party, and the Tea Party movement fucked that party in such a major way, creating a crazy extreme that now some people are aligning themselves with.
[2094] And then in order to get that base, they have to do things like, say, they're against abortion, even though even a lot of the policy makers aren't, but they have to say it in order to get those votes.
[2095] Like, it's crazy over there.
[2096] Yeah.
[2097] Yeah.
[2098] But I think on our side, the equivalent of that is, is every one of our debates centers around generally the lowest percentage numbers of people in the country.
[2099] Sort of.
[2100] Like, I mean, a lot of what Biden's done and talks about is jobs and infrastructure and things like that.
[2101] Like, that affects most people.
[2102] But what we will see in, well, there won't be any democratic debates.
[2103] But in the next cycle, it'll be, well, at least a. was last time.
[2104] These are the issues on the left that somehow take up the most of it as opposed to policy on jobs or foreign policy.
[2105] And I think the people on the right watch that and they're like they're not really concerned about the country per se.
[2106] They're concerned about these handful of groups of people.
[2107] That's what all of their energy and resources.
[2108] But I don't think that's true about Democrats.
[2109] I think Democrats, they seem very unpragmatic.
[2110] That's what it is.
[2111] And I do think Democrats are ultimately very pragmatic.
[2112] And I think it's a very small percentage of the Democratic Party that is completely not pragmatic, you know, like no cops.
[2113] Let's get rid of cops.
[2114] Like, it is insanity.
[2115] And I think that takes up a lot of space in the debate.
[2116] Conversation.
[2117] But part of that, when I was home, like we were watching Fox News, my parents watch all of it.
[2118] They try to hear all of the stuff.
[2119] And a lot of it is Republicans are watching not all.
[2120] Republicans, but definitely the base is watching a program that is feeding that and saying this is all they care about.
[2121] This is what they're doing.
[2122] They want every city to be full of homeless people.
[2123] Yeah.
[2124] And it was funny because my dad was saying, so one of the Republicans running is this Indian man, Vivek Rameshwami.
[2125] Yeah, yeah.
[2126] And he's an idiot.
[2127] He's a full -blown idiot.
[2128] He's not even liked in the Republican Party, right?
[2129] No, he's like third right now.
[2130] Oh, he is.
[2131] Yeah.
[2132] And that's a problem, right?
[2133] Because then also there's this like, yeah, we do like minorities, but because he's in white face.
[2134] I can say that as someone who tried to live like that for a long time.
[2135] Oh, that's a new expression.
[2136] I like that.
[2137] I made it up.
[2138] White face.
[2139] But yeah, so he was on Bill Maher, I guess, or on his podcast or something, something my dad was listening to.
[2140] Yeah.
[2141] My dad said that it was crazy because he straight up lied about.
[2142] nuclear energy.
[2143] And he was saying, we don't, we haven't had any nuclear advancements or buildings or anything since the 80s.
[2144] And my dad was like, that's literally not true because his company just built one.
[2145] Yes.
[2146] And Bill Maher was like, oh, yeah.
[2147] Like people just believe it.
[2148] And my dad was saying, this is the model minority.
[2149] We think Indians are so smart.
[2150] I know.
[2151] Like, yeah, whatever he said must be the true.
[2152] Exactly.
[2153] And my dad made a good point.
[2154] He was like, he just says it was such.
[2155] confidence that if you don't know, you just assume he's right.
[2156] I just happen to know this one thing.
[2157] Right.
[2158] But then he makes you realize, oh, probably everything he's saying is just that exact same thing with this confidence, but it's all just made up.
[2159] Right.
[2160] And we only know, we can only get glimpses if we have the truth and the facts, what barely any of us do.
[2161] Anyway, but it was a good reminder that we just listen and, like, not along because we don't really know for ourselves.
[2162] Yeah.
[2163] Okay, back to this.
[2164] Oh, IBM.
[2165] You mentioned the disagreeability chart comes from IBM.
[2166] They have done like a lot of personality insights and stuff.
[2167] Uh -huh.
[2168] So there is a lot on that.
[2169] But I just also remember we had the woman who wrote Culture Map.
[2170] Yes.
[2171] Aaron Meyer.
[2172] And that was a great episode.
[2173] Really good.
[2174] I love all those cross -cultural comparisons.
[2175] They're so fascinating.
[2176] Yeah, that is it.
[2177] I don't like that it's it.
[2178] I know.
[2179] Just do this all day.
[2180] It is fun.
[2181] It's fun.
[2182] I hope it got you out of your cold depression.
[2183] It did.
[2184] Good.
[2185] All right.
[2186] Okay.
[2187] I love you.
[2188] Love you.