Armchair Expert with Dax Shepard XX
[0] Welcome, welcome, welcome to armchair, expert, experts on expert.
[1] I'm Yuval Harari.
[2] You wish.
[3] I'm joined by Noam Chomsky.
[4] I wish.
[5] Hi, Noam.
[6] Hi.
[7] I'm Dan Shepard, and of course, you're Maximus Mousimus.
[8] That makes me think of the photo we took in our Halloween garb.
[9] That was fun.
[10] Your hair was very tall.
[11] Thank you.
[12] Your spider web was very spooky.
[13] Yeah, can I tell you a...
[14] secret yeah so when i arrived i was just wearing that black dress and a spider headband i didn't look like a spider oh and i was embarrassed so what did you do how did you course correct i walked through the house and our friend laura had the idea of putting that like cotton that stretches it's like a halloween decor sure sure and taking that and putting that on myself and i was like oh great idea so we're walking around the house looking for that and when we did we found two table runners that were Spider -Web.
[15] Oh, my goodness.
[16] Wow, this was a real improv.
[17] It was.
[18] And I mean, the idea of me going through the whole Halloween night just in my black dress, that would have been so embarrassed.
[19] Well, you would have found yourself in the position I'm generally in.
[20] Yeah.
[21] And truth be told, I didn't really have an outfit.
[22] I had a sleeveless t -shirt in some combat boots, but the hair really sold it.
[23] It came together nicely.
[24] Yeah, no one complained.
[25] Oh, we should say something really important.
[26] Oh, okay.
[27] We're recording this on Tuesday at 12 p .m. This is coming out on Thursday.
[28] So an election will have come and gone.
[29] Right.
[30] Thank you for that.
[31] And if you didn't hear the first Yuval Harari episode, which was one of our favorites in not enough time by our greedy estimation, this time we had some time.
[32] And boy, was it fun.
[33] Yuval Harari is a historian with a Ph .D. from the University of Oxford.
[34] He lectures at the Department of History, the Hebrew University of Jerusalem, and he specializes in world history.
[35] Yuval and his husband have co -founded Sapien, a social impact company with projects in the fields of entertainment and education.
[36] Their main goal is to focus the public conversation on the most important global challenges facing the world today.
[37] He has the best -selling book, Sapiens and Homo Deus, and 21 Lessons for the 21st Century.
[38] He has a new book out now called Sapiens, A Graphic History, which is an incredibly unique approach to helping the reader understand the material, and we will get into it at length with one of our star guests, Yvall Harari.
[39] Wondry Plus subscribers can listen to Armchair Expert early and add free right now.
[40] Join Wondry Plus in the Wondry app or on Apple Podcasts, or you can listen for free wherever you get your podcasts.
[41] Hello!
[42] Hey, nice to meet you again.
[43] Are you in Israel, I assume?
[44] Yes, I mean, we are in Israel.
[45] It's quite difficult to get in and out of the country these days.
[46] We're in Tel Aviv at our office.
[47] And is it hard because people don't want to let you in or you're not allowed to leave?
[48] You are allowed to leave, but most countries are not very keen on having Israelis at the moment because we are a red country.
[49] and also when you come back, you often have to be in quarantine.
[50] You can fly to the U .S., no problem.
[51] I mean, they don't care about anything.
[52] But most countries are being a bit more careful than the Americans.
[53] You know, I just read an article that you had written about, and we'll explore it in detail, but you were a little critical of Netanyahu policy about surveillance during COVID.
[54] I immediately got curious, if you could give countries out of 10 score of how free they are to criticize the current regime, what would you give Israel and what would you give us?
[55] So I have some sense of how safe you are.
[56] I mean, it depends what kind of criticism you level at the government.
[57] I mean, there are some things that are kind of almost taboo, but it's a social taboo, not a political taboo.
[58] with regard to saying things about Netanyahu and his government, you can basically say anything you want.
[59] Oh, okay.
[60] That's nice.
[61] So you'd give yourself like a nine or ten?
[62] Again, in terms of criticism of calling the prime minister corrupt and a criminal and whatever, you can say that at least for now, nobody will arrest you in the middle of the night.
[63] Well, I think I read or we had a guest on who explained this ranking, I think IBM created because they had satellite businesses all over the world.
[64] It was kind of detailed in one of the Malcolm Gladwell books, that Israel is one of the only countries that has less fear of authority than Americans.
[65] You guys are like the apex of that, right?
[66] This is part of Israeli culture.
[67] I see it in the university.
[68] There are no students like Israeli students for good and for bad.
[69] Like I would give them an assignment to read for next week.
[70] They will come to the next class.
[71] They will openly say, I didn't read the book of the article, but I think they are wrong.
[72] If you say something and they disagree, they have no respect for your authority as a professor whatsoever, which, you know, it sometimes makes life a bit difficult, but for me it was a great learning experience because if you say nonsense, you will immediately be told.
[73] So it really makes you kind of check yourself.
[74] And also, there are nevertheless things in Israel, certain things related to Israeli -Palestinian relations, to the occupation, to the army, that, you know, if you say them, you won't go to jail, perhaps, but the social reaction would be very, very severe.
[75] Uh -huh.
[76] You know, I guess almost every society has these kinds of red lines somewhere.
[77] And in Israel, there are certain things that are taboo.
[78] Okay, so I get immediately curious because as I try to assess our response to COVID, and I think it's probably well known around the world, we have a pretty significant faction of people that are against the mask, and to them what that represents and a lack of freedom and choice and these principles that they value.
[79] And so at first, I look at the data, and we're one of the worst in the world.
[80] I'm embarrassed first, like, oh, my God, you know, of all the places, we have access to the most education, technology, everything.
[81] So at first, I'm embarrassed and disappointed.
[82] And then another part of me thinks, well, you know, all this stuff is a spectrum.
[83] And I guess one of the upsides is it's also that same arrogance that leads to technological breakthroughs and this and that.
[84] Do you think it's always a balance between these things?
[85] are they related?
[86] And I'm curious what is the Israeli response has been?
[87] Have people been like, no, I know better than science?
[88] I think, you know, that can be dangerous that we do need a healthcare system.
[89] And I don't think people should have the freedom to just spread an epidemic or to ignore basic regulations.
[90] You know, you don't have the freedom to go in a red light.
[91] Hey, it's a free country.
[92] I can go in a red light.
[93] Why not?
[94] So, and I think we need to have a balanced attitude and a deep understanding what freedom means.
[95] Freedom doesn't mean I can do whatever I want and just ignore the consequences for other people.
[96] Well, yes, when your freedom's limiting other people's freedoms, we would agree that that's not a freedom we should have.
[97] But so you're making a moral and a philosophical assessment, which I agree with personally.
[98] I think I'm on the exact same page as you.
[99] I'm trying to have a dispassionate charting which countries have the most innovation and which countries didn't obey the most.
[100] And I'm just wondering if there's correlation there, not to make a moral case just to state maybe a pattern that's interesting.
[101] I'm not sure.
[102] I mean, I haven't done the research, so I don't want to commit to anything.
[103] But in terms of, you know, kind of independence and not taking shit from anybody, you look at a country like Afghanistan, you know, there is almost no law.
[104] and order.
[105] Everybody has a Kalashnikov at home and you can't tell anybody what to do.
[106] And I don't think they are very high on the innovation and technological.
[107] I mean, they are innovative in some areas.
[108] Sure, sure.
[109] But I'm not sure that just having no willingness to have these kinds of orderly system is necessarily, but invention and innovation also demand some kind of basis.
[110] to make progress in science, you often need to study in a critical way, but still to study the findings, the theories of those who came before you.
[111] I know as a scientist, you know, I couldn't write any of my books if I would just completely discount all the findings of people and other disciplines.
[112] And no, I know better than everybody.
[113] You know, I'm a historian, but I'm not an archaeologist.
[114] It has to hold up to scrutiny and peer review, and obviously the opinions of the non -mask wearers are just, it doesn't hold up, unfortunately, for them.
[115] It's not an offendable position, really.
[116] As far as I know, no. So, again, I mean, it's also in the terms of science and innovation, you need to find a balance.
[117] You need to have a critical attitude towards what you read and what other people say.
[118] But at the same time, you also need to.
[119] respect the authority of institutions like universities and scientific journals.
[120] Because if you simply disrespect all these institutions, no progress is possible because you can't research everything by yourself.
[121] You have to trust some other people.
[122] I mean, you know, I can't go and do research in physics and chemistry and biology and climate science.
[123] Most of the time I read the studies of other scientists, if they are published in a respected peer -reviewed journal, then I accept it, because you know, this is not my field.
[124] I always sympathize with pediatricians who must regularly be told that one of their patients has a syndrome that the mother or father figured out in 90 minutes of research when this person has dedicated the last 12 years of their life.
[125] It must be like a new phenomena they have to wade through of like, I understand you read that.
[126] And, And the internet does make us feel smarter and have more access to things than maybe we actually do.
[127] They must blow a lot of their time and capital and just talking people out of what they read in 90 minutes of research.
[128] Yeah.
[129] Okay.
[130] So in this article I read, you make this really compelling case.
[131] And so when you look at how the world has responded to COVID and what techniques and instruments are available, there is a strong incentive to go with a state monitored approach.
[132] And could you detail a little bit like what that is?
[133] Yeah, I mean, the most extreme version is to establish biometric surveillance system, a mass surveillance of basically everybody in the country, but monitoring not just where they go and who they meet and what shows they watch on television, but actually to go under the skin and monitor their medical situation, their temperature, their heartbeat, their blood pressure.
[134] Such a system, you know, on the plus side, it can eliminate the COVID epidemic within a few weeks.
[135] You know, if every person on the planet now wears a biometric bracelet which constantly monitors your body temperature, your blood pressure, your heart rate, COVID is over within a week or two.
[136] Because, you know, very often you don't know you're sick.
[137] but the biological signs are there and the system can pick it up and if you have such a system it not only stops COVID it's the last pandemic in history because there won't be flu there won't be any cholera you can stop all the pandemics and not just infectious diseases you can discover cancer when it's just beginning and it's still very easy and cheap to take care of it you don't have to wait until it spreads and you feel something and you go to the doctor and they tell you oh, you have cancer.
[138] So that's the plus side.
[139] If you don't, beyond that, that sounds like utopia.
[140] That sounds like utopia, but it's also the prescription from dystopia.
[141] Because exactly this system, if it is used by some, you know, 21st century Stalin, this is the basis for the worst totalitarian regime in human history.
[142] I love your current example of North Korea.
[143] So, yeah.
[144] North Korea, you know.
[145] dictators always wanted to follow everybody all the time, but they couldn't because they didn't have the technology.
[146] In the Soviet Union, you cannot have a KGB agent following every Soviet citizen 24 hours a day because you don't have enough agents.
[147] Now you can do it because you have the technology.
[148] You don't need people to follow people around.
[149] You have the microphones and the cameras and the smartphones and the biometric bracelets maybe.
[150] And more importantly, it's not just, again, where you go and who you meet.
[151] It's actually going under the skin.
[152] Today in North Korea, if the big leader, Kim Jong -un gives a speech, everybody, of course, have to listen, to open the radios or whatever and listen.
[153] And even if you don't like Kim very much and you think he's a complete idiot and you hate him, you would never say something like that.
[154] You would clap your hands and smile and look like you admire him.
[155] But But if you wear a biometric bracelet that monitors what's happening inside your body, it won't help you.
[156] Yeah, your cortisol is spiking and yeah.
[157] Yeah.
[158] I mean, your cortisol is rising, your blood pressure is rising, your heartbeat.
[159] And anger is different from joy.
[160] Anger is a biological process in the body.
[161] It's not some spiritual whatever.
[162] It's a process in the body.
[163] The same system that can tell you, you have COVID, can also tell you're angry.
[164] So imagine North Korea in 10 or 20 years when the regime knows what you feel every moment of the day.
[165] About the leader.
[166] About anything.
[167] And as you point out, when dictators know that, they get rid of people that don't agree with them.
[168] Yes, but now it's difficult for the dictator to know what you really think.
[169] dictators are usually surrounded by yes men even if you hate the dictator you are a very good actor you act as if you really like him or her usually it's him with this technology the dictator can go under your skin and you know this is worse than anything we have seen in human history so far and it's not just north korea it can happen in other countries it can also happen you know even with corporations It's not just governments.
[170] If you take, let's say, the entertainment industry, the number one thing like Netflix or Apple TV or everybody wants to know is not only what you watch.
[171] They want to know how you feel about what you watch.
[172] And let's say, for example, that you are watching a new show.
[173] And whenever the lead character appears, your interest goes down.
[174] But whenever some minor character appears, you suddenly become very engaged.
[175] Today, the producers have no way of knowing that.
[176] But in 10 years, if you wear this biometric bracelet while watching the television, or maybe just the television is watching you, analyzing your facial expression, they know, oh, next morning they pick up the phone, get rid of the lead character.
[177] People don't like him or her very much.
[178] And let's move the show to focus on this minor character.
[179] that everybody is really keen on.
[180] Yeah.
[181] You just spelled the end of my acting career, I think.
[182] Not necessarily.
[183] This kind of ability to go under the skin of people and know what they actually feel, this is like the Holy Grail.
[184] Everybody wants it.
[185] The dictators, the democratic leaders, the corporations.
[186] And now for the first time in history, there is the technology to actually do it.
[187] You know, you have these conspiracy theories.
[188] that somebody wants to implant chips inside our bodies to monitor us.
[189] The funny thing is, you're way late.
[190] You don't need to implant people with chips any longer.
[191] You're holding the chip.
[192] It can tell you everywhere you've been in the last 10 years.
[193] Yeah, and even if you get rid of your smartphone, now you can just analyze people's facial expression.
[194] So, you know, how do I know what you feel now?
[195] I'm looking at you and I analyze the tiny changes.
[196] in your facial expression, in your eyes, in your mouth.
[197] I also listen to your tone of voice.
[198] And I know from experience the difference between how a bald person looks like from somebody who is very engaged.
[199] Now, today, computers are learning to do that better than humans.
[200] So you don't even need to go inside the body.
[201] You know, the television can be watching you while you are watching it.
[202] Wow.
[203] So we're going to probably need to dump our smart.
[204] smartphones and wear a bag over our head.
[205] But before we get to that point, I just want to wrap up this article notion.
[206] So the other option, instead of mass surveillance and early detection, please give the example of what being informed can do for us.
[207] Yeah, I mean, you know, let's take a simple example.
[208] You want to make everybody in your country wash their hands three times a day.
[209] Now, one way to do it is put a policeman or a camera in every toilet and if you don't wash your hands, you get punished.
[210] That's the authoritarian way.
[211] There is another way.
[212] You can just educate people, give them basic scientific education in school or in the media, explain that, look, there are viruses and bacteria in the world.
[213] They are the cause of diseases.
[214] People don't get sick because of black magic or voodoo.
[215] or a punishment from some God, they get sick because of these tiny biological entities.
[216] And if you wash your hands with soap, this can remove or kill these pathogens.
[217] And then you just leave it to the people.
[218] If you gave them a good scientific education, you can rely on their own initiative and best interest that they would wash their hands even if there is no policemen watching them.
[219] And I think in many cases, the second option is the better one.
[220] It's more efficient, and it's far better for our freedoms and liberties.
[221] Yeah, so, you know, this comes up a lot.
[222] I think there's a fantasy or an illusion that we will have 100 % solutions to things, and we're uncomfortable if we have a 70 % solution to something.
[223] But then you're constantly weighing this against something else, right?
[224] So in your case of educating people, we're going to have to accept that, yeah, probably 25%, even with the great education and some campaigns, 25 % of people aren't going to wash their hands.
[225] This is, you know, best case scenario we can get, you know, seven out of 10 people to do it.
[226] And so you're not going to get 100%, but this 25 % that's not doing it is going to be worth ultimately not having a methodology for controlling us entirely by a dictator.
[227] So it's like all these truths are at best high percentage choices.
[228] And we're going to have to live with some fallout.
[229] And we have to look at the whole thing in totality, which is very hard and it's challenging.
[230] But this is a beautiful segue into, I think, Sapiens, which is there is such great value in understanding the full picture.
[231] By my estimation, and I've already interviewed you about Sapiens, I love it so much, and one of the things I love is just how many different disciplines you synthesize into this one snapshot of us on planet Earth.
[232] So my first question, to you is, I know for me personally, what is the value of understanding how we got right here?
[233] I think the most important thing about history is not to learn from the past, but to be liberated from the past.
[234] In a way, we are all living inside the dreams of dead people.
[235] Our institutions, our beliefs, our thoughts, even our fantasies.
[236] We don't know.
[237] that, but very often they are the dreams of people who died centuries and thousands of years ago and they created these stories and mythologies and institutions that we take for granted.
[238] And by understanding how our world was created, the human world, how nations and religions and economies were created, the process it happened, it liberates you.
[239] To some extent from their control, you realize, hey, this is not the natural way of things.
[240] It doesn't have to be like that.
[241] It's just some people in the past who thought about it.
[242] If you take, for example, let's say the situation of women throughout history.
[243] For centuries, women were dominated by men in most societies.
[244] They had far fewer political rights, economic rights.
[245] You know, when the U .S. became independent, they gave the vote to men, but not to women.
[246] Took more than a century and a half.
[247] And society's women were simply property of men.
[248] And when people asked about it, religious authorities or politicians would say, you know, this is just the natural order of things.
[249] It was always like this.
[250] It will always be like that.
[251] And can I add, we hung it on a Darwinian concept.
[252] of the strongest shall prevail.
[253] So, right, we kind of conflated something.
[254] Yeah, they had all kinds of explanations and stories, but the basic idea was this is the natural way of the world.
[255] And by studying the past, you realize this is not the case.
[256] I mean, all these stories about women being less smart than men or women being impure, somebody invented them at a certain.
[257] point and spread them.
[258] And you can follow the history of how these ideas develop in ancient Middle Eastern religions and Judaism and Christianity.
[259] And he said this and somebody else said that.
[260] And once you realize that, you're liberated from the power because you understand this is just a story people invented and we can change the world.
[261] And today, it's obvious that it wasn't the reality.
[262] Now we know that women are as capable as men.
[263] They can be politicians, they can be judges, they can be professors.
[264] And we look back and we are amazed.
[265] How could people believe such nonsense for centuries without questioning it?
[266] So this is the most important value of history that when you understand how the stories that rule our life, how they were created, you suddenly become, to some extent, flee from them.
[267] Yeah, and you know, I'm going to try to make this as apolitical as possible as a statement, but that's exactly what interests me, which is, oh, we inherited a very, very relatively short experiment.
[268] And we'll get into your timeline.
[269] I mean, we've been here for a fraction of the amount of time that's been here.
[270] And so the notion that we have figured it out is kind of, first of all, arrogant.
[271] and then maybe just lazy because you're just inheriting it.
[272] But it's why I don't understand the notion of being conservative, per se, and protecting a system from the past.
[273] I believe in progressing because this is such a new experiment and we keep bettering it in different ways and airing and then trying to acknowledge our error.
[274] But the notion that we should be preserving 1950, I think, is a little crazy when you think about how new the experiment is.
[275] We're not even close to having figured it out.
[276] Yeah, I mean, you know, conservatives usually they just defend the revolution of a century or two centuries ago.
[277] If you think that the old ways were always better, then we should all go back to the African savannah and learn to hunt zebras, because this is what our ancestors did.
[278] I mean, you know.
[279] Yeah, yeah.
[280] Yeah, the people in the 50s wanted to go back to the 30s.
[281] The 30s wanted to go back to the tens.
[282] You just follow it all the way down the rabbit hole.
[283] Yeah, and you're back in the Rift Valley.
[284] Exactly.
[285] Now, I would say that there is still a lot of sense also in conservatism.
[286] I think that a good society needs both.
[287] It needs both some progressive people that push forward and also conservatives.
[288] Because these experiments in building human societies, they very often fail.
[289] I think, you know, look at conservative philosophers like Edmund Burke in the 18th century, observing the French Revolution.
[290] And they made some very, very good points.
[291] when you try to go all the way, like building a new society from scratch, it's very often an arrogant enterprise because you think that you completely understand the world and how to build a perfect society.
[292] And it never works.
[293] Even revolutions need to be gradual.
[294] You know, you compare, let's say, the Russian revolution or the French revolution, which they try to build an entire society.
[295] Let's just throw away.
[296] everything that was until now and start from scratch.
[297] And what you get is Stalin and the gulags, or in the case of the French Revolution, you get the guillotine and then Napoleon.
[298] And the American Revolution was far more conservative and mild.
[299] Okay, let's take it slowly, step by step.
[300] Don't change everything at one time.
[301] And it had its downsides, but also humans just don't have the ability to predict the outcome of everything they do.
[302] so when you really try to change everything at once, there is a big danger there.
[303] Yeah, so those are fantastic points, and I totally agree with you, and it is why I'm probably more of a centrist, which is, yes, there are systems we have created in the last 250 years that have proved to be pretty great.
[304] There's many, many things that we're doing quite well, and perhaps even as good as they can be done, and those should be isolated and protected and walled off.
[305] But I guess what I'm saying, more broadly is the notion that will ever be done is a little naive.
[306] I think some people have an expectation that will preserve all this stuff and not an acceptance that, no, in fact, we're going to have to keep addressing these social problems that pop up and examine the system because the system is predictably making that outcome.
[307] So if we don't like the outcome, we have to acknowledge that we'll always be tinkering with these systems.
[308] Especially, as you know, the pace of technological change is accelerating, and social or political system that worked well when the technology was radio and trains and cars may be completely obsolete and irrelevant when you have artificial intelligence and bioengineering and things like that.
[309] So it's always a work in progress.
[310] I think the big advantage of democracy as a system is that democracy is more open to change than dictatorships and authoritarian regimes.
[311] And it tries to kind of manage change, to make it not too fast, but also not to try to freeze things because it won't work.
[312] The world is changing.
[313] And to say one more thing about the conservative role in society, I think their most important role is to preserve the key institutions.
[314] Because even to change, you need the kind of basis of an institution.
[315] And one of my concerns when I look at politics today in the world, in the US, in Brazil, in the UK, in many countries, is that what used to be conservative parties have become extremely unconservative parties because they no longer protect many of the basic institutions of society.
[316] Instead, they attack them.
[317] And I think that the progressive parties, they are doing their thing.
[318] But what worries me that in many countries, they are actually no longer conservative parties.
[319] You have parties that call themselves conservative, but they are actually busy undermining and even destroying the basic traditions and institutions of society.
[320] Stay tuned for more.
[321] armchair expert if you dare what's up guys this your girl Kiki and my podcast is back with a new season and let me tell you it's too good and I'm diving into the brains of entertainment's best and brightest okay every episode I bring on a friend and have a real conversation and I don't mean just friends I mean the likes of Amy Polar Kell Mitchell, Vivica Fox the list goes on so follow watch and listen to baby this is Kiki Palmer on the Wondery app or wherever you get your podcast We've all been there.
[322] Turning to the internet to self -diagnose our inexplicable pains, debilitating body aches, sudden fevers, and strange rashes.
[323] Though our minds tend to spiral to worst -case scenarios, it's usually nothing.
[324] But for an unlucky few, these unsuspecting symptoms can start the clock ticking on a terrifying medical mystery.
[325] Like the unexplainable death of a retired firefighter, whose body was found at home by his son, except it looked like he had been cremated.
[326] or the time when an entire town started jumping from buildings and seeing tigers on their ceilings.
[327] Hey listeners, it's Mr. Ballin here, and I'm here to tell you about my podcast.
[328] It's called Mr. Ballin's Medical Mysteries.
[329] Each terrifying true story will be sure to keep you up at night.
[330] Follow Mr. Ballin's Medical Mysteries wherever you get your podcasts.
[331] Prime members can listen early and ad free on Amazon Music.
[332] Yes, and this is where it ties back into your article, which is in order to have the citizens, of any state trust that washing your hands is effective, you must first have that trust so that when we say bacteria live on your hands, you believe that.
[333] And that certainly has eroded greatly.
[334] And I think, yes, with our current administration, you know, some institutions that have really been a part of the best safety mechanisms we have, the press, the fourth estate, if you just look at every scandal, everything that we would all agree on, that we don't like corruption, We don't like backroom deals.
[335] We don't like underground sex rings.
[336] All of these things have not been discovered by judiciary committees.
[337] They've been discovered by journalists.
[338] So this great failsafe that has been around for the entirety of this experiment in the U .S., the notion that you would try to delegitimize it is very scary.
[339] And we see it in many places around the world.
[340] And this leads to a situation of increasing social polarization.
[341] because there is no institution that everybody respects.
[342] Only one part of the political spectrum respects this institution and only one part respect the other institution.
[343] There is no common ground.
[344] And, you know, I look at the U .S. right now in the middle of this election campaign and what really strikes me as an outside observer, I'm not American, I'm not an expert in America, but what really strikes me is that today, Americans hate and fear each other far more than they hate and fear anybody else.
[345] You're so right.
[346] They're not the Chinese, not the Russians.
[347] You know, 50 years ago, Republicans and Democrats didn't agree about many things, but they both feared the Russians more than the other party.
[348] The Russians will come and change our way of life.
[349] Now, the Democrats are afraid the Republicans, if they take power, that's it.
[350] Our way of life is over.
[351] And the Republicans are the same thing.
[352] If the Democrats win, it's over for us.
[353] And you can't have a democracy when you think the other people in your country are your enemies, that they are there to get you.
[354] You can have a civil war.
[355] You can have a dictatorship in such a situation.
[356] But over the long term, you can't have a democracy when the people in the country hate and fear each other.
[357] And you know, it's such a parallel as well that the success of vaccines becomes their biggest enemy, right?
[358] So if you don't see tons of people with polio, you're no longer too worried about polio and you're inclined to not want this vaccine for polio.
[359] And similarly, having gone 10 years without a substantial terrorist attack, unfortunately, like one of the downsides of not being attacked is not having this common enemy that bonds and unites everyone.
[360] It's a very bizarre relationship almost.
[361] Yeah, but I think it's also the fault of, not just in the U .S., In many countries, you seem kind of powerful leaders who build their political career on deliberately inciting and creating divisions within society.
[362] That they treat their political rivals, not as legitimate rivals.
[363] You know, in democracy, rivalry is okay.
[364] People don't have to agree.
[365] But the basic attitude of a political leader in the democracy should be, you didn't vote for me, but I'm still your prime minister.
[366] I'm still your president too.
[367] I work also for you.
[368] Yeah.
[369] And my political rivals, they may be wrong.
[370] Maybe they're even stupid.
[371] It's okay to say that they are stupid.
[372] But they are not traitors.
[373] They are not evil.
[374] They are not un -American.
[375] That's such a big tactic is they're not patriotic.
[376] They're un -American.
[377] They're not American.
[378] They want to be socialist, this and that.
[379] And Hillary was guilty of it too.
[380] And she referred to them as a basket of deplorables.
[381] You know, that's a real us and them statement.
[382] And now if I'm those people, yeah, that woman can't possibly be working for me. She thinks I'm a basket of deplorables.
[383] And obviously, Trump, we don't have to list his division or divisive talk.
[384] But yeah, it's really troubling.
[385] You're right.
[386] The people aren't even Americans.
[387] If you're on the left, the people on the right aren't even Americans and vice versa.
[388] And it's very troubling.
[389] Now, again, back to your article, one thing I would say is, you know, over the last eight years, I guess it seems like there's been a kind of global shift to nationalist movements, whether you look at Brexit or you look at the rise of some more insular leaders around the world.
[390] Certainly we've become more nationalists.
[391] It's kind of all fine and dandy until you recognize that a pandemic is global.
[392] There is no such thing.
[393] It would be naive to imagine we could exist independently and that we're not so interconnected.
[394] And I think this is a great lesson in that.
[395] I think that's a little bit blowing over people's head.
[396] I mean, can you point up what we really need right now in this pandemic?
[397] I think the most important problem is that people have this mistaken notion that national loyalty and global cooperation are somehow contradictory.
[398] You have leaders saying specifically, you have to choose nationalism or global.
[399] And that's just a mistake because there is no contradiction.
[400] Nationalism is not about hating foreigners.
[401] You know, if nationalism is a key thing about being a patriot is that you hate foreigners, then yes, there is a contradiction.
[402] But nationalism is not about hating foreigners, it's about loving your compatriots, taking care of them.
[403] And in many cases, like in a pandemic, in order to take care of your compatriots, you need to cooperate with foreigners.
[404] You need to exchange information about viruses, about diseases.
[405] You need to build common defenses against pandemics, against ecological disasters.
[406] So there is no contradiction there.
[407] And also people, when they talk about global cooperation, they have this frightening scenario that there will be a global government that tells everybody what to do.
[408] We'll have to abandon our culture.
[409] We'll have to accept unlimited number of immigrants.
[410] this is not globalism.
[411] Right.
[412] That's a great distinction.
[413] Yeah, globalism just means that humanity has some common problems and interests and different nations.
[414] They retain their independent governments, their independent traditions and cultures, but they work together on these common problems.
[415] Because otherwise we can't solve them.
[416] Yeah, and minimally, we have to acknowledge that a virus doesn't know borders.
[417] it will not obey borders, the economy increasingly will not obey borders, and the environment's not going to obey borders.
[418] So if we all acknowledge we have minimally these three things that are going to not care where we draw the line on the map, then we've got to have some policies and some instruments in place to deal with those common threats.
[419] I would add one other major global problem that we need to cooperate on, which is the rise of new disruptive technologies, which we discussed earlier, like artificial intelligence and surveillance, you know, the big problem with these technologies is that unless you have a global agreement on regulating them, nobody can do much because nobody wants to stay behind.
[420] If you think, for example, about creating autonomous weapon systems, what is commonly known as killer robots.
[421] Now, it doesn't take much genius to realize that creating killer robots is a very much of very dangerous development.
[422] The problem is, you cannot regulate it just on a national level.
[423] Let's say the U .S. bans the development of killer robots.
[424] But if China produces them or Russia produces them, then the U .S. won't like to stay behind.
[425] The U .S. will say, yes, we don't want to do it.
[426] It's dangerous, but we can't stay behind in this arms race.
[427] And it's the same with things like genetic engineering.
[428] You know, you can have a ban in the U .S. on genetic engineering.
[429] No, no, no, no, you shouldn't meddle with the human DNA, engineering super babies.
[430] But if the Chinese are doing it and they are getting results, then very soon the Americans will feel we have to do it also.
[431] We don't want to because we have to do it, otherwise we will stay behind.
[432] And maybe the Chinese are feeling the same thing.
[433] We also don't want to do it, but we can't stay behind if the Russians or if the Americans are doing it.
[434] So the only way to effectively regulate these disruptive technologies, is by having some kind of global cooperation.
[435] And, you know, it's not impossible.
[436] People can agree.
[437] You know, you take, for instance, the Olympics, if you think about sports.
[438] Then on the one hand, the Olympics is a nationalist competition.
[439] Everybody go and wave their flags and how many medals we have and how many medals the Russians have and you cheer for your national team.
[440] But at the same time, it's all based on global cooperation because if you want to compete against the Russians in swimming or in whatever, you first have to agree on the same rules for the game.
[441] And the amazing thing is, you manage to do it.
[442] You know, athletes from the US, from Russia, from all over the world, can come together in the same place, agree on exactly the same rules.
[443] And this is a kind of model that you still have your national loyalties.
[444] You don't cheer for the Russian athletes.
[445] But you nevertheless have an agreement about the common basic rules.
[446] And again, back to your previous point, and we got to have a couple institutions that we trust so that when there is a negative or a positive drug test that that institutions trusted and believed, you know, because people do cheat.
[447] But again, they get caught by these systems.
[448] They get caught by journalists.
[449] They get caught by committees that test that we trust.
[450] Yeah, exactly.
[451] If we don't have these institutions that everybody respects, then very, very Very soon, the Olympics, and actually every sport, would become a competition between biochemists and between geneticists, not a competition between football players or swimmers.
[452] Because if you don't have an institution that you can trust and regulation, then everybody will just do whatever biological enhancements they can, and there will be no longer any sport.
[453] it will just be a competition between biochemists.
[454] Yeah.
[455] Okay, so we've kind of outlined the value of understanding why we got here so that we can choose where we want to go with more information.
[456] I think that dovetails beautifully on why you'd want to do a graphic novel version of sapiens.
[457] Am I right into think that this is to just make it kind of more broadly appealing and perhaps more broadly digestible?
[458] Exactly.
[459] I mean, there are many people who want to read a traditional science, 400 pages of text with lots of footnotes, but it's important to that science would reach these people too.
[460] Why not?
[461] And so I worked with a team with two very gifted artists, Daniel and David, on how to create a graphic novel version, a comic version of Sapiens.
[462] And you know, it's still very serious stuff.
[463] It aims at adults, not at young kids.
[464] Although, can I interrupt you for one second to say I was reading it last night, I thought, oh, my seven -year -old's ready for quite a bit of this, and I think the pictures would really aid in her comprehension of it.
[465] That's true, but it depends on the kid, but, you know, teenagers, definitely.
[466] But it's also aimed at adults, and it's not just sapiens with images, with illustrations.
[467] It's a completely fresh approach to history.
[468] We kind of took all the academic conventions and threw them in the garbage can and said, okay, let's start from zero.
[469] Let's think how you tell history.
[470] And we experiment with many different approaches.
[471] And we got a lot of inspirations also from Hollywood and from different genres.
[472] So like one chapter is discusses human evolution as a reality TV show.
[473] That kind of, you know, you have the different human species, you have homo sapiens, you have Neanderthals, and they are competing in a kind of survival reality TV show.
[474] Yeah, they're getting voted off, I guess.
[475] That is what evolution does is vote you off.
[476] Yeah, exactly.
[477] So another chapter is like a detective story.
[478] So we built it like, you know, this NYPD crime TV show.
[479] So we created this fictional detective, Detective Lopez, and she goes around the world to investigate the disappearance of the large animals of the planet.
[480] More than 10 ,000 years ago, you see that many of the large animals die.
[481] All the big mammoth and mastodons and cave bears.
[482] What's happening to them?
[483] So she's on the of the worst ecological serial killers in the history of the planet.
[484] And of course it's not a big spoiler that in the end she discovered its homo sapiens that spreads around the world and wherever humans go, the large animals become extinct.
[485] So, you know, it's still science.
[486] We hope that we got all the facts right, but it's told like an interesting and funny detective story and not with all the usual statistics and graphs and in scientific models.
[487] Yeah, and I guess I just feel compelled to, even though we covered it in the last time, just to give people like a snapshot.
[488] So, you know, the universe is roughly 14 billion years old.
[489] Planet Earth is 5 billion years old.
[490] Life presents itself three and a half billion years ago.
[491] We don't get to mammals to 65 million years ago.
[492] We don't get to hominids till 5 million years ago.
[493] We don't get to homo sapiens until 200 ,000, years ago.
[494] We don't get to communal living, agriculture, dedicated specific jobs until what, 20 ,000 years ago, not even?
[495] Yeah, maybe 10 ,000 years ago.
[496] 10 ,000 years, yeah.
[497] And then we're not writing and having the intellectual revolution, but a few thousand years ago, right?
[498] You know, everything we think about as ancient, you think about, you know, the big religious traditions of humanity, Judaism, Christianity, Hinduism, Buddhism, Islam.
[499] It's all just the last 3 ,000 years.
[500] They are the new kids on the block when humans have been around for, you know, more than 2 million years.
[501] I mean, humans head out and religion and politics tens of thousands of years ago.
[502] So it gives you really a different perspective.
[503] And anyway, it goes back to what we talked earlier about, what it means to be conservative.
[504] You think you're being conservative.
[505] Yeah.
[506] If you're following a religion, which is 2 ,000 years old, 2 ,000 years is nothing.
[507] Yeah, it's a grain of sand on a beach when you look at 14 billion years.
[508] Or when they do the geological calendar, right?
[509] Humans show up on like December 31st at 1159 p .m. It's like, when you think about it that way, you're like, oh, we should have some humility.
[510] How brief and recent this experiment is.
[511] Now, the book, which I'm enjoying immensely, and again, all I can think about is how excited I am to read it to my kids because I really am dying to pass on to them what it took me, I don't know, four years of college to kind of start comprehending and countless conversations with a ton of great people like you.
[512] And I think this book does an incredible job at pointing out our spec on the timeline, how new this is, all that stuff.
[513] I think it's really empowering.
[514] But you may envision even a more digestible version for kids.
[515] Is that possible?
[516] Yes, we are working next year.
[517] I mean, as I said, the graphic novel is aimed at teenagers and adults.
[518] We are working on a kids' book, which is aimed at kids aged 11, 10, 10, 11, 12, something like that, which will, again, it will tell the history of humankind in a new way, in a fresh way.
[519] explaining, you know, it's not kind of a bunch of dates and kings and battles and names.
[520] That's boring.
[521] It will try to explain, you know, the juicy stuff of history, of where we came from, and things like what is religion and how did money appear and these kinds of things, but in a way which will not just be accessible to kids, it will be fun.
[522] We hope so, at least.
[523] I mean, for us, it's fun.
[524] I think it's the most fun project that I ever worked on.
[525] are this graphic novel and the kids' book, because for years, as a professor at university, I learned to write in a certain way.
[526] And suddenly, you can go wild.
[527] You can experiment with so many different things.
[528] You can invent characters and plots.
[529] And again, you still have to stick with the basic scientific facts.
[530] Otherwise, what's the point?
[531] But in order to tell the story in an interesting way, we allow.
[532] ourselves, a lot of autistic creativity.
[533] Well, and I'd imagine, too, that your previous four books, or however many you've written at this point, is a very solitary endeavor.
[534] I have to imagine it was quite fun for you just personally to be collaborating with people and having them add to this and create something wholly original.
[535] Exactly.
[536] If you really want to do something fresh, you need help from other people.
[537] Because otherwise, you get stuck inside the patterns of your own mind.
[538] And certainly moving to a different medium.
[539] I mean, this is mostly images.
[540] It's not text.
[541] I never worked with images before.
[542] I don't know how to draw.
[543] I draw like a five -year -old kid.
[544] If they had to rely on me to draw the images, you wouldn't go very far.
[545] And it raises a lot of new questions.
[546] You know, when I write just the text, so let's say we discuss sex in the Stone Age.
[547] We now know that Homo sapiens and Neanderthals had sex and even had children together.
[548] When you write it, You can ignore many questions.
[549] For instance, was it a male sapiens and a female Neanderthal, or was it vice versa?
[550] And what was their skin color?
[551] What was their hairstyle?
[552] In a text, you can ignore these questions because texts are abstract.
[553] But drawings are always concrete.
[554] If you want to show the first interspecies couple, a sapiens and a Neanderthal, they can't be just generally humans.
[555] You have to decide who is the man, who is the woman, or maybe there are two men or two women also.
[556] And how do they look like?
[557] Do they have black skin, white skin?
[558] What kind of hairstyle?
[559] So you have to go back to the scientific literature and research these questions.
[560] And also you have to take into account current issues of race and gender.
[561] And there were a lot of discussions between Danielle, the artist, the one who does the painting, and David, who is the writer, me about how to present these scenes.
[562] Well, I was kind of embarrassed for myself last night as an anthropology major who's supposed to really know all this inside now.
[563] And I guess I don't think I realized that Neanderthal had appeared at 500 ,000 years ago and really didn't get absorbed into Homo sapiens sapiens until 70 ,000 years ago.
[564] So in fact, Neanderthals had a longer reign than Homo sapiens has had.
[565] Many human species were around far longer than us.
[566] I mean, Homo erectus is estimated to be even a million years on the planet.
[567] And I knew that.
[568] I guess I didn't realize Neanderthal was so much quicker to evolve than Homo sapiens.
[569] You know, it's very difficult to say when a species evolves, because they all evolve from a previous species, and it all depends on the latest bones that you find, the latest fossils that you find.
[570] So the date and the place where a species first emerges tend to change over time.
[571] For instance, lately, very old Homo sapiens bones were found in Greece and in the Middle East.
[572] So now some scholars say, actually, they didn't evolve in Africa first.
[573] Maybe they evolved first in Greece and the Middle East and then spread to Africa.
[574] It changes all the time.
[575] So I wouldn't give too much weight to the exact.
[576] dates and locations, but the big picture is important to realize that there are many human species living side by side on the planet, for example, because many people, even that they know about human evolution, they have this notion that at any point in the evolution of humans, there was just one human species that evolved into a better and better species.
[577] And this is understandable because today there is just one human species.
[578] So we think this is the normal situation.
[579] But actually, it's quite strange.
[580] You look at other animals, there are many species of birds living side by side.
[581] You have grizzly bears and polar bears and black bears and so forth.
[582] So why not have many human species?
[583] And for most of human existence, there were many human species living side by side.
[584] Only in the last 30 ,000 ,000 years, there is just one species, our species.
[585] And this is probably because we exterminated, we drove to extinction, all the other human species.
[586] This is kind of the earliest ethnic cleansing campaign in history, is the extinction.
[587] Stay tuned for more armchair expert, if you dare.
[588] Just imagine the world today if in addition to all the other divisions, Christians and Muslims, Americans, Chinese, Republicans, Democrats, you would also have sapiens Neanderthos.
[589] Yeah.
[590] I would really vote for Austerlopithecine giganticus to still be here.
[591] I want to see the seven and a half foot tall hominid.
[592] That would be my preference.
[593] Now, I want to just say that one of the most intriguing concepts that you laid down in Homo Deus, which I find myself repeating all the time in interviews, is back to the smartphone, back to the biometrics, back to the being able to evaluate what's inside the body is that you paint a picture of the future where you could set a goal for yourself.
[594] I want to get promoted.
[595] I want to lose weight, right?
[596] And that this device would be so good as you were walking into a meeting at work, it might pause you and say, you know what, your blood sugar is really low.
[597] You didn't sleep well last night.
[598] The last time you were in this situation and you spoke up, you had a shitty idea and everyone lost faith in you.
[599] So, Go into this meeting and shut the fuck up.
[600] That should be your marching orders.
[601] And so on the surface, you're like, that's an amazing bit of tech that could help us.
[602] But then you're very good, and I think this comes from your devours into meditation and your retreats, which is, what self is this device going to service?
[603] Is it going to service the experiential self, the one that enjoys eating candy bars?
[604] Or is it going to service the narrative self that wants to be at night going to bed saying, a controlled person who doesn't eat too much.
[605] And so right there, there's a big issue, which I thought was so fascinating.
[606] Even since that book was written, a new proposition has emerged, and we're starting to really uncover it, at least in the popular culture, which is the rabbit holes of YouTube, the rabbit holes of Instagram, the rabbit holes of all these different platforms.
[607] There's a third option, which is even more scary, which is these platforms, took us to goals that none of us had, right?
[608] Not the experiential self, not the narrative self.
[609] It took us to a place that was more extreme, more fringe, more militant.
[610] People who started now, they've documented very well as they look at their five -year history on YouTube.
[611] They started as maybe a centrist, environmental major, and become white nationalists, incrementally slowly.
[612] So that's kind and new.
[613] That's something that maybe had you already foreseen that, or that became something that was yet another scary option that was well -intentioned that went wrong.
[614] Yeah, the thing is that once you can hack human beings, all these scary scenarios become possible.
[615] To hack a human being means to understand that human better than he or she understand themselves.
[616] And this is now becoming increasingly possible.
[617] And then when you have an algorithm, that knows you so well, it can manipulate you for whatever purpose.
[618] Now, the really scary thing, in a way, about all these algorithms, that they were given a very kind of simple task.
[619] The interest of companies like Facebook or YouTube, they're not interested in a particular political position.
[620] They didn't come to their algorithm and tell the algorithm, okay, I want you to radicalize society.
[621] Right.
[622] No. They gave the algorithm a very simple task.
[623] I want you to make people spend more time on my platform.
[624] That's it.
[625] Easily measured last year they spent an average of half an hour on our platform every day.
[626] This year it should be 40 minutes.
[627] Go ahead, do it.
[628] And you had the smartest people in the world designing these algorithms.
[629] And the algorithms discovered that the easiest way to grab people's attention and keep them, on the platform seeing more videos, seeing more content, is to make them angry or make them afraid or press the emotional buttons of fear and hatred and anger and greed and things like that.
[630] They didn't even realize what would be the political, it was just, you know, something completely unexpected that this will be the result.
[631] Now think what would happen when such an algorithm will be in the hand of a kind of Stalin or a Kim Jong -un who gives much more direct political instructions to the algorithms.
[632] We need to understand that we are now hackable animals.
[633] Yeah.
[634] And when you wrote Homo Deus, it was largely theoretical.
[635] And now it's not theoretical.
[636] People talk about like, well, in the error of AI, well, no, no, we're already in it and it already happened.
[637] We have measurable polarization as a result of this.
[638] It's not a theory.
[639] It happened.
[640] It got away from us already.
[641] Yeah.
[642] And again, it's not even belonging to a particular side of the political spectrum.
[643] You know, if you have a platform and you want to keep people on the platform, and the algorithm discovers that whenever you see a headline about a political leader doing something crazy, you have an irresistible urge, I must click on it.
[644] I must see what he did, what he said today.
[645] This keeps you on the platform.
[646] So the algorithm shows you more and more of that.
[647] You know, like my husband is on TikTok.
[648] Okay.
[649] And it took TikTok something like, I don't know, 20 minutes to realize that if it shows him videos with sexy guys without a shirt, he tends to stay longer.
[650] It's not that he entered TikTok and had to mark a box, I'm gay.
[651] Please show me more pictures.
[652] No, the algorithm discovered very quickly by itself, you know, it throws at you all kinds of images and videos and see whether you stay or not and calculates things and reaches conclusions about what you like and dislike and that's it.
[653] You know, we are now in a global battle for attention.
[654] The most important resource on the planet is human attention.
[655] And unfortunately, extreme views are much better at grabbing human attention than moderate middle -of -the -way views.
[656] And this is something that may have served us well in the African savannah when there is a lion coming and a zebra coming better pay attention to the lion.
[657] When you live in the 21st century, and you have these algorithms that manage to hack us, this is now extremely dangerous.
[658] It is.
[659] And even when I think about if I'm an environmentalist and I do care about the planet and I want to either watch a video of the letter writing campaign to the senator versus that guy on the zodiac boat attacking a huge whaling vessel in the South China Sea, you know, what one am I going to watch?
[660] Clearly I'm going to watch that.
[661] You know, I'm going to watch the most extreme version of this thing I care about.
[662] I'm defenseless.
[663] Again, even if you don't care about it, even if the algorithm, you know, like, I don't know, watching videos of car crashes.
[664] Now, no sane person would wake up in the morning and say, okay, today I want to watch for 40 minutes videos of car crashing.
[665] But the algorithm discovers that they can keep you on the platform with this.
[666] So, okay, you're flooded, and you can't help yourself.
[667] It's stronger than you.
[668] So, you know, it goes back to what you started saying that I want to go on a diet and you tell the algorithm, I want to go on a diet.
[669] And at least this kind of obeys some kind of a goal that you set.
[670] But we are now in a situation when we are losing control of our lives because the algorithms are so good in hacking us.
[671] manipulating our emotions and using it against us.
[672] Again, it's still being done for relatively inconsequential purposes, like keeping you on the platform longer to show you a few more commercials to make a few more billion dollars.
[673] But in five or ten years, this technology can be the basis for the worst totalitarian regimes in history.
[674] It won't be some corporation trying to sell you advertisements.
[675] it will be a 21st century Stalin.
[676] So we have to be extremely careful about what's happening here.
[677] And even in the innocuous search for a diet, we just had an expert on talking about how that then becomes a gateway through the algorithm to anorexia, to thin -spiration videos.
[678] And you can just track the person very simple and benign, caloric video that ends up on these deep, deep anorexia videos.
[679] And again, it's just taking you.
[680] incrementally, slowly down this path.
[681] And I think, you know, it goes back to very deep philosophical questions about free will and human agency.
[682] The problem with the naive belief in free will is that it makes you very uncurious about the reasons why you make decisions.
[683] You think, well, I chose this.
[684] There is nothing to explain.
[685] I chose this car.
[686] I chose this politician.
[687] I chose to watch this video because this is my free will.
[688] End of story.
[689] You need to question free will in order to realize that maybe at least some of my choices don't really reflect my own free decision.
[690] They reflect some external manipulation.
[691] And that's very difficult for humans to acknowledge.
[692] The easiest people to manipulate are the people who believe in a free will.
[693] that everything they do is just their choice.
[694] And, you know, also many corporations and political leaders are using it as a defense.
[695] If you ask them, look what you're doing to people, they will say, you know, but people have a choice.
[696] We don't force them to watch these videos.
[697] They click on it.
[698] We don't take them and, you know, bind their hands and there is a robot clicking on, no, they do it with their finger.
[699] And their finger is controlled by their mind.
[700] So it's their free will.
[701] What do you want from us?
[702] The customer is always right.
[703] And so we need to realize that, you know, this is not the 18th century.
[704] This is the 21st century.
[705] And, you know, freedom is not something you just have.
[706] Freedom is something you need to struggle for.
[707] If you just assume that everything I do, everything I choose is my free will, then you are the easiest person to manipulate.
[708] Yeah.
[709] All the more reason to have trust.
[710] in some entity that is going to monitor this, that's going to enforce this, that's going to limit this.
[711] If we don't all agree in this country or in the world, that there is some force out there that is evaluating this objectively with many, many experts to help us navigate this thing.
[712] If we don't have that, then the war is over, because the technology is not stopping.
[713] I don't think anyone's naive enough to think that the technology is stopping.
[714] The train is hurtling down the tracks.
[715] It will not slow down.
[716] It will just keep accelerating.
[717] And if we don't all agree and trust on some shared reality, we're completely fucked.
[718] There's just no way, right?
[719] We'll be defenseless.
[720] Yeah.
[721] And we need to realize that, you know, in a previous century, we regulated things like air pollution or water pollution.
[722] You know, if you pollute the river, then you can be taken to court.
[723] There are regulations about it.
[724] We need to regulate the pollution of our information flow, which is now happening all around us.
[725] And again, we need to have, you know, like you have regulations of who controls the land and who controls the factories and so forth, regulations about who controls the data, our data, and what can be done about it.
[726] And, you know, I've been watching, for example, now all the process of the nomination for the Supreme court.
[727] And people are, you know, like very, very sharply check what does she think about abortion?
[728] What does she think about gay marriage?
[729] What does she think about gun control?
[730] And all these are important.
[731] I don't deny it.
[732] But hardly anybody asks, okay, what does she think about AI?
[733] What does she think about ownership of data?
[734] Maybe in her career, in five years or 10 years or 20 years, the most important legal cases she will have to decide will not be about abortion or gun control.
[735] It will be about artificial intelligence and data ownership.
[736] And the thing is that it's hardly on the political radar.
[737] I mean, what's the difference between Democrats and Republicans in their policy on AI?
[738] I don't know.
[739] Who knows?
[740] Yeah, who knows?
[741] Well, and again, it seems at least that the pattern in this country is we do trust certain government agencies when our mortality is at risk.
[742] So I don't think there's a left or right person that questions the FAA when they're circling the airport and trusting that that system will get them on the ground safely.
[743] We all concede to that because I think our lives are directly threatened by that and we have to believe in that system.
[744] And I guess I don't think people evaluate the threat of this stuff.
[745] If they evaluated the threat of this accurately and felt like it was existential, I do think we would then believe in the regulatory force to protect us.
[746] Yeah, I mean, there is no way to survive in the 21st century, certainly as a free society, without a new regime of regulation for all these new powerful technologies like artificial intelligence, like big data algorithms and so forth.
[747] And I think that this should be a kind of big partisan affair.
[748] That no matter what your thoughts are about other matters, at least there should be an agreement.
[749] You know, we can discuss exactly what are the regulations and which authorities should be in charge of enforcing them.
[750] But it should be obvious that we need them and we need them very, very soon.
[751] I agree.
[752] I think we should label this our new Russia.
[753] Yeah.
[754] Russia 2 .0.
[755] Well, you've all, I mean, I just cherished our last conversation, and this one as well.
[756] I hope you'll continue to write things, and I hope you'll come back and talk about that with us.
[757] You're a global treasure.
[758] Oh, thank you, making me a bit embarrassed, but thank you very much.
[759] That's part of the reason you're a global treasure is that that embarrasses you.
[760] That's good.
[761] be safe in these difficult days and hopefully once this is over it will be over i mean pandemics last for a certain while that they don't stay forever and hopefully when it's possible then we can person either in california or here i would love it i'd love either option you're so fascinating and so fun to talk to so i just look forward to it so take care be well bye bye And now my favorite part of the show, the fact check with my soulmate Monica Padman.
[762] This was a funny one.
[763] Tell me. Well, because remember how much anxiety I had about, I'm like, I don't know how to talk to him about sapiens.
[764] We've already talked about sapiens.
[765] I had all this anxiety that I wasn't going to be able to engage him.
[766] Yeah, get it up.
[767] Engage him for an hour.
[768] Yeah.
[769] And then that was not the case.
[770] Not at all.
[771] If it went fat.
[772] It flew again.
[773] Well, because we always referenced the first interview with him as being the weirdest experience with time space.
[774] That's right.
[775] And it kind of happened again, not as extreme.
[776] Yeah.
[777] But still extreme.
[778] Probably just only not as extreme because it was via Zoom.
[779] I bet if he was in person, it would have been.
[780] Two things.
[781] Yeah, I totally agree with that assessment.
[782] And then also the thrill of getting to meet him the first time is so heightened.
[783] Sure.
[784] And now he's just an old friend of ours.
[785] I was just like a boring old normal friend.
[786] Do you think we'll feel that about Bill?
[787] Oh, gosh.
[788] I'd like to find out.
[789] I'm willing to risk it.
[790] Me too.
[791] I'm willing to brisk it.
[792] By the way, someone sent me a really cool gray diet coat sweatshirt with a hood.
[793] Cute.
[794] It's really cute.
[795] It's like a nice sweatsher.
[796] Dang.
[797] Mm -hmm.
[798] I love nice sweatshers.
[799] Me too.
[800] They're the nicest.
[801] So we should say, when you're listening to this episode, The election will have happened.
[802] Yeah, yeah, yeah.
[803] And we are recording this pre -election.
[804] So we have nothing to say about that today.
[805] We don't.
[806] But congratulations, whoever won.
[807] Although I doubt, oh, this could be an Ostromacy moment.
[808] It probably won't be decided either.
[809] You think?
[810] I mean, I think it'll seem obvious where it's going, but I think there'll be a ton of uncounted ballots still.
[811] Perhaps.
[812] There was so much early voting that I have.
[813] I hope there's more of a definitive answer.
[814] Yeah.
[815] It's very interesting.
[816] People are listening to this and they're going to be annoyed.
[817] Some people, I don't know.
[818] I hope not.
[819] But some people are going to be very happy and some people are going to be very upset today.
[820] That's right.
[821] Yeah.
[822] I guess if you're listening to this, the good news is you're not actively shooting a gun at your neighbor.
[823] God, I hope.
[824] I hope that's not where we are on this day.
[825] Yeah, I don't think so.
[826] Should we talk about veneers?
[827] Oh, yeah, let's do it.
[828] Because I just saw you flash your teeth.
[829] Okay.
[830] Very nice teeth.
[831] Thank you.
[832] I feel the same way about your teeth.
[833] Thank you.
[834] Yeah.
[835] And I was really upset at you last week and when I learned that you had gone to a dentist appointment.
[836] By the way, an amazing dentist.
[837] Sure.
[838] And then you then told me that you guys had a conversation about veneers.
[839] I don't think you should get veneers.
[840] Do I say it weird, verneers?
[841] You do say it kind of.
[842] You add an R. I do.
[843] Vernors is what I want to say.
[844] That's a very popular pop in Michigan.
[845] They make a ginger ale that is very effervescent.
[846] Oh.
[847] And it's so effervescent that generally when you pop the cap off of the glass bottle, you sneeze pretty shortly thereafter.
[848] Oh, my God.
[849] And in Michigan, when people are sick, they say heppaverners.
[850] It's strongly recommended that people drink Vernors when they're sick.
[851] Back to your warner's.
[852] Right, but first real quick, I need to know why it cures sickness.
[853] Like if your tummy hurts?
[854] It's like chicken soup.
[855] I don't know.
[856] It's just, it's what you drink when you're ill. Although ginger ale in general is something people will tell people to drink.
[857] Yeah.
[858] But this is a cure -all in Michigan.
[859] You could have gonorrhea, have a verner's.
[860] Cut off your hand, have a verner's ASAP.
[861] Heart palpitations, get a verner's down the - I've got to get my hands on some Vernors.
[862] You do.
[863] It's a cure -all without the opium.
[864] Don't worry.
[865] It doesn't have the opium as most curals had.
[866] True.
[867] Anyways, back to your Roneers, your Keith Reneers.
[868] I haven't decided yet to say that.
[869] It's so much deeper for me than an aesthetic point of view, right?
[870] Right.
[871] Although I have an aesthetic argument I would make.
[872] Sure.
[873] And then I have a, it hurts my heart in the deepest way to think that you would think you don't have the most beautiful smile.
[874] And that's what really upsets me about it, that you could look in the mirror at your sparkly white teeth that are beautiful, straight teeth, and think you want a different set.
[875] Well, that's nice of you to say.
[876] But that's like, you know, what dad say to their babies.
[877] It's what you're going to say to the pee baby, too.
[878] No, no, no, no. This isn't a, no, you look great in that dress moment.
[879] Well.
[880] This is, I, there's many episodes where I've talked about how nice your teeth are.
[881] This isn't new.
[882] Well, so I have a little gap in between my two front teeth.
[883] Don't shake your head.
[884] This is real.
[885] Okay.
[886] I have a gap, and I have filled that gap with bonding multiple times in my life.
[887] It chips and chips and chips.
[888] It does not stay.
[889] It's not a good permanent option.
[890] Okay.
[891] So if I want a permanent option, that is a veneer.
[892] But then that led to the real -time conversation because Erica was there, and my favorite part of Erica's smile is her gap.
[893] I know, but see, you're tricked a little bit with mine because I have half the bonding is still in there.
[894] The bottom half is what chipped.
[895] So you can't see the gap for really what it is.
[896] So maybe what I should do is get that bonding out, and then I'll walk around with a big gap.
[897] That's not a big gap because I can already see.
[898] I've looked at where the little piece of bonding is.
[899] It's very small gap.
[900] It's noticeable.
[901] It's a noticeable gap.
[902] I hope so.
[903] And I'm not saying, by the way, gaps are very cute.
[904] Erika's is so cute.
[905] Oh, my God.
[906] It's like, I think, you know, one of the most appealing parts of her.
[907] I know.
[908] But it's hard to just get a gap all of a sudden when no one's used to a gap as opposed to like.
[909] You think it'll be a pop out.
[910] Yeah.
[911] It's going to be a Halloween.
[912] I don't think so.
[913] And then the point I made aesthetically is when everyone gets verneers, then everyone has the same smile.
[914] And who cares then?
[915] If there's five women in your social circle, social circle, and when they all smile, it's the exact same smile, great.
[916] I'm so not interested in anyone's smile now.
[917] I really do understand that argument.
[918] I do.
[919] And I might still, I'm still, I'm on the fence.
[920] I'm deciding.
[921] I think you're going to do it.
[922] Yeah.
[923] Now, look, if you had brown teeth, I get it.
[924] I don't think anyone's going to say like brown teeth are charactery and interesting and intriguing.
[925] That's, that's rough, brown, because it signals a lot of things that probably aren't even true, like a lack of cleaning your teeth, maybe fungus.
[926] And they might be true.
[927] It's the color of poop.
[928] You know, there's a lot of things that I can understand objectively.
[929] If I had brown teeth, everything would be brown.
[930] Yeah, yeah.
[931] I'd be monochrome.
[932] Might be cool.
[933] Well, no, no, your eyes, whites of your eyes.
[934] Oh, then I'd have one pop -out eye.
[935] Oh, eyes.
[936] Oh, I, our eyes.
[937] So brown, yeah, yeah.
[938] But if you got sparkly white straight teeth as you do, then I think it's all about like, well, thank God there's some uniqueness to it.
[939] And then I can remember your smile.
[940] Then I know what Monica's unique smile is.
[941] I hear you.
[942] Heard.
[943] Heard.
[944] Message heard.
[945] But I'll see you in a couple of weeks with a big grill.
[946] Well, listen, I can't not get them because you don't want me to.
[947] Yeah, of course not.
[948] But if what I'm saying, you find merit in the argument.
[949] Of course.
[950] And if you recognize that if I was making the same argument about eight of our friends and you agreed, and yet you didn't agree on yourself, that might be, you know, something to look at.
[951] Yes.
[952] You know, I can't be peer pressured.
[953] When we talked about the veneers.
[954] Well, I would argue that you're, this is you succumbing to peer pressure.
[955] This is what you don't understand.
[956] So I've been wanting veneers since I was ten years.
[957] years old.
[958] And it's been a conversation I've had with my friends when we were younger.
[959] Like, if you could change one thing, what would you do?
[960] What would you change?
[961] And I would always say I wanted veneers.
[962] So cut to now, I have the opportunity to do that.
[963] It's obviously hard for me to be like, actually, nah.
[964] When I've been telling myself, I want that.
[965] But also wouldn't you say when you declared that that you didn't like how you looked at a 10 out of 10.
[966] And I hope that you don't like how you look at a 5 out of 10 now.
[967] I hope there's been progress in liking how you look.
[968] Well, the progress is that it's not that I like how I look more.
[969] It's that I care less.
[970] It just feels like a step in the wrong direction to that self -actualization where you realize that you're so uniquely beautiful and wonderful.
[971] Well, I said I need to I need to, I need to think.
[972] So I'm not just jumping in.
[973] Right.
[974] I'm saying the notion of you looking in the mirror and not thinking you have perfect teeth, I don't like that.
[975] I know.
[976] Yeah.
[977] And I like that you feel that.
[978] And you want me to get on the train.
[979] No, I don't want you to get on the train at all.
[980] And I think you should step into my shoes a little bit and think like you want things when you were eight years old that you can now have and you have them.
[981] Uh -huh.
[982] A lot of cars.
[983] and motorcycles.
[984] Yep.
[985] Yeah.
[986] And I'm muscles now.
[987] Exactly.
[988] Um, okay, Yuval.
[989] All right.
[990] You've all.
[991] Sorry, Yuval.
[992] Sorry, Yvall.
[993] Okay, so you mentioned a ranking system for countries in one of Malcolm Gladwell's books.
[994] Brought it up a bunch of times.
[995] Are you talking about the pilot?
[996] It's in the chapter about Korean air.
[997] And what he talks about is that IBM, because they had all these satellite offices all over the world, they had to send, they had to send, to send people to study what their power dynamic with authority was.
[998] And that has a name and it was given a number.
[999] And so they, and this was just for internal use.
[1000] Right.
[1001] But then when that internal ranking was grafted against these pilot errors, I guess you could call them, it's a perfect correlation.
[1002] Yeah.
[1003] It's from outliers.
[1004] I just Googled.
[1005] Oh, okay.
[1006] I just Googied.
[1007] Okay.
[1008] Okay.
[1009] Also, I just learned about a really cool book that I think we should have the author on.
[1010] Which one?
[1011] It's called The Culture Gap.
[1012] Okay.
[1013] And it's very similar to what you're saying, that, you know, different cultures.
[1014] Have different fears of authority and...
[1015] Yeah, there's all these different levels of things and you can monitor and where each country sort of ranks on the scale.
[1016] Well, the really sad case in the book to demonstrate this is that, um, I want to say they were Brazilian pilots, and they were flying into New York, and they were asked to go into a holding pattern because air traffic control asked, and they were running out of gas, and they kept sane, but very gently and in a very mitigated way, we're running pretty low.
[1017] And the guy was like, that bossy, not afraid of authority, New York traffic controller is like, well, tough shit, you got to circle.
[1018] And they just kept circling until they ran completely out of fuel and crashed into Long Island and died.
[1019] And an American, especially in Israel, it would have been like, fuck you, I'm landing the plane right now.
[1020] We're out of fuel.
[1021] Yeah, exactly.
[1022] And then also if you look at the black box voice recorders of this one flight that there was ice on the wings, similar situation where the co -pilot had this huge fear of authority culturally.
[1023] And he kept saying like, I think we got a fair bit of ice on the wings.
[1024] Like, not stop, there's ice on the wings.
[1025] It was all these mitigated sentences trying to work towards, hey, I don't want to be annoying, but, you know.
[1026] Okay, and can I fact check my fact check?
[1027] Yeah.
[1028] It's not called the culture gap.
[1029] It's called the culture map.
[1030] Oh.
[1031] It's by Aaron Meyer, and I'll tell you the reason I know about it is because Callie works at Netflix, and it's a very popular book at Netflix.
[1032] Oh, and they'll probably talk.
[1033] turn it into a documentary or something.
[1034] Well, she can come here to promote the book and then she can go to her show on Netflix.
[1035] Great.
[1036] Good for that.
[1037] I'd love to have her on.
[1038] Let's have her on.
[1039] Okay, we're going to have her on.
[1040] Call her right now.
[1041] I'm going to call her right now.
[1042] Decoding how people think lead and get things done across cultures.
[1043] But it's in the same vein.
[1044] And I think you can just learn so much.
[1045] Also, one takeaway, because Callie was telling me about the book, I was like, I wish Americans, you know, we get so wrapped up and how different we are.
[1046] Uh -huh.
[1047] And, like, Republicans, Democrats, and within America, we're so concerned with how different we are.
[1048] But, like, when you read this book, it's just so clear, we're all the same.
[1049] We have a lot of...
[1050] You mean Americans.
[1051] Americans.
[1052] Yeah, yeah, yeah, yeah.
[1053] Yeah, we're way more the same than us and Brazilians.
[1054] Exactly.
[1055] And we have a lot of the same wiring.
[1056] And it'd be nice if we could remember, like, oh, we're actually all the same compared to these other cultures, and it might help us get on the same page.
[1057] Any Canadian looking at the left and right, they're like, oh, yeah, you guys are all blowhards and know it all isn't so fucking convicted about your opinion.
[1058] It's like all like apex of conviction on both sides.
[1059] It's, yeah, I think that's a good way to come together, realizing our culture is actually way more American than it is like Los Angeles or the South or, you know.
[1060] Big time.
[1061] How many books has you've all written?
[1062] I'm going to guess.
[1063] Okay.
[1064] I'm going to say four.
[1065] Okay.
[1066] On his website, he has four listed.
[1067] Oh, okay.
[1068] The ones you just list.
[1069] Sapiens, Homo Deus, 21 lessons for the 21st century?
[1070] Yes.
[1071] And now the graphic novel.
[1072] Yeah.
[1073] That's what he has listed.
[1074] It's crazy his first book was Sapiens.
[1075] I think that thing sold like 18 million copies worldwide or something.
[1076] But that's why I'm a little confused because when I just type in his name, not on his website, but his name, it says in 2007 there's a book called special.
[1077] operations in the age of chivalry.
[1078] And I wonder if he wrote like academic stuff prior to Sapien since he was a professor and it was more academic.
[1079] That's true.
[1080] Maybe it was just like a published paper or something.
[1081] But that was in 2007 and it is written by him.
[1082] Okay.
[1083] It does say book.
[1084] Okay.
[1085] So he's got five then.
[1086] But he's not listing that on his website.
[1087] He doesn't.
[1088] He wants everyone to forget about that.
[1089] Yeah.
[1090] all right and you know he's too smart to have too many facts so that's all similarly i don't really i talk about um hit and running chips a lot but i i did make a movie called brothers justice i wish you would talk about it more you should be proud of that one i love that one well i think because i made it for five thousand dollars and i know it looks and sounds terrible that i'm like i forget to ever mention it so funny well thank you so much i enjoy it a lotty lot okay okay love you Love you, bye.
[1091] Bye you, all.
[1092] Follow Armchair Expert on the Wondry app, Amazon Music, or wherever you get your podcasts.
[1093] You can listen to every episode of Armchair Expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple Podcasts.
[1094] Before you go, tell us about yourself by completing a short survey at Wondry .com slash survey.