The Joe Rogan Experience XX
[0] Joe Rogan podcast, check it out.
[1] The Joe Rogan Experience.
[2] Train by day, Joe Rogan podcast by night, all day.
[3] So I was saying I had, we were talking about phones and cameras and, in fact, the compact cameras essentially dead.
[4] I had an Apple camera.
[5] I don't know if you remember them.
[6] No. But it was a, I think it was one megapixel.
[7] And it was about the size of this book, County County Monte Cristo book.
[8] That's it right there.
[9] Oh, okay.
[10] How many megapix?
[11] There's two of them there.
[12] One's flat like that.
[13] I didn't have that one.
[14] I had the one on the right.
[15] Yeah, that's the one I had.
[16] A quick take 200.
[17] Yeah.
[18] I think it used floppy disks, if I remember.
[19] I'm trying to remember what you put in there.
[20] This was in the 90s, I want to say.
[21] Yeah.
[22] And that was a big deal.
[23] I mean, I don't know what the megapixels were, but I seemed to remember it was like one.
[24] Could have been one.
[25] Yeah.
[26] It was a big deal.
[27] Like you could take some good -ass pictures with that one.
[28] Okay, so it is some sort of an SD card.
[29] So, are you an amateur photographer?
[30] I am, yes.
[31] Do you use actual photography?
[32] Do you develop your own photographs?
[33] No longer.
[34] So I am digital, as most photographers are these days, except for people into nostalgia and retro and hipsters.
[35] stuff.
[36] But I do have a manual focus camera, so I am old school in that way.
[37] And I set my own aperture.
[38] I use a tripod when I can.
[39] And I do take it seriously.
[40] I love the gadgets, but I also love thinking about visual experience.
[41] I started off as a psychologist studying visual cognition.
[42] And so I'm interested in how the brain perceives color and shape what makes for an aesthetically pleasing image, what makes for a nice landscape, what makes for a nice portrait.
[43] So it combines my love of gadgets with my love of visual cognition.
[44] There really is an art to it, too.
[45] You know, the idea of just pressing a button and aiming a camera, people go, well, anyone can do that.
[46] But anyone doesn't have the sight of, the ability to, like, frame it properly and figure out what angle to take and how to focus things Well, yeah, because you're taking a three -dimensional scene and a three -dimensional scene that changes as you move around, even as you shift your head from side to side, as you look at the scene through two eyes, so you get depth information from stereoscopic vision.
[47] You've got 180 degrees of visual angles, so you're always looking at a panorama.
[48] Then you're converting that into a two -dimensional rectangle.
[49] It's a restricted frame of what the world is.
[50] It's flat, unless you're into stereo photography, which I sometimes do as well.
[51] But generally, it is flat.
[52] It doesn't change when you move your head.
[53] So it's a two -dimensional object.
[54] And so I think the art of photography is combining an appreciation of that part of the world that you are capturing with an aesthetically pleasing rectangle that has colors and shapes.
[55] shapes that would have to work if it was, even if it was just like an abstract rectangle of blobs of color, it's got to work at that level.
[56] At the same time, it's a picture of something in the world.
[57] And combining those two different mindsets, like it's reality, but it's a flat rectangle.
[58] That's what the art of photography is.
[59] What is stereo photography?
[60] That's when you have two lenses, like we have two eyes.
[61] And so you take two photographs from.
[62] slightly different vantage points, and then you view the pair of images through a viewer that allows your eyes to focus on the two, each eye to focus on its own image.
[63] Oh, so it's like a 3D movie type deal?
[64] Well, 3D movies, which were a fad, they were fad first in the 50s when Hollywood had to compete with TV.
[65] And of course, now they're, they've been revived with IMAX and they were huge in the 19th century.
[66] That was their equivalent of television.
[67] People would sit around with these wooden stereograph viewers, and they'd see side -by -side pictures that would pop into depth when they saw them through the viewer of the Eiffel Tower and the pyramids and the American West, the Grand Canyon.
[68] And that was kind of the way the whole family or a bunch of friends coming over would amuse themselves in the 19th century.
[69] Is this supposed to be a visual, like a video representation of what that would look like, Jamie?
[70] Yeah, it's like taking a GIF and bouncing back and forth between the left and right photo, so you can sort of see it without the viewer.
[71] You know, you can do it with, they used to sell stereo cameras, and I think there's still one or two that you can get.
[72] They were big in the 50s.
[73] You can get the equivalent by doing what they call the astronaut two -step, because the astronauts who walked on the moon would take stereo photos with a regular camera.
[74] Just you put your weight on your left leg, take a picture, put your weight on your right leg, take a picture.
[75] Naturally, it's going to shift the camera over a couple of inches.
[76] So you have two images that were taken, kind of like.
[77] from the perspective of your left eye and your right eye, then you, the trick then is you can't just take a pair of pictures, put them side by side, and have your left eye look at the left picture and the right eye look at the right picture.
[78] I mean, you can if you really train yourself, and I've trained myself to do this, and a lot of perception psychologists have.
[79] But ordinarily, because your eyes both converge and diverge in and out, you go cross -eyed or wall -eyed, and the lens in each eye has to focus what you're looking at, those two reflexes are coupled, so that if you have each eye looking at a picture, your brain thinks it's infinitely far away, and so you focus for infinity, so each picture is blurry.
[80] Then when you try to get it into focus, now your brain's thinking, well, something is nearby.
[81] I've got to make my eyes a little more cross -eyed so I don't get a double image, and you lose each image going to a separate eye.
[82] So that's why you have these viewers, kind of like the viewmasters that sold at tourist traps, those plastic contraptions with a ring of photos, where they just have two lenses, one for each eye, and that spares your eye from having to focus.
[83] You can just focus at infinity, and the lens makes the picture sharp.
[84] When you focus at infinity, your eyes are parallel.
[85] They're both looking out into the distance, and each eye can see its own photo, and then it snaps into a 3D illusion.
[86] If I remember correctly, a few years ago, there was a camera on a phone that was taking three -dimensional images.
[87] Do you remember this, Jamie?
[88] It was one of the Android phones.
[89] You know, like Android, you know, because it's such an open source thing, they kind of have freedom to do wacky things to try to attract attention and try to get people to buy them.
[90] And so they had developed this camera on it.
[91] It was like an enormous camera apparatus on the back of a phone that took three -dimensional images.
[92] Do you remember this?
[93] I'm not making this up, am I?
[94] I vaguely remember something like that.
[95] I want to say it was like eight or nine years ago.
[96] It was quite a while ago.
[97] There are a number of ways of having a picture pop into depth using stereo vision.
[98] One of them is the technique that goes back to the Victorians.
[99] You put two lenses in front of the eyes, and then the eyes can both look straight ahead.
[100] Each one can see its own image, and they're both sharp.
[101] You can also, in virtual reality, what you often have is you wear goggles that are effectively shutters for the left eye and the right eye.
[102] So you block the left eye, and then the screen shows the image that goes to the right eye.
[103] Then you block the right eye, and the screen shows the image that goes to the left eye.
[104] But it happens so fast that it doesn't even look like it's flickering faster than the eye can resolve.
[105] And that's how the old 3D TVs used to work.
[106] That was a fad in the late 90s, early 2000s, never really caught on.
[107] but for a while they were selling 3D TVs.
[108] That was even more recent.
[109] There was something, I believe, in the 2000s.
[110] What is this?
[111] 2007, it was 2009.
[112] It's a Samsung phone that doesn't show the picture of it.
[113] There it is.
[114] Is that it?
[115] Well, then I also had that one phone that had the display that was 3D, and that never took off either.
[116] Oh, that's what I'm thinking of.
[117] Where you go look at the thing correctly and, like, it was supposed to show things.
[118] Right.
[119] I think it was even the red phone, maybe, and it just bailed on the project.
[120] But I believe this.
[121] It had a camera that took specific types of...
[122] Yeah.
[123] So you could use all that stuff.
[124] Yeah.
[125] But it didn't take off.
[126] I still have it.
[127] I don't even know it is.
[128] You bought that big clunky thing.
[129] I remember that.
[130] And it took like a year to get it, didn't it?
[131] Yeah, a lot of things.
[132] Yeah.
[133] Yeah.
[134] Then there's the lenticular photos, which is like the kind of the winking Jesus when you kind of tilt it.
[135] Jesus, you know, winks at you or waves.
[136] And that can be used for stereo too.
[137] And there you have the two pictures.
[138] Again, always taken one where the left eye is, one where the right eye is.
[139] Then the trick is how do you get each photo to the appropriate eye?
[140] And with the ventricular photos, as if each photo is cut into teensy, weency little vertical strips and they're kind of interdigitated, then on top of it you have a bunch of tiny little kind of cylinder, half cylinders aligned with the images so that the left eye and the right eye, which are, you know, a couple of inches apart, are looking at the pictures through slightly different angles, and these cylindrical lenses just make sure that each eye can only see the half image at the appropriate angle.
[141] It's gotten much better, and now they're artists who actually use it as an art form, where as you move, the image moves like it's a real scene in the world.
[142] But I remember them as, when I was a kid, often with cheesy cartoons or Jesus or Santa Claus, yet another way of just basically getting two images taken from different vantage points, each to the right eye.
[143] Photography is fascinating.
[144] The technology around photography is fascinating.
[145] But what is, how do you, are you, do you have an optimistic view of our relationship with technology?
[146] On the whole, yes, but of course it crucially depends on what the technology is used for.
[147] If it's making better bio -weapons or better nuclear weapons, it's not necessarily such a good thing.
[148] Do we really need better nuclear weapons?
[149] Don't we have enough nuclear power to nuke the entire world multiple times over?
[150] We sure do.
[151] I think we need better nuclear energy sources.
[152] Yes.
[153] Like fourth -generation nuclear.
[154] That's probably the only way we're going to get out of the climate.
[155] climate crisis.
[156] Well, fourth generation nuclear, the problem with nuclear is this very, relatively small amount of accidents that have happened, you know, Fukushima, Three Mile Island, there's a few of them.
[157] There haven't been that many, but everybody associates Chernobyl with nuclear power.
[158] Like, oh my God, it's going to melt down.
[159] It's going to kill everybody.
[160] And if you really pay attention to what nuclear power is capable of, it's really capable of zero emission energy.
[161] It's really capable of, I mean, there's some problems with the world.
[162] waste and but that those problems can be resolved with better versions of nuclear power whereas like Fukushima is a perfect example they were working on like very old technology where they have one backup and the backup went down too because of the tidal waves and then that was it the tsunami kicked out the whole system and now they're in this like perpetual state of meltdown like they don't know what to do with all the waste they don't know what to do with all the water they have like they're developing these trenches if you paid attention to the with the way.
[163] what they're doing?
[164] Yeah.
[165] Well, I'm freezing it and...
[166] No, and that was spectacularly bad design.
[167] People don't know that there was another nuclear power plant in Japan, that during that same tsunami, people actually went into that nuclear power plant for safety because it was so well built.
[168] It was so removed from the reach of the ocean, even during the worst conceivable tsunami.
[169] It was better designed, better situated, and people used it as a refuge.
[170] I know it sounds like an episode out of The Simpsons.
[171] like let's get into the nuclear power plant to be safe.
[172] Right.
[173] But that really happened.
[174] So I've written about this a lot.
[175] I wrote an op -ed in the New York Times called Nuclear Power Can Save the World.
[176] And indeed, one of the big impediments is a feature of psychology that I also write about in my book, Rationality, namely the availability bias.
[177] Namely, when people assess risk, they don't look up data, they don't count up the number of accidents compared to the number of years that nuclear power plants have been in operation and how many.
[178] there are.
[179] You remember examples.
[180] And we estimate risk, we meaning the human race, by how easily we can dredge up examples from memory.
[181] We use our brain search engine as a way of calculating probability.
[182] And indeed, some of these flamboyant accidents like Three Mile Island and Chernobyl are what people associate nuclear power with.
[183] And so the hurdles to building new power plants are just been crippling, even though it's the biggest, most scalable form of energy and the safest form of energy, if you think of it in terms of number of deaths per amount of energy made available, it's probably the safest form of energy ever developed.
[184] Yeah.
[185] But our sense of danger comes from remembering these examples.
[186] I know someone who blames the climate crisis on the Doobie Brothers and Bonnie Raid and Bruce Springsteen because their 1979 film No Nukes, a benefit concert coming around the time of Three Mile Island, kind of poisoned an entire generation, maybe two generations against nuclear power.
[187] And, you know, the world needs energy.
[188] We're not, people aren't going to give it up.
[189] We saw that at the Glasgow meeting where, you know, India and China and Indonesia, they're saying, sorry, we're not going to do without the energy that lifted you guys out of poverty.
[190] Right.
[191] So we're going to get it one way or another.
[192] And if they don't get it from, they're not going to get it completely from sun and wind because that depends on the weather and the sun doesn't shine at night or when it's cloudy and the wind doesn't blow 24 -7.
[193] They're going to get it some way, we, not just they, need energy, we're going to get it one form or another, and nuclear is the way to deliver abundant amounts of energy with pretty much no emissions.
[194] But if you ask people, the average person who hasn't really looked into this would say, oh, you know, wind and solar.
[195] But I flew into Hawaii last week when I vacation went to family.
[196] And when we flew into Hawaii, they have those wind turbines that are sitting on the island of Maui.
[197] And they weren't moving.
[198] I go, see that?
[199] Kids, I was pointing to it out the window.
[200] I go, that's a spectacular failure.
[201] I go, it doesn't generate that much energy anyway.
[202] They kill a lot of birds.
[203] These poor birds don't know what's going on.
[204] They fly right into those propellers and get chopped up.
[205] Yeah, I mean, I think wind energy is going to be.
[206] part of the mix, but you just can't power an entire city.
[207] Well, it's also, they're ugly.
[208] There's so many of them.
[209] They litter the side of a mountain or a countryside.
[210] It's just, I think it's gross.
[211] I actually find them beautiful.
[212] Really?
[213] Yeah, I like them.
[214] Oh, my God, there's one in California.
[215] There's like a wind, see if you can find this, because it looks so gross.
[216] It's a, what do they call them, wind turbines?
[217] It's like a wind turbine field.
[218] Yeah, right.
[219] Have you seen it?
[220] It's enormous.
[221] Yeah, there it is.
[222] You don't think that looks gross?
[223] Oh, that looks pretty gross.
[224] But on the other hand, that is pretty...
[225] That's disgusting.
[226] That, to me, looks like a funeral, like a graveyard, rather.
[227] It's like a horrific graveyard of good ideas.
[228] But, you know, if it would prevent a climate catastrophe, I could live with it.
[229] Right, but it's not going to.
[230] It's not going to.
[231] No, not by itself.
[232] Not by itself.
[233] I mean, we need everything we can get.
[234] I don't think it's going to do anything.
[235] Like, the amount of energy you get out of those is relatively small in terms of, like, the amount of space they occupy.
[236] Well, isn't Texas generating a reasonable amount of electricity from wind these days?
[237] I don't think I'd use this place as a great example of how electricity is made.
[238] The whole thing almost went down when it got cold out last year.
[239] When I was living here, there was the first year we were here.
[240] And the winter, wasn't even a bad winter.
[241] You live in Boston.
[242] Oh, you lived at Cape Cod for a while.
[243] I grew up in Newton, Massachusetts.
[244] Oh, okay.
[245] Yeah, so I'm used to winter, like an actual winter.
[246] When winter, that kind of winter hit here, everything was fucked.
[247] Like, everything got shut down and they almost lost the entire, I mean, obviously, you know, Texas has its own grid.
[248] Yeah.
[249] You know, and it almost lost the entire grid.
[250] We were like four minutes from the grid going down.
[251] Like, hey, guys, whatever you doing, it's not good enough.
[252] Yeah, I think we need a mixture and, you know, we're going to need better battery storage.
[253] Yes.
[254] Your neighbor Elon is working on that, but still we don't have a battery that can power Chicago for a month.
[255] No, we don't.
[256] We don't have enough, like, the idea of getting rid of nuclear waste is still problematic.
[257] It's a problem, you know, I say, but it's not a problem the way climate change could be a problem.
[258] And I say, first, let's save the planet, then we can figure out what to do with the nuclear waste.
[259] It's not that big a problem.
[260] I mean, you could fit, you know, I think a person's, a single person's equivalent of nuclear waste in a lifetime is about a Coke can.
[261] and, you know, you put the country's nuclear waste in a couple of Walmarts and seal it up.
[262] Seal it up.
[263] And then, you know, then temporarily guard it the way right now most of the nuclear waste is kept on site in concrete casks.
[264] You can stand right next to the cask.
[265] There's zero radiation that escapes.
[266] At some point, we can bury it underground.
[267] We can recycle some of it in the next generation plants.
[268] but it's a problem, but climate change is a bigger problem.
[269] So let's save the planet first, and then we can worry about what to do with the nuclear waste.
[270] It's funny, but in comic books, nuclear power always leads to a great result.
[271] Like the nuclear energy, the radiation, always leads to superheroes.
[272] I don't think we should count on that.
[273] You always get like Spider -Man or the Fantastic Four.
[274] He'd get awesome stuff.
[275] Well, I remember as a child when nuclear, everything was good.
[276] You'd see these pictures of, you know, super zucchini's and massive turnips from seeds that had been irradiated and the mutants that would produce supercharged vegetables would be featured in county fairs, I don't think.
[277] But that was just after the era where people thought that nuclear bombs could excavate harbors and did canals.
[278] I mean, there was a period of, let's say, irrational exuberance about nuclear power in this country.
[279] Are you familiar?
[280] I'm sure you are, but with the issues that they had with that radioactive paint and women who would work.
[281] Oh, yeah, the glow -in -the -dark watch dials, which had a radium in them, yeah.
[282] Yeah, and these poor women that worked in these factories, painting these things, it would touch the brush on their tongue.
[283] Yeah, to draw it into a fine point.
[284] Yeah, the same way an artist, you put, or you know, you have a shoelace with the, When the aglet has fallen off, you put it into your mouth so that the little threads make a fine point.
[285] You do that with the brush, and it should have been coated in radium paint.
[286] And these poor women develop horrific radiation poisoning and developed holes in their faces, and it's horrible.
[287] Yeah, you've got to be careful.
[288] I mean, radiation is a big deal.
[289] But the thing is, the understanding of radiation is much better than it used to be.
[290] For sure.
[291] There can be ways of really poisoning people, as in the case of the, the, the, the, the, the, watch dial painters, but it's not true that there is a significant risk, no matter how small the radiation.
[292] There are levels of radiation that are perfectly safe, and we live with them all the time because there are rocks that naturally emit radiation.
[293] Yeah, that's interesting, right?
[294] Like the idea of radiation, people always assume that that's a negative thing.
[295] A lot of people are very concerned with the radiation that emits from their cell phone, right?
[296] But it's a very minute amount of radiation.
[297] And again, like you were saying, radiation emits from everything, like rocks, like the sun.
[298] That's right.
[299] I mean, there's even a theory that a small amount of radiation might even be healthful, but whether or not it is, it's not necessarily harmful.
[300] But going back to what I write about, namely psychology, they're also, together with the availability bias, namely people base their sense of risk on how easily they can think of examples, or especially catastrophic examples.
[301] There's also a psychology of contamination where there is no safe dose that one drop can contaminate a substance of any size.
[302] I mean, just think of, you know, say, you know, a big container of water if someone spits in it or peas in it, you wouldn't be reassured by someone saying, oh, it's just, you know, one part in a million.
[303] Right.
[304] It's like it's contaminated.
[305] Sorry, I'm not going anywhere near.
[306] it.
[307] And that does infect our psychology of risk where we intuitively feel that any amount of radiation is too much.
[308] Yeah, radiation is, it's a big one.
[309] It's one of those ones that carried over from the 1940s after the drop of the atomic bombs and all those Godzilla movies and all these different.
[310] There was so much talk about radiation that was in popular fiction and films and it became like a thing where people associate it like immediately with danger.
[311] They do.
[312] Also, there is a confusion in people's minds sometimes between nuclear power and nuclear weapons.
[313] People imagine that if a plant melts down, it could blow up like an atom bomb or a hydrogen bomb, which is physically impossible.
[314] Or they imagine that any country that has nuclear power will have an easy pathway to nuclear weapons.
[315] But there are lots of countries with nuclear power plants without nuclear weapons, and there are countries with nuclear weapons without much nuclear power.
[316] So they really are separate technologies.
[317] Your book, the new book, is Rationality.
[318] Rationality.
[319] What it is, why it seems scarce, why it matters.
[320] It is kind of scarce.
[321] It certainly seems scarce.
[322] What has happened to us?
[323] Well, it's...
[324] I think that there's a lot of rationality inequality.
[325] Yeah.
[326] Because, you know, at the top end, we've never been more rational.
[327] Right.
[328] Not only do we have just mind -boggling technology in terms of MRNA vaccines and smartphones and 3D printing and artificial intelligence.
[329] But we have rationality applied to areas of life that formerly were just a matter of, you know, seat of the pants, guesswork, and hunches and you rely on experts.
[330] So you have things like Moneyball, where some genius thought, well, if you make decisions in sports like drafting and strategy based on data, instead of the hunches of some old general manager, you could actually have an advantage.
[331] And so the Oakland A's went all the way with a fairly small budget for players because they applied data.
[332] Now every team has a statistician.
[333] Data -driven policing.
[334] One of the reasons that the crime rate in the U .S. fell by more than half in the 1990s.
[335] It wasn't that all of a sudden guns were taken off the street.
[336] It wasn't that, you know, racism vanished or inequality vanished.
[337] Part of it was that police got smarter since a lot of violence happens in a few small areas of a city, often by a few hotheads, a few perpetrators.
[338] If you concentrate police on where the crime hotspots are, you can control a lot of.
[339] a lot of crime without that much manpower.
[340] We were just having a conversation.
[341] Medicine, evidence -based policy and governance.
[342] So there are areas in which we've applied rationality in areas that formerly were just gut feelings and hunches.
[343] I'll give you one other examples, effective altruism, a movement that I'm kind of loosely connected with.
[344] Where do your charitable dollars save the most lives?
[345] Should you buy malaria bed nets?
[346] Should you buy seeing eye dogs for blind people?
[347] It makes a big difference, and that's – so charity now is becoming more rational.
[348] So all of these areas, policing and sports and charity and government are becoming more rational.
[349] But, of course, at the same time, you've got, you know, chemtrail conspiracy theorists.
[350] You've got the idea that COVID vaccines are away from Bill Gates to implant microchips in people's – They're not?
[351] Don't they make the magnetic?
[352] Yeah, right.
[353] What's interesting to me is that there's a thing that goes along.
[354] with irrational thought where you have irrational thought that is confined to your party lines, right?
[355] Oh, yes.
[356] If you are, I mean, this is just a blanket statement, but if you are right wing, you are more likely to dismiss the worries of climate change.
[357] Yeah, absolutely.
[358] In fact, that is the main predictor of whether you dismiss climate change.
[359] It's nothing to do with scientific literacy.
[360] You know, a lot of my fellow scientists say, oh, the fact that there's so much denial of man -made climate change means we need, you know, science education in the schools.
[361] We need scientists becoming more popular and making the climate science more accessible.
[362] Turns out that whether you accept human -made climate change or not has nothing to do with how much science you know.
[363] And a lot of the people who do accept it know dilly about the science.
[364] They often will think, well, yeah, climate change, isn't that because of the hole in the ozone layer and toxic waste dumps and plastic straws in the oceans?
[365] A lot of them are out to lunch.
[366] Whereas some of the climate deniers, they're like well -prepared litigators.
[367] They can find every loophole.
[368] They know every study.
[369] A good lawyer can argue against anything.
[370] What does predict your acceptance of climate change is just where you are on the political spectrum.
[371] The farther to the right, the more denial.
[372] So that's what you said is absolutely right.
[373] It's not just more denial, but this willingness to instantaneously argue it.
[374] Like the subject came up oddly enough in Jiu -Jitsu.
[375] or we're after class, we're like just getting dressed and putting stuff away.
[376] And someone said, man, it's just a fact of life that is getting hotter every year.
[377] And this guy jumped in immediately with like this defense of this idea that climate change is nonsense.
[378] And it's like, listen, it's a cycle.
[379] It's always been going on like this.
[380] I'm like, how much research have you done?
[381] Like, do you think people affect it at all?
[382] I'm like, yeah, it is a cycle, right?
[383] If you go back and look at core samples and you look at the ice ages and you look at all the various times and the history of the earth, the climate has moved.
[384] We were actually just talking about this yesterday.
[385] I had someone on who was an expert in ancient civilizations and all these archaeological mysteries that they've found.
[386] And one of the things we were talking about was the Sahara Desert, that the Sahara Desert goes for this period of every like 20 ,000 years or so where it's green and then it becomes a desert again and then it becomes green again.
[387] And it goes back and forth.
[388] And 5 ,000 plus years ago, it was very green.
[389] And now it's an inhospitable desert.
[390] And, you know, this guy just, like, had this, like, right -wing talking point.
[391] Instead of arguing with him, I said, how much research have you done?
[392] I'm like, what do you think is happening?
[393] Like, well, it's just a cycle.
[394] Like, it's definitely a cycle, but don't you think it's extraordinary the amount of CO2 we put in the atmosphere?
[395] Yeah, there can be superimposed trends.
[396] They can be cycling on top of the cycle.
[397] at that, there can be the forcing that we're doing.
[398] It's both things.
[399] It's clearly there's like, the earth varies.
[400] It has varied forever in terms of like the climate shifting back and forth.
[401] But clearly, like if you look at any like Mexico City is a great example.
[402] I flew into Mexico City once and I took photos because I couldn't believe there wasn't a fire.
[403] I'm like, I can't believe this.
[404] Like when you live in Los Angeles, you're used to smoky skies when there's forest fires and wildfires.
[405] But Mexico City, there was no fire.
[406] It was just pollution.
[407] I'm like, these poor fucking people.
[408] They have to live in this shit every day.
[409] This is crazy.
[410] No, that's true.
[411] Actually, what surprised me the most of all the places I've been is Africa, Tanzania and Uganda, where there is a brown haze everywhere, and that's because people burn wood and charcoal.
[412] Oh, that's the thing that people need to realize, too.
[413] Well, they heat up wood to make the charcoal, and they bring the charcoal, and there is a haze over the landscape where nothing like nothing I have.
[414] I've seen.
[415] A lot of the worst, unhealthiest air pollution is people cooking with open fires in their own houses.
[416] Isn't that wild?
[417] Which is wild.
[418] Yeah.
[419] You would never think that burning wood outside would have a significant impact on the environment, but it really does.
[420] Yeah.
[421] I mean, it isn't forcing climate change.
[422] It's a different kind of problem.
[423] Right.
[424] It's another problem.
[425] It's a different kind of pollution.
[426] But going back to what we were talking about, what are the big conclusions of rationality is that a lot of what we deplore now as just crazy stuff conspiracy theories and the fake news and a lot of it comes from one bias, the my side bias, which you kind of already alluded to.
[427] Namely, you believe in the sacred beliefs of your own clique, your own political party, your own coalition, your own tribe, your own sect, and you paint the other side as stupid but an evil for having different beliefs.
[428] And there's a perverse kind of rationality in being a champion for your cause because you get brownie points from all your friends.
[429] And if you were to defy them, if you were to accept climate change in a hardcore right -wing circle, you'd be a pariah.
[430] You'd be social death.
[431] So there's a perverse kind of rationality in championing the beliefs of your side.
[432] It isn't so good for whole democracy when everyone is just promoting the beliefs that get them prestige within their own coalition rather than the beliefs that are actually true.
[433] Yeah, it's strangely prevalent, right?
[434] Like, it's so common that people have this ideology.
[435] They subscribe to the belief system that is attached to that ideology and whatever, whether it's left wing or right wing, and they just adopt a conglomeration of ideas.
[436] Instead of having these things where they've thought them through rationally and really looked at it, Instead, they have an ideology and whether it's left or right wing.
[437] And it seems to me, it's a real shame that we only have two choices in this country politically.
[438] Like if you look at Holland, if you look at there's a lot of countries that have many, many choices.
[439] And I think if we had many, many choices, you would have, you still have tribalism, but at least you'd probably have a more varied idea of what it is.
[440] We have very polarizing perspectives.
[441] We have a left and a right.
[442] and each side thinks the other side are morons and are ruining the country.
[443] Absolutely.
[444] There has been a rise in negative polarization that is in the sense that the people you disagree with aren't just mistaken.
[445] They don't just have a different opinion, but they are evil and stupid.
[446] So that has risen, especially at the extremes.
[447] It's still true that a majority of Americans call themselves moderate, but the extremes hate each other more.
[448] And it's interesting why that's happened.
[449] It's, you know, the common explanation is you blame it on social media and people being in filter bubbles.
[450] You know, that might be part of it, but part of it may also be people segregate themselves in terms of where they live more so that you get kind of educated, you know, hipsters and knowledge workers in cities, and you get less educated people moving out to the outer suburbs and staying in rural areas.
[451] So people just don't meet people who disagree with them.
[452] anymore who have come from different backgrounds or less than they used to.
[453] And then some of the organizations and institutions that used to bring people from different walks of life together, churches, service organizations like, you know, the Elks, the Rotary Club and so on are declining.
[454] So we tend to hang out more with people like ourselves.
[455] Well, even in universities, which used to be a place where people on the left and people on the right could debate, I mean, in high schools even, Like, I remember when I was in high school, Barney Frank debated someone from the moral majority.
[456] And I remember watching it, I think I was 16, and we went and sat and watched this debate and watched Barney Frank trounce this guy and mock him.
[457] And it was pretty fun.
[458] But it was interesting because we got to hear two very different perspectives.
[459] But one, at the time, Barty Frank, who was just better at it and had better point.
[460] was more articulate and had a better argument, and we walked away from that, having heard both sides, but having heard one side argued more effectively.
[461] And you don't get that anymore.
[462] Instead, now, you get one side, and when someone tries to bring someone in that is of a differing opinion to debate this person, that person gets silenced, they try pulling alarms and buildings, and they shout and blow horns and call everyone a Nazi.
[463] And it's unfortunate because you miss the opportunity like I got to see when I was 16, where I got to see a more articulate person with better points of view, better perspectives, argue more effectively that their perspective was more rational.
[464] Yeah, no, I think that that's vital.
[465] Now, there is some evidence that people who have really hardcore beliefs can't be talked out of them with any amount of evidence.
[466] Some people.
[467] Yeah, some people, but exactly.
[468] So people ask me, as a question I get asked a lot when I talk about rationality.
[469] I say, well, how do you convince a real Q &on believer that there isn't a cabal of Satan worshipping cannibalistic pedophiles in the Democratic Party and Hollywood?
[470] In the basement of a pizza house?
[471] And the basement of a comet ping pong pizzeria.
[472] You know, the answer might be for some of them, you can't.
[473] It's kind of like the question, how do you convince the Pope that Jesus was not the son of God?
[474] Well, you can't.
[475] I mean, some people will go to their grave believing what they believe, but you don't have to convince everyone.
[476] There are people who are not so committed.
[477] They might find a little plausible, but their identity isn't fused with it.
[478] It doesn't define who they are, and they might be open to argument.
[479] And, of course, new babies are being born all the time, and they aren't born believing in Q &N or K. chem trails or 9 -11 truther theories, and some of them can be peeled off by rational arguments.
[480] I think it's not tried enough, including in issues that I strongly believe in, like human -made climate change, like safety and efficacy of vaccines, and on and on.
[481] I think that scientists and public health officials have not been willing enough to show their work.
[482] They've just set themselves up as priests.
[483] and said, believe me, I'm a scientist.
[484] Well, that could not be farther from the way science works.
[485] The whole point of science is there is no authority.
[486] You've got to show the goods.
[487] It's data.
[488] You always might be wrong.
[489] And what kind of increased my confidence, say, in human -made climate change, 25 years ago, I was a little bit open to it.
[490] But then seeing the objections, like it's all just cycles, or it's just because the temperature measurements were, the cities grew, and so the weather monitoring stations were used to be out in the country, now they're in the city, and cities are hotter.
[491] That's, you know, I actually heard this from a Nobel Prize winner.
[492] But seeing the counter arguments where there's a site called Skeptical Science, where they take on every one of the objections to human -made climate change, and they explain why it's unlikely to be true.
[493] I find that fantastically convincing, and I think that that we should not give up on people's ability to take evidence seriously.
[494] Granted, there's some people who won't, you know, that people, we all, to some extent, act like lawyers, we argue a cause.
[495] If there's a counter argument, we try to, we rack our brains to figure out how we can, you know, refute the counter argument.
[496] So there is that, especially when the belief is close to your personal identity.
[497] That's not true of all beliefs for everyone.
[498] And public health officials, government officials, scientists should, be prepared to show their work.
[499] This is why I believe it, not just this is the truth.
[500] The personal identity issue is a huge factor, isn't it?
[501] I mean, it seems that if you, did you watch that, um, the four -part documentary series on HBO about Q and on.
[502] It was called Into the Storm.
[503] I did not watch that.
[504] It's excellent.
[505] Yeah.
[506] And it's the, um, Colin, how do you say his name, Huthbeck?
[507] We had them on Holbeck.
[508] I forget how to say it's not his name.
[509] But, uh, We had him on the podcast, and I watched the documentary series with my mouth open the whole time.
[510] I'm like, oh, it's amazing.
[511] It's really good.
[512] But you get a sense of what's driving these people, and the personal identity aspect of it is a big factor.
[513] A big factor is the tribe that they're all this one group of patriots, and they're all in it together.
[514] Like even the way they have their little logo, where we go one, we go all.
[515] Wow, that says it all.
[516] Yeah, they've developed this sort of tribal climate where they really believed that they were going to stop this evil takeover of the government and, you know, supplanting the Constitution and, you know, killing our freedoms.
[517] And they thought they were going to do it through this one person who was a leaker.
[518] This is Q. Yeah, Q. Well, this documentary shows that most likely, again, most likely, I don't know.
[519] most likely Q was this guy who was running 4chan who was fucking with people and then he he actually took it over from another guy who was on 4chan who started fucking with people and then the style of the drops changed and then this you know there's like real evidence that this one guy would be the only person that would have access to be able to post things at certain times and during these certain times Q got to post when other people couldn't post and it seems pretty clear that there's some fuckery of fuck.
[520] Right.
[521] But the people that were all in on the Q &on theory, these people had, they had President Trump photographs on the walls of their houses and they believed in things so wholeheartedly the way they communicated online.
[522] It was a part of their tribal identity.
[523] And it was also a tremendously entertaining, you know, multi -actor online game too.
[524] You look for cues, clues, you share them.
[525] If you find something that no one else has missed, then you're kind of a local hero.
[526] You get a lot of credit.
[527] Also, so this is a problem that I had to think about a lot when I wrote rationality.
[528] Because, you know, I'm a cognitive psychologist, and I, you know, like everyone in my profession, I teach students about, you know, the gambler's fallacy.
[529] Like, you know, if there's a run of reds on a roulette wheel, people mistakenly think that a black is more likely, whereas, of course, the roulette wheel has no memory, and each spin is independent.
[530] So anyway, I have a list of those fallacies.
[531] Then someone says, okay, well, now explain QAnon.
[532] And, you know, the standard cognitive psychology textbook is not much use in explaining such a, you know, well -developed but bizarre set of beliefs.
[533] So part of what I came to and trying to make sense of this is, As you said, there's part of it is just building up a tribal identity, a set of sacred beliefs that just define your tribe.
[534] If you believe it, you're a member in good standing.
[535] If you doubt it, then you're some sort of traitor or heretic.
[536] So there is that.
[537] But another part of it is it kind of depends what you mean by belief.
[538] That people, I think there are two kinds of belief.
[539] There's the kind of belief about the physical environment that you, you know, the world that we live in, that where.
[540] You've got to be in touch with reality because reality gets the last word.
[541] You know, reality is what doesn't go away when you stop believing in it.
[542] That's a quote from Philip K. Dick, the science fiction writer.
[543] And even the people who believe in the weirdest conspiracy theories, you know, they hold a job a lot of them and they, you know, they pay their taxes and they get the kids clothed and fed and off to school on time.
[544] They're not like, you know, psychotic.
[545] They obviously are in touch with cause and effect in the real world.
[546] like if your car is out of gas, it's not going to move.
[547] You've got to understand that if you want to drive, and people do understand it.
[548] And wishful thinking is not going to make your car go if there's no gas in the tank.
[549] And, you know, the vast majority of people know that.
[550] They're not going to say, well, I'd be really upset if the car didn't go when there's no gas in the tank.
[551] So therefore, I'm going to believe that it will go.
[552] I mean, people don't believe that.
[553] Reality really is a pretty good check.
[554] in the world of day -to -day cause and effect.
[555] But then there's this whole other realm of things that we'll never get to know like what goes on in the White House and in corporate boardrooms and what happened at the origin of the universe and what happens, you know, why do good things happen, bad things happen to good people and vice versa, these kind of cosmic questions where I think the sense of most people is that's a different kind of belief.
[556] You know, you can't find out.
[557] No one knows.
[558] So I'm going to believe things that are interesting, that are morally uplifting, that convey the right message, whether they're, you know, true or false, you know, you can't find out.
[559] No one knows what difference does it make.
[560] And there's a whole set of beliefs that fall into that category of kind of more mythology than reality.
[561] You know, religious beliefs.
[562] That's why we say people hold things on faith because you don't demand evidence that Jesus was the same.
[563] son of God or that God created heaven and earth, you take it on faith.
[564] Sometimes, some of our national founding myths, you know, every country believes that it was founded by a generation of heroes, of martyrs, of great men.
[565] And then, you know, the annoying historians come along and they say, well, yeah, but, you know, Jefferson kept slaves and, you know, the greatest generation in World War II.
[566] They were kind of, you know, racist and they, you know, killed civilians and, you know, bombed innocent people too.
[567] And we don't want to hear that.
[568] We want to think that our side, our heroes are really heroic.
[569] And when it comes to some of these conspiracy theories, I suspect that a lot of them fall into this category of mythological beliefs rather than reality -based beliefs.
[570] So if someone says, I believe that Hillary Clinton ran a child sex ring out of the basement of a pizzeria, what they're really saying is I believe that Hillary Clinton is so depressive.
[571] that's the kind of thing she could have done.
[572] Or even just, you know, Hillary, boo, that's kind of what the belief amounts to.
[573] And it's not clear how factually committed most of them are.
[574] I took an example from another cognitive psychologist, Hugo Merci, who noted that one reaction of a PizzaGate believer was to leave a one -star Google review to the comic ping pong pizzeria.
[575] He said the pizza was incredibly underbaked.
[576] And there were some men looking suspiciously at my son.
[577] Now, that's not the kind of thing you'd do if you literally thought that children were being raped in the basement.
[578] You know, you'd call the police.
[579] You know, there wasn't one guy, Edgar Welch, who did, you know, come into the pizzeria with his guns blazing in a heroic attempt to rescue the children.
[580] Yeah.
[581] You know, at least he really did believe it.
[582] He then, you know, recanted, he realized that he had been duped.
[583] But the most people who say they believe it, they believe it in a kind of different sense then we believe that there's coffee in that cup or that it's going to rain today.
[584] It's a whole different mindset of belief.
[585] It's belief for the purpose of expressing the right moral values.
[586] Did it happen?
[587] Did it not happen?
[588] Yeah, what, you know, no one knows.
[589] Expressing the beliefs of the tribe.
[590] Yeah, and expressing the moral values of the tribe.
[591] Yeah.
[592] And that tribe in particular is, it's extremely exciting to a lot of the tribe.
[593] the people that are a part of it.
[594] And one of the things that you see in the documentary is like people got their families involved, like one couple, their child was like chanting, like build that wall.
[595] Like they had this idea that what they were doing was the right thing.
[596] And what they were doing was, you know, really going to save the world.
[597] And it was very exciting.
[598] It's the same thing that I always say about UFOs and Bigfoot.
[599] It's that like the belief that it's real is so interesting.
[600] It's so fun.
[601] It's so fun to believe that it's real.
[602] Yeah.
[603] And what you said about Q and on is that it's kind of an online game.
[604] And I think that was part of the fun of it.
[605] It's like if Q really had information, well, fucking say it, man. Tell us what's going on?
[606] Why, why you, what's with these cryptic drops?
[607] Well, just like those Hollywood movies where you've got the serial killer who deliberately leads tantalizing clues to the police.
[608] But what about real killer?
[609] The Zodiac killer did that for real.
[610] I guess they never caught him.
[611] That's true.
[612] But most of them don't deliberately leave little clues.
[613] Quite a few of them do.
[614] Quite a few serial killers do deliberately leave.
[615] Deliberally leave misleading clues?
[616] Well, there's a psychological aspect of it that I'm surprising you don't know about.
[617] They think that a lot of them somehow or another want to be caught.
[618] And the more time goes on, the more things where they don't get caught, they become more and more risky.
[619] And they make more and more mistakes.
[620] Well, that I do know, yes.
[621] They do, they, at first it's like a real, you know, kind of scary thrill.
[622] Then when they get away with it, they, they want an even bigger thrill next time.
[623] They want to cut it close to the edge as possible.
[624] Yeah, exactly.
[625] And they think there's also an aspect of it where they want to get caught, for some of them, right?
[626] Some of them.
[627] Okay, that I wasn't aware of.
[628] But it's speculative, right?
[629] I mean, some of them do say when they got caught, I was wondering what took you so long.
[630] Well, they do take bigger and bigger chances to see what they can get away with that.
[631] Well, the Unabomber is a great example of that, right?
[632] Like the Unabomber, he left very odd clues to what he did and even wrote this very bizarre manifesto that eventually was his downfall.
[633] Right.
[634] Because his brother, yeah, recognized the ranting.
[635] He was like, I know that kind of crazy.
[636] But, I mean, he put it out there.
[637] I don't know if he wanted to get caught.
[638] No, maybe not.
[639] Yeah, maybe not.
[640] You know, I think he really had, you know, this important message that the world had to understand.
[641] Do you, you ever pay attention to the story behind him?
[642] You know, a little bit.
[643] He was a Harvard graduate.
[644] He was also part of the Harvard LSD studies.
[645] Oh, that's right.
[646] Timothy Leary and Richard Alpert, Bob Arondas.
[647] They cooked that fellow's brain.
[648] Oh, maybe.
[649] I think they did.
[650] Well, I also was particularly attentive to it because I could have been a target.
[651] He was sending, you know, letter bombs to popular science writers.
[652] and at one point I actually got a package that looked suspicious I was at MIT at the time so I called in the MIT police as I was instructed to do when everyone was afraid of who would be the next target of the Unabomber it was you know it had all these plastered with postage stamps and tied with string and you know from someone I didn't recognize and we got stern warnings from the MIT police you know do not open a suspicious package call the police So I did.
[653] And, you know, I expected, you know, the bomb squad to come with, like, plexiglass shields and pincers.
[654] And so the police took the package out of my office.
[655] And then, like, three seconds later, the door knocked.
[656] I opened it up.
[657] They just opened the package.
[658] Maybe he said some cop in that nobody liked.
[659] Hey, Mike, go open that package.
[660] That's hilarious.
[661] There's a great documentary about Ted Kaczynski that's on Netflix.
[662] and it goes into detail.
[663] His brother goes into detail about his childhood, and one of the aspects, besides the Harvard LSD study that he was involved in, he had a disease when he was young, and they separated him from his family as an infant for long periods of time where they put him in this infirmary and in this hospital set up, no one touched him, no one held him, and he cried and screamed, and it went on for months, and they think that it really, negatively affected his psychological development and that it's part of this lack of empathy and this lack of connection that he had to other people was a direct result of his experiences as an infant hospitalized because it was like for months and months at a time his family didn't get to see him no one touched him so it's like the the orphans and the romanian orphanages during the chescue yes yes um and he was he's very different than his brother.
[664] His brother...
[665] Oh, he's a normal guy.
[666] Right.
[667] And his brother who turned him in was detailing all the instances in Ted Kaczynski's life where he realized, like, he's really a problem.
[668] Like if a woman rejected his advances, he would write horrible evil letters to her and do things to sabotage her and just go out of his way to try to attack her.
[669] And he realized, like, Jesus Christ, my brother is a real fucking psycho.
[670] Literally, yeah.
[671] Literally.
[672] And then once the bombing started happening, I think he probably had a notion like, geez, this could be my brother.
[673] And then when he ran the manifesto, he was like, I think it's him.
[674] Right.
[675] Yeah.
[676] Crazy, right?
[677] Amazing story.
[678] Yeah.
[679] Psychology is so fascinating because the way the mind works and the way the mind can be manipulated with cults and with religions and with ideologies and beliefs, my friend Bridget Fetasy, who has a great podcast, she's got a few great podcasts, but she was interviewing this guy who became a jihadist, this blonde -haired, blue -eyed white guy who got sucked in to the ideology of these jihadists and became one of them.
[680] And now he, you know, unradicalized himself, got free of it all, got, I don't know exactly how he did it.
[681] She was just explained to me on the phone last night, but now works to try to help people escape these situations.
[682] But it's almost like a disease of the mind where people get trapped into this certain, very rigid way of thinking, and they refuse to believe that what they're thinking is incorrect.
[683] Yeah, it is, or it's kind of like a matrix where they often cults and terrorist cells will befriend a lonely person.
[684] They'll simulate all the experiences of a family.
[685] They'll often say, you know, we're your family now.
[686] We're a band of brothers and use kinship metaphors.
[687] They'll, and religious cults do this too.
[688] There are a lot of lonely, alienated people out there, and then you suddenly provide them with a warm, loving family and a sense of purpose.
[689] Then you combine that with the fact that our own rationalization powers, and I distinguish rationalization from rationality.
[690] But, you know, we're all, to some extent, intuitive lawyers.
[691] That is, we can use our brain power to find the best possible argument for some position that we're committed to.
[692] It's called motivated reasoning.
[693] It's another big source of irrationality, even among smart people, sometimes especially among smart people.
[694] Yeah, it's very disturbing when smart people get locked into these rigid ideologies where they won't examine new evidence.
[695] It drives me crazy because I have very intelligent friends that hit these sort of roadblocks.
[696] And you want to say, hey, man, this is not, you're looking out the wrong way.
[697] It's so true.
[698] So that's another frequently asked question that I get is rationality the same thing as intelligence.
[699] And to the extent we can measure them separately, they correlate.
[700] On average, smarter people are more rational.
[701] When I say more rational, I mean, less vulnerable to standard cognitive.
[702] me to fallacies, like the gambler's fallacy, like the sunk cost fallacy, you know, better able to estimate risk and probability and chance and logical fallacies.
[703] So they kind of correlate, but not perfectly.
[704] There's an awful lot of smart, irrational people out there.
[705] And especially when you have a smart person who gets locked on to a belief that's close to his or her identity, then, of course, you can muster the best lawyerly skills to defend it.
[706] So it shows that what makes us rational as a species and as a country isn't so much that we've got some hyper -rational geniuses.
[707] So we form these communities where different people can check each other's irrationality in the same way that the whole basis of democratic government is, this goes back to the founding of the framers and the founding fathers, that, yeah, everyone wants power and everyone has too much ambition.
[708] If you let someone get too much power, for sure they'll abuse it.
[709] So the trick is you have one person's power that checks another person's power.
[710] Ambition counters ambition.
[711] You have checks and balances and branches of government.
[712] So likewise, when it comes to ideas, what's to prevent someone with their brilliant theory of the whole explains everything, but they're really out to lunch, but they're very capable of defending it.
[713] Well, you throw them in a community where they've got to defend it to other people and other people who get to pick holes in it.
[714] That's what makes us rational.
[715] So in science, you've got peer review.
[716] In journalism, at least in theory, you have freedom of the press.
[717] You've got editing.
[718] You've got fact -checking.
[719] You can't just say anything you want.
[720] You've got a whole community of people that try to keep you in touch with reality so that one person's cleverness doesn't get out of hand.
[721] Well, I think one of the things is contributing to today's infatuation with conspiracy theories is that some of them are real.
[722] That's one of the things that drives these Q &N people crazy.
[723] Like, for instance, like when the Hunter Biden laptop story gets censored by Twitter and they won't allow the New York posts link to the article to be posted on Twitter because there's a concern that people, because this was right before the election.
[724] Yes, right.
[725] People would read into this and decide to reelect Trump.
[726] And that's a real conspiracy.
[727] That's real.
[728] I guess.
[729] I mean, it was a, you know, it was a ham -fisted move, but it wasn't the kind of diabolical conspiracy involving hundreds or thousands of people keeping an amazing secret, like, for example, if the Twin Towers were demolished by implosion.
[730] I mean, it's a different order of conspiracy.
[731] It is, but it is a conspiracy.
[732] There's clearly a bunch of people that conspired to keep evidence from the general public.
[733] They wanted to make sure that evidence wasn't easily distributed because this idea that you could put something up on Facebook or on Twitter and then it catches fire and then it gets shared by millions of people and we know that some of the information that gets shared like that is incorrect and is done by foreign entities in order to so distrust into our political system and that's that's what the whole internet research agency is in Russia that Renee DeResta has written about and you know that was one of the things that they encountered recently is that 19 of the top 20 Christian sites on Facebook were run by a troll farm in Macedonia.
[734] That's amazing.
[735] Isn't that wild?
[736] So these are real conspiracies.
[737] Some conspiracies are real.
[738] Conspiracies do exist.
[739] And in fact, there's a sense in which, going back in our evolutionary history, the biggest threat of aggression was in the form of aggression.
[740] conspiracies rather than full frontal attacks.
[741] Because if you look at tribal warfare, it's not two sides like on a football field chucking spears at each other.
[742] I mean, they do that, but not that many people get killed.
[743] It's almost more for show.
[744] But where the body counts really get racked up are in pre -dawn raids and in stealthy ambushes.
[745] And so I give an example from the Yanamama, one of the peoples of indigenous peoples of the Amazon rainforest.
[746] where the villages are often at war with each other.
[747] And one of them invited, one village invited another one over for a feast.
[748] And at the end of the feast, everyone was kind of full and kind of drunk.
[749] Then on Q, the hosts kind of pulled out their bows and arrows and battle axes and killed all the guests on Q. It's like the Red Wedding in Game of Thrones.
[750] Yeah, right.
[751] So that exists, and that's what people were vulnerable to.
[752] So I suspect you're right in that a certain openness to the possibility of conspiracies came about because there really were conspiracies in our history.
[753] Not just our history, right?
[754] Don't you think they're happening right now?
[755] Not on the scale of chemtrails or Q &on.
[756] I mean, if you think of the number of people that would have to successfully cooperate.
[757] Well, the chemtrail one is just a total misunderstanding, like a lack of understanding.
[758] of what happens when a jet engine encounters condensation.
[759] Exactly.
[760] But, you know, the 9 -11 truth are conspiracy theory.
[761] Yeah.
[762] The number of people that would have to be to maintain, like, the equivalent of a non -disclosure agreement for decades, with no one leaking it and everyone being perfectly silent, perfectly coordinated, you know, that kind of defies common sense.
[763] Well, there's also a lack of understanding of eyewitness accounts of things, too.
[764] Like people say, eyewitness people said that they saw this and heard that.
[765] The problem with any traumatic experiences, eyewitness accounts are often highly inaccurate.
[766] Oh, you tell me about it.
[767] I mean, that's one of the main findings in cognitive psychology from Elizabeth Loftus at UC Irvine, who has shown in experiments that people confidently remember seeing things that never happened.
[768] Yes.
[769] Well, they're very easily convinced.
[770] Like, it's one of the problems with eyewitness testimony.
[771] That is her discovery, absolutely, that a lot of innocent people have been convicted based on eyewitness testimony, especially not only when they're coached, but especially after the fact, the more often they're asked to affirm what they saw, the more confident they get whether it was true or not.
[772] Yeah.
[773] And the fact that we distinctly remember this, I saw it with my own eyes, means nothing in terms of whether it really happened, because we can confidently remember things that never took place.
[774] That's the thing to the coaching.
[775] The coaching aspect of it is very disturbing because you can plant memories in a person's head that were not real.
[776] Oh, easily.
[777] Yes.
[778] So the mind is so fucked.
[779] Like, I talked to people about this before.
[780] I've said, like, how much of your childhood?
[781] How good is your memory?
[782] And people go, oh, my memory's great.
[783] And I go, oh, my memory's great.
[784] I go, my memory's pretty good.
[785] It's really good when it comes to, like, I can say things that I remember and quote things and remember numbers.
[786] and stuff like that, but if you ask me, could I give you a detailed account of yesterday?
[787] Yesterday is a blurry slide show to me. Like I kind of like, I've got a few images.
[788] I think I remember where I parked my car.
[789] I think I went in that door.
[790] I think my dog was there.
[791] I remember petting him kind of.
[792] I remember a few things, but I don't have like an HD video that I can roll of my entire day and back it up to the, But some people like to pretend that they do.
[793] Right.
[794] And we know that they don't.
[795] And there are certain tricks that we know our memory plays on us, such as we tend to kind of retrospectively edit our memories to make a good story, often that puts us at the center of historic events.
[796] Yes.
[797] So, you know, a lot of people remember, people of my generation, remember seeing John F. Kennedy assassinated on live TV.
[798] Now, there wasn't live TV.
[799] Right, ever.
[800] People remember the Arthur Zapruder.
[801] 8mm home movie, which was only made public weeks later.
[802] No, no, no, no, no, no. It wasn't even released to the public 13 years later.
[803] Oh, well, in frames, maybe it was just frames.
[804] Well, it was just photographs, but stills from the video was released on Geraldo Rivera's television show by Dick Gregory.
[805] Dick Gregory brought the video to Geraldo Rivera's.
[806] The comedian and activist?
[807] Yes.
[808] Oh.
[809] He's the reason why the whole back and to the left thing came about and people started questioning the official story of Lee Harvey Oswald acting alone.
[810] They released it on Geraldo Rivera's television show in 75.
[811] So it was 13 years after the assassination.
[812] Well, since then, people who've seen the film merge it with their memory of that day, and they think that they actually saw it in real time, which was absolutely impossible.
[813] There are lots of examples like this.
[814] And I know Snow personally, like your buddies, I think I have a great memory until I have to fact -check my books.
[815] or until readers point out things that I never bothered to fact -check because they were so obvious.
[816] And then I realized, oh, my God, I have a clear memory of, you know, Ronald Reagan saying that, and he never said it.
[817] Right.
[818] And it is really a sobering experience to do serious fact -checking on your own writing.
[819] You realize how many of your memories were just made up there, or not made up out of the whole cloth, but they're polished over time to be a more coherent story or more satisfying story.
[820] That's the thing we do with stand -up comedy.
[821] We take a story and we kind of trim it and edit it and move it.
[822] We take like some stories have a kernel of truth and reality to them, but we, for a comedic effect, we'll twist it around.
[823] Sure, reality's never that funny.
[824] Yeah.
[825] Sometimes it is.
[826] Sometimes you get lucky.
[827] You know, sometimes you get lucky.
[828] But the memory thing is unfortunate.
[829] It's unfortunate that people don't have good memories and that their memories can be manipulated.
[830] And I was reading an article that someone said to me today of a terrible case where a woman who's the author of that book, I believe it's called Lovely Bones, she was raped when she was 18 and a man was convicted who was innocent and he was just released after many, many years of being incarcerated.
[831] and she detailed how she was kind of coached into believing that he was the one.
[832] And she actually picked the wrong person out of a lineup.
[833] And she was told by the prosecutor that that wrong person that she picked out was there as a trick to throw her off because he looked like the other guy and they were friends.
[834] They did it on purpose.
[835] They kind of lied to her and coached her.
[836] And she was 18 and traumatized.
[837] She had just been raped, and she was convinced by these people that she had got the right guy.
[838] They used the junk science of microscopic hair samples, which has since been discredited.
[839] And this poor guy was in jail for more than a decade.
[840] I'm not sure exactly.
[841] No, that's right.
[842] And to her credit, she recanted and she expressed remorse for her role in falsely convicting him.
[843] It's a heartbreaking story.
[844] I mean, all around.
[845] It's horrible, right?
[846] A horrible story.
[847] She's 18, right?
[848] When you're 18, how, you know, I mean, we were talking about someone planting memories in your head, especially when you've had this horrific, traumatic event happened to you.
[849] You've been raped and you think they got the guy.
[850] And they're convincing, these are adults, they're convincing you that this is the guy.
[851] Yeah, no, and it's bad if you're, if you've been traumatized, it's bad if you're 18, but it happens even when you're not traumatized and even when you're, you know, 35, 45, 55.
[852] Yes, yeah.
[853] You know, it's something, it's another issue that I talk about in rationality where I have a chapter on what's a psychologist called signal detection theory, also called statistical decision theory, which is if you, you know, none of us is infallible, where we all have to rely on noisy signals from the world, and we're never completely sure whether they indicate reality or whether they're, as we say, in the noise.
[854] So you've got to have a cutoff.
[855] You have to say, well, if my confidence is above a certain level, I'll make one decision, like convict in a criminal trial.
[856] If it's below it, I'll acquit.
[857] And so there are two things going on in this kind of decision.
[858] One is, how good is the signal?
[859] That is, how good are your forensics, how reliable are your instruments?
[860] And the other is, where do you put your cut off?
[861] Are you going to be trigger -happy?
[862] Are you going to be gun -shy?
[863] Are you going to say yes a lot of the time?
[864] they know a lot of the time.
[865] And that isn't a question of fact.
[866] That's a question of value.
[867] How bad is it if I get it wrong in either direction?
[868] That is, how bad is it if I miss something that's really out there?
[869] How bad is it if I have a false alarm to something that isn't out there?
[870] And those are two parts of the decision process, and people sometimes confuse them.
[871] So you have people saying, well, we've got to, the way to deal with crime on the streets is to put more bad people behind bars, the way to deal with terrorism.
[872] We've got to monitor social media and arrest people before they can commit rampage shootings or acts of terrorism.
[873] We've got to believe more accusers.
[874] The thing is, if all you're doing is you're changing your cutoff, saying I'm going to be satisfied with less evidence before I pull the trigger and say guilty, yeah, you're going to convict more guilty people and you're also going to convict more innocent people.
[875] That's just mathematics.
[876] The way to satisfy the ideal of convicting more guilty people, but not falsely convicting innocent people, is your forensics have to get better.
[877] And our forensics, a lot of our forensics, you mentioned hair analysis, which are contrary to what you see on CSI.
[878] A lot of the forensic techniques are close to worthless.
[879] Yeah.
[880] And people get falsely convicted.
[881] I mean, DNA is the most reliable, and that's shown that a lot of people are in jail for crimes they convict.
[882] Yeah, my friend Josh Dubin, a good friend of mine who works for the Innocence Project, I've had him on a few times and recently had him on with a gentleman who was falsely accused and spent a long time in jail for that.
[883] And he has a podcast that he runs on junk science, things like bite marks.
[884] Yes, right.
[885] It's just not reliable at all.
[886] Ballistics, cut marks from tools, you know, what bolt cutters actually went through this chain.
[887] The thing is that, you know, I kind of rediscovered this myself when I went to a talk at Harvard by the FBI's expert in linguistic analysis because I study language.
[888] How do you tell, you know, who wrote the ransom note based on choice of words?
[889] There's six and X. And I realized this is the top FBI guy.
[890] it was based on kind of hunches and folklore.
[891] They did not do any statistical analysis to prove that it actually worked.
[892] And it's kind of shocking when you look at the science behind a lot of our forensics.
[893] Handwriting analysis is fairly accurate, though, right?
[894] Well, if it comes to identifying who wrote a note, if it comes from divining someone's character from the way they write...
[895] Oh, that's nonsense.
[896] Yeah, right.
[897] And that's astrology.
[898] That's astrology, right?
[899] But to be able to tell whether or not someone wrote a ransom note, like, they're pretty good at that, right?
[900] Good question.
[901] I don't want to say not knowing the answer.
[902] I don't want to say it either.
[903] Yeah.
[904] But it's a lot of them aren't so good.
[905] You know, so we have in the criminal justice system, we have what sometimes called Blackstone's ratio.
[906] After an 18th century British jurist who said, better that 10 people, 10 guilty people, go free than one innocent person be convicted.
[907] Now that's a moral decision.
[908] That's not a matter of accuracy or knowledge.
[909] And I think it's a good, not a bad rule.
[910] But that's signal detection theory is combining that kind of thinking, like how bad is the error of a false acquittal or a false conviction, which is a moral question.
[911] Combine that with how good are we at telling them apart?
[912] which is a question of how good our forensics are.
[913] My concern when it comes to this sort of inevitable connection of human beings to technology, I think it's just a matter of time before we become symbiotic, before something, we're kind of already, we're connected at the hip to our cell phones, but I think it's a matter of time before we get something that is more reliable than the human memory.
[914] And I have a real concern is that one day we're going to all be required to be chipped because this is the only way to get a full HD version of what you did during the day.
[915] And it shouldn't bother you if you're innocent.
[916] It's the same idea of like the NSA spying with what Edward Snowden revealed and some of the people were horrified by it and some of the other people are like, what difference does it make if you're not doing anything wrong?
[917] Like, well, you're missing the point.
[918] Because human beings having that kind of power to look into other human beings' lives are almost always going to abuse it.
[919] And if we do come to a point in time someday where we say, listen, there are, you know, thousands of innocent people convicted every year and sent to, probably more than that, sent to jails for crimes he didn't commit, we can stop all of that.
[920] We can stop all that through these chips.
[921] Just, like, chips do you mean like everyone wears Google Glass?
[922] I mean, like, Neurrelink.
[923] Oh, yeah.
[924] Like that kind of deal.
[925] I tend to be more skeptical of that, but the same problem arises if, you know, everyone's wearing, you know, Google Glass and has a 24 -7 video record of everything they do.
[926] Yeah, but that could, you could take it off.
[927] You know, the chip is there.
[928] Well, you know, in 1984 with a telescreen in every room, it was a crime to turn off the telescreen.
[929] Wow.
[930] We might be headed in that direction.
[931] Yeah, I don't know.
[932] But I tend to be more skeptical than you're only, not that it's, you know, physically impossible.
[933] But, you know, brains are pretty complex.
[934] Yeah.
[935] We're nowhere near that level of specificity.
[936] And I suspect we never will be.
[937] Really?
[938] Yeah.
[939] I think.
[940] So you have no faith whatsoever in AI being sentient?
[941] Oh, so this is separate from, say, neural implants.
[942] Right.
[943] interfaces with your brain tissue.
[944] Do you have the faith that we will be able to recreate what a brain does?
[945] Down to the last synapse?
[946] I doubt it.
[947] I mean, you know, it's not that it's impossible, but it is kind of gargantuan on a scale that we can barely imagine.
[948] Have you ever talked to Kurzweil about this?
[949] Oh, yeah.
[950] So obviously he has a different opinion.
[951] He thinks that we are going to be able to buy.
[952] I believe his guesstimate is 2045.
[953] Yeah, I know, but he keeps post -dating it.
[954] I wonder why.
[955] Yeah, right.
[956] I had a really interesting conversation with him for this sci -fi show, We talked for like over an hour.
[957] And at the end of it, one of the weirder parts about it is that what he is trying to do is to get to a point where he can have a conversation with his father.
[958] His dead father.
[959] Yeah.
[960] He thinks that he through memories and through whatever recordings he has and photographs he has will be able to replicate his father's personality to, a significant or sufficient extent where he can actually have a conversation with him.
[961] He'd talk to his dead father.
[962] Yeah.
[963] You know, I think he could, you know, if the father wrote a lot, like wrote a lot of correspondence, you know, we already have AI that can kind of fake new text in the style of existing text.
[964] That's weird, man. Although, you know, I think it's pretty unsatisfying.
[965] I would not consider that to be having a conversation.
[966] conversation with my dead father.
[967] No, if I had, I mean, I would want to, for one thing.
[968] I wouldn't want to be fooled in that way.
[969] No, it's just like that I would find no comfort in that.
[970] Like, I have very good friends that have died.
[971] And if I had the ability to email a fake version of my very good friend that died and get like a response that's very similar to what they would say, that would mean nothing to me. Exactly.
[972] No, that's really right.
[973] See, it's another interesting part of our psychology.
[974] We have this sense of, you know, is something real or not that Sometimes, is it really connected to the person that we know and love?
[975] And it makes a big psychological difference, even if you can't tell the difference.
[976] It's like, there are lots of examples, and this is from my former collaborator, Paul Bloom.
[977] Someone paid a lot of money for John F. Kennedy's golf clubs.
[978] Now, if it turned out that they weren't John F. Kennedy's golf clubs, if they were just some other guy's golf clubs from the late 50s and early 60s, it would be worth a fraction of the amount and it would be emotionally much less satisfying even if they're the same golf clubs.
[979] But just knowing that there is that personal connection makes a big psychological difference.
[980] Yeah, it means a lot to people.
[981] If you could get some sort of an artifact from a historic figure, you know, I've got a letter from Hunter S. Thompson that he wrote to someone that I have framed my office in L .A. And it's just like, I look at it every now and then I'm like, huh, he really wrote that.
[982] Like his fucking hands wrote on that piece of paper, and it's right there.
[983] And it's very valuable to me because of that weird reason.
[984] Absolutely.
[985] And even though you could not tell the difference.
[986] And the same with great paintings versus excellent forgeries.
[987] We do have a sense of real stuff that really was in contact with other people, with real events.
[988] Yeah.
[989] The real paintings versus forgeries things has always freaked me out.
[990] There was a documentary about this one gentleman who was a master.
[991] at recreating the style of the masters.
[992] Like, he was really good at, like, he could make a fake Picasso.
[993] He could make a fake Rembrandt, and he was making these paintings and selling them for spectacular amounts of money.
[994] And they were really good paintings, but they were bullshit.
[995] Right.
[996] But it's not bullshit.
[997] It's like, it's so weird.
[998] It's because, like, the painting is a real painting, and it's really good.
[999] And it was in the style of these people, And people love the paintings, but they loved it because it was a Rembrandt or because it was a Picasso.
[1000] It's sometimes calling the fancy word for it as Hecatee, H -A -E -C -C -I -T -Y.
[1001] It's from Latin meaning kind of thisness.
[1002] That is, the sense of above and beyond what it looks like, whether it can fool you, just the knowledge that this is the thing.
[1003] Yeah.
[1004] And we're sensitive to it.
[1005] That's the way our minds work.
[1006] Well, it is very strange.
[1007] And sometimes we like it.
[1008] If it's like, like, I went to see the Rolling Stones two weeks ago.
[1009] It was amazing.
[1010] Oh, minus Charlie Watts.
[1011] Yeah, minus Charlie Watts, unfortunately.
[1012] But it was amazing.
[1013] Just to, I felt like I was, I was stone sober for the first 20 minutes until I started drinking.
[1014] But I was, I felt like I was on a drug looking at Mick Jagger on stage, like dancing around and singing.
[1015] And I was like, I can't believe he's real.
[1016] I can't believe it's really him.
[1017] But there's something about that real experience of being in the presence of that real guy.
[1018] That's incredible.
[1019] But then there's some people that are really good at cover bands.
[1020] Right, sure.
[1021] All this impersonators, do those cover bands.
[1022] Well, how about the guy who plays lead singer for Journey now?
[1023] Oh, yes, right.
[1024] Do you know the deal?
[1025] Like Steve Perry retired, but Journey still totally.
[1026] with his other fella.
[1027] With an impersonator?
[1028] Yes, but he doesn't look anything like him.
[1029] But he sounds amazingly like him.
[1030] He's really good.
[1031] This is the fella.
[1032] He's, uh, is he from Vietnam or Filipino, right?
[1033] Filipino?
[1034] Yeah.
[1035] And just play a version of his song because it's so good.
[1036] You have to hear it.
[1037] Because it's so good, it'll freak out.
[1038] Here it goes.
[1039] So this guy was apparently, I mean, it's dead on.
[1040] I mean, dead on.
[1041] I saw the whalers a couple of summers ago with a Bob Marley impersonator.
[1042] Whoa.
[1043] And he was pretty good.
[1044] That one I had a problem with.
[1045] Yeah.
[1046] You know, like not taking anything away from Steve Perry, but Bob Marley was a cultural enigma.
[1047] I mean, he was something bigger than just a musician.
[1048] Yeah, that's true.
[1049] And he was, you know, a beautiful man. Yeah.
[1050] I mean, it wasn't the same as seeing Bob Marley.
[1051] Right, of course.
[1052] It was a different kind of experience, but still enjoy it.
[1053] And also, it's, you know, there probably was so much turnover in the band itself.
[1054] I doubt that any original whaler was still playing.
[1055] Right.
[1056] It's got what they call in philosophy of the ship of Theseus, after the ship where every plank gets replaced.
[1057] Yes.
[1058] And so after a few years.
[1059] There's not a single atom remains of the original.
[1060] Is it the same ship or not?
[1061] Right.
[1062] That's a real thing, right?
[1063] Because they've done recreations or reconstructions of certain boats, and that's how they've done it.
[1064] Yeah.
[1065] And, of course, each one of us is a ship of Theseus, because our atoms turn over.
[1066] Yeah.
[1067] And we like to think we're the same people that we were seven or eight years ago.
[1068] That's pretty crazy.
[1069] Yeah.
[1070] You're literally not the same human you were seven years ago.
[1071] Well, you're not the – well, you are the same human, but you're not the same hunk of matter.
[1072] Well, okay, so where are your memory stored then?
[1073] So your memories, so the thing is that your synapses don't turn over.
[1074] That is your, you know, the actual connections between brain cells in which memories are stored and the patterns of interconnections.
[1075] So even as the molecules turn over, they're in the same, they have the same connectivity pattern.
[1076] Well, we already know that those memories are terrible anyway.
[1077] Well, there is some, yes, they can be kind of evident after.
[1078] They are kind of like Wikipedia stories where anyone can edit them.
[1079] Well, the real problem with memory sometimes is when you repeat a story so many times you don't even remember the real story.
[1080] Oh, absolutely.
[1081] And that's what happens with a lot of these false convictions because the prosecutor keeps asking the witness, you know, are you sure and the more they ask them, are you sure, the more sure they are.
[1082] Right.
[1083] Whether they originally were or not.
[1084] Yeah.
[1085] But absolutely, yeah.
[1086] Elizabeth Loftus, the Pines, in this research has compared our memories to Wikipedia entries.
[1087] That is, anyone can edit them after the fact.
[1088] Very few people even know how vulnerable people's memories are to being manipulated that way, that someone can easily introduce a false memory into your mind.
[1089] Yeah.
[1090] And they're often, they often increase coherence because, you know, a lot of our lives are more random than we like to think.
[1091] They're not like scripted plots where we actually pursue a goal.
[1092] If you actually had a record of what you did in any given day, a lot of things you forgot what you were doing, you went here, and it wasn't really a very smart thing to do.
[1093] But then you like to present yourself to the world as someone who lives his life with a plan, who could be trusted, whose word is good.
[1094] And so we retrospectively edit our lives to make our lives more coherent, to make them seem more like a scripted story with a protagonist and a goal and a climax and a denouement.
[1095] And so our memories are always much more satisfying as stories than the reality.
[1096] That's why, by the way, I once read an interview with Susan Estrich, who was a political consultant, but before that she was a criminal defense lawyer.
[1097] She said the reason that we all are familiar with the courtroom shows where the defense lawyer, The first thing they tell their client is, you know, shut up, don't say anything to the police that you don't, until you're under oath or I'll answer for you.
[1098] The reason isn't that people lie.
[1099] The reason is that people will retrospectively make their memories more coherent than they really were.
[1100] And then that means that they'll talk themselves into a lie, which then the prosecutor can use to impugn their credibility.
[1101] well, you said the following seven things, you know, under oath, and we can show that number two and number five never happens.
[1102] So anything you say we're going to treat now as a lie.
[1103] And it wasn't that they were trying to fool anyone.
[1104] They were just trying to make themselves seem like sensible humans who did things for a good reason.
[1105] Well, there's also situations where you have a prosecutor that's unscrupulous and all they want to do is catch you lying, and then they could prosecute you for perjury.
[1106] Exactly.
[1107] That's happened to many people that were actually innocent, the initial crime they were trying to be convicted of.
[1108] But then they made false statements to the FBI.
[1109] Oh, absolutely.
[1110] We're sitting ducks for that because we all lie.
[1111] I mean, I think the estimate is every person tells two lies a day on average.
[1112] Jamie tells three.
[1113] That's what I heard.
[1114] But, you know, we do and not necessarily for nefarious reasons.
[1115] It may not be to pull the wool over someone's eyes.
[1116] It's just that we want to seem like sensible humans and none of us are as sensible as we'd like to be.
[1117] And sometimes we just want to like get to the point, you know, We just want to, like, cut out some nonsense, so we'll just make something out.
[1118] Oh, that's, I was, I was late, sorry.
[1119] You know, instead of, like, saying what actually happened.
[1120] Come up with some cockamamie.
[1121] Yeah, it's, the thing about the idea of a digital memory is that it's, it's something that they're working on.
[1122] It's something they do believe that within our lifetimes, they're going to be able to achieve some way of a video.
[1123] visual interpretation of what you're experiencing that can be either downloaded or shared or at least examined for veracity.
[1124] Like say if you have an idea of what happened and, you know, there's some sort of a criminal situation, there will be a time in our lifetimes where they'll be able to tell.
[1125] Well, you know, we already have in many cities, and this was true in England for a while where there's basically closed circuit cameras on every street corner.
[1126] You can't.
[1127] Camden, New Jersey.
[1128] It's, like, filled with it.
[1129] Yeah.
[1130] You know, unfortunately, it hasn't led to 1984 in England.
[1131] Fortunately.
[1132] Right.
[1133] For now.
[1134] So far.
[1135] Yeah.
[1136] And, you know, there is, there could be an argument that, you know, it does lead to safer streets and fewer false convictions.
[1137] Yeah, that's the thing.
[1138] It's like there's a trade -off.
[1139] There's trade -off, yeah.
[1140] Yeah, there's pros and cons.
[1141] You have a lack of privacy and erosion of your privacy, but do you also have this, possibility that someone who could be convicted and that has happened right they they have look this Kyle Rittenhouse verdict that came down there's many people that had a very distorted idea of what actually took place that day and then through the trial we got to see what it actually happened many people did not know that someone had actually pulled a gun on him and that they had attacked him and knocked him to the ground when you get to see it and see when he actually shot them, it changes the whole narrative.
[1142] Yeah, in other cases like the, well, some of the police shootings we never would have known about if they weren't cell phone cameras.
[1143] The Boston Marathon bombers, Somaya brothers, caught on camera.
[1144] So, yeah, there is a tradeoff.
[1145] You know, I tend to think that the fears that people have after reading 1984, that if you have better technology, that is the slippery slope towards totalitarianism, I tend, my own feeling is that tends to be overblown, that you can have some technologically pretty crude countries that are just horrible places to live because the government can always tap into ordinary conversations, gossip networks.
[1146] If people are really are planning something, they talk to other people.
[1147] They can use friends and relatives against each other.
[1148] And there's some tin pot third world dictatorships that are pretty terrifying places with really crude technology.
[1149] And then there are places like Scandinavia and England where the technology is pretty advanced, but they haven't turned into 1984.
[1150] But then you have places like China.
[1151] Well, that's true, yes.
[1152] Where the technology is very advanced and they've done some very disturbing things like this social credit score system, that people in America that are, you know, staunch advocates for personal liberty are very concerned that something like that is going to make its way in some sort of an innocuous form here that some social credit score thing will be something that we implement and then before you know it like there was an article really recently where they were saying that your actual credit score your credit score in terms of ability to borrow money could be affected by your browser history your browser history yeah did you see that i didn't it's not not implausible unless there's some regulation it was You can find the article was on Yahoo. And the way they were framing it was so insidious because they were framing it essentially like you could be able to borrow more money if we can look at your browser history.
[1153] So they were saying that essentially if you're a good guy, we could check your browser history and maybe you'd be eligible for more money.
[1154] Or maybe if you're some guy who wants to Google some naughty things about Joe Biden or some naughty things about Kamala Harris or how did Nancy Pelosi get all that money?
[1155] Hey, maybe you're a fucking problem and you don't really need a house, buddy.
[1156] Well, yeah, and it wouldn't take much artificial intelligence to find patterns in people's browsing history that would predict all kinds of stuff.
[1157] Well, those poor QAnon folks, one of the QAnon folks that was in that documentary, they were staunch Obama supporters.
[1158] They were staunch Democrats.
[1159] And then something happened.
[1160] And, you know, they got radicalized and they start, I mean, that happens all the time to people, right?
[1161] like who people that were hardcore left -wing people switch over to the right or hardcore right -wing people switch over to the left and then they find this new ideology and it's kind of exciting it's like breaking up with your wife and you get this new wife and all of a sudden you know you got new problems and everything but yay got it everything's new and exciting big smile what's going on George mom remarried got a new wife new life everything's great and that's kind of what they're doing they just they switch ideologies and this new ideology becomes exciting.
[1162] Well, and it is true that there's certain patterns of thinking like conspiratorial things that you can find on the left and on the right.
[1163] Oh, yeah.
[1164] There's a Washington Post survey recently, so that nine out of ten Americans believe in at least one conspiracy theory.
[1165] Do you believe in any?
[1166] I hope not.
[1167] I don't think so.
[1168] You don't believe in any conspiracy theories?
[1169] What about Enron?
[1170] That's a conspiracy that's real.
[1171] I guess it depends on what you call a conspiracy theory.
[1172] Well, a bunch of people conspiring to do something I do believe people act in concert, sometimes in secret.
[1173] So in that sense, yes.
[1174] But in terms of ones that kind of defy conventional understanding and involve considerable amounts of cooperation and conspiracy across a wide range in opposition to constituencies that have an interest in maintaining the truth, which is what we tend to call conspiracy theories.
[1175] then when I took the Washington Post survey, I didn't believe any of them.
[1176] People that have an interest in maintaining the truth?
[1177] Or discovering the truth, yeah.
[1178] So saying that conspiracies that are maintained by people that actually have an interest in maintaining and discovering the truth.
[1179] You know, when you've got some, you know, means, when you've got kind of checks and balances where there is, there are people who have a vested interest in.
[1180] advancing some goal.
[1181] And there are also people who have vested interest in stopping them, where they don't have completely free reign, where there are journalists, where there are members of the other political party, where there are people just doing their jobs, where just so many people would have to be acting together who ordinarily would have conflicting interests.
[1182] Give me an example of one of those kind of theories.
[1183] Well, let's say, did the CIA deliberately release the HIV virus in order to sterilize African Americans?
[1184] That's a conspiracy.
[1185] Is that a real one?
[1186] I've never even heard that one.
[1187] Oh, yeah.
[1188] How did it?
[1189] HIV, it doesn't sterilize people, though.
[1190] No, nor did it, nor was it, you know, deliberately engineered.
[1191] But that's a conspiracy theory on the left.
[1192] Right.
[1193] So, you know, so many people would have to be in cahoots.
[1194] Yeah.
[1195] Without anyone actually, you know, blowing the cover.
[1196] That's a very, that's not a very popular one.
[1197] I've never heard that one before.
[1198] It's actually widespread in some African -American communities.
[1199] Have you heard of that, Jimmy?
[1200] Yeah.
[1201] Yeah.
[1202] Wow.
[1203] You know, Jeffrey Epstein was murdered.
[1204] You don't think he was murdered?
[1205] I don't think he was murdered.
[1206] Really?
[1207] Why not?
[1208] Well, it's just a simpler theory that he would, he had every reason to kill himself.
[1209] He had the means.
[1210] I think it's much easier to believe that a bunch of prison staff were incompetent and that they actually were willing to risk imprisonment for a goal that they probably couldn't have meant a whole lot to them.
[1211] But not just incompetent, but the cameras were shut off.
[1212] The cameras that were supposed to be monitoring his area were shut off.
[1213] I mean, you know, low -paid civil servants can do all kinds of incompetent things.
[1214] Low -paid civil servants can also be bribed.
[1215] They can, but it would have to be an awful lot of people who are bribed and, you know, bribed by who?
[1216] And could that secret really have been kept by so many people?
[1217] But you know that forensic scientists have studied his autopsy and concluded that the ligature marks around his neck and the placement of them and the wound, the way the damage to his is actually his vertebrae, the bones that are in his neck, is not consistent with hanging, but it's consistent with being strangled because of the positioning of where the choke marks were.
[1218] When you hang, usually it's higher up because the force of gravity, when you have something tighter on your neck, the force of gravity raises it up to where your neck is meeting your chin.
[1219] He was strangled down low, which is usually what happens as someone gets behind you and chokes you to death.
[1220] I mean, it wasn't what, okay, so I mean, there's a fracture of the bones in his neck that's consistent with strangulation.
[1221] Dr. Michael Badden, who's a leading forensic scientist, who's the guy who used to be on that HBO show, autopsy, who's worked on thousands of criminal cases.
[1222] His conclusion was it was as a homicide.
[1223] The last one I read seemed to suggest it was perfectly compatible with him.
[1224] Was that the Hillary Clinton News?
[1225] Yeah, no, no. Who printed that one?
[1226] This is in I think it was in the Washington Post Yeah Of course he wasn't It wasn't like he's on the gallows Where there was a A floor that fell away Right He suddenly had his neck snapped But still To be strangled By a rope You need gravity If you're going to have gravity And it's going to be around your neck That means your body's going to sink down And your neck is not If that happens The rope usually It goes up to where your chin meets your neck.
[1227] It doesn't stay down here.
[1228] If it stays down here, that's more likely someone gets behind you and chokes you to death.
[1229] Well, this is, I mean, it wasn't consistent with the forensic, the most recent forensic report, which just came out a couple of weeks ago, I think, or just, I think, I'd like to see that.
[1230] Yeah, I think it's the post.
[1231] I want to see if someone had a legitimate criticism of Dr. Michael Badden's view of the autopsy.
[1232] But listen, that's an interesting conspiracy, the Jeffrey Epstein one, because, you know, I was told about that by Alex Jones, of all people, more than a decade ago.
[1233] He told me there was a group of people that would compromise very wealthy and powerful folks by bringing them to this place and introducing them to underage girls and filming them.
[1234] I go, that is ridiculous.
[1235] That sounds so crazy.
[1236] But it's not.
[1237] It's There were, well, there were people who visited, certainly a lot of people who visited him on his island.
[1238] Yeah.
[1239] And there were accounts by women who were brought there who were underage.
[1240] There were such accounts, yes.
[1241] So that's, not all of them credible, but.
[1242] Not all of them credible.
[1243] And Galane Maxwell, who's on trial right now, wasn't, isn't her father, some sort of an intelligence agency person?
[1244] No, no, he was a media mogul.
[1245] A media mogul.
[1246] Or her sister, who is, someone's involved in...
[1247] And that's the rumor, yeah.
[1248] He was like Mossad.
[1249] That he was massad.
[1250] So he's a media mogul in the idea that he was...
[1251] It's just a rumor, so...
[1252] Well, there are a lot of conspiracy theories about the death of Robert Maxwell because he fell off his yacht and drowned.
[1253] Naked.
[1254] Oh, that's not good.
[1255] But then again, if you're drunk and you're partying, that's what happens.
[1256] Well, you know, I think my default assumption always is that the truth is...
[1257] kind of boring and complicated and random.
[1258] It's not always.
[1259] Not always.
[1260] Sometimes people have people killed.
[1261] Sometimes they do.
[1262] So what makes you think of all the powerful people that he compromised?
[1263] So if Jeffrey Epstein really compromised all these powerful people, you know about the painting that was in the lobby of his home?
[1264] Oh, yes.
[1265] It's a wild painting.
[1266] I want to get a copy of it.
[1267] It's Bill Clinton in a blue dress.
[1268] And Bill Clinton in that dress, the way it looks to me, I mean, when you find out later, that Bill Clinton had been invited to Jeffrey Epstein's plane and flew with him more than 26 times over a period of just a couple of years, which is pretty fucking crazy.
[1269] And we know that Bill Clinton's kind of a pig, right?
[1270] That guy having that painting of Bill Clinton in a dress pointing at the viewer, that to me, if I was going to put that in my house, that's a way to say, I own you, bitch.
[1271] that's what I would be saying to Bill Clinton like look at that that is that the Jeffrey Epstein story is a real conspiracy the death yeah everything the whole thing the fact that he really did do that the fact that he really did get let out of jail with a very like he was a convicted pedophile and he got out he got out very easily he had a very light sentence and one of the sheriffs that was trying to prosecute the case was told that it was above his pay grade.
[1272] And, you know, when he talked about it, he said that he was told that Jeffrey Epstein was intelligence.
[1273] And that he had to let him, that he had to, like, not follow up on this.
[1274] The whole case is crazy.
[1275] When you see how many people he flew to these islands.
[1276] Yeah.
[1277] I mean, I had the tremendous misfortune of knowing Jeffrey Epstein because I knew so many people at Harvard, at MIT.
[1278] at Arizona State who got really in tight with scientists He got really in tight with scientists What do you think the What was the motivation of that?
[1279] Oh, I think he You know He wasn't as smart As a lot of his pals Made him out to be But he wasn't stupid And there are a lot of hedge fund guys Who had an interest in science And like to indulge it By hanging out with smart people Yeah And he was collecting celebrities scientific celebrities and he he liked to kivots and schmooze about these ideas I mean he had again I was had that tremendous misfortune of knowing some of the people he was tight with did they fly out to the island I did they fly me out to the island oh I wouldn't have set foot on his island in a million years but before any of this stuff came out I did fly on his plane once to Ted in Monterey with my literary agent John Brock You just thought you were hanging out with a rich guy.
[1280] Yeah, a rich guy.
[1281] And even then, I thought this guy is a fraud.
[1282] Really?
[1283] Yeah.
[1284] Why did you think he was a fraud?
[1285] Because he pretended to be an intellectual peer of the people whose company he was buying, but he was kind of a kibbitzer.
[1286] He liked to fool around.
[1287] He had ADD.
[1288] He couldn't keep on track with a conversation.
[1289] And I think because he sloshed money around.
[1290] around so freely that a lot of people, including good friends of mine, thought, oh, he's as smart as my academic and scientific colleagues, which he was not.
[1291] He was a faker.
[1292] Eric Weinstein got the same impression when he met him.
[1293] He said, because Eric is a mathematician.
[1294] And when he met him, they were discussing something that had to do with finances.
[1295] And Eric, like his immediate reaction to it was this guy's full of shit.
[1296] That was my reaction.
[1297] I couldn't, I could understand him.
[1298] I couldn't understand why.
[1299] I had five different colleagues who were in tight with him.
[1300] To my tremendous disadvantage because it meant that people would snap pictures and there I would be in a crowd with this sex criminal.
[1301] One of the worst things that has ever happened to me. Was this post his being convicted?
[1302] There was one that was post.
[1303] Most of them were pre.
[1304] But one of them was, I was at a convention that he paid for, scientific conference.
[1305] And the organizer said, oh, you know, will you, can we put him at the same table and then someone snapped a picture?
[1306] Oh, criming.
[1307] That's, you know, all over the Internet, and it's one of the bains of my existence.
[1308] See, but he is a real conspiracy.
[1309] Like that guy, him, what he, what is said about him is true, that he was compromising all these very rich and powerful people by introducing them to underage girls.
[1310] If it was an attempt to compromise them as opposed to just kind of sharing the favors and befriending people by offering them what he thought was a perk that he got to enjoy.
[1311] Within the other conspiracies, where's this guy getting this money?
[1312] Well, yeah, that is something that we don't know.
[1313] Right.
[1314] There's a lot of money.
[1315] I mean, from Leslie Wexner, the Victoria's Secret guy.
[1316] Well, how about all the people that gave him money?
[1317] Like hundreds of millions of dollars.
[1318] To invest, yeah.
[1319] Allegedly.
[1320] Allegedly.
[1321] Just cut him giant checks, and then they had to resign as CEOs.
[1322] Right.
[1323] Well, there was an era in which if you had something on the ball, if you had a little bit of math, and if you were lucky, you could make a lot of money on Wall Street in hedge funds.
[1324] And I think he was in that generation of capitalizing on some opportunities to multiply money because of what I've talked to people that understand money that way though specifically Eric Weinstein he doesn't think he's nearly sophisticated enough to do that he didn't think he had an understanding of he thought he was full shit I believe that he's like I think this guy's playing a role he goes I don't think he's a financial expert at all and the idea of all these people giving him money to invest he's like this is nonsense and Eric is one of smartest people I know when he you know he was talking about his particular field of interest, right?
[1325] Like, he's, he runs teal capital.
[1326] Yeah, right.
[1327] It's not a trivial.
[1328] Yeah, he understands what the fuck he's talking about.
[1329] Yeah.
[1330] So when he's talking to something, it's like, if I'm talking to someone who pretends of their stand -up comedian, right?
[1331] And I'm like, where do you play?
[1332] Yeah.
[1333] How long have you been doing it?
[1334] Where was your first open mic?
[1335] Who's your contemporaries?
[1336] Who do you hang out with?
[1337] Like, what clubs do you work at?
[1338] And they give me some bullshit answer.
[1339] I'll go to my friend.
[1340] Like, that guy's not a comic.
[1341] Like, what is going on here?
[1342] Yeah, right.
[1343] No, I mean, that's the thing that, you know, unless you're a really, really good liar.
[1344] Yeah.
[1345] You get exposed just because they're.
[1346] No one's that good, though.
[1347] When it comes to something that's so nuanced and so specific, like understanding finances.
[1348] Yeah.
[1349] Well, we don't know the answer because it is true that there's a lot that's still, you're going to shroud it in the mystery.
[1350] But we do know that intelligence agencies do try to compromise people.
[1351] Well, that's what they're in the business of doing.
[1352] That's a conspiracy, isn't it?
[1353] Yeah.
[1354] So I believe in conspiracies, too.
[1355] So do you.
[1356] Yeah.
[1357] No, I believe, no, I can believe conspiracies exist.
[1358] The question is, but it doesn't mean that every conspiracy theory is true.
[1359] No, most of them are not.
[1360] Yeah.
[1361] That's what's interesting.
[1362] What's interesting is it's so easy to dismiss the idea of a conspiracy theory because if you believe in them, you're a silly person and you can't be taken seriously.
[1363] Like, if you say...
[1364] Well, the reason I think that you've got to start off with an attitude of skepticism toward conspiracy theories is that they are so resistant to falsification.
[1365] Namely, the fact that there is no evidence for the conspiracy is proof of what a diabolical conspiracy it is.
[1366] And so whenever you have an idea that kind of resists falsification by its very nature, it's not that it's necessarily false, but still there should be a really high burden of proof.
[1367] I give the example, I got this for my philosophy professor a long time ago.
[1368] Let's say you ask, why does a watch go tick, Tick, tick, tick, tick, tick.
[1369] And a guy says, well, I have a theory.
[1370] There's a little man inside the watch with a hammer.
[1371] He's going, wap, wap, wop, wap, wap, wap, on the inside of the watch.
[1372] And he says, okay, well, let's test the theory.
[1373] I'll take a screwdriver, pull off the back of the watch.
[1374] And he say, hey, there's no, little man inside.
[1375] Just a bunch of gears and springs.
[1376] And the guy says, no, no, there really is a little man, but he turns into gears and springs whenever you look at him.
[1377] Now, you know, that could be true.
[1378] But the fact that the theory is so crafted so that it resists being falsified just should make you very suspicious.
[1379] You need an awful lot of evidence to be convinced of that kind of thing.
[1380] And that's why conspiracy theories are so easy to spin out and often so hard to definitively refute.
[1381] You can't prove that they're not true, but you should have a, greet them with a lot of skepticism.
[1382] Yeah, I've ramped up my skepticism lately.
[1383] On one of the subjects is the UFO subject.
[1384] And the reason why I've ramped my skepticism is because of the transparency of the federal government.
[1385] When they started talking about how UFOs are real, and they started talking about how these things are unexplainable, we don't know what they are, we're trying to monitor them.
[1386] And someone who worked for the Pentagon said that there's the reality of off -world vehicles not made on this earth.
[1387] And then I'm like, why are they telling us that?
[1388] And I start looking at this.
[1389] I go, this might be a complete cover -up for some new.
[1390] project for some new propulsion system, some new weapons project, something that's like very high -tech and super sophisticated and they're trying to like pass these drones off as something that's extraterrestrial.
[1391] I'm when I, when, I'm like, why are they telling us this?
[1392] I'm not buying this.
[1393] So like the more they give me evidence that they are going to release information about UFOs, the more I'm like, they're full of shit.
[1394] these things aren't even real.
[1395] These things are probably some kind of a drone.
[1396] They're probably not really from another planet.
[1397] This is a cover story because the way they operate, like there was a recent press release that just came out and Jeremy Corbell was talking about it and Jeremy Corbell said that he believed, Jeremy Corbell is the guy who produced this documentary about Bob Lazar, who's the most controversial of all characters when it comes to the UFO world because he is a propulsion's expert that claimed to have worked on Area 51 Site 4, which is a place where they say they have these engineered or back -engineered UFOs they're working on.
[1398] They're trying to figure out how to, yeah, you're laughing, see?
[1399] I am laughing, yes.
[1400] But we have spaceships, right, don't we?
[1401] We have spaceships here that we make, right?
[1402] And we live on a planet, right?
[1403] And we're trying to go to other planets.
[1404] So why is it so crazy to think that some person or some thing, some creature from another planet that also has a spaceship would come here?
[1405] Well, because there's a simpler explanation, namely that they are unidentified flying objects.
[1406] Namely, there's an object.
[1407] They fly.
[1408] We haven't able to identify them because we can't identify every last thing that happens on the planet.
[1409] There are a lot of things where they're going to be distant, they're going to be poorly spotted, and we just don't know what they are.
[1410] And the simplest explanation is something perfectly ordinary, but we just don't know what they are.
[1411] You sound skeptical.
[1412] I am highly skeptical.
[1413] And the most recent videos that were released, there was a guy whose name I'm forgetting now, I could look at up, an expert in optics who had perfectly mundane explanations for these, in terms of image tracking, in terms of perspective.
[1414] Are you talking about Mick West?
[1415] Yes.
[1416] Yeah, Mick West doesn't believe shit.
[1417] That's right.
[1418] He believes that he doesn't believe.
[1419] He's one of those guys.
[1420] Yeah, but he's been...
[1421] His cases...
[1422] He's not correct on a lot of his assumptions as well.
[1423] And one of the things, the problem with what he's saying is he disregards one of the most credible of all the sightings in this by this guy named Commander David Fravor.
[1424] And Commander David Fravor was a fighter jet pilot.
[1425] And off the coast of San Diego in 2004, they tracked some object on radar that went from 50 ,000 feet above sea level to 50 feet in less than a second.
[1426] They got visual confirmation of this thing.
[1427] They saw it.
[1428] They said it looked like a tick -tac.
[1429] They got video evidence of this thing, and this thing moved off at insane rates of speed, and then went to their cat point.
[1430] It also blocked their radar.
[1431] It was also actively jamming their radar.
[1432] So as they were trying to track it, like when the jets pulled up to try to track it, it was actively blocking their radar, which is technically an act of war.
[1433] But this thing was super sophisticated and moved at insane rates of speed that if you put a human being inside of it, he said, you would literally be turned into jello from the G -Force.
[1434] There's no way you would be able to tolerate it.
[1435] And this thing went from where they had found it, and it went to their cat point, which means their predetermined point of destination, whether the fighter jets were supposed to go to.
[1436] This thing went there and appeared there on radar again.
[1437] So they have visual confirmation from more than two jets.
[1438] They have video evidence of this thing, and then they have radar tracking that shows extraordinary speeds that defy our understanding of physics and propulsion.
[1439] It also showed no heat signature.
[1440] So whatever this thing is, it's not operating the way a jet would work, where you push things out the back to make something go fast forward, the way a rocket works.
[1441] It's operating on some completely different way.
[1442] My suspicion, especially because of all this government release of this UFO stuff, is that they've figured out how to use some sort of gravity propulsion system on a drone and that that's what that thing is.
[1443] And I think it's probably because it's off the coast of San Diego, which is very military dense area, right?
[1444] There's all these military bases and there's so much military activity going there.
[1445] It just makes sense that that would be a place where they would practice using some sort of drone.
[1446] I'm highly skeptical, and the thing is that we could, it would be pretty straightforward of these things did exist, that we would have high quality photography, we'd actually find the thing, find traces of them.
[1447] I suspect that there are complicated, boring explanations for them, such as the fact that the speed of a flying object, of course, depends on the distance that you think that it's at.
[1448] if it's much closer than you think it is, you could attribute fantastic speeds to it, simply because the visual angle that it covers might be large.
[1449] Sure, but we're talking about highly sophisticated United States military tracking systems that are designed to protect the United States from being attacked by other countries and their sophisticated weapons.
[1450] So these are the most accurate weapons systems detections, weapons detection systems, systems that we have.
[1451] So when they detect something from and they see it at 60 ,000 feet and then they see it again at 50 feet above sea level and it happens in less than a second, it gives one pause.
[1452] I don't know what it is, you know, but whatever it is.
[1453] Commander David Fravor was absolutely convinced that he had never seen anything like this before.
[1454] He knows that it was, it had been aware of him because it said it changed its plane and and it moved towards them.
[1455] It wasn't far away.
[1456] He said he got to see the size of it.
[1457] He said it was approximately 40 feet long, I think, is what he said it was.
[1458] And he said it looked like a tic -tac.
[1459] It was a smooth, white -looking object.
[1460] And he said the thing recognized them and then lift it up and took off so fast you couldn't track it with your eyes.
[1461] Well, I have not spent much time in investigating these, so I can't really argue against it.
[1462] But let me just say that I'm very, very skeptical.
[1463] Of course you are.
[1464] You should have Nick West in to give more detailed analyses of how these things goes.
[1465] Mick West is a video game maker.
[1466] That's what he made.
[1467] He made video games.
[1468] And now he runs a debunking site.
[1469] But he doesn't have any understanding of these tracking systems.
[1470] And that's where he made the critical errors.
[1471] And there's been more than one fighter pilot and more than one expert in these tracking systems that's debunked Mick's.
[1472] Mick West's debunking, Commander David Fravor being one of them.
[1473] There's another guy that's on YouTube that has a very long and detailed analysis of why Mick West is incorrect.
[1474] I don't know who's right or wrong because I don't know jack shit about these military systems.
[1475] But I find it fascinating that there's this guy who's an incredibly credible human being who is a fighter pilot, who's a guy who's the best and the brightest amongst our...
[1476] I mean, fighter pilots aren't the people who would be best equipped to answer these questions.
[1477] I mean, that's not what they're selected for.
[1478] That's not what they're trained for.
[1479] They're not kind of...
[1480] Answer what questions.
[1481] Well, questions of whether something that appears to be superhumanly fast might instead be produced by some artifact.
[1482] I mean, that's just not what they do.
[1483] Well, they got it on video.
[1484] When you say it artifact, they saw it visually.
[1485] They had visual confirmation, and then they have it on video, and they watched it jet off.
[1486] We're susceptible to visual illusions, the foremost being that the speed of something depends critically on the distance, which can be fooled.
[1487] But if anybody is going to understand these things, it's someone who operates these jets in war.
[1488] I'm not sure that's true.
[1489] You don't think so?
[1490] You don't think that someone who's accustomed to tracking, flying, moving objects with a jet plane in the heat of combat, and understands how all these tracking systems on these jets work, that that person wouldn't be very highly qualified when it comes to registering what a flying object is and how fast it's moving and how big it is?
[1491] I suspect not for the same reason that a pilot is not the kind of engineer that you'd bring in to, say, analyze the records from a plane crash to figure out what caused it.
[1492] It's just a different skill set.
[1493] But a different skill set, that's a different thing.
[1494] You're talking about wrecks.
[1495] This is not wrecks.
[1496] This is someone recognizing something taking off at the same rates of speed.
[1497] The thing is that the possible causes of highly unusual observations is not the kind of thing that a habitual pilot would be equipped to discern.
[1498] In the same way that when there were claims of telekinesis and psychics in the 70s and they brought in physicists to, say, examine Uri Geller, turned out that they were fooled.
[1499] The people they should have brought in were stage magician.
[1500] who are experts in how our appearances can deceive us in terms of the underlying reality.
[1501] But this is a very different type of situation.
[1502] You're talking about more than one jet, more than one person in each jet, visual confirmation of this thing by these people.
[1503] Like, are you seeing this?
[1504] What the fuck is this?
[1505] And this thing lifting off the water, recognizing them, jamming their radar, and then moving off an insane rate to speed, and then flying and being recognized at their cat point, which shows some sort of intelligent control of it.
[1506] Well, if the rates of speed really are insane, now that they may not be it.
[1507] It may be that the perception of insane rates of speed is mistaken.
[1508] If you're tracking something with the most sophisticated radar that we have, and it goes from 50 ,000 feet above sea level to 50 feet above sea level and less than a second, that's pretty fucking insane.
[1509] If you know for sure that it's done that.
[1510] Well, that's what they said.
[1511] That's what they said, yeah.
[1512] I mean, I have not spent much time investigating.
[1513] But your initial instinct is to debunk it.
[1514] Well, it's to be skeptical.
[1515] It's to demand a high burden of proof, such as actually having a irrefutable, high -quality photograph of it.
[1516] And it's been observed by Elon Musk and others that the quality of photographic evidence for UFOs over the last 50 years has been pretty much constant, even though the technology of photograph, of photography, and sensing has increased by orders of magnitude.
[1517] So shouldn't we have much more convincing evidence now that we're so much better able to take high quality photographs of everything?
[1518] We still have these blurry splotches that...
[1519] Okay, we're talking about a different thing then.
[1520] Well, that's, yes.
[1521] And also, if you're talking about something that can literally move at the rate of speed, that we can't perceive with our human eyes like this thing.
[1522] How are you going to take a picture of that?
[1523] You know, so we're talking about anomalies, things that rarely occur if they occur at all.
[1524] And I'm skeptical that they're from another world.
[1525] The more time goes on, the more I'm skeptical of it.
[1526] And I tend to think, because I know that there's been some work in magnetic propulsion systems and some sort of a gravity -defying propulsion system.
[1527] There's been all sorts of work in these things.
[1528] Maybe there's some breakthrough that we're not being.
[1529] led on about and that this has military applications and that this is what all this work with these drones is.
[1530] So when these fighter pilots, and there's been multiple fire pilots that have seen these anomalous objects moving in the same rates of speed, maybe that's what we're seeing.
[1531] Maybe we're seeing some kind of drone system.
[1532] Maybe I have reached the limits of my expertise, but I do have some skepticism.
[1533] And perhaps if we could decide what would be convincing evidence one way or another.
[1534] You're also a public intellectual, so you have to maintain.
[1535] a level of credibility that I don't see you can't entertain some dumb shit ideas that I can like go all the way into and you know more about it than I do well it's just I've become obsessed with it I've I've watched many documentaries and I've read analysis of these things by experts and I actually had Commander Fravor in here and I spoke to him in person for a long period of time and he's very convincing that what he saw was extraordinary and it doesn't make any sense.
[1536] He's never seen anything like it since.
[1537] And he also said that the folks that were communicating with him from whatever ship, I think they were on the Nimitz, he was saying that they had seen multiple ones of those and that they had happened multiple times over the past few months while he was there.
[1538] So again, not being an expert in UFOs, let me bring it back to reasoning and rationality and why I'm skeptical.
[1539] You can have a kind of a Bayesian analysis of how we should adjust our belief.
[1540] What is Bayesian mean?
[1541] So Bayesian refers to the formula from the Reverend Thomas Bays from the 18th century on the optimal way to calibrate your belief to evidence.
[1542] And so you've heard the expression priors, depends what your priors are.
[1543] That's from, that's Bayesian reasoning.
[1544] Namely, you start off with a, based on everything you know so far, everything you've already observed, what credence would you give to an idea, you know, a scale from zero to one?
[1545] Right.
[1546] Then you consider if the idea is true, what are the chances that you would observe what you're observing?
[1547] So you multiply those two together, and you divide by how common is that evidence across the board.
[1548] So a classic example would be how do we interpret a medical test?
[1549] So, you know, let's say there's a test for cancer.
[1550] We know that the base rate for the cancer is 1 % of the population.
[1551] The test is not bad.
[1552] It picks up 90 % of the cancers that are there, but it also has a false alarm rate so that, say, 9 % of the time, it picks up a signal that is not really cancer, a false positive rate, like a lot of medical tests.
[1553] So if you have a positive test result, how do you interpret it in terms of the probability that the person really has cancer?
[1554] And the famous finding from psychology is that we tend to, first of all, people are often and not very good at it, including doctors.
[1555] So in the numbers that I just gave you, most people and most doctors would say, oh, positive test result, 90 % chance you have cancer.
[1556] The correct answer, according to the formula of Thomas Bays, is 9%.
[1557] Why?
[1558] Well, if the test picks up 90 % of cancers, then if you test positive, isn't that a death sentence?
[1559] Well, no, if it's only 1 % of the population that even has the cancer, most of the positives are going to be false positives.
[1560] So what does this have to do with UFOs?
[1561] Well, before you even look at this particular evidence, given how many claims of UFOs there have been, which turned out to be bogus, namely pretty much all of them so far, that sets a pretty low prior so that even if you can't be certain that this is a false observation, it's an optical artifact, it's an artifact of your tracking system, it's people believing what they want to believe.
[1562] Let's see you can't prove that.
[1563] But still, your priors, before you even look at the quality of that evidence, would be chances are it's going to be like all the other UFO reports.
[1564] Namely, we may not even be able to explain it just because we can't track down every last minute fact of that situation that took place three years ago.
[1565] You know, we didn't have cameras from every angle.
[1566] But chances are that something that's unexplained.
[1567] for something that's unlikely to, based on all observations so far is unlikely to be true, is even if the evidence was pretty good, you'd be rational not to believe it.
[1568] But isn't that biased?
[1569] Well, it is biased, but it's the right kind of bias.
[1570] It's the right kind of bias.
[1571] But isn't it better when you're dealing with extraordinary and unique circumstances to look at it entirely based on the facts that are at hand?
[1572] So basis of theorem would say no. But what about things like what we were talking about before the podcast, started when we were doing our little COVID test.
[1573] We were talking about the Hobbit Man from the island of Flores.
[1574] Like, that was, they were very skeptical that there was a complete new branch of the human species that we weren't aware of that existed, coincided, coexisted with human beings as recently as 10 ,000 years ago.
[1575] I mean, if you told that to people 20 years ago, they would have laughed in your face.
[1576] I don't know if they would have laughed in your face, but they, you know, they would have demanded a high quality of evidence.
[1577] and there was a high quality of evidence.
[1578] There's just no reason to be skeptical whatsoever.
[1579] Initially, there was a lot of skepticism.
[1580] Yeah, but now that it's that it has been.
[1581] I know there was a theory that these were actually, you know, stunted from disease, from malnutrition, but those have been ruled out pretty, you know, pretty well.
[1582] So if this UFO evidence, if this evidence of this Tick -Tac, if there's more concrete, conclusive evidence that shows that something can defy our understanding of physics and use some sort of propulsion system, that's not indicative of something, pushing something out the back like fire and shooting forward like most rockets do.
[1583] Would you be willing to entertain the idea that something's going on?
[1584] Oh, sure.
[1585] I mean, the evidence would have to be pretty good.
[1586] So, for example, dark energy, the as yet unknown force that is propelling at the accelerating expansion at the universe.
[1587] Yeah.
[1588] So I'm willing to credit the physicists who have measured the acceleration of the expansion of the universe, that there is something going on there that we don't understand.
[1589] But there, the evidence is pretty good, and there's no way to dismiss it.
[1590] It's not a one -off unique event that happened somewhere a few years ago that will never be able to recreate.
[1591] It's just much better evidence than that.
[1592] And so, yeah, you've got to be prepared to be surprised.
[1593] You've got to revise your posterior, as they say, that is, how much you believe something after you've looked at the evidence from your priors, namely how much credence did you give to it before looking at the evidence, if the evidence is really, really strong.
[1594] That's what Bay's rule is all about.
[1595] It's trading off based on everything you know so far, how credible is it with how strong is the evidence and how common is the evidence across the board, those three things.
[1596] What's the difference between dark energy and dark matter?
[1597] So dark energy is the hypothetical, as yet a poorly understood source of the fact that the Big Bang seems to be getting faster and faster, which no one had predicted.
[1598] And dark matter, my understanding is that it's meant to explain a different phenomenon, namely why there's a kind of clumping among galaxies more than that.
[1599] than we would expect based on the mass of the stars making up those galaxies, suggesting that there is some source of gravitational attraction that we can't see that's forming those clumps.
[1600] What gets confusing to me is when I read this article that was talking about a galaxy that they've discovered that is made entirely of black matter, or dark matter rather, and they're like, well, what are you talking about?
[1601] Like, what is it?
[1602] Yeah.
[1603] Well, I think they don't know, but what they, I suppose, you're sort of guessing, that they We can find that.
[1604] Detect its presence from its gravitational effect on other celestial bodies.
[1605] Yeah.
[1606] So most of the universe, like it's a large percentage, is dark matter, correct?
[1607] I think that's right.
[1608] Yeah.
[1609] And we don't know what it is.
[1610] Astronomy.
[1611] Oh, they claimed it was 98 % dark matter.
[1612] They were wrong.
[1613] Ah, okay.
[1614] There we go.
[1615] LifeScience .com.
[1616] So back in 2016, researchers claimed they found a galaxy made almost completely of dark matter and almost no stars now on closer examination that claim has fallen apart oh that was so 2016 oh back in the day where people with the dark ages dark matter in the dark ages okay so all right so there was some sort of a mistake there all right now we know here we go okay but it could have been true i am absolutely fascinated by conspiracy theories and the psychology behind them because they are so fun they're so intriguing if you get into UFOs for instance because we're talking about that, and you start watching documentaries and reading personal accounts of, you know, abductions, and abductions are another good one, right?
[1617] Oh, yes, alien abductions?
[1618] Yeah, because a lot of them are through hypnotic regression.
[1619] And John Mack, who was from your university, from Harvard, who was a famous proponent of UFOs.
[1620] and wrote books to, I believe, about this sort of phenomenon, this hypnotic regression where people would have the stories of being abducted by UFOs, but very highly criticized.
[1621] Like his methods in particular, have you read any John Max?
[1622] I have, yeah.
[1623] What's your perception on that?
[1624] So I think he, by the way, I think this is a great example of the distinction that we were talking about.
[1625] between beliefs that you really hold because they affect your life and you have to act on them, like, you know, is there food in the fridge?
[1626] What's my bank balance?
[1627] And ones that are, that you believe because they're part of a story that's just, you know, too good not to believe.
[1628] Yeah.
[1629] And you probably wouldn't put much money on them.
[1630] You wouldn't bet your life on them, but you believe them because they form a satisfying narrative.
[1631] In his case, he had patients who, he was a psychiatrist.
[1632] He was, like a lot of people with a Harvard affiliation, there are a lot of people in Harvard teaching hospitals that if you're a doctor in the hospital, then it's easy to get a Harvard affiliation.
[1633] You're not really hired as a Harvard professor.
[1634] You're a doctor at one of these hospitals, in his case, the Cambridge Hospital, the Mass General Hospital, the Beth Israel hospital, where a lot of the doctors can put after their name and Harvard professor, but they're not really hired on the basis of their research.
[1635] Oh, so that's him?
[1636] That's him, yeah.
[1637] So he wasn't lying when he said that he had a Harvard affiliation, but didn't mean mean all that much.
[1638] In his case, so he had these patients who were convinced that they had been abducted by aliens, their genitals had been examined, they were part of experiments.
[1639] It's always that, right?
[1640] It's always that.
[1641] Usually your butt.
[1642] So his, and I think what was going on there is that he was a kind of psychiatrist who believes that we should take our patient's testimony seriously, that if it was their reality, we should treat it as reality.
[1643] Now, that's kind of different from, say, calling up the Harvard Astronomy Department and say, hey, you're going to get a Nobel Prize based on something that I'm going to tip you off to, like that we've been visited by aliens.
[1644] You know, just like if you really believed in Pizza Gate, you'd probably call the police if you really thought that they were children being raped in the basement.
[1645] If he really, really thought that he had evidence of alien visitation, you'd think he'd call some astronomers, some astrophysicists.
[1646] He didn't because that wasn't the way he believed it.
[1647] He believed it in the sense that, well, it's important to take my patient's testimony seriously.
[1648] That's respectful.
[1649] That's necessary.
[1650] Can I argue a point here?
[1651] If you were dealing with someone that had some sort of abduction experience, where they had been visited by beings that have technology and a capability beyond our understanding, and that these beings can appear and disappear at will, can paralyze people, can they can do things to people and perform medical exams and then return them and reduce their memory to like mere splinters where they have to be hypnotized in order to have this hypnotic regression to get this memory back.
[1652] what is calling an astronomer going to do to you?
[1653] Why would you think about an astrophysicist?
[1654] That doesn't even make any sense.
[1655] Well, if you think you had evidence that there were advanced...
[1656] But there's no evidence other than memories.
[1657] Right.
[1658] Like what he's doing as a psychiatrist, and I'm not saying he's...
[1659] I think what he did was highly...
[1660] I think he led those people.
[1661] Yeah.
[1662] And I think he would suggest things to them and in a hypnotic state.
[1663] You know, there was kind of joint storytelling.
[1664] But there was a problem with the way he was asking the questions, apparently.
[1665] I don't doubt it.
[1666] Yeah.
[1667] The main criticism is that he was introducing ideas and to these people's head and confirming.
[1668] And actually, we do know the neurological phenomenon that can lead to some of these memories.
[1669] There's a phenomenon of partial awakening, partial sleep where your body is still paralyzed as it is during deep sleep or during REM sleep.
[1670] but you're conscious and you're experiencing your surroundings but your body is paralyzed and there are states like that where you can misinterpret that that constellation of experience as being passively carried as seeing bulbous -headed apparitions.
[1671] And also believing that you are being manipulated into that state by some nefarious creatures and want to examine you.
[1672] Exactly.
[1673] So now we're kind of doing Bayesian reasoning again, saying, given that, you know, our memories are really not always that accurate, given that when we're in various states of, you know, exhaustion and delirium and half sleep, half wake, we can hallucinate all kinds of weird stuff.
[1674] What's more likely that some psychiatrist at Cambridge Hospital has made the most important discovery in thousands of years, or that he's taking some patients' hallucinations a little bit too seriously?
[1675] Well, all in all, we'd say, chances are the memories were not veretical, just based on everything else we understand.
[1676] Now, he did not engage in that kind of Bayesian reasoning.
[1677] Right.
[1678] He was a true believer.
[1679] Well, here's the thing, I don't know if he was a true believer.
[1680] I think it was in this zone.
[1681] He was like the Pizagate believers who left the one -star Google Review.
[1682] It's like, yeah, is it really true?
[1683] Is it really false?
[1684] Well, wasn't he more committed to that?
[1685] Because he was actually writing books and highly profitable books.
[1686] Well, that's true.
[1687] Yes.
[1688] That's true.
[1689] So he's committed to a narrative and the narrative was that these...
[1690] Exactly.
[1691] So, well, that, no, I think you put your finger on it.
[1692] The question is, how committed are you to a narrative being true in the same sense that there's gas in the car is true or for us?
[1693] And I think that people when it comes to stirring, interesting, meaning -giving narratives, they don't insist on that kind of proof.
[1694] I think there is an attitude that some people have, probably a tiny minority of humanity, that you should only believe things for which there's good evidence.
[1695] So I have a quote from Bertrand Russell.
[1696] It's undesirable to believe a proposition when there are no grounds whatsoever for believing it is true.
[1697] Now you might think...
[1698] I love that guy.
[1699] I love that guy, too.
[1700] You might think, oh, isn't that obvious?
[1701] I mean, couldn't your grandmother have told you that?
[1702] Isn't that?
[1703] But now, the thing is it, is it?
[1704] It's a revolutionary manifesto.
[1705] Yeah.
[1706] That's not the way people believe things.
[1707] They believe things for all kinds of reasons.
[1708] And I consider I have a gift of the Enlightenment where we have this strange new mindset, only believe things that someone can show to be true.
[1709] That's deeply weird.
[1710] I think it's a good belief, but I don't think that's the way the human mind naturally works.
[1711] No, I don't think so either.
[1712] And I think it's a fundamental flaw in maybe it's our education system or maybe it's just the collective way that people look at things, that they, they, they, attach themselves to an idea and then defend it as if it's a part of them.
[1713] Yeah, and even if they don't defend it, they can sometimes even just believe it.
[1714] Rationalize it, find the way around, and then find like -minded people that support that idea, get themselves in an echo chamber, and bounce around QAnon theories.
[1715] I mean, that's really kind of the same thing, right?
[1716] So is that a failure of our education system?
[1717] Is that a failure of the way we're raising our children?
[1718] Like, what is it that's causing this?
[1719] lack of understanding of how the mind works and how we form ideas and opinions and how not to cling to ones that might not be true at all or or might be like highly suspect well it is although i would kind of turn the question upside down it's not that it's this strange inexplicable anomaly that people believe weird stuff uh that's the natural state of of humanity is to believe weird stuff.
[1720] Why is that?
[1721] Oh, just because I think there is a reason, namely for most of our evolutionary history, most of our history, you couldn't find out anyway.
[1722] You can't really find out.
[1723] You know, until we had modern science and record keeping and archives and, you know, presidents having tape recorders going in the Oval Office, you just can't know.
[1724] You couldn't know.
[1725] Even now we can't know for sure, but we know a lot better than we could.
[1726] We can go to a lot of records.
[1727] We could do the forensics.
[1728] We have the technology and the equipment to answer questions that formerly were unanswerable.
[1729] Like, why do plagues happen?
[1730] Well, before it was divine punishment.
[1731] And who's to say it wasn't divine punishment?
[1732] Well, now we can identify the virus.
[1733] Right.
[1734] We don't have to...
[1735] Like lightning.
[1736] I mean, for thousands of years, lightning was the gods punishing us.
[1737] It was magic.
[1738] And, you know, who could tell otherwise?
[1739] Right.
[1740] Now we can tell otherwise.
[1741] That's really strange.
[1742] in human history that we can get answers to questions like that.
[1743] So our mind evolved in a case where, in a circumstance in which a lot of questions that are really, really interesting were unanswerable in terms of the factual basis.
[1744] So given that they're unanswerable, no one could prove you wrong.
[1745] There's still reason to believe things if they could increase your status and prestige and expertise, if they gave the group a moral purpose.
[1746] if they led people to do heroic moral things and inhibited them from doing evil bad things, those are all reasons to believe something separate from, is it actually true?
[1747] Or can you actually show that it's true?
[1748] Right.
[1749] And the idea that you should only believe things that are factually true, that's weird in human history.
[1750] I think it's good, and I think our educational systems should get kids to think that way, but it's not the natural way for anyone to think.
[1751] So we're always pushing against the part of human nature that is happy to believe things because it's uplifting, edifying, a good story, a satisfying myth.
[1752] And for those of us who say, no, that's really not a good reason to believe something.
[1753] You should only believe it if it's true.
[1754] It's always an uphill battle.
[1755] It's a battle worth fighting.
[1756] But our schools and our journalistic practices and our everyday conversation should be steered toward the kind of skeptical attitude of, I'm not going to believe it until there's good evidence for it.
[1757] You have faced, in my opinion, some of the most irrational criticism that I think is based on ideological narratives that people want to follow.
[1758] When it comes to the progression of safety and you've said that if you follow history, this is like one of the safest times to be alive, There's less murder, there's less rape, there's less racism, there's less violence, and medical sciences at its peak.
[1759] All these things that factor into that think this is a really amazing time to be alive.
[1760] And because of people's, I believe, because of their ideological biases or these narratives that they'd want to stick to, like things are terrible today.
[1761] When you say, things are actually less terrible than ever before, people get angry at you.
[1762] It's weird.
[1763] Like, what does that like to face that kind of criticism?
[1764] when you are talking about some hard statistics in science, it's very easily trackable.
[1765] Yeah, I think there's several things going on.
[1766] So one of them, indeed, is the ideology.
[1767] And there's an ideological resistance from the right and from the left.
[1768] They're very different.
[1769] From the right, it's nostalgia for the good old days that everything has gone down the hills since we abandoned the church and kings and had this weird democracy stuff.
[1770] We were better off when we had legitimate authority.
[1771] and we all conformed to rules, let's, you know, make America great again, let's look back, you know, good old days before the kids today screwed everything up.
[1772] So there's that attitude, and you say, well, actually, you know, things, there are bad things happening today, but there were worse things that happened in the past.
[1773] The best explanation for the good old days is a bad memory.
[1774] So there's that, there's the kind of the reactionary resistance, the people who want to look backward to a golden age.
[1775] Then there's, from the left, there's the idea that our current society is so corrupt, so rotten, so evil that we'd be better off just burning the whole thing down.
[1776] Yeah.
[1777] And anything that rises out of the rubble is going to be better than what we have now.
[1778] And when you say, well, yeah, we've got an awful lot of problems now, but things could be worse, things were worse.
[1779] Yeah.
[1780] And so let's not tear it down because it's much easier to make the, things worse than to make things better.
[1781] So that goes against that kind of radical left -wing ideology, kind of not exactly a mirror image of the reactionary right -wing ideology, but both sides are opposed to claims that there has been progress.
[1782] But on top of that, so that's the ideological resistance, but I think there's also some cognitive resistance, and that comes from the, we talked before about the availability bias, namely you base your sense of probability on how easily you can remember examples.
[1783] And the news is about stuff that happens, not stuff that doesn't happen.
[1784] And it's usually about bad stuff that happens.
[1785] Because bad things can happen quickly.
[1786] You can suddenly have a rampage shooting, a terrorist attack, a financial collapse, a war.
[1787] Those are all news.
[1788] Good things are often things that don't happen, like a part of the Or things that happen gradually, like every year, several hundred thousand people are escaped from extreme poverty.
[1789] Actually, every week, several hundred thousand people escape from extreme poverty.
[1790] But it's not something that all of a sudden happened on a Thursday.
[1791] It just happens in the background creeping up on us, so you never read about it.
[1792] It's only when you look at the graphs and you see, oh, my goodness, there's still wars.
[1793] But now the rate of death from war is about 1 per 100 ,000 per year.
[1794] Not long ago it was 10 per 100 ,000 per year.
[1795] And before that it was 30 per 100 ,000 a year.
[1796] It's when you actually plot the graphs that you see the progress, which you can never tell from headlines.
[1797] So there's also a kind of an illusion from the experience of news as opposed to data.
[1798] That's all well and good, but how do we fix this?
[1799] How do we change the way people look at the reality of progress, and instead of just dismissing it because it doesn't fit their narrative?
[1800] How do we convince people like, yes, it doesn't mean that there aren't real problems in the world.
[1801] There are real problems in the world, but we are collectively moving in the right general direction.
[1802] Yeah, several things.
[1803] One is I do think journalism should be more data -oriented and less anecdote and incident -orditorial.
[1804] Especially editorials But if there is a Police shooting, a rampage shooting, a terrorist attack, it should be put in perspective of how many murders there are a year in all.
[1805] So we'd realize that say, for example, terrorist attacks, they are terrifying.
[1806] Of course, that's what we call them why we call it terrorism.
[1807] But really, if you're going to get murdered, it's much more likely to be in an argument over a parking spot or a bar room brawl or a jealous spouse that, you know, hundreds of times more likely.
[1808] So stories in the papers should put things into statistical context.
[1809] We should have more of a dashboard of the world, that the news should be a little bit more like the sports page and the business section, where you see constantly updated numbers and not just the eyeball -grabbing sensational event.
[1810] We should also have an understanding of what, progress is because it's easy to misunderstand it in the other direction and to think, oh, things just get better and better all by themselves.
[1811] You know, we just, progress is just part of the universe.
[1812] And, you know, that's clearly wrong.
[1813] The universe doesn't care about us.
[1814] At all.
[1815] If anything, it seems to conspire against us.
[1816] Certainly germs conspire against us.
[1817] They're constantly evolving to be more deadly, as we're seeing it this very week.
[1818] So nothing by itself makes life better for us.
[1819] It only comes from human ingenuity being applied to making people better off.
[1820] That is, if we decide, well, what can we do to make people live longer or be less likely to be in a famine or less likely to go to war or less likely to commit crime and apply our brain power to try to reduce those problems, there's no guarantee that we'll succeed.
[1821] Every once in a while we'll come across something that works.
[1822] If we keep it, if we don't repeat our mistakes.
[1823] That's what can lead to intermittent progress.
[1824] And it can accumulate.
[1825] That's all the progress is, but it means that there's always a chance that things can go backwards.
[1826] And they do go backwards.
[1827] COVID meant that a lot of these data showing human improvement have gone into reverse, we hope temporarily.
[1828] But that's the way the world works.
[1829] There are a lot of ways for things to go wrong.
[1830] We can apply our brain power to make things go better, let's try to do that more.
[1831] There's also the issue where some people want to move things collectively in a better place for the human race and other people want to profit wildly.
[1832] Well, that's also true.
[1833] Yeah, they want to take advantage of these situations where people are trying to move things in the right direction and they hijack these movements and they instead attach themselves or their corporation or whatever their causes to these movements in order for them to profit.
[1834] This is the problem we have with politicians, right?
[1835] This is a problem we have when politicians are corrupt and making a lot of money while also espousing woke ideals that seem great to young people, and they hijack these ideas.
[1836] And, you know, we have to figure out a way to stop that from happening.
[1837] Collectively, to get people to move in the right direction, the general direction of progress, it's on paper it seems like a great thing for almost everybody.
[1838] Everybody is like, yeah, I want the world to be a better place.
[1839] Everybody wants a world to be a better place.
[1840] But I like coal.
[1841] My family's in the coal business.
[1842] I don't know what to tell you.
[1843] Yeah, it is a problem and we do need to institutional changes that make that less likely happen.
[1844] And we don't have nearly enough guardrails in terms of just disclosure of campaign contributions.
[1845] Right, finances.
[1846] Dependence of politicians on money to get reelected, these are systematic things that get in the way.
[1847] Absolutely.
[1848] When people get hijacked, like when pharmaceutical, have you seen dope sick?
[1849] We've seen the series dope sick?
[1850] No, but I've read about the problem.
[1851] You mean the Sacklers and the opioid crisis?
[1852] Oh, my God.
[1853] And that's just one, right?
[1854] There's been many of those situations where the pharmaceutical companies who have extraordinary power and influence have manipulated reality so that they can sell their drugs.
[1855] And in this case, selling opioids, which are highly, highly, highly addictive.
[1856] and destroy people's lives, and wreck people's health.
[1857] And they tried to convince people otherwise.
[1858] And they tried to say for a long time, despite the evidence they're aware of, that they weren't addictive.
[1859] They're fine.
[1860] Just take them.
[1861] Go ahead.
[1862] And part of the progress has to be to change our laws and institutions to make that less likely to happen.
[1863] And sometimes people call me an optimist, just because I present data on things that have gotten better.
[1864] I don't consider myself an optimist.
[1865] I consider myself someone who just presents data that most people are unaware of, many of which show progress, but not all of them.
[1866] And we don't seem to be on track to reducing the influence of vested interests in American politics.
[1867] Is that reversible?
[1868] It seems like we've almost come to this point of a crossroads where the influence that a lot of the special interests.
[1869] groups have, like whether it's pharmaceutical companies or big oil or whatever it is, they have so much influence that to try to get that out of politics, to try to get that out of the way we govern, it seems like we'd almost have to revamp the entire system.
[1870] And this is where all these crazy burner all to the ground kids come into play, right?
[1871] This is where all the crazy communists and Marxism just hasn't been done right.
[1872] And I just want to come in.
[1873] And I mean, That's their argument.
[1874] It's like, these motherfuckers are just never going to let go of profit and profit at the cost of destroying the Amazon or at the cost of whatever it is.
[1875] Yeah, I mean, it's, you know, what one could imagine, you know, boot stamping on a face forever to use one of the last lines of 1984, where you can't reform the system because the system is unreformable, precisely because the vested interest won't let it be reformed.
[1876] Right.
[1877] But, you know, there is, it's not always, too, there have been reforms in many times in American history that go against the interests of corporations.
[1878] They don't always win.
[1879] Well, the Clean Air Act, Clean Water Act in the 1970s that, you know, in the teeth of opposition from many, many corporations, and with, ironically, the support of Richard Nixon at the time.
[1880] But didn't the different, there was a different level of.
[1881] influence that corporations had on campaign contributions, on the amount of money they could donate, the amount of influence they had.
[1882] It seems like it was a different level back then.
[1883] I mean, it could be, but there's lots of cases in which environmental regulations have gotten more stringent where countries have introduced carbon taxes, which the fossil fuel companies don't like safety standards, which the car companies don't like.
[1884] And the overall tendency is that in many regards, the environment has gotten cleaner because of these innovations.
[1885] The people have gotten safer.
[1886] Fewer people die in car crashes.
[1887] None of it's inevitable.
[1888] It always involves pushing against the interests, the vested interests of corporations.
[1889] But often they come around when they realize that the regulations are going to penalize their competitors.
[1890] as much as they themselves.
[1891] And it'll be a level playing field.
[1892] And so it'll be a level or playing field.
[1893] That's the argument that a lot of right -wing people use against doing things for the environment today because of the competition with China and the competition with other countries that are not doing things to regulate and so that they can't compete with these other countries because their governments don't give a shit.
[1894] Right.
[1895] And in fact, it is another theme that I explore in rationality in the chapter on game theory, where what's rational for an individual, for every individual, might be irrational for everyone put together.
[1896] The classic case is the tragedy of the commons, the hypothetical scenario where each shepherd has an interest in grazing his sheep on the town commons because his sheep isn't going to make the difference between the grasping, grazing, faster that it can grow back.
[1897] It's always to his advantage.
[1898] They all think that.
[1899] Then you've got so many sheep that the grass can't grow back and all the sheep starve.
[1900] So that's called the tragedy of the commons also called negative externality, also called public goods problem.
[1901] But it's a case in which everyone doing what's in their interests actually works against their interests in the long run, unless you have some, you change the rules of the game, such as you've got permits or you've got to pay for the privilege or some ways.
[1902] of aligning individual's incentives with everyone's incentives.
[1903] And that's true of carbon, exactly as you said, if we forego burning coal and oil, but China and India keep doing it, then we're just going to suffer the economic costs without saving the planet.
[1904] So you do need that kind of international pressure.
[1905] You need an international community that makes it just deeply uncool to be the bad guy who's boiling the planet.
[1906] You've got to dangle other incentives so that if you want our cooperation on one thing, you've got to cooperate on this.
[1907] You need changes in the technology so that the cheaper form of energy isn't the form that is most polluting.
[1908] And this goes back to our conversation on new generations of, say, nuclear technology.
[1909] If the cleanest energy is the cheapest kind of energy, that kind of solves the problem because no one has to sacrifice.
[1910] They just do what's in their own interest.
[1911] You change the rules of the game, so it's no longer a tragedy of the commons.
[1912] That's another way out.
[1913] But what's crucial is you do have to think about it in those terms.
[1914] You can't just think about, you know, what can I do so that my virtue will save the planet?
[1915] It won't, unless everyone else is virtuous at the same time, and that's not so easy to engineer.
[1916] Well, there's also a problem if we are competing with these other countries that aren't following the same rules.
[1917] We're buying those countries things and goods.
[1918] And even though we know that they're committing human rights abuses, I mean, no one even thinks about it.
[1919] You just buy their stuff.
[1920] Yeah.
[1921] And especially if their stuff is made by an enormous corporation.
[1922] And the enormous corporation, if it's a very popular corporation like Apple, they don't suffer at all for the fact that their stuff is being made by essentially slave labor in China.
[1923] You know, it's a real conundrum.
[1924] that we all use cell phones, but that none of them are made here.
[1925] Yeah.
[1926] So part of it is that the moral shaming campaigns can change corporate practices when they care about their image, which they do, because they depend on public favor for all kinds of perks and privileges and goodies that they may want.
[1927] But the pushback against that is so minimal.
[1928] Yeah.
[1929] Well, they have been, in each one of them, you do see companies changing their, their, their, policies because they don't want to look like bad guys.
[1930] So it's not totally ineffective, although not as effective as we would like it to be.
[1931] And also, when it comes going back to, say, China, India burning coal, there is also a built -in incentive for them to not go all out on it, namely that their skies are so polluted with particulate matter and poison gases that people start to drop like flies from respiratory diseases, you can't see the sky, you've got, you're metal corroding.
[1932] So for the same reason that when you have the choice of some source of energy other than coal, you go for it, that itself is also going to have a partial pushback against, and indeed there have been a slowdown in the rate of building coal plants in both India and China just because it's choking their own population.
[1933] And there have been some work done in innovating some kind of a device to suck all the particulate matter out of the sky, like some sort of enormous air filter.
[1934] I know that there was some, there was a concept that they had developed.
[1935] It was essentially a skyscraper.
[1936] There was a giant air filter.
[1937] It was like sucking all the pollution out of the sky and cleaning the air.
[1938] Yeah, better.
[1939] I mean, kind of like one of those air purifiers behind your house.
[1940] But an enormous skyscraper one.
[1941] Well, the thing about that is, and together with carbon capture, especially direct air carbon capture as opposed to, say, smokestack carbon capture, is that those things are going to require an awful lot of energy.
[1942] And if you get that energy by, you know, burning coal, then you're right back where you started.
[1943] Right, exactly.
[1944] All the more reason why we really need massive amounts of clean energy.
[1945] Yeah.
[1946] Because every other way in which we reduce the cost of the climate crisis It's going to depend on having that energy available.
[1947] It would be full circle if a nuclear -powered air filter is what cleans out the world.
[1948] Well, it could happen.
[1949] It could, right?
[1950] If we had scalable fourth -generation nuclear, yeah.
[1951] When you sit down to write something like rationality, when you write your book and you've written many great books, do you have a goal in mind other than put together these ideas?
[1952] Are you trying to get something out there?
[1953] Yeah.
[1954] I sometimes quote Anton Chekhov.
[1955] Mankind will be better when you show him what he is like.
[1956] So the idea is that if we understood what makes us tick better, then we'd be better equipped to solve a number of our problems, which, after all, are human problems.
[1957] I think we can end it right there.
[1958] Thanks so much, Joe.
[1959] Thank you.
[1960] It's always a pleasure.
[1961] It's amazing to pick your brain.
[1962] I really appreciate you very much.
[1963] and I really appreciate your work and I enjoy your books and I just started this one so I'm enjoying it very much as well.
[1964] Thanks for having you.
[1965] Thank you.
[1966] My pleasure.
[1967] Bye, everybody.