Insightcast AI
Home
© 2025 All rights reserved
ImpressumDatenschutz
#2201 - Robert Epstein

#2201 - Robert Epstein

The Joe Rogan Experience XX

--:--
--:--

Full Transcription:

[0] Joe Rogan podcast, check it out.

[1] The Joe Rogan Experience.

[2] Train by day, Joe Rogan podcast by night, all day.

[3] And we're up.

[4] Hello, Robert.

[5] Good to see you.

[6] Hello, Joe.

[7] You look a little stressed out.

[8] I am stressed out.

[9] In fact, are we recording?

[10] Yes.

[11] Okay, then I want to make a special request.

[12] Okay.

[13] You can kick me out, if you like.

[14] Why would I do that?

[15] Well, because I need to have a meltdown.

[16] I would like to have a meltdown right now on your show.

[17] You want to have a personal meltdown?

[18] Yes.

[19] Okay, go ahead.

[20] Okay.

[21] I've never heard anybody plan for a meltdown before.

[22] Well, I need to do this, and I think this is the right opportunity.

[23] Okay.

[24] And I don't know what I'm going to say.

[25] Okay.

[26] But I am definitely going to meltdown.

[27] Okay.

[28] So I am completely fed up.

[29] I have worked day and night.

[30] I work about 80 hours a week.

[31] I'm directing almost 40 research projects.

[32] I've been working really hard for maybe 45 years.

[33] And the last 12 years where I've turned my eye to Google and other tech companies have turned into, for me, personally, a disaster.

[34] So before I started studying Google, I had published 15 books with major publishers.

[35] Since I've started studying Google and other companies, I can't publish anymore.

[36] I used to write for and actually work for mainstream news organizations and media organizations.

[37] I was editor -in -chief of psychology today for four years.

[38] I was an editor for Scientific American.

[39] I wrote for USA Today and U .S. News and World Report and Time magazine.

[40] But in 2019, after I testified before Congress about some of my research on Google, President Trump tweeted to his whatever, millions of gazillions of followers, basically some praise for my research.

[41] He got the details wrong.

[42] But then Hillary Clinton, whom I had always admired, chose to tweet back to her 80 million Twitter followers, and she tweeted that my work had been completely debunked and was based.

[43] based on data from 21 undecided voters, I still have no idea where any of that came from, probably someone from Google, because Google was her biggest supporter in 2016.

[44] And this was 2019.

[45] And then that got picked up by this machine.

[46] I'm told it's called the Clinton machine.

[47] And the New York Times picked that up without fact -checking, and then 100 other places did.

[48] and I got squashed like a bug, squashed.

[49] I had a flawless reputation as a researcher.

[50] My research reputation was gone.

[51] I was now a fraud, a fraud, even though I've always published in peer -reviewed journals, which is really hard to do.

[52] And there was nothing I could do about it.

[53] And all of a sudden I found that the only places I could publish were in what I call right -wing conservative nutcase, where I've actually made friends over the years.

[54] I've made friends with them, but that's beside the point.

[55] I was crushed.

[56] And not only that, I've been discovering things.

[57] I've made at least 10 major discoveries about new forms of influence that the Internet has made possible.

[58] These are controlled almost entirely by a couple of big tech companies.

[59] affecting more than 5 billion people around the world every single day.

[60] And I've discovered them, I've named them, I've quantified them, I've published randomized controlled studies to show how they work, publish them in peer -reviewed journals.

[61] We just had another paper accepted yesterday.

[62] And what it, and I, and I, I've built systems to do to them what they do to us and our kids.

[63] They surveil us and our kids, 24 hours a day.

[64] Google alone does that over more than 200 different platforms, most of which no one's ever heard of.

[65] People have no idea of the extent they're being monitored.

[66] They're being monitored when they're, if they have Android phones, they're being monitored even when your phone is off.

[67] Even when the power is off, you're still being monitored.

[68] How do they do that?

[69] Well, because remember when we could take the batteries out?

[70] Yeah.

[71] And then at some point they soldered them in.

[72] Yeah.

[73] Because they soldered the batteries in, even when you turn the phone off, it's not off.

[74] It's easy to demonstrate it.

[75] It's still transmitting.

[76] Or it'll transmit the moment the power comes back on.

[77] It's still collecting data.

[78] So what am I trying to say here?

[79] Then my wife was killed.

[80] in a suspicious car accident.

[81] This was also shortly after I testified before Congress in 2019.

[82] Right before she was killed, I did a private briefing for Ken Paxton in the AG of Texas and other AGs at Stanford University.

[83] And one of those guys came out afterwards and he said, well, based on what you told us, Dr. Epstein, he said, I don't mean to scare you, but he said, I predict you're going to be killed in some sort of accident.

[84] in the next few months.

[85] So I told you this before when I was on before.

[86] And obviously I wasn't killed, but my beautiful wife was killed.

[87] And, you know, her vehicle was never inspected forensically.

[88] And then it disappeared from the impound lot.

[89] I was told it was sold to some junk company in Mexico.

[90] And that is one of now six, six incidents.

[91] six of violence against people who are associated with me over the past few years.

[92] The last just happened a couple of weeks ago.

[93] What was that one?

[94] Well, this last one was kind of weird and creepy.

[95] And then I was at a meeting in Dallas, I think it was.

[96] Oh, no, no, no. It was up in Monterey.

[97] And it was with General Paxton, and then with some of my staff.

[98] And one of my staff members sitting next to me, she all of a sudden just brushed her hand against my computer case, which is, that's my computer case.

[99] Okay.

[100] And she screamed.

[101] And we all went, what, what happened?

[102] And she goes, look.

[103] And there was a needle sticking out of the computer case, sticking out of the computer case, which is impossible.

[104] and it was going a half inch into her thumb had gone through the end and of course I'm thinking oh that's awful but maybe you just save my life maybe it's got some sort of weird poison on it or it's like a Putin thing and it's got radioactive substances or something I'm trying to joke around but meanwhile she was terrified how did a pin and by the way I have a picture of the pin it's really creepy So when you're saying a needle, you're not saying like a syringe, you're saying a needle like a sewing needle?

[105] No, that's what I'm saying.

[106] None of us has ever seen a needle like this needle.

[107] It's about two inches long.

[108] The end is like it's been sharpened.

[109] Okay.

[110] You can see it sharpened.

[111] And at the end where there should be a hole for thread, there's no hole.

[112] Okay.

[113] I don't know what it is.

[114] But we've had worse incidents, too.

[115] I'm just saying this was happened to be the latest.

[116] But that's in your computer bag?

[117] Was your computer bag ever out of your care?

[118] Well, not that I noticed, but...

[119] But I mean, if somebody wanted to harm you a little needle, it's not really...

[120] Oh, I don't think that's someone wanting to harm me. What do you think that is?

[121] Well, if it's anything, it's someone wanting to scare me. And the fact is, I have been scared, and so have a lot of my staff.

[122] This summer, we've had 26 interns.

[123] They come from all over the country.

[124] 23 of these people are volunteers.

[125] And fantastic young people, extremely smart, you know, helping me run almost 40 research projects.

[126] And there is, you know, we take precautions and there is some fear.

[127] And one of these young men who's done superb work, he asked that we take.

[128] his name off of everything.

[129] You know, he didn't quit, but he'm just saying, he just, because there's...

[130] Sounds reasonable?

[131] Yeah, yeah, because, you know, there have been a number of incidents.

[132] And if I were, did you ever hear of John Katzumatidis?

[133] No. Okay, he owns a ton of supermarkets in New York.

[134] he also owns W .A .B .C. New York.

[135] But I was at a luncheon with him.

[136] I shouldn't do this on the air.

[137] I shouldn't do this.

[138] But he actually said, to make a long story short, that if he were Google, he would kill me. He said it straight out.

[139] But yet you're still alive.

[140] Well, I'm alive, but I am in rough shape because, you know, what a push comes to shove here, I have been making discoveries that.

[141] are really startling, and they've gotten worse and worse and worse.

[142] And since I was last with you, which was two and a half years ago, we've made probably five or six, seven more discoveries.

[143] They get worse each time.

[144] And we've done something that I was speculating about doing when I was here, which was building a nationwide monitoring system to surveil them the way they surveil us and see what content they're actually sending to real voters and to real kids.

[145] So let's break this down because I think we're getting a little in the weeds here.

[146] Let's explain to people that don't know what you're talking about, what your research is about.

[147] Because most people are not aware.

[148] And one of the major issues that you have discovered is the curation and the purposeful curation of information through search engines.

[149] So most people that are unaware think that when you, do a Google search on something, say if you want to find out about a Kamala Harris rally or a Trump rally, that you are just going to get the most pertinent information in the order in which it's most applicable to your search.

[150] But that's not the case.

[151] The case is everything is curated and if you want to find positive things about someone who they deem to be negative to whatever ideology they're promoting.

[152] it will be very difficult to find that information.

[153] If you want to find positive things about someone they support, they will be right up front.

[154] If you want to find negative things about someone they support, they will be very difficult to find and you will be inundated with positive things.

[155] And what you have found is that this curation of information from searches has a profound effect, especially on the casual voter, on the low information voter, a profound effect on who gets elected, and it's tantamount to election interference.

[156] Is that fair to say?

[157] It's fair to say that's where I was two and a half years ago.

[158] We have gone so far beyond that because it's not just search results.

[159] It's search suggestions, which we're capturing now by the millions.

[160] So it was in the news recently.

[161] Then when people were typing in Trump assassination, they were getting crazy stuff like the Lincoln assassination.

[162] They were getting crazy stuff and they were not getting information about the Trump attempted assassination.

[163] And, you know, I looked at that and I said, oh, isn't that nice?

[164] There's an anecdote about how they may be abusing search suggestions.

[165] We don't have anecdotal data anymore.

[166] We have hardcore, large -scale scientific data on all of these issues.

[167] We know what's actually going on, and we've quantified the impact.

[168] See, it's one thing to say, oh, look what they're doing.

[169] It's quite another to say, what impact does that have on people?

[170] Right.

[171] Let's talk about the Trump assassination one in particular.

[172] What did you find about that?

[173] Well, frankly, we couldn't care less about that because that's one anecdote.

[174] We're collecting these by the millions and what we know, we know a couple of things.

[175] We know that, first of all, they're not.

[176] You know, it started out as one thing.

[177] and it's turned into something else.

[178] And so what they do is they use search suggestions to shift people's thinking about anything.

[179] It's not just about candidates either.

[180] It's about anything.

[181] And we've shown in controlled experiments that by manipulating search suggestions, you can turn a 50 -50 split among undecided voters into a 90 -10 split, with no one having the slightest idea that they have been manipulated.

[182] Wow.

[183] And this always goes a very specific way.

[184] It always goes.

[185] It always goes a specific way, but I'm going to show you maybe a little later if I haven't put you to sleep or if my meltdown hasn't gotten too bad because I'm not quite finished with my meltdown yet.

[186] I'll show you content, data, large scale that we're collecting now 24 hours a day and I'll show you what they're actually doing.

[187] An anecdote, those don't hold up in court.

[188] You know, they grab headlines for a couple of days, but that's about it.

[189] They don't do anything.

[190] But we're actually collecting evidence that's court admissible.

[191] So we're collecting data now in all 50 states, but we actually have court admissible data now in 20 states already.

[192] And we keep building bigger and bigger every day.

[193] And what is this data about?

[194] Well, it's any data that's going to real people.

[195] So we're collecting data with their permission from the computers of a politically balanced group of more than 15 ,000 registered voters in all 50 states and from many of their children and teens as well.

[196] And so when they're doing anything on their computers, they've given us the right to collect it, grab it, zap it over to our own computers, aggregate the data, and analyze it.

[197] I want to point out that when we do this, we do this without transmitting any identifying information.

[198] We protect people's privacy, but we are getting these increasingly accurate pictures of what Google and other companies are sending to real people.

[199] Why do you have to do it this way?

[200] Because all the data they send is personalized.

[201] You will never know what they're sending to people unless you look over the shoulder.

[202] of real people and see the personalized content.

[203] And what have you found?

[204] Well, as it happens, I've just summarized our findings over the last 12 years, and you get the first advanced copy of a monograph that's called The Evidence.

[205] And because we're so desperate, we need help, we need money, we need emails, we're so desperate for that.

[206] That we have set up, kind of did this last time too, but we have set up a link.

[207] If people go to that link and they're willing to give us their email, we will give them a free copy of this advanced copy of this monograph.

[208] And it goes through the whole thing.

[209] It shows all the effects we've discovered, but it also shows the monitoring we're doing and what we're finding out from this monitoring.

[210] One of the things that I noticed since the last time you were here, was I used to use Duck, Duck, Duck Go.

[211] And one of the reasons why I started using Duck, Duck Go, is there was a story about a physician in Florida that took the MRI vaccine and had a stroke shortly afterwards.

[212] It was very early on in the pandemic.

[213] And they were beginning to speculate that some of the side effects of the vaccine are being hidden.

[214] And I could not find this story on Google.

[215] I could not find it.

[216] I kept looking and looking and looking.

[217] I entered in the information on Duck, Duck, Go.

[218] It was one of the first articles, instantaneously.

[219] I was like, this is crazy.

[220] Since then, something's happened.

[221] And I think they became aware that Duck, Duck Go was a problem spot for the dissemination of information.

[222] And now it appears to mirror Google.

[223] Well, the same has happened with Bing, and the same has happened with Yahoo. What about Brave?

[224] No, Brave is still independent.

[225] I know Brendan Ike, you should have him on if you haven't.

[226] And he's the guy who wrote Brave.

[227] Before that, he wrote Firefox for Mozilla.

[228] He left because Google had its tentacles into Firefox.

[229] Yeah.

[230] I'm afraid to talk about Brave for them to be compromised.

[231] Because, like, we were talking about Duck, Duck, Do.

[232] And I was telling everybody, go to Duck, Duck, Go.

[233] And now I'm like, Jesus, it's the same thing as Google.

[234] Like, something happened.

[235] Do you know what happened?

[236] Well, we know in some cases with some of these companies what happened.

[237] I don't know the particulars with Duck