Insightcast AI
Home
© 2025 All rights reserved
Impressum
#266 – Nicole Perlroth: Cybersecurity and the Weapons of Cyberwar

#266 – Nicole Perlroth: Cybersecurity and the Weapons of Cyberwar

Lex Fridman Podcast XX

--:--
--:--

Full Transcription:

[0] The following is a conversation with Nicole Perlroth, Cybersecurity journalist and author of This Is How They Tell Me the World Ends, The Cyber Weapons Arm Race.

[1] And now, a quick few seconds mention of each sponsor.

[2] Check them out in the description.

[3] It's the best way to support this podcast.

[4] First is Linode, Linux virtual machines.

[5] Second is Inside Tracker, a service I used to track my biological data.

[6] Third is On It, a nutrition, supplement, and fitness company.

[7] fourth is roca my favorite sunglasses and prescription glasses and fifth is indeed a hiring website so the choices computation health style or building an amazing team choose wisely my friends and now onto the full ad reads as always there's no ads in the middle i try to make this interesting but if you skip them please still check out the sponsors i enjoy their stuff maybe you will too this episode is sponsored by linode Linux virtual machines It's an awesome compute infrastructure that lets you develop, deploy, and scale what applications you build faster and easier.

[8] This is both for small personal projects and huge systems.

[9] Lower cost on AWS, but more important to me is the simplicity and quality of their customer service with real humans, 24 -7 -365.

[10] Their motto is, if it runs on Linux, it runs on Linode.

[11] I have not shut up in saying how much I love Linux every single distribution Gen 2, Arch Linux, everything based on Arch Linux, Ubuntu, everything based on Debian.

[12] I used to use Genti, did I really mention Gentoo?

[13] There's Linux mint, all the sexy new flavors based on the old school.

[14] Of course, there's Red Hat and Fedora and all that.

[15] Anyway, I love Linux.

[16] I love computer infrastructure.

[17] What else can you say?

[18] Linero is awesome.

[19] visit linode .com slash lex and click on the create free account button to get started with $100 in free credit.

[20] This show is also brought to you by Inside Tracker, a service I use to track my biological data.

[21] They have a bunch of plans, most of which include a blood test that gives you a lot of information that you can then make decisions based on.

[22] They use machine learning algorithms to then analyze that data, and that comes from blood data, DNA data, fitness tracker data, all that to, provide you with a clear picture what's going on inside your body and to offer you signs -backed recommendations for positive diet and lifestyle changes.

[23] This to me feels like the future.

[24] It's obvious that medicine, lifestyle, all your decisions to come from your own personal data.

[25] By the way, this is something that comes up more and more.

[26] The privacy of that data, the control over that data is really important.

[27] Getting that right in the future will be a very difficult problem.

[28] But what isn't difficult is to understand that decisions should be made based on information coming from your own body.

[29] And that's what InsideTracker is about.

[30] For a limited time, you get 25 % off the entire InsideTracker store.

[31] If you go to InsideTracker .com slash Lex.

[32] That's Insightracker .com slash Lex.

[33] This episode is also brought to you by Onet, a nutrition, supplement, and fitness company.

[34] They make AlphaBrain, which is a neutropic that helps you support memory, mental speed, and focus.

[35] I use it, Not every day, but when I want to boost my thinking, the clarity, the focus, the mental speed in a deep work session, a particularly difficult deep work session when I have to struggle through a hard problem, that's when I will take an alpha brain.

[36] It's a boost, a rocket launcher for my mind.

[37] One thing I should mention is the thing that makes a work session hard is when there's a lot of tangents, a lot of side roads that lead to that end and force me to backtrack.

[38] That is the most exhausting thing.

[39] So, yeah, in those cases, I'll sometimes take an alpha brain and it really helps.

[40] Anyway, go to Lexfreedman .com slash On It to get up to 10 % off AlphaBrain.

[41] That's Lexfriedman .com slash On It.

[42] This show is also brought to you by Roka, the makers of glasses and sunglasses that I love wearing for their design feel and innovation on material optics and grip.

[43] Roka was started by two All -American swimmers from Stanford, and it was born out of an obsession with performance.

[44] I met one of said founders.

[45] They have a place here in Austin.

[46] I love it when not only the product is great, but the people that make the product is great.

[47] And obviously the product I love, it's designed to be active in, extremely lightweight.

[48] The grip is comfortable but strong, and the style is just like I like it.

[49] It's classy, it's minimalist, holds up in all conditions.

[50] I'll wear it while wearing a suit.

[51] I'll wear it while running, running gear, freezing weather, super hot.

[52] You know, I wore it through the Austin 100 degree summer.

[53] Check them out for both prescription glasses and sunglasses at rocca .com and intercoding lex.

[54] to save 20 % off on your first order.

[55] That's roca .com and enter code Lex.

[56] This show is also brought to you by Indeed, a hiring website.

[57] I've used them as part of many hiring efforts I've done for the teams I've led in the past.

[58] They have tools like Indeed Instant Match that gives you quality candidates whose resume is and Indeed fit your job description immediately.

[59] I am one of those people that believe that a great team is not just about productivity, A great team is a source of happiness.

[60] It's a source of meaning.

[61] It's a source of personal growth.

[62] Hiring is the most important thing for those of us that spend a significant percentage of our life at work or working on something we love.

[63] It doesn't feel like work.

[64] And one of the reasons it might not feel like work is because you're working with people.

[65] You love, you respect, that challenge you, that excite you, all those kinds of things.

[66] And that's all about hiring.

[67] So you should be using the best tools for the job.

[68] Anyway, right now I'll get a free $75 -sponsored job credit to upgrade your job post at Indeed .com slash Lex.

[69] Terms and conditions apply.

[70] Go to Indeed .com slash Lex.

[71] This is Alex Friedman podcast, and here is my conversation with Nicole Porlaroth.

[72] You've interviewed hundreds of cybersecurity hackers, activists, dissidents, computer scientists, government officials, forensic investigators, and mercenaries.

[73] So let's talk about cybersecurity and cyber war.

[74] Start with the basics.

[75] What is a zero -day vulnerability and then a zero -day exploit or attack?

[76] So at the most basic level, let's say I'm a hacker and I find a bug in your iPhone, iOS software that no one else knows about, especially Apple.

[77] that's called a zero day, because the minute it's discovered, engineers have had zero days to fix it.

[78] If I can study that zero day, I could potentially write a program to exploit it.

[79] And that program would be called a zero -day exploit.

[80] And for iOS, the dream is that you craft a zero -day exploit that can remotely exploit someone else's iPhone without them ever knowing about it.

[81] And you can capture their location.

[82] You can capture their contacts that record their telephone calls, record their camera without them knowing about it.

[83] Basically, you can put an invisible ankle bracelet on someone without them knowing.

[84] And you can see why that capability, that zero -day exploit would have immense value for a spy agency or a government that wants to monitor its critics or dissidents.

[85] And so there's a very lucrative market now for zero -day exploits.

[86] So you said a few things there.

[87] One is iOS, why iOS, which operating system, which one is the sexier thing to try to get to or the most impactful thing?

[88] And the other thing you mentioned is remote versus like having to actually come in physical contact with it.

[89] Is that the distinction?

[90] So iPhone exploits have just been a government's number one priority.

[91] Recently, actually, the price of an Android remote zero -day exploit, something that can get you into Android phones, is actually higher.

[92] The value of that is now higher on this underground market for zero -day exploits than an iPhone iOS exploit.

[93] So things are changing.

[94] So there's probably more Android devices.

[95] So that's why it's better.

[96] But then the iPhone side, if I, so I'm an Android person, because I'm a man of the people.

[97] But it seems like all the elites use iPhone, all the people at nice dinner parties.

[98] So is that, is that the reason that, like, the more powerful people use iPhones?

[99] Is that why?

[100] I don't think so.

[101] I actually, so it was about two years ago that the price is flipped.

[102] It used to be that if you could craft a remote zero -click.

[103] exploit for iOS, then that was about as good as it gets.

[104] You could sell that to a zero -day broker for $2 million.

[105] The caveat is you can never tell anyone about it, because the minute you tell someone about it, Apple learns about it, they patch it in that $2 .5 million investment that that zero -day broker just made goes to dust.

[106] So a couple years ago, and don't quote me on prices, but an Android zero -click remote exploit for the first time topped the iOS.

[107] And actually, a lot of people's read on that was that it might be a sign that Apple security was falling and that it might actually be easier to find an iOS zero -day exploit than find an Android zero -day exploit.

[108] The other thing is market share.

[109] There are just more people around the world that use Android.

[110] And a lot of governments that are paying top dollar for zero to exploits these days are deep pocketed governments in the Gulf that want to use these exploits to monitor their own citizens, monitor their critics.

[111] And so it's not necessarily that they're trying to find elites.

[112] It's that they want to find out who these people are that are criticizing them or perhaps planning the next Arab Spring.

[113] So in your experience, are most of these attack targeted to cover a large population, or is there attacks that are targeted to our specific individuals?

[114] So I think it's both.

[115] Some of the zero -day exploits that have fetched top dollar that I've heard of in my reporting in the United States were highly targeted.

[116] You know, there was a potential terrorist attack.

[117] They wanted to get into this person's phone.

[118] It had to be done in the next 24 hours.

[119] they approached hackers and say, we'll pay you, you know, X millions of dollars if you can do this.

[120] But then you look at when we've discovered iOS zero -day exploits in the wild, some of them have been targeting large populations like Uyghurs.

[121] So a couple years ago, there was a watering hole attack.

[122] Okay, what's a watering hole attack?

[123] There's a website.

[124] It was actually had information aimed at Uyghurs.

[125] And you could access it all over the world.

[126] And if you visited this website, it would drop an iOS zero -day exploit onto your phone.

[127] And so anyone that visited this website that was about Uyghurs anywhere.

[128] I mean, Uyghurs, Uyghurs living abroad, basically the Uyghur diaspora, would have gotten infected with this zero -day exploit.

[129] So in that case, you know, they were targeting huge swaths of this one population or people interested in this one population basically in real time.

[130] Who are these attackers?

[131] From the individual level to the group level, psychologically speaking, what's their motivation?

[132] Is it purely money?

[133] Is it the challenge?

[134] Are they malevolent?

[135] Is it power?

[136] These are big philosophical human questions, I guess.

[137] So these are the questions I set out to answer for my book.

[138] I wanted to know are these people that are just after money?

[139] If they're just after money, how do they sleep at night?

[140] Not knowing whether that zero -day exploit they just sold to a broker is being used to basically make someone's life a living hell.

[141] And what I found was there's kind of this long -sorted history to this question.

[142] You know, it started out.

[143] in the 80s and 90s when hackers were just finding holes and bugs and software for curiosity's sake, really as a hobby.

[144] And some of them would go to the tech companies like Microsoft or Sun Microsystems at the time or Oracle.

[145] And they'd say, hey, I just found this zero day in your software and I can use it to break into NASA.

[146] And the general response at the time wasn't, thank you so much for pointing out this flaw and our software will get it fixed as soon as possible.

[147] It was, don't ever poke around our software ever again or we'll stick our general counsel on you.

[148] And that was really sort of the common thread for years.

[149] And so hackers who set out to do the right thing were basically told to shut up and stop doing what you're doing.

[150] And what happened next was, they basically started trading this information online.

[151] Now, when you go back and interview people from those early days, they all tell a very similar story, which is they're curious, they're tinkers.

[152] You know, they remind me of like the kid down the block that was constantly poking around the hood of his dad's car.

[153] You know, they just couldn't help themselves.

[154] They wanted to figure out how a system is designed and how they could potentially exploit it for some other purpose.

[155] It doesn't have to be good or bad.

[156] But they were basically kind of beat down for so long by these big tech companies that they started just silently trading them with other hackers.

[157] And that's how you got these really heated debates in the 90s about disclosure.

[158] Should you just dump these things online?

[159] Because any script kitty can pick them up and use it for all kinds of mischief.

[160] but, you know, don't you want to just stick a middle finger to all these companies that are basically threatening you all the time?

[161] So there was this really interesting dynamic at play and what I learned in the course of doing my book was that government agencies and their contractors sort of tapped into that frustration and that resentment.

[162] And they started quietly reaching out to hackers on these forums.

[163] And they said, hey, you know, that zero day you just dropped online?

[164] Could you come up with something custom for me?

[165] And I'll pay you six figures for it so long as you shut up and never tell anyone that I paid you for this.

[166] And that's what happened.

[167] So throughout the 90s, there was a bunch of boutique contractors that started reaching out to hackers on these forums and saying, hey, I'll pay you six figures for that bug you were trying to get Microsoft to fix for free.

[168] And sort of so began or so catalyzed this market where governments and their intermediaries started reaching out to these hackers and buying their bugs for free.

[169] And in those early days, I think a lot of it was just for quiet counterintelligence, traditional espionage.

[170] But as we started baking the software, Windows software, Schneider Electric, Siemens industrial software, into our nuclear plants and our factories and our power grid and our petrochemical facilities and our pipelines, those same zero days came to be just as valuable for sabotage and war planning.

[171] Does the fact that the market sprung up and you can not make a lot of money change the nature of the attackers that came to the table or grow the number of attackers?

[172] I mean, what is, I guess, you told the psychology of the hackers in the 90s, What is the culture today?

[173] And where is it heading?

[174] So I think there are people who will tell you they would never sell a zero day to a zero -day broker or a government.

[175] One, because they don't know how it's going to get used when they throw it over the fence.

[176] You know, most of these get rolled into classified programs and you don't know how they get used.

[177] If you sell it to a zero -day broker, you don't even know which nation state might use it.

[178] Or potentially which criminal group might use it if you sell it on the dark web.

[179] The other thing that they say is that they want to be able to sleep a night.

[180] And they lose a lot of sleep if they found out their zero day was being used to make a dissidence life, living hell.

[181] But there are a lot of people, good people, who also say, no, this is not my problem.

[182] This is the technology company's problem.

[183] If they weren't writing new bugs into their software every day, then there wouldn't be a market.

[184] you know, then there wouldn't be a problem.

[185] But they continue to write bugs into their software all the time and they continue to profit off that software.

[186] So why shouldn't I profit off my labor too?

[187] And one of the things that has happened, which is I think a positive development over the last 10 years, are bug bounty programs.

[188] You know, companies like Google and Facebook and then Microsoft and finally Apple, which resisted it for a really long time.

[189] I've said, okay, we are going to shift our perspective about hackers.

[190] We're no longer going to treat them as the enemy here.

[191] We're going to start paying them for what it's essentially free quality assurance.

[192] And we're going to pay them good money in some cases, you know, six figures in some cases.

[193] We're never going to be able to bid against a zero -day broker who sells to government agencies.

[194] But we can reward them and hopefully get that to that bug earlier where we can neutralize it so that they don't have to spend another year developing the zero -day exploit.

[195] And in that way, we can keep our software more secure.

[196] But every week I get messages from some hacker that says, you know, I tried to see this zero -day exploit that was just found in the wild, you know, being used by this nation state.

[197] I tried to tell Microsoft about this two years ago and they were going to pay me peanuts.

[198] So it never got fixed.

[199] You know, there are all sorts of those stories that can continue on.

[200] And, you know, I think just generally, hackers are not very good at diplomacy.

[201] You know, they tend to be pretty snipey, technical crowd.

[202] And very philosophical in my experience, but, you know, diplomacy is not their strong suit.

[203] Oh, there almost has to be a broker between companies and hackers.

[204] We can translate effectively, just like you have a zero -day broker between governments and hackers.

[205] Yeah.

[206] Because you have to speak their language.

[207] Yeah.

[208] And there have been some of those companies who've risen up to meet that demand.

[209] And Hacker 1 is one of them.

[210] Bug Crowd is another.

[211] CINAC has an interesting model.

[212] So that's a company that you pay for a private bug bounty program, essentially.

[213] So you pay this company.

[214] They tap hackers all over the world to come hack your software, hack your system.

[215] and then they'll quietly tell you what they found.

[216] And I think that's a really positive development.

[217] And actually, the Department of Defense hired all three of those companies I just mentioned to help secure their systems.

[218] Now, I think they're still a little timid in terms of letting those hackers into the really sensitive, high side classified stuff.

[219] But, you know, baby steps.

[220] Just to understand what you were saying, you think it's impossible for, companies to financially compete with the zero -day brokers with governments.

[221] So, like, the defense can't outpay the hackers.

[222] It's interesting, you know, they shouldn't out -pay them because what would happen if they started offering $2 .5 million at Apple for any, you know, zero -day exploit that governments would pay that much for is their own engineers would say, why the hell am I working, you know, for less than that and doing my nine to five every day?

[223] So you would create a perverse incentive.

[224] And I didn't think about that until I started this research and I realized, okay, yeah, that makes sense.

[225] You don't want to incentivize offense so much that it's to your own detriment.

[226] And so I think what they have, though, what the companies have on government agencies is if they pay you, you get to talk about it.

[227] know, you get a street cred, you get to brag about the fact you just found that $2 .5 million you know, iOS Zero Day that no one else did.

[228] And if you sell it to a broker, you never got to talk about it.

[229] And I think that really does eat at people.

[230] Can I ask you a big philosophical question about human nature here?

[231] So if you have, in what you've seen, if a human being has a zero day, they found a zero day vulnerability.

[232] that can hack into, I don't know, what's the worst thing you can hack into?

[233] Something that could launch nuclear weapons.

[234] Which percentage of the people in the world that have the skill would not share that with anyone, with any bad party?

[235] I guess how many people are completely devoid of ethical concerns in your sense?

[236] So my belief is all the ultra -competent people, or very, very high percentage of ultra -competent people are also ethical people.

[237] That's been my experience, but then, again, my experience is narrow.

[238] What's your experience been like?

[239] So this was another question I wanted to answer.

[240] You know, who are these people who would sell a zero -day exploit that would neutralize a Schneider electric safety lock at a petrochemical plant?

[241] Basically, the last thing you would need to neutralize before you trigger some kind of explosion?

[242] Who would sell that?

[243] And I got my answer.

[244] Well, the answer was different.

[245] A lot of people said, I would never even look there because I don't even want to know.

[246] I don't even want to have that capability.

[247] I don't, like, I don't even want to have to make that decision about whether I'm going to profit off of that knowledge.

[248] I went down to Argentina and this whole kind of moral calculus I had in my head was completely flipped around.

[249] So just to back up for a moment.

[250] So Argentina actually is a real hacker's paradise.

[251] People grew up in Argentina and, you know, I went down there.

[252] I guess I was there around 2015, 2016, but you still couldn't get an iPhone.

[253] You know, they didn't have Amazon Prime.

[254] You couldn't get access to any of the apps we all take.

[255] for granted.

[256] To get those things in Argentina as a kid, you have to find a way to hack them.

[257] You know, and it's, the whole culture is really like a hacker culture.

[258] They say like, it's really like a MacGyver culture.

[259] You know, you have to figure out how to break into something with wire and tape.

[260] And that means that there are a lot of really good hackers in Argentina who are, who specialize in developing zero -day exploits.

[261] And I went down to this Argentina conference called Echo Party.

[262] And I asked the organizer, okay, can you introduce me to someone who's selling zero -day exploits to governments?

[263] And he was like, just throw a stone.

[264] Throw a stone anywhere and you're going to hit someone.

[265] And all over this conference, you saw these guys who were clearly from these Gulf states who only spoke Arabic.

[266] You know, what are they doing at a young hacking conference in Buenos Aires?

[267] Oh, boy.

[268] And so I went out to lunch with kind of this godfather of the hacking scene there, and I asked this really dumb question, and I'm still embarrassed about how I phrased it.

[269] But I said, so, you know, well, these guys only sell these zero -de -explets to good Western governments.

[270] And he said, Nicole, last time I checked, the United States wasn't a good Western government.

[271] you know, the last country that bombed another country into oblivion wasn't China or Iran.

[272] It was the United States.

[273] So if we're going to go by your whole moral calculus, you know, just know that we have a very different calculus down here.

[274] And we'd actually rather sell to Iran or Russia or China maybe than the United States.

[275] And that just blew me away.

[276] Like, wow.

[277] You know, he's like, we'll just sell to whoever brings us the biggest bag of cash.

[278] Have you checked into our inflation situation recently?

[279] So, you know, I had some of those, like, reality checks along the way.

[280] You know, we tend to think of things as, is this moral, you know, is this ethical, especially as journalists.

[281] You know, we kind of sit on our high horse sometimes and write about a lot of things that seem to push the moral bounds.

[282] But in this market, which is essentially an underground market that, you know, the one rule is like fight club.

[283] You know, no one talks about Fight Club.

[284] First rule of the zero -day market, nobody talks about the zero -day market on both sides.

[285] Because the hacker doesn't want to lose their $2 .5 million bounty.

[286] And governments roll these into classified programs, and they don't want anyone to know what they have.

[287] So no one talks about this thing.

[288] And when you're operating in the dark like that, it's really easy to put aside your morals sometimes.

[289] Can I, in a small tangent, ask you by way of advice, you must, have done some incredible interviews and you've also spoken about how serious you take protecting your sources if you were to give me advice for interviewing when you're recording on mic with a video camera how is it possible to get into this world like is it basically impossible so you've you've spoken with a few people uh what is it like the godfather of uh cyber war cyber security so people that are out.

[290] And they're still have to be pretty brave to speak publicly.

[291] But is it virtually impossible to really talk to anybody who's a current hacker?

[292] You're always like 10, 20 years behind.

[293] It's a good question.

[294] And this is why I'm a print journalist.

[295] But, you know, a lot, when I've seen people do it, it's always the guy who's behind the shadows, whose voice has been altered.

[296] you know, when they've gotten someone on camera, that's usually how they do it.

[297] You know, very, very few people talk in this space.

[298] And there's actually a pretty well -known case study in why you don't talk publicly in the space and you don't get photographed.

[299] And that's the gruck.

[300] So, you know, the gruck is or was this zero -day broker, South African guy, lives in Thailand.

[301] And right when I was starting on this subject at the New York Times, he'd given a an interview to Forbes.

[302] And he talked about being a zero -day broker.

[303] And he even posed next to this giant defile bag filled with cash ostensibly.

[304] And later he would say he was speaking off the record.

[305] He didn't understand the rules of the game.

[306] But what I heard from people who did business with him was that the minute that that story came out, he became PNGed.

[307] No one did business with him.

[308] You know, his business plummeted by at least half.

[309] No one wants.

[310] to do business with anyone who's going to get on camera and talk about how they're selling zero days to governments, you know, it puts you at danger.

[311] And I did hear that he got some visits from some security folks.

[312] And, you know, that's another thing for these people to consider.

[313] You know, if they have those zero -day exploits at their disposal, they become a huge target for nation states all over the world.

[314] You know, talk about having perfect opsec.

[315] You know, you better have some perfect OPSEC if people know that you have access to those zero -day exploits.

[316] Which sucks because, I mean, transparency here would be really powerful for educating the world.

[317] And also inspiring other engineers to do good.

[318] It just feels like when you're operating in the shadows, it doesn't help us move in the positive direction in terms of, like, getting more people on the defense side versus on the attack side.

[319] But of course, what can you do?

[320] I mean, the best you can possibly do is have great journalists, just like you did interview and write books about it and integrate the information you get while hiding the sources.

[321] Yeah, and I think, you know, what Hacker 1 has told me was, okay, let's just put away the people that are finding and developing zero -day exploits all day long.

[322] Let's put that aside.

[323] What about the, you know, however many millions of programmers all over.

[324] the world who've never even heard of a zero -day exploit, why not tap into them and say, hey, we'll start paying you if you can find a bug in United Airlines software or in Schneider Electric or in Ford or Tesla.

[325] And I think that is a really smart approach.

[326] Let's go find this untapped army of programmers to neutralize these bugs before the people who will continue to sell these to governments can find them and exploit them.

[327] Okay, I have to ask you about this.

[328] From a personal side, it's funny enough, after we agreed to talk, I've gotten, for the first time of my life, was a victim of a cyber attack.

[329] So this is ransomware.

[330] It's called Deadbolt.

[331] People can look it up.

[332] I have a Q -Nap device for basically kind of cold -dish storage.

[333] So it's about 60 terabytes with 50 terabytes of data on it in Raid 5.

[334] And apparently about 4 to 5 ,000 QNAP devices were hacked and taken over with this ransomware.

[335] And what ransomware does there is it goes file by file almost all the files on the QNAP storage device and encrypts them.

[336] And then there's this very eloquently and politely written page that pops up.

[337] You know, it describes what happened.

[338] All your files have been encrypted.

[339] This includes, but is not limited to photos, documents, and spreadsheets.

[340] Why me?

[341] This is a lot of people commented about how friendly and eloquent this is.

[342] And I have to commend them.

[343] It is, and it's pretty user -friendly.

[344] Why me?

[345] This is not a personal attack.

[346] You have been targeted because of the inadequate security provided by your vendor, QNAP.

[347] what now you can make a payment of exactly 0 .03 bitcoin which is about a thousand dollars to the following address once the payment has been made we'll follow up with transaction to the same address blah blah blah they give you instructions of what happens next and they'll give you a decryption key that you can then use and then there's another message for QNAP that says all your affected customers have been targeted using a zero day vulnerability in your product we offer you two options to medicate this and future damage one make a bitcoin payment of five bitcoin to the following address and that will reveal to cue nap the i'm summarizing things here what what the actual vulnerability is or you can make a bitcoin payment of 50 bitcoin to get a master decryption key for all your customers 50 bitcoin is about 1 .8 million dollars okay so first of all on a personal level, this one hurt for me. There's, I mean, I learned a lot because I wasn't, for the most part, backing up much of that data because I thought I can afford to lose that data.

[348] It's not, like, horrible.

[349] I mean, I think you've spoken about the crown jewels, like, making sure there's things you really protect, and I have, you know, I'm very conscious, security -wise on the crown jewels.

[350] But there's a bunch of stuff like, you know, personal videos, they're not, like, I don't have anything creepy, but just like fun things I did that because they were very large or 4K or something like that, I kept them on there thinking Raid 5 will protect it.

[351] You know, just I lost a bunch of stuff, including raw footage from interviews and all that kind of stuff.

[352] So it's painful.

[353] And I'm sure there's a lot of painful stuff like that for the.

[354] the 4 to 5 ,000 people that use QNAP.

[355] And there's a lot of interesting ethical questions here.

[356] Do you pay them?

[357] Does QNAP pay them?

[358] Do the individuals pay them?

[359] Especially when you don't know if it's going to work or not.

[360] Do you wait?

[361] So QNAP said that, please don't pay them.

[362] We're working very hard day and night to solve this.

[363] it's so philosophically interesting to me because I also project onto them thinking what is their motivation because the way they phrased it on purpose perhaps but I'm not sure if that actually reflects their real motivation is maybe they're trying to help themselves sleep at night basically saying this is not about you this is about the company with the vulnerabilities just like you mentioned this is the justification they have but they're hurting real people they hurt me but I'm sure there's a few others that are really hurt and the zero day factor is a big one you know that their QNAP right now is trying to figure out what the hell is wrong with their system that would let this in and even if they pay if they still don't know where the zero day is what's to say that they won't just hit them again and hit you again so that really complicates thing and things and that is a huge advancement for ransomware.

[364] It's really only been, I think, in the last 18 months that we've ever really seen ransomware exploit zero days to pull these off.

[365] Usually 80 % of them, I think the data shows 80 % of them come down to a lack of two -factor authentication.

[366] You know, so when someone gets hit by a ransomware attack, they don't have two -factor authentication on you know their employees were using stupid passwords like you can mitigate that in the future this one they don't know they probably don't know yeah and it was uh i guess it's zero click because i didn't have to do anything the only thing i well you know here's the thing i did you know basics of you know i put it behind a firewall i followed instructions but like i wasn't i didn't really pay attention so maybe there's like maybe there's a misconfiguration of some sort that's easy to make.

[367] It's difficult.

[368] We have a personal NAS.

[369] So I don't, I'm not willing to sort of say that I did everything I possibly could.

[370] But I did a lot of reasonable stuff and they still hit it with zero clicks.

[371] I didn't have to do anything.

[372] Yeah, well, it's like a zero day and it's a supply chain attack.

[373] You know, you're getting hit from your supplier.

[374] You're getting hit because of your vendor.

[375] And it's also a new thing for ransomware groups to go to the individuals to pressure them to pay.

[376] There was this really interesting case.

[377] I think it was in Norway where there was a mental health clinic that got hit.

[378] And the cyber criminals were going to the patients themselves to say pay this or were going to release your psychiatric records.

[379] I mean, talk about hell.

[380] In terms of whether to pay, you know, that is on the.

[381] cheaper end of the spectrum.

[382] From the individual or from the company?

[383] Both.

[384] You know, we've seen, for instance, there was an Apple supplier in Taiwan.

[385] They got hit and the ransom demand was 50 million.

[386] You know, I'm surprised it's only 1 .8 million.

[387] I'm sure it's going to go up.

[388] And it's hard, you know, there's obviously governments and maybe in this case the company are going to tell you, we recommend you don't pay or please don't pay.

[389] but the reality on the ground is that some businesses can't operate, some countries can't function.

[390] I mean, the underreported storyline of colonial pipeline was after the company got hit and took the preemptive step of shutting down the pipeline because their billing systems were frozen, they couldn't charge customers downstream.

[391] my colleague David Sanger and I got our hands on a classified assessment that said that as a country, we could have only afforded two to three more days of colonial pipeline being down.

[392] And it was really interesting.

[393] I thought it was the gas and the jet fuel, but it wasn't.

[394] You know, we were sort of prepared for that.

[395] It was the diesel.

[396] Without the diesel, the refineries couldn't function.

[397] And it would have totally screwed up the economy.

[398] And so there was almost this, like, national security economic impetus for them to pay this ransom.

[399] And the other one I always think about is Baltimore.

[400] You know, when the city of Baltimore got hit, I think the initial ransom demand was something around 76 ,000.

[401] It may have even started smaller than that.

[402] And Baltimore stood its ground and didn't pay, but ultimately the cost to remediate was $18 million.

[403] It's a lot for the city of Baltimore.

[404] That's money that could have gone to public school education and roads and, you know, public health.

[405] And instead, it just went to rebuilding these systems from scratch.

[406] And so a lot of residents in Baltimore were like, why the hell didn't you pay the $76 ,000?

[407] So it's not obvious.

[408] You know, it's easy to say don't pay because why you're funding their R &D for the next go -round.

[409] But it's too obvious.

[410] often, it's too complicated.

[411] So on the individual level, just like, you know, the way I feel personally from this attack, have you talked to people that were kind of victims in the same way I was, but maybe more dramatic ways or so on, you know, in the same way that violence hurts people?

[412] Yeah.

[413] How much is this hurt people in your sense and the way you researched it?

[414] The worst ransomware attack I've covered on a personal level was an attack.

[415] on a hospital in Vermont and you know you think of this as like okay it's hitting their IT networks they should still be able to treat patients but it turns out that cancer patients couldn't get their chemo anymore because the protocol of who gets what is very complicated and without it nurses and doctors couldn't access it so they were turning chemo patients away cancer patients away.

[416] One nurse told us, I don't know why people aren't screaming about this, that the only thing I've seen that even compares to what we're seeing at this hospital right now was when I worked in the burn unit after the Boston Marathon bombing.

[417] You know, they really put it in these super dramatic terms.

[418] And last year, there was a report in the Wall Street Journal where they attributed an infant death to a ransomware attack because a mom came in and whatever device they were using to monitor the fetus wasn't working because of the ransomware attack.

[419] And so they attributed this infant death to the ransomware attack.

[420] Now, on a bigger scale but less personal, when there was the not Petya attack, so this was an attack by Russia on Ukraine.

[421] that came at them through a supplier, a tax software company in that case, that didn't just hit any government agency or business in Ukraine that used this tax software.

[422] It actually hit any business all over the world that had even a single employee working remotely in Ukraine.

[423] So it hit Merrish, the shipping company, but hit Pfizer, hit FedEx, but the one I will never forget is Merck.

[424] it paralyzed Merck's factories.

[425] I mean, it really created an existential crisis for the company.

[426] Merck had to tap into the CDC's emergency supplies of the Gardasil vaccine that year because their whole vaccine production line had been paralyzed in that attack.

[427] Imagine if that was going to happen right now to Pfizer or Moderna or Johnson and Johnson.

[428] You know, imagine.

[429] I mean, that would really create.

[430] a global cyber terrorist attack, essentially.

[431] And that's almost unintentional.

[432] I thought for a long time, I always labeled it as collateral damage.

[433] But actually just today, there was a really impressive threat researcher at Cisco, which has this threat intelligence division called Talos, who said, stop calling it collateral damage.

[434] They could see who was going to get hit before they deployed that malware.

[435] It wasn't collateral damage.

[436] It was intentional.

[437] They meant to hit any business that did business with Ukraine.

[438] It was to send a message to them too.

[439] So I don't know if that's accurate.

[440] I always thought of it as sort of this sloppy collateral damage, but it definitely made me think.

[441] So how much of this between states is going to be a part of war, these kinds of attacks on Ukraine, between Russia and U .S., Russia and China, China and U .S. Let's look at China and U .S. Do you think China and U .S. are going to escalate something that would be called a war purely in the space of cyber?

[442] I believe any geopolitical conflict from now on is guaranteed to have some cyber element to it.

[443] The Department of Justice recently declassified a report that said China's been hacking into our pipelines and it's not for intellectual property theft.

[444] It's to get a foothold so that if things escalate in Taiwan, for example, they are where they need to be to shut our pipeline.

[445] down.

[446] And we just got a little glimpse of what that looked like with colonial pipeline and the panic buying and the jet fuel shortages and that assessment I just mentioned about the diesel.

[447] So they're there.

[448] You know, they've got in there.

[449] Anytime I read a report about new aggression from fighter jets, Chinese fighter jets in Taiwan, or what's happening right now with Russia's build up on the Ukraine border or India, Pakistan.

[450] I'm always looking at it through a cyber lens, and it really bothers me that other people aren't because there is no way that these governments and these nation states are not going to use their access to gain some advantage in those conflicts.

[451] And I'm now in a position where I'm an advisor to the cybersecurity.

[452] infrastructure security agency at the DHS.

[453] So I'm not saying anything classified here, but I just think that it's really important to understand just generally what the collateral damage could be for American businesses and critical infrastructure in any of these escalated conflicts around the world.

[454] Because just generally, our adversaries have learned that they might never be able to match us in terms of our traditional military spending on traditional weapons and fighter jets.

[455] But we have a very soft underbelly when it comes to cyber.

[456] 80 % or more of America's critical infrastructure.

[457] So pipelines, power grid, nuclear plants, water systems is owned and operated by the private sector.

[458] And for the most part, there is nothing out there legislating that those companies share the fact they've been breached.

[459] They don't even have to tell the government they've been hit.

[460] There's nothing mandating that they even meet a bare minimum standard of cybersecurity.

[461] And that's it.

[462] So even when there are these attacks, most of the time, we don't even know about it.

[463] So that is, you know, if you were going to design a system to be as blind and vulnerable as possible.

[464] That's, that's, that's pretty, pretty good.

[465] That's what it looks like is what we have here in the United States.

[466] And everyone here is just operating like, let's just keep hooking up everything for convenience.

[467] You know, software eats the world.

[468] Let's just keep going for cost, for convenience sake, just because we can.

[469] And when you study these issues and you study these attacks and you study the advancement and the uptick and frequency and the lower barrier to entry that we see every single year, you realize just how dumb software eats world is.

[470] And no one has ever stopped to pause and think, should we be hooking up these systems to the internet?

[471] They've just been saying, can we?

[472] Let's do it.

[473] And that's a real problem.

[474] And just in the last year, you know, we've seen a record number of zero -day attacks.

[475] I think there were 80 last year, which is probably more than double what it was in 2019.

[476] A lot of those were nation states.

[477] You know, we live in a world with a lot of geopolitical hot points right now.

[478] And where those geopolitical hot points are, are places where countries have been investing heavily in offensive cyber tools.

[479] If you're a nation state, the goal would be to maximize the footprint of zero -day, like super -secretes zero -day that nobody's aware of.

[480] And whenever war is initiated, the huge negative effects of shutting down infrastructure or any kind of zero -day is the chaos it creates.

[481] So if you just, there's a certain threshold when you create the chaos, the market's plummet, just everything goes to hell.

[482] It's not just zero days.

[483] It's, you know, we make it so easy for threat actors.

[484] I mean, we're not using two -factor authentication.

[485] We're not patching.

[486] There was the shell shock vulnerability that was discovered a couple years ago.

[487] It's still being exploited, because so many people haven't fixed it.

[488] So, you know, the zero days are really the sexy stuff.

[489] And what really drew me to the zero -day market was the moral calculus we talked about.

[490] particularly from, you know, the U .S. government's point of view, how do they justify leaving these systems so vulnerable when we use them here and we're baking more of our critical infrastructure with this vulnerable software?

[491] You know, it's not like we're using one set of technology and Russia is using another and China's using this.

[492] We're all using the same technology.

[493] So when you find a zero day in Windows, you know, you're not just leaving it open so you can spy on Russia or implant yourself in the Russian grid, you're leaving Americans vulnerable too.

[494] But zero days are like that is the secret sauce.

[495] That's the superpower.

[496] And I always say like every country now with the exception of Antarctica, someone added the Vatican to my list, is trying to find offensive hacking tools in zero days to make them work.

[497] And those that don't have the skills now have this market that they can tap into where you know two point five million dollars that's chum change for a lot of these nation states it's a hell of a lot less than trying to build the next fighter jet um but yeah the goal is chaos i mean why did russia turn off the lights twice in ukraine you know i think part of it is chaos i think part of it is to to sow the seeds of doubt in their current government your government can't even keep your lights on Why are you sticking with them?

[498] You know, come over here and we'll keep your lights on at least.

[499] You know, there's like a little bit of that.

[500] Nuclear weapons seems to have helped prevent nuclear war.

[501] Is it possible that we have so many vulnerabilities and so many attack vectors on each other that it will kind of achieve the same kind of equilibrium like mutually shared destruction?

[502] Yeah.

[503] That's one hopeful solution.

[504] to this?

[505] Do you have any hope for this particular solution?

[506] You know, nuclear analogies always tend to fall apart when it comes to cyber, mainly because you don't need fissile material.

[507] You know, you just need a laptop and the skills and you're in the game.

[508] So it's a really low barrier to entry.

[509] The other thing is attribution's harder.

[510] And we've seen countries muck around with attribution.

[511] We've seen, you know, nation states piggyback on other country's spy operations and just sit there and siphon out whatever they're getting.

[512] We learned some of that from the Snowden documents.

[513] We've seen Russia hack into Iran's command and control attack servers.

[514] We've seen them hit a Saudi petrochemical plant where they did neutralize the safety locks at the plan and everyone assumed that it was Iran given Iran had been targeting Saudi oil companies forever.

[515] But nope, it turned out that it was a graduate research institute outside Moscow.

[516] So you see countries kind of playing around with attribution.

[517] Why?

[518] I think because they think, okay, if I do this, like, how am I going to cover up that it came from me because I don't want to risk the response.

[519] So people are sort of dancing around this.

[520] It's just in a very different way.

[521] And, you know, at the times, I'd covered the Chinese hacks of infrastructure companies like pipelines.

[522] I'd covered the Russian probes of nuclear plants.

[523] I'd covered, cover, the Russian attacks on the Ukraine grid.

[524] And then in 2018, my colleague David Sanger and I covered the fact that U .S. Cyber Command had been hacking into the Russian grid and making a pretty loud show of it.

[525] And when we went to the National Security Council, because that's what journalists do before they publish a story, they give the other side a chance to respond, I assumed we would be in for that really awkward, painful conversation where they would say, you will have blood on your hands if you publish this story.

[526] And instead, they gave us the opposite answer.

[527] They said, we have no problem with you publishing this story.

[528] Why?

[529] Well, they didn't say it out loud, but it was pretty obvious they wanted Russia to know that we're hacking into their power grid to, and they better think twice before they do to us what they had done to Ukraine.

[530] So, yeah, you know, we have stumbled into this new era of mutually assured digital destruction.

[531] I think another sort of quasi -norm we've stumbled into is proportional responses.

[532] You know, there's this idea that if you get hit, you're allowed to respond proportionally at a time and place of your choosing.

[533] You know, that is how the language always goes.

[534] That's what Obama said after North Korea hit Sony.

[535] We will respond at a time and place of our choosing.

[536] but no one really knows what that response looks like and so what you see a lot of the time are just these like just short of war attacks you know Russia turned off the power in Ukraine but it wasn't like it stayed off for a week you know it stayed off for a number of hours you know not Petya hit those companies pretty hard but no one died you know and the question is what's going to happen when someone dies and can a nation state masquerade as a cybercriminal group, as a ransomware group?

[537] And that's what really complicates coming to some sort of digital Geneva convention.

[538] Like there's been a push from Brad Smith at Microsoft.

[539] We need a digital Geneva convention.

[540] And on its face, it sounds like a no -brainer.

[541] Yeah.

[542] Why wouldn't we all agree to stop hacking into each other's civilian hospital systems, elections, power grid?

[543] pipelines.

[544] But when you talk to people in the West, officials in the West, they'll say we would never, we'd love to agree to it, but we'd never do it when you're dealing with Xi or Putin or Kim Jong -un because a lot of times they outsource these operations to cyber criminals.

[545] In China, we see a lot of these attacks come from this loose satellite network of private citizens that work the behest of the Ministry of State Security.

[546] So how do you come to some sort of state to state agreement when you're dealing with transnational actors and cybercriminals where it's really hard to pin down whether that person was acting alone or whether they were acting at the behest of the MSS or the FSB?

[547] And a couple years ago, I remember, I can't remember if it was before or after not Petya, but Putin said, hackers are like artists who wake up in the morning in a good mood and start painting.

[548] In other words, I have no say over what they do or don't do.

[549] So how do you come to some kind of norm when that's how he's talking about these issues?

[550] And he's just decimated Merck and, you know, Pfizer and another, you know, however many thousand companies.

[551] That is the fundamental difference between nuclear weapons and cyber attacks is the attribution, or one of the fundamental differences.

[552] If you can fix one thing in the world in terms of the cybersecurity that would make the world a better place, what would you fix?

[553] So you're not allowed to fix like authoritarian regimes and you can't.

[554] Right.

[555] You have to keep that.

[556] You have to keep human nature as it is.

[557] In terms of on the security side, technologically speaking, you mentioned there's no regulation on companies in the United States.

[558] what if you could just fix with a snap of a finger what would you fix two -factor authentication multi -factor authentication it's it's ridiculous how many of these attacks come in because someone didn't turn on multi -factor authentication i mean colonial pipeline okay they took down the biggest conduit for gas jet fuel and diesel to the east coast of the united states of America.

[559] How?

[560] Because they forgot to deactivate an old employee account whose password had been traded on the dark web, and they'd never turned on two -factor authentication.

[561] This water treatment facility outside Florida was hacked last year.

[562] How did it happen?

[563] They were using Windows XP from like a decade ago that can't even get patches if you want it to, and they didn't have two -factor authentication.

[564] Time and time again, if they just switched on two -factor authentication, some of these attacks wouldn't have been possible.

[565] Now, if I could snap my fingers, that's the thing I would do right now.

[566] But of course, you know, this is a cat and mouse game and then the attackers on to the next thing.

[567] But I think right now, that is like bar none.

[568] That is just, that is the easiest, simplest way to deflect the most attacks.

[569] And, you know, the name of the game right now isn't perfect security.

[570] Perfect security is impossible.

[571] They will always find a way in.

[572] The name of the game right now is make yourself a little bit harder to attack than your competitor than anyone else out there so that they just give up and move along.

[573] And maybe if you are a target for an advanced nation state or the SVR, you know, you're going to get hacked no matter what.

[574] But you can make cybercriminal groups deadbolt is it.

[575] You can make their jobs a lot harder simply by doing the bare basics.

[576] And the other thing is stop for using your passwords.

[577] But if I always only get one, then two -factor authentication.

[578] So what is two -factor authentication?

[579] Factor one is what, logging in with a password, and factor two is like have another device or another channel through which you can confirm, yeah, that's me. Yes.

[580] You know, usually this happens through some kind of text.

[581] You know, you get your one -time code from Bank of America or from Google.

[582] The better way to do it is spend $20 buying yourself a Fido key on Amazon.

[583] That's a hardware device.

[584] and if you don't have that hardware device with you, then you're not going to get in.

[585] And the whole goal is, I mean, basically, you know, my first half of my decade at the times was spent covering like the cop beat.

[586] It was like Home Depot got breached, News at 11, you know, Target, Neiman Marcus, like who wasn't hacked over the course of those five years?

[587] And a lot of those companies that got hacked, what did hackers take?

[588] They took the credentials.

[589] They took the passwords.

[590] They can make a pretty penny selling them on the dark web, and people reuse their passwords.

[591] So you get one from, you know, God knows who, I don't know, last pass, the worst case example, actually last pass.

[592] But you get one, and then you go test it on their email account, and you go test it on their brokerage account, and you test it on their cold storage account.

[593] You know, that's how it works.

[594] But if you have multi -factor authentication, then they can't get in.

[595] because they might have your password, but they don't have your phone.

[596] They don't have your phytokie, you know, and so you keep them out.

[597] And, you know, I get a lot of alerts that tell me someone is trying to get into your Instagram account or your Twitter account or your email account.

[598] And I don't worry because I use multi -factor authentication.

[599] They can try all day.

[600] Okay, I worry a little bit.

[601] But, you know, It's the simplest thing to do, and we don't even do it.

[602] Well, there's an interface aspect to it, because it's pretty annoying if it's implemented poorly.

[603] Yeah, true.

[604] So actually bad implementation of two -factor authentication, not just bad, but just something that adds friction is a security vulnerability, I guess, because it's really annoying.

[605] Like, I think MIT for a while had two -factor authentication.

[606] It was really annoying.

[607] It just, like, the number of times it pings you, like, it asks to re -authenticate across multiple subdomains.

[608] Like, it just feels like a pain.

[609] I don't know what the right balance there.

[610] Yeah.

[611] It feels like friction in our frictionless society.

[612] It feels like friction.

[613] It's annoying.

[614] That's security's biggest problem.

[615] It's annoying.

[616] You know, we need the Steve jobs of security to come along, and we need to make.

[617] make it painless.

[618] And actually, you know, on that point, Apple has probably done more for security than anyone else simply by introducing biometric authentication first with the fingerprint and then with face ID.

[619] And it's not perfect.

[620] But, you know, if you think just eight years ago, everyone was running around with either no passcode, an optional passcode or a four -digit passcode on their phone that anyone, you know, think of what you can get when you get someone's iPhone if you steal someone's iPhone.

[621] And, you know, props to them for introducing the fingerprint and face ID.

[622] And again, it wasn't perfect, but it was a huge step forward.

[623] Now it's time to make another huge step forward.

[624] I want to see the password die.

[625] I mean, it's gotten us as far as it was ever going to get us.

[626] And I hope whatever we come up with next is not going to be annoying.

[627] It's going to be seamless.

[628] When I was at Google, that's what we worked on is, and there's a lot of ways to call it's active authentication, passive authentication.

[629] So basically use biometric data, not just like a fingerprint, but everything from your body to identify who you are, like movement patterns.

[630] So it basically create a lot of layers of protection where it's very difficult to fake, including like face unlock, checking that it's your actual face, like the liveliness tests.

[631] So like from video.

[632] So unlocking it with video, voice, the way you move the phone, the way you take it out of the pocket, that kind of thing, all of those factors.

[633] It's a really hard problem, though.

[634] And ultimately, it's very difficult to beat the password in terms of security.

[635] Well, there's a company that I actually will call out, and that's abnormal security.

[636] So they work on email attacks.

[637] And it was started by a couple guys.

[638] who were doing, I think, ad tech at Twitter.

[639] So, you know, ad technology now, like, it's a joke how much they know about us.

[640] You know, you always hear the conspiracy theories that, you know, you saw someone's shoes, and next thing you know, it's on your phone.

[641] It's amazing what they know about you.

[642] And they're basically taking that and they're applying it to attacks.

[643] So they're saying, okay, you know, if you're, this is what your email patterns are.

[644] And might be different for you and me because we're emailing strangers all the time.

[645] But for most people, their email patterns are pretty predictable.

[646] And if something strays from that pattern, that's abnormal.

[647] And they'll block it.

[648] They'll investigate it.

[649] And that's great.

[650] You know, let's start using that kind of targeted ad technology to protect people.

[651] And yeah, I mean, it's not going to get us away from the password and using multifactor authentication.

[652] But, you know, the technology is out there.

[653] And we just have to figure out how to use it in a really seamless way because it doesn't matter if you have the perfect security solution if no one uses it.

[654] I mean, when I started at the times when I was trying to be really good about protecting sources, I was trying to use PGP encryption.

[655] And it's like, it didn't work.

[656] You know, the number of mistakes I would put.

[657] probably make just trying to email someone with PGP just wasn't worth it.

[658] And then Signal came along.

[659] And Signal made it a wicker.

[660] They made it a lot easier to send someone an encrypted text message.

[661] So we have to start investing in creative minds in good security design.

[662] You know, I really think that's the hack that's going to get us out of where we are today.

[663] what about social engineering do you worry about this sort of hacking people yes i mean this is the worst nightmare of every chief information security officer out there um you know social engineering we work from home now i saw this this uh woman posted online about how her husband it went viral today but it was her husband had this problem at work.

[664] They hired a guy named John, and now the guy that shows up for work every day doesn't act like John.

[665] I mean, think about that.

[666] Like, think about the potential for social engineering in that context.

[667] You know, you apply for a job and you put on a pretty face, you hire an actor or something, and then you just get inside the organization and get access to all that organization's data.

[668] You know, a couple years ago, Saudi Arabia planted spies inside Twitter.

[669] You know, why?

[670] Probably because they were trying to figure out who these people were who were criticizing the regime on Twitter.

[671] You know, they couldn't do it with a hack from the outside.

[672] So why not plant people on the inside?

[673] And that's like the worst nightmare.

[674] And it also, unfortunately, creates all kinds of xenophobia at a lot of these organizations.

[675] I mean, if you're going to have to take that into consideration, then organizations are going to start looking really skeptically and suspiciously at someone who applies for that job from China.

[676] And we've seen that go really badly at places like the Department of Commerce where they basically accuse people of being spies that aren't spies.

[677] So it is the hardest problem to solve.

[678] And it's never been harder to solve than right at this very moment when there's so much pressure for companies to let people work remotely.

[679] That's actually why I'm single, I'm suspicious that China and Russia every time I meet somebody are trying to plant and get insider information.

[680] So I'm very, very suspicious.

[681] I keep putting the touring test in front.

[682] No. No, I have a friend who worked inside NSA and was one of their top hackers.

[683] And he's like, every time I go to Russia, I get hit on by these tens.

[684] And I come home, my friends are like, I'm sorry, you're not a 10.

[685] Like, it's a common story.

[686] I mean, it's difficult to trust humans in this day and age online, you know, because so we're working remotely, that's one thing, but just interacting with people on the internet.

[687] It sounds ridiculous, but, you know, because of this podcast in part, and I've gotten to meet some incredible people, but it, you know, it makes you nervous to trust folks.

[688] and I don't know how to solve that problem.

[689] So I'm talking with Mark Zuckerberg, who dreams about creating the Metaverse, what do you do about that world where more and more our lives is in the digital sphere?

[690] Like, one way to phrase it is most of our meaningful experiences at some point will be online.

[691] Like falling in love, getting a job or experiencing a moment of happiness with a friend, with a new friend made online, all of those things.

[692] Like more and more, the fun we do, the things that make us love life will happen online.

[693] And if those things have an avatar that's digital, that's like a way to hack into people's minds, whether it's with AI or kind of troll farms or something like that.

[694] I don't know if there's a way to protect against that.

[695] that might fundamentally rely on our faith in how good human nature is.

[696] So if most people are good, we're going to be okay.

[697] But if people will tend towards manipulation and unleavenment behavior in search of power, then we're screwed.

[698] So I don't know if you can comment on how to keep the metaverse secure.

[699] Yeah, I mean, all I thought about when you were talking just now is my three -year -old son.

[700] Yeah.

[701] He asked me the other day, what's the internet mom?

[702] And I just almost wanted to cry.

[703] You know, I don't want that for him.

[704] I don't want all of his most meaningful experiences to be online.

[705] You know, by the time that happens, how do you know that person's human?

[706] That avatar is human.

[707] You know, I believe in free speech.

[708] I don't believe in free speech for robots and bots.

[709] And like, look what just happened over the last six years.

[710] You know, we had bots pretending to be Black Lives Matter activists just to sow some division or, you know, Texas secessionists or, you know, organizing anti -Hillary protests or just to sew more division to tie us up in our own politics.

[711] so that we're so paralyzed, we can't get anything done.

[712] We can't make any progress, and we definitely can't handle our adversaries, and they're long -term thinking.

[713] It really scares me, and here's where I just come back to just because we can create the metaverse, you know, just because it sounds like the next logical step in our digital revolution.

[714] Do I really want my child's most significant moments to be online?

[715] They weren't for me. You know, so maybe I'm just stuck in that old school thinking, or maybe I've seen too much.

[716] And I'm really sick of being the guinea pig parent generation for these things.

[717] I mean, it's hard enough with screen time.

[718] Like, thinking about how to manage the men's, metaverse as a parent to a young boy, like, I can't even let my head go there.

[719] That's so terrifying for me. But we've never stopped any new technology just because it introduces risks.

[720] We've always said, okay, the promise of this technology means we should keep going, keep pressing ahead.

[721] We just need to figure out new ways to manage that risk.

[722] And, you know, that is, that's That's the blockchain right now.

[723] Like, when I was covering all of these ransomware attacks, I thought, okay, this is going to be it for cryptocurrency.

[724] You know, governments are going to put the kibosh down.

[725] They're going to put the hammer down and say, enough is enough.

[726] Like, we have to put this genie back in the bottle because it's enabled ransomware.

[727] I mean, five years ago, they would hijack your PC and they'd say go to the local pharmacy, get a e -gift card, and tell us what the pin.

[728] is, and then we'll get your $200.

[729] Now it's pay us, you know, five Bitcoin.

[730] And so there's no doubt cryptocurrencies enabled ransomware attacks, but after the colonial pipeline ransom was seized, because if you remember, the FBI was actually able to go in and claw some of it back from Darkside, which was the ransomware group that hit it.

[731] And I spoke to these guys at TRM Labs.

[732] So they're one of these blockchain intelligence companies.

[733] And a lot of people that work there used to work at the Treasury.

[734] And what they said to me was, yeah, cryptocurrency has enabled ransomware.

[735] But to track down that ransom payment would have taken, you know, if we were dealing with fiat currency, would have taken us years to get to that one bank account or belonging to that one front company in the Seychelles.

[736] And now, thanks to the blockchain, we can track the movement of those funds in real time.

[737] And you know what?

[738] These payments are not as anonymous as people think.

[739] Like we still can use our old hacking ways and zero days and, you know, old school intelligence methods to find out who owns that private wallet and how to get to it.

[740] So it's a curse in some ways and that it's an enabler, but it's also a blessing.

[741] And they said that same thing to me that I just said to you.

[742] They said, we've never shut down a promising new technology because it's, introduced risk, we just figured out how to manage that risk.

[743] And I think that's where the conversation unfortunately has to go, is how do we, in the metaverse, use technology to fix things.

[744] So maybe we'll finally be able to, not finally, but figure out a way to solve the identity problem on the internet meaning like a blue checkmark for actual human, and connect it to identity, like a fingerprint, so you can prove your you, and yet do it in a way that doesn't involve the company having all your data.

[745] So giving you, allowing you to maintain control over your data, or if you don't, then there's a complete transparency of how that data is being used, all those kinds of things.

[746] And maybe as you educate more and more people, they would demand in a capitalist society that the companies that they give their data to will respect that data.

[747] Yeah.

[748] I mean, there is this company, and I hope they succeed.

[749] Their name's PII Iano, Piano.

[750] And they want to create a vault for your personal information inside every organization.

[751] And ultimately, if I'm going to call Delta Airlines to book a flight, They don't need to know my social security number.

[752] They don't need to know my birth date.

[753] They're just going to send me a one -time token to my phone.

[754] My phone's going to say, or my, you know, bido key is going to say, yep, it's her.

[755] And then we're going to talk about my identity like a token, you know, some random token.

[756] They don't need to know exactly who I am.

[757] They just need to know I am, the system trust that I am, who I say I am.

[758] But they don't get access to my PII data.

[759] They don't get access to my social security number, my location, or the fact I'm a Times journalist.

[760] You know, I think that's the way the world's going to go.

[761] We have, enough is enough on sort of losing our personal information everywhere, letting data marketing companies track our every move.

[762] You know, they don't need to know who I am.

[763] You know, okay, I get it.

[764] You know, we're stuck in this world where the internet runs on ads.

[765] so ads are not going to go away.

[766] But they don't need to know I'm Nicole Perlera.

[767] They can know that I'm token number, you know, X5, 6, 7.

[768] And they can let you know what they know and give you control about removing the things they know.

[769] Yeah, right to be forgotten.

[770] To me, you should be able to walk away with a single press of a button.

[771] And I also believe that most people, given the choice to walk away, won't walk away.

[772] They'll just feel better about having the option to walk away when they understand the tradeoffs.

[773] If you walk away, you're not going to get some of the personalized experiences that you would otherwise get, like a personalized feed and all those kinds of things.

[774] But the freedom to walk away is, I think, really powerful.

[775] And obviously what you're saying, there's all of these HTML forms.

[776] We have to enter your phone number and email and private information from Delta to every single airline.

[777] New York Times I have so many opinions on this just the friction and the sign -up and all those kinds of things I should be able to this has to do with everything this has to do with payment too the payment should be trivial it should be one click and one click to unsubscribe and subscribe and one click to provide all of your information that's necessary for the subscription service for the transaction service whatever that is getting ticket as opposed to, I have all these fake phone numbers and emails that I use and I'll to sign out because, you know, you never know if one site is hacked, then it's just going to propagate to everything else.

[778] Yeah.

[779] And, you know, there's low -hanging fruit, and I hope Congress does something, and frankly, I think it's negligent they haven't on the fact that elderly people are getting spammed to death on their phones these days with fake, car warranty scams.

[780] And I mean, my dad was in the hospital last year and I was in the hospital room and his phone kept buzzing and I look at it and it's just spam attack after spam attack people nonstop calling about his freaking car warranty.

[781] Why they're trying to get his social security number.

[782] They're trying to get his PII.

[783] They're trying to get this information.

[784] We need to figure out how to put those people in jail.

[785] for life.

[786] And we need to figure out why in the hell we are being required or asked to hand over our social security number and our home address and our passport, you know, all of that information to every retailer who asks.

[787] I mean, that's insanity.

[788] And there's no question.

[789] They're not protecting it because it keeps showing up in, you know, spam or identity.

[790] theft or credit card theft or worse.

[791] Well, the spam is getting better.

[792] And maybe I need to, as a side note, make a public announcement.

[793] Please clip this out, which is if you get an email or a message from Lex Friedman saying how much I, Lex, you know, appreciate you and love you and so on.

[794] And please connect with me on my WhatsApp number and I will give you Bitcoin or something like that, please do not click.

[795] And I'm aware that there's a lot of this going on, a very large amount.

[796] I can't do anything about it.

[797] This is on every single platform.

[798] It's happening more and more and more, which I've been recently informed that they're now emailing.

[799] So it's cross platform.

[800] They're taking people's, they're somehow, this is fascinating to me, because they are taking people who comment on various social platforms and they somehow reverse engineer, they figure out what their email is and they send an email to that person saying from Lex Friedman and it's like a heartfelt email with links.

[801] It's fascinating because it's cross platform now.

[802] It's not just a spam bot that's messaging and a comment that's in a reply.

[803] They are saying, okay, this person cares about this other person on social media.

[804] So I'm going to find another channel which in their mind probably increases and it does the likelihood that they'll get the people to click and they do i don't know what to do about that it makes me really really sad especially with podcasting there's an intimacy that people feel connected and they get really excited okay cool i want to talk to lex and they click and like i get angry at the people that do this.

[805] I mean, you're, um, it's like the John that gets hired, uh, the fake employee.

[806] I mean, I don't know what to do about that.

[807] I mean, I suppose that's the, I suppose the solution is education.

[808] It's, uh, telling people to be skeptical on stuff they click.

[809] Uh, it's, that's, that balance with the technology solution of creating a, um, maybe like two -factor authentication and, um, maybe helping identify things that are likely to be spam.

[810] I don't know.

[811] But then the machine learning there is tricky because you don't want to add a lot of extra friction that just annoys people because they'll turn it off because you have the accept cookies thing, right, that everybody has to click on now so now they completely ignore they accept cookies.

[812] This is very difficult to find that frictionless security.

[813] You mentioned Snowden.

[814] You've talked about looking through the NSA documents he leaked and doing the hard.

[815] work of that.

[816] What do you make of Edward Snowden?

[817] What have you learned from those documents?

[818] What do you think of him?

[819] In the long arc of history, is Edward Snowden a hero or a villain?

[820] I think he's neither.

[821] I have really complicated feelings about Edward Snowden.

[822] On the one end, I'm a journalist at hard.

[823] And more transparency is good.

[824] And I'm grateful for the conversations that we had in the post -Snowden era about the limits to surveillance and how critical privacy is.

[825] And when you have no transparency and you don't really know in that case what our secret courts were doing, how can you truly believe that our country is taking our civil liberties seriously?

[826] So on the one hand, I'm grateful that he cracked open these debates.

[827] On the other hand, when I walked into the storage closet of classified NSA secrets, I had just spent two years covering Chinese cyber espionage almost every day, and this sort of advancement of Russian attacks that were just getting worse and worse and more destructive.

[828] And there were no limits to Chinese cyber espionage and Chinese surveillance of its own citizens.

[829] And there seemed to be no limit to what Russia was willing to do.

[830] in terms of cyber attacks and also in some cases assassinating journalists.

[831] So when I walked into that room, there was a part of me, quite honestly, that was relieved to know that the NSA was as good as I hoped they were.

[832] And we weren't using that knowledge to, as far as I know, assassinate journalists.

[833] we weren't using our access to, you know, take out pharmaceutical companies.

[834] For the most part, we were using it for traditional espionage.

[835] Now, that set of documents also set me on the journey of my book because to me, the American people's reaction to the Snowden documents was a little bit misplaced.

[836] You know, they were upset about the phone call metadata.

[837] collection program.

[838] Angla Merkel, I think, rightfully was upset that we were hacking her cell phone, but in sort of the spy, eat spy world hacking world leader's cell phones is pretty much what most spy agencies do.

[839] And there wasn't a lot that I saw in those documents that was beyond what I thought a spy agency does.

[840] And I think if there was another 9 -11 tomorrow, God forbid, we would all say, how did the NSA miss this?

[841] Why weren't they spying on those terrorists?

[842] Why weren't they spying on those world leaders?

[843] You know, there's some of that too.

[844] But I think that there was great damage done to the U .S.'s reputation.

[845] I think we really lost our halo in terms of a protector of civil liberties.

[846] and I think a lot of what was reported was unfortunately reported in a vacuum.

[847] That was my biggest gripe that we were always reporting the NSA has this program and here's what it does and the NSA is in Angela Merkel's cell phone and the NSA can do this and no one was saying and by the way China has been hacking into our pipeline.

[848] pipelines and they've been making off with all of our intellectual property and Russia's been hacking into our energy infrastructure and they've been using the same methods to spy on track and in many cases kill their own journalists and the Saudis have been doing this to their own critics and dissidents and so you can't talk about any of these countries in isolation.

[849] It is really like spy, spy out there.

[850] And so I just have complicated feelings.

[851] And the other thing is, and I'm sorry, this is a little bit of a tangent, but the amount of documents that we had, like thousands of documents, most of which were just crap, but had people's names on them.

[852] You know, part of me wishes that those documents had been released in a much more targeted, limited way.

[853] It's just a lot of it just felt like a PowerPoint that was taken out of context and you just sort of wish that there had been a little bit more thought into what was released because I think a lot of the impact from someone was just the volume of the reporting but I but I think you know based on what I saw personally there was a lot of stuff that I just I don't know why that that particular thing got released.

[854] As a whistleblower, what's the better way to do it?

[855] Because, I mean, there's fear.

[856] There's, it takes a lot of effort to do a more targeted release.

[857] You know, if there's proper channels, you're afraid that those channels would be manipulated.

[858] Like, who do you trust?

[859] What's a better way to do this, do you think?

[860] As a journalist, this is almost like a journalistic question.

[861] Reveal some fundamental flaw in the system without destroying the system.

[862] I bring up, you know, again, Mark Zuckerberg and Meta, there was a whistleblower that came out about Instagram internal studies.

[863] And I also torn about how to feel about that whistleblower.

[864] Because from a company perspective, that's an open culture, how can you operate successfully if you have an open culture where any one whistleblower can come out, out of context, take a study, whether represents a larger context or not, and the press eats it up.

[865] And then that creates a narrative that is, just like with the NSA, you said, it's out of context very targeted to where, well, Facebook is evil, clearly, because of this one leak.

[866] It's really hard to know what to do there, because we're now in a society that's deeply distrust institutions.

[867] And so narratives by whistleblowers make that whistleblower and their forthcoming book very popular.

[868] And so there's a huge incentive to take stuff out of context and to tell stories that don't represent the full context, the full truth.

[869] It's hard to know what to do with that.

[870] Because then that forces Facebook, meta, and governments to be much more conservative, much more secretive.

[871] It's like a race to the bottom.

[872] I don't know.

[873] I don't know if you can comment on any of that how to be a whistleblower ethically and properly?

[874] I don't know.

[875] I mean, these are hard questions.

[876] And, you know, even for myself, like, in some ways, I think of my book as sort of blowing the whistle on the underground zero day market.

[877] But, you know, it's not like I was in the market myself.

[878] It's not like I had access to classify data when I was reporting out that book.

[879] You know, as I say in the book, like, Listen, I'm just trying to scrape the surface here so we can have these conversations before it's too late.

[880] And, you know, I'm sure there's plenty in there that someone who's, you know, the U .S. intelligence agency's preeminent zero -day broker probably has some voodoo doll of me out there.

[881] And, you know, you're never going to get at 100%.

[882] But I really applaud whistleblowers like the whistleblower who blew the whistleblower who blew the whistle on the Trump call with Zelensky.

[883] I mean, people needed to know about that, that we were basically in some ways blackmailing an ally to try to influence an election.

[884] I mean, they went through the proper channels.

[885] They weren't trying to profit off of it, right?

[886] There was no book that came out afterwards from that whistleblower.

[887] That whistleblower's not like, they went through the channels.

[888] They're not living in Moscow.

[889] You know, let's put it that way.

[890] Can I ask you a question?

[891] You mentioned NSA.

[892] One of the things that showed is they're pretty good at what they do.

[893] Again, this is a touchy subject, I suppose, but there's a lot of conspiracy theories about intelligence agencies from your understanding of intelligence agencies.

[894] CIA, NSA, and the equivalent of in other countries, are they one question, this could be a dangerous question, are they competent, are they good at what they do?

[895] And two, are they malevolent in any way?

[896] Sort of, I've recently had a conversation about tobacco companies that kind of see their customers as dupes.

[897] Like, they can just play games.

[898] with people.

[899] Conspiracy theories tell that similar story about intelligence agencies that they're interested in manipulating the populace for whatever ends the powerful in dark rooms, cigarette smoke, cigar smoke -filled rooms.

[900] What's your sense?

[901] Do these conspiracy theories have kind of any truth to them?

[902] Or are intelligence agencies, for the most part, good for society?

[903] Okay, well, that's an easy one.

[904] Is it?

[905] No. I think, you know, it depends which intelligence agency.

[906] Think about the Mossad.

[907] You know, they're killing every Iranian nuclear scientist they can over the years, you know.

[908] But have they delayed the time horizon before Iran gets the bomb?

[909] yeah.

[910] Have they probably staved off terror attacks on their own citizens?

[911] Yeah.

[912] You know, none of these intelligence is intelligence.

[913] You know, you can't just say like they're malevolent or they're heroes.

[914] You know, everyone I have met in this space is not like the pound your chest patriot that you see on, you know, the beach on the Fourth of July.

[915] lie.

[916] A lot of them have complicated feelings about their former employers.

[917] Well, at least at the NSA reminded me to do what we were accused of doing after Snowden, to spy on Americans.

[918] You have no idea the amount of red tape and paperwork and bureaucracy it would have taken to do what everyone thinks that we were supposedly doing.

[919] But then, you know, we find out in the course of the Snowden reporting about a program called Lovin, where a couple of the NSA analysts were using their access to spy on their ex -girlfriends.

[920] So, you know, there's an exception to every case.

[921] Generally, I will probably get, you know, accused of my Western bias here again, but But I think you can almost barely compare some of these Western intelligence agencies to China, for instance.

[922] And the surveillance that they're deploying on the Uyghurs, to the level they're deploying it.

[923] And the surveillance they're starting to export abroad with some of the programs like the watering hole attack I mentioned earlier, where it's not just hitting the Uyghurs inside China, it's hitting anyone interested in the Uyghur plight outside China.

[924] I mean, it could be an American high school student writing a paper on the Uyghurs.

[925] They want to spy on that person too.

[926] You know, there's no rules in China really limiting the extent of that surveillance.

[927] And we all better pay attention to what's happening with the Uyghurs because just as Ukraine has been to Russia in terms of a test kitchen, for its cyber attacks.

[928] The Uyghurs are China's test kitchen for surveillance.

[929] And there's no doubt in my mind that they're testing them on the Uyghurs.

[930] Uyghurs are their petri dish, and eventually they will export that level of surveillance overseas.

[931] I mean, in 2015, Obama and Xi, Jinping, reached a deal where basically the White House said, you better cut it out.

[932] on intellectual property theft.

[933] And so they made this agreement that they would not hack each other for commercial benefit.

[934] And for a period of about 18 months, we saw this huge drop -off in Chinese cyber attacks on American companies.

[935] But some of them continued.

[936] Where did they continue?

[937] They continued on aviation companies, on hospitality companies like Marriott.

[938] Why?

[939] Because that was still considered fair game to China.

[940] It wasn't IP theft.

[941] They were after.

[942] They wanted to know who was staying in this city at this time when Chinese citizens were staying there so they could cross -match for counterintelligence who might be a likely Chinese spy.

[943] I'm sure we're doing some of that too.

[944] Counterintelligence is counterintelligence.

[945] It's considered fair game.

[946] But where I think it gets evil is when you use it for censorship, you know, to suppress any dissent, to do what I've seen the UAE do to its citizens, where people who've gone on Twitter just to advocate for better voting rights, more enfranchisement, suddenly find their passports confiscated.

[947] You know, I talked to one critic, Ahmed Mansour, and he told me, you know, you might find yourself a terrorist, labeled a terrorist, one day.

[948] You don't even know how to operate a gun.

[949] I mean, he had been beaten up every time he tried to go somewhere.

[950] His passport had been confiscated.

[951] By that point, it turned out they'd already hacked into his phone, so they were listening to us talking.

[952] They'd hacked into his baby monitor.

[953] So they're spying on his child.

[954] And they stole his car.

[955] And then they created a new law that you couldn't criticize the ruling family or the ruling party on Twitter.

[956] And he's been in solitary confinement every day.

[957] since on hunger strike.

[958] So that's evil.

[959] You know, that's evil.

[960] And we still, we don't do that here.

[961] You know, we have rules here.

[962] We don't cross that line.

[963] So yeah, in some cases, like, I won't go to Dubai.

[964] You know, I won't go to Abu Dhabi.

[965] If I ever want to go to the Maldives, like too bad, like most of the flights go through Dubai.

[966] So there's some lines we're not willing to cross.

[967] But then again, just like you said, there's individuals within NSA, within CIA.

[968] and they may have power.

[969] And to me, there's levels of evil.

[970] To me personally, this is the stuff of conspiracy theories, is the things you've mentioned as evil are more direct attacks.

[971] But there's also psychological warfare.

[972] So blackmail.

[973] So what does spying allow you to do?

[974] It allow you to collect information if you have something that's embarrassing.

[975] Or if you have like Jeffrey Epstein, conspiracy theories, active, what is it, manufacture of embarrassing things, and then use blackmail to manipulate the population or all the powerful people involved.

[976] It troubles me deeply that MIT allowed somebody like Jeffrey Epstein in their midst, especially some of the scientists I admire that they would hang out with that person at all.

[977] And so, you know, I'll talk about it sometimes.

[978] and then a lot of people tell me, well, obviously, Jeffrey Epstein is a front for intelligence.

[979] And I just, I struggle to see that level of competence and malevolence.

[980] But, you know, who the hell am I?

[981] And I guess I was trying to get to that point.

[982] You said that there's bureaucracy and so on, which makes some of these things very difficult.

[983] I wonder how much malevolence, how much competence there is in these institutions.

[984] Like how far, this takes us back to the hacking question.

[985] How far are people willing to go if they have the power?

[986] This has to do with social engineering.

[987] This has to do with hacking.

[988] This has to do with manipulating people, attacking people, doing evil onto people, psychological warfare and stuff like that.

[989] I don't know.

[990] I believe that most people are good.

[991] And I don't think that's possible.

[992] in a free society.

[993] There's something that happens when you have a centralized government where power corrupts over time and you start, you know, surveillance programs kind of, it's like a slippery slope that over time starts to both use fear and direct manipulation to control the populace.

[994] But in a free society, I just, it's difficult for me to imagine that you can have like somebody like a Jeffrey Epis, I don't know what I'm asking you, but I'm just, I have a hope that for the most part, intelligence agencies are trying to do good and are actually doing good for the world when you view it in the full context of the complexities of the world.

[995] But then again, if they're not, would we know?

[996] That's why Edwin Snowden might be a good thing.

[997] Let me ask you on a personal question.

[998] You have investigated some of the most powerful organizations and people in the world of cyber warfare, cybersecurity.

[999] Are you ever afraid for your own life, your own well -being, digital, or physical?

[1000] I mean, I've had my moments.

[1001] You know, I've had our security team at the times called me at one point and said someone's on the dark web offering, you know, good money to anyone who can hack your phone or your laptop.

[1002] I describe in my book how when I was at that hacking conference in Argentina, I came back and I brought a burner laptop with me, but I'd kept it in the safe anyway, and it didn't have anything on it, but someone had broken in and it was moved.

[1003] you know I've had all sorts of sort of scary moments and then I've had moments where I think I went just way too far into the paranoid side I mean I remember writing about the Times hack by China and I just covered a number of Chinese cyber attacks where they'd gotten into the thermostat at someone's corporate apartment and you know they've gotten into all.

[1004] sorts of stuff and I was living by myself.

[1005] I was single in San Francisco and my cable box on my television started making some weird noises in the middle of the night and I got up and I ripped it out of the wall and I think I said something like embarrassing like fuck you China you know and then I went back to bed and I woke up and like this like beautiful morning light I mean I'll never forget it like this is like glimmering morning light is shining on my cable box which has now been ripped out and is sitting on my floor and like the morning light and i was just like no no no like i'm not going down that road like you basically i came to to a road you know a fork in the road where i could either go full tinfoil hat go live off the grid never have a car with navigation never use google maps never own an iPhone, never order diapers off Amazon, you know, create an alias, or I could just do the best I can and live in this new digital world we're living in.

[1006] And what does that look like for me?

[1007] I mean, what are my crown jewels?

[1008] This is what I tell people.

[1009] What are your crown jewels?

[1010] Because just focus on that.

[1011] You can't protect everything, but you can protect your crown jewels.

[1012] For me, for the longest time, my crown jewels were my sources.

[1013] I was nothing without my sources.

[1014] So I had some sources.

[1015] I would meet the same dim sum place, or maybe it was a different restaurant, on the same date, you know, every quarter.

[1016] And we would never drive there.

[1017] We would never Uber there.

[1018] We wouldn't bring any devices.

[1019] I could bring a pencil and a notepad.

[1020] And if someone wasn't in town, like there were a couple times where I'd show up and the source never came.

[1021] But we never communicated digitally.

[1022] And those were the lengths I was willing to go to protect that source.

[1023] But you can't do it for everyone.

[1024] So for everyone else, you know, it was signal using two -factor authentication, you know, keeping my devices up to date, not clicking on phishing emails, using a password manager, all the things that, you know, we know we're supposed to do.

[1025] And that's what I tell everyone.

[1026] Like, don't go crazy because then that's like the ultimate hack.

[1027] Then they've hacked your mind, whoever they is for you.

[1028] But just do the best you can.

[1029] Now, my whole risk model changed when I had a kid.

[1030] You know, now it's, oh God, you know, if anyone threatened my family, God help them.

[1031] But.

[1032] But.

[1033] But it's, uh, it, it changes you.

[1034] And, you know, unfortunately, there are some things like I was really scared to go deep on like Russian cybercrime, you know, like Putin himself, you know.

[1035] And, and it's interesting.

[1036] Like I have a mentor who's an incredible person who was the Times Moscow bureau chief during the Cold War.

[1037] And after I wrote a series of stories about Chinese cyber espionage, he took me out to lunch.

[1038] And he told me that when he was living in Moscow, he would drop his kids off at preschool when they were my son's age now.

[1039] And the KGB would follow him.

[1040] And they would make a really, like, loud show of it.

[1041] You know, they'd tail him.

[1042] They'd, you know, honk.

[1043] They'd just be a rucket, make a ruckus.

[1044] And he said, you know what, they never actually did anything.

[1045] But they wanted me to know that they were following me. And I operated accordingly.

[1046] And he says, that's how you should operate in the digital world.

[1047] Know that there are probably people following you.

[1048] Sometimes they'll make a little bit of noise.

[1049] But one thing you need to know is that while you're at the New York Times, you have a little bit of an invisible shield on you.

[1050] You know, if something were to happen to you, that would be a really big deal.

[1051] That would be an international incident.

[1052] So I kind of carried that invisible shield with me for years.

[1053] And then Jamal Khashoggi happened.

[1054] And that destroyed my vision of my invisible shield.

[1055] You know, sure, you know, he was a Saudi, but he was a Washington Post columnist.

[1056] You know, for the most part, he was living in the United States.

[1057] He was a journalist.

[1058] And for them to do what they did to him pretty much in the open and get away with it.

[1059] And for the United States to let them get away with it because we wanted to preserve diplomatic relations with the Saudis, that really threw my worldview upside down.

[1060] And, you know, I think that sent a message to a lot of countries that it was sort of open season on journalists.

[1061] And to me, that was one of the most destructive things that happened under the previous administration.

[1062] And, you know, I don't really know what to think of my invisible shield anymore.

[1063] Like you said, that really worries me on the journalism side that people would be afraid to dig deep on fascinating topics.

[1064] And, you know, I have my own, that's part of the reason, I, like, I would love to have kids, I would love to have a family.

[1065] Part of the reason I'm a little bit afraid, there's many ways to phrase this, but the loss of freedom in the way of doing all the crazy shit that I naturally do, which I would say, the ethic of journalism is kind of not, is doing crazy shit without really thinking about it.

[1066] This is letting your curiosity really allow you to be free and explore.

[1067] It's, I mean, whether it's stupidity or fearlessness, whatever it is, that's what great journalism is.

[1068] And all the concerns about security security risks have made me, like, become a better person.

[1069] The way I approach it is just make sure you don't have anything to hide.

[1070] I know this is not a thing.

[1071] This is not a, this is not a approach to security.

[1072] I'm just, this is like a motivational speech or something.

[1073] It's just like if you can lose, you can be hacked at any moment.

[1074] Just don't be a douchebag secretly.

[1075] Just be like a good person.

[1076] Because then I see this actually with social media in general.

[1077] Just present yourself in the most authentic way possible, meaning be the same person online as you are privately, have nothing to hide.

[1078] That's one, not the only, but one of the ways to achieve security.

[1079] Maybe I'm totally wrong on this, but don't be secretly weird.

[1080] If you're weird, be publicly weird, so it's impossible to blackmail you.

[1081] That's my approach to the security.

[1082] Yeah, well, they call it the New York Times front page.

[1083] phenomenon you know don't put anything in email or i guess social media these days that um you wouldn't want to read on the front page of the new york times and that works but you know sometimes i even get carried i mean i i have i don't not as many followers as you but a lot of followers and sometimes even i get carried away to be emotional and stuff to say something yeah yeah i mean just the cortisol um response on twitter you know twitter is basically like designed to elicit those responses.

[1084] I mean, every day I turn on my computer, I look at my phone, I look at what's trending on Twitter, and it's like, what are the topics that are going to make people the most angry today, you know?

[1085] And, you know, it's easy to get carried away.

[1086] But it's also just, that sucks too, that you have to be constantly censoring yourself.

[1087] And maybe it's for the better.

[1088] Maybe you can't be a secret asshole.

[1089] And we can put that in the good bucket.

[1090] But at the same time, you know, there is a danger to that other voice, to creativity, you know, to being weird.

[1091] There is a danger to that little whispered voice that was, that's like, well, how would people read that?

[1092] You know, how could that be manipulated?

[1093] How could that be used against you?

[1094] And that stifles creativity and innovation and free thought.

[1095] And, you know, that's, that that is on a very micro level.

[1096] And that's something I think about a lot.

[1097] And that's actually something that Tim Cook has talked about a lot.

[1098] And why he has, you know, said he goes full force on privacy is it's just that little voice that is at some level.

[1099] level censoring you.

[1100] And what is sort of the long -term impact of that little voice over time?

[1101] I think there's a ways, I think that self -censorship is an attack factory that there's solutions to.

[1102] The way I'm really inspired by Elon Musk, the solution to that is just be privately and publicly the same person and be ridiculous.

[1103] Embrace the full weirdness and show it more and more.

[1104] so it you know that's there's memes that has like ridiculous humor and I think uh and if there is something you really want to hide deeply consider if that you want to be that like why are you hiding it what exactly are you afraid of because I think my hopeful vision for the internet is the internet loves authenticity they want to see you weird so be that and like live that fully Because I think that gray area where you're kind of censoring yourself, that's where the destruction is.

[1105] You have to go all the way.

[1106] Step over.

[1107] Be weird.

[1108] And then it feels it can be painful because people can attack you and so on, but just ride it.

[1109] I mean, that's just like a skill on the social psychological level that ends up being an approach to security, which is like remove the attack vector of having private information by being your full.

[1110] weird self publicly.

[1111] What advice would you give to young folks today, you know, operating in this complicated space about how to have a successful life, a life that can be proud of, a career that can be proud of maybe somebody in high school and college thinking about what they're going to do?

[1112] Be a hacker.

[1113] You know, if you have any interest, become a hacker and apply yourself to do.

[1114] defense you know every time like we do have these these amazing scholarship programs for instance where you know they find you early they'll pay your college as long as you um commit to some kind of federal commitment to sort of help federal agencies with cybersecurity and where does everyone want to go every year from the scholarship program they want to go work at the nsa or cyber command you know they want to go work on offense they want to go do the sexy stuff it's really hard to get people work on defense.

[1115] It's just, it's always been more fun to be a pirate than being the Coast Guard, you know, and so we have a huge deficit when it comes to filling those roles.

[1116] There's 3 .5 million unfilled cybersecurity positions around the world.

[1117] I mean, talk about job security, like be a hacker and work on cybersecurity.

[1118] You will always have a job.

[1119] And we're actually had a huge deficit and disadvantage as a free market economy because we can't match cyber security salaries at Palantir or Facebook or Google or Microsoft.

[1120] And so it's really hard for the United States to fill those roles.

[1121] And other countries have had this workaround where they basically have forced conscription on some level.

[1122] You know, China tells people like, you do whatever you're going to do during the day.

[1123] work at Alibaba, you know, if you need to do some ransomware, okay.

[1124] But the minute we tap you on the shoulder and ask you to come do this sensitive operation for us, the answer is yes.

[1125] You know, same with Russia.

[1126] You know, a couple years ago when Yahoo was hacked and they laid it all out in an indictment, it came down to two cybercriminals and two guys from the FSB.

[1127] Cybercriminals were allowed to have their fun, but the minute they came across the username and password for someone's personal Yahoo account that worked at the White House or the State Department or military, they were expected to pass that over to the FSB.

[1128] So we don't do that here.

[1129] And it's even worse on defense.

[1130] We really can't fill these positions.

[1131] So, you know, if you are a hacker, if you're interested in code, if you're a tinker, you know, learn how to hack.

[1132] There are all sorts of amazing hacking competitions you can do through the Sands org, for example, S -A -N -S.

[1133] And then use those skills for good, you know, neuter the bugs in that code that get used by autocratic regimes to make people's life, you know, a living prison, you know, plug those holes, you know, defend industrial systems, defend our water treatment facilities from hacks where people are trying to come in and poison the water.

[1134] You know, that I think is just, an amazing, it's an amazing job on so many levels.

[1135] It's intellectually stimulating.

[1136] You can tell yourself you're serving your country.

[1137] You can tell yourself you're saving lives and keeping people safe.

[1138] And you'll always have amazing job security.

[1139] And if you need to go get that job that pays you, you know, two million bucks a year, you can do that too.

[1140] And you can have a public profile, more so of a public profile.

[1141] You can be a public rock star.

[1142] I mean, it's the same thing as sort of the military there's a lot of there's a lot of well -known sort of people commenting on the fact that veterans are not treated as well as they should be but it's still the fact that soldiers are deeply respected for for defending the country, the freedoms, the ideals that we stand for.

[1143] And in the same way I mean, in some ways the cyber security defense are the soldiers of the future.

[1144] Yeah.

[1145] And you No, it's interesting.

[1146] I mean, in cybersecurity, the difference is oftentimes you see the more interesting threats in the private sector, because that's where the attacks come.

[1147] You know, when cyber criminals and nation state adversaries come for the United States, they don't go directly for cyber command or the NSA.

[1148] No, they go for banks.

[1149] They go for Google.

[1150] They go for Microsoft.

[1151] They go for critical infrastructure.

[1152] And so those companies, those private sector companies, see some of the most advanced, sophisticated attacks out there.

[1153] And, you know, if you're working at Fire Eye and you're calling out the solar winds attack, for instance, I mean, you just saved, God knows how many systems from, you know, that compromise turning into something that more closely resembles sabotage.

[1154] So, you know, go be a hacker and or go be a journalist.

[1155] So you wrote the book.

[1156] This is how they tell me the world ends, as we've been talking about, of course, referring to cyber war, cybersecurity.

[1157] What gives you hope about the future of our world if it doesn't end?

[1158] How will it not end?

[1159] That's a good question.

[1160] I mean, I have to have hope, right?

[1161] Because I have a kid and I have another on the way.

[1162] And if I didn't have hope, I wouldn't be having kids.

[1163] But it's a scary time to be having kids.

[1164] And now it's like pandemic, climate change, disinformation, increasingly advanced, perhaps deadly cyber attacks.

[1165] What gives me hope is that I share your worldview that I think people are fundamentally good.

[1166] And sometimes, and this is why the metaverse scares me to death, but when I'm reminded of that, is not online.

[1167] Like online, I get the opposite.

[1168] You know, you start to lose hope and humanity when you're on Twitter, half your day.

[1169] It's like when I go to the grocery store or I go on a hike or like someone smiles at me or, you know, or someone just says something nice, you know, people are fundamentally good.

[1170] We just don't hear from those people enough.

[1171] And my hope is, you know, I just think.

[1172] think our current political climate like we've hit rock bottom you know this is as bad as it gets we can't do anything don't jinx it well but i think it's a generational thing you know i think baby boomers like it's time to move along i think it's it's time for a new generation to come in and i actually have a lot of hope when i look at you know i'm sort of like this i guess they call it me a geriatric millennial or a young Gen X. But like we have this unique responsibility because I grew up without the Internet and without social media, but I'm native to it.

[1173] So I know the good and I know the bad.

[1174] And that's true on so many different things.

[1175] You know, I grew up without climate change anxiety and now I'm feeling it and I know it's not a given.

[1176] We don't have to just resign ourselves to climate change.

[1177] you know, same with disinformation.

[1178] And I think a lot of the problems we face today have just exposed the sort of inertia that there has been on so many of these issues.

[1179] And I really think it's a generational shift that has to happen.

[1180] And I think this next generation is going to come in and say, like, we're not doing business like you guys did it anymore.

[1181] You know, we're not just going to like rape and pillage the earth and try and turn everyone against each other and play dirty tricks and let lobbyists dictate, you know, what we do or don't do as a country anymore.

[1182] And that's really where I see the hope.

[1183] It feels like there's a lot of low -hanging fruit for young minds to step up and create solutions and lead.

[1184] So whenever, like, politicians or leaders that are older, like you said, are acting shitty, I see that as a positive.

[1185] They're inspiring a large number of young people.

[1186] people to replace them.

[1187] And so it's, I think you're right.

[1188] There's going to be, it's almost like you need people to act shitty to remind them, oh, wow, we need good leaders.

[1189] We need great creators and builders and entrepreneurs and scientists and engineers and journalists.

[1190] Yeah.

[1191] You know, all the discussions about how the journalism is quote unquote broken and so on, that's just an inspiration for new institutions to rise up that do journalism better, new journalists to step up and do journalism better.

[1192] So I, and I've been constantly, when I talk to young people, I'm constantly impressed by the ones that dream to build solutions.

[1193] And so that's, that's ultimately why I put the hope.

[1194] But the world is a messy place like we've been talking about.

[1195] It's a scary place.

[1196] Yeah, and I think you hit something, hit on something earlier, which is authenticity.

[1197] Yes.

[1198] Like, no one is going to rise.

[1199] above that is plastic anymore.

[1200] You know, people are craving authenticity.

[1201] You know, the benefit of the Internet is it's really hard to hide who you are on every single platform, you know, on some level it's going to come out who you really are.

[1202] And so you hope that, you know, by the time my kids are grown, like, no one's going to care if they made one mistake online so long as they're authentic.

[1203] You know, and I used to worry about this.

[1204] My nephew was born the day I graduated from college.

[1205] And I just always, you know, he's like born into Facebook.

[1206] And just think like, how is a kid like that ever going to be president of the United States of America?

[1207] Because if Facebook had been around when I was in college, you know, like Jesus.

[1208] You know, what, how is, how are those kids going to ever be president?

[1209] There's going to be some photo of, you know, them at some point making some mistake and that's going to be all over for them.

[1210] And now I take that back.

[1211] Now it's like, no, everyone's going to make mistakes.

[1212] There's going to be a picture for everyone.

[1213] And we're all going to have to come and grow up to the view that as humans, we're going to make huge mistakes.

[1214] And hopefully they're not so big that they're going to ruin the rest of your life.

[1215] But we're going to have to come around to this view that we're all human and we're going to have to be a little bit more forgiving and a little bit more tolerant when people mess up and we're going to have to be a little bit more humble when we do and like keep moving forward otherwise you can't like cancel everyone uh Nicole this is an incredible hopeful conversation also um one that reveals that in the shadows there's a lot of challenges to be solved so I really appreciate that you took on this really difficult subject with your book that's journalism is best.

[1216] So I'm really grateful that you took the risk that you took that on and that you plugged the cable box back in.

[1217] That means you have hope.

[1218] And thank you so much for spending your valuable time with me today.

[1219] Thank you.

[1220] Thanks for having me. Thanks for listening to this conversation with Nicole Perl Roth.

[1221] To support this podcast, please check out our sponsors in the description.

[1222] And now, let me leave you with some words from Nicole herself.

[1223] Here we are.

[1224] entrusting our entire digital lives, passwords, texts, love letters, banking records, health records, credit cards, sources, and deepest thoughts to this mystery box, whose inner circuitry most of us would never vet.

[1225] Run by code written in a language most of us will never fully understand.

[1226] Thank you for listening and hope to see you next time.